Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 13 Dec 2016 22:48:12 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/13/t1481665751k2wf11gq1wmtakw.htm/, Retrieved Sat, 04 May 2024 20:59:32 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=299247, Retrieved Sat, 04 May 2024 20:59:32 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact68
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2016-12-13 21:48:12] [b2e25925e4919b0d6985405fcb461c0d] [Current]
Feedback Forum

Post a new message
Dataseries X:
4020
3540
3430
4200
3360
4440
4390
4940
3940
4560
4850
5070
6210
5200
4860
5160
5530
8830
4410
4850
8960
4620
5120
4520
8870
9470
6590
3970
3770
5500
6580
5280
8640
5510
5690
7620
4010
3570
4040
3600
4000
3070
3230
4060
3480
3750
3990
3100
3950
3010
3160
2960
2750
3590
3060
2970
3590
3450
2930
2660
3540
3160
2680
2900
2920
2900
3150
3150
3120
3720
3360
2740
3250
2700
2610
2410
2590
2630
2650
2600
3060
2650
2700
2620
2630
2850
2680
2430
2550
2570
2520
2500
2550
2790
2770
2460
2800
2770
2450
2370
2540
3470
2690
4110
3840
2860
3540
3370




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time6 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299247&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]6 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=299247&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299247&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
HPC[t] = + 5623.54 + 233.758M1[t] -70.2199M2[t] -458.642M3[t] -705.954M4[t] -674.376M5[t] + 132.757M6[t] -316.777M7[t] -88.5324M8[t] + 688.601M9[t] -88.7106M10[t] + 57.3113M11[t] -30.4664t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
HPC[t] =  +  5623.54 +  233.758M1[t] -70.2199M2[t] -458.642M3[t] -705.954M4[t] -674.376M5[t] +  132.757M6[t] -316.777M7[t] -88.5324M8[t] +  688.601M9[t] -88.7106M10[t] +  57.3113M11[t] -30.4664t  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299247&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]HPC[t] =  +  5623.54 +  233.758M1[t] -70.2199M2[t] -458.642M3[t] -705.954M4[t] -674.376M5[t] +  132.757M6[t] -316.777M7[t] -88.5324M8[t] +  688.601M9[t] -88.7106M10[t] +  57.3113M11[t] -30.4664t  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299247&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299247&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
HPC[t] = + 5623.54 + 233.758M1[t] -70.2199M2[t] -458.642M3[t] -705.954M4[t] -674.376M5[t] + 132.757M6[t] -316.777M7[t] -88.5324M8[t] + 688.601M9[t] -88.7106M10[t] + 57.3113M11[t] -30.4664t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+5624 473.1+1.1890e+01 1.667e-20 8.337e-21
M1+233.8 585.5+3.9920e-01 0.6906 0.3453
M2-70.22 585.2-1.2000e-01 0.9047 0.4524
M3-458.6 585-7.8400e-01 0.435 0.2175
M4-706 584.8-1.2070e+00 0.2303 0.1152
M5-674.4 584.6-1.1540e+00 0.2516 0.1258
M6+132.8 584.4+2.2720e-01 0.8208 0.4104
M7-316.8 584.3-5.4220e-01 0.589 0.2945
M8-88.53 584.2-1.5160e-01 0.8799 0.4399
M9+688.6 584.1+1.1790e+00 0.2414 0.1207
M10-88.71 584-1.5190e-01 0.8796 0.4398
M11+57.31 584+9.8140e-02 0.922 0.461
t-30.47 3.847-7.9190e+00 4.456e-12 2.228e-12

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +5624 &  473.1 & +1.1890e+01 &  1.667e-20 &  8.337e-21 \tabularnewline
M1 & +233.8 &  585.5 & +3.9920e-01 &  0.6906 &  0.3453 \tabularnewline
M2 & -70.22 &  585.2 & -1.2000e-01 &  0.9047 &  0.4524 \tabularnewline
M3 & -458.6 &  585 & -7.8400e-01 &  0.435 &  0.2175 \tabularnewline
M4 & -706 &  584.8 & -1.2070e+00 &  0.2303 &  0.1152 \tabularnewline
M5 & -674.4 &  584.6 & -1.1540e+00 &  0.2516 &  0.1258 \tabularnewline
M6 & +132.8 &  584.4 & +2.2720e-01 &  0.8208 &  0.4104 \tabularnewline
M7 & -316.8 &  584.3 & -5.4220e-01 &  0.589 &  0.2945 \tabularnewline
M8 & -88.53 &  584.2 & -1.5160e-01 &  0.8799 &  0.4399 \tabularnewline
M9 & +688.6 &  584.1 & +1.1790e+00 &  0.2414 &  0.1207 \tabularnewline
M10 & -88.71 &  584 & -1.5190e-01 &  0.8796 &  0.4398 \tabularnewline
M11 & +57.31 &  584 & +9.8140e-02 &  0.922 &  0.461 \tabularnewline
t & -30.47 &  3.847 & -7.9190e+00 &  4.456e-12 &  2.228e-12 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299247&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+5624[/C][C] 473.1[/C][C]+1.1890e+01[/C][C] 1.667e-20[/C][C] 8.337e-21[/C][/ROW]
[ROW][C]M1[/C][C]+233.8[/C][C] 585.5[/C][C]+3.9920e-01[/C][C] 0.6906[/C][C] 0.3453[/C][/ROW]
[ROW][C]M2[/C][C]-70.22[/C][C] 585.2[/C][C]-1.2000e-01[/C][C] 0.9047[/C][C] 0.4524[/C][/ROW]
[ROW][C]M3[/C][C]-458.6[/C][C] 585[/C][C]-7.8400e-01[/C][C] 0.435[/C][C] 0.2175[/C][/ROW]
[ROW][C]M4[/C][C]-706[/C][C] 584.8[/C][C]-1.2070e+00[/C][C] 0.2303[/C][C] 0.1152[/C][/ROW]
[ROW][C]M5[/C][C]-674.4[/C][C] 584.6[/C][C]-1.1540e+00[/C][C] 0.2516[/C][C] 0.1258[/C][/ROW]
[ROW][C]M6[/C][C]+132.8[/C][C] 584.4[/C][C]+2.2720e-01[/C][C] 0.8208[/C][C] 0.4104[/C][/ROW]
[ROW][C]M7[/C][C]-316.8[/C][C] 584.3[/C][C]-5.4220e-01[/C][C] 0.589[/C][C] 0.2945[/C][/ROW]
[ROW][C]M8[/C][C]-88.53[/C][C] 584.2[/C][C]-1.5160e-01[/C][C] 0.8799[/C][C] 0.4399[/C][/ROW]
[ROW][C]M9[/C][C]+688.6[/C][C] 584.1[/C][C]+1.1790e+00[/C][C] 0.2414[/C][C] 0.1207[/C][/ROW]
[ROW][C]M10[/C][C]-88.71[/C][C] 584[/C][C]-1.5190e-01[/C][C] 0.8796[/C][C] 0.4398[/C][/ROW]
[ROW][C]M11[/C][C]+57.31[/C][C] 584[/C][C]+9.8140e-02[/C][C] 0.922[/C][C] 0.461[/C][/ROW]
[ROW][C]t[/C][C]-30.47[/C][C] 3.847[/C][C]-7.9190e+00[/C][C] 4.456e-12[/C][C] 2.228e-12[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299247&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299247&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+5624 473.1+1.1890e+01 1.667e-20 8.337e-21
M1+233.8 585.5+3.9920e-01 0.6906 0.3453
M2-70.22 585.2-1.2000e-01 0.9047 0.4524
M3-458.6 585-7.8400e-01 0.435 0.2175
M4-706 584.8-1.2070e+00 0.2303 0.1152
M5-674.4 584.6-1.1540e+00 0.2516 0.1258
M6+132.8 584.4+2.2720e-01 0.8208 0.4104
M7-316.8 584.3-5.4220e-01 0.589 0.2945
M8-88.53 584.2-1.5160e-01 0.8799 0.4399
M9+688.6 584.1+1.1790e+00 0.2414 0.1207
M10-88.71 584-1.5190e-01 0.8796 0.4398
M11+57.31 584+9.8140e-02 0.922 0.461
t-30.47 3.847-7.9190e+00 4.456e-12 2.228e-12







Multiple Linear Regression - Regression Statistics
Multiple R 0.6559
R-squared 0.4303
Adjusted R-squared 0.3583
F-TEST (value) 5.979
F-TEST (DF numerator)12
F-TEST (DF denominator)95
p-value 1.151e-07
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1239
Sum Squared Residuals 1.458e+08

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.6559 \tabularnewline
R-squared &  0.4303 \tabularnewline
Adjusted R-squared &  0.3583 \tabularnewline
F-TEST (value) &  5.979 \tabularnewline
F-TEST (DF numerator) & 12 \tabularnewline
F-TEST (DF denominator) & 95 \tabularnewline
p-value &  1.151e-07 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1239 \tabularnewline
Sum Squared Residuals &  1.458e+08 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299247&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.6559[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.4303[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.3583[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 5.979[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]12[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]95[/C][/ROW]
[ROW][C]p-value[/C][C] 1.151e-07[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1239[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 1.458e+08[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299247&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299247&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.6559
R-squared 0.4303
Adjusted R-squared 0.3583
F-TEST (value) 5.979
F-TEST (DF numerator)12
F-TEST (DF denominator)95
p-value 1.151e-07
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1239
Sum Squared Residuals 1.458e+08







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 4020 5827-1807
2 3540 5492-1952
3 3430 5074-1644
4 4200 4796-595.7
5 3360 4797-1437
6 4440 5574-1134
7 4390 5094-703.5
8 4940 5291-351.3
9 3940 6038-2098
10 4560 5230-670.2
11 4850 5346-495.7
12 5070 5258-187.9
13 6210 5461 748.8
14 5200 5127 73.21
15 4860 4708 152.1
16 5160 4430 729.9
17 5530 4431 1099
18 8830 5208 3622
19 4410 4728-317.9
20 4850 4926-75.68
21 8960 5672 3288
22 4620 4865-244.6
23 5120 4980 139.9
24 4520 4892-372.3
25 8870 5096 3774
26 9470 4761 4709
27 6590 4342 2248
28 3970 4065-94.53
29 3770 4066-295.6
30 5500 4842 657.7
31 6580 4362 2218
32 5280 4560 719.9
33 8640 5307 3333
34 5510 4499 1011
35 5690 4615 1075
36 7620 4527 3093
37 4010 4730-720
38 3570 4396-825.6
39 4040 3977 63.29
40 3600 3699-98.93
41 4000 3700 300
42 3070 4477-1407
43 3230 3997-766.7
44 4060 4194-134.5
45 3480 4941-1461
46 3750 4133-383.4
47 3990 4249-258.9
48 3100 4161-1061
49 3950 4364-414.4
50 3010 4030-1020
51 3160 3611-451.1
52 2960 3333-373.3
53 2750 3334-584.4
54 3590 4111-521.1
55 3060 3631-571.1
56 2970 3829-858.9
57 3590 4576-985.6
58 3450 3768-317.8
59 2930 3883-953.3
60 2660 3796-1136
61 3540 3999-458.8
62 3160 3664-504.4
63 2680 3246-565.5
64 2900 2968-67.74
65 2920 2969-48.85
66 2900 3746-845.5
67 3150 3266-115.5
68 3150 3463-313.3
69 3120 4210-1090
70 3720 3402 317.8
71 3360 3518-157.7
72 2740 3430-690
73 3250 3633-383.2
74 2700 3299-598.8
75 2610 2880-269.9
76 2410 2602-192.1
77 2590 2603-13.25
78 2630 3380-749.9
79 2650 2900-249.9
80 2600 3098-497.7
81 3060 3844-784.4
82 2650 3037-386.6
83 2700 3152-452.1
84 2620 3064-444.4
85 2630 3268-637.7
86 2850 2933-83.21
87 2680 2514 165.7
88 2430 2237 193.5
89 2550 2238 312.3
90 2570 3014-444.3
91 2520 2534-14.32
92 2500 2732-232.1
93 2550 3479-928.8
94 2790 2671 119
95 2770 2787-16.54
96 2460 2699-238.8
97 2800 2902-102.1
98 2770 2568 202.4
99 2450 2149 301.3
100 2370 1871 499.1
101 2540 1872 667.9
102 3470 2649 821.3
103 2690 2169 521.3
104 4110 2366 1744
105 3840 3113 726.8
106 2860 2305 554.6
107 3540 2421 1119
108 3370 2333 1037

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  4020 &  5827 & -1807 \tabularnewline
2 &  3540 &  5492 & -1952 \tabularnewline
3 &  3430 &  5074 & -1644 \tabularnewline
4 &  4200 &  4796 & -595.7 \tabularnewline
5 &  3360 &  4797 & -1437 \tabularnewline
6 &  4440 &  5574 & -1134 \tabularnewline
7 &  4390 &  5094 & -703.5 \tabularnewline
8 &  4940 &  5291 & -351.3 \tabularnewline
9 &  3940 &  6038 & -2098 \tabularnewline
10 &  4560 &  5230 & -670.2 \tabularnewline
11 &  4850 &  5346 & -495.7 \tabularnewline
12 &  5070 &  5258 & -187.9 \tabularnewline
13 &  6210 &  5461 &  748.8 \tabularnewline
14 &  5200 &  5127 &  73.21 \tabularnewline
15 &  4860 &  4708 &  152.1 \tabularnewline
16 &  5160 &  4430 &  729.9 \tabularnewline
17 &  5530 &  4431 &  1099 \tabularnewline
18 &  8830 &  5208 &  3622 \tabularnewline
19 &  4410 &  4728 & -317.9 \tabularnewline
20 &  4850 &  4926 & -75.68 \tabularnewline
21 &  8960 &  5672 &  3288 \tabularnewline
22 &  4620 &  4865 & -244.6 \tabularnewline
23 &  5120 &  4980 &  139.9 \tabularnewline
24 &  4520 &  4892 & -372.3 \tabularnewline
25 &  8870 &  5096 &  3774 \tabularnewline
26 &  9470 &  4761 &  4709 \tabularnewline
27 &  6590 &  4342 &  2248 \tabularnewline
28 &  3970 &  4065 & -94.53 \tabularnewline
29 &  3770 &  4066 & -295.6 \tabularnewline
30 &  5500 &  4842 &  657.7 \tabularnewline
31 &  6580 &  4362 &  2218 \tabularnewline
32 &  5280 &  4560 &  719.9 \tabularnewline
33 &  8640 &  5307 &  3333 \tabularnewline
34 &  5510 &  4499 &  1011 \tabularnewline
35 &  5690 &  4615 &  1075 \tabularnewline
36 &  7620 &  4527 &  3093 \tabularnewline
37 &  4010 &  4730 & -720 \tabularnewline
38 &  3570 &  4396 & -825.6 \tabularnewline
39 &  4040 &  3977 &  63.29 \tabularnewline
40 &  3600 &  3699 & -98.93 \tabularnewline
41 &  4000 &  3700 &  300 \tabularnewline
42 &  3070 &  4477 & -1407 \tabularnewline
43 &  3230 &  3997 & -766.7 \tabularnewline
44 &  4060 &  4194 & -134.5 \tabularnewline
45 &  3480 &  4941 & -1461 \tabularnewline
46 &  3750 &  4133 & -383.4 \tabularnewline
47 &  3990 &  4249 & -258.9 \tabularnewline
48 &  3100 &  4161 & -1061 \tabularnewline
49 &  3950 &  4364 & -414.4 \tabularnewline
50 &  3010 &  4030 & -1020 \tabularnewline
51 &  3160 &  3611 & -451.1 \tabularnewline
52 &  2960 &  3333 & -373.3 \tabularnewline
53 &  2750 &  3334 & -584.4 \tabularnewline
54 &  3590 &  4111 & -521.1 \tabularnewline
55 &  3060 &  3631 & -571.1 \tabularnewline
56 &  2970 &  3829 & -858.9 \tabularnewline
57 &  3590 &  4576 & -985.6 \tabularnewline
58 &  3450 &  3768 & -317.8 \tabularnewline
59 &  2930 &  3883 & -953.3 \tabularnewline
60 &  2660 &  3796 & -1136 \tabularnewline
61 &  3540 &  3999 & -458.8 \tabularnewline
62 &  3160 &  3664 & -504.4 \tabularnewline
63 &  2680 &  3246 & -565.5 \tabularnewline
64 &  2900 &  2968 & -67.74 \tabularnewline
65 &  2920 &  2969 & -48.85 \tabularnewline
66 &  2900 &  3746 & -845.5 \tabularnewline
67 &  3150 &  3266 & -115.5 \tabularnewline
68 &  3150 &  3463 & -313.3 \tabularnewline
69 &  3120 &  4210 & -1090 \tabularnewline
70 &  3720 &  3402 &  317.8 \tabularnewline
71 &  3360 &  3518 & -157.7 \tabularnewline
72 &  2740 &  3430 & -690 \tabularnewline
73 &  3250 &  3633 & -383.2 \tabularnewline
74 &  2700 &  3299 & -598.8 \tabularnewline
75 &  2610 &  2880 & -269.9 \tabularnewline
76 &  2410 &  2602 & -192.1 \tabularnewline
77 &  2590 &  2603 & -13.25 \tabularnewline
78 &  2630 &  3380 & -749.9 \tabularnewline
79 &  2650 &  2900 & -249.9 \tabularnewline
80 &  2600 &  3098 & -497.7 \tabularnewline
81 &  3060 &  3844 & -784.4 \tabularnewline
82 &  2650 &  3037 & -386.6 \tabularnewline
83 &  2700 &  3152 & -452.1 \tabularnewline
84 &  2620 &  3064 & -444.4 \tabularnewline
85 &  2630 &  3268 & -637.7 \tabularnewline
86 &  2850 &  2933 & -83.21 \tabularnewline
87 &  2680 &  2514 &  165.7 \tabularnewline
88 &  2430 &  2237 &  193.5 \tabularnewline
89 &  2550 &  2238 &  312.3 \tabularnewline
90 &  2570 &  3014 & -444.3 \tabularnewline
91 &  2520 &  2534 & -14.32 \tabularnewline
92 &  2500 &  2732 & -232.1 \tabularnewline
93 &  2550 &  3479 & -928.8 \tabularnewline
94 &  2790 &  2671 &  119 \tabularnewline
95 &  2770 &  2787 & -16.54 \tabularnewline
96 &  2460 &  2699 & -238.8 \tabularnewline
97 &  2800 &  2902 & -102.1 \tabularnewline
98 &  2770 &  2568 &  202.4 \tabularnewline
99 &  2450 &  2149 &  301.3 \tabularnewline
100 &  2370 &  1871 &  499.1 \tabularnewline
101 &  2540 &  1872 &  667.9 \tabularnewline
102 &  3470 &  2649 &  821.3 \tabularnewline
103 &  2690 &  2169 &  521.3 \tabularnewline
104 &  4110 &  2366 &  1744 \tabularnewline
105 &  3840 &  3113 &  726.8 \tabularnewline
106 &  2860 &  2305 &  554.6 \tabularnewline
107 &  3540 &  2421 &  1119 \tabularnewline
108 &  3370 &  2333 &  1037 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299247&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 4020[/C][C] 5827[/C][C]-1807[/C][/ROW]
[ROW][C]2[/C][C] 3540[/C][C] 5492[/C][C]-1952[/C][/ROW]
[ROW][C]3[/C][C] 3430[/C][C] 5074[/C][C]-1644[/C][/ROW]
[ROW][C]4[/C][C] 4200[/C][C] 4796[/C][C]-595.7[/C][/ROW]
[ROW][C]5[/C][C] 3360[/C][C] 4797[/C][C]-1437[/C][/ROW]
[ROW][C]6[/C][C] 4440[/C][C] 5574[/C][C]-1134[/C][/ROW]
[ROW][C]7[/C][C] 4390[/C][C] 5094[/C][C]-703.5[/C][/ROW]
[ROW][C]8[/C][C] 4940[/C][C] 5291[/C][C]-351.3[/C][/ROW]
[ROW][C]9[/C][C] 3940[/C][C] 6038[/C][C]-2098[/C][/ROW]
[ROW][C]10[/C][C] 4560[/C][C] 5230[/C][C]-670.2[/C][/ROW]
[ROW][C]11[/C][C] 4850[/C][C] 5346[/C][C]-495.7[/C][/ROW]
[ROW][C]12[/C][C] 5070[/C][C] 5258[/C][C]-187.9[/C][/ROW]
[ROW][C]13[/C][C] 6210[/C][C] 5461[/C][C] 748.8[/C][/ROW]
[ROW][C]14[/C][C] 5200[/C][C] 5127[/C][C] 73.21[/C][/ROW]
[ROW][C]15[/C][C] 4860[/C][C] 4708[/C][C] 152.1[/C][/ROW]
[ROW][C]16[/C][C] 5160[/C][C] 4430[/C][C] 729.9[/C][/ROW]
[ROW][C]17[/C][C] 5530[/C][C] 4431[/C][C] 1099[/C][/ROW]
[ROW][C]18[/C][C] 8830[/C][C] 5208[/C][C] 3622[/C][/ROW]
[ROW][C]19[/C][C] 4410[/C][C] 4728[/C][C]-317.9[/C][/ROW]
[ROW][C]20[/C][C] 4850[/C][C] 4926[/C][C]-75.68[/C][/ROW]
[ROW][C]21[/C][C] 8960[/C][C] 5672[/C][C] 3288[/C][/ROW]
[ROW][C]22[/C][C] 4620[/C][C] 4865[/C][C]-244.6[/C][/ROW]
[ROW][C]23[/C][C] 5120[/C][C] 4980[/C][C] 139.9[/C][/ROW]
[ROW][C]24[/C][C] 4520[/C][C] 4892[/C][C]-372.3[/C][/ROW]
[ROW][C]25[/C][C] 8870[/C][C] 5096[/C][C] 3774[/C][/ROW]
[ROW][C]26[/C][C] 9470[/C][C] 4761[/C][C] 4709[/C][/ROW]
[ROW][C]27[/C][C] 6590[/C][C] 4342[/C][C] 2248[/C][/ROW]
[ROW][C]28[/C][C] 3970[/C][C] 4065[/C][C]-94.53[/C][/ROW]
[ROW][C]29[/C][C] 3770[/C][C] 4066[/C][C]-295.6[/C][/ROW]
[ROW][C]30[/C][C] 5500[/C][C] 4842[/C][C] 657.7[/C][/ROW]
[ROW][C]31[/C][C] 6580[/C][C] 4362[/C][C] 2218[/C][/ROW]
[ROW][C]32[/C][C] 5280[/C][C] 4560[/C][C] 719.9[/C][/ROW]
[ROW][C]33[/C][C] 8640[/C][C] 5307[/C][C] 3333[/C][/ROW]
[ROW][C]34[/C][C] 5510[/C][C] 4499[/C][C] 1011[/C][/ROW]
[ROW][C]35[/C][C] 5690[/C][C] 4615[/C][C] 1075[/C][/ROW]
[ROW][C]36[/C][C] 7620[/C][C] 4527[/C][C] 3093[/C][/ROW]
[ROW][C]37[/C][C] 4010[/C][C] 4730[/C][C]-720[/C][/ROW]
[ROW][C]38[/C][C] 3570[/C][C] 4396[/C][C]-825.6[/C][/ROW]
[ROW][C]39[/C][C] 4040[/C][C] 3977[/C][C] 63.29[/C][/ROW]
[ROW][C]40[/C][C] 3600[/C][C] 3699[/C][C]-98.93[/C][/ROW]
[ROW][C]41[/C][C] 4000[/C][C] 3700[/C][C] 300[/C][/ROW]
[ROW][C]42[/C][C] 3070[/C][C] 4477[/C][C]-1407[/C][/ROW]
[ROW][C]43[/C][C] 3230[/C][C] 3997[/C][C]-766.7[/C][/ROW]
[ROW][C]44[/C][C] 4060[/C][C] 4194[/C][C]-134.5[/C][/ROW]
[ROW][C]45[/C][C] 3480[/C][C] 4941[/C][C]-1461[/C][/ROW]
[ROW][C]46[/C][C] 3750[/C][C] 4133[/C][C]-383.4[/C][/ROW]
[ROW][C]47[/C][C] 3990[/C][C] 4249[/C][C]-258.9[/C][/ROW]
[ROW][C]48[/C][C] 3100[/C][C] 4161[/C][C]-1061[/C][/ROW]
[ROW][C]49[/C][C] 3950[/C][C] 4364[/C][C]-414.4[/C][/ROW]
[ROW][C]50[/C][C] 3010[/C][C] 4030[/C][C]-1020[/C][/ROW]
[ROW][C]51[/C][C] 3160[/C][C] 3611[/C][C]-451.1[/C][/ROW]
[ROW][C]52[/C][C] 2960[/C][C] 3333[/C][C]-373.3[/C][/ROW]
[ROW][C]53[/C][C] 2750[/C][C] 3334[/C][C]-584.4[/C][/ROW]
[ROW][C]54[/C][C] 3590[/C][C] 4111[/C][C]-521.1[/C][/ROW]
[ROW][C]55[/C][C] 3060[/C][C] 3631[/C][C]-571.1[/C][/ROW]
[ROW][C]56[/C][C] 2970[/C][C] 3829[/C][C]-858.9[/C][/ROW]
[ROW][C]57[/C][C] 3590[/C][C] 4576[/C][C]-985.6[/C][/ROW]
[ROW][C]58[/C][C] 3450[/C][C] 3768[/C][C]-317.8[/C][/ROW]
[ROW][C]59[/C][C] 2930[/C][C] 3883[/C][C]-953.3[/C][/ROW]
[ROW][C]60[/C][C] 2660[/C][C] 3796[/C][C]-1136[/C][/ROW]
[ROW][C]61[/C][C] 3540[/C][C] 3999[/C][C]-458.8[/C][/ROW]
[ROW][C]62[/C][C] 3160[/C][C] 3664[/C][C]-504.4[/C][/ROW]
[ROW][C]63[/C][C] 2680[/C][C] 3246[/C][C]-565.5[/C][/ROW]
[ROW][C]64[/C][C] 2900[/C][C] 2968[/C][C]-67.74[/C][/ROW]
[ROW][C]65[/C][C] 2920[/C][C] 2969[/C][C]-48.85[/C][/ROW]
[ROW][C]66[/C][C] 2900[/C][C] 3746[/C][C]-845.5[/C][/ROW]
[ROW][C]67[/C][C] 3150[/C][C] 3266[/C][C]-115.5[/C][/ROW]
[ROW][C]68[/C][C] 3150[/C][C] 3463[/C][C]-313.3[/C][/ROW]
[ROW][C]69[/C][C] 3120[/C][C] 4210[/C][C]-1090[/C][/ROW]
[ROW][C]70[/C][C] 3720[/C][C] 3402[/C][C] 317.8[/C][/ROW]
[ROW][C]71[/C][C] 3360[/C][C] 3518[/C][C]-157.7[/C][/ROW]
[ROW][C]72[/C][C] 2740[/C][C] 3430[/C][C]-690[/C][/ROW]
[ROW][C]73[/C][C] 3250[/C][C] 3633[/C][C]-383.2[/C][/ROW]
[ROW][C]74[/C][C] 2700[/C][C] 3299[/C][C]-598.8[/C][/ROW]
[ROW][C]75[/C][C] 2610[/C][C] 2880[/C][C]-269.9[/C][/ROW]
[ROW][C]76[/C][C] 2410[/C][C] 2602[/C][C]-192.1[/C][/ROW]
[ROW][C]77[/C][C] 2590[/C][C] 2603[/C][C]-13.25[/C][/ROW]
[ROW][C]78[/C][C] 2630[/C][C] 3380[/C][C]-749.9[/C][/ROW]
[ROW][C]79[/C][C] 2650[/C][C] 2900[/C][C]-249.9[/C][/ROW]
[ROW][C]80[/C][C] 2600[/C][C] 3098[/C][C]-497.7[/C][/ROW]
[ROW][C]81[/C][C] 3060[/C][C] 3844[/C][C]-784.4[/C][/ROW]
[ROW][C]82[/C][C] 2650[/C][C] 3037[/C][C]-386.6[/C][/ROW]
[ROW][C]83[/C][C] 2700[/C][C] 3152[/C][C]-452.1[/C][/ROW]
[ROW][C]84[/C][C] 2620[/C][C] 3064[/C][C]-444.4[/C][/ROW]
[ROW][C]85[/C][C] 2630[/C][C] 3268[/C][C]-637.7[/C][/ROW]
[ROW][C]86[/C][C] 2850[/C][C] 2933[/C][C]-83.21[/C][/ROW]
[ROW][C]87[/C][C] 2680[/C][C] 2514[/C][C] 165.7[/C][/ROW]
[ROW][C]88[/C][C] 2430[/C][C] 2237[/C][C] 193.5[/C][/ROW]
[ROW][C]89[/C][C] 2550[/C][C] 2238[/C][C] 312.3[/C][/ROW]
[ROW][C]90[/C][C] 2570[/C][C] 3014[/C][C]-444.3[/C][/ROW]
[ROW][C]91[/C][C] 2520[/C][C] 2534[/C][C]-14.32[/C][/ROW]
[ROW][C]92[/C][C] 2500[/C][C] 2732[/C][C]-232.1[/C][/ROW]
[ROW][C]93[/C][C] 2550[/C][C] 3479[/C][C]-928.8[/C][/ROW]
[ROW][C]94[/C][C] 2790[/C][C] 2671[/C][C] 119[/C][/ROW]
[ROW][C]95[/C][C] 2770[/C][C] 2787[/C][C]-16.54[/C][/ROW]
[ROW][C]96[/C][C] 2460[/C][C] 2699[/C][C]-238.8[/C][/ROW]
[ROW][C]97[/C][C] 2800[/C][C] 2902[/C][C]-102.1[/C][/ROW]
[ROW][C]98[/C][C] 2770[/C][C] 2568[/C][C] 202.4[/C][/ROW]
[ROW][C]99[/C][C] 2450[/C][C] 2149[/C][C] 301.3[/C][/ROW]
[ROW][C]100[/C][C] 2370[/C][C] 1871[/C][C] 499.1[/C][/ROW]
[ROW][C]101[/C][C] 2540[/C][C] 1872[/C][C] 667.9[/C][/ROW]
[ROW][C]102[/C][C] 3470[/C][C] 2649[/C][C] 821.3[/C][/ROW]
[ROW][C]103[/C][C] 2690[/C][C] 2169[/C][C] 521.3[/C][/ROW]
[ROW][C]104[/C][C] 4110[/C][C] 2366[/C][C] 1744[/C][/ROW]
[ROW][C]105[/C][C] 3840[/C][C] 3113[/C][C] 726.8[/C][/ROW]
[ROW][C]106[/C][C] 2860[/C][C] 2305[/C][C] 554.6[/C][/ROW]
[ROW][C]107[/C][C] 3540[/C][C] 2421[/C][C] 1119[/C][/ROW]
[ROW][C]108[/C][C] 3370[/C][C] 2333[/C][C] 1037[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299247&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299247&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 4020 5827-1807
2 3540 5492-1952
3 3430 5074-1644
4 4200 4796-595.7
5 3360 4797-1437
6 4440 5574-1134
7 4390 5094-703.5
8 4940 5291-351.3
9 3940 6038-2098
10 4560 5230-670.2
11 4850 5346-495.7
12 5070 5258-187.9
13 6210 5461 748.8
14 5200 5127 73.21
15 4860 4708 152.1
16 5160 4430 729.9
17 5530 4431 1099
18 8830 5208 3622
19 4410 4728-317.9
20 4850 4926-75.68
21 8960 5672 3288
22 4620 4865-244.6
23 5120 4980 139.9
24 4520 4892-372.3
25 8870 5096 3774
26 9470 4761 4709
27 6590 4342 2248
28 3970 4065-94.53
29 3770 4066-295.6
30 5500 4842 657.7
31 6580 4362 2218
32 5280 4560 719.9
33 8640 5307 3333
34 5510 4499 1011
35 5690 4615 1075
36 7620 4527 3093
37 4010 4730-720
38 3570 4396-825.6
39 4040 3977 63.29
40 3600 3699-98.93
41 4000 3700 300
42 3070 4477-1407
43 3230 3997-766.7
44 4060 4194-134.5
45 3480 4941-1461
46 3750 4133-383.4
47 3990 4249-258.9
48 3100 4161-1061
49 3950 4364-414.4
50 3010 4030-1020
51 3160 3611-451.1
52 2960 3333-373.3
53 2750 3334-584.4
54 3590 4111-521.1
55 3060 3631-571.1
56 2970 3829-858.9
57 3590 4576-985.6
58 3450 3768-317.8
59 2930 3883-953.3
60 2660 3796-1136
61 3540 3999-458.8
62 3160 3664-504.4
63 2680 3246-565.5
64 2900 2968-67.74
65 2920 2969-48.85
66 2900 3746-845.5
67 3150 3266-115.5
68 3150 3463-313.3
69 3120 4210-1090
70 3720 3402 317.8
71 3360 3518-157.7
72 2740 3430-690
73 3250 3633-383.2
74 2700 3299-598.8
75 2610 2880-269.9
76 2410 2602-192.1
77 2590 2603-13.25
78 2630 3380-749.9
79 2650 2900-249.9
80 2600 3098-497.7
81 3060 3844-784.4
82 2650 3037-386.6
83 2700 3152-452.1
84 2620 3064-444.4
85 2630 3268-637.7
86 2850 2933-83.21
87 2680 2514 165.7
88 2430 2237 193.5
89 2550 2238 312.3
90 2570 3014-444.3
91 2520 2534-14.32
92 2500 2732-232.1
93 2550 3479-928.8
94 2790 2671 119
95 2770 2787-16.54
96 2460 2699-238.8
97 2800 2902-102.1
98 2770 2568 202.4
99 2450 2149 301.3
100 2370 1871 499.1
101 2540 1872 667.9
102 3470 2649 821.3
103 2690 2169 521.3
104 4110 2366 1744
105 3840 3113 726.8
106 2860 2305 554.6
107 3540 2421 1119
108 3370 2333 1037







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
16 0.03386 0.06773 0.9661
17 0.01488 0.02975 0.9851
18 0.2446 0.4891 0.7554
19 0.324 0.6479 0.676
20 0.3545 0.709 0.6455
21 0.6844 0.6312 0.3156
22 0.6852 0.6295 0.3148
23 0.6508 0.6984 0.3492
24 0.6789 0.6422 0.3211
25 0.7786 0.4428 0.2214
26 0.9663 0.06742 0.03371
27 0.97 0.06002 0.03001
28 0.9937 0.01256 0.006278
29 0.9976 0.004748 0.002374
30 0.9993 0.001382 0.0006911
31 0.9996 0.0007452 0.0003726
32 0.9996 0.0008363 0.0004181
33 1 4.802e-06 2.401e-06
34 1 3.489e-06 1.744e-06
35 1 1.467e-06 7.336e-07
36 1 8.004e-14 4.002e-14
37 1 5.005e-16 2.503e-16
38 1 2.324e-17 1.162e-17
39 1 5.678e-18 2.839e-18
40 1 4.761e-18 2.381e-18
41 1 1.619e-18 8.097e-19
42 1 2.624e-19 1.312e-19
43 1 4.088e-19 2.044e-19
44 1 4.314e-19 2.157e-19
45 1 1.603e-19 8.017e-20
46 1 4.277e-19 2.139e-19
47 1 5.067e-19 2.534e-19
48 1 9.34e-19 4.67e-19
49 1 8.639e-19 4.32e-19
50 1 2.339e-18 1.169e-18
51 1 6.281e-18 3.141e-18
52 1 2.181e-17 1.09e-17
53 1 9.217e-17 4.609e-17
54 1 1.505e-16 7.526e-17
55 1 5.459e-16 2.729e-16
56 1 1.955e-15 9.773e-16
57 1 4.265e-15 2.132e-15
58 1 1.264e-14 6.319e-15
59 1 4.395e-14 2.197e-14
60 1 1.387e-13 6.937e-14
61 1 2.541e-13 1.27e-13
62 1 7.541e-13 3.77e-13
63 1 2.924e-12 1.462e-12
64 1 6.679e-12 3.339e-12
65 1 1.84e-11 9.198e-12
66 1 6.589e-11 3.294e-11
67 1 1.166e-10 5.832e-11
68 1 4.059e-10 2.03e-10
69 1 1.306e-09 6.532e-10
70 1 3.125e-10 1.563e-10
71 1 4.683e-10 2.342e-10
72 1 1.573e-09 7.862e-10
73 1 1.402e-09 7.008e-10
74 1 5.144e-09 2.572e-09
75 1 1.465e-08 7.323e-09
76 1 4.158e-08 2.079e-08
77 1 1.016e-07 5.081e-08
78 1 3.89e-07 1.945e-07
79 1 8.542e-07 4.271e-07
80 1 2.941e-06 1.471e-06
81 1 8.236e-06 4.118e-06
82 1 2.276e-05 1.138e-05
83 1 7.719e-05 3.86e-05
84 0.9999 0.0002133 0.0001067
85 0.9997 0.0006288 0.0003144
86 0.9993 0.001361 0.0006803
87 0.999 0.002039 0.00102
88 0.9982 0.003501 0.001751
89 0.9973 0.005336 0.002668
90 0.9906 0.01883 0.009417
91 0.9824 0.03525 0.01763
92 0.9723 0.05548 0.02774

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
16 &  0.03386 &  0.06773 &  0.9661 \tabularnewline
17 &  0.01488 &  0.02975 &  0.9851 \tabularnewline
18 &  0.2446 &  0.4891 &  0.7554 \tabularnewline
19 &  0.324 &  0.6479 &  0.676 \tabularnewline
20 &  0.3545 &  0.709 &  0.6455 \tabularnewline
21 &  0.6844 &  0.6312 &  0.3156 \tabularnewline
22 &  0.6852 &  0.6295 &  0.3148 \tabularnewline
23 &  0.6508 &  0.6984 &  0.3492 \tabularnewline
24 &  0.6789 &  0.6422 &  0.3211 \tabularnewline
25 &  0.7786 &  0.4428 &  0.2214 \tabularnewline
26 &  0.9663 &  0.06742 &  0.03371 \tabularnewline
27 &  0.97 &  0.06002 &  0.03001 \tabularnewline
28 &  0.9937 &  0.01256 &  0.006278 \tabularnewline
29 &  0.9976 &  0.004748 &  0.002374 \tabularnewline
30 &  0.9993 &  0.001382 &  0.0006911 \tabularnewline
31 &  0.9996 &  0.0007452 &  0.0003726 \tabularnewline
32 &  0.9996 &  0.0008363 &  0.0004181 \tabularnewline
33 &  1 &  4.802e-06 &  2.401e-06 \tabularnewline
34 &  1 &  3.489e-06 &  1.744e-06 \tabularnewline
35 &  1 &  1.467e-06 &  7.336e-07 \tabularnewline
36 &  1 &  8.004e-14 &  4.002e-14 \tabularnewline
37 &  1 &  5.005e-16 &  2.503e-16 \tabularnewline
38 &  1 &  2.324e-17 &  1.162e-17 \tabularnewline
39 &  1 &  5.678e-18 &  2.839e-18 \tabularnewline
40 &  1 &  4.761e-18 &  2.381e-18 \tabularnewline
41 &  1 &  1.619e-18 &  8.097e-19 \tabularnewline
42 &  1 &  2.624e-19 &  1.312e-19 \tabularnewline
43 &  1 &  4.088e-19 &  2.044e-19 \tabularnewline
44 &  1 &  4.314e-19 &  2.157e-19 \tabularnewline
45 &  1 &  1.603e-19 &  8.017e-20 \tabularnewline
46 &  1 &  4.277e-19 &  2.139e-19 \tabularnewline
47 &  1 &  5.067e-19 &  2.534e-19 \tabularnewline
48 &  1 &  9.34e-19 &  4.67e-19 \tabularnewline
49 &  1 &  8.639e-19 &  4.32e-19 \tabularnewline
50 &  1 &  2.339e-18 &  1.169e-18 \tabularnewline
51 &  1 &  6.281e-18 &  3.141e-18 \tabularnewline
52 &  1 &  2.181e-17 &  1.09e-17 \tabularnewline
53 &  1 &  9.217e-17 &  4.609e-17 \tabularnewline
54 &  1 &  1.505e-16 &  7.526e-17 \tabularnewline
55 &  1 &  5.459e-16 &  2.729e-16 \tabularnewline
56 &  1 &  1.955e-15 &  9.773e-16 \tabularnewline
57 &  1 &  4.265e-15 &  2.132e-15 \tabularnewline
58 &  1 &  1.264e-14 &  6.319e-15 \tabularnewline
59 &  1 &  4.395e-14 &  2.197e-14 \tabularnewline
60 &  1 &  1.387e-13 &  6.937e-14 \tabularnewline
61 &  1 &  2.541e-13 &  1.27e-13 \tabularnewline
62 &  1 &  7.541e-13 &  3.77e-13 \tabularnewline
63 &  1 &  2.924e-12 &  1.462e-12 \tabularnewline
64 &  1 &  6.679e-12 &  3.339e-12 \tabularnewline
65 &  1 &  1.84e-11 &  9.198e-12 \tabularnewline
66 &  1 &  6.589e-11 &  3.294e-11 \tabularnewline
67 &  1 &  1.166e-10 &  5.832e-11 \tabularnewline
68 &  1 &  4.059e-10 &  2.03e-10 \tabularnewline
69 &  1 &  1.306e-09 &  6.532e-10 \tabularnewline
70 &  1 &  3.125e-10 &  1.563e-10 \tabularnewline
71 &  1 &  4.683e-10 &  2.342e-10 \tabularnewline
72 &  1 &  1.573e-09 &  7.862e-10 \tabularnewline
73 &  1 &  1.402e-09 &  7.008e-10 \tabularnewline
74 &  1 &  5.144e-09 &  2.572e-09 \tabularnewline
75 &  1 &  1.465e-08 &  7.323e-09 \tabularnewline
76 &  1 &  4.158e-08 &  2.079e-08 \tabularnewline
77 &  1 &  1.016e-07 &  5.081e-08 \tabularnewline
78 &  1 &  3.89e-07 &  1.945e-07 \tabularnewline
79 &  1 &  8.542e-07 &  4.271e-07 \tabularnewline
80 &  1 &  2.941e-06 &  1.471e-06 \tabularnewline
81 &  1 &  8.236e-06 &  4.118e-06 \tabularnewline
82 &  1 &  2.276e-05 &  1.138e-05 \tabularnewline
83 &  1 &  7.719e-05 &  3.86e-05 \tabularnewline
84 &  0.9999 &  0.0002133 &  0.0001067 \tabularnewline
85 &  0.9997 &  0.0006288 &  0.0003144 \tabularnewline
86 &  0.9993 &  0.001361 &  0.0006803 \tabularnewline
87 &  0.999 &  0.002039 &  0.00102 \tabularnewline
88 &  0.9982 &  0.003501 &  0.001751 \tabularnewline
89 &  0.9973 &  0.005336 &  0.002668 \tabularnewline
90 &  0.9906 &  0.01883 &  0.009417 \tabularnewline
91 &  0.9824 &  0.03525 &  0.01763 \tabularnewline
92 &  0.9723 &  0.05548 &  0.02774 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299247&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]16[/C][C] 0.03386[/C][C] 0.06773[/C][C] 0.9661[/C][/ROW]
[ROW][C]17[/C][C] 0.01488[/C][C] 0.02975[/C][C] 0.9851[/C][/ROW]
[ROW][C]18[/C][C] 0.2446[/C][C] 0.4891[/C][C] 0.7554[/C][/ROW]
[ROW][C]19[/C][C] 0.324[/C][C] 0.6479[/C][C] 0.676[/C][/ROW]
[ROW][C]20[/C][C] 0.3545[/C][C] 0.709[/C][C] 0.6455[/C][/ROW]
[ROW][C]21[/C][C] 0.6844[/C][C] 0.6312[/C][C] 0.3156[/C][/ROW]
[ROW][C]22[/C][C] 0.6852[/C][C] 0.6295[/C][C] 0.3148[/C][/ROW]
[ROW][C]23[/C][C] 0.6508[/C][C] 0.6984[/C][C] 0.3492[/C][/ROW]
[ROW][C]24[/C][C] 0.6789[/C][C] 0.6422[/C][C] 0.3211[/C][/ROW]
[ROW][C]25[/C][C] 0.7786[/C][C] 0.4428[/C][C] 0.2214[/C][/ROW]
[ROW][C]26[/C][C] 0.9663[/C][C] 0.06742[/C][C] 0.03371[/C][/ROW]
[ROW][C]27[/C][C] 0.97[/C][C] 0.06002[/C][C] 0.03001[/C][/ROW]
[ROW][C]28[/C][C] 0.9937[/C][C] 0.01256[/C][C] 0.006278[/C][/ROW]
[ROW][C]29[/C][C] 0.9976[/C][C] 0.004748[/C][C] 0.002374[/C][/ROW]
[ROW][C]30[/C][C] 0.9993[/C][C] 0.001382[/C][C] 0.0006911[/C][/ROW]
[ROW][C]31[/C][C] 0.9996[/C][C] 0.0007452[/C][C] 0.0003726[/C][/ROW]
[ROW][C]32[/C][C] 0.9996[/C][C] 0.0008363[/C][C] 0.0004181[/C][/ROW]
[ROW][C]33[/C][C] 1[/C][C] 4.802e-06[/C][C] 2.401e-06[/C][/ROW]
[ROW][C]34[/C][C] 1[/C][C] 3.489e-06[/C][C] 1.744e-06[/C][/ROW]
[ROW][C]35[/C][C] 1[/C][C] 1.467e-06[/C][C] 7.336e-07[/C][/ROW]
[ROW][C]36[/C][C] 1[/C][C] 8.004e-14[/C][C] 4.002e-14[/C][/ROW]
[ROW][C]37[/C][C] 1[/C][C] 5.005e-16[/C][C] 2.503e-16[/C][/ROW]
[ROW][C]38[/C][C] 1[/C][C] 2.324e-17[/C][C] 1.162e-17[/C][/ROW]
[ROW][C]39[/C][C] 1[/C][C] 5.678e-18[/C][C] 2.839e-18[/C][/ROW]
[ROW][C]40[/C][C] 1[/C][C] 4.761e-18[/C][C] 2.381e-18[/C][/ROW]
[ROW][C]41[/C][C] 1[/C][C] 1.619e-18[/C][C] 8.097e-19[/C][/ROW]
[ROW][C]42[/C][C] 1[/C][C] 2.624e-19[/C][C] 1.312e-19[/C][/ROW]
[ROW][C]43[/C][C] 1[/C][C] 4.088e-19[/C][C] 2.044e-19[/C][/ROW]
[ROW][C]44[/C][C] 1[/C][C] 4.314e-19[/C][C] 2.157e-19[/C][/ROW]
[ROW][C]45[/C][C] 1[/C][C] 1.603e-19[/C][C] 8.017e-20[/C][/ROW]
[ROW][C]46[/C][C] 1[/C][C] 4.277e-19[/C][C] 2.139e-19[/C][/ROW]
[ROW][C]47[/C][C] 1[/C][C] 5.067e-19[/C][C] 2.534e-19[/C][/ROW]
[ROW][C]48[/C][C] 1[/C][C] 9.34e-19[/C][C] 4.67e-19[/C][/ROW]
[ROW][C]49[/C][C] 1[/C][C] 8.639e-19[/C][C] 4.32e-19[/C][/ROW]
[ROW][C]50[/C][C] 1[/C][C] 2.339e-18[/C][C] 1.169e-18[/C][/ROW]
[ROW][C]51[/C][C] 1[/C][C] 6.281e-18[/C][C] 3.141e-18[/C][/ROW]
[ROW][C]52[/C][C] 1[/C][C] 2.181e-17[/C][C] 1.09e-17[/C][/ROW]
[ROW][C]53[/C][C] 1[/C][C] 9.217e-17[/C][C] 4.609e-17[/C][/ROW]
[ROW][C]54[/C][C] 1[/C][C] 1.505e-16[/C][C] 7.526e-17[/C][/ROW]
[ROW][C]55[/C][C] 1[/C][C] 5.459e-16[/C][C] 2.729e-16[/C][/ROW]
[ROW][C]56[/C][C] 1[/C][C] 1.955e-15[/C][C] 9.773e-16[/C][/ROW]
[ROW][C]57[/C][C] 1[/C][C] 4.265e-15[/C][C] 2.132e-15[/C][/ROW]
[ROW][C]58[/C][C] 1[/C][C] 1.264e-14[/C][C] 6.319e-15[/C][/ROW]
[ROW][C]59[/C][C] 1[/C][C] 4.395e-14[/C][C] 2.197e-14[/C][/ROW]
[ROW][C]60[/C][C] 1[/C][C] 1.387e-13[/C][C] 6.937e-14[/C][/ROW]
[ROW][C]61[/C][C] 1[/C][C] 2.541e-13[/C][C] 1.27e-13[/C][/ROW]
[ROW][C]62[/C][C] 1[/C][C] 7.541e-13[/C][C] 3.77e-13[/C][/ROW]
[ROW][C]63[/C][C] 1[/C][C] 2.924e-12[/C][C] 1.462e-12[/C][/ROW]
[ROW][C]64[/C][C] 1[/C][C] 6.679e-12[/C][C] 3.339e-12[/C][/ROW]
[ROW][C]65[/C][C] 1[/C][C] 1.84e-11[/C][C] 9.198e-12[/C][/ROW]
[ROW][C]66[/C][C] 1[/C][C] 6.589e-11[/C][C] 3.294e-11[/C][/ROW]
[ROW][C]67[/C][C] 1[/C][C] 1.166e-10[/C][C] 5.832e-11[/C][/ROW]
[ROW][C]68[/C][C] 1[/C][C] 4.059e-10[/C][C] 2.03e-10[/C][/ROW]
[ROW][C]69[/C][C] 1[/C][C] 1.306e-09[/C][C] 6.532e-10[/C][/ROW]
[ROW][C]70[/C][C] 1[/C][C] 3.125e-10[/C][C] 1.563e-10[/C][/ROW]
[ROW][C]71[/C][C] 1[/C][C] 4.683e-10[/C][C] 2.342e-10[/C][/ROW]
[ROW][C]72[/C][C] 1[/C][C] 1.573e-09[/C][C] 7.862e-10[/C][/ROW]
[ROW][C]73[/C][C] 1[/C][C] 1.402e-09[/C][C] 7.008e-10[/C][/ROW]
[ROW][C]74[/C][C] 1[/C][C] 5.144e-09[/C][C] 2.572e-09[/C][/ROW]
[ROW][C]75[/C][C] 1[/C][C] 1.465e-08[/C][C] 7.323e-09[/C][/ROW]
[ROW][C]76[/C][C] 1[/C][C] 4.158e-08[/C][C] 2.079e-08[/C][/ROW]
[ROW][C]77[/C][C] 1[/C][C] 1.016e-07[/C][C] 5.081e-08[/C][/ROW]
[ROW][C]78[/C][C] 1[/C][C] 3.89e-07[/C][C] 1.945e-07[/C][/ROW]
[ROW][C]79[/C][C] 1[/C][C] 8.542e-07[/C][C] 4.271e-07[/C][/ROW]
[ROW][C]80[/C][C] 1[/C][C] 2.941e-06[/C][C] 1.471e-06[/C][/ROW]
[ROW][C]81[/C][C] 1[/C][C] 8.236e-06[/C][C] 4.118e-06[/C][/ROW]
[ROW][C]82[/C][C] 1[/C][C] 2.276e-05[/C][C] 1.138e-05[/C][/ROW]
[ROW][C]83[/C][C] 1[/C][C] 7.719e-05[/C][C] 3.86e-05[/C][/ROW]
[ROW][C]84[/C][C] 0.9999[/C][C] 0.0002133[/C][C] 0.0001067[/C][/ROW]
[ROW][C]85[/C][C] 0.9997[/C][C] 0.0006288[/C][C] 0.0003144[/C][/ROW]
[ROW][C]86[/C][C] 0.9993[/C][C] 0.001361[/C][C] 0.0006803[/C][/ROW]
[ROW][C]87[/C][C] 0.999[/C][C] 0.002039[/C][C] 0.00102[/C][/ROW]
[ROW][C]88[/C][C] 0.9982[/C][C] 0.003501[/C][C] 0.001751[/C][/ROW]
[ROW][C]89[/C][C] 0.9973[/C][C] 0.005336[/C][C] 0.002668[/C][/ROW]
[ROW][C]90[/C][C] 0.9906[/C][C] 0.01883[/C][C] 0.009417[/C][/ROW]
[ROW][C]91[/C][C] 0.9824[/C][C] 0.03525[/C][C] 0.01763[/C][/ROW]
[ROW][C]92[/C][C] 0.9723[/C][C] 0.05548[/C][C] 0.02774[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299247&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299247&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
16 0.03386 0.06773 0.9661
17 0.01488 0.02975 0.9851
18 0.2446 0.4891 0.7554
19 0.324 0.6479 0.676
20 0.3545 0.709 0.6455
21 0.6844 0.6312 0.3156
22 0.6852 0.6295 0.3148
23 0.6508 0.6984 0.3492
24 0.6789 0.6422 0.3211
25 0.7786 0.4428 0.2214
26 0.9663 0.06742 0.03371
27 0.97 0.06002 0.03001
28 0.9937 0.01256 0.006278
29 0.9976 0.004748 0.002374
30 0.9993 0.001382 0.0006911
31 0.9996 0.0007452 0.0003726
32 0.9996 0.0008363 0.0004181
33 1 4.802e-06 2.401e-06
34 1 3.489e-06 1.744e-06
35 1 1.467e-06 7.336e-07
36 1 8.004e-14 4.002e-14
37 1 5.005e-16 2.503e-16
38 1 2.324e-17 1.162e-17
39 1 5.678e-18 2.839e-18
40 1 4.761e-18 2.381e-18
41 1 1.619e-18 8.097e-19
42 1 2.624e-19 1.312e-19
43 1 4.088e-19 2.044e-19
44 1 4.314e-19 2.157e-19
45 1 1.603e-19 8.017e-20
46 1 4.277e-19 2.139e-19
47 1 5.067e-19 2.534e-19
48 1 9.34e-19 4.67e-19
49 1 8.639e-19 4.32e-19
50 1 2.339e-18 1.169e-18
51 1 6.281e-18 3.141e-18
52 1 2.181e-17 1.09e-17
53 1 9.217e-17 4.609e-17
54 1 1.505e-16 7.526e-17
55 1 5.459e-16 2.729e-16
56 1 1.955e-15 9.773e-16
57 1 4.265e-15 2.132e-15
58 1 1.264e-14 6.319e-15
59 1 4.395e-14 2.197e-14
60 1 1.387e-13 6.937e-14
61 1 2.541e-13 1.27e-13
62 1 7.541e-13 3.77e-13
63 1 2.924e-12 1.462e-12
64 1 6.679e-12 3.339e-12
65 1 1.84e-11 9.198e-12
66 1 6.589e-11 3.294e-11
67 1 1.166e-10 5.832e-11
68 1 4.059e-10 2.03e-10
69 1 1.306e-09 6.532e-10
70 1 3.125e-10 1.563e-10
71 1 4.683e-10 2.342e-10
72 1 1.573e-09 7.862e-10
73 1 1.402e-09 7.008e-10
74 1 5.144e-09 2.572e-09
75 1 1.465e-08 7.323e-09
76 1 4.158e-08 2.079e-08
77 1 1.016e-07 5.081e-08
78 1 3.89e-07 1.945e-07
79 1 8.542e-07 4.271e-07
80 1 2.941e-06 1.471e-06
81 1 8.236e-06 4.118e-06
82 1 2.276e-05 1.138e-05
83 1 7.719e-05 3.86e-05
84 0.9999 0.0002133 0.0001067
85 0.9997 0.0006288 0.0003144
86 0.9993 0.001361 0.0006803
87 0.999 0.002039 0.00102
88 0.9982 0.003501 0.001751
89 0.9973 0.005336 0.002668
90 0.9906 0.01883 0.009417
91 0.9824 0.03525 0.01763
92 0.9723 0.05548 0.02774







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level61 0.7922NOK
5% type I error level650.844156NOK
10% type I error level690.896104NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 61 &  0.7922 & NOK \tabularnewline
5% type I error level & 65 & 0.844156 & NOK \tabularnewline
10% type I error level & 69 & 0.896104 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299247&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]61[/C][C] 0.7922[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]65[/C][C]0.844156[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]69[/C][C]0.896104[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299247&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299247&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level61 0.7922NOK
5% type I error level650.844156NOK
10% type I error level690.896104NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 5.0202, df1 = 2, df2 = 93, p-value = 0.008503
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3138, df1 = 24, df2 = 71, p-value = 0.188
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 20.65, df1 = 2, df2 = 93, p-value = 3.792e-08

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 5.0202, df1 = 2, df2 = 93, p-value = 0.008503
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3138, df1 = 24, df2 = 71, p-value = 0.188
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 20.65, df1 = 2, df2 = 93, p-value = 3.792e-08
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=299247&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 5.0202, df1 = 2, df2 = 93, p-value = 0.008503
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3138, df1 = 24, df2 = 71, p-value = 0.188
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 20.65, df1 = 2, df2 = 93, p-value = 3.792e-08
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=299247&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299247&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 5.0202, df1 = 2, df2 = 93, p-value = 0.008503
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3138, df1 = 24, df2 = 71, p-value = 0.188
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 20.65, df1 = 2, df2 = 93, p-value = 3.792e-08







Variance Inflation Factors (Multicollinearity)
> vif
      M1       M2       M3       M4       M5       M6       M7       M8 
1.842962 1.841291 1.839779 1.838426 1.837232 1.836198 1.835323 1.834606 
      M9      M10      M11        t 
1.834049 1.833652 1.833413 1.012413 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
      M1       M2       M3       M4       M5       M6       M7       M8 
1.842962 1.841291 1.839779 1.838426 1.837232 1.836198 1.835323 1.834606 
      M9      M10      M11        t 
1.834049 1.833652 1.833413 1.012413 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=299247&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
      M1       M2       M3       M4       M5       M6       M7       M8 
1.842962 1.841291 1.839779 1.838426 1.837232 1.836198 1.835323 1.834606 
      M9      M10      M11        t 
1.834049 1.833652 1.833413 1.012413 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=299247&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299247&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
      M1       M2       M3       M4       M5       M6       M7       M8 
1.842962 1.841291 1.839779 1.838426 1.837232 1.836198 1.835323 1.834606 
      M9      M10      M11        t 
1.834049 1.833652 1.833413 1.012413 



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')