Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 20 Nov 2012 15:57:25 -0500
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2012/Nov/20/t13534451195oymf38k0jq9b9y.htm/, Retrieved Mon, 29 Apr 2024 21:33:25 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=191291, Retrieved Mon, 29 Apr 2024 21:33:25 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact74
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [WS7 blog 1] [2012-11-20 20:57:25] [8dfbd7005bbb2cc4a26b500e2e7611ce] [Current]
Feedback Forum

Post a new message
Dataseries X:
2,45	2,44	2,46	2,45	2,45	2,43	2,42	2,43	2,46	2,47	2,48	2,49
0,55	0,55	0,55	0,55	0,56	0,55	0,55	0,56	0,55	0,56	0,56	0,56
1,41	1,37	1,4	1,38	1,44	1,43	1,45	1,47	1,47	1,44	1,45	1,45
1,1	1,1	1,1	1,09	1,09	1,08	1,08	1,08	1,08	1,08	1,08	1,06
1,65	1,65	1,64	1,64	1,64	1,63	1,63	1,63	1,63	1,63	1,62	1,61
1,65	1,65	1,65	1,65	1,64	1,64	1,64	1,64	1,64	1,63	1,63	1,62
4,15	4,14	4,13	4,13	4,12	4,11	4,11	4,1	4,1	4,1	4,08	4,06
5,64	5,64	5,63	5,61	5,59	5,58	5,58	5,57	5,56	5,55	5,51	5,44
6,64	6,61	6,63	6,62	6,59	6,57	6,52	6,51	6,51	6,48	6,49	6,47
1,6	1,6	1,6	1,59	1,58	1,58	1,58	1,57	1,58	1,57	1,57	1,56
0,98	0,98	0,98	0,98	0,97	0,97	0,97	0,97	0,97	0,97	0,97	0,96
1,03	1,03	1,02	1,02	1,02	1,02	1,02	1,03	1,03	1,02	1,02	1,02
4,72	4,6	4,62	4,63	4,65	4,61	4,65	4,59	4,65	4,65	4,63	4,48
3,17	3,16	3,16	3,15	3,19	3,2	3,2	3,2	3,22	3,21	3,2	3,15
7,3	7,29	7,27	7,26	7,2	7,19	7,18	7,15	7,14	7,12	7,11	7,08
3,25	3,23	3,23	3,25	3,25	3,27	3,27	3,29	3,27	3,28	3,29	3,24
0,65	0,65	0,65	0,65	0,65	0,65	0,66	0,65	0,66	0,65	0,65	0,65
4,1	4,1	4,11	4,12	4,15	4,13	4,12	4,12	4,18	4,19	4,2	4,19
6,58	6,57	6,56	6,43	6,45	6,41	6,33	6,39	6,39	6,39	6,39	6,4
15,21	15,19	15,17	15,19	15,18	15,15	15,2	15,06	14,97	15,13	15,05	15,16
10,99	11	10,9	10,99	11,04	11,03	10,99	11	10,87	10,88	10,91	10,92
8,19	8,3	8,18	8,24	8,3	8,34	8,3	8,27	8,22	8,22	8,22	8,12
5,89	5,88	5,89	5,91	5,89	5,87	5,87	5,84	5,83	5,83	5,83	5,8
15	15,18	15,18	15,18	15,05	15	15,13	15,08	14,98	15,1	14,95	15,12




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R Server'George Udny Yule' @ yule.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 7 seconds \tabularnewline
R Server & 'George Udny Yule' @ yule.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=191291&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]7 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'George Udny Yule' @ yule.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=191291&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=191291&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R Server'George Udny Yule' @ yule.wessa.net







Multiple Linear Regression - Estimated Regression Equation
2005/12[t] = + 0.00472891862713576 + 0.129933274148402`2005/11`[t] + 1.38134444423706`2005/10`[t] -0.689395298957021`2005/09`[t] + 1.06575281674565`2005/08`[t] -1.11502202842671`2005/07`[t] + 0.980661518292317`2005/06`[t] -0.107463644420801`2005/05`[t] -0.961060792363887`2005/04`[t] -1.19812820806339`2005/03`[t] + 2.09852327243365`2005/02`[t] -0.586487789996359`2005/01`[t] -0.000641674979176878t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
2005/12[t] =  +  0.00472891862713576 +  0.129933274148402`2005/11`[t] +  1.38134444423706`2005/10`[t] -0.689395298957021`2005/09`[t] +  1.06575281674565`2005/08`[t] -1.11502202842671`2005/07`[t] +  0.980661518292317`2005/06`[t] -0.107463644420801`2005/05`[t] -0.961060792363887`2005/04`[t] -1.19812820806339`2005/03`[t] +  2.09852327243365`2005/02`[t] -0.586487789996359`2005/01`[t] -0.000641674979176878t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=191291&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]2005/12[t] =  +  0.00472891862713576 +  0.129933274148402`2005/11`[t] +  1.38134444423706`2005/10`[t] -0.689395298957021`2005/09`[t] +  1.06575281674565`2005/08`[t] -1.11502202842671`2005/07`[t] +  0.980661518292317`2005/06`[t] -0.107463644420801`2005/05`[t] -0.961060792363887`2005/04`[t] -1.19812820806339`2005/03`[t] +  2.09852327243365`2005/02`[t] -0.586487789996359`2005/01`[t] -0.000641674979176878t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=191291&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=191291&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
2005/12[t] = + 0.00472891862713576 + 0.129933274148402`2005/11`[t] + 1.38134444423706`2005/10`[t] -0.689395298957021`2005/09`[t] + 1.06575281674565`2005/08`[t] -1.11502202842671`2005/07`[t] + 0.980661518292317`2005/06`[t] -0.107463644420801`2005/05`[t] -0.961060792363887`2005/04`[t] -1.19812820806339`2005/03`[t] + 2.09852327243365`2005/02`[t] -0.586487789996359`2005/01`[t] -0.000641674979176878t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)0.004728918627135760.0122930.38470.7078010.353901
`2005/11`0.1299332741484020.4300360.30210.7681780.384089
`2005/10`1.381344444237060.3827263.60920.0041030.002052
`2005/09`-0.6893952989570210.363395-1.89710.0843640.042182
`2005/08`1.065752816745650.5808561.83480.0936990.04685
`2005/07`-1.115022028426710.634499-1.75730.1066210.05331
`2005/06`0.9806615182923170.6264251.56550.1457640.072882
`2005/05`-0.1074636444208010.375197-0.28640.779880.38994
`2005/04`-0.9610607923638870.470669-2.04190.0658880.032944
`2005/03`-1.198128208063390.737364-1.62490.132470.066235
`2005/02`2.098523272433650.8199422.55940.0265490.013275
`2005/01`-0.5864877899963590.194042-3.02250.0116040.005802
t-0.0006416749791768780.001173-0.5470.5953130.297656

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 0.00472891862713576 & 0.012293 & 0.3847 & 0.707801 & 0.353901 \tabularnewline
`2005/11` & 0.129933274148402 & 0.430036 & 0.3021 & 0.768178 & 0.384089 \tabularnewline
`2005/10` & 1.38134444423706 & 0.382726 & 3.6092 & 0.004103 & 0.002052 \tabularnewline
`2005/09` & -0.689395298957021 & 0.363395 & -1.8971 & 0.084364 & 0.042182 \tabularnewline
`2005/08` & 1.06575281674565 & 0.580856 & 1.8348 & 0.093699 & 0.04685 \tabularnewline
`2005/07` & -1.11502202842671 & 0.634499 & -1.7573 & 0.106621 & 0.05331 \tabularnewline
`2005/06` & 0.980661518292317 & 0.626425 & 1.5655 & 0.145764 & 0.072882 \tabularnewline
`2005/05` & -0.107463644420801 & 0.375197 & -0.2864 & 0.77988 & 0.38994 \tabularnewline
`2005/04` & -0.961060792363887 & 0.470669 & -2.0419 & 0.065888 & 0.032944 \tabularnewline
`2005/03` & -1.19812820806339 & 0.737364 & -1.6249 & 0.13247 & 0.066235 \tabularnewline
`2005/02` & 2.09852327243365 & 0.819942 & 2.5594 & 0.026549 & 0.013275 \tabularnewline
`2005/01` & -0.586487789996359 & 0.194042 & -3.0225 & 0.011604 & 0.005802 \tabularnewline
t & -0.000641674979176878 & 0.001173 & -0.547 & 0.595313 & 0.297656 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=191291&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]0.00472891862713576[/C][C]0.012293[/C][C]0.3847[/C][C]0.707801[/C][C]0.353901[/C][/ROW]
[ROW][C]`2005/11`[/C][C]0.129933274148402[/C][C]0.430036[/C][C]0.3021[/C][C]0.768178[/C][C]0.384089[/C][/ROW]
[ROW][C]`2005/10`[/C][C]1.38134444423706[/C][C]0.382726[/C][C]3.6092[/C][C]0.004103[/C][C]0.002052[/C][/ROW]
[ROW][C]`2005/09`[/C][C]-0.689395298957021[/C][C]0.363395[/C][C]-1.8971[/C][C]0.084364[/C][C]0.042182[/C][/ROW]
[ROW][C]`2005/08`[/C][C]1.06575281674565[/C][C]0.580856[/C][C]1.8348[/C][C]0.093699[/C][C]0.04685[/C][/ROW]
[ROW][C]`2005/07`[/C][C]-1.11502202842671[/C][C]0.634499[/C][C]-1.7573[/C][C]0.106621[/C][C]0.05331[/C][/ROW]
[ROW][C]`2005/06`[/C][C]0.980661518292317[/C][C]0.626425[/C][C]1.5655[/C][C]0.145764[/C][C]0.072882[/C][/ROW]
[ROW][C]`2005/05`[/C][C]-0.107463644420801[/C][C]0.375197[/C][C]-0.2864[/C][C]0.77988[/C][C]0.38994[/C][/ROW]
[ROW][C]`2005/04`[/C][C]-0.961060792363887[/C][C]0.470669[/C][C]-2.0419[/C][C]0.065888[/C][C]0.032944[/C][/ROW]
[ROW][C]`2005/03`[/C][C]-1.19812820806339[/C][C]0.737364[/C][C]-1.6249[/C][C]0.13247[/C][C]0.066235[/C][/ROW]
[ROW][C]`2005/02`[/C][C]2.09852327243365[/C][C]0.819942[/C][C]2.5594[/C][C]0.026549[/C][C]0.013275[/C][/ROW]
[ROW][C]`2005/01`[/C][C]-0.586487789996359[/C][C]0.194042[/C][C]-3.0225[/C][C]0.011604[/C][C]0.005802[/C][/ROW]
[ROW][C]t[/C][C]-0.000641674979176878[/C][C]0.001173[/C][C]-0.547[/C][C]0.595313[/C][C]0.297656[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=191291&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=191291&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)0.004728918627135760.0122930.38470.7078010.353901
`2005/11`0.1299332741484020.4300360.30210.7681780.384089
`2005/10`1.381344444237060.3827263.60920.0041030.002052
`2005/09`-0.6893952989570210.363395-1.89710.0843640.042182
`2005/08`1.065752816745650.5808561.83480.0936990.04685
`2005/07`-1.115022028426710.634499-1.75730.1066210.05331
`2005/06`0.9806615182923170.6264251.56550.1457640.072882
`2005/05`-0.1074636444208010.375197-0.28640.779880.38994
`2005/04`-0.9610607923638870.470669-2.04190.0658880.032944
`2005/03`-1.198128208063390.737364-1.62490.132470.066235
`2005/02`2.098523272433650.8199422.55940.0265490.013275
`2005/01`-0.5864877899963590.194042-3.02250.0116040.005802
t-0.0006416749791768780.001173-0.5470.5953130.297656







Multiple Linear Regression - Regression Statistics
Multiple R0.99999205700978
R-squared0.999984114082651
Adjusted R-squared0.999966783990997
F-TEST (value)57702.1826573012
F-TEST (DF numerator)12
F-TEST (DF denominator)11
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.0242650456291831
Sum Squared Residuals0.0064767168332497

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.99999205700978 \tabularnewline
R-squared & 0.999984114082651 \tabularnewline
Adjusted R-squared & 0.999966783990997 \tabularnewline
F-TEST (value) & 57702.1826573012 \tabularnewline
F-TEST (DF numerator) & 12 \tabularnewline
F-TEST (DF denominator) & 11 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.0242650456291831 \tabularnewline
Sum Squared Residuals & 0.0064767168332497 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=191291&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.99999205700978[/C][/ROW]
[ROW][C]R-squared[/C][C]0.999984114082651[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.999966783990997[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]57702.1826573012[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]12[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]11[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.0242650456291831[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]0.0064767168332497[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=191291&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=191291&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.99999205700978
R-squared0.999984114082651
Adjusted R-squared0.999966783990997
F-TEST (value)57702.1826573012
F-TEST (DF numerator)12
F-TEST (DF denominator)11
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.0242650456291831
Sum Squared Residuals0.0064767168332497







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
12.452.46426526863613-0.0142652686361256
20.550.56542919313168-0.01542919313168
31.411.42190685348082-0.0119068534808228
41.11.12643127277518-0.0264312727751839
51.651.640252580625210.00974741937479462
61.651.649621902454010.000378097545989533
74.154.122149507360560.0278504926394451
85.645.630394971765110.00960502823488586
96.646.64873417208688-0.00873417208687876
101.61.60332316234763-0.00332316234763146
110.980.98045203267049-0.000452032670489602
121.031.006273622152150.0237263778478514
134.724.714776882369940.00522311763005661
143.173.152932692148080.0170673078519161
157.37.284398156507130.0156018434928701
163.253.237528637115190.0124713628848054
170.650.643143867599210.00685613240078949
184.14.097353279834540.00264672016545869
196.586.581539817959-0.00153981795900251
2015.2115.19871960597240.0112803940275865
2110.9910.97138450583810.0186154941619101
228.198.22411632046864-0.0341163204686408
235.895.92146044236279-0.0314604423627941
241515.0134112523391-0.0134112523391101

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 2.45 & 2.46426526863613 & -0.0142652686361256 \tabularnewline
2 & 0.55 & 0.56542919313168 & -0.01542919313168 \tabularnewline
3 & 1.41 & 1.42190685348082 & -0.0119068534808228 \tabularnewline
4 & 1.1 & 1.12643127277518 & -0.0264312727751839 \tabularnewline
5 & 1.65 & 1.64025258062521 & 0.00974741937479462 \tabularnewline
6 & 1.65 & 1.64962190245401 & 0.000378097545989533 \tabularnewline
7 & 4.15 & 4.12214950736056 & 0.0278504926394451 \tabularnewline
8 & 5.64 & 5.63039497176511 & 0.00960502823488586 \tabularnewline
9 & 6.64 & 6.64873417208688 & -0.00873417208687876 \tabularnewline
10 & 1.6 & 1.60332316234763 & -0.00332316234763146 \tabularnewline
11 & 0.98 & 0.98045203267049 & -0.000452032670489602 \tabularnewline
12 & 1.03 & 1.00627362215215 & 0.0237263778478514 \tabularnewline
13 & 4.72 & 4.71477688236994 & 0.00522311763005661 \tabularnewline
14 & 3.17 & 3.15293269214808 & 0.0170673078519161 \tabularnewline
15 & 7.3 & 7.28439815650713 & 0.0156018434928701 \tabularnewline
16 & 3.25 & 3.23752863711519 & 0.0124713628848054 \tabularnewline
17 & 0.65 & 0.64314386759921 & 0.00685613240078949 \tabularnewline
18 & 4.1 & 4.09735327983454 & 0.00264672016545869 \tabularnewline
19 & 6.58 & 6.581539817959 & -0.00153981795900251 \tabularnewline
20 & 15.21 & 15.1987196059724 & 0.0112803940275865 \tabularnewline
21 & 10.99 & 10.9713845058381 & 0.0186154941619101 \tabularnewline
22 & 8.19 & 8.22411632046864 & -0.0341163204686408 \tabularnewline
23 & 5.89 & 5.92146044236279 & -0.0314604423627941 \tabularnewline
24 & 15 & 15.0134112523391 & -0.0134112523391101 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=191291&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]2.45[/C][C]2.46426526863613[/C][C]-0.0142652686361256[/C][/ROW]
[ROW][C]2[/C][C]0.55[/C][C]0.56542919313168[/C][C]-0.01542919313168[/C][/ROW]
[ROW][C]3[/C][C]1.41[/C][C]1.42190685348082[/C][C]-0.0119068534808228[/C][/ROW]
[ROW][C]4[/C][C]1.1[/C][C]1.12643127277518[/C][C]-0.0264312727751839[/C][/ROW]
[ROW][C]5[/C][C]1.65[/C][C]1.64025258062521[/C][C]0.00974741937479462[/C][/ROW]
[ROW][C]6[/C][C]1.65[/C][C]1.64962190245401[/C][C]0.000378097545989533[/C][/ROW]
[ROW][C]7[/C][C]4.15[/C][C]4.12214950736056[/C][C]0.0278504926394451[/C][/ROW]
[ROW][C]8[/C][C]5.64[/C][C]5.63039497176511[/C][C]0.00960502823488586[/C][/ROW]
[ROW][C]9[/C][C]6.64[/C][C]6.64873417208688[/C][C]-0.00873417208687876[/C][/ROW]
[ROW][C]10[/C][C]1.6[/C][C]1.60332316234763[/C][C]-0.00332316234763146[/C][/ROW]
[ROW][C]11[/C][C]0.98[/C][C]0.98045203267049[/C][C]-0.000452032670489602[/C][/ROW]
[ROW][C]12[/C][C]1.03[/C][C]1.00627362215215[/C][C]0.0237263778478514[/C][/ROW]
[ROW][C]13[/C][C]4.72[/C][C]4.71477688236994[/C][C]0.00522311763005661[/C][/ROW]
[ROW][C]14[/C][C]3.17[/C][C]3.15293269214808[/C][C]0.0170673078519161[/C][/ROW]
[ROW][C]15[/C][C]7.3[/C][C]7.28439815650713[/C][C]0.0156018434928701[/C][/ROW]
[ROW][C]16[/C][C]3.25[/C][C]3.23752863711519[/C][C]0.0124713628848054[/C][/ROW]
[ROW][C]17[/C][C]0.65[/C][C]0.64314386759921[/C][C]0.00685613240078949[/C][/ROW]
[ROW][C]18[/C][C]4.1[/C][C]4.09735327983454[/C][C]0.00264672016545869[/C][/ROW]
[ROW][C]19[/C][C]6.58[/C][C]6.581539817959[/C][C]-0.00153981795900251[/C][/ROW]
[ROW][C]20[/C][C]15.21[/C][C]15.1987196059724[/C][C]0.0112803940275865[/C][/ROW]
[ROW][C]21[/C][C]10.99[/C][C]10.9713845058381[/C][C]0.0186154941619101[/C][/ROW]
[ROW][C]22[/C][C]8.19[/C][C]8.22411632046864[/C][C]-0.0341163204686408[/C][/ROW]
[ROW][C]23[/C][C]5.89[/C][C]5.92146044236279[/C][C]-0.0314604423627941[/C][/ROW]
[ROW][C]24[/C][C]15[/C][C]15.0134112523391[/C][C]-0.0134112523391101[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=191291&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=191291&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
12.452.46426526863613-0.0142652686361256
20.550.56542919313168-0.01542919313168
31.411.42190685348082-0.0119068534808228
41.11.12643127277518-0.0264312727751839
51.651.640252580625210.00974741937479462
61.651.649621902454010.000378097545989533
74.154.122149507360560.0278504926394451
85.645.630394971765110.00960502823488586
96.646.64873417208688-0.00873417208687876
101.61.60332316234763-0.00332316234763146
110.980.98045203267049-0.000452032670489602
121.031.006273622152150.0237263778478514
134.724.714776882369940.00522311763005661
143.173.152932692148080.0170673078519161
157.37.284398156507130.0156018434928701
163.253.237528637115190.0124713628848054
170.650.643143867599210.00685613240078949
184.14.097353279834540.00264672016545869
196.586.581539817959-0.00153981795900251
2015.2115.19871960597240.0112803940275865
2110.9910.97138450583810.0186154941619101
228.198.22411632046864-0.0341163204686408
235.895.92146044236279-0.0314604423627941
241515.0134112523391-0.0134112523391101



Parameters (Session):
par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}