Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 20 Nov 2009 14:14:00 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2009/Nov/20/t1258751734qgedl8pl1jatgcn.htm/, Retrieved Fri, 19 Apr 2024 05:10:00 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=58469, Retrieved Fri, 19 Apr 2024 05:10:00 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact152
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Q1 The Seatbeltlaw] [2007-11-14 19:27:43] [8cd6641b921d30ebe00b648d1481bba0]
- RM D  [Multiple Regression] [Seatbelt] [2009-11-12 14:10:54] [b98453cac15ba1066b407e146608df68]
-    D    [Multiple Regression] [ws 7] [2009-11-19 17:53:21] [b5908418e3090fddbd22f5f0f774653d]
-    D        [Multiple Regression] [WS 7-4] [2009-11-20 21:14:00] [a53416c107f5e7e1e12bb9940270d09d] [Current]
Feedback Forum

Post a new message
Dataseries X:
8.3	98.6	8.2	8.7	9.3	9.3
8.5	96.5	8.3	8.2	8.7	9.3
8.6	95.9	8.5	8.3	8.2	8.7
8.5	103.7	8.6	8.5	8.3	8.2
8.2	103.1	8.5	8.6	8.5	8.3
8.1	103.7	8.2	8.5	8.6	8.5
7.9	112.1	8.1	8.2	8.5	8.6
8.6	86.9	7.9	8.1	8.2	8.5
8.7	95	8.6	7.9	8.1	8.2
8.7	111.8	8.7	8.6	7.9	8.1
8.5	108.8	8.7	8.7	8.6	7.9
8.4	109.3	8.5	8.7	8.7	8.6
8.5	101.4	8.4	8.5	8.7	8.7
8.7	100.5	8.5	8.4	8.5	8.7
8.7	100.7	8.7	8.5	8.4	8.5
8.6	113.5	8.7	8.7	8.5	8.4
8.5	106.1	8.6	8.7	8.7	8.5
8.3	111.6	8.5	8.6	8.7	8.7
8	114.9	8.3	8.5	8.6	8.7
8.2	88.6	8	8.3	8.5	8.6
8.1	99.5	8.2	8	8.3	8.5
8.1	115.1	8.1	8.2	8	8.3
8	118	8.1	8.1	8.2	8
7.9	111.4	8	8.1	8.1	8.2
7.9	107.3	7.9	8	8.1	8.1
8	105.3	7.9	7.9	8	8.1
8	105.3	8	7.9	7.9	8
7.9	117.9	8	8	7.9	7.9
8	110.2	7.9	8	8	7.9
7.7	112.4	8	7.9	8	8
7.2	117.5	7.7	8	7.9	8
7.5	93	7.2	7.7	8	7.9
7.3	103.5	7.5	7.2	7.7	8
7	116.3	7.3	7.5	7.2	7.7
7	120	7	7.3	7.5	7.2
7	114.3	7	7	7.3	7.5
7.2	104.7	7	7	7	7.3
7.3	109.8	7.2	7	7	7
7.1	112.6	7.3	7.2	7	7
6.8	114.4	7.1	7.3	7.2	7
6.4	115.7	6.8	7.1	7.3	7.2
6.1	114.7	6.4	6.8	7.1	7.3
6.5	118.4	6.1	6.4	6.8	7.1
7.7	94.9	6.5	6.1	6.4	6.8
7.9	103.8	7.7	6.5	6.1	6.4
7.5	115.1	7.9	7.7	6.5	6.1
6.9	113.7	7.5	7.9	7.7	6.5
6.6	104	6.9	7.5	7.9	7.7
6.9	94.3	6.6	6.9	7.5	7.9
7.7	92.5	6.9	6.6	6.9	7.5
8	93.2	7.7	6.9	6.6	6.9
8	104.7	8	7.7	6.9	6.6
7.7	94	8	8	7.7	6.9
7.3	98.1	7.7	8	8	7.7
7.4	102.7	7.3	7.7	8	8
8.1	82.4	7.4	7.3	7.7	8




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58469&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58469&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58469&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 2.42428661862928 -0.0117846612829053X[t] + 1.46617466855599Y1[t] -0.818010407458793Y2[t] -0.0404602516786089Y3[t] + 0.251899520029868Y4[t] + 0.0759909965765519M1[t] + 0.00922654169473366M2[t] -0.169015605044706M3[t] + 0.0500030192168811M4[t] -0.0214506584890375M5[t] -0.125440393745722M6[t] + 0.0393717349680023M7[t] + 0.336062911357132M8[t] -0.489194385002157M9[t] + 0.0466595620924516M10[t] + 0.149763779550361M11[t] -0.00298100774336668t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Y[t] =  +  2.42428661862928 -0.0117846612829053X[t] +  1.46617466855599Y1[t] -0.818010407458793Y2[t] -0.0404602516786089Y3[t] +  0.251899520029868Y4[t] +  0.0759909965765519M1[t] +  0.00922654169473366M2[t] -0.169015605044706M3[t] +  0.0500030192168811M4[t] -0.0214506584890375M5[t] -0.125440393745722M6[t] +  0.0393717349680023M7[t] +  0.336062911357132M8[t] -0.489194385002157M9[t] +  0.0466595620924516M10[t] +  0.149763779550361M11[t] -0.00298100774336668t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58469&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Y[t] =  +  2.42428661862928 -0.0117846612829053X[t] +  1.46617466855599Y1[t] -0.818010407458793Y2[t] -0.0404602516786089Y3[t] +  0.251899520029868Y4[t] +  0.0759909965765519M1[t] +  0.00922654169473366M2[t] -0.169015605044706M3[t] +  0.0500030192168811M4[t] -0.0214506584890375M5[t] -0.125440393745722M6[t] +  0.0393717349680023M7[t] +  0.336062911357132M8[t] -0.489194385002157M9[t] +  0.0466595620924516M10[t] +  0.149763779550361M11[t] -0.00298100774336668t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58469&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58469&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 2.42428661862928 -0.0117846612829053X[t] + 1.46617466855599Y1[t] -0.818010407458793Y2[t] -0.0404602516786089Y3[t] + 0.251899520029868Y4[t] + 0.0759909965765519M1[t] + 0.00922654169473366M2[t] -0.169015605044706M3[t] + 0.0500030192168811M4[t] -0.0214506584890375M5[t] -0.125440393745722M6[t] + 0.0393717349680023M7[t] + 0.336062911357132M8[t] -0.489194385002157M9[t] + 0.0466595620924516M10[t] + 0.149763779550361M11[t] -0.00298100774336668t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)2.424286618629281.0671722.27170.0288520.014426
X-0.01178466128290530.004771-2.47020.0181060.009053
Y11.466174668555990.1539139.52600
Y2-0.8180104074587930.289833-2.82240.0075440.003772
Y3-0.04046025167860890.288756-0.14010.8893060.444653
Y40.2518995200298680.1576921.59740.1184570.059228
M10.07599099657655190.1164850.65240.518090.259045
M20.009226541694733660.1232480.07490.9407180.470359
M3-0.1690156050447060.126806-1.33290.1905110.095255
M40.05000301921688110.117720.42480.6734050.336702
M5-0.02145065848903750.113407-0.18910.8509840.425492
M6-0.1254403937457220.107946-1.16210.2524570.126228
M70.03937173496800230.1100790.35770.722570.361285
M80.3360629113571320.1486662.26050.02960.0148
M9-0.4891943850021570.156086-3.13410.0033160.001658
M100.04665956209245160.1662120.28070.7804450.390223
M110.1497637795503610.141061.06170.2950730.147536
t-0.002981007743366680.002959-1.00760.3200330.160016

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 2.42428661862928 & 1.067172 & 2.2717 & 0.028852 & 0.014426 \tabularnewline
X & -0.0117846612829053 & 0.004771 & -2.4702 & 0.018106 & 0.009053 \tabularnewline
Y1 & 1.46617466855599 & 0.153913 & 9.526 & 0 & 0 \tabularnewline
Y2 & -0.818010407458793 & 0.289833 & -2.8224 & 0.007544 & 0.003772 \tabularnewline
Y3 & -0.0404602516786089 & 0.288756 & -0.1401 & 0.889306 & 0.444653 \tabularnewline
Y4 & 0.251899520029868 & 0.157692 & 1.5974 & 0.118457 & 0.059228 \tabularnewline
M1 & 0.0759909965765519 & 0.116485 & 0.6524 & 0.51809 & 0.259045 \tabularnewline
M2 & 0.00922654169473366 & 0.123248 & 0.0749 & 0.940718 & 0.470359 \tabularnewline
M3 & -0.169015605044706 & 0.126806 & -1.3329 & 0.190511 & 0.095255 \tabularnewline
M4 & 0.0500030192168811 & 0.11772 & 0.4248 & 0.673405 & 0.336702 \tabularnewline
M5 & -0.0214506584890375 & 0.113407 & -0.1891 & 0.850984 & 0.425492 \tabularnewline
M6 & -0.125440393745722 & 0.107946 & -1.1621 & 0.252457 & 0.126228 \tabularnewline
M7 & 0.0393717349680023 & 0.110079 & 0.3577 & 0.72257 & 0.361285 \tabularnewline
M8 & 0.336062911357132 & 0.148666 & 2.2605 & 0.0296 & 0.0148 \tabularnewline
M9 & -0.489194385002157 & 0.156086 & -3.1341 & 0.003316 & 0.001658 \tabularnewline
M10 & 0.0466595620924516 & 0.166212 & 0.2807 & 0.780445 & 0.390223 \tabularnewline
M11 & 0.149763779550361 & 0.14106 & 1.0617 & 0.295073 & 0.147536 \tabularnewline
t & -0.00298100774336668 & 0.002959 & -1.0076 & 0.320033 & 0.160016 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58469&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]2.42428661862928[/C][C]1.067172[/C][C]2.2717[/C][C]0.028852[/C][C]0.014426[/C][/ROW]
[ROW][C]X[/C][C]-0.0117846612829053[/C][C]0.004771[/C][C]-2.4702[/C][C]0.018106[/C][C]0.009053[/C][/ROW]
[ROW][C]Y1[/C][C]1.46617466855599[/C][C]0.153913[/C][C]9.526[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]Y2[/C][C]-0.818010407458793[/C][C]0.289833[/C][C]-2.8224[/C][C]0.007544[/C][C]0.003772[/C][/ROW]
[ROW][C]Y3[/C][C]-0.0404602516786089[/C][C]0.288756[/C][C]-0.1401[/C][C]0.889306[/C][C]0.444653[/C][/ROW]
[ROW][C]Y4[/C][C]0.251899520029868[/C][C]0.157692[/C][C]1.5974[/C][C]0.118457[/C][C]0.059228[/C][/ROW]
[ROW][C]M1[/C][C]0.0759909965765519[/C][C]0.116485[/C][C]0.6524[/C][C]0.51809[/C][C]0.259045[/C][/ROW]
[ROW][C]M2[/C][C]0.00922654169473366[/C][C]0.123248[/C][C]0.0749[/C][C]0.940718[/C][C]0.470359[/C][/ROW]
[ROW][C]M3[/C][C]-0.169015605044706[/C][C]0.126806[/C][C]-1.3329[/C][C]0.190511[/C][C]0.095255[/C][/ROW]
[ROW][C]M4[/C][C]0.0500030192168811[/C][C]0.11772[/C][C]0.4248[/C][C]0.673405[/C][C]0.336702[/C][/ROW]
[ROW][C]M5[/C][C]-0.0214506584890375[/C][C]0.113407[/C][C]-0.1891[/C][C]0.850984[/C][C]0.425492[/C][/ROW]
[ROW][C]M6[/C][C]-0.125440393745722[/C][C]0.107946[/C][C]-1.1621[/C][C]0.252457[/C][C]0.126228[/C][/ROW]
[ROW][C]M7[/C][C]0.0393717349680023[/C][C]0.110079[/C][C]0.3577[/C][C]0.72257[/C][C]0.361285[/C][/ROW]
[ROW][C]M8[/C][C]0.336062911357132[/C][C]0.148666[/C][C]2.2605[/C][C]0.0296[/C][C]0.0148[/C][/ROW]
[ROW][C]M9[/C][C]-0.489194385002157[/C][C]0.156086[/C][C]-3.1341[/C][C]0.003316[/C][C]0.001658[/C][/ROW]
[ROW][C]M10[/C][C]0.0466595620924516[/C][C]0.166212[/C][C]0.2807[/C][C]0.780445[/C][C]0.390223[/C][/ROW]
[ROW][C]M11[/C][C]0.149763779550361[/C][C]0.14106[/C][C]1.0617[/C][C]0.295073[/C][C]0.147536[/C][/ROW]
[ROW][C]t[/C][C]-0.00298100774336668[/C][C]0.002959[/C][C]-1.0076[/C][C]0.320033[/C][C]0.160016[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58469&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58469&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)2.424286618629281.0671722.27170.0288520.014426
X-0.01178466128290530.004771-2.47020.0181060.009053
Y11.466174668555990.1539139.52600
Y2-0.8180104074587930.289833-2.82240.0075440.003772
Y3-0.04046025167860890.288756-0.14010.8893060.444653
Y40.2518995200298680.1576921.59740.1184570.059228
M10.07599099657655190.1164850.65240.518090.259045
M20.009226541694733660.1232480.07490.9407180.470359
M3-0.1690156050447060.126806-1.33290.1905110.095255
M40.05000301921688110.117720.42480.6734050.336702
M5-0.02145065848903750.113407-0.18910.8509840.425492
M6-0.1254403937457220.107946-1.16210.2524570.126228
M70.03937173496800230.1100790.35770.722570.361285
M80.3360629113571320.1486662.26050.02960.0148
M9-0.4891943850021570.156086-3.13410.0033160.001658
M100.04665956209245160.1662120.28070.7804450.390223
M110.1497637795503610.141061.06170.2950730.147536
t-0.002981007743366680.002959-1.00760.3200330.160016







Multiple Linear Regression - Regression Statistics
Multiple R0.97972581791468
R-squared0.95986267828859
Adjusted R-squared0.941906508049273
F-TEST (value)53.455868678887
F-TEST (DF numerator)17
F-TEST (DF denominator)38
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.160319121295799
Sum Squared Residuals0.976684384816172

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.97972581791468 \tabularnewline
R-squared & 0.95986267828859 \tabularnewline
Adjusted R-squared & 0.941906508049273 \tabularnewline
F-TEST (value) & 53.455868678887 \tabularnewline
F-TEST (DF numerator) & 17 \tabularnewline
F-TEST (DF denominator) & 38 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.160319121295799 \tabularnewline
Sum Squared Residuals & 0.976684384816172 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58469&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.97972581791468[/C][/ROW]
[ROW][C]R-squared[/C][C]0.95986267828859[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.941906508049273[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]53.455868678887[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]17[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]38[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.160319121295799[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]0.976684384816172[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58469&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58469&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.97972581791468
R-squared0.95986267828859
Adjusted R-squared0.941906508049273
F-TEST (value)53.455868678887
F-TEST (DF numerator)17
F-TEST (DF denominator)38
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.160319121295799
Sum Squared Residuals0.976684384816172







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
18.38.207655937902380.0923440620976185
28.58.74255708556345-0.242557085563449
38.68.64892903463709-0.0489290346370851
48.58.6260658933297-0.126065893329691
58.28.34738139871594-0.147381398715937
68.17.921622377963330.178377622036665
77.98.11248397671017-0.212483976710174
88.68.478681840220430.121318159779571
98.78.673388298366090.0266117016339067
108.78.56519120813170.134808791868295
118.58.54016528076808-0.0401652807680848
128.48.260576867974750.139423132025247
138.58.468860247582040.0311397524179634
148.78.646231538048670.0537684619513331
158.78.627751465436480.0722485345635147
168.68.500107358870910.0998926411290883
178.58.38335960172680.116640398273208
188.38.197136699567020.102863300432985
1988.11269057050633-0.112690570506327
208.28.41894308498233-0.218943084982334
218.17.983792127177580.116207872822419
228.17.984362973665750.115637026334248
2388.04844980006107-0.0484498000610658
247.97.881292239552750.0187077604472520
257.97.91261296153314-0.0126129615331387
2687.95228388738750.0477161126124956
2787.896534272925170.103465727074829
287.97.857094164529920.0429058354700816
2987.722737878935550.277262121064455
307.77.84344934071757-0.143449340717567
317.27.42757127300029-0.227571273000291
327.57.493085453866030.00691454613397222
337.37.42729383809563-0.127293838095629
3477.21534532690719-0.215345326907191
3576.857527135281430.142472864718572
3677.10101994588258-0.101019945882580
377.27.24892085452927-0.0489208545292653
387.37.3367386970635-0.0367386970635021
397.17.1055338763524-0.00553387635240147
406.86.91723107776859-0.117231077768592
416.46.5975598924146-0.197559892414603
426.16.19458906785141-0.0945890678514077
436.56.161926875989340.338073124010659
447.77.50506381910590.194936180894101
457.97.9155257363607-0.0155257363606966
467.57.53510049129535-0.0351004912953519
476.96.95385778388942-0.0538577838894216
486.66.65711094658992-0.0571109465899182
496.96.96194999845318-0.061949998453178
507.77.522188791936880.177811208063122
5188.12125135064886-0.121251350648857
5287.899501505500890.100498494499114
537.77.74896122820712-0.0489612282071236
547.37.34320251390068-0.0432025139006756
557.47.185327303793870.214672696206132
568.18.2042258018253-0.104225801825310

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 8.3 & 8.20765593790238 & 0.0923440620976185 \tabularnewline
2 & 8.5 & 8.74255708556345 & -0.242557085563449 \tabularnewline
3 & 8.6 & 8.64892903463709 & -0.0489290346370851 \tabularnewline
4 & 8.5 & 8.6260658933297 & -0.126065893329691 \tabularnewline
5 & 8.2 & 8.34738139871594 & -0.147381398715937 \tabularnewline
6 & 8.1 & 7.92162237796333 & 0.178377622036665 \tabularnewline
7 & 7.9 & 8.11248397671017 & -0.212483976710174 \tabularnewline
8 & 8.6 & 8.47868184022043 & 0.121318159779571 \tabularnewline
9 & 8.7 & 8.67338829836609 & 0.0266117016339067 \tabularnewline
10 & 8.7 & 8.5651912081317 & 0.134808791868295 \tabularnewline
11 & 8.5 & 8.54016528076808 & -0.0401652807680848 \tabularnewline
12 & 8.4 & 8.26057686797475 & 0.139423132025247 \tabularnewline
13 & 8.5 & 8.46886024758204 & 0.0311397524179634 \tabularnewline
14 & 8.7 & 8.64623153804867 & 0.0537684619513331 \tabularnewline
15 & 8.7 & 8.62775146543648 & 0.0722485345635147 \tabularnewline
16 & 8.6 & 8.50010735887091 & 0.0998926411290883 \tabularnewline
17 & 8.5 & 8.3833596017268 & 0.116640398273208 \tabularnewline
18 & 8.3 & 8.19713669956702 & 0.102863300432985 \tabularnewline
19 & 8 & 8.11269057050633 & -0.112690570506327 \tabularnewline
20 & 8.2 & 8.41894308498233 & -0.218943084982334 \tabularnewline
21 & 8.1 & 7.98379212717758 & 0.116207872822419 \tabularnewline
22 & 8.1 & 7.98436297366575 & 0.115637026334248 \tabularnewline
23 & 8 & 8.04844980006107 & -0.0484498000610658 \tabularnewline
24 & 7.9 & 7.88129223955275 & 0.0187077604472520 \tabularnewline
25 & 7.9 & 7.91261296153314 & -0.0126129615331387 \tabularnewline
26 & 8 & 7.9522838873875 & 0.0477161126124956 \tabularnewline
27 & 8 & 7.89653427292517 & 0.103465727074829 \tabularnewline
28 & 7.9 & 7.85709416452992 & 0.0429058354700816 \tabularnewline
29 & 8 & 7.72273787893555 & 0.277262121064455 \tabularnewline
30 & 7.7 & 7.84344934071757 & -0.143449340717567 \tabularnewline
31 & 7.2 & 7.42757127300029 & -0.227571273000291 \tabularnewline
32 & 7.5 & 7.49308545386603 & 0.00691454613397222 \tabularnewline
33 & 7.3 & 7.42729383809563 & -0.127293838095629 \tabularnewline
34 & 7 & 7.21534532690719 & -0.215345326907191 \tabularnewline
35 & 7 & 6.85752713528143 & 0.142472864718572 \tabularnewline
36 & 7 & 7.10101994588258 & -0.101019945882580 \tabularnewline
37 & 7.2 & 7.24892085452927 & -0.0489208545292653 \tabularnewline
38 & 7.3 & 7.3367386970635 & -0.0367386970635021 \tabularnewline
39 & 7.1 & 7.1055338763524 & -0.00553387635240147 \tabularnewline
40 & 6.8 & 6.91723107776859 & -0.117231077768592 \tabularnewline
41 & 6.4 & 6.5975598924146 & -0.197559892414603 \tabularnewline
42 & 6.1 & 6.19458906785141 & -0.0945890678514077 \tabularnewline
43 & 6.5 & 6.16192687598934 & 0.338073124010659 \tabularnewline
44 & 7.7 & 7.5050638191059 & 0.194936180894101 \tabularnewline
45 & 7.9 & 7.9155257363607 & -0.0155257363606966 \tabularnewline
46 & 7.5 & 7.53510049129535 & -0.0351004912953519 \tabularnewline
47 & 6.9 & 6.95385778388942 & -0.0538577838894216 \tabularnewline
48 & 6.6 & 6.65711094658992 & -0.0571109465899182 \tabularnewline
49 & 6.9 & 6.96194999845318 & -0.061949998453178 \tabularnewline
50 & 7.7 & 7.52218879193688 & 0.177811208063122 \tabularnewline
51 & 8 & 8.12125135064886 & -0.121251350648857 \tabularnewline
52 & 8 & 7.89950150550089 & 0.100498494499114 \tabularnewline
53 & 7.7 & 7.74896122820712 & -0.0489612282071236 \tabularnewline
54 & 7.3 & 7.34320251390068 & -0.0432025139006756 \tabularnewline
55 & 7.4 & 7.18532730379387 & 0.214672696206132 \tabularnewline
56 & 8.1 & 8.2042258018253 & -0.104225801825310 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58469&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]8.3[/C][C]8.20765593790238[/C][C]0.0923440620976185[/C][/ROW]
[ROW][C]2[/C][C]8.5[/C][C]8.74255708556345[/C][C]-0.242557085563449[/C][/ROW]
[ROW][C]3[/C][C]8.6[/C][C]8.64892903463709[/C][C]-0.0489290346370851[/C][/ROW]
[ROW][C]4[/C][C]8.5[/C][C]8.6260658933297[/C][C]-0.126065893329691[/C][/ROW]
[ROW][C]5[/C][C]8.2[/C][C]8.34738139871594[/C][C]-0.147381398715937[/C][/ROW]
[ROW][C]6[/C][C]8.1[/C][C]7.92162237796333[/C][C]0.178377622036665[/C][/ROW]
[ROW][C]7[/C][C]7.9[/C][C]8.11248397671017[/C][C]-0.212483976710174[/C][/ROW]
[ROW][C]8[/C][C]8.6[/C][C]8.47868184022043[/C][C]0.121318159779571[/C][/ROW]
[ROW][C]9[/C][C]8.7[/C][C]8.67338829836609[/C][C]0.0266117016339067[/C][/ROW]
[ROW][C]10[/C][C]8.7[/C][C]8.5651912081317[/C][C]0.134808791868295[/C][/ROW]
[ROW][C]11[/C][C]8.5[/C][C]8.54016528076808[/C][C]-0.0401652807680848[/C][/ROW]
[ROW][C]12[/C][C]8.4[/C][C]8.26057686797475[/C][C]0.139423132025247[/C][/ROW]
[ROW][C]13[/C][C]8.5[/C][C]8.46886024758204[/C][C]0.0311397524179634[/C][/ROW]
[ROW][C]14[/C][C]8.7[/C][C]8.64623153804867[/C][C]0.0537684619513331[/C][/ROW]
[ROW][C]15[/C][C]8.7[/C][C]8.62775146543648[/C][C]0.0722485345635147[/C][/ROW]
[ROW][C]16[/C][C]8.6[/C][C]8.50010735887091[/C][C]0.0998926411290883[/C][/ROW]
[ROW][C]17[/C][C]8.5[/C][C]8.3833596017268[/C][C]0.116640398273208[/C][/ROW]
[ROW][C]18[/C][C]8.3[/C][C]8.19713669956702[/C][C]0.102863300432985[/C][/ROW]
[ROW][C]19[/C][C]8[/C][C]8.11269057050633[/C][C]-0.112690570506327[/C][/ROW]
[ROW][C]20[/C][C]8.2[/C][C]8.41894308498233[/C][C]-0.218943084982334[/C][/ROW]
[ROW][C]21[/C][C]8.1[/C][C]7.98379212717758[/C][C]0.116207872822419[/C][/ROW]
[ROW][C]22[/C][C]8.1[/C][C]7.98436297366575[/C][C]0.115637026334248[/C][/ROW]
[ROW][C]23[/C][C]8[/C][C]8.04844980006107[/C][C]-0.0484498000610658[/C][/ROW]
[ROW][C]24[/C][C]7.9[/C][C]7.88129223955275[/C][C]0.0187077604472520[/C][/ROW]
[ROW][C]25[/C][C]7.9[/C][C]7.91261296153314[/C][C]-0.0126129615331387[/C][/ROW]
[ROW][C]26[/C][C]8[/C][C]7.9522838873875[/C][C]0.0477161126124956[/C][/ROW]
[ROW][C]27[/C][C]8[/C][C]7.89653427292517[/C][C]0.103465727074829[/C][/ROW]
[ROW][C]28[/C][C]7.9[/C][C]7.85709416452992[/C][C]0.0429058354700816[/C][/ROW]
[ROW][C]29[/C][C]8[/C][C]7.72273787893555[/C][C]0.277262121064455[/C][/ROW]
[ROW][C]30[/C][C]7.7[/C][C]7.84344934071757[/C][C]-0.143449340717567[/C][/ROW]
[ROW][C]31[/C][C]7.2[/C][C]7.42757127300029[/C][C]-0.227571273000291[/C][/ROW]
[ROW][C]32[/C][C]7.5[/C][C]7.49308545386603[/C][C]0.00691454613397222[/C][/ROW]
[ROW][C]33[/C][C]7.3[/C][C]7.42729383809563[/C][C]-0.127293838095629[/C][/ROW]
[ROW][C]34[/C][C]7[/C][C]7.21534532690719[/C][C]-0.215345326907191[/C][/ROW]
[ROW][C]35[/C][C]7[/C][C]6.85752713528143[/C][C]0.142472864718572[/C][/ROW]
[ROW][C]36[/C][C]7[/C][C]7.10101994588258[/C][C]-0.101019945882580[/C][/ROW]
[ROW][C]37[/C][C]7.2[/C][C]7.24892085452927[/C][C]-0.0489208545292653[/C][/ROW]
[ROW][C]38[/C][C]7.3[/C][C]7.3367386970635[/C][C]-0.0367386970635021[/C][/ROW]
[ROW][C]39[/C][C]7.1[/C][C]7.1055338763524[/C][C]-0.00553387635240147[/C][/ROW]
[ROW][C]40[/C][C]6.8[/C][C]6.91723107776859[/C][C]-0.117231077768592[/C][/ROW]
[ROW][C]41[/C][C]6.4[/C][C]6.5975598924146[/C][C]-0.197559892414603[/C][/ROW]
[ROW][C]42[/C][C]6.1[/C][C]6.19458906785141[/C][C]-0.0945890678514077[/C][/ROW]
[ROW][C]43[/C][C]6.5[/C][C]6.16192687598934[/C][C]0.338073124010659[/C][/ROW]
[ROW][C]44[/C][C]7.7[/C][C]7.5050638191059[/C][C]0.194936180894101[/C][/ROW]
[ROW][C]45[/C][C]7.9[/C][C]7.9155257363607[/C][C]-0.0155257363606966[/C][/ROW]
[ROW][C]46[/C][C]7.5[/C][C]7.53510049129535[/C][C]-0.0351004912953519[/C][/ROW]
[ROW][C]47[/C][C]6.9[/C][C]6.95385778388942[/C][C]-0.0538577838894216[/C][/ROW]
[ROW][C]48[/C][C]6.6[/C][C]6.65711094658992[/C][C]-0.0571109465899182[/C][/ROW]
[ROW][C]49[/C][C]6.9[/C][C]6.96194999845318[/C][C]-0.061949998453178[/C][/ROW]
[ROW][C]50[/C][C]7.7[/C][C]7.52218879193688[/C][C]0.177811208063122[/C][/ROW]
[ROW][C]51[/C][C]8[/C][C]8.12125135064886[/C][C]-0.121251350648857[/C][/ROW]
[ROW][C]52[/C][C]8[/C][C]7.89950150550089[/C][C]0.100498494499114[/C][/ROW]
[ROW][C]53[/C][C]7.7[/C][C]7.74896122820712[/C][C]-0.0489612282071236[/C][/ROW]
[ROW][C]54[/C][C]7.3[/C][C]7.34320251390068[/C][C]-0.0432025139006756[/C][/ROW]
[ROW][C]55[/C][C]7.4[/C][C]7.18532730379387[/C][C]0.214672696206132[/C][/ROW]
[ROW][C]56[/C][C]8.1[/C][C]8.2042258018253[/C][C]-0.104225801825310[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58469&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58469&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
18.38.207655937902380.0923440620976185
28.58.74255708556345-0.242557085563449
38.68.64892903463709-0.0489290346370851
48.58.6260658933297-0.126065893329691
58.28.34738139871594-0.147381398715937
68.17.921622377963330.178377622036665
77.98.11248397671017-0.212483976710174
88.68.478681840220430.121318159779571
98.78.673388298366090.0266117016339067
108.78.56519120813170.134808791868295
118.58.54016528076808-0.0401652807680848
128.48.260576867974750.139423132025247
138.58.468860247582040.0311397524179634
148.78.646231538048670.0537684619513331
158.78.627751465436480.0722485345635147
168.68.500107358870910.0998926411290883
178.58.38335960172680.116640398273208
188.38.197136699567020.102863300432985
1988.11269057050633-0.112690570506327
208.28.41894308498233-0.218943084982334
218.17.983792127177580.116207872822419
228.17.984362973665750.115637026334248
2388.04844980006107-0.0484498000610658
247.97.881292239552750.0187077604472520
257.97.91261296153314-0.0126129615331387
2687.95228388738750.0477161126124956
2787.896534272925170.103465727074829
287.97.857094164529920.0429058354700816
2987.722737878935550.277262121064455
307.77.84344934071757-0.143449340717567
317.27.42757127300029-0.227571273000291
327.57.493085453866030.00691454613397222
337.37.42729383809563-0.127293838095629
3477.21534532690719-0.215345326907191
3576.857527135281430.142472864718572
3677.10101994588258-0.101019945882580
377.27.24892085452927-0.0489208545292653
387.37.3367386970635-0.0367386970635021
397.17.1055338763524-0.00553387635240147
406.86.91723107776859-0.117231077768592
416.46.5975598924146-0.197559892414603
426.16.19458906785141-0.0945890678514077
436.56.161926875989340.338073124010659
447.77.50506381910590.194936180894101
457.97.9155257363607-0.0155257363606966
467.57.53510049129535-0.0351004912953519
476.96.95385778388942-0.0538577838894216
486.66.65711094658992-0.0571109465899182
496.96.96194999845318-0.061949998453178
507.77.522188791936880.177811208063122
5188.12125135064886-0.121251350648857
5287.899501505500890.100498494499114
537.77.74896122820712-0.0489612282071236
547.37.34320251390068-0.0432025139006756
557.47.185327303793870.214672696206132
568.18.2042258018253-0.104225801825310







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
210.7243743176671480.5512513646657050.275625682332853
220.5978623042082880.8042753915834250.402137695791712
230.4620946613803650.924189322760730.537905338619635
240.3235589295605690.6471178591211380.676441070439431
250.2420950089851890.4841900179703770.757904991014811
260.1474646485129490.2949292970258990.85253535148705
270.09929768014904940.1985953602980990.90070231985095
280.05933907756659340.1186781551331870.940660922433407
290.3981562826932520.7963125653865040.601843717306748
300.4060692589188020.8121385178376040.593930741081198
310.6465028912934020.7069942174131960.353497108706598
320.5828047873419090.8343904253161820.417195212658091
330.4963557252690140.9927114505380280.503644274730986
340.4321677753793260.8643355507586520.567832224620674
350.4449494140853930.8898988281707860.555050585914607

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
21 & 0.724374317667148 & 0.551251364665705 & 0.275625682332853 \tabularnewline
22 & 0.597862304208288 & 0.804275391583425 & 0.402137695791712 \tabularnewline
23 & 0.462094661380365 & 0.92418932276073 & 0.537905338619635 \tabularnewline
24 & 0.323558929560569 & 0.647117859121138 & 0.676441070439431 \tabularnewline
25 & 0.242095008985189 & 0.484190017970377 & 0.757904991014811 \tabularnewline
26 & 0.147464648512949 & 0.294929297025899 & 0.85253535148705 \tabularnewline
27 & 0.0992976801490494 & 0.198595360298099 & 0.90070231985095 \tabularnewline
28 & 0.0593390775665934 & 0.118678155133187 & 0.940660922433407 \tabularnewline
29 & 0.398156282693252 & 0.796312565386504 & 0.601843717306748 \tabularnewline
30 & 0.406069258918802 & 0.812138517837604 & 0.593930741081198 \tabularnewline
31 & 0.646502891293402 & 0.706994217413196 & 0.353497108706598 \tabularnewline
32 & 0.582804787341909 & 0.834390425316182 & 0.417195212658091 \tabularnewline
33 & 0.496355725269014 & 0.992711450538028 & 0.503644274730986 \tabularnewline
34 & 0.432167775379326 & 0.864335550758652 & 0.567832224620674 \tabularnewline
35 & 0.444949414085393 & 0.889898828170786 & 0.555050585914607 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58469&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]21[/C][C]0.724374317667148[/C][C]0.551251364665705[/C][C]0.275625682332853[/C][/ROW]
[ROW][C]22[/C][C]0.597862304208288[/C][C]0.804275391583425[/C][C]0.402137695791712[/C][/ROW]
[ROW][C]23[/C][C]0.462094661380365[/C][C]0.92418932276073[/C][C]0.537905338619635[/C][/ROW]
[ROW][C]24[/C][C]0.323558929560569[/C][C]0.647117859121138[/C][C]0.676441070439431[/C][/ROW]
[ROW][C]25[/C][C]0.242095008985189[/C][C]0.484190017970377[/C][C]0.757904991014811[/C][/ROW]
[ROW][C]26[/C][C]0.147464648512949[/C][C]0.294929297025899[/C][C]0.85253535148705[/C][/ROW]
[ROW][C]27[/C][C]0.0992976801490494[/C][C]0.198595360298099[/C][C]0.90070231985095[/C][/ROW]
[ROW][C]28[/C][C]0.0593390775665934[/C][C]0.118678155133187[/C][C]0.940660922433407[/C][/ROW]
[ROW][C]29[/C][C]0.398156282693252[/C][C]0.796312565386504[/C][C]0.601843717306748[/C][/ROW]
[ROW][C]30[/C][C]0.406069258918802[/C][C]0.812138517837604[/C][C]0.593930741081198[/C][/ROW]
[ROW][C]31[/C][C]0.646502891293402[/C][C]0.706994217413196[/C][C]0.353497108706598[/C][/ROW]
[ROW][C]32[/C][C]0.582804787341909[/C][C]0.834390425316182[/C][C]0.417195212658091[/C][/ROW]
[ROW][C]33[/C][C]0.496355725269014[/C][C]0.992711450538028[/C][C]0.503644274730986[/C][/ROW]
[ROW][C]34[/C][C]0.432167775379326[/C][C]0.864335550758652[/C][C]0.567832224620674[/C][/ROW]
[ROW][C]35[/C][C]0.444949414085393[/C][C]0.889898828170786[/C][C]0.555050585914607[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58469&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58469&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
210.7243743176671480.5512513646657050.275625682332853
220.5978623042082880.8042753915834250.402137695791712
230.4620946613803650.924189322760730.537905338619635
240.3235589295605690.6471178591211380.676441070439431
250.2420950089851890.4841900179703770.757904991014811
260.1474646485129490.2949292970258990.85253535148705
270.09929768014904940.1985953602980990.90070231985095
280.05933907756659340.1186781551331870.940660922433407
290.3981562826932520.7963125653865040.601843717306748
300.4060692589188020.8121385178376040.593930741081198
310.6465028912934020.7069942174131960.353497108706598
320.5828047873419090.8343904253161820.417195212658091
330.4963557252690140.9927114505380280.503644274730986
340.4321677753793260.8643355507586520.567832224620674
350.4449494140853930.8898988281707860.555050585914607







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58469&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58469&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58469&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}