Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 23 Nov 2009 12:19:36 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2009/Nov/23/t12590041247d26hs51fgzkxc2.htm/, Retrieved Thu, 02 May 2024 22:07:11 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=58868, Retrieved Thu, 02 May 2024 22:07:11 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact167
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Q1 The Seatbeltlaw] [2007-11-14 19:27:43] [8cd6641b921d30ebe00b648d1481bba0]
- RM D  [Multiple Regression] [Seatbelt] [2009-11-12 14:10:54] [b98453cac15ba1066b407e146608df68]
-    D    [Multiple Regression] [] [2009-11-20 07:49:33] [5d885a68c2332cc44f6191ec94766bfa]
-    D        [Multiple Regression] [VerbeteringModel5] [2009-11-23 19:19:36] [82f29a5d509ab8039aab37a0145f886d] [Current]
Feedback Forum

Post a new message
Dataseries X:
100.49	1.9	100.16	100.03
99.72	2	100.49	100.25
100.14	2.3	99.72	99.6
98.48	2.8	100.14	100.16
100.38	2.4	98.48	100.49
101.45	2.3	100.38	99.72
98.42	2.7	101.45	100.14
98.6	2.7	98.42	98.48
100.06	2.9	98.6	100.38
98.62	3	100.06	101.45
100.84	2.2	98.62	98.42
100.02	2.3	100.84	98.6
97.95	2.8	100.02	100.06
98.32	2.8	97.95	98.62
98.27	2.8	98.32	100.84
97.22	2.2	98.27	100.02
99.28	2.6	97.22	97.95
100.38	2.8	99.28	98.32
99.02	2.5	100.38	98.27
100.32	2.4	99.02	97.22
99.81	2.3	100.32	99.28
100.6	1.9	99.81	100.38
101.19	1.7	100.6	99.02
100.47	2	101.19	100.32
101.77	2.1	100.47	99.81
102.32	1.7	101.77	100.6
102.39	1.8	102.32	101.19
101.16	1.8	102.39	100.47
100.63	1.8	101.16	101.77
101.48	1.3	100.63	102.32
101.44	1.3	101.48	102.39
100.09	1.3	101.44	101.16
100.7	1.2	100.09	100.63
100.78	1.4	100.7	101.48
99.81	2.2	100.78	101.44
98.45	2.9	99.81	100.09
98.49	3.1	98.45	100.7
97.48	3.5	98.49	100.78
97.91	3.6	97.48	99.81
96.94	4.4	97.91	98.45
98.53	4.1	96.94	98.49
96.82	5.1	98.53	97.48
95.76	5.8	96.82	97.91
95.27	5.9	95.76	96.94
97.32	5.4	95.27	98.53
96.68	5.5	97.32	96.82
97.87	4.8	96.68	95.76
97.42	3.2	97.87	95.27
97.94	2.7	97.42	97.32
99.52	2.1	97.94	96.68
100.99	1.9	99.52	97.87
99.92	0.6	100.99	97.42
101.97	0.7	99.92	97.94
101.58	-0.2	101.97	99.52
99.54	-1	101.58	100.99
100.83	-1.7	99.54	99.92




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58868&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58868&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58868&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 57.3427945489103 -0.49999670903731X[t] + 0.571499960545084Y1[t] -0.139851953533137Y4[t] + 0.650113977389304M1[t] + 0.71191289906699M2[t] + 1.20339151560654M3[t] -0.388902182650499M4[t] + 1.70117362711875M5[t] + 1.07641684010856M6[t] -0.460083159944192M7[t] + 0.358538583379871M8[t] + 1.46442617656104M9[t] + 0.701154255385232M10[t] + 1.33629220612652M11[t] -0.00920527843402831t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Y[t] =  +  57.3427945489103 -0.49999670903731X[t] +  0.571499960545084Y1[t] -0.139851953533137Y4[t] +  0.650113977389304M1[t] +  0.71191289906699M2[t] +  1.20339151560654M3[t] -0.388902182650499M4[t] +  1.70117362711875M5[t] +  1.07641684010856M6[t] -0.460083159944192M7[t] +  0.358538583379871M8[t] +  1.46442617656104M9[t] +  0.701154255385232M10[t] +  1.33629220612652M11[t] -0.00920527843402831t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58868&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Y[t] =  +  57.3427945489103 -0.49999670903731X[t] +  0.571499960545084Y1[t] -0.139851953533137Y4[t] +  0.650113977389304M1[t] +  0.71191289906699M2[t] +  1.20339151560654M3[t] -0.388902182650499M4[t] +  1.70117362711875M5[t] +  1.07641684010856M6[t] -0.460083159944192M7[t] +  0.358538583379871M8[t] +  1.46442617656104M9[t] +  0.701154255385232M10[t] +  1.33629220612652M11[t] -0.00920527843402831t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58868&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58868&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 57.3427945489103 -0.49999670903731X[t] + 0.571499960545084Y1[t] -0.139851953533137Y4[t] + 0.650113977389304M1[t] + 0.71191289906699M2[t] + 1.20339151560654M3[t] -0.388902182650499M4[t] + 1.70117362711875M5[t] + 1.07641684010856M6[t] -0.460083159944192M7[t] + 0.358538583379871M8[t] + 1.46442617656104M9[t] + 0.701154255385232M10[t] + 1.33629220612652M11[t] -0.00920527843402831t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)57.342794548910313.2956684.31290.0001025.1e-05
X-0.499996709037310.129826-3.85130.0004150.000207
Y10.5714999605450840.1322644.32091e-045e-05
Y4-0.1398519535331370.101748-1.37450.1769420.088471
M10.6501139773893040.6064761.0720.2901660.145083
M20.711912899066990.6031081.18040.244810.122405
M31.203391515606540.6079281.97950.0546710.027335
M4-0.3889021826504990.589216-0.660.5130150.256508
M51.701173627118750.6250572.72160.0095730.004787
M61.076416840108560.5897861.82510.075460.03773
M7-0.4600831599441920.596563-0.77120.4451070.222553
M80.3585385833798710.6158890.58210.5637340.281867
M91.464426176561040.6583382.22440.0318270.015914
M100.7011542553852320.6405251.09470.2802150.140108
M111.336292206126520.6231332.14450.0381240.019062
t-0.009205278434028310.008074-1.14010.2610050.130503

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 57.3427945489103 & 13.295668 & 4.3129 & 0.000102 & 5.1e-05 \tabularnewline
X & -0.49999670903731 & 0.129826 & -3.8513 & 0.000415 & 0.000207 \tabularnewline
Y1 & 0.571499960545084 & 0.132264 & 4.3209 & 1e-04 & 5e-05 \tabularnewline
Y4 & -0.139851953533137 & 0.101748 & -1.3745 & 0.176942 & 0.088471 \tabularnewline
M1 & 0.650113977389304 & 0.606476 & 1.072 & 0.290166 & 0.145083 \tabularnewline
M2 & 0.71191289906699 & 0.603108 & 1.1804 & 0.24481 & 0.122405 \tabularnewline
M3 & 1.20339151560654 & 0.607928 & 1.9795 & 0.054671 & 0.027335 \tabularnewline
M4 & -0.388902182650499 & 0.589216 & -0.66 & 0.513015 & 0.256508 \tabularnewline
M5 & 1.70117362711875 & 0.625057 & 2.7216 & 0.009573 & 0.004787 \tabularnewline
M6 & 1.07641684010856 & 0.589786 & 1.8251 & 0.07546 & 0.03773 \tabularnewline
M7 & -0.460083159944192 & 0.596563 & -0.7712 & 0.445107 & 0.222553 \tabularnewline
M8 & 0.358538583379871 & 0.615889 & 0.5821 & 0.563734 & 0.281867 \tabularnewline
M9 & 1.46442617656104 & 0.658338 & 2.2244 & 0.031827 & 0.015914 \tabularnewline
M10 & 0.701154255385232 & 0.640525 & 1.0947 & 0.280215 & 0.140108 \tabularnewline
M11 & 1.33629220612652 & 0.623133 & 2.1445 & 0.038124 & 0.019062 \tabularnewline
t & -0.00920527843402831 & 0.008074 & -1.1401 & 0.261005 & 0.130503 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58868&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]57.3427945489103[/C][C]13.295668[/C][C]4.3129[/C][C]0.000102[/C][C]5.1e-05[/C][/ROW]
[ROW][C]X[/C][C]-0.49999670903731[/C][C]0.129826[/C][C]-3.8513[/C][C]0.000415[/C][C]0.000207[/C][/ROW]
[ROW][C]Y1[/C][C]0.571499960545084[/C][C]0.132264[/C][C]4.3209[/C][C]1e-04[/C][C]5e-05[/C][/ROW]
[ROW][C]Y4[/C][C]-0.139851953533137[/C][C]0.101748[/C][C]-1.3745[/C][C]0.176942[/C][C]0.088471[/C][/ROW]
[ROW][C]M1[/C][C]0.650113977389304[/C][C]0.606476[/C][C]1.072[/C][C]0.290166[/C][C]0.145083[/C][/ROW]
[ROW][C]M2[/C][C]0.71191289906699[/C][C]0.603108[/C][C]1.1804[/C][C]0.24481[/C][C]0.122405[/C][/ROW]
[ROW][C]M3[/C][C]1.20339151560654[/C][C]0.607928[/C][C]1.9795[/C][C]0.054671[/C][C]0.027335[/C][/ROW]
[ROW][C]M4[/C][C]-0.388902182650499[/C][C]0.589216[/C][C]-0.66[/C][C]0.513015[/C][C]0.256508[/C][/ROW]
[ROW][C]M5[/C][C]1.70117362711875[/C][C]0.625057[/C][C]2.7216[/C][C]0.009573[/C][C]0.004787[/C][/ROW]
[ROW][C]M6[/C][C]1.07641684010856[/C][C]0.589786[/C][C]1.8251[/C][C]0.07546[/C][C]0.03773[/C][/ROW]
[ROW][C]M7[/C][C]-0.460083159944192[/C][C]0.596563[/C][C]-0.7712[/C][C]0.445107[/C][C]0.222553[/C][/ROW]
[ROW][C]M8[/C][C]0.358538583379871[/C][C]0.615889[/C][C]0.5821[/C][C]0.563734[/C][C]0.281867[/C][/ROW]
[ROW][C]M9[/C][C]1.46442617656104[/C][C]0.658338[/C][C]2.2244[/C][C]0.031827[/C][C]0.015914[/C][/ROW]
[ROW][C]M10[/C][C]0.701154255385232[/C][C]0.640525[/C][C]1.0947[/C][C]0.280215[/C][C]0.140108[/C][/ROW]
[ROW][C]M11[/C][C]1.33629220612652[/C][C]0.623133[/C][C]2.1445[/C][C]0.038124[/C][C]0.019062[/C][/ROW]
[ROW][C]t[/C][C]-0.00920527843402831[/C][C]0.008074[/C][C]-1.1401[/C][C]0.261005[/C][C]0.130503[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58868&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58868&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)57.342794548910313.2956684.31290.0001025.1e-05
X-0.499996709037310.129826-3.85130.0004150.000207
Y10.5714999605450840.1322644.32091e-045e-05
Y4-0.1398519535331370.101748-1.37450.1769420.088471
M10.6501139773893040.6064761.0720.2901660.145083
M20.711912899066990.6031081.18040.244810.122405
M31.203391515606540.6079281.97950.0546710.027335
M4-0.3889021826504990.589216-0.660.5130150.256508
M51.701173627118750.6250572.72160.0095730.004787
M61.076416840108560.5897861.82510.075460.03773
M7-0.4600831599441920.596563-0.77120.4451070.222553
M80.3585385833798710.6158890.58210.5637340.281867
M91.464426176561040.6583382.22440.0318270.015914
M100.7011542553852320.6405251.09470.2802150.140108
M111.336292206126520.6231332.14450.0381240.019062
t-0.009205278434028310.008074-1.14010.2610050.130503







Multiple Linear Regression - Regression Statistics
Multiple R0.899016799661889
R-squared0.808231206074305
Adjusted R-squared0.73631790835217
F-TEST (value)11.2389673631324
F-TEST (DF numerator)15
F-TEST (DF denominator)40
p-value5.60008373007292e-10
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.870721422195195
Sum Squared Residuals30.3262318027849

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.899016799661889 \tabularnewline
R-squared & 0.808231206074305 \tabularnewline
Adjusted R-squared & 0.73631790835217 \tabularnewline
F-TEST (value) & 11.2389673631324 \tabularnewline
F-TEST (DF numerator) & 15 \tabularnewline
F-TEST (DF denominator) & 40 \tabularnewline
p-value & 5.60008373007292e-10 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.870721422195195 \tabularnewline
Sum Squared Residuals & 30.3262318027849 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58868&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.899016799661889[/C][/ROW]
[ROW][C]R-squared[/C][C]0.808231206074305[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.73631790835217[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]11.2389673631324[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]15[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]40[/C][/ROW]
[ROW][C]p-value[/C][C]5.60008373007292e-10[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.870721422195195[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]30.3262318027849[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58868&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58868&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.899016799661889
R-squared0.808231206074305
Adjusted R-squared0.73631790835217
F-TEST (value)11.2389673631324
F-TEST (DF numerator)15
F-TEST (DF denominator)40
p-value5.60008373007292e-10
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.870721422195195
Sum Squared Residuals30.3262318027849







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1100.49100.2857546369710.204245363029248
299.72100.446176166513-0.726176166513105
3100.14100.429299292084-0.289299292084255
498.4898.739514850325-0.259514850324912
5100.38100.0255429861040.354457013895707
6101.45100.635116520820.814883479180028
798.4299.4421796960176-1.02217969601759
898.698.752105523321-0.152105523321035
9100.0699.58593977744590.474060222554142
1098.6299.4482112590477-0.828211259047656
11100.84100.0749327746050.765067225394747
12100.0299.9229921799150.0970078200848986
1397.9599.6410887045464-1.69108870454636
1498.3298.7120642425494-0.392064242549426
1598.2799.095321229213-0.825321229213055
1697.2297.8799238818143-0.659923881814293
1799.2899.4502143147758-0.170214314775844
18100.3899.84179760343980.538202396560212
1999.0299.0817338919404-0.0617338919404416
20100.3299.31075463260271.00924536739731
2199.81100.912291542684-1.10229154268390
22100.699.89451089792450.705489102075445
23101.19101.262126537675-0.0721265376749542
24100.4799.92200747753170.547992522468268
25101.77100.1727610302931.59723896970728
26102.32101.0578202625691.26217973743127
27102.39101.7219062554860.668093744514241
28101.16100.2611056825770.898894317423284
29100.63101.457223722848-0.827223722848403
30101.48100.6934464583910.786553541609285
31101.4499.623726509621.81627349038006
32100.09100.582300878934-0.492300878933925
33100.7101.031579453221-0.3315794532215
34100.78100.3888437272340.391156272766464
3599.81100.666093107296-0.856093107295881
3698.4598.6050431019502-0.155043101950216
3798.4998.28340282110150.206597178898490
3897.4898.1476696228694-0.667669622869383
3997.9198.1383847248478-0.228384724847794
4096.9496.57283202076630.367167979233672
4198.5398.24375252494270.286247475057317
4296.8298.1597291607963-1.33972916079632
4395.7695.2266249134320.53337508656794
4495.2795.5159081441677-0.245908144167734
4597.3296.36018922664870.959810773351254
4696.6896.9484341157943-0.268434115794253
4797.8797.7068475804240.163152419576088
4897.4297.909957240603-0.489957240602947
4997.9498.2569928070887-0.316992807088660
5099.5298.99626970549930.523730294500647
51100.99100.3150884983690.674911501630864
5299.92100.266623564518-0.346623564517751
53101.97101.6132664513290.356733548671224
54101.58102.379910256553-0.799910256553205
5599.54100.80573498899-1.26573498898997
56100.83100.948930820975-0.118930820974616

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 100.49 & 100.285754636971 & 0.204245363029248 \tabularnewline
2 & 99.72 & 100.446176166513 & -0.726176166513105 \tabularnewline
3 & 100.14 & 100.429299292084 & -0.289299292084255 \tabularnewline
4 & 98.48 & 98.739514850325 & -0.259514850324912 \tabularnewline
5 & 100.38 & 100.025542986104 & 0.354457013895707 \tabularnewline
6 & 101.45 & 100.63511652082 & 0.814883479180028 \tabularnewline
7 & 98.42 & 99.4421796960176 & -1.02217969601759 \tabularnewline
8 & 98.6 & 98.752105523321 & -0.152105523321035 \tabularnewline
9 & 100.06 & 99.5859397774459 & 0.474060222554142 \tabularnewline
10 & 98.62 & 99.4482112590477 & -0.828211259047656 \tabularnewline
11 & 100.84 & 100.074932774605 & 0.765067225394747 \tabularnewline
12 & 100.02 & 99.922992179915 & 0.0970078200848986 \tabularnewline
13 & 97.95 & 99.6410887045464 & -1.69108870454636 \tabularnewline
14 & 98.32 & 98.7120642425494 & -0.392064242549426 \tabularnewline
15 & 98.27 & 99.095321229213 & -0.825321229213055 \tabularnewline
16 & 97.22 & 97.8799238818143 & -0.659923881814293 \tabularnewline
17 & 99.28 & 99.4502143147758 & -0.170214314775844 \tabularnewline
18 & 100.38 & 99.8417976034398 & 0.538202396560212 \tabularnewline
19 & 99.02 & 99.0817338919404 & -0.0617338919404416 \tabularnewline
20 & 100.32 & 99.3107546326027 & 1.00924536739731 \tabularnewline
21 & 99.81 & 100.912291542684 & -1.10229154268390 \tabularnewline
22 & 100.6 & 99.8945108979245 & 0.705489102075445 \tabularnewline
23 & 101.19 & 101.262126537675 & -0.0721265376749542 \tabularnewline
24 & 100.47 & 99.9220074775317 & 0.547992522468268 \tabularnewline
25 & 101.77 & 100.172761030293 & 1.59723896970728 \tabularnewline
26 & 102.32 & 101.057820262569 & 1.26217973743127 \tabularnewline
27 & 102.39 & 101.721906255486 & 0.668093744514241 \tabularnewline
28 & 101.16 & 100.261105682577 & 0.898894317423284 \tabularnewline
29 & 100.63 & 101.457223722848 & -0.827223722848403 \tabularnewline
30 & 101.48 & 100.693446458391 & 0.786553541609285 \tabularnewline
31 & 101.44 & 99.62372650962 & 1.81627349038006 \tabularnewline
32 & 100.09 & 100.582300878934 & -0.492300878933925 \tabularnewline
33 & 100.7 & 101.031579453221 & -0.3315794532215 \tabularnewline
34 & 100.78 & 100.388843727234 & 0.391156272766464 \tabularnewline
35 & 99.81 & 100.666093107296 & -0.856093107295881 \tabularnewline
36 & 98.45 & 98.6050431019502 & -0.155043101950216 \tabularnewline
37 & 98.49 & 98.2834028211015 & 0.206597178898490 \tabularnewline
38 & 97.48 & 98.1476696228694 & -0.667669622869383 \tabularnewline
39 & 97.91 & 98.1383847248478 & -0.228384724847794 \tabularnewline
40 & 96.94 & 96.5728320207663 & 0.367167979233672 \tabularnewline
41 & 98.53 & 98.2437525249427 & 0.286247475057317 \tabularnewline
42 & 96.82 & 98.1597291607963 & -1.33972916079632 \tabularnewline
43 & 95.76 & 95.226624913432 & 0.53337508656794 \tabularnewline
44 & 95.27 & 95.5159081441677 & -0.245908144167734 \tabularnewline
45 & 97.32 & 96.3601892266487 & 0.959810773351254 \tabularnewline
46 & 96.68 & 96.9484341157943 & -0.268434115794253 \tabularnewline
47 & 97.87 & 97.706847580424 & 0.163152419576088 \tabularnewline
48 & 97.42 & 97.909957240603 & -0.489957240602947 \tabularnewline
49 & 97.94 & 98.2569928070887 & -0.316992807088660 \tabularnewline
50 & 99.52 & 98.9962697054993 & 0.523730294500647 \tabularnewline
51 & 100.99 & 100.315088498369 & 0.674911501630864 \tabularnewline
52 & 99.92 & 100.266623564518 & -0.346623564517751 \tabularnewline
53 & 101.97 & 101.613266451329 & 0.356733548671224 \tabularnewline
54 & 101.58 & 102.379910256553 & -0.799910256553205 \tabularnewline
55 & 99.54 & 100.80573498899 & -1.26573498898997 \tabularnewline
56 & 100.83 & 100.948930820975 & -0.118930820974616 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58868&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]100.49[/C][C]100.285754636971[/C][C]0.204245363029248[/C][/ROW]
[ROW][C]2[/C][C]99.72[/C][C]100.446176166513[/C][C]-0.726176166513105[/C][/ROW]
[ROW][C]3[/C][C]100.14[/C][C]100.429299292084[/C][C]-0.289299292084255[/C][/ROW]
[ROW][C]4[/C][C]98.48[/C][C]98.739514850325[/C][C]-0.259514850324912[/C][/ROW]
[ROW][C]5[/C][C]100.38[/C][C]100.025542986104[/C][C]0.354457013895707[/C][/ROW]
[ROW][C]6[/C][C]101.45[/C][C]100.63511652082[/C][C]0.814883479180028[/C][/ROW]
[ROW][C]7[/C][C]98.42[/C][C]99.4421796960176[/C][C]-1.02217969601759[/C][/ROW]
[ROW][C]8[/C][C]98.6[/C][C]98.752105523321[/C][C]-0.152105523321035[/C][/ROW]
[ROW][C]9[/C][C]100.06[/C][C]99.5859397774459[/C][C]0.474060222554142[/C][/ROW]
[ROW][C]10[/C][C]98.62[/C][C]99.4482112590477[/C][C]-0.828211259047656[/C][/ROW]
[ROW][C]11[/C][C]100.84[/C][C]100.074932774605[/C][C]0.765067225394747[/C][/ROW]
[ROW][C]12[/C][C]100.02[/C][C]99.922992179915[/C][C]0.0970078200848986[/C][/ROW]
[ROW][C]13[/C][C]97.95[/C][C]99.6410887045464[/C][C]-1.69108870454636[/C][/ROW]
[ROW][C]14[/C][C]98.32[/C][C]98.7120642425494[/C][C]-0.392064242549426[/C][/ROW]
[ROW][C]15[/C][C]98.27[/C][C]99.095321229213[/C][C]-0.825321229213055[/C][/ROW]
[ROW][C]16[/C][C]97.22[/C][C]97.8799238818143[/C][C]-0.659923881814293[/C][/ROW]
[ROW][C]17[/C][C]99.28[/C][C]99.4502143147758[/C][C]-0.170214314775844[/C][/ROW]
[ROW][C]18[/C][C]100.38[/C][C]99.8417976034398[/C][C]0.538202396560212[/C][/ROW]
[ROW][C]19[/C][C]99.02[/C][C]99.0817338919404[/C][C]-0.0617338919404416[/C][/ROW]
[ROW][C]20[/C][C]100.32[/C][C]99.3107546326027[/C][C]1.00924536739731[/C][/ROW]
[ROW][C]21[/C][C]99.81[/C][C]100.912291542684[/C][C]-1.10229154268390[/C][/ROW]
[ROW][C]22[/C][C]100.6[/C][C]99.8945108979245[/C][C]0.705489102075445[/C][/ROW]
[ROW][C]23[/C][C]101.19[/C][C]101.262126537675[/C][C]-0.0721265376749542[/C][/ROW]
[ROW][C]24[/C][C]100.47[/C][C]99.9220074775317[/C][C]0.547992522468268[/C][/ROW]
[ROW][C]25[/C][C]101.77[/C][C]100.172761030293[/C][C]1.59723896970728[/C][/ROW]
[ROW][C]26[/C][C]102.32[/C][C]101.057820262569[/C][C]1.26217973743127[/C][/ROW]
[ROW][C]27[/C][C]102.39[/C][C]101.721906255486[/C][C]0.668093744514241[/C][/ROW]
[ROW][C]28[/C][C]101.16[/C][C]100.261105682577[/C][C]0.898894317423284[/C][/ROW]
[ROW][C]29[/C][C]100.63[/C][C]101.457223722848[/C][C]-0.827223722848403[/C][/ROW]
[ROW][C]30[/C][C]101.48[/C][C]100.693446458391[/C][C]0.786553541609285[/C][/ROW]
[ROW][C]31[/C][C]101.44[/C][C]99.62372650962[/C][C]1.81627349038006[/C][/ROW]
[ROW][C]32[/C][C]100.09[/C][C]100.582300878934[/C][C]-0.492300878933925[/C][/ROW]
[ROW][C]33[/C][C]100.7[/C][C]101.031579453221[/C][C]-0.3315794532215[/C][/ROW]
[ROW][C]34[/C][C]100.78[/C][C]100.388843727234[/C][C]0.391156272766464[/C][/ROW]
[ROW][C]35[/C][C]99.81[/C][C]100.666093107296[/C][C]-0.856093107295881[/C][/ROW]
[ROW][C]36[/C][C]98.45[/C][C]98.6050431019502[/C][C]-0.155043101950216[/C][/ROW]
[ROW][C]37[/C][C]98.49[/C][C]98.2834028211015[/C][C]0.206597178898490[/C][/ROW]
[ROW][C]38[/C][C]97.48[/C][C]98.1476696228694[/C][C]-0.667669622869383[/C][/ROW]
[ROW][C]39[/C][C]97.91[/C][C]98.1383847248478[/C][C]-0.228384724847794[/C][/ROW]
[ROW][C]40[/C][C]96.94[/C][C]96.5728320207663[/C][C]0.367167979233672[/C][/ROW]
[ROW][C]41[/C][C]98.53[/C][C]98.2437525249427[/C][C]0.286247475057317[/C][/ROW]
[ROW][C]42[/C][C]96.82[/C][C]98.1597291607963[/C][C]-1.33972916079632[/C][/ROW]
[ROW][C]43[/C][C]95.76[/C][C]95.226624913432[/C][C]0.53337508656794[/C][/ROW]
[ROW][C]44[/C][C]95.27[/C][C]95.5159081441677[/C][C]-0.245908144167734[/C][/ROW]
[ROW][C]45[/C][C]97.32[/C][C]96.3601892266487[/C][C]0.959810773351254[/C][/ROW]
[ROW][C]46[/C][C]96.68[/C][C]96.9484341157943[/C][C]-0.268434115794253[/C][/ROW]
[ROW][C]47[/C][C]97.87[/C][C]97.706847580424[/C][C]0.163152419576088[/C][/ROW]
[ROW][C]48[/C][C]97.42[/C][C]97.909957240603[/C][C]-0.489957240602947[/C][/ROW]
[ROW][C]49[/C][C]97.94[/C][C]98.2569928070887[/C][C]-0.316992807088660[/C][/ROW]
[ROW][C]50[/C][C]99.52[/C][C]98.9962697054993[/C][C]0.523730294500647[/C][/ROW]
[ROW][C]51[/C][C]100.99[/C][C]100.315088498369[/C][C]0.674911501630864[/C][/ROW]
[ROW][C]52[/C][C]99.92[/C][C]100.266623564518[/C][C]-0.346623564517751[/C][/ROW]
[ROW][C]53[/C][C]101.97[/C][C]101.613266451329[/C][C]0.356733548671224[/C][/ROW]
[ROW][C]54[/C][C]101.58[/C][C]102.379910256553[/C][C]-0.799910256553205[/C][/ROW]
[ROW][C]55[/C][C]99.54[/C][C]100.80573498899[/C][C]-1.26573498898997[/C][/ROW]
[ROW][C]56[/C][C]100.83[/C][C]100.948930820975[/C][C]-0.118930820974616[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58868&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58868&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1100.49100.2857546369710.204245363029248
299.72100.446176166513-0.726176166513105
3100.14100.429299292084-0.289299292084255
498.4898.739514850325-0.259514850324912
5100.38100.0255429861040.354457013895707
6101.45100.635116520820.814883479180028
798.4299.4421796960176-1.02217969601759
898.698.752105523321-0.152105523321035
9100.0699.58593977744590.474060222554142
1098.6299.4482112590477-0.828211259047656
11100.84100.0749327746050.765067225394747
12100.0299.9229921799150.0970078200848986
1397.9599.6410887045464-1.69108870454636
1498.3298.7120642425494-0.392064242549426
1598.2799.095321229213-0.825321229213055
1697.2297.8799238818143-0.659923881814293
1799.2899.4502143147758-0.170214314775844
18100.3899.84179760343980.538202396560212
1999.0299.0817338919404-0.0617338919404416
20100.3299.31075463260271.00924536739731
2199.81100.912291542684-1.10229154268390
22100.699.89451089792450.705489102075445
23101.19101.262126537675-0.0721265376749542
24100.4799.92200747753170.547992522468268
25101.77100.1727610302931.59723896970728
26102.32101.0578202625691.26217973743127
27102.39101.7219062554860.668093744514241
28101.16100.2611056825770.898894317423284
29100.63101.457223722848-0.827223722848403
30101.48100.6934464583910.786553541609285
31101.4499.623726509621.81627349038006
32100.09100.582300878934-0.492300878933925
33100.7101.031579453221-0.3315794532215
34100.78100.3888437272340.391156272766464
3599.81100.666093107296-0.856093107295881
3698.4598.6050431019502-0.155043101950216
3798.4998.28340282110150.206597178898490
3897.4898.1476696228694-0.667669622869383
3997.9198.1383847248478-0.228384724847794
4096.9496.57283202076630.367167979233672
4198.5398.24375252494270.286247475057317
4296.8298.1597291607963-1.33972916079632
4395.7695.2266249134320.53337508656794
4495.2795.5159081441677-0.245908144167734
4597.3296.36018922664870.959810773351254
4696.6896.9484341157943-0.268434115794253
4797.8797.7068475804240.163152419576088
4897.4297.909957240603-0.489957240602947
4997.9498.2569928070887-0.316992807088660
5099.5298.99626970549930.523730294500647
51100.99100.3150884983690.674911501630864
5299.92100.266623564518-0.346623564517751
53101.97101.6132664513290.356733548671224
54101.58102.379910256553-0.799910256553205
5599.54100.80573498899-1.26573498898997
56100.83100.948930820975-0.118930820974616







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
190.3039639667587690.6079279335175380.696036033241231
200.3656382158300750.731276431660150.634361784169925
210.5176454009482540.9647091981034920.482354599051746
220.4698888523193220.9397777046386440.530111147680678
230.4073483778676320.8146967557352630.592651622132368
240.3726344494590530.7452688989181070.627365550540947
250.7113050717379050.577389856524190.288694928262095
260.7184940308115750.563011938376850.281505969188425
270.6244314339431170.7511371321137660.375568566056883
280.5502018359125480.8995963281749040.449798164087452
290.6265777712992340.7468444574015330.373422228700766
300.5666666471683560.8666667056632870.433333352831644
310.8215741918123260.3568516163753470.178425808187673
320.8000726545446130.3998546909107740.199927345455387
330.807220125532820.385559748934360.19277987446718
340.7298377997861230.5403244004277540.270162200213877
350.6252553348156680.7494893303686650.374744665184333
360.5246501190113850.950699761977230.475349880988615
370.8076358831355670.3847282337288660.192364116864433

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
19 & 0.303963966758769 & 0.607927933517538 & 0.696036033241231 \tabularnewline
20 & 0.365638215830075 & 0.73127643166015 & 0.634361784169925 \tabularnewline
21 & 0.517645400948254 & 0.964709198103492 & 0.482354599051746 \tabularnewline
22 & 0.469888852319322 & 0.939777704638644 & 0.530111147680678 \tabularnewline
23 & 0.407348377867632 & 0.814696755735263 & 0.592651622132368 \tabularnewline
24 & 0.372634449459053 & 0.745268898918107 & 0.627365550540947 \tabularnewline
25 & 0.711305071737905 & 0.57738985652419 & 0.288694928262095 \tabularnewline
26 & 0.718494030811575 & 0.56301193837685 & 0.281505969188425 \tabularnewline
27 & 0.624431433943117 & 0.751137132113766 & 0.375568566056883 \tabularnewline
28 & 0.550201835912548 & 0.899596328174904 & 0.449798164087452 \tabularnewline
29 & 0.626577771299234 & 0.746844457401533 & 0.373422228700766 \tabularnewline
30 & 0.566666647168356 & 0.866666705663287 & 0.433333352831644 \tabularnewline
31 & 0.821574191812326 & 0.356851616375347 & 0.178425808187673 \tabularnewline
32 & 0.800072654544613 & 0.399854690910774 & 0.199927345455387 \tabularnewline
33 & 0.80722012553282 & 0.38555974893436 & 0.19277987446718 \tabularnewline
34 & 0.729837799786123 & 0.540324400427754 & 0.270162200213877 \tabularnewline
35 & 0.625255334815668 & 0.749489330368665 & 0.374744665184333 \tabularnewline
36 & 0.524650119011385 & 0.95069976197723 & 0.475349880988615 \tabularnewline
37 & 0.807635883135567 & 0.384728233728866 & 0.192364116864433 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58868&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]19[/C][C]0.303963966758769[/C][C]0.607927933517538[/C][C]0.696036033241231[/C][/ROW]
[ROW][C]20[/C][C]0.365638215830075[/C][C]0.73127643166015[/C][C]0.634361784169925[/C][/ROW]
[ROW][C]21[/C][C]0.517645400948254[/C][C]0.964709198103492[/C][C]0.482354599051746[/C][/ROW]
[ROW][C]22[/C][C]0.469888852319322[/C][C]0.939777704638644[/C][C]0.530111147680678[/C][/ROW]
[ROW][C]23[/C][C]0.407348377867632[/C][C]0.814696755735263[/C][C]0.592651622132368[/C][/ROW]
[ROW][C]24[/C][C]0.372634449459053[/C][C]0.745268898918107[/C][C]0.627365550540947[/C][/ROW]
[ROW][C]25[/C][C]0.711305071737905[/C][C]0.57738985652419[/C][C]0.288694928262095[/C][/ROW]
[ROW][C]26[/C][C]0.718494030811575[/C][C]0.56301193837685[/C][C]0.281505969188425[/C][/ROW]
[ROW][C]27[/C][C]0.624431433943117[/C][C]0.751137132113766[/C][C]0.375568566056883[/C][/ROW]
[ROW][C]28[/C][C]0.550201835912548[/C][C]0.899596328174904[/C][C]0.449798164087452[/C][/ROW]
[ROW][C]29[/C][C]0.626577771299234[/C][C]0.746844457401533[/C][C]0.373422228700766[/C][/ROW]
[ROW][C]30[/C][C]0.566666647168356[/C][C]0.866666705663287[/C][C]0.433333352831644[/C][/ROW]
[ROW][C]31[/C][C]0.821574191812326[/C][C]0.356851616375347[/C][C]0.178425808187673[/C][/ROW]
[ROW][C]32[/C][C]0.800072654544613[/C][C]0.399854690910774[/C][C]0.199927345455387[/C][/ROW]
[ROW][C]33[/C][C]0.80722012553282[/C][C]0.38555974893436[/C][C]0.19277987446718[/C][/ROW]
[ROW][C]34[/C][C]0.729837799786123[/C][C]0.540324400427754[/C][C]0.270162200213877[/C][/ROW]
[ROW][C]35[/C][C]0.625255334815668[/C][C]0.749489330368665[/C][C]0.374744665184333[/C][/ROW]
[ROW][C]36[/C][C]0.524650119011385[/C][C]0.95069976197723[/C][C]0.475349880988615[/C][/ROW]
[ROW][C]37[/C][C]0.807635883135567[/C][C]0.384728233728866[/C][C]0.192364116864433[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58868&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58868&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
190.3039639667587690.6079279335175380.696036033241231
200.3656382158300750.731276431660150.634361784169925
210.5176454009482540.9647091981034920.482354599051746
220.4698888523193220.9397777046386440.530111147680678
230.4073483778676320.8146967557352630.592651622132368
240.3726344494590530.7452688989181070.627365550540947
250.7113050717379050.577389856524190.288694928262095
260.7184940308115750.563011938376850.281505969188425
270.6244314339431170.7511371321137660.375568566056883
280.5502018359125480.8995963281749040.449798164087452
290.6265777712992340.7468444574015330.373422228700766
300.5666666471683560.8666667056632870.433333352831644
310.8215741918123260.3568516163753470.178425808187673
320.8000726545446130.3998546909107740.199927345455387
330.807220125532820.385559748934360.19277987446718
340.7298377997861230.5403244004277540.270162200213877
350.6252553348156680.7494893303686650.374744665184333
360.5246501190113850.950699761977230.475349880988615
370.8076358831355670.3847282337288660.192364116864433







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58868&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58868&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58868&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}