Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 20 Nov 2009 05:04:49 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2009/Nov/20/t12587187798hgeaf8pu1d13bk.htm/, Retrieved Thu, 28 Mar 2024 16:18:00 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=58064, Retrieved Thu, 28 Mar 2024 16:18:00 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact139
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Q1 The Seatbeltlaw] [2007-11-14 19:27:43] [8cd6641b921d30ebe00b648d1481bba0]
- RMPD  [Multiple Regression] [Seatbelt] [2009-11-12 14:03:14] [b98453cac15ba1066b407e146608df68]
F    D      [Multiple Regression] [ws777] [2009-11-20 12:04:49] [9a1fef436e1d399a5ecd6808bfbd8489] [Current]
-    D        [Multiple Regression] [w7] [2009-11-22 13:21:07] [0a7d38ad9c7f1a2c46637c75a8a0e083]
Feedback Forum
2009-11-25 19:15:55 [Nick Aerts] [reply
Model 1: http://www.freestatistics.org/blog/index.php?v=date/2009/Nov/25/t1259174661rk8ftst492o9qga.htm/
(Voor uitleg rond dit model zie: het feedbackforum bij je vorige link)

Model 2: http://www.freestatistics.org/blog/index.php?v=date/2009/Nov/25/t1259170139ejbxbdrwasqhwp2.htm/
Aan dit model hebben we de dalende lange termijn trend toegevoegd. We krijgen nu het volgende te zien:

Analyse van de cijfergegevens:
De economische groei zorgt voor een daling van het percentage werkloze mannen met 0,03. De toevoeging van trend (t) zorgt voor een maandelijkse daling van 0,02. Beide dalingen zijn significant omdat de alfa-fout (p-waarde 1-tail) bijna gelijk is aan 0.
Om te weten hoeveel schommelingen van dit model kunnen bepaald worden, bekijken we de adjusted R².
R² = 35,7%; Deze waarde is significant want de p-waarde is bijna 0.
De standaardfout is: 0,53

Analyse van de grafieken:
Het is ook visueel zichtbaar dat onze voorspellingen nog niet goed zijn. Dit is te zien in de 'actuals and interpolation' grafiek. De bolletjes zijn de voorspelde waarden, en de grafiek is de tijdreeks.
Bij de residu's is de trend duidelijk verdwenen. We zien ook dat de gegevens niet mooi rond 0 schommelen.
In de autocorrelatiefunctie is de lange termijn trend verwijderd. We merken wel op dat de twee eerste waarden zeer sterk geautocorreleerd zijn. De waarde van de vorige maand heeft een groot verband met die van deze maand. Dit geld ook voor deze maand en de mand dààrvoor. We kunnen deze autocorrelatie wegwerken door het toevoegen van vertragingen. Dit doen we door twee extra reeksen Y(t-1) en Y(t-2) aan te maken, hiervoor gebruiken we Excel. Bij de kolom van Y(t-1) is het eerste veld leeg en vanaf dan volgt Yt. Y(t-2) --> twee lege velden. Daarna brengen we de reeksen terug over naar de rekensoftware. Let er op dat je geen lege velden selecteert! Het resultaat van deze vertragingen is te zien onder jouw link 3.

2009-11-26 09:39:05 [Angelo Stuer] [reply
volgens mij is het toevoegen van een seizoenale trend niet van toepassing op jouw reeks. We zien duidelijk een averechts effect op de autocorrelatie en ook de trend in je residu's is nog steeds dezelfde.

Post a new message
Dataseries X:
100	0
95.84395716	0
105.5073942	1
118.1540031	1
101.8612953	1
109.8419174	1
105.6348802	1
112.927078	1
133.0698623	1
125.6756757	1
146.736359	1
142.5803162	1
106.1448241	1
126.5170831	1
132.7893932	1
121.2391637	1
114.5079041	1
146.1499235	1
146.1244263	1
128.5058644	1
155.5838858	1
125.0382458	1
136.8944416	1
142.2233554	1
117.7715451	1
120.627231	1
127.7664457	1
135.1096379	1
105.7113717	1
117.9245283	1
120.754717	1
107.572667	1
130.4436512	1
107.2157063	1
105.0739419	1
130.1121877	1
109.6379398	1
116.7261601	1
97.11881693	0
140.8975013	1
108.2865885	1
97.65425803	0
112.0346762	1
123.0494646	1
112.4171341	1
116.4966854	1
104.6914839	1
122.2335543	1
99.79602244	0
96.71086181	0
112.3151453	1
102.5497195	1
104.5385008	1
122.0805711	1
80.64762876	0
91.40744518	0
99.51555329	0
106.527282	1
98.49566548	0
106.7567568	1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 4 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58064&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]4 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58064&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58064&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 103.383194533056 + 25.3980395469445X[t] -11.9519519732223M1[t] -7.33695962722222M2[t] -8.60218710461113M3[t] -5.19122898000001M4[t] -21.800102M5[t] -4.97138650461111M6[t] -10.6623604786111M7[t] -11.0091223346111M8[t] + 2.50439116738889M9[t] -12.59051504M10[t] -5.32324779461111M11[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Y[t] =  +  103.383194533056 +  25.3980395469445X[t] -11.9519519732223M1[t] -7.33695962722222M2[t] -8.60218710461113M3[t] -5.19122898000001M4[t] -21.800102M5[t] -4.97138650461111M6[t] -10.6623604786111M7[t] -11.0091223346111M8[t] +  2.50439116738889M9[t] -12.59051504M10[t] -5.32324779461111M11[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58064&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Y[t] =  +  103.383194533056 +  25.3980395469445X[t] -11.9519519732223M1[t] -7.33695962722222M2[t] -8.60218710461113M3[t] -5.19122898000001M4[t] -21.800102M5[t] -4.97138650461111M6[t] -10.6623604786111M7[t] -11.0091223346111M8[t] +  2.50439116738889M9[t] -12.59051504M10[t] -5.32324779461111M11[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58064&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58064&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 103.383194533056 + 25.3980395469445X[t] -11.9519519732223M1[t] -7.33695962722222M2[t] -8.60218710461113M3[t] -5.19122898000001M4[t] -21.800102M5[t] -4.97138650461111M6[t] -10.6623604786111M7[t] -11.0091223346111M8[t] + 2.50439116738889M9[t] -12.59051504M10[t] -5.32324779461111M11[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)103.3831945330567.42266513.92800
X25.39803954694454.7518745.34483e-061e-06
M1-11.95195197322238.285176-1.44260.1557730.077886
M2-7.336959627222228.285176-0.88560.3803670.190184
M3-8.602187104611138.120006-1.05940.294840.14742
M4-5.191228980000018.064198-0.64370.5228740.261437
M5-21.8001028.064198-2.70330.0095250.004763
M6-4.971386504611118.120006-0.61220.543330.271665
M7-10.66236047861118.120006-1.31310.1955250.097763
M8-11.00912233461118.120006-1.35580.1816410.090821
M92.504391167388898.1200060.30840.7591240.379562
M10-12.590515048.064198-1.56130.1251650.062583
M11-5.323247794611118.120006-0.65560.5152960.257648

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 103.383194533056 & 7.422665 & 13.928 & 0 & 0 \tabularnewline
X & 25.3980395469445 & 4.751874 & 5.3448 & 3e-06 & 1e-06 \tabularnewline
M1 & -11.9519519732223 & 8.285176 & -1.4426 & 0.155773 & 0.077886 \tabularnewline
M2 & -7.33695962722222 & 8.285176 & -0.8856 & 0.380367 & 0.190184 \tabularnewline
M3 & -8.60218710461113 & 8.120006 & -1.0594 & 0.29484 & 0.14742 \tabularnewline
M4 & -5.19122898000001 & 8.064198 & -0.6437 & 0.522874 & 0.261437 \tabularnewline
M5 & -21.800102 & 8.064198 & -2.7033 & 0.009525 & 0.004763 \tabularnewline
M6 & -4.97138650461111 & 8.120006 & -0.6122 & 0.54333 & 0.271665 \tabularnewline
M7 & -10.6623604786111 & 8.120006 & -1.3131 & 0.195525 & 0.097763 \tabularnewline
M8 & -11.0091223346111 & 8.120006 & -1.3558 & 0.181641 & 0.090821 \tabularnewline
M9 & 2.50439116738889 & 8.120006 & 0.3084 & 0.759124 & 0.379562 \tabularnewline
M10 & -12.59051504 & 8.064198 & -1.5613 & 0.125165 & 0.062583 \tabularnewline
M11 & -5.32324779461111 & 8.120006 & -0.6556 & 0.515296 & 0.257648 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58064&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]103.383194533056[/C][C]7.422665[/C][C]13.928[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]X[/C][C]25.3980395469445[/C][C]4.751874[/C][C]5.3448[/C][C]3e-06[/C][C]1e-06[/C][/ROW]
[ROW][C]M1[/C][C]-11.9519519732223[/C][C]8.285176[/C][C]-1.4426[/C][C]0.155773[/C][C]0.077886[/C][/ROW]
[ROW][C]M2[/C][C]-7.33695962722222[/C][C]8.285176[/C][C]-0.8856[/C][C]0.380367[/C][C]0.190184[/C][/ROW]
[ROW][C]M3[/C][C]-8.60218710461113[/C][C]8.120006[/C][C]-1.0594[/C][C]0.29484[/C][C]0.14742[/C][/ROW]
[ROW][C]M4[/C][C]-5.19122898000001[/C][C]8.064198[/C][C]-0.6437[/C][C]0.522874[/C][C]0.261437[/C][/ROW]
[ROW][C]M5[/C][C]-21.800102[/C][C]8.064198[/C][C]-2.7033[/C][C]0.009525[/C][C]0.004763[/C][/ROW]
[ROW][C]M6[/C][C]-4.97138650461111[/C][C]8.120006[/C][C]-0.6122[/C][C]0.54333[/C][C]0.271665[/C][/ROW]
[ROW][C]M7[/C][C]-10.6623604786111[/C][C]8.120006[/C][C]-1.3131[/C][C]0.195525[/C][C]0.097763[/C][/ROW]
[ROW][C]M8[/C][C]-11.0091223346111[/C][C]8.120006[/C][C]-1.3558[/C][C]0.181641[/C][C]0.090821[/C][/ROW]
[ROW][C]M9[/C][C]2.50439116738889[/C][C]8.120006[/C][C]0.3084[/C][C]0.759124[/C][C]0.379562[/C][/ROW]
[ROW][C]M10[/C][C]-12.59051504[/C][C]8.064198[/C][C]-1.5613[/C][C]0.125165[/C][C]0.062583[/C][/ROW]
[ROW][C]M11[/C][C]-5.32324779461111[/C][C]8.120006[/C][C]-0.6556[/C][C]0.515296[/C][C]0.257648[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58064&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58064&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)103.3831945330567.42266513.92800
X25.39803954694454.7518745.34483e-061e-06
M1-11.95195197322238.285176-1.44260.1557730.077886
M2-7.336959627222228.285176-0.88560.3803670.190184
M3-8.602187104611138.120006-1.05940.294840.14742
M4-5.191228980000018.064198-0.64370.5228740.261437
M5-21.8001028.064198-2.70330.0095250.004763
M6-4.971386504611118.120006-0.61220.543330.271665
M7-10.66236047861118.120006-1.31310.1955250.097763
M8-11.00912233461118.120006-1.35580.1816410.090821
M92.504391167388898.1200060.30840.7591240.379562
M10-12.590515048.064198-1.56130.1251650.062583
M11-5.323247794611118.120006-0.65560.5152960.257648







Multiple Linear Regression - Regression Statistics
Multiple R0.700889678005733
R-squared0.49124634073498
Adjusted R-squared0.361351789433272
F-TEST (value)3.78188565888308
F-TEST (DF numerator)12
F-TEST (DF denominator)47
p-value0.000490109125249605
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation12.7506163904545
Sum Squared Residuals7641.17626181678

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.700889678005733 \tabularnewline
R-squared & 0.49124634073498 \tabularnewline
Adjusted R-squared & 0.361351789433272 \tabularnewline
F-TEST (value) & 3.78188565888308 \tabularnewline
F-TEST (DF numerator) & 12 \tabularnewline
F-TEST (DF denominator) & 47 \tabularnewline
p-value & 0.000490109125249605 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 12.7506163904545 \tabularnewline
Sum Squared Residuals & 7641.17626181678 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58064&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.700889678005733[/C][/ROW]
[ROW][C]R-squared[/C][C]0.49124634073498[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.361351789433272[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]3.78188565888308[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]12[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]47[/C][/ROW]
[ROW][C]p-value[/C][C]0.000490109125249605[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]12.7506163904545[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]7641.17626181678[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58064&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58064&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.700889678005733
R-squared0.49124634073498
Adjusted R-squared0.361351789433272
F-TEST (value)3.78188565888308
F-TEST (DF numerator)12
F-TEST (DF denominator)47
p-value0.000490109125249605
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation12.7506163904545
Sum Squared Residuals7641.17626181678







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110091.43124255983358.5687574401665
295.8439571696.0462349058333-0.202277745833311
3105.5073942120.179046975389-14.6716527753889
4118.1540031123.5900051-5.436002
5101.8612953106.98113208-5.11983677999997
6109.8419174123.809847575389-13.9679301753889
7105.6348802118.118873601389-12.4839934013889
8112.927078117.772111745389-4.84503374538888
9133.0698623131.2856252473891.78423705261114
10125.6756757116.190719049.48495666000002
11146.736359123.45798628538923.2783727146111
12142.5803162128.7812340813.7990821200000
13106.1448241116.829282106778-10.6844580067777
14126.5170831121.4442744527785.07280864722221
15132.7893932120.17904697538912.6103462246111
16121.2391637123.5900051-2.35084139999999
17114.5079041106.981132087.52677202
18146.1499235123.80984757538922.3400759246111
19146.1244263118.11887360138928.0055526986111
20128.5058644117.77211174538910.7337526546111
21155.5838858131.28562524738924.2982605526111
22125.0382458116.190719048.84752676
23136.8944416123.45798628538913.4364553146111
24142.2233554128.7812340813.44212132
25117.7715451116.8292821067780.942262993222257
26120.627231121.444274452778-0.817043452777788
27127.7664457120.1790469753897.58739872461112
28135.1096379123.590005111.5196328
29105.7113717106.98113208-1.26976038000000
30117.9245283123.809847575389-5.88531927538889
31120.754717118.1188736013892.6358433986111
32107.572667117.772111745389-10.1994447453889
33130.4436512131.285625247389-0.841974047388886
34107.2157063116.19071904-8.97501274000001
35105.0739419123.457986285389-18.3840443853889
36130.1121877128.781234081.33095361999999
37109.6379398116.829282106778-7.19134230677774
38116.7261601121.444274452778-4.71811435277778
3997.1188169394.78100742844442.33780950155557
40140.8975013123.590005117.3074962
41108.2865885106.981132081.30545641999999
4297.6542580398.4118080284444-0.757549998444434
43112.0346762118.118873601389-6.08419740138889
44123.0494646117.7721117453895.2773528546111
45112.4171341131.285625247389-18.8684911473889
46116.4966854116.190719040.305966360000001
47104.6914839123.457986285389-18.7665023853889
48122.2335543128.78123408-6.54767978000001
4999.7960224491.43124255983338.36477988016673
5096.7108618196.04623490583330.664626904166679
51112.3151453120.179046975389-7.86390167538889
52102.5497195123.5900051-21.0402856
53104.5385008106.98113208-2.44263128000001
54122.0805711123.809847575389-1.72927647538890
5580.6476287692.7208340544444-12.0732052944444
5691.4074451892.3740721984444-0.966627018444427
5799.51555329105.887585700444-6.37203241044442
58106.527282116.19071904-9.66343704
5998.4956654898.05994673844440.435718741555576
60106.7567568128.78123408-22.02447728

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 100 & 91.4312425598335 & 8.5687574401665 \tabularnewline
2 & 95.84395716 & 96.0462349058333 & -0.202277745833311 \tabularnewline
3 & 105.5073942 & 120.179046975389 & -14.6716527753889 \tabularnewline
4 & 118.1540031 & 123.5900051 & -5.436002 \tabularnewline
5 & 101.8612953 & 106.98113208 & -5.11983677999997 \tabularnewline
6 & 109.8419174 & 123.809847575389 & -13.9679301753889 \tabularnewline
7 & 105.6348802 & 118.118873601389 & -12.4839934013889 \tabularnewline
8 & 112.927078 & 117.772111745389 & -4.84503374538888 \tabularnewline
9 & 133.0698623 & 131.285625247389 & 1.78423705261114 \tabularnewline
10 & 125.6756757 & 116.19071904 & 9.48495666000002 \tabularnewline
11 & 146.736359 & 123.457986285389 & 23.2783727146111 \tabularnewline
12 & 142.5803162 & 128.78123408 & 13.7990821200000 \tabularnewline
13 & 106.1448241 & 116.829282106778 & -10.6844580067777 \tabularnewline
14 & 126.5170831 & 121.444274452778 & 5.07280864722221 \tabularnewline
15 & 132.7893932 & 120.179046975389 & 12.6103462246111 \tabularnewline
16 & 121.2391637 & 123.5900051 & -2.35084139999999 \tabularnewline
17 & 114.5079041 & 106.98113208 & 7.52677202 \tabularnewline
18 & 146.1499235 & 123.809847575389 & 22.3400759246111 \tabularnewline
19 & 146.1244263 & 118.118873601389 & 28.0055526986111 \tabularnewline
20 & 128.5058644 & 117.772111745389 & 10.7337526546111 \tabularnewline
21 & 155.5838858 & 131.285625247389 & 24.2982605526111 \tabularnewline
22 & 125.0382458 & 116.19071904 & 8.84752676 \tabularnewline
23 & 136.8944416 & 123.457986285389 & 13.4364553146111 \tabularnewline
24 & 142.2233554 & 128.78123408 & 13.44212132 \tabularnewline
25 & 117.7715451 & 116.829282106778 & 0.942262993222257 \tabularnewline
26 & 120.627231 & 121.444274452778 & -0.817043452777788 \tabularnewline
27 & 127.7664457 & 120.179046975389 & 7.58739872461112 \tabularnewline
28 & 135.1096379 & 123.5900051 & 11.5196328 \tabularnewline
29 & 105.7113717 & 106.98113208 & -1.26976038000000 \tabularnewline
30 & 117.9245283 & 123.809847575389 & -5.88531927538889 \tabularnewline
31 & 120.754717 & 118.118873601389 & 2.6358433986111 \tabularnewline
32 & 107.572667 & 117.772111745389 & -10.1994447453889 \tabularnewline
33 & 130.4436512 & 131.285625247389 & -0.841974047388886 \tabularnewline
34 & 107.2157063 & 116.19071904 & -8.97501274000001 \tabularnewline
35 & 105.0739419 & 123.457986285389 & -18.3840443853889 \tabularnewline
36 & 130.1121877 & 128.78123408 & 1.33095361999999 \tabularnewline
37 & 109.6379398 & 116.829282106778 & -7.19134230677774 \tabularnewline
38 & 116.7261601 & 121.444274452778 & -4.71811435277778 \tabularnewline
39 & 97.11881693 & 94.7810074284444 & 2.33780950155557 \tabularnewline
40 & 140.8975013 & 123.5900051 & 17.3074962 \tabularnewline
41 & 108.2865885 & 106.98113208 & 1.30545641999999 \tabularnewline
42 & 97.65425803 & 98.4118080284444 & -0.757549998444434 \tabularnewline
43 & 112.0346762 & 118.118873601389 & -6.08419740138889 \tabularnewline
44 & 123.0494646 & 117.772111745389 & 5.2773528546111 \tabularnewline
45 & 112.4171341 & 131.285625247389 & -18.8684911473889 \tabularnewline
46 & 116.4966854 & 116.19071904 & 0.305966360000001 \tabularnewline
47 & 104.6914839 & 123.457986285389 & -18.7665023853889 \tabularnewline
48 & 122.2335543 & 128.78123408 & -6.54767978000001 \tabularnewline
49 & 99.79602244 & 91.4312425598333 & 8.36477988016673 \tabularnewline
50 & 96.71086181 & 96.0462349058333 & 0.664626904166679 \tabularnewline
51 & 112.3151453 & 120.179046975389 & -7.86390167538889 \tabularnewline
52 & 102.5497195 & 123.5900051 & -21.0402856 \tabularnewline
53 & 104.5385008 & 106.98113208 & -2.44263128000001 \tabularnewline
54 & 122.0805711 & 123.809847575389 & -1.72927647538890 \tabularnewline
55 & 80.64762876 & 92.7208340544444 & -12.0732052944444 \tabularnewline
56 & 91.40744518 & 92.3740721984444 & -0.966627018444427 \tabularnewline
57 & 99.51555329 & 105.887585700444 & -6.37203241044442 \tabularnewline
58 & 106.527282 & 116.19071904 & -9.66343704 \tabularnewline
59 & 98.49566548 & 98.0599467384444 & 0.435718741555576 \tabularnewline
60 & 106.7567568 & 128.78123408 & -22.02447728 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58064&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]100[/C][C]91.4312425598335[/C][C]8.5687574401665[/C][/ROW]
[ROW][C]2[/C][C]95.84395716[/C][C]96.0462349058333[/C][C]-0.202277745833311[/C][/ROW]
[ROW][C]3[/C][C]105.5073942[/C][C]120.179046975389[/C][C]-14.6716527753889[/C][/ROW]
[ROW][C]4[/C][C]118.1540031[/C][C]123.5900051[/C][C]-5.436002[/C][/ROW]
[ROW][C]5[/C][C]101.8612953[/C][C]106.98113208[/C][C]-5.11983677999997[/C][/ROW]
[ROW][C]6[/C][C]109.8419174[/C][C]123.809847575389[/C][C]-13.9679301753889[/C][/ROW]
[ROW][C]7[/C][C]105.6348802[/C][C]118.118873601389[/C][C]-12.4839934013889[/C][/ROW]
[ROW][C]8[/C][C]112.927078[/C][C]117.772111745389[/C][C]-4.84503374538888[/C][/ROW]
[ROW][C]9[/C][C]133.0698623[/C][C]131.285625247389[/C][C]1.78423705261114[/C][/ROW]
[ROW][C]10[/C][C]125.6756757[/C][C]116.19071904[/C][C]9.48495666000002[/C][/ROW]
[ROW][C]11[/C][C]146.736359[/C][C]123.457986285389[/C][C]23.2783727146111[/C][/ROW]
[ROW][C]12[/C][C]142.5803162[/C][C]128.78123408[/C][C]13.7990821200000[/C][/ROW]
[ROW][C]13[/C][C]106.1448241[/C][C]116.829282106778[/C][C]-10.6844580067777[/C][/ROW]
[ROW][C]14[/C][C]126.5170831[/C][C]121.444274452778[/C][C]5.07280864722221[/C][/ROW]
[ROW][C]15[/C][C]132.7893932[/C][C]120.179046975389[/C][C]12.6103462246111[/C][/ROW]
[ROW][C]16[/C][C]121.2391637[/C][C]123.5900051[/C][C]-2.35084139999999[/C][/ROW]
[ROW][C]17[/C][C]114.5079041[/C][C]106.98113208[/C][C]7.52677202[/C][/ROW]
[ROW][C]18[/C][C]146.1499235[/C][C]123.809847575389[/C][C]22.3400759246111[/C][/ROW]
[ROW][C]19[/C][C]146.1244263[/C][C]118.118873601389[/C][C]28.0055526986111[/C][/ROW]
[ROW][C]20[/C][C]128.5058644[/C][C]117.772111745389[/C][C]10.7337526546111[/C][/ROW]
[ROW][C]21[/C][C]155.5838858[/C][C]131.285625247389[/C][C]24.2982605526111[/C][/ROW]
[ROW][C]22[/C][C]125.0382458[/C][C]116.19071904[/C][C]8.84752676[/C][/ROW]
[ROW][C]23[/C][C]136.8944416[/C][C]123.457986285389[/C][C]13.4364553146111[/C][/ROW]
[ROW][C]24[/C][C]142.2233554[/C][C]128.78123408[/C][C]13.44212132[/C][/ROW]
[ROW][C]25[/C][C]117.7715451[/C][C]116.829282106778[/C][C]0.942262993222257[/C][/ROW]
[ROW][C]26[/C][C]120.627231[/C][C]121.444274452778[/C][C]-0.817043452777788[/C][/ROW]
[ROW][C]27[/C][C]127.7664457[/C][C]120.179046975389[/C][C]7.58739872461112[/C][/ROW]
[ROW][C]28[/C][C]135.1096379[/C][C]123.5900051[/C][C]11.5196328[/C][/ROW]
[ROW][C]29[/C][C]105.7113717[/C][C]106.98113208[/C][C]-1.26976038000000[/C][/ROW]
[ROW][C]30[/C][C]117.9245283[/C][C]123.809847575389[/C][C]-5.88531927538889[/C][/ROW]
[ROW][C]31[/C][C]120.754717[/C][C]118.118873601389[/C][C]2.6358433986111[/C][/ROW]
[ROW][C]32[/C][C]107.572667[/C][C]117.772111745389[/C][C]-10.1994447453889[/C][/ROW]
[ROW][C]33[/C][C]130.4436512[/C][C]131.285625247389[/C][C]-0.841974047388886[/C][/ROW]
[ROW][C]34[/C][C]107.2157063[/C][C]116.19071904[/C][C]-8.97501274000001[/C][/ROW]
[ROW][C]35[/C][C]105.0739419[/C][C]123.457986285389[/C][C]-18.3840443853889[/C][/ROW]
[ROW][C]36[/C][C]130.1121877[/C][C]128.78123408[/C][C]1.33095361999999[/C][/ROW]
[ROW][C]37[/C][C]109.6379398[/C][C]116.829282106778[/C][C]-7.19134230677774[/C][/ROW]
[ROW][C]38[/C][C]116.7261601[/C][C]121.444274452778[/C][C]-4.71811435277778[/C][/ROW]
[ROW][C]39[/C][C]97.11881693[/C][C]94.7810074284444[/C][C]2.33780950155557[/C][/ROW]
[ROW][C]40[/C][C]140.8975013[/C][C]123.5900051[/C][C]17.3074962[/C][/ROW]
[ROW][C]41[/C][C]108.2865885[/C][C]106.98113208[/C][C]1.30545641999999[/C][/ROW]
[ROW][C]42[/C][C]97.65425803[/C][C]98.4118080284444[/C][C]-0.757549998444434[/C][/ROW]
[ROW][C]43[/C][C]112.0346762[/C][C]118.118873601389[/C][C]-6.08419740138889[/C][/ROW]
[ROW][C]44[/C][C]123.0494646[/C][C]117.772111745389[/C][C]5.2773528546111[/C][/ROW]
[ROW][C]45[/C][C]112.4171341[/C][C]131.285625247389[/C][C]-18.8684911473889[/C][/ROW]
[ROW][C]46[/C][C]116.4966854[/C][C]116.19071904[/C][C]0.305966360000001[/C][/ROW]
[ROW][C]47[/C][C]104.6914839[/C][C]123.457986285389[/C][C]-18.7665023853889[/C][/ROW]
[ROW][C]48[/C][C]122.2335543[/C][C]128.78123408[/C][C]-6.54767978000001[/C][/ROW]
[ROW][C]49[/C][C]99.79602244[/C][C]91.4312425598333[/C][C]8.36477988016673[/C][/ROW]
[ROW][C]50[/C][C]96.71086181[/C][C]96.0462349058333[/C][C]0.664626904166679[/C][/ROW]
[ROW][C]51[/C][C]112.3151453[/C][C]120.179046975389[/C][C]-7.86390167538889[/C][/ROW]
[ROW][C]52[/C][C]102.5497195[/C][C]123.5900051[/C][C]-21.0402856[/C][/ROW]
[ROW][C]53[/C][C]104.5385008[/C][C]106.98113208[/C][C]-2.44263128000001[/C][/ROW]
[ROW][C]54[/C][C]122.0805711[/C][C]123.809847575389[/C][C]-1.72927647538890[/C][/ROW]
[ROW][C]55[/C][C]80.64762876[/C][C]92.7208340544444[/C][C]-12.0732052944444[/C][/ROW]
[ROW][C]56[/C][C]91.40744518[/C][C]92.3740721984444[/C][C]-0.966627018444427[/C][/ROW]
[ROW][C]57[/C][C]99.51555329[/C][C]105.887585700444[/C][C]-6.37203241044442[/C][/ROW]
[ROW][C]58[/C][C]106.527282[/C][C]116.19071904[/C][C]-9.66343704[/C][/ROW]
[ROW][C]59[/C][C]98.49566548[/C][C]98.0599467384444[/C][C]0.435718741555576[/C][/ROW]
[ROW][C]60[/C][C]106.7567568[/C][C]128.78123408[/C][C]-22.02447728[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58064&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58064&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110091.43124255983358.5687574401665
295.8439571696.0462349058333-0.202277745833311
3105.5073942120.179046975389-14.6716527753889
4118.1540031123.5900051-5.436002
5101.8612953106.98113208-5.11983677999997
6109.8419174123.809847575389-13.9679301753889
7105.6348802118.118873601389-12.4839934013889
8112.927078117.772111745389-4.84503374538888
9133.0698623131.2856252473891.78423705261114
10125.6756757116.190719049.48495666000002
11146.736359123.45798628538923.2783727146111
12142.5803162128.7812340813.7990821200000
13106.1448241116.829282106778-10.6844580067777
14126.5170831121.4442744527785.07280864722221
15132.7893932120.17904697538912.6103462246111
16121.2391637123.5900051-2.35084139999999
17114.5079041106.981132087.52677202
18146.1499235123.80984757538922.3400759246111
19146.1244263118.11887360138928.0055526986111
20128.5058644117.77211174538910.7337526546111
21155.5838858131.28562524738924.2982605526111
22125.0382458116.190719048.84752676
23136.8944416123.45798628538913.4364553146111
24142.2233554128.7812340813.44212132
25117.7715451116.8292821067780.942262993222257
26120.627231121.444274452778-0.817043452777788
27127.7664457120.1790469753897.58739872461112
28135.1096379123.590005111.5196328
29105.7113717106.98113208-1.26976038000000
30117.9245283123.809847575389-5.88531927538889
31120.754717118.1188736013892.6358433986111
32107.572667117.772111745389-10.1994447453889
33130.4436512131.285625247389-0.841974047388886
34107.2157063116.19071904-8.97501274000001
35105.0739419123.457986285389-18.3840443853889
36130.1121877128.781234081.33095361999999
37109.6379398116.829282106778-7.19134230677774
38116.7261601121.444274452778-4.71811435277778
3997.1188169394.78100742844442.33780950155557
40140.8975013123.590005117.3074962
41108.2865885106.981132081.30545641999999
4297.6542580398.4118080284444-0.757549998444434
43112.0346762118.118873601389-6.08419740138889
44123.0494646117.7721117453895.2773528546111
45112.4171341131.285625247389-18.8684911473889
46116.4966854116.190719040.305966360000001
47104.6914839123.457986285389-18.7665023853889
48122.2335543128.78123408-6.54767978000001
4999.7960224491.43124255983338.36477988016673
5096.7108618196.04623490583330.664626904166679
51112.3151453120.179046975389-7.86390167538889
52102.5497195123.5900051-21.0402856
53104.5385008106.98113208-2.44263128000001
54122.0805711123.809847575389-1.72927647538890
5580.6476287692.7208340544444-12.0732052944444
5691.4074451892.3740721984444-0.966627018444427
5799.51555329105.887585700444-6.37203241044442
58106.527282116.19071904-9.66343704
5998.4956654898.05994673844440.435718741555576
60106.7567568128.78123408-22.02447728







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
160.5949498499276880.8101003001446230.405050150072312
170.4924963570679710.9849927141359430.507503642932029
180.776662625331380.446674749337240.22333737466862
190.9443016375592770.1113967248814460.055698362440723
200.9272173965280940.1455652069438120.0727826034719059
210.9670817917404850.06583641651903010.0329182082595150
220.9530875727620440.0938248544759130.0469124272379565
230.9680444880997130.06391102380057380.0319555119002869
240.973794438439200.05241112312159850.0262055615607992
250.9556953489436480.08860930211270340.0443046510563517
260.9276312381818180.1447375236363640.0723687618181819
270.913302139144120.1733957217117600.0866978608558802
280.9135785706725580.1728428586548850.0864214293274425
290.8680539968283660.2638920063432680.131946003171634
300.8211161951837940.3577676096324120.178883804816206
310.8022444693403470.3955110613193050.197755530659653
320.7727111257443750.4545777485112510.227288874255625
330.7793190979262390.4413618041475220.220680902073761
340.7455667400586420.5088665198827150.254433259941358
350.810264369767230.3794712604655390.189735630232769
360.8084650555232250.3830698889535510.191534944476775
370.7480630108519040.5038739782961920.251936989148096
380.6502232677933760.6995534644132480.349776732206624
390.54832296446950.9033540710610.4516770355305
400.9115536203552870.1768927592894270.0884463796447134
410.8451931066635520.3096137866728960.154806893336448
420.7463710621841990.5072578756316010.253628937815801
430.6990430947939050.6019138104121910.300956905206095
440.7558520979406650.488295804118670.244147902059335

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
16 & 0.594949849927688 & 0.810100300144623 & 0.405050150072312 \tabularnewline
17 & 0.492496357067971 & 0.984992714135943 & 0.507503642932029 \tabularnewline
18 & 0.77666262533138 & 0.44667474933724 & 0.22333737466862 \tabularnewline
19 & 0.944301637559277 & 0.111396724881446 & 0.055698362440723 \tabularnewline
20 & 0.927217396528094 & 0.145565206943812 & 0.0727826034719059 \tabularnewline
21 & 0.967081791740485 & 0.0658364165190301 & 0.0329182082595150 \tabularnewline
22 & 0.953087572762044 & 0.093824854475913 & 0.0469124272379565 \tabularnewline
23 & 0.968044488099713 & 0.0639110238005738 & 0.0319555119002869 \tabularnewline
24 & 0.97379443843920 & 0.0524111231215985 & 0.0262055615607992 \tabularnewline
25 & 0.955695348943648 & 0.0886093021127034 & 0.0443046510563517 \tabularnewline
26 & 0.927631238181818 & 0.144737523636364 & 0.0723687618181819 \tabularnewline
27 & 0.91330213914412 & 0.173395721711760 & 0.0866978608558802 \tabularnewline
28 & 0.913578570672558 & 0.172842858654885 & 0.0864214293274425 \tabularnewline
29 & 0.868053996828366 & 0.263892006343268 & 0.131946003171634 \tabularnewline
30 & 0.821116195183794 & 0.357767609632412 & 0.178883804816206 \tabularnewline
31 & 0.802244469340347 & 0.395511061319305 & 0.197755530659653 \tabularnewline
32 & 0.772711125744375 & 0.454577748511251 & 0.227288874255625 \tabularnewline
33 & 0.779319097926239 & 0.441361804147522 & 0.220680902073761 \tabularnewline
34 & 0.745566740058642 & 0.508866519882715 & 0.254433259941358 \tabularnewline
35 & 0.81026436976723 & 0.379471260465539 & 0.189735630232769 \tabularnewline
36 & 0.808465055523225 & 0.383069888953551 & 0.191534944476775 \tabularnewline
37 & 0.748063010851904 & 0.503873978296192 & 0.251936989148096 \tabularnewline
38 & 0.650223267793376 & 0.699553464413248 & 0.349776732206624 \tabularnewline
39 & 0.5483229644695 & 0.903354071061 & 0.4516770355305 \tabularnewline
40 & 0.911553620355287 & 0.176892759289427 & 0.0884463796447134 \tabularnewline
41 & 0.845193106663552 & 0.309613786672896 & 0.154806893336448 \tabularnewline
42 & 0.746371062184199 & 0.507257875631601 & 0.253628937815801 \tabularnewline
43 & 0.699043094793905 & 0.601913810412191 & 0.300956905206095 \tabularnewline
44 & 0.755852097940665 & 0.48829580411867 & 0.244147902059335 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58064&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]16[/C][C]0.594949849927688[/C][C]0.810100300144623[/C][C]0.405050150072312[/C][/ROW]
[ROW][C]17[/C][C]0.492496357067971[/C][C]0.984992714135943[/C][C]0.507503642932029[/C][/ROW]
[ROW][C]18[/C][C]0.77666262533138[/C][C]0.44667474933724[/C][C]0.22333737466862[/C][/ROW]
[ROW][C]19[/C][C]0.944301637559277[/C][C]0.111396724881446[/C][C]0.055698362440723[/C][/ROW]
[ROW][C]20[/C][C]0.927217396528094[/C][C]0.145565206943812[/C][C]0.0727826034719059[/C][/ROW]
[ROW][C]21[/C][C]0.967081791740485[/C][C]0.0658364165190301[/C][C]0.0329182082595150[/C][/ROW]
[ROW][C]22[/C][C]0.953087572762044[/C][C]0.093824854475913[/C][C]0.0469124272379565[/C][/ROW]
[ROW][C]23[/C][C]0.968044488099713[/C][C]0.0639110238005738[/C][C]0.0319555119002869[/C][/ROW]
[ROW][C]24[/C][C]0.97379443843920[/C][C]0.0524111231215985[/C][C]0.0262055615607992[/C][/ROW]
[ROW][C]25[/C][C]0.955695348943648[/C][C]0.0886093021127034[/C][C]0.0443046510563517[/C][/ROW]
[ROW][C]26[/C][C]0.927631238181818[/C][C]0.144737523636364[/C][C]0.0723687618181819[/C][/ROW]
[ROW][C]27[/C][C]0.91330213914412[/C][C]0.173395721711760[/C][C]0.0866978608558802[/C][/ROW]
[ROW][C]28[/C][C]0.913578570672558[/C][C]0.172842858654885[/C][C]0.0864214293274425[/C][/ROW]
[ROW][C]29[/C][C]0.868053996828366[/C][C]0.263892006343268[/C][C]0.131946003171634[/C][/ROW]
[ROW][C]30[/C][C]0.821116195183794[/C][C]0.357767609632412[/C][C]0.178883804816206[/C][/ROW]
[ROW][C]31[/C][C]0.802244469340347[/C][C]0.395511061319305[/C][C]0.197755530659653[/C][/ROW]
[ROW][C]32[/C][C]0.772711125744375[/C][C]0.454577748511251[/C][C]0.227288874255625[/C][/ROW]
[ROW][C]33[/C][C]0.779319097926239[/C][C]0.441361804147522[/C][C]0.220680902073761[/C][/ROW]
[ROW][C]34[/C][C]0.745566740058642[/C][C]0.508866519882715[/C][C]0.254433259941358[/C][/ROW]
[ROW][C]35[/C][C]0.81026436976723[/C][C]0.379471260465539[/C][C]0.189735630232769[/C][/ROW]
[ROW][C]36[/C][C]0.808465055523225[/C][C]0.383069888953551[/C][C]0.191534944476775[/C][/ROW]
[ROW][C]37[/C][C]0.748063010851904[/C][C]0.503873978296192[/C][C]0.251936989148096[/C][/ROW]
[ROW][C]38[/C][C]0.650223267793376[/C][C]0.699553464413248[/C][C]0.349776732206624[/C][/ROW]
[ROW][C]39[/C][C]0.5483229644695[/C][C]0.903354071061[/C][C]0.4516770355305[/C][/ROW]
[ROW][C]40[/C][C]0.911553620355287[/C][C]0.176892759289427[/C][C]0.0884463796447134[/C][/ROW]
[ROW][C]41[/C][C]0.845193106663552[/C][C]0.309613786672896[/C][C]0.154806893336448[/C][/ROW]
[ROW][C]42[/C][C]0.746371062184199[/C][C]0.507257875631601[/C][C]0.253628937815801[/C][/ROW]
[ROW][C]43[/C][C]0.699043094793905[/C][C]0.601913810412191[/C][C]0.300956905206095[/C][/ROW]
[ROW][C]44[/C][C]0.755852097940665[/C][C]0.48829580411867[/C][C]0.244147902059335[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58064&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58064&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
160.5949498499276880.8101003001446230.405050150072312
170.4924963570679710.9849927141359430.507503642932029
180.776662625331380.446674749337240.22333737466862
190.9443016375592770.1113967248814460.055698362440723
200.9272173965280940.1455652069438120.0727826034719059
210.9670817917404850.06583641651903010.0329182082595150
220.9530875727620440.0938248544759130.0469124272379565
230.9680444880997130.06391102380057380.0319555119002869
240.973794438439200.05241112312159850.0262055615607992
250.9556953489436480.08860930211270340.0443046510563517
260.9276312381818180.1447375236363640.0723687618181819
270.913302139144120.1733957217117600.0866978608558802
280.9135785706725580.1728428586548850.0864214293274425
290.8680539968283660.2638920063432680.131946003171634
300.8211161951837940.3577676096324120.178883804816206
310.8022444693403470.3955110613193050.197755530659653
320.7727111257443750.4545777485112510.227288874255625
330.7793190979262390.4413618041475220.220680902073761
340.7455667400586420.5088665198827150.254433259941358
350.810264369767230.3794712604655390.189735630232769
360.8084650555232250.3830698889535510.191534944476775
370.7480630108519040.5038739782961920.251936989148096
380.6502232677933760.6995534644132480.349776732206624
390.54832296446950.9033540710610.4516770355305
400.9115536203552870.1768927592894270.0884463796447134
410.8451931066635520.3096137866728960.154806893336448
420.7463710621841990.5072578756316010.253628937815801
430.6990430947939050.6019138104121910.300956905206095
440.7558520979406650.488295804118670.244147902059335







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level50.172413793103448NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 5 & 0.172413793103448 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=58064&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]5[/C][C]0.172413793103448[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=58064&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=58064&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level50.172413793103448NOK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}