Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 27 Nov 2008 06:15:57 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2008/Nov/27/t1227792212c47e5hs3x3dtkt9.htm/, Retrieved Thu, 09 May 2024 15:45:55 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=25803, Retrieved Thu, 09 May 2024 15:45:55 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact151
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
F       [Multiple Regression] [W6Q3 (2)] [2008-11-27 13:15:57] [434228f9e3c7eaa307f0fb12855e2147] [Current]
-         [Multiple Regression] [AH paper 3] [2008-12-15 18:18:03] [74be16979710d4c4e7c6647856088456]
- RM D      [Multiple Regression] [Multiple Regression] [2009-12-20 18:17:03] [3dd791303389e75e672968b227170a72]
Feedback Forum
2008-11-30 13:18:56 [Nicolaj Wuyts] [reply
Je zou hier aan kunnen toevoegen dat het Residual histogram in normale omstandigheden een normaalverdeling zou moeten vertonen, maar dat dit hier niet het geval is. Aan de autocorrelatiefunctie valt ook te zien dat er nog veel significante fouten in het model zitten.
2008-11-30 17:20:42 [Toon Wouters] [reply
Goede berekening, maar er zijn nog wel enkele opmerkingen omtrend de interpretatie en de conclusie. Het is beter te kijken naar de adjust R-squared, die is nauwkeuriger maar hier heeft het geen belang. Bij de density plot had je meer moeten zeggen over de verdeling : is deze normaal verdeeld? Nee omdat je lagns beide kanten enomre staarten hebt. De residual lag plot had ook nog interessant geweest om te zien of er verbanden zijn tussen de voorspellingsfouten : we kunnen zien dat er een hoge concetratie is, wel wat correlatie. De autocorrelation fuction was ook interessant geweest : daarop is te zien dat er veel staafjes binnen het betrouwbaarheids interval vallen en dus wijst op dat er autocorrelatie aanwezig is en dit niet bijdraagt tot een goed model.
2008-12-01 15:59:56 [Hannes Van Hoof] [reply
Goede interpretatie, maar er ontbreken nog enkele grafieken.
Zoals het residual histogram, dit zou in een normaal, goed model een normaal verdeling moeten weergeven.
de autocorrelation plot: deze geeft weer dat er autocorrelatie aanwezig is, er zitten dus nog fouten in het model.
2008-12-01 21:37:35 [Jonas Scheltjens] [reply
De student had eventueel de actuals and interpolation van de gegevensreeks zonder dummyvariabelen kunnen geven daar we een betere vergelijking kunnen maken. Uiteraard vraagt dit dan een korte uitleg om aan te tonen wat deze voorstelling wil zeggen.
Voor de bespreking van de gegevens die voorgesteld worden met invloed van de dummyvariabelen inbegrepen zou de student toch nog een extra tabel kunnen weergeven om de uitleg die hij/zij ervan vermeldt makkelijk na te kunnen kijken.
Wat vreemd is, is dat de student eerst verwijst naar het jaar 2003 als referentiejaar en vervolgens er in zijn tekst staat dat hij vergeleek met 2000 als basisjaar.
Ook is de keuze van de significante gebeurtenis bizar. Er wordt gekozen voor het feit dat Yves Leterme eerste minister is tegenover het feit dat Guy Verhofstadt deze functie vervulde. Mij lijkt het nogal vreemd dat louter omdat iemand anders eerste minister is, dat er een prijsstijging plaatsvindt in de categorie van ijzer en schroot. Wel zou dit verklaard kunnen worden door bijvoorbeeld een verandering van akkoorden of wetten en regels omtrent de prijzen van ijzer of schroot dat gehanteerd wordt door een andere eerste minister (en de rest van de ministerraad uiteraard).
De keuze van de 2-zijdige test is goed aangezien de gebeurtenis zowel een positieve als een negatieve invloed kan hebben op de prijsindex. Zoals de student zegt ligt in januari, februari en maart de p-waarde onder de type 1 error wat wil zeggen dat hier er sprake is van een significant verschil. Daarom zou de student ook beter een opmerking maken i.v.m. de seizoenaliteit.
In deze tabel kijken we voornamelijk naar de “Adjusted R-square” en niet naar de R-squared zoals de student beweert. In de praktijk wil dit getal zeggen dat er bijna 93% van de schommelingen (wat zeer hoog is), die ontstaan zijn doordat een gebeurtenis zich heeft voorgedaan, verklaard kan worden door dit model. Aldus is het correct te stellen dat dit een goed model is.
De student heeft het bij het rechte eind i.v.m. de “Actuals and Interpollation”-grafiek. Deze geeft ons duidelijk te kennen dat er zich een stijgende trend voordoet in de gegevensreeks. We kunnen zeer goed zien dat er een niveauverschil optreedt bij index 64. Dit kan verklaard worden door het voorkomen van de (onbekende) gebeurtenis. We kunnen dus wel degelijk zien dat er zich een verhoging van het indexcijfer voordoet.
Wat de student zegt over de Residuals-grafiek is volledig correct en er hoeft dan ook verder niet over opgemerkt te worden.
De student is hier correct in zijn uitspraak over de density plot. Men kan dus stellen dat deze gegevens niet normaal verdeeld zijn.
Verder werden er geen grafieken meer opgenomen en besproken. Dit had echter wel van pas kunnen komen voor het Residual Histogram, de Residual Normal Q-Q plot, de Residual Lag plot, lowess, and regression line en de Residual Autocorrelation Function. Hier een korte bespreking van wat zij verhalen:
Het Residual Histogram zou normaal gezien een normaalverdeling moeten weergeven. Hier concluderen we dat dit niet het geval is (bv. zowel aan de linker en rechterkant nog teveel waarden), dus dit model is nog niet optimaal.
De Residual Normal Q-Q plot geeft informatie omtrent of de gegevens al dan niet normaal verdeeld zijn. We kunnen zien aan quantielen in welke mate dat dit het geval is. Hier zien we dat dezen wel redelijk goed overeenkomen met de onderzocht variabele. Indien de punten allen op de lijn zouden liggen zou men kunnen spreken van een perfecte normaalverdeling. Hier kan men echter wel nog enkele extreme waarden waarnemen aan de kanten, maar dat de gegevens in het midden beter overeenkomen met een normaalverdeling.
De Residual Lag plot, lowess, and regression line legt het verband tussen de voorspellingsfouten van een maand en zijn voorgaande. Hieruit is het mogelijk te concluderen dat er niet echt een grote correlatie tussen de variabelen te vinden is. Dus luidt de conclusie dat het geen goed model is.
De Residual Autocorrelation Function wordt gekenmerkt door zijn blauwe stippellijnen en zijn verticale lijnen (dezen stellen de voorspellingsfouten voor). De stippellijnen vertegenwoordigen duiden het betrouwbaarheidsinterval van 95% aan. Op het moment dat de verticale lijnen zodanig lang zijn dat ze de stippellijnen (en dus het betrouwbaarheidsinterval) passeren zijn de waarden ervan significant verschillend. Indien we dit volledig goed willen laten worden zou het gemiddelde constant moeten zijn en gelijk zijn aan 0. Ook zouden we geen patroon mogen kunnen waarnemen en er zou geen autocorrelatie aanwezig mogen zijn. Hier kunnen we opmaken dat sommige lijnen in het begin de horizontale stippellijnen passeren en dusdanig zijn deze significant verschillend (deze voorspellingsfouten zijn niet te wijten zijn aan toeval).
  2008-12-15 18:05:32 [Annemiek Hoofman] [reply
Excuseer dat het referentiejaar voor verwarring zorgt.Op de site
van de Nationale Bank worden de consumptieprijsindexen berekend
ten opzichte van het jaar 2000 dat gelijk gesteld wordt aan 100.
Voor mijn tijdreeks ben ik begonnen bij januari 2003, dus voor
het maken van een voorspelling baseren we ons op het jaar 2003
het referentiejaar) waarvan de getallen uitgedrukt staan in het
jaar 2000. Ik zie geen andere mogelijkheid om de tijdreeks te
behandelen of ik zou misschien de reeks moeten laten beginnen
vanaf 2000 of alles moeten herrekenen en zo 2003 gelijk stellen
aan 100 (maar dat laatste lijkt mij ongelooflijk veel
werk

Post a new message
Dataseries X:
115.6	0
120.3	0
121.9	0
121.7	0
118.9	0
113.4	0
114	0
117.5	0
120.9	0
125.1	0
124.7	0
128.2	0
149.7	0
163.6	0
173.9	0
164.5	0
154.2	0
147.9	0
159.3	0
170.3	0
170	0
174.2	0
190.8	0
179.9	0
240.8	0
241.9	0
241.1	0
239.6	0
220.8	0
209.3	0
209.9	0
228.3	0
242.1	0
226.4	0
231.5	0
229.7	0
257.6	0
260	0
264.4	0
268.8	0
271.4	0
273.8	0
277.4	0
268.2	0
264.6	0
266.6	0
266	0
267.4	0
289.8	0
294	0
310.3	0
311.7	0
302.1	0
298.2	0
299.2	0
296.2	0
299	0
300	0
299.4	0
300.2	0
470.2	0
472.1	0
484.8	0
513.4	1
547.2	1
548.1	1
544.7	1
521.1	1
459	1
413.2	1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 4 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25803&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]4 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25803&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25803&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
y[t] = + 63.837894736842 + 133.843609022556D[t] + 54.7091812865496M1[t] + 55.0413450292399M2[t] + 58.0901754385965M3[t] + 35.2984043441938M4[t] + 30.0805680868839M5[t] + 21.7293984962406M6[t] + 19.6615622389307M7[t] + 14.8103926482874M8[t] + 2.77588972431076M9[t] -9.94194653299918M10[t] + 5.76783625730993M11[t] + 4.36783625730994t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
y[t] =  +  63.837894736842 +  133.843609022556D[t] +  54.7091812865496M1[t] +  55.0413450292399M2[t] +  58.0901754385965M3[t] +  35.2984043441938M4[t] +  30.0805680868839M5[t] +  21.7293984962406M6[t] +  19.6615622389307M7[t] +  14.8103926482874M8[t] +  2.77588972431076M9[t] -9.94194653299918M10[t] +  5.76783625730993M11[t] +  4.36783625730994t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25803&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]y[t] =  +  63.837894736842 +  133.843609022556D[t] +  54.7091812865496M1[t] +  55.0413450292399M2[t] +  58.0901754385965M3[t] +  35.2984043441938M4[t] +  30.0805680868839M5[t] +  21.7293984962406M6[t] +  19.6615622389307M7[t] +  14.8103926482874M8[t] +  2.77588972431076M9[t] -9.94194653299918M10[t] +  5.76783625730993M11[t] +  4.36783625730994t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25803&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25803&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
y[t] = + 63.837894736842 + 133.843609022556D[t] + 54.7091812865496M1[t] + 55.0413450292399M2[t] + 58.0901754385965M3[t] + 35.2984043441938M4[t] + 30.0805680868839M5[t] + 21.7293984962406M6[t] + 19.6615622389307M7[t] + 14.8103926482874M8[t] + 2.77588972431076M9[t] -9.94194653299918M10[t] + 5.76783625730993M11[t] + 4.36783625730994t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)63.83789473684215.7491374.05340.0001577.9e-05
D133.84360902255614.8696269.001100
M154.709181286549618.6210682.9380.0047890.002394
M255.041345029239918.6099652.95760.0045340.002267
M358.090175438596518.6013243.12290.0028330.001417
M435.298404344193818.7889281.87870.0654970.032748
M530.080568086883918.7705841.60250.1146630.057332
M621.729398496240618.7546711.15860.2515310.125766
M719.661562238930718.7411961.04910.2986350.149317
M814.810392648287418.7301640.79070.4324410.216221
M92.7758897243107618.7215790.14830.8826610.44133
M10-9.9419465329991818.715444-0.53120.597370.298685
M115.7678362573099319.4180230.2970.7675390.383769
t4.367836257309940.21431920.380100

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 63.837894736842 & 15.749137 & 4.0534 & 0.000157 & 7.9e-05 \tabularnewline
D & 133.843609022556 & 14.869626 & 9.0011 & 0 & 0 \tabularnewline
M1 & 54.7091812865496 & 18.621068 & 2.938 & 0.004789 & 0.002394 \tabularnewline
M2 & 55.0413450292399 & 18.609965 & 2.9576 & 0.004534 & 0.002267 \tabularnewline
M3 & 58.0901754385965 & 18.601324 & 3.1229 & 0.002833 & 0.001417 \tabularnewline
M4 & 35.2984043441938 & 18.788928 & 1.8787 & 0.065497 & 0.032748 \tabularnewline
M5 & 30.0805680868839 & 18.770584 & 1.6025 & 0.114663 & 0.057332 \tabularnewline
M6 & 21.7293984962406 & 18.754671 & 1.1586 & 0.251531 & 0.125766 \tabularnewline
M7 & 19.6615622389307 & 18.741196 & 1.0491 & 0.298635 & 0.149317 \tabularnewline
M8 & 14.8103926482874 & 18.730164 & 0.7907 & 0.432441 & 0.216221 \tabularnewline
M9 & 2.77588972431076 & 18.721579 & 0.1483 & 0.882661 & 0.44133 \tabularnewline
M10 & -9.94194653299918 & 18.715444 & -0.5312 & 0.59737 & 0.298685 \tabularnewline
M11 & 5.76783625730993 & 19.418023 & 0.297 & 0.767539 & 0.383769 \tabularnewline
t & 4.36783625730994 & 0.214319 & 20.3801 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25803&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]63.837894736842[/C][C]15.749137[/C][C]4.0534[/C][C]0.000157[/C][C]7.9e-05[/C][/ROW]
[ROW][C]D[/C][C]133.843609022556[/C][C]14.869626[/C][C]9.0011[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M1[/C][C]54.7091812865496[/C][C]18.621068[/C][C]2.938[/C][C]0.004789[/C][C]0.002394[/C][/ROW]
[ROW][C]M2[/C][C]55.0413450292399[/C][C]18.609965[/C][C]2.9576[/C][C]0.004534[/C][C]0.002267[/C][/ROW]
[ROW][C]M3[/C][C]58.0901754385965[/C][C]18.601324[/C][C]3.1229[/C][C]0.002833[/C][C]0.001417[/C][/ROW]
[ROW][C]M4[/C][C]35.2984043441938[/C][C]18.788928[/C][C]1.8787[/C][C]0.065497[/C][C]0.032748[/C][/ROW]
[ROW][C]M5[/C][C]30.0805680868839[/C][C]18.770584[/C][C]1.6025[/C][C]0.114663[/C][C]0.057332[/C][/ROW]
[ROW][C]M6[/C][C]21.7293984962406[/C][C]18.754671[/C][C]1.1586[/C][C]0.251531[/C][C]0.125766[/C][/ROW]
[ROW][C]M7[/C][C]19.6615622389307[/C][C]18.741196[/C][C]1.0491[/C][C]0.298635[/C][C]0.149317[/C][/ROW]
[ROW][C]M8[/C][C]14.8103926482874[/C][C]18.730164[/C][C]0.7907[/C][C]0.432441[/C][C]0.216221[/C][/ROW]
[ROW][C]M9[/C][C]2.77588972431076[/C][C]18.721579[/C][C]0.1483[/C][C]0.882661[/C][C]0.44133[/C][/ROW]
[ROW][C]M10[/C][C]-9.94194653299918[/C][C]18.715444[/C][C]-0.5312[/C][C]0.59737[/C][C]0.298685[/C][/ROW]
[ROW][C]M11[/C][C]5.76783625730993[/C][C]19.418023[/C][C]0.297[/C][C]0.767539[/C][C]0.383769[/C][/ROW]
[ROW][C]t[/C][C]4.36783625730994[/C][C]0.214319[/C][C]20.3801[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25803&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25803&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)63.83789473684215.7491374.05340.0001577.9e-05
D133.84360902255614.8696269.001100
M154.709181286549618.6210682.9380.0047890.002394
M255.041345029239918.6099652.95760.0045340.002267
M358.090175438596518.6013243.12290.0028330.001417
M435.298404344193818.7889281.87870.0654970.032748
M530.080568086883918.7705841.60250.1146630.057332
M621.729398496240618.7546711.15860.2515310.125766
M719.661562238930718.7411961.04910.2986350.149317
M814.810392648287418.7301640.79070.4324410.216221
M92.7758897243107618.7215790.14830.8826610.44133
M10-9.9419465329991818.715444-0.53120.597370.298685
M115.7678362573099319.4180230.2970.7675390.383769
t4.367836257309940.21431920.380100







Multiple Linear Regression - Regression Statistics
Multiple R0.97198092869887
R-squared0.944746925754318
Adjusted R-squared0.931920319232999
F-TEST (value)73.6552512298603
F-TEST (DF numerator)13
F-TEST (DF denominator)56
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation30.7007198242396
Sum Squared Residuals52781.9150726817

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.97198092869887 \tabularnewline
R-squared & 0.944746925754318 \tabularnewline
Adjusted R-squared & 0.931920319232999 \tabularnewline
F-TEST (value) & 73.6552512298603 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 56 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 30.7007198242396 \tabularnewline
Sum Squared Residuals & 52781.9150726817 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25803&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.97198092869887[/C][/ROW]
[ROW][C]R-squared[/C][C]0.944746925754318[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.931920319232999[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]73.6552512298603[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]56[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]30.7007198242396[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]52781.9150726817[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25803&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25803&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.97198092869887
R-squared0.944746925754318
Adjusted R-squared0.931920319232999
F-TEST (value)73.6552512298603
F-TEST (DF numerator)13
F-TEST (DF denominator)56
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation30.7007198242396
Sum Squared Residuals52781.9150726817







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1115.6122.914912280702-7.3149122807023
2120.3127.614912280702-7.31491228070157
3121.9135.031578947368-13.1315789473684
4121.7116.6076441102765.09235588972419
5118.9115.7576441102763.14235588972434
6113.4111.7743107769421.62568922305774
7114114.074310776942-0.0743107769423977
8117.5113.5909774436093.90902255639095
9120.9105.92431077694214.9756892230576
10125.197.574310776942327.5256892230577
11124.7117.6519298245617.04807017543864
12128.2116.25192982456111.9480701754386
13149.7175.328947368421-25.628947368421
14163.6180.028947368421-16.428947368421
15173.9187.445614035088-13.5456140350877
16164.5169.021679197995-4.52167919799496
17154.2168.171679197995-13.9716791979950
18147.9164.188345864662-16.2883458646617
19159.3166.488345864662-7.18834586466163
20170.3166.0050125313284.2949874686717
21170158.33834586466211.6616541353384
22174.2149.98834586466224.2116541353383
23190.8170.06596491228120.7340350877193
24179.9168.66596491228111.2340350877193
25240.8227.7429824561413.0570175438598
26241.9232.4429824561409.4570175438596
27241.1239.8596491228071.24035087719295
28239.6221.43571428571418.1642857142857
29220.8220.5857142857140.214285714285712
30209.3216.602380952381-7.30238095238096
31209.9218.902380952381-9.00238095238093
32228.3218.4190476190489.8809523809524
33242.1210.75238095238131.3476190476191
34226.4202.40238095238123.9976190476191
35231.5222.489.01999999999997
36229.7221.088.62
37257.6280.157017543860-22.5570175438595
38260284.85701754386-24.8570175438597
39264.4292.273684210526-27.8736842105264
40268.8273.849749373434-5.04974937343354
41271.4272.999749373434-1.59974937343361
42273.8269.01641604014.78358395989974
43277.4271.31641604016.08358395989974
44268.2270.833082706767-2.63308270676692
45264.6263.1664160401001.43358395989976
46266.6254.816416040111.7835839598998
47266274.894035087719-8.8940350877193
48267.4273.494035087719-6.0940350877193
49289.8332.571052631579-42.7710526315788
50294337.271052631579-43.271052631579
51310.3344.687719298246-34.3877192982456
52311.7326.263784461153-14.5637844611529
53302.1325.413784461153-23.3137844611529
54298.2321.430451127820-23.2304511278196
55299.2323.730451127820-24.5304511278196
56296.2323.247117794486-27.0471177944862
57299315.580451127820-16.5804511278196
58300307.230451127820-7.2304511278196
59299.4327.308070175439-27.9080701754387
60300.2325.908070175439-25.7080701754386
61470.2384.98508771929885.2149122807018
62472.1389.68508771929882.4149122807017
63484.8397.10175438596587.698245614035
64513.4512.5214285714290.87857142857141
65547.2511.67142857142935.5285714285714
66548.1507.68809523809540.4119047619048
67544.7509.98809523809534.7119047619048
68521.1509.50476190476211.5952380952381
69459501.838095238095-42.8380952380953
70413.2493.488095238095-80.2880952380952

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 115.6 & 122.914912280702 & -7.3149122807023 \tabularnewline
2 & 120.3 & 127.614912280702 & -7.31491228070157 \tabularnewline
3 & 121.9 & 135.031578947368 & -13.1315789473684 \tabularnewline
4 & 121.7 & 116.607644110276 & 5.09235588972419 \tabularnewline
5 & 118.9 & 115.757644110276 & 3.14235588972434 \tabularnewline
6 & 113.4 & 111.774310776942 & 1.62568922305774 \tabularnewline
7 & 114 & 114.074310776942 & -0.0743107769423977 \tabularnewline
8 & 117.5 & 113.590977443609 & 3.90902255639095 \tabularnewline
9 & 120.9 & 105.924310776942 & 14.9756892230576 \tabularnewline
10 & 125.1 & 97.5743107769423 & 27.5256892230577 \tabularnewline
11 & 124.7 & 117.651929824561 & 7.04807017543864 \tabularnewline
12 & 128.2 & 116.251929824561 & 11.9480701754386 \tabularnewline
13 & 149.7 & 175.328947368421 & -25.628947368421 \tabularnewline
14 & 163.6 & 180.028947368421 & -16.428947368421 \tabularnewline
15 & 173.9 & 187.445614035088 & -13.5456140350877 \tabularnewline
16 & 164.5 & 169.021679197995 & -4.52167919799496 \tabularnewline
17 & 154.2 & 168.171679197995 & -13.9716791979950 \tabularnewline
18 & 147.9 & 164.188345864662 & -16.2883458646617 \tabularnewline
19 & 159.3 & 166.488345864662 & -7.18834586466163 \tabularnewline
20 & 170.3 & 166.005012531328 & 4.2949874686717 \tabularnewline
21 & 170 & 158.338345864662 & 11.6616541353384 \tabularnewline
22 & 174.2 & 149.988345864662 & 24.2116541353383 \tabularnewline
23 & 190.8 & 170.065964912281 & 20.7340350877193 \tabularnewline
24 & 179.9 & 168.665964912281 & 11.2340350877193 \tabularnewline
25 & 240.8 & 227.74298245614 & 13.0570175438598 \tabularnewline
26 & 241.9 & 232.442982456140 & 9.4570175438596 \tabularnewline
27 & 241.1 & 239.859649122807 & 1.24035087719295 \tabularnewline
28 & 239.6 & 221.435714285714 & 18.1642857142857 \tabularnewline
29 & 220.8 & 220.585714285714 & 0.214285714285712 \tabularnewline
30 & 209.3 & 216.602380952381 & -7.30238095238096 \tabularnewline
31 & 209.9 & 218.902380952381 & -9.00238095238093 \tabularnewline
32 & 228.3 & 218.419047619048 & 9.8809523809524 \tabularnewline
33 & 242.1 & 210.752380952381 & 31.3476190476191 \tabularnewline
34 & 226.4 & 202.402380952381 & 23.9976190476191 \tabularnewline
35 & 231.5 & 222.48 & 9.01999999999997 \tabularnewline
36 & 229.7 & 221.08 & 8.62 \tabularnewline
37 & 257.6 & 280.157017543860 & -22.5570175438595 \tabularnewline
38 & 260 & 284.85701754386 & -24.8570175438597 \tabularnewline
39 & 264.4 & 292.273684210526 & -27.8736842105264 \tabularnewline
40 & 268.8 & 273.849749373434 & -5.04974937343354 \tabularnewline
41 & 271.4 & 272.999749373434 & -1.59974937343361 \tabularnewline
42 & 273.8 & 269.0164160401 & 4.78358395989974 \tabularnewline
43 & 277.4 & 271.3164160401 & 6.08358395989974 \tabularnewline
44 & 268.2 & 270.833082706767 & -2.63308270676692 \tabularnewline
45 & 264.6 & 263.166416040100 & 1.43358395989976 \tabularnewline
46 & 266.6 & 254.8164160401 & 11.7835839598998 \tabularnewline
47 & 266 & 274.894035087719 & -8.8940350877193 \tabularnewline
48 & 267.4 & 273.494035087719 & -6.0940350877193 \tabularnewline
49 & 289.8 & 332.571052631579 & -42.7710526315788 \tabularnewline
50 & 294 & 337.271052631579 & -43.271052631579 \tabularnewline
51 & 310.3 & 344.687719298246 & -34.3877192982456 \tabularnewline
52 & 311.7 & 326.263784461153 & -14.5637844611529 \tabularnewline
53 & 302.1 & 325.413784461153 & -23.3137844611529 \tabularnewline
54 & 298.2 & 321.430451127820 & -23.2304511278196 \tabularnewline
55 & 299.2 & 323.730451127820 & -24.5304511278196 \tabularnewline
56 & 296.2 & 323.247117794486 & -27.0471177944862 \tabularnewline
57 & 299 & 315.580451127820 & -16.5804511278196 \tabularnewline
58 & 300 & 307.230451127820 & -7.2304511278196 \tabularnewline
59 & 299.4 & 327.308070175439 & -27.9080701754387 \tabularnewline
60 & 300.2 & 325.908070175439 & -25.7080701754386 \tabularnewline
61 & 470.2 & 384.985087719298 & 85.2149122807018 \tabularnewline
62 & 472.1 & 389.685087719298 & 82.4149122807017 \tabularnewline
63 & 484.8 & 397.101754385965 & 87.698245614035 \tabularnewline
64 & 513.4 & 512.521428571429 & 0.87857142857141 \tabularnewline
65 & 547.2 & 511.671428571429 & 35.5285714285714 \tabularnewline
66 & 548.1 & 507.688095238095 & 40.4119047619048 \tabularnewline
67 & 544.7 & 509.988095238095 & 34.7119047619048 \tabularnewline
68 & 521.1 & 509.504761904762 & 11.5952380952381 \tabularnewline
69 & 459 & 501.838095238095 & -42.8380952380953 \tabularnewline
70 & 413.2 & 493.488095238095 & -80.2880952380952 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25803&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]115.6[/C][C]122.914912280702[/C][C]-7.3149122807023[/C][/ROW]
[ROW][C]2[/C][C]120.3[/C][C]127.614912280702[/C][C]-7.31491228070157[/C][/ROW]
[ROW][C]3[/C][C]121.9[/C][C]135.031578947368[/C][C]-13.1315789473684[/C][/ROW]
[ROW][C]4[/C][C]121.7[/C][C]116.607644110276[/C][C]5.09235588972419[/C][/ROW]
[ROW][C]5[/C][C]118.9[/C][C]115.757644110276[/C][C]3.14235588972434[/C][/ROW]
[ROW][C]6[/C][C]113.4[/C][C]111.774310776942[/C][C]1.62568922305774[/C][/ROW]
[ROW][C]7[/C][C]114[/C][C]114.074310776942[/C][C]-0.0743107769423977[/C][/ROW]
[ROW][C]8[/C][C]117.5[/C][C]113.590977443609[/C][C]3.90902255639095[/C][/ROW]
[ROW][C]9[/C][C]120.9[/C][C]105.924310776942[/C][C]14.9756892230576[/C][/ROW]
[ROW][C]10[/C][C]125.1[/C][C]97.5743107769423[/C][C]27.5256892230577[/C][/ROW]
[ROW][C]11[/C][C]124.7[/C][C]117.651929824561[/C][C]7.04807017543864[/C][/ROW]
[ROW][C]12[/C][C]128.2[/C][C]116.251929824561[/C][C]11.9480701754386[/C][/ROW]
[ROW][C]13[/C][C]149.7[/C][C]175.328947368421[/C][C]-25.628947368421[/C][/ROW]
[ROW][C]14[/C][C]163.6[/C][C]180.028947368421[/C][C]-16.428947368421[/C][/ROW]
[ROW][C]15[/C][C]173.9[/C][C]187.445614035088[/C][C]-13.5456140350877[/C][/ROW]
[ROW][C]16[/C][C]164.5[/C][C]169.021679197995[/C][C]-4.52167919799496[/C][/ROW]
[ROW][C]17[/C][C]154.2[/C][C]168.171679197995[/C][C]-13.9716791979950[/C][/ROW]
[ROW][C]18[/C][C]147.9[/C][C]164.188345864662[/C][C]-16.2883458646617[/C][/ROW]
[ROW][C]19[/C][C]159.3[/C][C]166.488345864662[/C][C]-7.18834586466163[/C][/ROW]
[ROW][C]20[/C][C]170.3[/C][C]166.005012531328[/C][C]4.2949874686717[/C][/ROW]
[ROW][C]21[/C][C]170[/C][C]158.338345864662[/C][C]11.6616541353384[/C][/ROW]
[ROW][C]22[/C][C]174.2[/C][C]149.988345864662[/C][C]24.2116541353383[/C][/ROW]
[ROW][C]23[/C][C]190.8[/C][C]170.065964912281[/C][C]20.7340350877193[/C][/ROW]
[ROW][C]24[/C][C]179.9[/C][C]168.665964912281[/C][C]11.2340350877193[/C][/ROW]
[ROW][C]25[/C][C]240.8[/C][C]227.74298245614[/C][C]13.0570175438598[/C][/ROW]
[ROW][C]26[/C][C]241.9[/C][C]232.442982456140[/C][C]9.4570175438596[/C][/ROW]
[ROW][C]27[/C][C]241.1[/C][C]239.859649122807[/C][C]1.24035087719295[/C][/ROW]
[ROW][C]28[/C][C]239.6[/C][C]221.435714285714[/C][C]18.1642857142857[/C][/ROW]
[ROW][C]29[/C][C]220.8[/C][C]220.585714285714[/C][C]0.214285714285712[/C][/ROW]
[ROW][C]30[/C][C]209.3[/C][C]216.602380952381[/C][C]-7.30238095238096[/C][/ROW]
[ROW][C]31[/C][C]209.9[/C][C]218.902380952381[/C][C]-9.00238095238093[/C][/ROW]
[ROW][C]32[/C][C]228.3[/C][C]218.419047619048[/C][C]9.8809523809524[/C][/ROW]
[ROW][C]33[/C][C]242.1[/C][C]210.752380952381[/C][C]31.3476190476191[/C][/ROW]
[ROW][C]34[/C][C]226.4[/C][C]202.402380952381[/C][C]23.9976190476191[/C][/ROW]
[ROW][C]35[/C][C]231.5[/C][C]222.48[/C][C]9.01999999999997[/C][/ROW]
[ROW][C]36[/C][C]229.7[/C][C]221.08[/C][C]8.62[/C][/ROW]
[ROW][C]37[/C][C]257.6[/C][C]280.157017543860[/C][C]-22.5570175438595[/C][/ROW]
[ROW][C]38[/C][C]260[/C][C]284.85701754386[/C][C]-24.8570175438597[/C][/ROW]
[ROW][C]39[/C][C]264.4[/C][C]292.273684210526[/C][C]-27.8736842105264[/C][/ROW]
[ROW][C]40[/C][C]268.8[/C][C]273.849749373434[/C][C]-5.04974937343354[/C][/ROW]
[ROW][C]41[/C][C]271.4[/C][C]272.999749373434[/C][C]-1.59974937343361[/C][/ROW]
[ROW][C]42[/C][C]273.8[/C][C]269.0164160401[/C][C]4.78358395989974[/C][/ROW]
[ROW][C]43[/C][C]277.4[/C][C]271.3164160401[/C][C]6.08358395989974[/C][/ROW]
[ROW][C]44[/C][C]268.2[/C][C]270.833082706767[/C][C]-2.63308270676692[/C][/ROW]
[ROW][C]45[/C][C]264.6[/C][C]263.166416040100[/C][C]1.43358395989976[/C][/ROW]
[ROW][C]46[/C][C]266.6[/C][C]254.8164160401[/C][C]11.7835839598998[/C][/ROW]
[ROW][C]47[/C][C]266[/C][C]274.894035087719[/C][C]-8.8940350877193[/C][/ROW]
[ROW][C]48[/C][C]267.4[/C][C]273.494035087719[/C][C]-6.0940350877193[/C][/ROW]
[ROW][C]49[/C][C]289.8[/C][C]332.571052631579[/C][C]-42.7710526315788[/C][/ROW]
[ROW][C]50[/C][C]294[/C][C]337.271052631579[/C][C]-43.271052631579[/C][/ROW]
[ROW][C]51[/C][C]310.3[/C][C]344.687719298246[/C][C]-34.3877192982456[/C][/ROW]
[ROW][C]52[/C][C]311.7[/C][C]326.263784461153[/C][C]-14.5637844611529[/C][/ROW]
[ROW][C]53[/C][C]302.1[/C][C]325.413784461153[/C][C]-23.3137844611529[/C][/ROW]
[ROW][C]54[/C][C]298.2[/C][C]321.430451127820[/C][C]-23.2304511278196[/C][/ROW]
[ROW][C]55[/C][C]299.2[/C][C]323.730451127820[/C][C]-24.5304511278196[/C][/ROW]
[ROW][C]56[/C][C]296.2[/C][C]323.247117794486[/C][C]-27.0471177944862[/C][/ROW]
[ROW][C]57[/C][C]299[/C][C]315.580451127820[/C][C]-16.5804511278196[/C][/ROW]
[ROW][C]58[/C][C]300[/C][C]307.230451127820[/C][C]-7.2304511278196[/C][/ROW]
[ROW][C]59[/C][C]299.4[/C][C]327.308070175439[/C][C]-27.9080701754387[/C][/ROW]
[ROW][C]60[/C][C]300.2[/C][C]325.908070175439[/C][C]-25.7080701754386[/C][/ROW]
[ROW][C]61[/C][C]470.2[/C][C]384.985087719298[/C][C]85.2149122807018[/C][/ROW]
[ROW][C]62[/C][C]472.1[/C][C]389.685087719298[/C][C]82.4149122807017[/C][/ROW]
[ROW][C]63[/C][C]484.8[/C][C]397.101754385965[/C][C]87.698245614035[/C][/ROW]
[ROW][C]64[/C][C]513.4[/C][C]512.521428571429[/C][C]0.87857142857141[/C][/ROW]
[ROW][C]65[/C][C]547.2[/C][C]511.671428571429[/C][C]35.5285714285714[/C][/ROW]
[ROW][C]66[/C][C]548.1[/C][C]507.688095238095[/C][C]40.4119047619048[/C][/ROW]
[ROW][C]67[/C][C]544.7[/C][C]509.988095238095[/C][C]34.7119047619048[/C][/ROW]
[ROW][C]68[/C][C]521.1[/C][C]509.504761904762[/C][C]11.5952380952381[/C][/ROW]
[ROW][C]69[/C][C]459[/C][C]501.838095238095[/C][C]-42.8380952380953[/C][/ROW]
[ROW][C]70[/C][C]413.2[/C][C]493.488095238095[/C][C]-80.2880952380952[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25803&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25803&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1115.6122.914912280702-7.3149122807023
2120.3127.614912280702-7.31491228070157
3121.9135.031578947368-13.1315789473684
4121.7116.6076441102765.09235588972419
5118.9115.7576441102763.14235588972434
6113.4111.7743107769421.62568922305774
7114114.074310776942-0.0743107769423977
8117.5113.5909774436093.90902255639095
9120.9105.92431077694214.9756892230576
10125.197.574310776942327.5256892230577
11124.7117.6519298245617.04807017543864
12128.2116.25192982456111.9480701754386
13149.7175.328947368421-25.628947368421
14163.6180.028947368421-16.428947368421
15173.9187.445614035088-13.5456140350877
16164.5169.021679197995-4.52167919799496
17154.2168.171679197995-13.9716791979950
18147.9164.188345864662-16.2883458646617
19159.3166.488345864662-7.18834586466163
20170.3166.0050125313284.2949874686717
21170158.33834586466211.6616541353384
22174.2149.98834586466224.2116541353383
23190.8170.06596491228120.7340350877193
24179.9168.66596491228111.2340350877193
25240.8227.7429824561413.0570175438598
26241.9232.4429824561409.4570175438596
27241.1239.8596491228071.24035087719295
28239.6221.43571428571418.1642857142857
29220.8220.5857142857140.214285714285712
30209.3216.602380952381-7.30238095238096
31209.9218.902380952381-9.00238095238093
32228.3218.4190476190489.8809523809524
33242.1210.75238095238131.3476190476191
34226.4202.40238095238123.9976190476191
35231.5222.489.01999999999997
36229.7221.088.62
37257.6280.157017543860-22.5570175438595
38260284.85701754386-24.8570175438597
39264.4292.273684210526-27.8736842105264
40268.8273.849749373434-5.04974937343354
41271.4272.999749373434-1.59974937343361
42273.8269.01641604014.78358395989974
43277.4271.31641604016.08358395989974
44268.2270.833082706767-2.63308270676692
45264.6263.1664160401001.43358395989976
46266.6254.816416040111.7835839598998
47266274.894035087719-8.8940350877193
48267.4273.494035087719-6.0940350877193
49289.8332.571052631579-42.7710526315788
50294337.271052631579-43.271052631579
51310.3344.687719298246-34.3877192982456
52311.7326.263784461153-14.5637844611529
53302.1325.413784461153-23.3137844611529
54298.2321.430451127820-23.2304511278196
55299.2323.730451127820-24.5304511278196
56296.2323.247117794486-27.0471177944862
57299315.580451127820-16.5804511278196
58300307.230451127820-7.2304511278196
59299.4327.308070175439-27.9080701754387
60300.2325.908070175439-25.7080701754386
61470.2384.98508771929885.2149122807018
62472.1389.68508771929882.4149122807017
63484.8397.10175438596587.698245614035
64513.4512.5214285714290.87857142857141
65547.2511.67142857142935.5285714285714
66548.1507.68809523809540.4119047619048
67544.7509.98809523809534.7119047619048
68521.1509.50476190476211.5952380952381
69459501.838095238095-42.8380952380953
70413.2493.488095238095-80.2880952380952







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
170.006310046843167120.01262009368633420.993689953156833
180.001208581369724570.002417162739449150.998791418630275
190.0001932905877619620.0003865811755239240.999806709412238
207.07521739968231e-050.0001415043479936460.999929247826003
211.34286962969354e-052.68573925938708e-050.999986571303703
222.52463853339033e-065.04927706678066e-060.999997475361467
236.88001675690013e-061.37600335138003e-050.999993119983243
241.43127513325214e-062.86255026650429e-060.999998568724867
254.04284829785219e-058.08569659570439e-050.999959571517022
263.14033838812095e-056.2806767762419e-050.99996859661612
271.13786544257290e-052.27573088514581e-050.999988621345574
284.8523842593926e-069.7047685187852e-060.99999514761574
291.23811624296263e-062.47623248592526e-060.999998761883757
303.29539136680130e-076.59078273360261e-070.999999670460863
311.04055441845221e-072.08110883690442e-070.999999895944558
322.52367006212459e-085.04734012424918e-080.9999999747633
331.61755756509510e-083.23511513019019e-080.999999983824424
349.02708554182095e-091.80541710836419e-080.999999990972914
354.0655935225616e-098.1311870451232e-090.999999995934406
361.79451216468308e-093.58902432936615e-090.999999998205488
371.56119140644418e-093.12238281288837e-090.999999998438809
381.56355317520187e-093.12710635040374e-090.999999998436447
391.1686962723879e-092.3373925447758e-090.999999998831304
403.67696615505176e-107.35393231010352e-100.999999999632303
418.5688111102323e-111.71376222204646e-100.999999999914312
423.20942619589646e-116.41885239179291e-110.999999999967906
431.12658185304108e-112.25316370608217e-110.999999999988734
443.55253306702863e-127.10506613405725e-120.999999999996448
455.90117063298988e-121.18023412659798e-110.9999999999941
462.04146826871416e-104.08293653742832e-100.999999999795853
472.9668244823087e-095.9336489646174e-090.999999997033175
483.08929610370999e-066.17859220741997e-060.999996910703896
493.55988329512363e-067.11976659024726e-060.999996440116705
503.19264338186226e-066.38528676372452e-060.999996807356618
511.09279495923175e-062.18558991846350e-060.99999890720504
522.96585682510153e-075.93171365020307e-070.999999703414318
532.66726696200074e-075.33453392400147e-070.999999733273304

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
17 & 0.00631004684316712 & 0.0126200936863342 & 0.993689953156833 \tabularnewline
18 & 0.00120858136972457 & 0.00241716273944915 & 0.998791418630275 \tabularnewline
19 & 0.000193290587761962 & 0.000386581175523924 & 0.999806709412238 \tabularnewline
20 & 7.07521739968231e-05 & 0.000141504347993646 & 0.999929247826003 \tabularnewline
21 & 1.34286962969354e-05 & 2.68573925938708e-05 & 0.999986571303703 \tabularnewline
22 & 2.52463853339033e-06 & 5.04927706678066e-06 & 0.999997475361467 \tabularnewline
23 & 6.88001675690013e-06 & 1.37600335138003e-05 & 0.999993119983243 \tabularnewline
24 & 1.43127513325214e-06 & 2.86255026650429e-06 & 0.999998568724867 \tabularnewline
25 & 4.04284829785219e-05 & 8.08569659570439e-05 & 0.999959571517022 \tabularnewline
26 & 3.14033838812095e-05 & 6.2806767762419e-05 & 0.99996859661612 \tabularnewline
27 & 1.13786544257290e-05 & 2.27573088514581e-05 & 0.999988621345574 \tabularnewline
28 & 4.8523842593926e-06 & 9.7047685187852e-06 & 0.99999514761574 \tabularnewline
29 & 1.23811624296263e-06 & 2.47623248592526e-06 & 0.999998761883757 \tabularnewline
30 & 3.29539136680130e-07 & 6.59078273360261e-07 & 0.999999670460863 \tabularnewline
31 & 1.04055441845221e-07 & 2.08110883690442e-07 & 0.999999895944558 \tabularnewline
32 & 2.52367006212459e-08 & 5.04734012424918e-08 & 0.9999999747633 \tabularnewline
33 & 1.61755756509510e-08 & 3.23511513019019e-08 & 0.999999983824424 \tabularnewline
34 & 9.02708554182095e-09 & 1.80541710836419e-08 & 0.999999990972914 \tabularnewline
35 & 4.0655935225616e-09 & 8.1311870451232e-09 & 0.999999995934406 \tabularnewline
36 & 1.79451216468308e-09 & 3.58902432936615e-09 & 0.999999998205488 \tabularnewline
37 & 1.56119140644418e-09 & 3.12238281288837e-09 & 0.999999998438809 \tabularnewline
38 & 1.56355317520187e-09 & 3.12710635040374e-09 & 0.999999998436447 \tabularnewline
39 & 1.1686962723879e-09 & 2.3373925447758e-09 & 0.999999998831304 \tabularnewline
40 & 3.67696615505176e-10 & 7.35393231010352e-10 & 0.999999999632303 \tabularnewline
41 & 8.5688111102323e-11 & 1.71376222204646e-10 & 0.999999999914312 \tabularnewline
42 & 3.20942619589646e-11 & 6.41885239179291e-11 & 0.999999999967906 \tabularnewline
43 & 1.12658185304108e-11 & 2.25316370608217e-11 & 0.999999999988734 \tabularnewline
44 & 3.55253306702863e-12 & 7.10506613405725e-12 & 0.999999999996448 \tabularnewline
45 & 5.90117063298988e-12 & 1.18023412659798e-11 & 0.9999999999941 \tabularnewline
46 & 2.04146826871416e-10 & 4.08293653742832e-10 & 0.999999999795853 \tabularnewline
47 & 2.9668244823087e-09 & 5.9336489646174e-09 & 0.999999997033175 \tabularnewline
48 & 3.08929610370999e-06 & 6.17859220741997e-06 & 0.999996910703896 \tabularnewline
49 & 3.55988329512363e-06 & 7.11976659024726e-06 & 0.999996440116705 \tabularnewline
50 & 3.19264338186226e-06 & 6.38528676372452e-06 & 0.999996807356618 \tabularnewline
51 & 1.09279495923175e-06 & 2.18558991846350e-06 & 0.99999890720504 \tabularnewline
52 & 2.96585682510153e-07 & 5.93171365020307e-07 & 0.999999703414318 \tabularnewline
53 & 2.66726696200074e-07 & 5.33453392400147e-07 & 0.999999733273304 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25803&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]17[/C][C]0.00631004684316712[/C][C]0.0126200936863342[/C][C]0.993689953156833[/C][/ROW]
[ROW][C]18[/C][C]0.00120858136972457[/C][C]0.00241716273944915[/C][C]0.998791418630275[/C][/ROW]
[ROW][C]19[/C][C]0.000193290587761962[/C][C]0.000386581175523924[/C][C]0.999806709412238[/C][/ROW]
[ROW][C]20[/C][C]7.07521739968231e-05[/C][C]0.000141504347993646[/C][C]0.999929247826003[/C][/ROW]
[ROW][C]21[/C][C]1.34286962969354e-05[/C][C]2.68573925938708e-05[/C][C]0.999986571303703[/C][/ROW]
[ROW][C]22[/C][C]2.52463853339033e-06[/C][C]5.04927706678066e-06[/C][C]0.999997475361467[/C][/ROW]
[ROW][C]23[/C][C]6.88001675690013e-06[/C][C]1.37600335138003e-05[/C][C]0.999993119983243[/C][/ROW]
[ROW][C]24[/C][C]1.43127513325214e-06[/C][C]2.86255026650429e-06[/C][C]0.999998568724867[/C][/ROW]
[ROW][C]25[/C][C]4.04284829785219e-05[/C][C]8.08569659570439e-05[/C][C]0.999959571517022[/C][/ROW]
[ROW][C]26[/C][C]3.14033838812095e-05[/C][C]6.2806767762419e-05[/C][C]0.99996859661612[/C][/ROW]
[ROW][C]27[/C][C]1.13786544257290e-05[/C][C]2.27573088514581e-05[/C][C]0.999988621345574[/C][/ROW]
[ROW][C]28[/C][C]4.8523842593926e-06[/C][C]9.7047685187852e-06[/C][C]0.99999514761574[/C][/ROW]
[ROW][C]29[/C][C]1.23811624296263e-06[/C][C]2.47623248592526e-06[/C][C]0.999998761883757[/C][/ROW]
[ROW][C]30[/C][C]3.29539136680130e-07[/C][C]6.59078273360261e-07[/C][C]0.999999670460863[/C][/ROW]
[ROW][C]31[/C][C]1.04055441845221e-07[/C][C]2.08110883690442e-07[/C][C]0.999999895944558[/C][/ROW]
[ROW][C]32[/C][C]2.52367006212459e-08[/C][C]5.04734012424918e-08[/C][C]0.9999999747633[/C][/ROW]
[ROW][C]33[/C][C]1.61755756509510e-08[/C][C]3.23511513019019e-08[/C][C]0.999999983824424[/C][/ROW]
[ROW][C]34[/C][C]9.02708554182095e-09[/C][C]1.80541710836419e-08[/C][C]0.999999990972914[/C][/ROW]
[ROW][C]35[/C][C]4.0655935225616e-09[/C][C]8.1311870451232e-09[/C][C]0.999999995934406[/C][/ROW]
[ROW][C]36[/C][C]1.79451216468308e-09[/C][C]3.58902432936615e-09[/C][C]0.999999998205488[/C][/ROW]
[ROW][C]37[/C][C]1.56119140644418e-09[/C][C]3.12238281288837e-09[/C][C]0.999999998438809[/C][/ROW]
[ROW][C]38[/C][C]1.56355317520187e-09[/C][C]3.12710635040374e-09[/C][C]0.999999998436447[/C][/ROW]
[ROW][C]39[/C][C]1.1686962723879e-09[/C][C]2.3373925447758e-09[/C][C]0.999999998831304[/C][/ROW]
[ROW][C]40[/C][C]3.67696615505176e-10[/C][C]7.35393231010352e-10[/C][C]0.999999999632303[/C][/ROW]
[ROW][C]41[/C][C]8.5688111102323e-11[/C][C]1.71376222204646e-10[/C][C]0.999999999914312[/C][/ROW]
[ROW][C]42[/C][C]3.20942619589646e-11[/C][C]6.41885239179291e-11[/C][C]0.999999999967906[/C][/ROW]
[ROW][C]43[/C][C]1.12658185304108e-11[/C][C]2.25316370608217e-11[/C][C]0.999999999988734[/C][/ROW]
[ROW][C]44[/C][C]3.55253306702863e-12[/C][C]7.10506613405725e-12[/C][C]0.999999999996448[/C][/ROW]
[ROW][C]45[/C][C]5.90117063298988e-12[/C][C]1.18023412659798e-11[/C][C]0.9999999999941[/C][/ROW]
[ROW][C]46[/C][C]2.04146826871416e-10[/C][C]4.08293653742832e-10[/C][C]0.999999999795853[/C][/ROW]
[ROW][C]47[/C][C]2.9668244823087e-09[/C][C]5.9336489646174e-09[/C][C]0.999999997033175[/C][/ROW]
[ROW][C]48[/C][C]3.08929610370999e-06[/C][C]6.17859220741997e-06[/C][C]0.999996910703896[/C][/ROW]
[ROW][C]49[/C][C]3.55988329512363e-06[/C][C]7.11976659024726e-06[/C][C]0.999996440116705[/C][/ROW]
[ROW][C]50[/C][C]3.19264338186226e-06[/C][C]6.38528676372452e-06[/C][C]0.999996807356618[/C][/ROW]
[ROW][C]51[/C][C]1.09279495923175e-06[/C][C]2.18558991846350e-06[/C][C]0.99999890720504[/C][/ROW]
[ROW][C]52[/C][C]2.96585682510153e-07[/C][C]5.93171365020307e-07[/C][C]0.999999703414318[/C][/ROW]
[ROW][C]53[/C][C]2.66726696200074e-07[/C][C]5.33453392400147e-07[/C][C]0.999999733273304[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25803&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25803&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
170.006310046843167120.01262009368633420.993689953156833
180.001208581369724570.002417162739449150.998791418630275
190.0001932905877619620.0003865811755239240.999806709412238
207.07521739968231e-050.0001415043479936460.999929247826003
211.34286962969354e-052.68573925938708e-050.999986571303703
222.52463853339033e-065.04927706678066e-060.999997475361467
236.88001675690013e-061.37600335138003e-050.999993119983243
241.43127513325214e-062.86255026650429e-060.999998568724867
254.04284829785219e-058.08569659570439e-050.999959571517022
263.14033838812095e-056.2806767762419e-050.99996859661612
271.13786544257290e-052.27573088514581e-050.999988621345574
284.8523842593926e-069.7047685187852e-060.99999514761574
291.23811624296263e-062.47623248592526e-060.999998761883757
303.29539136680130e-076.59078273360261e-070.999999670460863
311.04055441845221e-072.08110883690442e-070.999999895944558
322.52367006212459e-085.04734012424918e-080.9999999747633
331.61755756509510e-083.23511513019019e-080.999999983824424
349.02708554182095e-091.80541710836419e-080.999999990972914
354.0655935225616e-098.1311870451232e-090.999999995934406
361.79451216468308e-093.58902432936615e-090.999999998205488
371.56119140644418e-093.12238281288837e-090.999999998438809
381.56355317520187e-093.12710635040374e-090.999999998436447
391.1686962723879e-092.3373925447758e-090.999999998831304
403.67696615505176e-107.35393231010352e-100.999999999632303
418.5688111102323e-111.71376222204646e-100.999999999914312
423.20942619589646e-116.41885239179291e-110.999999999967906
431.12658185304108e-112.25316370608217e-110.999999999988734
443.55253306702863e-127.10506613405725e-120.999999999996448
455.90117063298988e-121.18023412659798e-110.9999999999941
462.04146826871416e-104.08293653742832e-100.999999999795853
472.9668244823087e-095.9336489646174e-090.999999997033175
483.08929610370999e-066.17859220741997e-060.999996910703896
493.55988329512363e-067.11976659024726e-060.999996440116705
503.19264338186226e-066.38528676372452e-060.999996807356618
511.09279495923175e-062.18558991846350e-060.99999890720504
522.96585682510153e-075.93171365020307e-070.999999703414318
532.66726696200074e-075.33453392400147e-070.999999733273304







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level360.972972972972973NOK
5% type I error level371NOK
10% type I error level371NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 36 & 0.972972972972973 & NOK \tabularnewline
5% type I error level & 37 & 1 & NOK \tabularnewline
10% type I error level & 37 & 1 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=25803&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]36[/C][C]0.972972972972973[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]37[/C][C]1[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]37[/C][C]1[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=25803&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=25803&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level360.972972972972973NOK
5% type I error level371NOK
10% type I error level371NOK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}