Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 30 Aug 2016 09:44:19 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Aug/30/t1472547532xlg2kohkyb692vo.htm/, Retrieved Fri, 03 May 2024 16:56:47 +0200
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=, Retrieved Fri, 03 May 2024 16:56:47 +0200
QR Codes:

Original text written by user:
IsPrivate?This computation is private
User-defined keywords
Estimated Impact0
Dataseries X:
149 0.5 0 0.67 0.67 0.5 1 0 1 2011
139 1 0.5 0.33 0.83 0.5 0.89 1 1 2011
148 1 0 0.67 1 0.4 0.89 0 1 2011
158 0 0 0 0.83 0.5 0.89 1 1 2011
128 1 1 0 0.67 0.7 0.89 1 1 2011
224 0.5 0.5 0 0 0.3 0.78 1 1 2011
159 0 0.5 0.67 0.83 0.4 0.89 0 1 2011
105 1 1 0.67 0.5 0.4 1 1 1 2011
159 0 0.5 0 0.83 0.7 0.89 1 1 2011
167 0.5 0.5 0.67 0.33 0.6 0.78 1 1 2011
165 0.5 0 1 0.5 0.6 1 1 1 2011
159 0.5 0.5 0 0.67 0.2 0.78 1 1 2011
119 0.5 0.5 0 1 0.4 0.89 1 1 2011
176 1 0 0.67 0.5 0.4 0.89 0 1 2011
54 0 0 0.33 0.67 0.5 0.89 0 1 2011
91 0.5 0 0.67 0.17 0.3 0.89 0 0 2011
163 0.5 0.5 0.33 0.83 0.4 0.89 1 1 2011
124 1 0.5 0.33 0.67 0.7 0.67 0 1 2011
137 1 0 0.33 0.67 0.5 1 1 0 2011
121 1 0 0 0.67 0.2 0.78 0 1 2011
153 0.5 0 0.67 0.5 0.3 0.78 1 1 2011
148 1 0 0.33 1 0.6 0.89 1 1 2011
221 1 0 0.33 0.83 0.6 0.78 0 1 2011
188 1 0 0.33 0.83 0.2 0.89 1 1 2011
149 0 1 0.67 1 0.7 0.89 1 1 2011
244 0 0 0 0.67 0.2 0.33 1 1 2011
148 1 1 0.33 1 1 1 1 0 2011
92 0.5 0 0.67 0.83 0.4 0.89 0 0 2011
150 1 0 1 1 0.4 0.89 1 1 2011
153 0.5 0 0.67 0.83 0.2 0.67 0 1 2011
94 1 0 0.33 0.67 0.4 0.56 0 1 2011
156 1 0.5 0 0.67 0.4 0.89 0 1 2011
132 0.5 0.5 0.67 1 0.7 0.89 1 1 2011
161 0.5 0 0.67 0.67 0.2 1 1 1 2011
105 0.5 0 1 1 0.6 0.78 1 1 2011
97 0.5 0.5 1 1 0.3 0.78 1 1 2011
151 0 0 0.33 0.5 0.3 0.33 0 1 2011
131 0 0.5 0 0.67 0.2 0.78 1 0 2011
166 0.5 0.5 0.67 0.83 0.5 0.89 1 1 2011
157 1 0.5 0.67 1 0.7 0.89 0 1 2011
111 0.5 0.5 0.67 1 0.6 0.78 1 1 2011
145 1 0.5 0.67 1 0.4 0.89 1 1 2011
162 1 0.5 0.33 1 0.6 0.89 1 1 2011
163 1 0 1 1 0.4 1 1 1 2011
59 1 0 0.67 0.83 0.3 0.67 1 0 2011
187 0.5 0.5 0.67 0.83 0.5 1 0 1 2011
109 1 0 0 0.5 0.2 0.89 1 1 2011
90 1 0.5 0 0.83 0.3 0.89 1 0 2011
105 1 0 0 0.17 0.5 0.89 0 1 2011
83 1 0.5 1 0.83 0.7 0.78 1 0 2011
116 0.5 1 0.67 1 0.4 0.89 1 0 2011
42 0.5 0 0 1 0.3 0.78 1 0 2011
148 1 1 0.67 0.67 0.2 0.78 1 1 2011
155 0.5 0 0 1 0.5 1 1 0 2011
125 0 0.5 0 1 0.4 0.78 1 1 2011
116 1 1 0.67 1 0.6 1 1 1 2011
128 1 0 1 0.83 0.4 0.78 0 0 2011
138 0.5 0 0 0.33 0.4 0.67 1 1 2011
49 0 0 0.33 0.33 0.2 0.33 0 0 2011
96 1 0.5 0.67 1 0.9 1 1 0 2011
164 0.5 1 0.67 1 0.8 1 1 1 2011
162 1 0.5 0 0.83 0.8 0.78 0 1 2011
99 1 0.5 1 1 0.3 0.67 0 1 2011
202 0.5 0 0.67 0.83 0.2 1 1 1 2011
186 1 0.5 0 0.67 0.4 0.89 0 1 2011
66 1 0 1 0.83 0.2 0.89 1 0 2011
183 1 0.5 0.67 0.67 0.2 0.78 0 1 2011
214 1 0 0.67 0.83 0.1 1 1 1 2011
188 0 0.5 1 0.67 0.4 0.56 1 1 2011
104 0.5 0.5 0 1 0.5 0.67 0 0 2011
177 1 0.5 0.33 0.83 0.8 0.89 0 1 2011
126 0.5 0 0.67 0.67 0.4 0.89 0 1 2011
76 0.5 0.5 0.33 0.83 0.6 0.89 0 0 2011
99 1 0.5 0.67 0.83 0.5 0.89 1 0 2011
139 0 0 0 0.67 0.3 0.78 0 1 2011
162 0 0.5 0 0.33 0.4 1 0 1 2011
108 0.5 0.5 0.67 0.83 0.6 1 1 0 2011
159 0.5 0 0.33 1 0.4 0.89 0 1 2011
74 0 0 0 0.83 0.3 0.44 0 0 2011
110 1 1 0 0.83 0.8 0.78 1 1 2011
96 1 1 0.33 0.5 0.6 0.89 0 0 2011
116 0 0 0 0.5 0.3 0.67 0 0 2011
87 1 0.5 0.67 0.83 0.5 0.78 0 0 2011
97 1 0 0.33 1 0.4 0.78 1 0 2011
127 0 0 0.67 0.33 0.3 0.33 0 0 2011
106 0.5 0 0.33 1 0.7 0.89 1 0 2011
80 0.5 0.5 0.33 0.67 0.2 0.89 1 0 2011
74 1 0 1 0.83 0.4 0.89 0 0 2011
91 0.5 0.5 0.67 1 0.6 0.89 0 0 2011
133 1 0 0 0.83 0.6 0.56 0 0 2011
74 0.5 0.5 0.67 0.83 0.6 0.67 1 0 2011
114 1 0.5 0.33 1 0.4 0.67 1 0 2011
140 1 0 0 0.83 0.6 0.78 1 0 2011
95 1 0.5 0.33 1 0.5 0.78 0 0 2011
98 1 0 0 0.83 0.5 0.78 1 0 2011
121 1 0 0 0.67 0.6 0.89 0 0 2011
126 1 0.5 0.33 0.83 0.8 1 1 0 2011
98 0.5 1 0.67 0.83 0.5 0.89 1 0 2011
95 1 0.5 0.67 0.83 0.6 0.89 1 0 2011
110 1 0.5 0.67 0.83 0.4 0.78 1 0 2011
70 1 0.5 0.67 0.67 0.3 1 1 0 2011
102 0.5 0 1 0.83 0.3 0.78 0 0 2011
86 0 0 0 0 0.2 0.67 1 0 2011
130 0.5 0 0 0.83 0.4 0.78 1 0 2011
96 0.5 0 0 1 0.5 0.89 1 0 2011
102 0 0.5 0 0.17 0.3 0.67 0 0 2011
100 0 0.5 0 0.17 0.4 0.22 0 0 2011
94 0 0 1 0.5 0.5 0.44 0 0 2011
52 1 0 0.67 0.5 0.3 0.89 0 0 2011
98 0.5 0 0 1 0.5 0.67 0 0 2011
118 0.5 0 0.67 0.67 0.4 0.89 0 0 2011
99 1 0 0.67 0.83 0.4 0.67 1 0 2011
48 1 1 0 1 0.6 0.78 1 1 2012
50 1 1 0.67 1 0.3 0.78 1 1 2012
150 0.5 1 0.33 1 0.4 0.78 1 1 2012
154 1 1 1 1 0.3 1 1 1 2012
109 1 1 1 1 1 0.78 0 0 2012
68 0.5 0 0 1 0.4 0.67 1 0 2012
194 1 0.5 1 0.83 0.8 0.89 1 1 2012
158 1 1 0.67 1 0.3 0.89 0 1 2012
159 1 0 0.67 0.83 0.5 1 1 1 2012
67 0.5 0 0 1 0.4 0.78 0 1 2012
147 1 0 0.67 0.83 0.3 0.67 0 1 2012
39 1 0 1 0.83 0.5 0.89 1 1 2012
100 1 0 0.67 1 0.3 0.67 1 1 2012
111 1 0 0 0.67 0.3 0.67 1 1 2012
138 1 0 0 0.83 0.4 1 1 1 2012
101 0.5 0 0 1 0.3 0.67 1 1 2012
131 0.5 0.5 0.33 1 0.6 1 1 0 2012
101 1 1 0.67 0.83 0.6 0.89 1 1 2012
114 1 1 1 1 0.4 0.89 1 1 2012
165 0 0 0 1 0.4 1 0 1 2012
114 0.5 0 0.67 1 0.4 0.67 1 1 2012
111 1 0.5 0.67 0.67 0.3 0.44 1 1 2012
75 0 1 0.33 1 0.2 0.89 1 1 2012
82 1 0 0.67 0.83 0.5 0.56 1 1 2012
121 1 1 0.67 1 0.4 0.78 1 1 2012
32 0 0 0.67 1 0.4 1 1 1 2012
150 1 0 0.67 0.83 0.4 1 0 1 2012
117 0.5 0.5 0.67 0.67 0.3 0.89 1 1 2012
71 0.5 1 0.67 0.83 0.4 0.67 1 0 2012
165 1 0.5 0.33 1 0.2 0.89 1 1 2012
154 0 0 0 0 0 0.33 1 1 2012
126 1 0.5 0.67 1 0.4 0.89 1 1 2012
149 1 1 0 1 0.6 0.78 0 1 2012
145 0.5 0 0.67 0.67 0.4 1 0 1 2012
120 0.5 0 0 1 0.4 0.44 1 1 2012
109 0 0.5 0 0.83 0.4 0.67 0 1 2012
132 0 0.5 0 0.17 0.2 0.33 0 1 2012
172 1 1 1 0.83 0.4 0.89 1 1 2012
169 0.5 0 0 0.83 0.3 0.89 0 1 2012
114 0 1 0.67 0.83 0.6 1 1 1 2012
156 1 0 1 0.83 0.6 0.89 1 1 2012
172 1 0 0 0.83 0.4 0.89 0 1 2012
68 0.5 1 0.67 1 0.5 1 1 0 2012
89 1 0.5 0 0.83 0.4 0.89 1 0 2012
167 1 1 1 1 0.6 1 1 1 2012
113 1 0.5 0.67 0.83 0.6 0.78 0 1 2012
115 1 0.5 0.67 1 0.9 0.78 0 0 2012
78 0 0.5 0.67 0.83 0.4 0.67 0 0 2012
118 1 0.5 1 1 0.8 0.89 0 0 2012
87 1 0 1 0.83 0.5 0.67 1 0 2012
173 0 0 1 0.83 0.4 0.78 0 1 2012
2 0.5 1 0.67 1 0.4 0.89 1 1 2012
162 0.5 1 1 1 0.7 0.89 0 0 2012
49 1 1 0.33 1 0.4 0.78 1 0 2012
122 1 0.5 0.67 1 0.8 1 0 0 2012
96 0.5 1 1 1 0.4 1 1 0 2012
100 0.5 0 0.67 1 0.3 1 0 0 2012
82 1 0.5 0.67 1 0.5 0.67 0 0 2012
100 1 1 0.67 1 0.8 0.89 1 0 2012
115 0.5 0 0.33 0.83 0.4 1 0 0 2012
141 0 0.5 1 1 1 1 1 0 2012
165 1 1 0.67 1 0.5 0.89 1 1 2012
165 1 1 0.67 1 0.5 0.89 1 1 2012
110 1 0 0.33 1 0.3 0.89 1 0 2012
118 1 0.5 0.33 0.83 0.3 0.89 1 1 2012
158 1 0 0 0.5 0.3 0.89 0 1 2012
146 0.5 0.5 0.33 0.67 0.4 1 1 0 2012
49 1 0 0.33 1 0.5 0.67 0 1 2012
90 1 0.5 0.67 0.67 0.5 1 0 0 2012
121 0 0 0 1 0.4 0.89 0 0 2012
155 0 0.5 1 1 0.7 0.89 1 1 2012
104 0.5 0 0.33 0.5 0.5 0.89 0 0 2012
147 0 1 0.33 0.67 0.4 0.89 1 0 2012
110 1 0 1 0.67 0.7 1 0 0 2012
108 1 0 1 0.67 0.7 1 0 0 2012
113 1 0 1 0.67 0.7 1 0 0 2012
115 1 0 1 0.67 0.7 0.89 0 0 2012
61 0 0 0 0.67 0.7 0.89 1 0 2012
60 1 0.5 0.67 1 0.7 0.89 1 0 2012
109 0 0.5 0.33 0.67 0.1 0.33 1 0 2012
68 1 0.5 0.67 0.67 0.2 0.67 1 0 2012
111 1 0 0.33 0.33 0.3 0.56 0 0 2012
77 0.5 0 0.33 0.83 0.6 0.44 0 0 2012
73 1 1 1 1 0.8 1 1 0 2012
151 0.5 0.5 0.33 1 0.8 0.89 0 1 2012
89 0 0 0 0.17 0 0.33 0 0 2012
78 1 0 0.33 0.67 0.3 0.67 0 0 2012
110 1 0.5 0.33 0.83 0.6 0.67 0 0 2012
220 1 0 0.67 0.83 0.5 1 1 1 2012
65 0.5 0 0.33 1 0.7 0.78 1 0 2012
141 1 0.5 0 0.83 0.3 0.67 0 1 2012
117 0 0 0.67 1 0.3 1 0 0 2012
122 0.5 0 0.67 1 0.4 0.78 1 1 2012
63 1 0 1 0.83 0.4 0.89 0 0 2012
44 1 0 0 0.83 0.1 0.89 1 1 2012
52 1 0 0.67 1 0.5 0.89 1 0 2012
131 0 0 0 0 0 0 0 0 2012
101 0 0.5 0.33 1 0.4 0.67 1 0 2012
42 0.5 1 0.67 0.83 0.6 1 1 0 2012
152 1 0.5 0.33 1 0.4 1 1 1 2012
107 1 0.5 0 0.33 0.1 0.67 0 1 2012
77 1 0 0 0.83 0.3 0.89 0 0 2012
154 1 0 0.67 0.83 0.7 0.89 0 1 2012
103 1 0 0 0.17 0.3 0.56 1 1 2012
96 0 0.5 0.33 0.83 0.5 0.67 1 0 2012
175 1 1 0.67 0.83 0.3 1 1 1 2012
57 1 0.5 0.67 0.67 0.6 1 1 0 2012
112 1 0 1 1 0.9 1 0 0 2012
143 1 0.5 0 0.83 0.4 0.67 0 1 2012
49 0.5 0.5 0 1 0.3 0.44 0 0 2012
110 1 1 0.67 1 0.9 0.89 1 1 2012
131 0 0.5 0 1 0.5 0.44 1 1 2012
167 0.5 0.5 1 1 0.3 0.56 0 1 2012
56 0.5 0 0.67 0.83 0.6 0.89 0 0 2012
137 0.5 0 0.33 1 0.2 0.67 0 1 2012
86 1 0.5 1 0.83 0.4 0.89 1 0 2012
121 0.5 0.5 0.67 0.83 0.5 1 1 1 2012
149 0.5 0 0.67 0.83 0.4 0.78 0 1 2012
168 0 0 0 0 0 0.44 0 1 2012
140 1 0.5 0.33 1 0.2 0.89 0 1 2012
88 1 0.5 0.67 1 0.5 0.89 1 0 2012
168 0.5 0 0.67 1 0.3 0.89 1 1 2012
94 0 0 0 0 0 0.44 1 1 2012
51 1 0 1 0.83 0.5 1 1 1 2012
48 1 0 0.33 0.83 0.6 0.89 0 0 2012
145 0.5 0.5 0 0.83 0.3 0.67 1 1 2012
66 0 0 0 0 0 0.33 1 1 2012
85 0 0.5 0 0.67 0.3 0.78 1 0 2012
109 1 0.5 0.67 1 0.5 0.89 0 1 2012
63 1 0 0 0.67 0.4 0.78 0 0 2012
102 0.5 0 0.67 0.83 0.5 0.78 1 0 2012
162 0.5 1 1 1 0.7 0.89 0 0 2012
86 1 0.5 0.67 1 0.8 0.78 1 0 2012
114 1 0.5 0.33 1 0.6 0.78 1 0 2012
164 0.5 0 0.33 0.83 0.4 0.67 0 1 2012
119 0 0.5 0.33 0.83 0.5 0.89 1 1 2012
126 1 0.5 0 1 0.5 0.89 0 1 2012
132 1 0 0.33 1 0.3 0.78 1 1 2012
142 1 0.5 0 1 0.6 1 1 1 2012
83 0.5 0 0.67 0.67 0.3 1 0 1 2012
94 0.5 0.5 1 0.83 0.6 0.78 1 0 2012
81 1 0 0.33 0.33 0.3 0.78 0 0 2012
166 1 1 0.67 1 0.7 0.89 1 1 2012
110 1 0 1 1 0.7 0.89 0 0 2012
64 1 0.5 1 0.67 0.6 0.67 1 0 2012
93 0 0.5 0.33 1 0.5 1 0 1 2012
104 0.5 0 0.33 0.83 0.5 0.67 0 0 2012
105 1 0 0 0.67 0.4 0.56 1 0 2012
49 1 1 0.33 1 0.4 0.78 1 0 2012
88 1 0 1 1 0.7 1 0 0 2012
95 0 0.5 0 0.17 0.2 0.67 1 0 2012
102 0.5 0 0.67 0.83 0.5 0.78 1 0 2012
99 0 0.5 0.67 0.83 0.4 0.56 0 0 2012
63 1 1 0.67 1 0.2 1 1 0 2012
76 0 0 0.67 0.67 0.5 0.89 0 0 2012
109 1 0 0 0.5 0.4 0.44 0 0 2012
117 1 1 1 0.67 0.7 1 1 0 2012
57 0 1 0.67 0.83 0.6 0.89 1 0 2012
120 0 0 0 0.83 0.4 0.78 0 0 2012
73 1 1 0.67 1 0.5 0.89 1 0 2012
91 0 0 0 0.17 0 0.11 0 0 2012
108 1 0.5 0.67 1 0.7 0.89 0 0 2012
105 1 0 0.67 0.67 0.4 0.89 1 0 2012
117 1 0 1 0.67 0.5 1 0 1 2012
119 0.5 0 0.67 0.83 0.6 0.89 0 0 2012
31 0.5 0.5 0.67 0.5 0.8 1 1 0 2012




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Herman Ole Andreas Wold' @ wold.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Herman Ole Andreas Wold' @ wold.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Herman Ole Andreas Wold' @ wold.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Herman Ole Andreas Wold' @ wold.wessa.net







Multiple Linear Regression - Estimated Regression Equation
LFM[t] = + 30015.2 -4.85314Estimation[t] -0.548369Probability_and_Sampling[t] + 0.40022Proportionality_and_Ratio[t] -8.77305Graphical_Interpretation[t] + 17.7631Algebraic_Reasoning[t] + 16.8152Calculation[t] -7.42246gender[t] + 42.184group[t] -14.8775year[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
LFM[t] =  +  30015.2 -4.85314Estimation[t] -0.548369Probability_and_Sampling[t] +  0.40022Proportionality_and_Ratio[t] -8.77305Graphical_Interpretation[t] +  17.7631Algebraic_Reasoning[t] +  16.8152Calculation[t] -7.42246gender[t] +  42.184group[t] -14.8775year[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]LFM[t] =  +  30015.2 -4.85314Estimation[t] -0.548369Probability_and_Sampling[t] +  0.40022Proportionality_and_Ratio[t] -8.77305Graphical_Interpretation[t] +  17.7631Algebraic_Reasoning[t] +  16.8152Calculation[t] -7.42246gender[t] +  42.184group[t] -14.8775year[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
LFM[t] = + 30015.2 -4.85314Estimation[t] -0.548369Probability_and_Sampling[t] + 0.40022Proportionality_and_Ratio[t] -8.77305Graphical_Interpretation[t] + 17.7631Algebraic_Reasoning[t] + 16.8152Calculation[t] -7.42246gender[t] + 42.184group[t] -14.8775year[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+3.002e+04 8383+3.5810e+00 0.0004069 0.0002035
Estimation-4.853 5.583-8.6930e-01 0.3854 0.1927
Probability_and_Sampling-0.5484 6.032-9.0910e-02 0.9276 0.4638
Proportionality_and_Ratio+0.4002 6.316+6.3370e-02 0.9495 0.4748
Graphical_Interpretation-8.773 10.08-8.7050e-01 0.3848 0.1924
Algebraic_Reasoning+17.76 12.86+1.3810e+00 0.1684 0.08418
Calculation+16.82 13.73+1.2250e+00 0.2216 0.1108
gender-7.423 4.321-1.7180e+00 0.08698 0.04349
group+42.18 4.156+1.0150e+01 1.074e-20 5.372e-21
year-14.88 4.167-3.5700e+00 0.0004227 0.0002114

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +3.002e+04 &  8383 & +3.5810e+00 &  0.0004069 &  0.0002035 \tabularnewline
Estimation & -4.853 &  5.583 & -8.6930e-01 &  0.3854 &  0.1927 \tabularnewline
Probability_and_Sampling & -0.5484 &  6.032 & -9.0910e-02 &  0.9276 &  0.4638 \tabularnewline
Proportionality_and_Ratio & +0.4002 &  6.316 & +6.3370e-02 &  0.9495 &  0.4748 \tabularnewline
Graphical_Interpretation & -8.773 &  10.08 & -8.7050e-01 &  0.3848 &  0.1924 \tabularnewline
Algebraic_Reasoning & +17.76 &  12.86 & +1.3810e+00 &  0.1684 &  0.08418 \tabularnewline
Calculation & +16.82 &  13.73 & +1.2250e+00 &  0.2216 &  0.1108 \tabularnewline
gender & -7.423 &  4.321 & -1.7180e+00 &  0.08698 &  0.04349 \tabularnewline
group & +42.18 &  4.156 & +1.0150e+01 &  1.074e-20 &  5.372e-21 \tabularnewline
year & -14.88 &  4.167 & -3.5700e+00 &  0.0004227 &  0.0002114 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+3.002e+04[/C][C] 8383[/C][C]+3.5810e+00[/C][C] 0.0004069[/C][C] 0.0002035[/C][/ROW]
[ROW][C]Estimation[/C][C]-4.853[/C][C] 5.583[/C][C]-8.6930e-01[/C][C] 0.3854[/C][C] 0.1927[/C][/ROW]
[ROW][C]Probability_and_Sampling[/C][C]-0.5484[/C][C] 6.032[/C][C]-9.0910e-02[/C][C] 0.9276[/C][C] 0.4638[/C][/ROW]
[ROW][C]Proportionality_and_Ratio[/C][C]+0.4002[/C][C] 6.316[/C][C]+6.3370e-02[/C][C] 0.9495[/C][C] 0.4748[/C][/ROW]
[ROW][C]Graphical_Interpretation[/C][C]-8.773[/C][C] 10.08[/C][C]-8.7050e-01[/C][C] 0.3848[/C][C] 0.1924[/C][/ROW]
[ROW][C]Algebraic_Reasoning[/C][C]+17.76[/C][C] 12.86[/C][C]+1.3810e+00[/C][C] 0.1684[/C][C] 0.08418[/C][/ROW]
[ROW][C]Calculation[/C][C]+16.82[/C][C] 13.73[/C][C]+1.2250e+00[/C][C] 0.2216[/C][C] 0.1108[/C][/ROW]
[ROW][C]gender[/C][C]-7.423[/C][C] 4.321[/C][C]-1.7180e+00[/C][C] 0.08698[/C][C] 0.04349[/C][/ROW]
[ROW][C]group[/C][C]+42.18[/C][C] 4.156[/C][C]+1.0150e+01[/C][C] 1.074e-20[/C][C] 5.372e-21[/C][/ROW]
[ROW][C]year[/C][C]-14.88[/C][C] 4.167[/C][C]-3.5700e+00[/C][C] 0.0004227[/C][C] 0.0002114[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+3.002e+04 8383+3.5810e+00 0.0004069 0.0002035
Estimation-4.853 5.583-8.6930e-01 0.3854 0.1927
Probability_and_Sampling-0.5484 6.032-9.0910e-02 0.9276 0.4638
Proportionality_and_Ratio+0.4002 6.316+6.3370e-02 0.9495 0.4748
Graphical_Interpretation-8.773 10.08-8.7050e-01 0.3848 0.1924
Algebraic_Reasoning+17.76 12.86+1.3810e+00 0.1684 0.08418
Calculation+16.82 13.73+1.2250e+00 0.2216 0.1108
gender-7.423 4.321-1.7180e+00 0.08698 0.04349
group+42.18 4.156+1.0150e+01 1.074e-20 5.372e-21
year-14.88 4.167-3.5700e+00 0.0004227 0.0002114







Multiple Linear Regression - Regression Statistics
Multiple R 0.5689
R-squared 0.3237
Adjusted R-squared 0.301
F-TEST (value) 14.25
F-TEST (DF numerator)9
F-TEST (DF denominator)268
p-value 0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 33.3
Sum Squared Residuals 2.973e+05

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.5689 \tabularnewline
R-squared &  0.3237 \tabularnewline
Adjusted R-squared &  0.301 \tabularnewline
F-TEST (value) &  14.25 \tabularnewline
F-TEST (DF numerator) & 9 \tabularnewline
F-TEST (DF denominator) & 268 \tabularnewline
p-value &  0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  33.3 \tabularnewline
Sum Squared Residuals &  2.973e+05 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.5689[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.3237[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.301[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 14.25[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]9[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]268[/C][/ROW]
[ROW][C]p-value[/C][C] 0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 33.3[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 2.973e+05[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.5689
R-squared 0.3237
Adjusted R-squared 0.301
F-TEST (value) 14.25
F-TEST (DF numerator)9
F-TEST (DF denominator)268
p-value 0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 33.3
Sum Squared Residuals 2.973e+05



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}