April \(30^{th}\) 2020
In census the sample size is the whole population!
Generally speaking, the larger the sample size, the more accurate are your estimates and your statistics.
However we have a lot of limitations: availability of participants, time contraints, money constraints, etc…
Therefore the optimum sample size is the minimum number of participants needed to have statistically representative data.
In frequentist statistics (NHST), some fundamental concepts are:
The power of a statistical test is given by \(1-\beta\).
It is the probability to reject the null hypothesis when it is false.
Generally, accepted values of power are: 80%, 90%, 95% and 99%, which means \(\beta = 20\%, 10\%, 5\%, 1\%\).
The sample size is the number of observations required, to obtain a statistically significant result in the power% of cases, given a specific effect size.
First introduced by Cohen 1988, they demonstrated their importance in short time.
The effect size of a test is a (usually) standardized measure of the dimension of an effect (difference between groups, covariation, etc…) that should always been presented altogether with a (significant) p-value.
If the p-value says that there is an effect, the effect size says us if the effect is worth to be mentioned.
The effect size is an indication of how much of the control observation is lower (or higher) than the experimental observations.
Nominal size | Effect size | % control < experimental observations |
---|---|---|
0.0 | 50% | |
Small | 0.2 | 58% |
Medium | 0.5 | 69% |
Large | 0.8 | 79% |
1.4 | 92% |
https://rpsychologist.com/d3/cohend/ \(\leftarrow\) interesting visualization
The sample size is computed as a function of the (desired) power (\(1-\beta\)) and effect size (\(ES\)).
It is the optimum sample size needed to have the desired power, specifying a certain effect size.
Taking for example the case of a single sample and one-tail t-test, with the Cohen's \(d\) as effect size, it solves for \(n\) the following system:
\[ \begin{cases} Pr_t(q,\nu,c) = 1-\beta\\ \nu = (n-1)\\q = Q_t(\alpha , \nu)\\c = \sqrt{n}\times d \end{cases} \]
\(\nu = n-1\) these are the degrees of freedom of the Student's t distribution
\(q = Q_t(\alpha,\nu)\) this is the quantile of the t distribution at \(\alpha\) (usually 0.05) with these d.o.f.
\(c = \sqrt{n}\times d\) is the non-centrality parameter of the t distribution. Larger the sample, more deviated is the distribution.
\(Pr_t(q, \nu, c)\) is the integrate of the curve of the t distribution stopped at quantile \(q\).
EXAMPLE
\(d = 0.3\); \(1-\beta = 0.8\); \(\alpha = 0.05\)
\(n = 70\)
\(\nu = 69\)
\(q = Q_t(\alpha,\nu) = Q_t(0.05,69) \simeq 1.67\)
\(c = \sqrt{n}\times d = \sqrt{69}\times 0.3 \simeq 8.31\times0.3 \simeq 2.49\)
\(Pr_t(q, \nu, c) = Pr_t(1.67, 69, 2.49) = 0.7932814\)
Recalling that: the sample size is the number of observations required, to obtain a statistically significant result in the power% of cases, given a specific effect size.
Do you think that if I reach a statistically significant result with less observations than those required by the sample size, it is OK?
Why?
No, it is not OK.
The motivation is that the sample size if the minimum number of observations (subjects) required to have a representative sample of the statistical population.
Smaller samples can be more prone to (less obvious) outliers, therefore there is not only the risk to have a greater second type error, but also a first type error.
Here I present three main ways:
When we compute the sample size, we need to think to our hypotheses and to the specific contrasts of interest.
We also need to think to the possible effect size.
How to determine it?
In order to use the Cohen's formular, there is the package pwr in R.
In this package there are several functions that allow us to compute the sample size in different cases.
function | power calculations for | ES standard values |
---|---|---|
pwr.2p.test |
two proportions (equal n) | h - S: 0.2; M: 0.5; L: 0.8 |
pwr.2p2n.test |
two proportions (unequal n) | h - S: 0.2; M: 0.5; L: 0.8 |
pwr.anova.test |
balanced one way ANOVA | f - S: 0.1; M: 0.25; L: 0.4 |
pwr.chisq.test |
chi-square test | w - S: 0.1; M: 0.3; L: 0.5 |
pwr.f2.test |
general linear model | f2 - S: 0.02; M: 0.15; L: 0.35 |
function | power calculations for | ES standard values |
---|---|---|
pwr.p.test |
proportion (one sample) | h - S: 0.2; M: 0.5; L: 0.8 |
pwr.r.test |
correlation | r - S: 0.1; M: 0.3; L: 0.5 |
pwr.t.test |
t-tests (one sample, 2 sample, paired) | d - S: 0.2; M: 0.5; L: 0.8 |
pwr.t2n.test |
t-test (two samples with unequal n) | d - S: 0.2; M: 0.5; L: 0.8 |
with the function cohen.ES you can have all the standard effect sizes.
More or less, all the pwr.?.test have the same parameter to be specified:
Let's compute the sample size previously seen
library(pwr) pwr.t.test(d = 0.3, power = 0.8, sig.level = 0.05, type = "one.sample", alternative = "greater")
## ## One-sample t test power calculation ## ## n = 70.06791 ## d = 0.3 ## sig.level = 0.05 ## power = 0.8 ## alternative = greater
The parameters are:
EXAMPLE
We want to train black and white cats in jumping at a command.
If our hypothesis is that black cats will jump more, how many cats we have to train?
Let's use a power of 80%, and a medium effect size.
EXAMPLE
## ## Two-sample t test power calculation ## ## n = 50.1508 ## d = 0.5 ## sig.level = 0.05 ## power = 0.8 ## alternative = greater ## ## NOTE: n is number in *each* group
The function is pwr.anova.test, and it works only with one-way balanced designs.
That means that if we have a multifactorial design, we have to force it as a one-way design.
The null hypothesis is that in all levels of the factor the means are equal, the alternative hypothesis is that there are at least two levels that are statistically different from each other.
EXAMPLE
We have an "Empathy for pain" experiment. In this experiment, we collect physiological data during the vision of 3 typologies of videos: - Control video, A video of a Syringe penetrating an hand, A video of a Q-tip touching a hand In two perspectives: - 1PP, 3PP The colour of the Q-tip or syringe can change: - Blue, green, pink
Therefore, the design is a \(3\times2\times3\). If we translate it as a one.way design, the total number of "groups" (k) is \(18\).
However, our hypothesis is that there is a difference between the Syringe videos in the 1PP and 3PP, therefore touching two factors: \(3\times2\). The total number of groups to take into account is \(6\).
EXAMPLE
Let's use a medium effect size \(f = 0.25\)
pwr.anova.test(k = 6, f = 0.25, power = 0.8, sig.level = 0.05)
## ## Balanced one-way analysis of variance power calculation ## ## k = 6 ## n = 35.14095 ## f = 0.25 ## sig.level = 0.05 ## power = 0.8 ## ## NOTE: n is number in each group
The function is pwr.f2.test.
Formally, it is for multiple regressions, but it does the same things seen in pwr.anova.test with much more flexibility.
You do not need to force your design into a "one-way" study, and you can take into account covariates.
The parameter that takes this function are:
You need to give to the function the number of degrees of freedom for the numerator. In this way the function will return the number of degrees of freedom of the demoninator. From this value we can estimate the sample size.
\(u\) is equal to the number of levels of the factor/interaction minus 1. \(v\) is the degrees of freedom of the residuals.
The sample size is \(u + v + 1\).
EXAMPLE
At the previous study, we add the evaluation of embodiment on a likert scale of the hand seen in the videos.
The d.o.f. at the numerator are now: \(2\times3\times1 - 1\)
Let's use a medium effect size \(f2 = 0.15\)
pwr.f2.test(u = 5, f2 = 0.15, power = 0.8, sig.level = 0.05)
## ## Multiple regression power calculation ## ## u = 5 ## v = 85.21369 ## f2 = 0.15 ## sig.level = 0.05 ## power = 0.8
Compute the required sample size to understand if females are more creative than males, using a small effect size and a power = 80%.
Estimate the sample size necessary to understand if people feel more energic after a hot shower, with a moderate effect size and a power = 90%.
Repeat, with a small effect size and power = 80%.
We can use the TrialSize package.
It has 80 different functions to compute the sample size.
Some of interest are:
ANOVA.Repeat.Measure
CrossOver.ISV.Equality
CrossOver.ISV.NIS
\(H_0: \forall \mu_{i} = \mu_{(j:k) \neq i}\); \(H_1: \exists \mu_{i} \neq \mu_{(j:k) \neq i}\)
This function needs:
No space for standard ES!
Suggestion: in terms of Cohen's d, delta = ES \(\times \sqrt{sigma}\)
EXAMPLE
We use again the same example seen before.
Our physiological data have been transformed into z-scores, therefore a meaningful difference may be 1.5
We set the sum of the variances as \(\sqrt{delta\div ES^2}\) = \(\sqrt{1.5\div0.5^2} \simeq 2.45\)
s.size <- ANOVA.Repeat.Measure(alpha = 0.05, beta = 0.2, sigma = 2.45, delta = 1.5, m = 6) s.size
## [1] 64.6112
\(sigma^T\) is the within-subjects variance for treatment T
\(H_0: \forall \sigma^{T^1}_{1} = \sigma^{T^2}_{2}\); \(H_1: \exists \sigma^{T^1}_{1} \neq \sigma^{T^2}_{2}\)
This function needs:
Suggestion: think sigmas in terms of percentages or z-scores
EXAMPLE
Cross-Over design with treatment and placebo condition, 5 trainings per week, done for two weeks.
Data are in z-scores
CrossOver.ISV.Equality(alpha = 0.05, beta = 0.2, sigma1 = 1, sigma2 = 2, m = 10)
## [1] 2.0000000 0.2573179 ## [1] 3.0000000 0.3880039 ## [1] 4.0000000 0.4632526 ## [1] 5.0000000 0.5143397 ## [1] 6.0000000 0.5521888 ## [1] 7.0000000 0.5818003 ## [1] 8.0000000 0.6058496 ## [1] 9.0000000 0.6259223 ## [1] 10.0000000 0.6430289 ## [1] 11.0000000 0.6578498 ## [1] 12.0000000 0.6708628 ## [1] 13.0000000 0.6824154 ## [1] 14.000000 0.692767 ## [1] 15.0000000 0.7021163 ## [1] 16.0000000 0.7106183 ## [1] 17.000000 0.718396 ## [1] 18.0000000 0.7255486 ## [1] 19.0000000 0.7321571 ## [1] 20.0000000 0.7382884 ## [1] 21.0000000 0.7439981 ## [1] 22.0000000 0.7493334 ## [1] 23.0000000 0.7543342 ## [1] 24.0000000 0.7590344 ## [1] 25.0000000 0.7634637 ## [1] 26.0000000 0.7676474 ## [1] 27.0000000 0.7716077 ## [1] 28.0000000 0.7753643 ## [1] 29.0000000 0.7789342 ## [1] 30.0000000 0.7823326 ## [1] 31.0000000 0.7855731 ## [1] 32.0000000 0.7886677 ## [1] 33.0000000 0.7916271 ## [1] 34.0000000 0.7944611 ## [1] 35.0000000 0.7971784 ## [1] 36.0000000 0.7997869 ## [1] 37.0000000 0.8022938 ## [1] 38.0000000 0.8047056 ## [1] 39.0000000 0.8070282 ## [1] 40.0000000 0.8092671 ## [1] 41.0000000 0.8114273 ## [1] 42.0000000 0.8135133 ## [1] 43.0000000 0.8155293 ## [1] 44.0000000 0.8174792 ## [1] 45.0000000 0.8193666 ## [1] 46.0000000 0.8211948 ## [1] 47.0000000 0.8229669 ## [1] 48.0000000 0.8246857 ## [1] 49.0000000 0.8263538 ## [1] 50.0000000 0.8279738 ## [1] 51.0000000 0.8295479 ## [1] 52.0000000 0.8310783 ## [1] 53.000000 0.832567 ## [1] 54.0000000 0.8340158 ## [1] 55.0000000 0.8354267 ## [1] 56.0000000 0.8368011 ## [1] 57.0000000 0.8381406 ## [1] 58.0000000 0.8394468 ## [1] 59.000000 0.840721 ## [1] 60.0000000 0.8419644 ## [1] 61.0000000 0.8431784 ## [1] 62.0000000 0.8443641 ## [1] 63.0000000 0.8455226 ## [1] 64.0000000 0.8466548 ## [1] 65.0000000 0.8477619 ## [1] 66.0000000 0.8488447 ## [1] 67.0000000 0.8499041 ## [1] 68.000000 0.850941 ## [1] 69.0000000 0.8519561 ## [1] 70.0000000 0.8529502 ## [1] 71.000000 0.853924 ## [1] 72.0000000 0.8548783 ## [1] 73.0000000 0.8558136 ## [1] 74.0000000 0.8567306 ## [1] 75.0000000 0.8576299 ## [1] 76.0000000 0.8585121 ## [1] 77.0000000 0.8593776 ## [1] 78.0000000 0.8602271 ## [1] 79.000000 0.861061 ## [1] 80.0000000 0.8618798 ## [1] 81.0000000 0.8626839 ## [1] 82.0000000 0.8634738 ## [1] 83.0000000 0.8642499 ## [1] 84.0000000 0.8650126 ## [1] 85.0000000 0.8657623 ## [1] 86.0000000 0.8664994 ## [1] 87.0000000 0.8672241 ## [1] 88.0000000 0.8679369 ## [1] 89.0000000 0.8686381 ## [1] 90.000000 0.869328 ## [1] 91.0000000 0.8700069 ## [1] 92.0000000 0.8706751 ## [1] 93.0000000 0.8713328 ## [1] 94.0000000 0.8719804 ## [1] 95.0000000 0.8726181 ## [1] 96.0000000 0.8732461 ## [1] 97.0000000 0.8738647 ## [1] 98.0000000 0.8744742 ## [1] 99.0000000 0.8750747 ## [1] 100.0000000 0.8756665 ## [1] 101.0000000 0.8762498 ## [1] 102.0000000 0.8768248 ## [1] 103.0000000 0.8773916 ## [1] 104.0000000 0.8779506 ## [1] 105.0000000 0.8785018 ## [1] 106.0000000 0.8790454 ## [1] 107.0000000 0.8795817 ## [1] 108.0000000 0.8801107 ## [1] 109.0000000 0.8806327 ## [1] 110.0000000 0.8811478 ## [1] 111.0000000 0.8816561 ## [1] 112.0000000 0.8821579 ## [1] 113.0000000 0.8826531 ## [1] 114.0000000 0.8831421 ## [1] 115.0000000 0.8836249 ## [1] 116.0000000 0.8841016 ## [1] 117.0000000 0.8845724 ## [1] 118.0000000 0.8850374 ## [1] 119.0000000 0.8854967 ## [1] 120.0000000 0.8859504 ## [1] 121.0000000 0.8863987 ## [1] 122.0000000 0.8868416 ## [1] 123.0000000 0.8872793 ## [1] 124.0000000 0.8877118 ## [1] 125.0000000 0.8881393 ## [1] 126.0000000 0.8885618 ## [1] 127.0000000 0.8889796 ## [1] 128.0000000 0.8893925 ## [1] 129.0000000 0.8898008 ## [1] 130.0000000 0.8902046 ## [1] 131.0000000 0.8906038 ## [1] 132.0000000 0.8909986 ## [1] 133.0000000 0.8913891 ## [1] 134.0000000 0.8917754 ## [1] 135.0000000 0.8921575 ## [1] 136.0000000 0.8925355 ## [1] 137.0000000 0.8929094 ## [1] 138.0000000 0.8932795 ## [1] 139.0000000 0.8936456 ## [1] 140.0000000 0.8940079 ## [1] 141.0000000 0.8943665 ## [1] 142.0000000 0.8947214 ## [1] 143.0000000 0.8950727 ## [1] 144.0000000 0.8954204 ## [1] 145.0000000 0.8957647 ## [1] 146.0000000 0.8961055 ## [1] 147.0000000 0.8964429 ## [1] 148.000000 0.896777 ## [1] 149.0000000 0.8971078 ## [1] 150.0000000 0.8974354 ## [1] 151.0000000 0.8977598 ## [1] 152.0000000 0.8980811 ## [1] 153.0000000 0.8983994 ## [1] 154.0000000 0.8987146 ## [1] 155.0000000 0.8990269 ## [1] 156.0000000 0.8993363 ## [1] 157.0000000 0.8996428 ## [1] 158.0000000 0.8999464 ## [1] 159.0000000 0.9002473 ## [1] 160.0000000 0.9005454 ## [1] 161.0000000 0.9008408 ## [1] 162.0000000 0.9011335 ## [1] 163.0000000 0.9014237 ## [1] 164.0000000 0.9017112 ## [1] 165.0000000 0.9019962 ## [1] 166.0000000 0.9022787 ## [1] 167.0000000 0.9025587 ## [1] 168.0000000 0.9028363 ## [1] 169.0000000 0.9031115 ## [1] 170.0000000 0.9033843 ## [1] 171.0000000 0.9036548 ## [1] 172.000000 0.903923 ## [1] 173.000000 0.904189 ## [1] 174.0000000 0.9044527 ## [1] 175.0000000 0.9047142 ## [1] 176.0000000 0.9049735 ## [1] 177.0000000 0.9052307 ## [1] 178.0000000 0.9054858 ## [1] 179.0000000 0.9057388 ## [1] 180.0000000 0.9059897 ## [1] 181.0000000 0.9062386 ## [1] 182.0000000 0.9064856 ## [1] 183.0000000 0.9067305 ## [1] 184.0000000 0.9069735 ## [1] 185.0000000 0.9072146 ## [1] 186.0000000 0.9074538 ## [1] 187.0000000 0.9076911 ## [1] 188.0000000 0.9079266 ## [1] 189.0000000 0.9081602 ## [1] 190.0000000 0.9083921 ## [1] 191.0000000 0.9086221 ## [1] 192.0000000 0.9088505 ## [1] 193.000000 0.909077 ## [1] 194.0000000 0.9093019 ## [1] 195.0000000 0.9095251 ## [1] 196.0000000 0.9097467 ## [1] 197.0000000 0.9099666 ## [1] 198.0000000 0.9101848 ## [1] 199.0000000 0.9104015 ## [1] 200.0000000 0.9106166 ## [1] 201.0000000 0.9108301 ## [1] 202.000000 0.911042 ## [1] 203.0000000 0.9112525 ## [1] 204.0000000 0.9114614 ## [1] 205.0000000 0.9116688 ## [1] 206.0000000 0.9118748 ## [1] 207.0000000 0.9120793 ## [1] 208.0000000 0.9122824 ## [1] 209.000000 0.912484 ## [1] 210.0000000 0.9126843 ## [1] 211.0000000 0.9128831 ## [1] 212.0000000 0.9130806 ## [1] 213.0000000 0.9132767 ## [1] 214.0000000 0.9134715 ## [1] 215.000000 0.913665 ## [1] 216.0000000 0.9138571 ## [1] 217.0000000 0.9140479 ## [1] 218.0000000 0.9142375 ## [1] 219.0000000 0.9144258 ## [1] 220.0000000 0.9146129 ## [1] 221.0000000 0.9147987 ## [1] 222.0000000 0.9149832 ## [1] 223.0000000 0.9151666 ## [1] 224.0000000 0.9153488 ## [1] 225.0000000 0.9155298 ## [1] 226.0000000 0.9157096 ## [1] 227.0000000 0.9158882 ## [1] 228.0000000 0.9160657 ## [1] 229.0000000 0.9162421 ## [1] 230.0000000 0.9164173 ## [1] 231.0000000 0.9165915 ## [1] 232.0000000 0.9167645 ## [1] 233.0000000 0.9169365 ## [1] 234.0000000 0.9171073 ## [1] 235.0000000 0.9172771 ## [1] 236.0000000 0.9174459 ## [1] 237.0000000 0.9176136 ## [1] 238.0000000 0.9177803 ## [1] 239.0000000 0.9179459 ## [1] 240.0000000 0.9181106 ## [1] 241.0000000 0.9182742 ## [1] 242.0000000 0.9184369 ## [1] 243.0000000 0.9185985 ## [1] 244.0000000 0.9187592 ## [1] 245.000000 0.918919 ## [1] 246.0000000 0.9190778 ## [1] 247.0000000 0.9192356 ## [1] 248.0000000 0.9193925 ## [1] 249.0000000 0.9195485 ## [1] 250.0000000 0.9197036 ## [1] 251.0000000 0.9198577 ## [1] 252.000000 0.920011 ## [1] 253.0000000 0.9201634 ## [1] 254.0000000 0.9203149 ## [1] 255.0000000 0.9204655 ## [1] 256.0000000 0.9206153 ## [1] 257.0000000 0.9207642 ## [1] 258.0000000 0.9209123 ## [1] 259.0000000 0.9210595 ## [1] 260.0000000 0.9212059 ## [1] 261.0000000 0.9213515 ## [1] 262.0000000 0.9214962 ## [1] 263.0000000 0.9216402 ## [1] 264.0000000 0.9217833 ## [1] 265.0000000 0.9219257 ## [1] 266.0000000 0.9220673 ## [1] 267.0000000 0.9222081 ## [1] 268.0000000 0.9223481 ## [1] 269.0000000 0.9224874 ## [1] 270.0000000 0.9226259 ## [1] 271.0000000 0.9227636 ## [1] 272.0000000 0.9229007 ## [1] 273.0000000 0.9230369 ## [1] 274.0000000 0.9231725 ## [1] 275.0000000 0.9233073 ## [1] 276.0000000 0.9234414 ## [1] 277.0000000 0.9235748 ## [1] 278.0000000 0.9237075 ## [1] 279.0000000 0.9238395 ## [1] 280.0000000 0.9239708 ## [1] 281.0000000 0.9241014 ## [1] 282.0000000 0.9242314 ## [1] 283.0000000 0.9243607 ## [1] 284.0000000 0.9244893 ## [1] 285.0000000 0.9246172 ## [1] 286.0000000 0.9247445 ## [1] 287.0000000 0.9248711 ## [1] 288.0000000 0.9249971 ## [1] 289.0000000 0.9251224 ## [1] 290.0000000 0.9252471 ## [1] 291.0000000 0.9253712 ## [1] 292.0000000 0.9254947 ## [1] 293.0000000 0.9256175 ## [1] 294.0000000 0.9257397 ## [1] 295.0000000 0.9258614 ## [1] 296.0000000 0.9259824 ## [1] 297.0000000 0.9261028 ## [1] 298.0000000 0.9262226 ## [1] 299.0000000 0.9263418 ## [1] 300.0000000 0.9264605 ## [1] 301.0000000 0.9265786 ## [1] 302.0000000 0.9266961 ## [1] 303.000000 0.926813 ## [1] 304.0000000 0.9269294 ## [1] 305.0000000 0.9270452 ## [1] 306.0000000 0.9271604 ## [1] 307.0000000 0.9272751 ## [1] 308.0000000 0.9273893 ## [1] 309.0000000 0.9275029 ## [1] 310.0000000 0.9276159 ## [1] 311.0000000 0.9277285 ## [1] 312.0000000 0.9278405 ## [1] 313.000000 0.927952 ## [1] 314.0000000 0.9280629 ## [1] 315.0000000 0.9281733 ## [1] 316.0000000 0.9282833 ## [1] 317.0000000 0.9283927 ## [1] 318.0000000 0.9285016 ## [1] 319.00000 0.92861 ## [1] 320.0000000 0.9287179 ## [1] 321.0000000 0.9288254 ## [1] 322.0000000 0.9289323 ## [1] 323.0000000 0.9290387 ## [1] 324.0000000 0.9291447 ## [1] 325.0000000 0.9292502 ## [1] 326.0000000 0.9293552 ## [1] 327.0000000 0.9294597 ## [1] 328.0000000 0.9295638 ## [1] 329.0000000 0.9296674 ## [1] 330.0000000 0.9297705 ## [1] 331.0000000 0.9298732 ## [1] 332.0000000 0.9299754 ## [1] 333.0000000 0.9300772 ## [1] 334.0000000 0.9301785 ## [1] 335.0000000 0.9302794 ## [1] 336.0000000 0.9303798 ## [1] 337.0000000 0.9304799 ## [1] 338.0000000 0.9305794 ## [1] 339.0000000 0.9306786 ## [1] 340.0000000 0.9307773 ## [1] 341.0000000 0.9308756 ## [1] 342.0000000 0.9309734 ## [1] 343.0000000 0.9310709 ## [1] 344.0000000 0.9311679 ## [1] 345.0000000 0.9312645 ## [1] 346.0000000 0.9313607 ## [1] 347.0000000 0.9314565 ## [1] 348.0000000 0.9315519 ## [1] 349.0000000 0.9316469 ## [1] 350.0000000 0.9317415 ## [1] 351.0000000 0.9318356 ## [1] 352.0000000 0.9319294 ## [1] 353.0000000 0.9320229 ## [1] 354.0000000 0.9321159 ## [1] 355.0000000 0.9322085 ## [1] 356.0000000 0.9323008 ## [1] 357.0000000 0.9323926 ## [1] 358.0000000 0.9324841 ## [1] 359.0000000 0.9325753 ## [1] 360.000000 0.932666 ## [1] 361.0000000 0.9327564 ## [1] 362.0000000 0.9328464 ## [1] 363.000000 0.932936 ## [1] 364.0000000 0.9330253 ## [1] 365.0000000 0.9331142 ## [1] 366.0000000 0.9332028 ## [1] 367.000000 0.933291 ## [1] 368.0000000 0.9333789 ## [1] 369.0000000 0.9334664 ## [1] 370.0000000 0.9335536 ## [1] 371.0000000 0.9336404 ## [1] 372.0000000 0.9337269 ## [1] 373.000000 0.933813 ## [1] 374.0000000 0.9338988 ## [1] 375.0000000 0.9339842 ## [1] 376.0000000 0.9340694 ## [1] 377.0000000 0.9341541 ## [1] 378.0000000 0.9342386 ## [1] 379.0000000 0.9343227 ## [1] 380.0000000 0.9344065 ## [1] 381.00000 0.93449 ## [1] 382.0000000 0.9345732 ## [1] 383.000000 0.934656 ## [1] 384.0000000 0.9347386 ## [1] 385.0000000 0.9348208 ## [1] 386.0000000 0.9349027 ## [1] 387.0000000 0.9349842 ## [1] 388.0000000 0.9350655 ## [1] 389.0000000 0.9351465 ## [1] 390.0000000 0.9352272 ## [1] 391.0000000 0.9353075 ## [1] 392.0000000 0.9353876 ## [1] 393.0000000 0.9354673 ## [1] 394.0000000 0.9355468 ## [1] 395.000000 0.935626 ## [1] 396.0000000 0.9357048 ## [1] 397.0000000 0.9357834 ## [1] 398.0000000 0.9358617 ## [1] 399.0000000 0.9359397 ## [1] 400.0000000 0.9360174 ## [1] 401.0000000 0.9360948 ## [1] 402.000000 0.936172 ## [1] 403.0000000 0.9362488 ## [1] 404.0000000 0.9363254 ## [1] 405.0000000 0.9364017 ## [1] 406.0000000 0.9364777 ## [1] 407.0000000 0.9365535 ## [1] 408.000000 0.936629 ## [1] 409.0000000 0.9367042 ## [1] 410.0000000 0.9367791 ## [1] 411.0000000 0.9368538 ## [1] 412.0000000 0.9369282 ## [1] 413.0000000 0.9370023 ## [1] 414.0000000 0.9370762 ## [1] 415.0000000 0.9371498 ## [1] 416.0000000 0.9372231 ## [1] 417.0000000 0.9372962 ## [1] 418.000000 0.937369 ## [1] 419.0000000 0.9374416 ## [1] 420.0000000 0.9375139 ## [1] 421.000000 0.937586 ## [1] 422.0000000 0.9376578 ## [1] 423.0000000 0.9377294 ## [1] 424.0000000 0.9378007 ## [1] 425.0000000 0.9378718 ## [1] 426.0000000 0.9379426 ## [1] 427.0000000 0.9380132 ## [1] 428.0000000 0.9380835 ## [1] 429.0000000 0.9381536 ## [1] 430.0000000 0.9382235 ## [1] 431.0000000 0.9382931 ## [1] 432.0000000 0.9383625 ## [1] 433.0000000 0.9384316 ## [1] 434.0000000 0.9385006 ## [1] 435.0000000 0.9385692 ## [1] 436.0000000 0.9386377 ## [1] 437.0000000 0.9387059 ## [1] 438.0000000 0.9387739 ## [1] 439.0000000 0.9388416 ## [1] 440.0000000 0.9389092 ## [1] 441.0000000 0.9389765 ## [1] 442.0000000 0.9390436 ## [1] 443.0000000 0.9391104 ## [1] 444.000000 0.939177 ## [1] 445.0000000 0.9392435 ## [1] 446.0000000 0.9393097 ## [1] 447.0000000 0.9393756 ## [1] 448.0000000 0.9394414 ## [1] 449.0000000 0.9395069 ## [1] 450.0000000 0.9395723 ## [1] 451.0000000 0.9396374 ## [1] 452.0000000 0.9397023 ## [1] 453.000000 0.939767 ## [1] 454.0000000 0.9398315 ## [1] 455.0000000 0.9398958 ## [1] 456.0000000 0.9399598 ## [1] 457.0000000 0.9400237 ## [1] 458.0000000 0.9400873 ## [1] 459.0000000 0.9401508 ## [1] 460.000000 0.940214 ## [1] 461.0000000 0.9402771 ## [1] 462.0000000 0.9403399 ## [1] 463.0000000 0.9404026 ## [1] 464.000000 0.940465 ## [1] 465.0000000 0.9405273 ## [1] 466.0000000 0.9405893 ## [1] 467.0000000 0.9406512 ## [1] 468.0000000 0.9407128 ## [1] 469.0000000 0.9407743 ## [1] 470.0000000 0.9408356 ## [1] 471.0000000 0.9408967 ## [1] 472.0000000 0.9409576 ## [1] 473.0000000 0.9410183 ## [1] 474.0000000 0.9410788 ## [1] 475.0000000 0.9411391 ## [1] 476.0000000 0.9411992 ## [1] 477.0000000 0.9412592 ## [1] 478.000000 0.941319 ## [1] 479.0000000 0.9413785 ## [1] 480.0000000 0.9414379 ## [1] 481.0000000 0.9414972 ## [1] 482.0000000 0.9415562 ## [1] 483.0000000 0.9416151 ## [1] 484.0000000 0.9416737 ## [1] 485.0000000 0.9417322 ## [1] 486.0000000 0.9417906 ## [1] 487.0000000 0.9418487 ## [1] 488.0000000 0.9419067 ## [1] 489.0000000 0.9419645 ## [1] 490.0000000 0.9420221 ## [1] 491.0000000 0.9420795 ## [1] 492.0000000 0.9421368 ## [1] 493.0000000 0.9421939 ## [1] 494.0000000 0.9422509 ## [1] 495.0000000 0.9423076 ## [1] 496.0000000 0.9423642 ## [1] 497.0000000 0.9424207 ## [1] 498.0000000 0.9424769 ## [1] 499.000000 0.942533 ## [1] 500.0000000 0.9425889 ## [1] 501.0000000 0.9426447 ## [1] 502.0000000 0.9427003 ## [1] 503.0000000 0.9427558 ## [1] 504.000000 0.942811 ## [1] 505.0000000 0.9428661 ## [1] 506.0000000 0.9429211 ## [1] 507.0000000 0.9429759 ## [1] 508.0000000 0.9430305 ## [1] 509.000000 0.943085 ## [1] 510.0000000 0.9431393 ## [1] 511.0000000 0.9431935 ## [1] 512.0000000 0.9432475 ## [1] 513.0000000 0.9433014 ## [1] 514.0000000 0.9433551 ## [1] 515.0000000 0.9434086 ## [1] 516.000000 0.943462 ## [1] 517.0000000 0.9435152 ## [1] 518.0000000 0.9435683 ## [1] 519.0000000 0.9436213 ## [1] 520.000000 0.943674 ## [1] 521.0000000 0.9437267 ## [1] 522.0000000 0.9437792 ## [1] 523.0000000 0.9438315 ## [1] 524.0000000 0.9438837 ## [1] 525.0000000 0.9439357 ## [1] 526.0000000 0.9439876 ## [1] 527.0000000 0.9440394 ## [1] 528.000000 0.944091 ## [1] 529.0000000 0.9441425 ## [1] 530.0000000 0.9441938 ## [1] 531.000000 0.944245 ## [1] 532.000000 0.944296 ## [1] 533.0000000 0.9443469 ## [1] 534.0000000 0.9443977 ## [1] 535.0000000 0.9444483 ## [1] 536.0000000 0.9444988 ## [1] 537.0000000 0.9445491 ## [1] 538.0000000 0.9445993 ## [1] 539.0000000 0.9446494 ## [1] 540.0000000 0.9446993 ## [1] 541.0000000 0.9447491 ## [1] 542.0000000 0.9447988 ## [1] 543.0000000 0.9448483 ## [1] 544.0000000 0.9448977 ## [1] 545.0000000 0.9449469 ## [1] 546.000000 0.944996 ## [1] 547.000000 0.945045 ## [1] 548.0000000 0.9450939 ## [1] 549.0000000 0.9451426 ## [1] 550.0000000 0.9451912 ## [1] 551.0000000 0.9452397 ## [1] 552.000000 0.945288 ## [1] 553.0000000 0.9453362 ## [1] 554.0000000 0.9453843 ## [1] 555.0000000 0.9454322 ## [1] 556.00000 0.94548 ## [1] 557.0000000 0.9455277 ## [1] 558.0000000 0.9455753 ## [1] 559.0000000 0.9456227 ## [1] 560.0000000 0.9456701 ## [1] 561.0000000 0.9457173 ## [1] 562.0000000 0.9457643 ## [1] 563.0000000 0.9458113 ## [1] 564.0000000 0.9458581 ## [1] 565.0000000 0.9459048 ## [1] 566.0000000 0.9459514 ## [1] 567.0000000 0.9459978 ## [1] 568.0000000 0.9460442 ## [1] 569.0000000 0.9460904 ## [1] 570.0000000 0.9461365 ## [1] 571.0000000 0.9461825 ## [1] 572.0000000 0.9462283 ## [1] 573.0000000 0.9462741 ## [1] 574.0000000 0.9463197 ## [1] 575.0000000 0.9463652 ## [1] 576.0000000 0.9464106 ## [1] 577.0000000 0.9464559 ## [1] 578.000000 0.946501 ## [1] 579.0000000 0.9465461 ## [1] 580.000000 0.946591 ## [1] 581.0000000 0.9466358 ## [1] 582.0000000 0.9466805 ## [1] 583.0000000 0.9467251 ## [1] 584.0000000 0.9467696 ## [1] 585.0000000 0.9468139 ## [1] 586.0000000 0.9468582 ## [1] 587.0000000 0.9469023 ## [1] 588.0000000 0.9469464 ## [1] 589.0000000 0.9469903 ## [1] 590.0000000 0.9470341 ## [1] 591.0000000 0.9470778 ## [1] 592.0000000 0.9471214 ## [1] 593.0000000 0.9471649 ## [1] 594.0000000 0.9472082 ## [1] 595.0000000 0.9472515 ## [1] 596.0000000 0.9472947 ## [1] 597.0000000 0.9473377 ## [1] 598.0000000 0.9473807 ## [1] 599.0000000 0.9474235 ## [1] 600.0000000 0.9474662 ## [1] 601.0000000 0.9475089 ## [1] 602.0000000 0.9475514 ## [1] 603.0000000 0.9475938 ## [1] 604.0000000 0.9476361 ## [1] 605.0000000 0.9476783 ## [1] 606.0000000 0.9477205 ## [1] 607.0000000 0.9477625 ## [1] 608.0000000 0.9478044 ## [1] 609.0000000 0.9478462 ## [1] 610.0000000 0.9478879 ## [1] 611.0000000 0.9479295 ## [1] 612.000000 0.947971 ## [1] 613.0000000 0.9480124 ## [1] 614.0000000 0.9480537 ## [1] 615.0000000 0.9480949 ## [1] 616.000000 0.948136 ## [1] 617.000000 0.948177 ## [1] 618.0000000 0.9482179 ## [1] 619.0000000 0.9482587 ## [1] 620.0000000 0.9482995 ## [1] 621.0000000 0.9483401 ## [1] 622.0000000 0.9483806 ## [1] 623.000000 0.948421 ## [1] 624.0000000 0.9484613 ## [1] 625.0000000 0.9485016 ## [1] 626.0000000 0.9485417 ## [1] 627.0000000 0.9485818 ## [1] 628.0000000 0.9486217 ## [1] 629.0000000 0.9486616 ## [1] 630.0000000 0.9487013 ## [1] 631.000000 0.948741 ## [1] 632.0000000 0.9487806 ## [1] 633.0000000 0.9488201 ## [1] 634.0000000 0.9488595 ## [1] 635.0000000 0.9488988 ## [1] 636.000000 0.948938 ## [1] 637.0000000 0.9489771 ## [1] 638.0000000 0.9490161 ## [1] 639.0000000 0.9490551 ## [1] 640.0000000 0.9490939 ## [1] 641.0000000 0.9491327 ## [1] 642.0000000 0.9491713 ## [1] 643.0000000 0.9492099 ## [1] 644.0000000 0.9492484 ## [1] 645.0000000 0.9492868 ## [1] 646.0000000 0.9493251 ## [1] 647.0000000 0.9493634 ## [1] 648.0000000 0.9494015 ## [1] 649.0000000 0.9494396 ## [1] 650.0000000 0.9494775 ## [1] 651.0000000 0.9495154 ## [1] 652.0000000 0.9495532 ## [1] 653.0000000 0.9495909 ## [1] 654.0000000 0.9496286 ## [1] 655.0000000 0.9496661 ## [1] 656.0000000 0.9497036 ## [1] 657.0000000 0.9497409 ## [1] 658.0000000 0.9497782 ## [1] 659.0000000 0.9498154 ## [1] 660.0000000 0.9498525 ## [1] 661.0000000 0.9498896 ## [1] 662.0000000 0.9499265 ## [1] 663.0000000 0.9499634 ## [1] 664.0000000 0.9500002 ## [1] 665.0000000 0.9500369 ## [1] 666.0000000 0.9500736 ## [1] 667.0000000 0.9501101 ## [1] 668.0000000 0.9501466 ## [1] 669.000000 0.950183 ## [1] 670.0000000 0.9502193 ## [1] 671.0000000 0.9502555 ## [1] 672.0000000 0.9502916 ## [1] 673.0000000 0.9503277 ## [1] 674.0000000 0.9503637 ## [1] 675.0000000 0.9503996 ## [1] 676.0000000 0.9504354 ## [1] 677.0000000 0.9504712 ## [1] 678.0000000 0.9505069 ## [1] 679.0000000 0.9505425 ## [1] 680.000000 0.950578 ## [1] 681.0000000 0.9506134 ## [1] 682.0000000 0.9506488 ## [1] 683.0000000 0.9506841 ## [1] 684.0000000 0.9507193 ## [1] 685.0000000 0.9507544 ## [1] 686.0000000 0.9507895 ## [1] 687.0000000 0.9508245 ## [1] 688.0000000 0.9508594 ## [1] 689.0000000 0.9508942 ## [1] 690.000000 0.950929 ## [1] 691.0000000 0.9509637 ## [1] 692.0000000 0.9509983 ## [1] 693.0000000 0.9510328 ## [1] 694.0000000 0.9510673 ## [1] 695.0000000 0.9511017 ## [1] 696.000000 0.951136 ## [1] 697.0000000 0.9511703 ## [1] 698.0000000 0.9512044 ## [1] 699.0000000 0.9512385 ## [1] 700.0000000 0.9512726 ## [1] 701.0000000 0.9513065 ## [1] 702.0000000 0.9513404 ## [1] 703.0000000 0.9513742 ## [1] 704.000000 0.951408 ## [1] 705.0000000 0.9514416 ## [1] 706.0000000 0.9514752 ## [1] 707.0000000 0.9515088 ## [1] 708.0000000 0.9515422 ## [1] 709.0000000 0.9515756 ## [1] 710.000000 0.951609 ## [1] 711.0000000 0.9516422 ## [1] 712.0000000 0.9516754 ## [1] 713.0000000 0.9517085 ## [1] 714.0000000 0.9517416 ## [1] 715.0000000 0.9517745 ## [1] 716.0000000 0.9518075 ## [1] 717.0000000 0.9518403 ## [1] 718.0000000 0.9518731 ## [1] 719.0000000 0.9519058 ## [1] 720.0000000 0.9519384 ## [1] 721.000000 0.951971 ## [1] 722.0000000 0.9520035 ## [1] 723.000000 0.952036 ## [1] 724.0000000 0.9520683 ## [1] 725.0000000 0.9521006 ## [1] 726.0000000 0.9521329 ## [1] 727.0000000 0.9521651 ## [1] 728.0000000 0.9521972 ## [1] 729.0000000 0.9522292 ## [1] 730.0000000 0.9522612 ## [1] 731.0000000 0.9522931 ## [1] 732.000000 0.952325 ## [1] 733.0000000 0.9523568 ## [1] 734.0000000 0.9523885 ## [1] 735.0000000 0.9524202 ## [1] 736.0000000 0.9524518 ## [1] 737.0000000 0.9524833 ## [1] 738.0000000 0.9525148 ## [1] 739.0000000 0.9525462 ## [1] 740.0000000 0.9525775 ## [1] 741.0000000 0.9526088 ## [1] 742.00000 0.95264 ## [1] 743.0000000 0.9526712 ## [1] 744.0000000 0.9527023 ## [1] 745.0000000 0.9527333 ## [1] 746.0000000 0.9527643 ## [1] 747.0000000 0.9527952 ## [1] 748.0000000 0.9528261 ## [1] 749.0000000 0.9528568 ## [1] 750.0000000 0.9528876 ## [1] 751.0000000 0.9529182 ## [1] 752.0000000 0.9529489 ## [1] 753.0000000 0.9529794 ## [1] 754.0000000 0.9530099 ## [1] 755.0000000 0.9530403 ## [1] 756.0000000 0.9530707 ## [1] 757.000000 0.953101 ## [1] 758.0000000 0.9531313 ## [1] 759.0000000 0.9531614 ## [1] 760.0000000 0.9531916 ## [1] 761.0000000 0.9532217 ## [1] 762.0000000 0.9532517 ## [1] 763.0000000 0.9532816 ## [1] 764.0000000 0.9533115 ## [1] 765.0000000 0.9533414 ## [1] 766.0000000 0.9533712 ## [1] 767.0000000 0.9534009 ## [1] 768.0000000 0.9534306 ## [1] 769.0000000 0.9534602 ## [1] 770.0000000 0.9534897 ## [1] 771.0000000 0.9535192 ## [1] 772.0000000 0.9535487 ## [1] 773.0000000 0.9535781 ## [1] 774.0000000 0.9536074 ## [1] 775.0000000 0.9536367 ## [1] 776.0000000 0.9536659 ## [1] 777.000000 0.953695 ## [1] 778.0000000 0.9537242 ## [1] 779.0000000 0.9537532 ## [1] 780.0000000 0.9537822 ## [1] 781.0000000 0.9538111 ## [1] 782.00000 0.95384 ## [1] 783.0000000 0.9538689 ## [1] 784.0000000 0.9538977 ## [1] 785.0000000 0.9539264 ## [1] 786.000000 0.953955 ## [1] 787.0000000 0.9539837 ## [1] 788.0000000 0.9540122 ## [1] 789.0000000 0.9540407 ## [1] 790.0000000 0.9540692 ## [1] 791.0000000 0.9540976 ## [1] 792.0000000 0.9541259 ## [1] 793.0000000 0.9541542 ## [1] 794.0000000 0.9541825 ## [1] 795.0000000 0.9542107 ## [1] 796.0000000 0.9542388 ## [1] 797.0000000 0.9542669 ## [1] 798.0000000 0.9542949 ## [1] 799.0000000 0.9543229 ## [1] 800.0000000 0.9543509 ## [1] 801.0000000 0.9543787 ## [1] 802.0000000 0.9544066 ## [1] 803.0000000 0.9544343 ## [1] 804.0000000 0.9544621 ## [1] 805.0000000 0.9544898 ## [1] 806.0000000 0.9545174 ## [1] 807.000000 0.954545 ## [1] 808.0000000 0.9545725 ## [1] 809.0000000 0.9545999 ## [1] 810.0000000 0.9546274 ## [1] 811.0000000 0.9546547 ## [1] 812.0000000 0.9546821 ## [1] 813.0000000 0.9547093 ## [1] 814.0000000 0.9547366 ## [1] 815.0000000 0.9547637 ## [1] 816.0000000 0.9547909 ## [1] 817.0000000 0.9548179 ## [1] 818.000000 0.954845 ## [1] 819.0000000 0.9548719 ## [1] 820.0000000 0.9548989 ## [1] 821.0000000 0.9549258 ## [1] 822.0000000 0.9549526 ## [1] 823.0000000 0.9549794 ## [1] 824.0000000 0.9550061 ## [1] 825.0000000 0.9550328 ## [1] 826.0000000 0.9550594 ## [1] 827.000000 0.955086 ## [1] 828.0000000 0.9551126 ## [1] 829.0000000 0.9551391 ## [1] 830.0000000 0.9551655 ## [1] 831.0000000 0.9551919 ## [1] 832.0000000 0.9552183 ## [1] 833.0000000 0.9552446 ## [1] 834.0000000 0.9552709 ## [1] 835.0000000 0.9552971 ## [1] 836.0000000 0.9553233 ## [1] 837.0000000 0.9553494 ## [1] 838.0000000 0.9553755 ## [1] 839.0000000 0.9554015 ## [1] 840.0000000 0.9554275 ## [1] 841.0000000 0.9554534 ## [1] 842.0000000 0.9554793 ## [1] 843.0000000 0.9555052 ## [1] 844.000000 0.955531 ## [1] 845.0000000 0.9555567 ## [1] 846.0000000 0.9555824 ## [1] 847.0000000 0.9556081 ## [1] 848.0000000 0.9556337 ## [1] 849.0000000 0.9556593 ## [1] 850.0000000 0.9556848 ## [1] 851.0000000 0.9557103 ## [1] 852.0000000 0.9557358 ## [1] 853.0000000 0.9557612 ## [1] 854.0000000 0.9557865 ## [1] 855.0000000 0.9558119 ## [1] 856.0000000 0.9558371 ## [1] 857.0000000 0.9558624 ## [1] 858.0000000 0.9558875 ## [1] 859.0000000 0.9559127 ## [1] 860.0000000 0.9559378 ## [1] 861.0000000 0.9559628 ## [1] 862.0000000 0.9559878 ## [1] 863.0000000 0.9560128 ## [1] 864.0000000 0.9560377 ## [1] 865.0000000 0.9560626 ## [1] 866.0000000 0.9560875 ## [1] 867.0000000 0.9561123 ## [1] 868.000000 0.956137 ## [1] 869.0000000 0.9561617 ## [1] 870.0000000 0.9561864 ## [1] 871.000000 0.956211 ## [1] 872.0000000 0.9562356 ## [1] 873.0000000 0.9562602 ## [1] 874.0000000 0.9562847 ## [1] 875.0000000 0.9563091 ## [1] 876.0000000 0.9563336 ## [1] 877.0000000 0.9563579 ## [1] 878.0000000 0.9563823 ## [1] 879.0000000 0.9564066 ## [1] 880.0000000 0.9564308 ## [1] 881.000000 0.956455 ## [1] 882.0000000 0.9564792 ## [1] 883.0000000 0.9565034 ## [1] 884.0000000 0.9565274 ## [1] 885.0000000 0.9565515 ## [1] 886.0000000 0.9565755 ## [1] 887.0000000 0.9565995 ## [1] 888.0000000 0.9566234 ## [1] 889.0000000 0.9566473 ## [1] 890.0000000 0.9566712 ## [1] 891.000000 0.956695 ## [1] 892.0000000 0.9567188 ## [1] 893.0000000 0.9567425 ## [1] 894.0000000 0.9567662 ## [1] 895.0000000 0.9567899 ## [1] 896.0000000 0.9568135 ## [1] 897.0000000 0.9568371 ## [1] 898.0000000 0.9568606 ## [1] 899.0000000 0.9568841 ## [1] 900.0000000 0.9569076 ## [1] 901.000000 0.956931 ## [1] 902.0000000 0.9569544 ## [1] 903.0000000 0.9569777 ## [1] 904.000000 0.957001 ## [1] 905.0000000 0.9570243 ## [1] 906.0000000 0.9570475 ## [1] 907.0000000 0.9570707 ## [1] 908.0000000 0.9570939 ## [1] 909.000000 0.957117 ## [1] 910.0000000 0.9571401 ## [1] 911.0000000 0.9571631 ## [1] 912.0000000 0.9571862 ## [1] 913.0000000 0.9572091 ## [1] 914.0000000 0.9572321 ## [1] 915.000000 0.957255 ## [1] 916.0000000 0.9572778 ## [1] 917.0000000 0.9573006 ## [1] 918.0000000 0.9573234 ## [1] 919.0000000 0.9573462 ## [1] 920.0000000 0.9573689 ## [1] 921.0000000 0.9573916 ## [1] 922.0000000 0.9574142 ## [1] 923.0000000 0.9574368 ## [1] 924.0000000 0.9574594 ## [1] 925.0000000 0.9574819 ## [1] 926.0000000 0.9575044 ## [1] 927.0000000 0.9575269 ## [1] 928.0000000 0.9575493 ## [1] 929.0000000 0.9575717 ## [1] 930.000000 0.957594 ## [1] 931.0000000 0.9576163 ## [1] 932.0000000 0.9576386 ## [1] 933.0000000 0.9576609 ## [1] 934.0000000 0.9576831 ## [1] 935.0000000 0.9577052 ## [1] 936.0000000 0.9577274 ## [1] 937.0000000 0.9577495 ## [1] 938.0000000 0.9577716 ## [1] 939.0000000 0.9577936 ## [1] 940.0000000 0.9578156 ## [1] 941.0000000 0.9578376 ## [1] 942.0000000 0.9578595 ## [1] 943.0000000 0.9578814 ## [1] 944.0000000 0.9579032 ## [1] 945.0000000 0.9579251 ## [1] 946.0000000 0.9579469 ## [1] 947.0000000 0.9579686 ## [1] 948.0000000 0.9579904 ## [1] 949.000000 0.958012 ## [1] 950.0000000 0.9580337 ## [1] 951.0000000 0.9580553 ## [1] 952.0000000 0.9580769 ## [1] 953.0000000 0.9580985 ## [1] 954.00000 0.95812 ## [1] 955.0000000 0.9581415 ## [1] 956.0000000 0.9581629 ## [1] 957.0000000 0.9581844 ## [1] 958.0000000 0.9582058 ## [1] 959.0000000 0.9582271 ## [1] 960.0000000 0.9582484 ## [1] 961.0000000 0.9582697 ## [1] 962.000000 0.958291 ## [1] 963.0000000 0.9583122 ## [1] 964.0000000 0.9583334 ## [1] 965.0000000 0.9583546 ## [1] 966.0000000 0.9583757 ## [1] 967.0000000 0.9583968 ## [1] 968.0000000 0.9584179 ## [1] 969.0000000 0.9584389 ## [1] 970.0000000 0.9584599 ## [1] 971.0000000 0.9584809 ## [1] 972.0000000 0.9585018 ## [1] 973.0000000 0.9585227 ## [1] 974.0000000 0.9585436 ## [1] 975.0000000 0.9585644 ## [1] 976.0000000 0.9585852 ## [1] 977.000000 0.958606 ## [1] 978.0000000 0.9586268 ## [1] 979.0000000 0.9586475 ## [1] 980.0000000 0.9586682 ## [1] 981.0000000 0.9586888 ## [1] 982.0000000 0.9587094 ## [1] 983.00000 0.95873 ## [1] 984.0000000 0.9587506 ## [1] 985.0000000 0.9587711 ## [1] 986.0000000 0.9587916 ## [1] 987.0000000 0.9588121 ## [1] 988.0000000 0.9588325 ## [1] 989.0000000 0.9588529 ## [1] 990.0000000 0.9588733 ## [1] 991.0000000 0.9588936 ## [1] 992.0000000 0.9589139 ## [1] 993.0000000 0.9589342 ## [1] 994.0000000 0.9589545 ## [1] 995.0000000 0.9589747 ## [1] 996.0000000 0.9589949 ## [1] 997.0000000 0.9590151 ## [1] 998.0000000 0.9590352 ## [1] 999.0000000 0.9590553 ## [1] 1000.0000000 0.9590754
EXAMPLE
[1] 30.0000000 0.7823326
[1] 31.0000000 0.7855731
[1] 32.0000000 0.7886677
[1] 33.0000000 0.7916271
[1] 34.0000000 0.7944611
[1] 35.0000000 0.7971784
[1] 36.0000000 0.7997869
[1] 37.0000000 0.8022938 \(\leftarrow\) we reached the desired power!
NIS stands for Non-Inferiority/Superiority of Intra-Subject Variability in Crossover Design
\(\rho = \sigma^{T^1} \div \sigma^{T^2}\)
\(H_0: \rho \geq \delta\); \(H_1: \rho < \delta\)
This function needs:
Suggestion: think sigmas in terms of percentages or z-scores
EXAMPLE
Same design seen before. When the experimental treatment will give a better outcome than placebo? Data are in z-scores.
CrossOver.ISV.NIS(alpha = 0.05, beta = 0.2, sigma1 = 1, sigma2 = 2, m = 10, margin = 2)
## [1] "ratio" "0.125" ## [1] 2.0000000 0.3012328 ## [1] 3.0000000 0.4323352 ## [1] 4.0000000 0.5055975 ## [1] 5.0000000 0.5546081 ## [1] 6.0000000 0.5905835 ## [1] 7.0000000 0.6185455 ## [1] 8.000000 0.641143 ## [1] 9.0000000 0.6599302 ## [1] 10.0000000 0.6758903 ## [1] 11.0000000 0.6896809 ## [1] 12.0000000 0.7017617 ## [1] 13.0000000 0.7124655 ## [1] 14.0000000 0.7220399 ## [1] 15.0000000 0.7306739 ## [1] 16.0000000 0.7385146 ## [1] 17.0000000 0.7456785 ## [1] 18.0000000 0.7522592 ## [1] 19.000000 0.758333 ## [1] 20.0000000 0.7639628 ## [1] 21.0000000 0.7692011 ## [1] 22.0000000 0.7740918 ## [1] 23.0000000 0.7786724 ## [1] 24.0000000 0.7829748 ## [1] 25.0000000 0.7870264 ## [1] 26.0000000 0.7908511 ## [1] 27.0000000 0.7944695 ## [1] 28.0000000 0.7978998 ## [1] 29.000000 0.801158 ## [1] 30.0000000 0.8042582 ## [1] 31.0000000 0.8072129 ## [1] 32.0000000 0.8100333 ## [1] 33.0000000 0.8127294 ## [1] 34.0000000 0.8153102 ## [1] 35.0000000 0.8177838 ## [1] 36.0000000 0.8201575 ## [1] 37.0000000 0.8224379 ## [1] 38.000000 0.824631 ## [1] 39.0000000 0.8267425 ## [1] 40.0000000 0.8287771 ## [1] 41.0000000 0.8307397 ## [1] 42.0000000 0.8326342 ## [1] 43.0000000 0.8344648 ## [1] 44.0000000 0.8362348 ## [1] 45.0000000 0.8379477 ## [1] 46.0000000 0.8396064 ## [1] 47.0000000 0.8412138 ## [1] 48.0000000 0.8427725 ## [1] 49.0000000 0.8442849 ## [1] 50.0000000 0.8457533 ## [1] 51.0000000 0.8471799 ## [1] 52.0000000 0.8485666 ## [1] 53.0000000 0.8499152 ## [1] 54.0000000 0.8512274 ## [1] 55.000000 0.852505 ## [1] 56.0000000 0.8537493 ## [1] 57.0000000 0.8549619 ## [1] 58.0000000 0.8561441 ## [1] 59.0000000 0.8572971 ## [1] 60.0000000 0.8584221 ## [1] 61.0000000 0.8595203 ## [1] 62.0000000 0.8605928 ## [1] 63.0000000 0.8616404 ## [1] 64.0000000 0.8626642 ## [1] 65.0000000 0.8636651 ## [1] 66.0000000 0.8646438 ## [1] 67.0000000 0.8656013 ## [1] 68.0000000 0.8665383 ## [1] 69.0000000 0.8674556 ## [1] 70.0000000 0.8683537 ## [1] 71.0000000 0.8692334 ## [1] 72.0000000 0.8700953 ## [1] 73.00000 0.87094 ## [1] 74.0000000 0.8717681 ## [1] 75.0000000 0.8725801 ## [1] 76.0000000 0.8733765 ## [1] 77.0000000 0.8741578 ## [1] 78.0000000 0.8749245 ## [1] 79.0000000 0.8756771 ## [1] 80.000000 0.876416 ## [1] 81.0000000 0.8771416 ## [1] 82.0000000 0.8778542 ## [1] 83.0000000 0.8785544 ## [1] 84.0000000 0.8792424 ## [1] 85.0000000 0.8799186 ## [1] 86.0000000 0.8805833 ## [1] 87.0000000 0.8812368 ## [1] 88.0000000 0.8818796 ## [1] 89.0000000 0.8825118 ## [1] 90.0000000 0.8831337 ## [1] 91.0000000 0.8837457 ## [1] 92.000000 0.884348 ## [1] 93.0000000 0.8849408 ## [1] 94.0000000 0.8855244 ## [1] 95.0000000 0.8860991 ## [1] 96.000000 0.886665 ## [1] 97.0000000 0.8872224 ## [1] 98.0000000 0.8877715 ## [1] 99.0000000 0.8883125 ## [1] 100.0000000 0.8888456
EXAMPLE
[1] 24.0000000 0.7829748
[1] 25.0000000 0.7870264
[1] 26.0000000 0.7908511
[1] 27.0000000 0.7944695
[1] 28.0000000 0.7978998
[1] 29.000000 0.801158 \(\leftarrow\) we reached the desired power!
Estimate the required sample size for a repeated measures anova design, with a \(3 \times 2 \times 3\) within-subjects design. You decided that the difference should be half of the overall variance. The power is 90%.
Repeat the same exercie with a \(2 \times 2\) design.
"By data simulation, we simply mean the generation of random numbers from a stochastic process that is described by a series of distributional statements, such as \(\alpha_i \sim Normal(\mu, \sigma^2_{\alpha})\)" (Kéry and Royle, 2016)
In R, this simple simulation can be done as follows:
rnorm(n = 5, mean = 2, sd = 1)
## [1] 1.556414 2.504723 2.736112 3.511451 1.625105
Larger the simulation, more similar to the distribution are the data.
1000 random samplings with mean 0 and standard deviation 1.
For complex models, it is difficult to provide a good estimate of power without the use of simulation.
Simulations repeatedly generate random data (at least 1000 times) based on one’s predefined model, then analyze each data set and count the proportion of results that are significant.
That proportion is the estimated power for the model.
We increase the number of "observations" until we reach the desired power.
Let's use simulation to determine the sample size for a t-test, with effect size Cohen's d = 0.8 and power = 80%.
If we use the analytical functions:
pwr.t.test( d = 0.8 , power = 0.8 )
## ## Two-sample t test power calculation ## ## n = 25.52458 ## d = 0.8 ## sig.level = 0.05 ## power = 0.8 ## alternative = two.sided ## ## NOTE: n is number in *each* group
\(d = (\mu_1 - \mu_2) \div \sigma^2_{12} \rightarrow 0.8 = (\mu_1 - 0) \div \sigma^2_{12} \rightarrow\) \(0.8 = (\mu_1 - 0) \div 1 \rightarrow \mu_1 = 0.8\)
out <- rep(NA , times = 10000) for ( i in 1:10000){ y1 <- rnorm( 25 , mean = 0.8 ) y2 <- rnorm( 25 ) out[i] <- t.test( y1 , y2 )$p.value } paste( "The power for 25 participants is:" , mean( out < 0.05 ))
## [1] "The power for 25 participants is: 0.7867"
out <- matrix(NA , nrow = 1000 , ncol = 10) n <- c(10 , 15 , 20 , 25 , 30 , 35 , 40 , 45 , 50 , 55) # seq( from = 10 , to = 55 , by = 5 ) colnames( out ) <- n i_column <- 1 # index variable repeat{ for ( i in 1:1000){ y1 <- rnorm( n[ i_column ] , mean = 0.8 ) y2 <- rnorm( n[ i_column ] ) out[ i , i_column ] <- t.test( y1 , y2 )$p.value } if( mean( out[ , i_column] < 0.05) >= 0.8) break; i_column <- i_column + 1 } apply(out[ , 1:i_column]<0.05 , 2 , mean)
## 10 15 20 25 ## 0.392 0.565 0.691 0.803
\(y \sim Intercept + condition \times perspective\), a \(3 \times 2\) design
## we choose our effects of the IV on the DV intercept <- 0 condition <- c( 0 , 0 ) perspect <- 0 cond_per <- c( 0 , 0.8 ) n_subj <- 30 ## we create our dataframe with the IV data.tmp <- as.data.frame( expand.grid( ID = 1:n_subj, conditions = factor(1:3), perspective = factor(1:2) ) )
mm <- model.matrix(~conditions*perspective,data=data.tmp) ## we start computing our DV data.tmp$y <- intercept + mm[,2] * condition[1] + mm[,3] * condition[2] + mm[,4] * perspect[1] + mm[,5] * cond_per[1] + mm[,6] * cond_per[2] ## this is a shorter way that is doing the same thing data.tmp$y <- rowSums( model.matrix( ~ conditions * perspective , data = data.tmp ) %*% c( intercept , condition , perspect ,cond_per ) ) ## we add some stocastic noise to the DV data.tmp$y <- data.tmp$y + rnorm( nrow( data.tmp ) ) data.tmp
## ID conditions perspective y ## 1 1 1 1 0.76161471 ## 2 2 1 1 -1.18688823 ## 3 3 1 1 2.48858967 ## 4 4 1 1 -0.52312161 ## 5 5 1 1 0.37514819 ## 6 6 1 1 -0.04020194 ## 7 7 1 1 1.26054872 ## 8 8 1 1 -0.26645395 ## 9 9 1 1 -1.29827641 ## 10 10 1 1 1.17505552 ## 11 11 1 1 0.33136803 ## 12 12 1 1 1.23465678 ## 13 13 1 1 -1.86853150 ## 14 14 1 1 -1.09493452 ## 15 15 1 1 0.07036085 ## 16 16 1 1 -0.62030430 ## 17 17 1 1 2.41764946 ## 18 18 1 1 0.34475841 ## 19 19 1 1 -0.69230029 ## 20 20 1 1 0.14682570 ## 21 21 1 1 -0.71891864 ## 22 22 1 1 -0.63445689 ## 23 23 1 1 1.14032284 ## 24 24 1 1 -1.29814284 ## 25 25 1 1 -1.64515079 ## 26 26 1 1 -0.97742475 ## 27 27 1 1 -0.05994002 ## 28 28 1 1 -0.43653522 ## 29 29 1 1 -2.30037543 ## 30 30 1 1 -0.33250177 ## 31 1 2 1 3.03384529 ## 32 2 2 1 0.43816943 ## 33 3 2 1 -0.59283684 ## 34 4 2 1 1.21721976 ## 35 5 2 1 1.41368267 ## 36 6 2 1 0.94330645 ## 37 7 2 1 -0.14133547 ## 38 8 2 1 -0.02122121 ## 39 9 2 1 0.55146600 ## 40 10 2 1 -1.00144984 ## 41 11 2 1 1.91534664 ## 42 12 2 1 1.22408035 ## 43 13 2 1 0.15442896 ## 44 14 2 1 -0.45635264 ## 45 15 2 1 -0.93991085 ## 46 16 2 1 0.02085634 ## 47 17 2 1 0.50026760 ## 48 18 2 1 2.12603129 ## 49 19 2 1 0.16329913 ## 50 20 2 1 0.95343970 ## 51 21 2 1 1.92399817 ## 52 22 2 1 1.21965795 ## 53 23 2 1 0.74334500 ## 54 24 2 1 -0.15298083 ## 55 25 2 1 2.10699919 ## 56 26 2 1 1.60910968 ## 57 27 2 1 1.85159314 ## 58 28 2 1 0.88004326 ## 59 29 2 1 1.43332853 ## 60 30 2 1 -0.15299324 ## 61 1 3 1 0.69435598 ## 62 2 3 1 -0.84957647 ## 63 3 3 1 0.47287056 ## 64 4 3 1 0.42065788 ## 65 5 3 1 -0.50171282 ## 66 6 3 1 -0.12545244 ## 67 7 3 1 -2.44753278 ## 68 8 3 1 -1.85726849 ## 69 9 3 1 -1.86771530 ## 70 10 3 1 -0.83981103 ## 71 11 3 1 1.12979649 ## 72 12 3 1 -2.09107634 ## 73 13 3 1 -0.83814944 ## 74 14 3 1 -1.85927444 ## 75 15 3 1 -2.42829214 ## 76 16 3 1 -3.28210183 ## 77 17 3 1 -1.92576398 ## 78 18 3 1 -1.54637160 ## 79 19 3 1 -1.45184059 ## 80 20 3 1 -1.23046282 ## 81 21 3 1 -0.30184450 ## 82 22 3 1 -2.17690361 ## 83 23 3 1 -1.54987579 ## 84 24 3 1 -0.49697612 ## 85 25 3 1 -2.29047538 ## 86 26 3 1 0.24800349 ## 87 27 3 1 -1.20071538 ## 88 28 3 1 -0.60017269 ## 89 29 3 1 0.19791184 ## 90 30 3 1 -1.07690330 ## 91 1 1 2 -0.50277981 ## 92 2 1 2 -0.30355422 ## 93 3 1 2 -0.94375580 ## 94 4 1 2 1.69027805 ## 95 5 1 2 0.78525088 ## 96 6 1 2 0.38360169 ## 97 7 1 2 -0.91083212 ## 98 8 1 2 -1.68241397 ## 99 9 1 2 0.64323770 ## 100 10 1 2 1.58277812 ## 101 11 1 2 0.16567032 ## 102 12 1 2 -2.02825114 ## 103 13 1 2 0.58334714 ## 104 14 1 2 1.67511518 ## 105 15 1 2 0.18756059 ## 106 16 1 2 -0.32403961 ## 107 17 1 2 -2.70953513 ## 108 18 1 2 1.77179175 ## 109 19 1 2 -1.29570620 ## 110 20 1 2 -1.37616047 ## 111 21 1 2 0.98323394 ## 112 22 1 2 -0.09000263 ## 113 23 1 2 -0.10700896 ## 114 24 1 2 0.96429770 ## 115 25 1 2 2.29505746 ## 116 26 1 2 1.04413521 ## 117 27 1 2 -0.09043190 ## 118 28 1 2 1.22581407 ## 119 29 1 2 1.54861002 ## 120 30 1 2 0.03250257 ## 121 1 2 2 0.89478633 ## 122 2 2 2 0.40795968 ## 123 3 2 2 0.28684768 ## 124 4 2 2 -1.11301034 ## 125 5 2 2 -0.76783587 ## 126 6 2 2 -2.47489694 ## 127 7 2 2 0.88107704 ## 128 8 2 2 -2.59875593 ## 129 9 2 2 -0.31517909 ## 130 10 2 2 -0.36076304 ## 131 11 2 2 -0.46842127 ## 132 12 2 2 1.29342606 ## 133 13 2 2 0.32039534 ## 134 14 2 2 0.20868728 ## 135 15 2 2 1.29478779 ## 136 16 2 2 -0.67252951 ## 137 17 2 2 -1.63908102 ## 138 18 2 2 -0.96260844 ## 139 19 2 2 -0.95602961 ## 140 20 2 2 -0.29752418 ## 141 21 2 2 -0.87544127 ## 142 22 2 2 -1.31881861 ## 143 23 2 2 0.81546875 ## 144 24 2 2 0.25814663 ## 145 25 2 2 0.63239615 ## 146 26 2 2 -0.23000932 ## 147 27 2 2 0.92923130 ## 148 28 2 2 1.07461989 ## 149 29 2 2 -0.42866060 ## 150 30 2 2 -2.23260146 ## 151 1 3 2 1.51244483 ## 152 2 3 2 0.90811762 ## 153 3 3 2 0.82384050 ## 154 4 3 2 0.78390103 ## 155 5 3 2 0.66364293 ## 156 6 3 2 -0.22551916 ## 157 7 3 2 -0.70390812 ## 158 8 3 2 1.10758878 ## 159 9 3 2 1.39453725 ## 160 10 3 2 1.32580208 ## 161 11 3 2 -0.98037324 ## 162 12 3 2 0.77378188 ## 163 13 3 2 0.44093470 ## 164 14 3 2 2.12541754 ## 165 15 3 2 2.08261214 ## 166 16 3 2 2.40532214 ## 167 17 3 2 0.73110990 ## 168 18 3 2 0.35464154 ## 169 19 3 2 1.06112557 ## 170 20 3 2 2.72059947 ## 171 21 3 2 2.55338570 ## 172 22 3 2 -1.16926250 ## 173 23 3 2 -0.51716271 ## 174 24 3 2 1.99939272 ## 175 25 3 2 1.20481057 ## 176 26 3 2 -0.33707204 ## 177 27 3 2 1.16227367 ## 178 28 3 2 1.76296533 ## 179 29 3 2 0.87192324 ## 180 30 3 2 2.29001958
summary( aov( y ~ conditions*perspective + Error(ID/conditions*perspective) , data= data.tmp) )
## ## Error: ID ## Df Sum Sq Mean Sq F value Pr(>F) ## Residuals 1 0.001572 0.001572 ## ## Error: perspective ## Df Sum Sq Mean Sq ## perspective 1 8.392 8.392 ## ## Error: ID:conditions ## Df Sum Sq Mean Sq ## conditions 2 3.081 1.54 ## ## Error: ID:perspective ## Df Sum Sq Mean Sq F value Pr(>F) ## Residuals 1 4.6 4.6 ## ## Error: ID:conditions:perspective ## Df Sum Sq Mean Sq ## conditions:perspective 2 63.2 31.6 ## ## Error: Within ## Df Sum Sq Mean Sq F value Pr(>F) ## conditions 2 0.25 0.124 0.103 0.90235 ## conditions:perspective 2 12.40 6.200 5.146 0.00678 ** ## Residuals 168 202.42 1.205 ## --- ## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
out <- matrix(NA , nrow = 1000 , ncol = 39) n <- seq(from = 10, to = 200, by=5) colnames(out) <- n i_column <- 1 # index variable ## we choose our effects of the IV on the DV intercept <- 0 condition <- c( 0 , 0 ) perspect <- 0 cond_per <- c( 0 , 0.8 )
repeat{ for(i in 1:1000){ data.tmp <- as.data.frame( expand.grid( ID = 1:n[ i_column] , conditions = factor(1:3), perspective = factor(1:2) ) ) data.tmp$y <- rowSums( model.matrix( ~ conditions * perspective , data = data.tmp ) %*% c( intercept , condition , perspect , cond_per ) ) + rnorm( nrow(data.tmp) ) out[ i , i_column ] <- summary( aov( y ~ conditions*perspective + Error(ID/conditions*perspective) , data= data.tmp) )$`Error: Within`[[1]][2,5] } if( mean( out[ , i_column] < 0.05 ) >= 0.9 ) break; i_column <- i_column + 1 }
apply(out[ , 1:i_column]<0.05 , 2 , mean)
## 10 15 20 25 ## 0.524 0.718 0.883 0.933
One typical problem is: "how do I know in which coefficient I have to put my effect size?"
That depends on your hypothesis.
Is your main hypothesis involving the hierarchically higher interaction, or a lower order coefficient?
Among all coefficients, you should think to which contrast if relly important for your sample size.
For example, if you have a control and an experimental group, and your hypothesis is that there will be a difference between the two groups in condition A compared to a baseline, you should use the coefficient that represents that interaction.
In this case we should have:
For both factors we can use the treatment coding for the contrast matrix.
In this matrix all coefficients are the differences with a "baseline" level.
Thus, in the moment we simulate our data set we should declare these aspects
data.tmp <- as.data.frame( expand.grid( ID = 1:n[ i_column] , conditions = factor(c("baseline","A","B")), group = factor(c("experimental","control")) ) )
We check the contrasts matrices
contrasts( data.tmp$conditions )
## [,1] [,2] ## A 1 0 ## B 0 1 ## baseline -1 -1
contrasts( data.tmp$group )
## [,1] ## control 1 ## experimental -1
We change the contrasts
contrasts( data.tmp$conditions ) <- contr.treatment( n = 3 , base = 3 ) contrasts( data.tmp$group ) <- contr.treatment( n = 2 , base = 1 ) contrasts( data.tmp$conditions )
## 1 2 ## A 1 0 ## B 0 1 ## baseline 0 0
contrasts( data.tmp$group )
## 2 ## control 0 ## experimental 1
The coefficients for interactions are simply given by the multiplication of the contrasts matrices of the main (and lower-order interactions) involved in the interaction of interest.
A simple way to understand which is this coefficient is just give a look to the contrast matrix of the whole design.
model.matrix( ~ conditions * group , data = data.tmp )
## (Intercept) conditions1 conditions2 group2 conditions1:group2 ## 1 1 0 0 1 0 ## 2 1 0 0 1 0 ## 3 1 0 0 1 0 ## 4 1 0 0 1 0 ## 5 1 0 0 1 0 ## 6 1 0 0 1 0 ## 7 1 0 0 1 0 ## 8 1 0 0 1 0 ## 9 1 0 0 1 0 ## 10 1 0 0 1 0 ## 11 1 0 0 1 0 ## 12 1 0 0 1 0 ## 13 1 0 0 1 0 ## 14 1 0 0 1 0 ## 15 1 0 0 1 0 ## 16 1 0 0 1 0 ## 17 1 0 0 1 0 ## 18 1 0 0 1 0 ## 19 1 0 0 1 0 ## 20 1 0 0 1 0 ## 21 1 0 0 1 0 ## 22 1 0 0 1 0 ## 23 1 0 0 1 0 ## 24 1 0 0 1 0 ## 25 1 0 0 1 0 ## 26 1 1 0 1 1 ## 27 1 1 0 1 1 ## 28 1 1 0 1 1 ## 29 1 1 0 1 1 ## 30 1 1 0 1 1 ## 31 1 1 0 1 1 ## 32 1 1 0 1 1 ## 33 1 1 0 1 1 ## 34 1 1 0 1 1 ## 35 1 1 0 1 1 ## 36 1 1 0 1 1 ## 37 1 1 0 1 1 ## 38 1 1 0 1 1 ## 39 1 1 0 1 1 ## 40 1 1 0 1 1 ## 41 1 1 0 1 1 ## 42 1 1 0 1 1 ## 43 1 1 0 1 1 ## 44 1 1 0 1 1 ## 45 1 1 0 1 1 ## 46 1 1 0 1 1 ## 47 1 1 0 1 1 ## 48 1 1 0 1 1 ## 49 1 1 0 1 1 ## 50 1 1 0 1 1 ## 51 1 0 1 1 0 ## 52 1 0 1 1 0 ## 53 1 0 1 1 0 ## 54 1 0 1 1 0 ## 55 1 0 1 1 0 ## 56 1 0 1 1 0 ## 57 1 0 1 1 0 ## 58 1 0 1 1 0 ## 59 1 0 1 1 0 ## 60 1 0 1 1 0 ## 61 1 0 1 1 0 ## 62 1 0 1 1 0 ## 63 1 0 1 1 0 ## 64 1 0 1 1 0 ## 65 1 0 1 1 0 ## 66 1 0 1 1 0 ## 67 1 0 1 1 0 ## 68 1 0 1 1 0 ## 69 1 0 1 1 0 ## 70 1 0 1 1 0 ## 71 1 0 1 1 0 ## 72 1 0 1 1 0 ## 73 1 0 1 1 0 ## 74 1 0 1 1 0 ## 75 1 0 1 1 0 ## 76 1 0 0 0 0 ## 77 1 0 0 0 0 ## 78 1 0 0 0 0 ## 79 1 0 0 0 0 ## 80 1 0 0 0 0 ## 81 1 0 0 0 0 ## 82 1 0 0 0 0 ## 83 1 0 0 0 0 ## 84 1 0 0 0 0 ## 85 1 0 0 0 0 ## 86 1 0 0 0 0 ## 87 1 0 0 0 0 ## 88 1 0 0 0 0 ## 89 1 0 0 0 0 ## 90 1 0 0 0 0 ## 91 1 0 0 0 0 ## 92 1 0 0 0 0 ## 93 1 0 0 0 0 ## 94 1 0 0 0 0 ## 95 1 0 0 0 0 ## 96 1 0 0 0 0 ## 97 1 0 0 0 0 ## 98 1 0 0 0 0 ## 99 1 0 0 0 0 ## 100 1 0 0 0 0 ## 101 1 1 0 0 0 ## 102 1 1 0 0 0 ## 103 1 1 0 0 0 ## 104 1 1 0 0 0 ## 105 1 1 0 0 0 ## 106 1 1 0 0 0 ## 107 1 1 0 0 0 ## 108 1 1 0 0 0 ## 109 1 1 0 0 0 ## 110 1 1 0 0 0 ## 111 1 1 0 0 0 ## 112 1 1 0 0 0 ## 113 1 1 0 0 0 ## 114 1 1 0 0 0 ## 115 1 1 0 0 0 ## 116 1 1 0 0 0 ## 117 1 1 0 0 0 ## 118 1 1 0 0 0 ## 119 1 1 0 0 0 ## 120 1 1 0 0 0 ## 121 1 1 0 0 0 ## 122 1 1 0 0 0 ## 123 1 1 0 0 0 ## 124 1 1 0 0 0 ## 125 1 1 0 0 0 ## 126 1 0 1 0 0 ## 127 1 0 1 0 0 ## 128 1 0 1 0 0 ## 129 1 0 1 0 0 ## 130 1 0 1 0 0 ## 131 1 0 1 0 0 ## 132 1 0 1 0 0 ## 133 1 0 1 0 0 ## 134 1 0 1 0 0 ## 135 1 0 1 0 0 ## 136 1 0 1 0 0 ## 137 1 0 1 0 0 ## 138 1 0 1 0 0 ## 139 1 0 1 0 0 ## 140 1 0 1 0 0 ## 141 1 0 1 0 0 ## 142 1 0 1 0 0 ## 143 1 0 1 0 0 ## 144 1 0 1 0 0 ## 145 1 0 1 0 0 ## 146 1 0 1 0 0 ## 147 1 0 1 0 0 ## 148 1 0 1 0 0 ## 149 1 0 1 0 0 ## 150 1 0 1 0 0 ## conditions2:group2 ## 1 0 ## 2 0 ## 3 0 ## 4 0 ## 5 0 ## 6 0 ## 7 0 ## 8 0 ## 9 0 ## 10 0 ## 11 0 ## 12 0 ## 13 0 ## 14 0 ## 15 0 ## 16 0 ## 17 0 ## 18 0 ## 19 0 ## 20 0 ## 21 0 ## 22 0 ## 23 0 ## 24 0 ## 25 0 ## 26 0 ## 27 0 ## 28 0 ## 29 0 ## 30 0 ## 31 0 ## 32 0 ## 33 0 ## 34 0 ## 35 0 ## 36 0 ## 37 0 ## 38 0 ## 39 0 ## 40 0 ## 41 0 ## 42 0 ## 43 0 ## 44 0 ## 45 0 ## 46 0 ## 47 0 ## 48 0 ## 49 0 ## 50 0 ## 51 1 ## 52 1 ## 53 1 ## 54 1 ## 55 1 ## 56 1 ## 57 1 ## 58 1 ## 59 1 ## 60 1 ## 61 1 ## 62 1 ## 63 1 ## 64 1 ## 65 1 ## 66 1 ## 67 1 ## 68 1 ## 69 1 ## 70 1 ## 71 1 ## 72 1 ## 73 1 ## 74 1 ## 75 1 ## 76 0 ## 77 0 ## 78 0 ## 79 0 ## 80 0 ## 81 0 ## 82 0 ## 83 0 ## 84 0 ## 85 0 ## 86 0 ## 87 0 ## 88 0 ## 89 0 ## 90 0 ## 91 0 ## 92 0 ## 93 0 ## 94 0 ## 95 0 ## 96 0 ## 97 0 ## 98 0 ## 99 0 ## 100 0 ## 101 0 ## 102 0 ## 103 0 ## 104 0 ## 105 0 ## 106 0 ## 107 0 ## 108 0 ## 109 0 ## 110 0 ## 111 0 ## 112 0 ## 113 0 ## 114 0 ## 115 0 ## 116 0 ## 117 0 ## 118 0 ## 119 0 ## 120 0 ## 121 0 ## 122 0 ## 123 0 ## 124 0 ## 125 0 ## 126 0 ## 127 0 ## 128 0 ## 129 0 ## 130 0 ## 131 0 ## 132 0 ## 133 0 ## 134 0 ## 135 0 ## 136 0 ## 137 0 ## 138 0 ## 139 0 ## 140 0 ## 141 0 ## 142 0 ## 143 0 ## 144 0 ## 145 0 ## 146 0 ## 147 0 ## 148 0 ## 149 0 ## 150 0 ## attr(,"assign") ## [1] 0 1 1 2 3 3 ## attr(,"contrasts") ## attr(,"contrasts")$conditions ## 1 2 ## A 1 0 ## B 0 1 ## baseline 0 0 ## ## attr(,"contrasts")$group ## 2 ## control 0 ## experimental 1
(Intercept)
is the first one, and it is always = 1condition1
is the second one, and it is A - baselinecondition2
is the third one, and it is B - baselinegroup2
is control - experimentalconditions1:group2
is the "A - baseline" difference for the control group, minus the same difference but for the experimental groupconditions2:group2
is the "B - baseline" difference for the control group, minus the same difference but for the experimental groupQuestion: Which coefficients we should use?
Therefore, when we create our coefficients, we should put the required effect size in that coefficient:
## we choose our effects of the IV on the DV intercept <- 0 condition <- c( 0 , 0 ) group <- 0 cond_grou <- c( 0.8 , 0 ) ## <- this should change as follows:
The repeat
loop, without the break
instruction, can go forever.
Until now we put the break only if we reach the desired power.
But what if we never reach the desired power?
And what if we have constraints in collecting samples from a particular population?
We just need to put an additional condition in the if ... break;
part of the code.
We need to write "if the sample size is greater or equal than X, then break the loop"
repeat{ for(i in 1:1000){ ... } if( mean( out[ , i_column] < 0.05) >= 0.9 ) break; if( n[ i_column ] >= 40 ) break; ## sample size contraint i_column <- i_column + 1 }
Better written:
repeat{ for(i in 1:1000){ ... } ## if we reach the power or... if( ( mean( out[ , i_column] < 0.05) >= 0.9 ) || ## we reach the sample size constraint... ( n[ i_column ] >= 40 ) ) break; i_column <- i_column + 1 }
With this methodology, you can compute the power of whatsoever study design.
You can apply this to multilevel models, bayesian statistics, and so on…
At the following example change the power at 90%.
out <- matrix(NA , nrow = 1000 , ncol = 10) n <- c(10 , 15 , 20 , 25 , 30 , 35 , 40 , 45 , 50 , 55) colnames( out ) <- n i_column <- 1 # index variable repeat{ for ( i in 1:1000){ y1 <- rnorm( n[ i_column ] , mean = 0.8 ) y2 <- rnorm( n[ i_column ] ) out[ i , i_column ] <- t.test( y1 , y2 )$p.value } if( mean( out[ , i_column] < 0.05) >= 0.8) break; i_column <- i_column + 1 } apply(out[ , 1:i_column]<0.05 , 2 , mean)
Change the effect size from 0.8 to 0.5:
out <- matrix(NA , nrow = 1000 , ncol = 10) n <- c(10 , 15 , 20 , 25 , 30 , 35 , 40 , 45 , 50 , 55) colnames( out ) <- n i_column <- 1 # index variable repeat{ for ( i in 1:1000){ y1 <- rnorm( n[ i_column ] , mean = 0.8 ) y2 <- rnorm( n[ i_column ] ) out[ i , i_column ] <- t.test( y1 , y2 )$p.value } if( mean( out[ , i_column] < 0.05) >= 0.8) break; i_column <- i_column + 1 } apply(out[ , 1:i_column]<0.05 , 2 , mean)
Add at the following example a break at a sample size of 55:
out <- matrix(NA , nrow = 1000 , ncol = 10) n <- c(10 , 15 , 20 , 25 , 30 , 35 , 40 , 45 , 50 , 55) colnames( out ) <- n i_column <- 1 # index variable repeat{ for ( i in 1:1000){ y1 <- rnorm( n[ i_column ] , mean = 0.2 ) y2 <- rnorm( n[ i_column ] ) out[ i , i_column ] <- t.test( y1 , y2 )$p.value } if( mean( out[ , i_column] < 0.05) >= 0.8) break; i_column <- i_column + 1 } apply(out[ , 1:i_column]<0.05 , 2 , mean)
Increase the number of subjects by steps of 1, instead of steps of 5.
out <- matrix(NA , nrow = 1000 , ncol = 10) n <- c(10 , 15 , 20 , 25 , 30 , 35 , 40 , 45 , 50 , 55) colnames( out ) <- n i_column <- 1 # index variable repeat{ for ( i in 1:1000){ y1 <- rnorm( n[ i_column ] , mean = 0.8 ) y2 <- rnorm( n[ i_column ] ) out[ i , i_column ] <- t.test( y1 , y2 )$p.value } if( mean( out[ , i_column] < 0.05) >= 0.8) break; i_column <- i_column + 1 } apply(out[ , 1:i_column]<0.05 , 2 , mean)