Experimental Design One Way Design 9 [617883]

ONE-WAY EXPERIMENTAL
DESIGN Research Methods in
Psychology 1

10 EXPERIMENTAL RESEARCH:
ONE-WAY DESIGNS
•What types of evidence allow us to conclude that one variable
causes another variable?
•How do experimental research designs allow the demonstration
of causal relationships between independent and dependent variables?
•How is equivalence among the level of the independent variable
created in experiments?
•How does the ANOVA test hypotheses about differences between the
experimental conditions?
•What are repeated -measures experimental designs?
•How are the results of experimental research designs presented in the
research report?
•What are the advantages and disadvantages of experimental designs
versus correlational research?

DEMONSTRATION OF CAUSALITY
Association
Temporal Priority
Control of Common -causal VariableIf there is a causal relationship between IV and DV, there must be
a strong correlation between them.
IV must be an precursor of the DV.
The influence of common -causal variables that may have produced
false relationships between IV and DV should be ruled out.

ONE-WAY EXPERIMENTAL DESIGN
Example: Violent cartoons increase children’s aggressive
behaviors.
1) Define an independent variable and its levels (experimental
condition)
2) Create equivalence either through use of different
participants ( between -participant designs ) or through use of the
same participants in each of experimental conditions ( repeated –
measure designs = within -participant designs )
3) Assign participants to each level randomly.
4) Select a dependent variable

ANALYSIS OF VARIANCE (ANOVA)
A statistical procedure that is specially designed to compare the
means of the dependent variables across the levels of an
experimental research design (the independent variable).
Example: Violent cartoons increase children’s aggressive
behaviors.
IV: Violent Cartoons vs. Nonviolent Cartoons vs. Control
DV: Children’s aggressive behaviors

ANOVA
The analysis of variance (ANOVA) is a technique of
decomposing the total variability of a response variable into:
Variability due to the experimental factor(s) and…
Variability due to error (i.e., factors that are not accounted for
in the experimental design).
The basic purpose of ANOVA is to test the equality of several
means.
A fixed effect model includes only fixed factors in the model.
A random effect model includes only random factors in the
model.
A mixed effect model includes both fixed and random factors
in the model.

ONE-WAY ANALYSIS OF VARIANCE
One factor of k levels or groups. E.g., 3 treatment groups in violent cartoon
study.
The main objective is to examine the equality of means of different groups.
Total variation of observations (SST) can be split in two components:
variation between groups (SSG) and variation within groups (SSE).
Variation between groups is due to the difference in different groups. E.g.
different cartoon groups or different levels of violence witnessed.
Variation within groups is the inherent variation among the observations
within each group.
Completely randomized design (CRD) is an example of one -way analysis
of variance.

HYPOTHESIS TESTING IN EXPERIMENTAL DESIGN
Null Hypothesis H0:
Mean Violent cartoons = Mean Nonviolent Cartoons = Mean Control
Research/Alternative Hypothesis H1:
Mean Violent cartoons > Mean Nonviolent Cartoons > Mean Control
There is a specific difference among the conditions so that Mean(violent)
is greater than M(nonviolent).

BETWEEN -GROUPS AND WITHIN -GROUPS
VARIANCE ESTIMATES
Variance A measure of dispersion of the scores on a variable.
The ANOVA compares the variance of the means of the
dependent variable between the different levels to the variance
of individuals on the dependent variable within each of the
conditions
Between -Group Variance
Within -Group Variance The variance among the condition means
The variance within the conditions

Data
Violent Nonviolent
5 2
3 1
4 3
3 2
4 1
5 3
Total mean
= 2.44Mean(violent)
= 4Mean(violent)
= 2Within -Groups
Variance
Between Group VarianceControl
= 1.33Control
2
1
1
1
2
1

F VALUE
F =Between -groups variance
Within -groups variance
As the Between -groups variance increases
in comparison to the Within -groups variance
F increases
P value will be less than alpha

ANOVA SUMMARY TABLE
Source Sum of df Mean F p -value
Squares Square
DV:
Aggressive
PlayBetween 14.40 2 14.40 10.98 .002
Within 49.78 38 1.31
Total 64.18

MEAN SQUARES AND GROUP DIFFERENCES
MSbetween> MSwithin:
MSbetween< MSwithin:

MEAN SQUARES AND GROUP DIFFERENCES
Question: Which suggests that group means are quite different:
MSbetween> MSwithinor MSbetween< MSwithin
Answer: If between group variance is greater than
within , the groups are quite distinct
It is unlikely that they came from a population with
the same mean
But, if within is greater than between , the groups
aren’t very different –they overlap a lot
It is plausible that m1= m2= m3= m4

THE F RATIO
The ratio of MSbetween to MSwithinis referred to
as the F ratio :
•If MSbetween> MSwithinthen F > 1
•If MSbetween< MSwithinthen F < 1
•Higher F indicates that groups are more separate
WithinBetween
JNJMSMS,1F

PRESENTATION OF EXPERIMENT RESULTS
There were significant differences on rated aggression across
the levels of the cartoon condition, F(2, 38) = 10.98, p< .01.
Children who viewed the violent cartoons ( M= 2.89) were rated
as playing more aggressively than children who had viewed the
nonviolent cartoons ( M= 1.52)F valuep value
Degree of freedom (# of Between -Groups -2)Degree of freedom (# of Within -Groups -1)
The mean value of the violent conditionThe mean value of the nonviolent condition

BETWEEN -PARTICIPANTS DESIGNS AND REPEATED -MEASURE DESIGNS
B-P Design R-M Design
Violent
CartoonsNonviolent
Cartoons
Aggressive
PlayAggressive
PlayPsPs
Violent
Cartoons
Violent
CartoonsNonviolent
CartoonsNonviolent
Cartoons
Aggressive PlayAggressive Play
Aggressive Play Aggressive Play

ADVANTAGES AND DISADVANTAGES OF
REPEATED -MEASURE DESIGNS
1) Increase Statistical Power
2) Economize Participants
X1) The first measure will influence the second measure.
(carryover)
2) Participants might become fatigued in the second measure.
3) Participants performance might improve on the task over
time through practice.

Counterbalancing
Arranging the order in which the
conditions of a repeated -measures
design are experienced.
Latin Square Designs
A method of counterbalancing the
order of conditions so that each
condition appears in each order
but also follows equally often
after each of the order conditions.P1 P2 P3
S1
S2
S3A B C
B C A
C A B

ADVANTAGES AND DISADVANTAGES OF EXPERIMENTS
The experiment design can not manipulate a person’s sex, race,
intelligence, family variables, and religious background.X
XXThe experiment design will not observe participant’s reaction
exactly as they would behave if observed outside of the lab
The experiment design necessarily oversimplify things.The experiment design allow us to draw conclusions about
causal relationships between the independent and dependent
variables.

Similar Posts