Evaluating Tools For Assessing Games With Systems Thinking Design Principles
Learning in games has historically been assessed indirectly and/or in a post hoc manner. What's needed instead is real-time assessment and support of learning based on the dynamic needs of players. We need to be able to experimentally determine the degree to which games can support learning, and how and why they achieve this objective. In this chapter we describe an approach to designing and developing evidence-based diagnostic assessments that may be embedded in a game environment. When embedded assessments are so seamlessly woven into the game that they're virtually invisible, we call this "stealth assessment." Embedding assessments within games provides a way to monitor a player's current level on valued competencies, and then use that information as the basis for support, such as adjusting the difficulty level of challenges or providing a report for the teacher.
Discover the world's research
- 20+ million members
- 135+ million publications
- 700k+ research projects
Join for free
43
D. Ifenthaler et al. (eds.), Assessment in Game-Based Learning: Foundations,
Innovations, and Perspectives, DOI 10.1007/978-1-4614-3546-4_4,
© Springer Science+Business Media New York 2012
4.1 Introduction
Scholars from various disciplines have recently shown increasing interest in using
well-designed digital games to support learning (e.g., Gee, 2003 ; Prensky, 2006 ;
Shaffer, Squire, Halverson, & Gee, 2005 ; Shute, Rieber, & Van Eck, 2011 ) . A com-
mon motivation for studying games as vehicles to support learning is frustration with
the current education system and a desire for alternative ways of teaching—ways that
increase student engagement and yield a rich, authentic picture of the learner(s).
Frustration stems from the fact that most schools in the U.S. are not adequately
preparing kids for success in the twenty- fi rst century (e.g., Partnership for 21st Century
Skills, 2006 ) . Learning in school is still heavily geared toward the acquisition of con-
tent within a teacher-centered model, with instruction too often abstract and decontex-
tualized and thus not suitable for this age of complexity and interconnectedness
(Shute, 2007 ) . One downside of this outdated pedagogy is that other developed coun-
tries of the world are surpassing the U.S. on measures of important competencies
(e.g., mathematics problem solving) as assessed by international tests such as the
PISA and TIMSS (Gonzales et al., 2008 ; Howard, Paul, Marisa, & Brooke, 2010 ) .
To make the problem with today's schools clearer, consider the following sce-
nario involving a prototypical student. Maya (13 years old) is sitting in her bedroom
with two of her friends. They are playing Little Big Planet —a digital game involv-
ing sack-person characters, clever and complex problems to solve, and compelling
music and graphics. The game can not only be played (for countless hours), but it
also provides tools to develop one's own levels and worlds which can then be shared
V. J. Shute (* ) • F. K e
Florida State University , 3205C Stone Building, 1114 West Call Street ,
Tallahassee , FL 32306-4453 , USA
e-mail: vshute@fsu.edu; fke@fsu.edu
Chapter 4
Games, Learning, and Assessment
Valerie J. Shute and Fengfeng Ke
44 V.J. Shute and F. Ke
and played with the rest of the Internet community. Fully engaging in the game
requires problem solving skills, persistence, and creativity—i.e., competencies
which are increasingly critical to success in the twenty- fi rst century but are not
supported by our current educational system.
Like so many young people today, Maya and her friends are bored with school,
and their mediocre grades re fl ect that attitude. But if Maya's teachers could see what
she was doing in Little Big Planet, their views of her as a "slacker" would be quite
different. For instance, Maya created and uploaded a new level in the game and is
showing it to her friends—both in her bedroom and all over the world via the Internet.
Several weeks ago, she began by writing a creative storyline, and used the in-game
toolbox to create a visually-stunning environment complete with actions and reac-
tions in the environment that re fl ect highly sophisticated physics understanding
(as well as a good command of AI programming skills that goes beyond what most
of her teachers are capable of doing). She regularly contributes detailed descriptions
of how she solved her various coding problems to the Little Big Planet discussion
forum, crafting her messages so they communicate clearly to all of the Little Big
Planet players. Is Maya completely wasting her time with this game when she could
be studying for her science test (e.g., memorizing the parts of a cell) or writing an
expository essay for English class (e.g., on "why someone you care about is impor-
tant to you")?
To answer the question above and to be able to make the claim that Maya is
indeed developing valuable skills like problem solving, creativity, and writing, we
need to employ some type of valid assessment to understand what Maya is learning
from playing the game, to what degree, and in which contexts. The main challenges
involved with creating such an assessment is that it must be suitable for the dynamic
nature of digital games, unobtrusive to the player, while not sacri fi cing reliability
and validity in the process.
The purpose of this chapter is to take a closer look at issues relating to game-
based assessment and learning. What are the core elements of a good game? Can
good games be used to support learning, based on the cumulative fi ndings of the
literature? How can game-based learning be assessed without interrupting the
engagement? To address these questions, we begin by de fi ning games and learning,
provide some examples of learning from games, and then present a new approach to
dynamically and validly assess learning within game environments (i.e., evidence-
based stealth assessment).
4.2 Games
According to Klopfer, Osterweil, and Salen ( 2009 ) , games refer to structured or
organized play. Play is voluntary, intrinsically motivating, and involves active
cognitive and/or physical engagement that allows for the freedom to fail (and
recover), to experiment, to fashion identities, and freedom of effort and interpreta-
tion (Klopfer et al., 2009 ; Pellegrini, 1995 ; Rieber, 1996 ) . Different from "free
45 4 Games, Learning, and Assessment
play," a game is usually a contest of physical or mental skills and strengths, requir-
ing the player to follow a speci fi c set of rules to attain a goal (Hogle, 1996 ) .
A more succinct de fi nition of "games" comes from Suits ( 1978 ) , who describes
games as, "unnecessary obstacles we volunteer to tackle." To illustrate this idea, he
used the game of golf where the objective is to get the ball into the hole. The most
obvious (and easiest) way to accomplish that goal is to just pick up the ball and put
it in the hole. But when you include the rules of the game (e.g., you must hit the
ball with a stick that has a small piece of metal on the end, while standing 200 yards
or so away from the hole) and other challenges (e.g., sand traps), this makes the
game much more dif fi cult and thus all the more compelling. In games, these unnec-
essary obstacles become something that we want to overcome because reaching for
goals and ultimately succeeding is highly rewarding. Games and their associated
obstacles also create a positive kind of stress, called eustress, which is actually
good for us, providing us with a sense of motivation and desire to succeed
(McGonigal,
2011 ) .
Taking a more componential tack, Prensky ( 2001 ) has argued that a game con-
sists of a number of key elements: rules, goals and objectives, outcomes and feed-
back, con fl ict (or competition, challenge, opposition), interaction, and representation
or story. Using Prensky's de fi nition, a game differs from a simulation in that a game
is intrinsically motivating and involves competition. A competitive format does not,
however, require two or more participants (Dempsey, Haynes, Lucassen, & Casey,
2002 ) . That is, if a simulation enables a learner to compete against him/herself by
comparing scores over successive attempts at the simulation, or has a game struc-
ture imposed on the system, it is regarded as a type of game. If the focus of a simula-
tion involves the completion of an event only, the simulation is not a game. In
addition, a simulation generally requires representing certain key characteristics or
behaviors of a selected real-world phenomenon or system. But not all games are
created to simulate dynamic systems in reality. For instance, fantasy may be part of
the game design.
4.2.1 Core Elements of Good Games
Diverse perspectives exist in the literature on what a good game should be. Gee
( 2009 ) recently de fi ned six key properties for good digital games to promote deep
learning: (a) an underlying rule system and game goal to which the player is emo-
tionally attached; (b) micro-control that creates a sense of intimacy or a feeling of
power; (c) experiences that offer good learning opportunities; (d) a match between
affordance (allowing for a certain action to occur) and effectivity (the ability of a
player to carry out such an action), (e) modeling to make learning from experience
more general and abstract, and (f) encouragement to players to enact their own
unique trajectory through the game (p. 78).
Other gaming scholars have focused on the playability of the game and player
motivation in describing a good game (e.g., Fabricatore, Nussbaum, & Rosas, 2002 ;
46 V.J. Shute and F. Ke
Kirkpatrick, 2007 ; Yee, 2006 ). For example, Sweetser and Wyeth ( 2005 ) developed
and validated an analytic model of game engagement called the GameFlow model .
This model captures and evaluates a game's enjoyment or engagement quality
through eight game fl ow elements, including concentration, challenge, player skills,
control, clear goals, feedback, immersion, and social interaction. Each element
encompasses a list of design criteria.
Concentration prescribes that games should provide stimuli from different
sources to grab and maintain players' attention, but not burden players with trivial
tasks or overload them beyond their cognitive, perceptual, and memory limits.
Challenge in a game should match the player's skill level, be increased as the player
progresses through the game, and allow for player-centered pacing. The element of
player skills suggests that games should have an easy and user-friendly interface,
provide a tutorial or online help that enables players' skill development as they
progress through the game, and reward players for skill development. The element
of control indicates that players should have a sense of control over the characters
and movements in the game world, the game interface, and gameplay (i.e., actions
and strategies players take or use when playing the game). Games should also pres-
ent clear overall and intermediate goals, as well as provide immediate feedback and
score status during the gaming process. As a result, games should support players
becoming fully immersed in the game, losing a sense of time and environment in the
process. Finally, games should support social interactions (including competition
and cooperation) between players, and support social communities inside and
outside the game.
By synthesizing the aforementioned fi ndings from the literature and other
discussions on good games, we have derived seven core elements of well-designed
games that are presented below.
• Interactive problem solving : Games require ongoing interaction between the
player and the game, which usually involves the requirement to solve a series of
problems or quests.
• Speci fi c goals / rules : Games have rules to follow and goals to attain which help
the player focus on what to do and when. Goals in games may be implicit or
explicit.
• Adaptive challenges : Good games balance dif fi culty levels to match players'
abilities. The best games and instruction hover at the boundary of a student's
ability.
• Control : A good game should allow or encourage a player's in fl uence over
gameplay, the game environment, and the learning experience.
• Ongoing feedback : Good games should provide timely information to players
about their performance. Feedback can be explicit or implicit, and as research
has indicated, has positive effects on learning.
• Uncertainty evokes suspense and player engagement. If a game "telegraphs" its
outcome, or can be seen as predictable, it will lose its appeal.
• Sensory stimuli refer to the combination of graphics, sounds, and/or storyline
used to excite the senses, which do not require "professional" graphics or sound
to be compelling.
47 4 Games, Learning, and Assessment
4.2.2 Good Games as Transformative Learning Tools
As many researchers have argued, good games can act as transformative digital
learning tools to support deep and meaningful learning. Based on the situated learn-
ing theory (Brown, Collins, & Duguid,
1989 ) , learning in a mindful way results in
knowledge that is considered meaningful and useful, as compared to the inert
knowledge that results from decontextualized learning strategies.
Learning is at its best when it is active, goal-oriented, contextualized, and inter-
esting (e.g., Bransford, Brown, & Cocking, 2000 ; Bruner, 1961 ; Quinn, 2005 ;
Vygotsky, 1978 ) . Instructional environments should thus be interactive, provide
ongoing feedback, grab and sustain attention, and have appropriate and adaptive
levels of challenge—i.e., the features of good games. With simulated visualization
and authentic problem solving with instant feedback, computer games can afford a
realistic framework for experimentation and situated understanding, hence can act
as rich primers for active learning (Gee, 2003 ; Laurel, 1991 ) .
In this chapter, learning is de fi ned as a lifelong process of accessing, interpreting,
and evaluating information and experiences, then translating the information/
experiences into knowledge, skills, values, and dispositions. It also involves
change—from one point in time to another—in terms of knowing, doing, believing,
and feeling. Prior research on games for learning usually focused on content learn-
ing in schools, such as learning the subjects of reading, writing, and mathematics.
For example, major literature reviews on educational gaming research (Dempsey,
Rasmussen, & Lucassen, 1996 ; Emes, 1997 ; Hays, 2005 ; Ke, 2008 ; Randel, Morris,
Wetzel, & Whitehill, 1992 ; Vogel et al., 2006 ; Wolfe, 1997 ) have indicated that the
majority of gaming studies have focused on content-speci fi c learning. Learning in
game studies encompasses the following subject areas: science education, mathe-
matics, language arts, reading, physics, and health, among others (Ke, 2008 ) .
Substantially fewer studies to date have examined the development of cognitive
processes in games (e.g., Alkan & Cagiltay, 2007 ; Pillay, 2002 ; Pillay, Brownlee, &
Wilss, 1999 ) .
While games can support content learning, we believe that games are actually better
suited to support more complex competencies. As many researchers have pointed out
(e.g., Gee, 2003 ; Malone & Lepper, 1987 ; Rieber, 1996 ) , games, as a vehicle for play,
can be viewed as a natural cognitive tool or toy for both children and adults (Hogel,
1996 ) . And rather than being used as a means to achieve an external goal (e.g., learning
mathematics), games are often made to align with players' intrinsic interests and chal-
lenge learners to use skills they would not otherwise tend to use (Malone & Lepper,
1987 ) , thus enabling the design of intrinsically motivating environments, with knowl-
edge and skill acquisition as a positive by-product of gameplay.
Besides providing opportunities for play, games enable extensive and multiple
types of cognitive learning strategies. For example, games can be used as an anchor
for learning-by-design to reinforce creativity of learners (Kafai, 2005 ) . Games can
involve players in forming, experimenting with, interpreting, and adapting playing
strategy in order to solve problems, thus enabling players to practice persistent
48 V.J. Shute and F. Ke
problem solving (Kiili, 2007 ) . Games can also be developed as dynamic systems
with which players can observe and play out key principles inherent in the systems,
and hence develop organizational and systemic thinking skills (Klopfer et al., 2009 ) .
Finally, games can express and inspire certain underlying epistemic frames, values,
beliefs, and identities (Shaffer,
2005 ) .
There is a convergence between the core elements of a good game and the char-
acteristics of productive learning. The constructivist problem-based and inquiry
learning methods indicated the success of learning in the context of challenging,
open-ended problems (Hmelo-Silver, 2004 ) . Goal-based scenarios have long been
viewed as an active primer for situated learning (Bransford et al., 2000 ).
Correspondingly, in a good game a player is involved in an iterative cycle of goal-
based, interactive problem solving. Psychologists (e.g., Falmagne, Cosyn, Doignon,
& Thiery, 2003 ; Vygotsky, 1987 ) have long argued that the best instruction hovers
at the boundary of a student's competence. Along the same line, Gee ( 2003 ) has
argued that the secret of a good game is not its 3D graphics and other bells and
whistles, but its underlying architecture where each level dances around the outer
limits of the player's abilities, seeking at every point to be hard enough to be just
doable. Moreover, a good game reinforces a sense of control—a critical metacogni-
tive component for self-regulated learning (Zimmerman & Schunk, 2001 ) . Similarly,
both well-designed games and productive learning processes employ ongoing feed-
back as a major mechanism of play/learning support. Finally, the literature on the
contribution of curiosity for learning motivation (Krapp, 1999 ) and the critical role
of sensory memory in information processing (Anderson, 1995 ) is closely con-
nected with the discussion of uncertainty and sensory stimuli in good games.
The problem with offering a game as a transformative learning tool to support
complex competencies is that its effectiveness often cannot be directly or easily
measured by traditional assessment instruments (e.g., multiple-choice tests). Implicit
learning occurs when players are not consciously intending to learn some content.
Therefore, focusing solely on knowledge-test-scores as outcomes is too limited
since the games' strength lies in supporting emergent complex skills.
4.3 Evidence of Learning from Games
Following are four examples of learning from digital games that represent commercial
as well as educational games. Preliminary evidence suggests that students can learn
deeply from such games, and acquire important twenty- fi rst century competencies.
4.3.1 Deep Learning in Civilization
Our fi rst example illustrates how a commercial digital game can be used to support
deep learning of history. Kurt Squire, at the University of Wisconsin, used a strategy
49 4 Games, Learning, and Assessment
game called Civilization in a high school world history class (Squire, 2004 ) . The
goal of this game is to build, advance, and protect a civilization. This game starts
with kids picking a civilization that they want to build (e.g., ancient Mesopotamia).
Kids make many decisions about how to build and grow their civilization. Sometimes
their decisions can be as simple as deciding where to put a new bridge, but they can
be as complex as deciding whether to start a nuclear war. To make successful deci-
sions, a player needs to consider important elements of human history, including
economy, geography, culture, technology advancement, and war.
So what do kids learn from playing this game? Squire reported that players mas-
tered many historical facts (e.g., where Rome was located), but more importantly, at
the end of the game, they took away a deep understanding about the intricate rela-
tionships involving geographical, historical, and economic systems within and
across civilizations.
4.3.2 Gamestar Mechanic and Systems Thinking
Our next example illustrates how digital games can be used to support systems
thinking skill. Systems thinking skill refers to a particular way of looking at the
world which involves seeing the "big picture" and the underlying interrelationships
among the constituent elements rather than just as isolated bits. Gamestar Mechanic
is an online game that is intended to teach kids basic game design skills and also
allows them to actually build their own games for themselves, friends, and family to
play. To design a functioning and challenging game in Gamestar Mechanic, players
need to think hard about various game elements, parameters, and their interrelation-
ships. If they think too simply, and just change a few elements of the game without
considering the whole system, the game will not work.
For example, consider a player who included too many enemies in her game
(each one with full strength). The consequence of this decision would be that other
players would not be able to beat the game, so it would not be any fun. With a little
re fl ection, she would realize the impact that the number/strength of enemies feature
of the game would have on other elements of the game, and revise accordingly.
Torres ( 2009 ) recently reported on his research using Gamestar Mechanic. He found
that kids who played the game did, in fact, develop systems thinking skills along
with other important skills such as innovative design.
4.3.3 Epistemic Games
Another example of a type of digital game that supports learning is the epistemic
game. An epistemic game is a unique game genre where players virtually experi-
ence the same things that professional practitioners do (e.g., urban planner, journal-
ist, and engineer). Epistemic games are being developed by Shaffer and his research
team at the University of Wisconsin-Madison (Shaffer, 2007 ) . These games are
50 V.J. Shute and F. Ke
based on the idea that learning means acquiring and adopting knowledge, skills,
values, and identities that are embedded within a particular discipline or profes-
sional community. For example, to really learn engineering means being able to
think, talk, and act like an engineer.
One example of an epistemic game is Urban Science. In Urban Science, players
work as interns for an urban and regional planning center. Players as a group develop
landscape planning proposals for the mayor of the city where they live. As part of
the game play process, they fi rst conduct a site visit interviewing virtual stakehold-
ers in the area to identify different interests. For instance, some stakeholders may
want a parking garage while others want affordable housing. Players need to con-
sider various social and economic impacts of their decisions. They also use a special
mapping tool called iplan (which is a tool similar to an actual Geographic Information
System) to come up with their fi nal planning. Towards the end of the game, they
write their fi nal proposal to the mayor discussing strengths and weaknesses of their
fi nal planning ideas.
4.3.4 Taiga Park and Science Content Learning
Our last example illustrates how kids learn science content and inquiry skills within
an online game called Quest Atlantis: Taiga Park. Taiga Park is an immersive digi-
tal game developed by Barab et al. at Indiana University (Barab, Gresal fi , &
Ingram-Goble, 2010 ; Barab et al., 2007 ) . Taiga Park is a beautiful national park
where many groups co-exist, such as the fl y- fi shing company, the Mulu farmers,
the lumber company, and park visitors. In this game, Ranger Bartle calls on the
player to investigate why the fi sh are dying in the Taiga River. To solve this prob-
lem, players are engaged in scienti fi c inquiry activities. They interview virtual
characters to gather information, and collect water samples at several locations
along the river to measure water quality. Based on the collected information, play-
ers make a hypothesis and suggest a solution to the park ranger.
To move successfully through the game, players need to understand how certain
science concepts are related to each other (e.g., sediment in the water from the log-
gers' activities causes an increase to the water temperature, which decreases the
amount of dissolved oxygen in the water, which causes the fi sh to die). Also, players
need to think systemically about how different social, ecological, and economical
interests are intertwined in this park. In a controlled experiment, Barab et al. ( 2010 )
found that the middle school students learning with Taiga Park scored signi fi cantly
higher on the posttest (assessing knowledge of core concepts such as erosion and
eutrophication) compared to the classroom condition. The same teacher taught both
treatment and control conditions. The Taiga Park group also scored signi fi cantly
higher than the control condition on a delayed posttest, thus demonstrating retention
of the content relating to water quality.
As these examples show, digital games appear to support learning. But how can
we more accurately measure learning, especially as it happens (rather than after the
51 4 Games, Learning, and Assessment
fact)? The answer is not likely to be via multiple choice tests or self-report surveys
as those kinds of assessments cannot capture and analyze the dynamic and complex
performances that inform twenty- fi rst century competencies. A new approach to
assessment is needed.
4.4 Assessment in Games
In a typical digital game, as players interact with the environment, the values of
different game-speci fi c variables change. For instance, getting injured in a battle
reduces health and fi nding a treasure or another object increases your inventory of
goods. In addition, solving major problems in games permits players to gain rank
or "level up." One could argue that these are all "assessments" in games—of health,
personal goods, and rank. But now consider monitoring educationally-relevant
variables at different levels of granularity in games. In addition to checking health
status, players could check their current levels of systems thinking skill, creativity,
and teamwork, where each of these competencies is further broken down into con-
stituent knowledge and skill elements (e.g., teamwork may be broken down into
cooperating, negotiating, and in fl uencing skills). If the estimated values of those
competencies got too low, the player would likely feel compelled to take action to
boost them.
4.4.1 Evidence-Centered Design
One main challenge for educators who want to employ or design games to support
learning involves making valid inferences—about what the student knows, believes,
and can do—at any point in time, at various levels, and without disrupting the fl ow
of the game (and hence engagement and learning). One way to increase the quality
and utility of an assessment is to use evidence-centered design (ECD), which
informs the design of valid assessments and can yield real-time estimates of stu-
dents' competency levels across a range of knowledge and skills (Mislevy, Steinberg,
& Almond, 2003 ).
ECD is a conceptual framework that can be used to develop assessment models,
which in turn support the design of valid assessments. The goal is to help assess-
ment designers coherently align (a) the claims that they want to make about learn-
ers, and (b) the things that learners say or do in relation to the contexts and tasks of
interest (for an overview, see Mislevy & Haertel,
2006 ; Mislevy et al., 2003 ). There
are three main theoretical models in the ECD framework: competency, evidence,
and task models.
The competency model consists of student-related variables (e.g., knowledge,
skills, and other attributes) on which we want to make claims. For example, sup-
pose that you wanted to make claims about a student's ability to "design excellent
52 V.J. Shute and F. Ke
presentation slides" using MS PowerPoint. The competency model variables
(or nodes) would include technical as well as visual design skills. The evidence
model would show how, and to what degree, speci fi c observations and artifacts can
be used as evidence to inform inferences about the levels or states of competency
model variables. For instance, if you observed that a learner demonstrated a high
level of technical skill but a low level of visual design skill, you may estimate her
overall ability to design excellent slides to be approximately "medium"—if both
the technical and aesthetic skills were weighted equally.
The task model in the ECD framework speci fi es the activities or conditions under
which data are collected. In our current PowerPoint example, the task model would
de fi ne the actions and products (and their associated indicators) that the student
would generate comprising evidence for the various competencies.
There are two main reasons why we believe that the ECD framework fi ts well
with the assessment of learning in digital games. First, in digital games, people
learn in action (Gee,
2003 ; Salen & Zimmerman, 2005 ). That is, learning involves
continuous interactions between the learner and the game, so learning is inherently
situated in context. Therefore, the interpretation of knowledge and skills as the
products of learning cannot be isolated from the context, and neither should assess-
ment. The ECD framework helps us to link what we want to assess and what learn-
ers do in complex contexts. Consequently, an assessment can be clearly tied to
learners' actions within digital games, and can operate without interrupting what
learners are doing or thinking (Shute, 2011 ) .
The second reason that ECD is believed to work well with digital games is
because the ECD framework is based on the assumption that assessment is, at its
core, an evidentiary argument. Its strength resides in the development of perfor-
mance-based assessments where what is being assessed is latent or not apparent
(Rupp, Gushta, Mislevy, & Shaffer, 2010 ) . In many cases, it is not clear what people
learn in digital games. However in ECD, assessment begins by fi guring out just
what we want to assess (i.e., the claims we want to make about learners), and clari-
fying the intended goals, processes, and outcomes of learning.
Accurate information about the student can be used as the basis for (a) deliv-
ering timely and targeted feedback, as well as (b) presenting a new task or quest
that is right at the cusp of the student's skill level, in line with fl ow theory (e.g.,
Csikszentmihalyi, 1900 ) and Vygotsky's zone of proximal development
(Vygotsky, 1978 ) .
4.4.2 Stealth Assessment
Given the goal of using educational games to support learning in school settings
(and elsewhere), we need to ensure that the assessments are valid, reliable, and
also pretty much invisible (to keep engagement intact). That is where "stealth
assessment" comes in (Shute, 2011 ; Shute, Ventura, Bauer, & Zapata-Rivera,
2009 ) . Very simply, stealth assessment refers to ECD-based assessments that are
53 4 Games, Learning, and Assessment
woven directly and invisibly into the fabric of the learning environment. During
game play, students naturally produce rich sequences of actions while performing
complex tasks, drawing on the very skills or competencies that we want to assess
(e.g., scienti fi c inquiry skills, creative problem solving). Evidence needed to
assess the skills is thus provided by the players' interactions with the game itself
(i.e., the processes of play), which can be contrasted with the product(s) of an
activity—the norm in educational environments.
Making use of this stream of evidence to assess students' knowledge, skills, and
understanding (as well as beliefs, feelings, and other learner states and traits) pres-
ents problems for traditional measurement models used in assessment. First, in tra-
ditional tests the answer to each question is seen as an independent data point. In
contrast, the individual actions within a sequence of interactions in a game are often
highly dependent on one another. For example, what one does in a particular game
at one point in time affects the subsequent actions later on. Second, in traditional
tests, questions are often designed to measure particular, individual pieces of knowl-
edge or skill. Answering the question correctly is evidence that one may know a
certain fact: one question—one fact. But by analyzing a sequence of actions within
a quest (where each response or action provides incremental evidence about the cur-
rent mastery of a speci fi c fact, concept, or skill), stealth assessments within game
environments can infer what learners know and do not know at any point in time.
Now, because we typically want to assess a whole cluster of skills and abilities from
evidence coming from learners' interactions within a game, methods for analyzing
the sequence of behaviors to infer these abilities are not as obvious. As suggested
above, evidence-based stealth assessments can address these problems.
As a brief example of stealth assessment, Shute et al. (
2009 ) used a commercial
video game called Oblivion (i.e., The Elder Scrolls ® IV : Oblivion ©, 2006 , by
Bethesda Softworks) and demonstrated how assessment can be situated within a
game environment and the dynamic student data can be used as the basis for diag-
nosis and formative feedback. A competency model for creative problem solving
was created, which was divided into two parts—creativity and problem solving.
These, in turn, were divided into novelty and ef fi ciency indicators which were tied
to particular actions one could take in the game. Different actions would have
different impacts on relevant variables in the competency model. For instance, if a
player came to a river in the game and dove in to swim across it, the system would
recognize this as a common (not novel) action and automatically score it accord-
ingly (e.g., low on novelty). Another person who came to the same river but chose
to use a spell to freeze the river and slide across would be evidencing more novel
(and ef fi cient) actions, and the score for the creative variable in the competency
model would be updated accordingly.
The models are updated via Bayesian inference networks (or Bayes nets). That
is, the model of a student's game-play performance (i.e., the "student model") accu-
mulates and represents probabilistic belief about the targeted aspects of skill,
expressed as probability distributions for competency-model variables (Almond &
Mislevy,
1999 ). Evidence models identify what the student says or does that can
provide evidence about those skills (Steinberg & Gitomer, 1996 ) and express in a
54 V.J. Shute and F. Ke
psychometric model how the evidence depends on the competency-model variables
(Mislevy, 1994 ) . Task models express situations that can evoke required evidence.
One upside of the evidence-based stealth assessment approach relates to its
ability to assess general and content-speci fi c learning in games. That is, stealth
assessment is able to assess a range of attributes—from general abilities or disposi-
tions (e.g., problem solving, creativity, and persistence) to content-speci fi c learning
(e.g., water quality, physics concepts), or even current beliefs.
4.5 Conclusion
At the beginning of this chapter we listed several questions and attempted to answer
them throughout. That is, we (a) described a set of core elements of a well-designed
game distilled from the literature, (b) presented examples of research studies where
games were shown to support learning, and (c) discussed an approach to game-
based learning using stealth assessment techniques. Our stealth assessment approach
involves the use of ECD which enables the estimation of students' competency
levels and further provides the evidence supporting claims about competencies.
Consequently, ECD has built-in diagnostic capabilities that permits a stakeholder
(i.e., the teacher, student, parent, and others) to examine the evidence and view the
current estimated competency levels. This in turn can inform instructional support
or provide valuable feedback to the learner.
While there seems to be a lot of promise in relation to the evidence-based stealth
assessment idea, what are some of the downsides or possible limitations of this
approach? First, Rupp et al. ( 2010 ) noted that when developing games that employ
ECD for assessment design, the competency model must be developed at an appro-
priate level of granularity to be implemented in the assessment. Too large a grain
size means less speci fi c evidence is available to determine student competency,
while too fi ne a grain size means a high level of complexity and increased resources
to be devoted to the assessment. Second, the development costs of ECD-based
assessments can be relatively high for complex competencies. To counter this obsta-
cle, we are currently exploring ways to create stealth assessment models that can be
used in related but different games (i.e., in a plug-and-play manner). Creating such
cross-platform models for digital games would be useful and cost effective for edu-
cators interested in using games for assessment and support of learning. Finally,
some people may not be "into games" thus there may be individual (or cultural) dif-
ferences relating to prior game experience or differential interests that affect learn-
ing. That is, certain personal or cultural variables may be identi fi ed that interact,
mediate, or moderate the effects of gameplay on learning. This is all valuable future
research to pursue.
In conclusion, the world is changing rapidly but education is not. Preparing our
kids to succeed in the twenty- fi rst century requires fresh thinking on how to foster
new competencies. There's an associated need to design and develop valid and reli-
able assessments of these new skills. We have suggested that ECD should be used
55 4 Games, Learning, and Assessment
as the framework for developing new assessments that can yield valid measures;
provide accurate estimates of complex competencies embedded in dynamic perfor-
mances; and aggregate information from a variety of sources. We also believe that
well-designed games can serve as one excellent type of learning environment
because games are intrinsically motivating and can facilitate learning of academic
content and twenty- fi rst century competencies within complex and meaningful
environments. Such games can also promote social skills (like communication,
collaboration, negotiation, and perspective taking), higher-order thinking skills (like
problem solving and critical reasoning), and ownership of learning.
Designing evidence-based stealth assessments and weaving them directly within
digital games will allow all kids to become fully engaged, to the point where they
want (perhaps even demand) to play/learn, even outside of school. That is a lovely
vision, especially in contrast with often frequent struggles to get kids to do their
homework.
Acknowledgments We'd like to offer special thanks to Matthew Ventura and Yoon Jeon Kim for
their help on conceptualizing various parts of this paper, regarding the categorization of the seven
core elements of games and game-based assessment issues.
References
Alkan, S., & Cagiltay, K. (2007). Studying computer game learning experience through eye track-
ing. British Journal of Educational Technology, 38 (3), 538–542.
Almond, R. G., & Mislevy, R. J. (1999). Graphical models and computerized adaptive testing.
Applied Psychological Measurement, 23 (3), 223–237.
Anderson, J. R. (1995). Learning and memory: An integrated approach . New York: Wiley.
Barab, S. A., Gresal fi , M., & Ingram-Goble, A. (2010). Transformational play. Educational
Researcher, 39 (7), 525–536.
Barab, S. A., Zuiker, S., Warren, S., Hickey, D., Ingram-Goble, A., Kwon, E.-J., et al. (2007).
Situationally embodied curriculum: Relating formalisms and contexts. Science Education,
91 (5), 750–782.
Bethesda Softworks (2006). Elder schools VI: Oblivion . Retrieved April 9, 2012, from
http://www.
bethsoft.com/games/games_oblivion.html .
Bransford, J., Brown, A., & Cocking, R. (2000). How People Learn: Brain, Mind, and Experience
& School . Washington, DC: National Academy Press.
Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning.
Educational Researcher, 18 (1), 32–42.
Bruner, J. S. (1961). The act of discovery. Harvard Educational Review, 31 (1), 21–32.
Csikszentmihalyi, M. (1990). Flow: The psychology of optical experience . New York: Harper
Perrennial.
Dempsey, J. V., Haynes, L. L., Lucassen, B. A., & Casey, M. S. (2002). Forty simple computer
games and what they could mean to educators. Simulation & Gaming, 33 (2), 157–168.
Dempsey, J. V., Rasmussen, K., & Lucassen, B. (1996). Instructional gaming: Implications for
instructional technology . Paper presented at the annual meeting of the Association for
Educational Communications and Technology, Nashville, TN.
Emes, C. E. (1997). Is Mr Pac Man eating our children? A review of the effect of digital games on
children. Canadian Journal of Psychiatry, 42 (4), 409–414.
56 V.J. Shute and F. Ke
Fabricatore, C., Nussbaum, M., & Rosas, R. (2002). Playability in action videogames: A qualitative
design model. Human Computer Interaction, 17 (4), 311–368.
Falmagne, J.-C., Cosyn, E., Doignon, J.-P., & Thiery, N. (2003). The assessment of knowledge, in
theory and in practice. In R. Missaoui & J. Schmidt (Eds.), Fourth international conference on
formal concept analysis (Lecture notes in computer science, Vol. 3874, pp. 61–79). New York:
Springer.
Gee, J. P. (2003). What digital games have to teach us about learning and literacy . New York:
Palgrave Macmillan.
Gee, J. P. (2009). Deep learning properties of good digital games: How far can they go? In
U. Ritterfeld, M. Cody, & P. Vorderer (Eds.), Serious games: Mechanisms and effects (pp.
65–80). New York: Routledge.
Gonzales, P., Williams, T., Jocelyn, L., Roey, S., Kastberg, D., & Brenwald, S. (2008). Highlights
from TIMSS 2007: Mathematics and science achievement of U.S. fourth- and eighth-grade
students in an international context (NCES 2009–001) . Washington, DC: National Center for
Education Statistics, Institute of Education Sciences, U.S. Department of Education.
Hays, R. T. (2005). The effectiveness of instructional games: A literature review and discussion .
Retrieved May 10, 2006, from
http://adlcommunity.net/ fi le.php/23/GrooveFiles/Instr_Game_
Review_Tr_2005.pdf.
Hmelo-Silver, C. E. (2004). Problem-based learning: What and how do students learn? Educational
Psychology Review, 16 (3), 235–266.
Hogle, J. G. (1996). Considering games as cognitive tools: In search of effective "Edutainment" .
Retrieved January 12, 2005, from ERIC, ED 425737.
Howard, L. F., Paul, J. H., Marisa, P. P., & Brooke, E. S. (2010). Highlights from PISA 2009:
Performance of U.S. 15-year-old students in reading, mathematics, and science literacy in an
international context (NCES 2011–004) . Washington, DC: National Center for Education
Statistics, Institute of Education Sciences, U.S. Department of Education.
Kafai, Y. B. (2005). The classroom as "living laboratory": Design-based research for understand-
ing, comparing, and evaluating learning science through design. Educational Technology,
65 (1), 28–34.
Ke, F. (2008). A qualitative meta-analysis of computer games as learning tools. In R. E. Ferdig
(Ed.), Handbook of research on effective electronic gaming in education (pp. 1–32). New York:
IGI Global.
Kiili, K. (2007). Foundation for problem-based gaming. British Journal of Educational Technology,
38 (3), 394–404.
Kirkpatrick, G. (2007). Between art and gameness: Critical theory and computer game aesthetics.
Thesis Eleven, 89 , 74–93.
Klopfer, E., Osterweil, S., & Salen, K. (2009). Moving learning games forward: Obstacles, oppor-
tunities & openness . Cambridge, MA: The Education Arcade.
Krapp, A. (1999). Interest, motivation and learning: An educational-psychological perspective.
European Journal of Psychology of Education, 14 (1), 23–40.
Laurel, B. (1991). Computers as theatre . Reading, MA: Addison-Wesley.
Malone, T. W., & Lepper, M. R. (1987). Making learning fun: A taxonomy of intrinsic motivations
for learning. In R. E. Snow & M. J. Farr (Eds.), Aptitude, learning and instruction: III. Cognitive
and affective process analyses (pp. 223–253). Hilsdale, NJ: Erlbaum.
McGonigal, J. (2011). Reality is broken: Why games make us better and how they can change the
world . New York: Penguin Press.
Mislevy, R. J. (1994). Evidence and inference in educational assessment. Psychometrika, 59 ,
439–483.
Mislevy, R. J., & Haertel, G. D. (2006). Implications of evidence-centered design for educational
testing. Educational Measurement: Issues and Practice, 25 (4), 6–20.
Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003). On the structure of educational assess-
ments. Measurement: Interdisciplinary Research and Perspectives, 1 , 3–62.
Partnership for 21st Century Skills. (2006). Results that matter: 21st century skills and high school
reform . Retrieved from April 28, 2012.
http://www.p21.org/documents/RTM2006.pdf .
57 4 Games, Learning, and Assessment
Pellegrini, A. D. (1995). The future of play theory: A multidisciplinary inquiry into the contribu-
tions of Brian Sutton-Smith . Albany, NY: State University of New York Press.
Pillay, H. (2002). An investigation of cognitive processes engaged in by recreational computer
game players: Implications for skills of the future. Journal of Research on Technology in
Education, 34 (3), 336–350.
Pillay, H., Brownlee, J., & Wilss, L. (1999). Cognition and recreational computer games: Implications
for educational technology. Journal of Research on Computing in Education, 32 (1), 203.
Prensky, M. (2001). Digital game-based learning . New York: McGraw-Hill.
Prensky, M. (2006). Don't bother me mom, I'm learning!: How computer and digital games are
preparing your kids for 21st century success and how you can help! St. Paul, MN: Paragon
House.
Quinn, C. (2005). Engaging learning: Designing e-learning simulation games . San Francisco:
Pfeiffer.
Randel, J. M., Morris, B. A., Wetzel, C. D., & Whitehil, B. V. (1992). The effectiveness of games
for educational purposes: A review of recent research. Simulation & Gaming, 23 (3), 261–276.
Rieber, L. P. (1996). Seriously considering play: Designing interactive learning environments
based on the blending of microworlds, simulations, and games. Educational Technology
Research and Development, 44 (1), 43–58.
Rupp, A. A., Gushta, M., Mislevy, R. J., & Shaffer, D. W. (2010). Evidence-centered design of
epistemic games: Measurement principles for complex learning environments. The Journal of
Technology, Learning, and Assessment, 8 (4). Retrieved April 9, 2012, from
http://escholarship.
bc.edu/jtla/vol8/4 .
Salen, K., & Zimmerman, E. (2005). Game design and meaningful play. In J. Raessens & J. Goldstein
(Eds.), Handbook of computer game studies (pp. 59–80). Cambridge, MA: MIT Press.
Shaffer, D. W. (2005). Studio mathematics: The epistemology and practice of design pedagogy as
a model for mathematics learning . Wisconsin Center for Education Research Working paper,
No. 2005-3.
Shaffer, D. W. (2007). How computer games help children learn . New York: Palgrave.
Shaffer, D. W., Squire, K. A., Halverson, R., & Gee, J. P. (2005). Digital games and the future of
learning. Phi Delta Kappan, 87 (2), 104–111.
Shute, V. J. (2007). Tensions, trends, tools, and technologies: Time for an educational sea change.
In C. A. Dwyer (Ed.), The future of assessment: Shaping teaching and learning (pp. 139–187).
New York: Lawrence Erlbaum/Taylor & Francis.
Shute, V. J. (2011). Stealth assessment in computer-based games to support learning. In S. Tobias
& J. D. Fletcher (Eds.), Computer games and instruction (pp. 503–524). Charlotte, NC:
Information Age.
Shute, V. J., Rieber, L., & Van Eck, R. (2011). Games … and … learning. In R. Reiser &
J. Dempsey (Eds.), Trends and issues in instructional design and technology (3rd ed.,
pp. 321–332). Upper Saddle River, NJ: Pearson Education.
Shute, V. J., Ventura, M., Bauer, M. I., & Zapata-Rivera, D. (2009). Melding the power of serious
games and embedded assessment to monitor and foster learning: Flow and grow. In
U. Ritterfeld, M. Cody, & P. Vorderer (Eds.), Serious games: Mechanisms and effects
(pp. 295–321). Mahwah, NJ: Routledge, Taylor and Francis.
Squire, K. (2004). Replaying history: Learning world history through playing Civilization III.
ProQuest Dissertations, Indiana University.
Steinberg, L. S., & Gitomer, D. G. (1996). Intelligent tutoring and assessment built on an under-
standing of a technical problem-solving task. Instructional Science, 24 , 223–258.
Suits, B. H. (1978). The grasshopper: Games, life and utopia . Toronto, ON: University of Toronto Press.
Sweetser, P., & Wyeth, P. (2005). GameFlow: A model for evaluating player enjoyment in games.
ACM Computers in Entertainment, 3 (3), 1–24.
Torres, R. J. (2009). Using Gamestar Mechanic within a nodal learning ecology to learn systems
thinking: A worked example. International Journal of Learning and Media, 1 (2), 1–11.
Vogel, J. F., Vogel, D. S., Cannon-Bowers, J., Bowers, C. A., Muse, K., & Wright, M. (2006).
Computer gaming and interactive simulations for learning: A meta-analysis. Journal of
Educational Computing Research, 34 (3), 229–243.
58 V.J. Shute and F. Ke
Vygotsky, L. S. (1978). Mind in society: The development of higher mental processes . Cambridge,
MA: Harvard University Press.
Vygotsky, L. S. (1987). The collected works of L. S. Vygotsky . New York: Plenum.
Wolfe, J. (1997). The effectiveness of business games in strategic management course work.
Simulation & Gaming, 28 (4), 360–376.
Yee, N. (2006). The demographics, motivations, and derived experiences of users of massively
multi-user online graphical environments. Presence: Teleoperators and Virtual Environments,
15 (3), 309–329.
Zimmerman, B. J., & Schunk, D. H. (2001). Self-regulated learning and academic achievement:
Theoretical perspectives . Mahwah, NJ: Lawrence Erlbaum.
... Some of the skills that can be assessed based on gameplay data are teamwork ability [19,21,54], language proficiency [19,54,56], financial investment skills [19], math fluency [24,51,54,56], ICT skills [54,57], creative problem solving [45,51], spatial navigation [58], fine motor skills [51], metacognition and systems thinking [51], memory retention [10,45,59], cultural knowledge [41], and the understanding of specific science concepts, such as Newtonian mechanics [11,60]. Approaches for game-based assessment can also allow to track a user's cognitive development and learning trajectories over time [51,61,62,63] and to examine specific gaps in knowledge [46,63] or learning difficulties, such as reading problems and dyscalculia [61,64]. ...
... As "transformative learning tools" which often cover aspects of human history, economy, geography, culture, technology, and war[11], video games can be intentionally designed to propagandize populations and influence users' political leanings, functioning as an "interactive influence medium"[12] or "radicalising medium"[13]. ...
With many million users across all age groups and income levels, video games have become the world's leading entertainment industry. Behind the fun experience they provide, it goes largely unnoticed that modern game devices pose a serious threat to consumer privacy. To illustrate the industry's potential for illegitimate surveillance and user profiling, this paper offers a classification of data types commonly gathered by video games. Drawing from patents and literature of diverse disciplines, we also discuss how patterns and correlations in collected gameplay data may leak additional information in ways not easily understood or anticipated by the user. This includes inferences about a user's biometric identity, age and gender, emotions, skills, interests, consumption habits, and personality traits. Based on these findings, we argue that video games need to be brought into the focus of privacy research and discourse. Considering the granularity and enormous scale of the data collection taking place, this industry deserves the same level of scrutiny as other digital services, such as search engines , dating apps, or social media platforms. The knowledge compiled in this paper can serve as a basis for privacy impact assessments, consumer education, and further research into the societal impact of video games.
... Xu found that vocabulary was the most prevalent language skill studied, and gameful elements lacked consistency amongst the digital games used. The design categorization used in this review was adapted from Shute [28], who well-categorized what the user should experience but did not identify the specific elements needed to accomplish these goals. ...
... To reduce redundancy with previously conducted research, the data extraction and coding focused on the game design elements using a framework that could identify specific features. Other game-based learning reviews used more general design models such as Shute [28] and ARCS [139]. For example, ARCS, which stands for attention, relevance, confidence, and satisfaction, is a detailed model and contains several design elements needed to create motivational environments. ...
Considerable changes have occurred in language learning with the introduction of gameful approaches in the classroom and the increase in the popularity of language applications like Duolingo. A review of existing studies on such approaches to language learning shows that gamification tends to be the most popular approach. However, this popularity has been achieved at the expense of other gameful approaches, such as the use of digital games. To gain a clearer picture of the developments and gaps in the digital game-based learning research, this paper examines and categorizes observations about game elements used in published papers (n = 114) where serious and digital games were tested in language education settings. Game element analysis reveals that (1) the most frequently occurring elements in digital game-based language learning (DGBLL) are feedback, theme, points, narrative, and levels; (2) even though there was significant variance in the number of elements observed in DGBLL, both the bespoke and off-the-shelf games show similar high-frequency elements; (3) DGBLL has been applied to vocabulary acquisition and retention in many cases, but lacks implementation and testing in input and output language skills; (4) although there is some consensus on the most frequent elements, the design patterns of common elements according to age group and target language skill show considerable variance; (5) more research is needed on less common design elements that have shown promise in encouraging language acquisition. The synthesis of information from the collected papers contributes to knowledge regarding DGBLL application design and will help formulate guidelines and detect efficacy patterns as the field continues to grow.
... Gamification is the use of distinct game building blocks embedded in real world contexts (Dreimane, 2018, p.454). It is useful as a pedagogical strategy because it meets the requirements of situated learning, which states that learning must be active, goal-oriented, contextualized, and interesting (Brown, Collins & Duguid 1989;Shute & Ke, 2012). These requirements can be met by designing a game which has an interactive environment, provides some feedback, catches the students' attention, ...
-
- M. Paz Trillo Miravalles
Antecedentes: La familia es un sistema social que cumple funciones trascendentales en el desarrollo de sus miembros. Los progenitores son componentes relevantes del mismo y se preguntan qué hacer y cómo actuar ante las situaciones surgidas con sus hijos. El objetivo de esta investigación ha sido explorar las temáticas de contenidos relacionadas con las preocupaciones y necesidades formativas de los progenitores en relación a la crianza y educación de sus hijos. Método: Se ha seguido el procedimiento del muestreo no probabilístico tipo bola de nieve. Participaron 58 sujetos. El diseño de investigación ha sido cuantitativo, no experimental, transeccional y descriptivo. Se han realizado análisis cualitativos de contenido. Resultados: Las principales categorías emergidas fueron: emociones, centro escolar, futuro de los hijos, relación padres-hijos, problemas de comportamiento, riesgo para la salud, TIC y valores. Conclusiones: Es pertinente aplicar el instrumento a una muestra más amplia. Para guiar el diseño de futuras intervenciones de educación parental es oportuno conocer qué necesitan aprender actualmente los padres y las madres.
... Also, most of these studies used ex situ data like questionnaires and tests scores (Kang et al., 2017) for performance assessment as well as non-established questionnaires for perception evaluation raising concerns about the validity of the outcomes. The current study, therefore, aims to bridge this knowledge gap and go a step further to evaluate perceptions of higher education students using an established technology acceptance model and gameplay log data that are considered more valid and reliable for performance assessment (Shute & Ke, 2012). ...
The growing interest in the use of digital games for education resulted in the expansion of the field of game-based learning. There have been several research on the perceptions and attitudes of students towards the use of games for learning. These studies have tried to understand what students make of the use of digital games for learning, as it is believed that the views of users and their acceptance of new technologies play a crucial role in ensuring successful outcomes. However, it is unclear whether there is any relationship between experiences, perceptions towards games and gameplay performance in a learning game. Understanding this relationship is important for game developers to effectively design and develop games, and for educators to be able to determine how to best deploy games for educational purposes. This study examines how the experiences and perceptions of engineering students towards digital games for engineering education influence their use and performance in a serious game called CosmiClean. Findings suggest that while students are enthusiastic about digital learning games, there was no relationship between their perceptions of games for learning and their gameplay performance. However, a relationship was found between the game experiences of students and their gameplay performance.
... Nonetheless, the use of external assessment in DGBL has been criticised as being isolated from the learning context. It is believed that using these forms of assessments misses out on the opportunity for performance-based assessment that is afforded by the game, hence failing to measure more complex skills and competencies that are otherwise difficult to measure (Groff, 2018;Shute & Ke, 2012). As an alternative, in-game assessment methods such as game scoring (Bellotti et al., 2013;Moseley, 2013), log data analysis (Kerr & Chung, 2012;Loh, Sheng, & Li, 2015;Westera, Nadolski, & Hummel, 2014), and integrated or stealth assessment (Almond, 2015;Kim, Almond, & Shute, 2016;Shute, Wang, Greiff, Zhao, & Moore, 2016) have been proposed for game-based learning. ...
Digital games are interactive, which makes them highly engaging for players. The adoption and use of digital games in higher education are on the rise with many researchers and educators developing and deploying these in classrooms. As a relatively new pedagogical tool, some aspects of the use of games for learning such as measurement and assessment of learning are still under research. Although assessment of performance and learning in digital games are commonly done with pre and post-game tests, interest is growing in the use of gameplay log data as an alternative and valid means of measuring the performance of students in digital games. A few studies have utilized log data to measure the performance of students in general knowledge and skills but limited studies exist where game log data were used to measure domain-specific competencies. This empirical study describes the use of game log data for measuring the behaviours and performance of engineering students in the Cosmiclean game, a serious game designed to teach the principles of separation and recycling operations. Using the data from first year engineering students from two European institutions, sequential behaviour pattern analysis and performance assessment of students solutions in the game are presented. The findings of this study highlight the behaviours and gameplay strategies of students in the game environment, and these would be particularly useful to game designers, educators and researchers in the field of game-based learning.
The COVID-19 pandemic has changed how millions around the globe are educated. The 2nd or 3rd waves of the disease have made learning in classrooms unsafe once again. Many schools are forced to send their students home to take online classes under their government's lock-down protocols. For many young learners, engaging with school is a significant part of their well-being, which has been compromised by the extended period of remote learning and low social interaction levels during the pandemic. New and innovative solutions to address learners' needs have been called during this pandemic. The Presentria GO system is an innovative solution that enables students from K-12 to higher education to learn experientially from their cars during a city excursion. Through a survey with 74 educators and a series of expert interviews and focus group discussions, insights into the feasibility of this active learning mode are explored. This paper proposes the concept of 'In-Car Location-Based Experiential Learning' as one of the methods to engage students during the pandemic and beyond.
Understanding the global water cycle is fundamental to Earth systems literacy and fostering an informed citizenry; however, students often struggle with terminology, the role of key processes, and estimating the physical and temporal scales, leaving knowledge gaps that impair comprehension. The Hydrologic Cycle Game is a pedagogical tool for teaching students about the global water cycle through game-based learning. It familiarizes students with terminology related to transport, fluxes, and storage, using box models to understand complex cycles and visualize invisible processes, and introduces students to the concept of residence time. It was developed for university undergraduates but could be used in other educational settings. When deploying this activity in-class, it is helpful to introduce students to the vocabulary and the concept of box models prior to game play; a short lecture or prerecorded video was sufficient. One game typically takes 5–10 min to play, and, in the author's experience, engagement increases when students have the opportunity to play multiple times. Student comprehension of terminology, connections, and directions of flow was assessed using pre and posttests. Scores increased significantly (p < 0.05) after gameplay with a large effect size (d > 0.8), and learning gains persisted through mid-semester evaluations. The data collected indicate that the Hydrologic Cycle Game is an effective tool for teaching the global water cycle. Supplemental data for this article is available online at https://doi.org/10.1080/10899995.2021.1977030
- Andrea Maria Pfändler
Surveys and tests have shown deficits in the financial literacy (or financial competence) of young adults. Often this cohort lacks financial knowledge and skills. While interventions can increase financial knowledge, they frequently do not produce sustainable long-term improvements of financial competence. For this reason, a financial literacy board game has been developed that links emotional, motivational, and cognitive processes to encourage the sustainable development of financial competence. The developmental objective was a holistic tool, in which financial competence, as well as the corresponding emotional-motivational facets of financial issues, plays a central role. The game is based on the competency model of financial literacy that takes into account decision-making. As a result, the game incorporates the relevant aspects of personal finance and their interrelationships as well as the necessary mathematical, personal, and social competencies that influence financial decision-making. The game also considers personal incentives for financial decisions based on the latest findings in the field of happiness research. This paper will present the theoretical foundations of serious games, how they have been implemented in this game, with a special focus on the game mechanics, as well as the results from the pilot testing phase of the game. In brief, the usability testing has shown that the game is perceived on one hand as fun, creating excitement and flow, while on the other hand, it generates interest in and leads to a discussion about the financial literacy topics.
This publication is part of Designing Future Innovative Learning Spaces project (Design FILS) funded by European Union's Erasmus+ KA2-Cooperation for innovation and the exchange of good practices under grant agreement number 2019-1-TR01-KA201-076567. - Often stereotyped, traditional teaching is characterized by a pedagogical delivery model taking place in a standardized and fixed classroom. Current teaching practices show that many teachers want to shift to a different paradigm with less pedagogical sameness, facilitating personalized, student-centered and active learning, while aiming at building future skills. In this study, the different parameters are explored to bring active learning into practice. The physical design of the space as well as the use of educational technology are critical components that support active learning pedagogy. The academic literature on the three pillars of active learning-pedagogy, space design and technology-forms the theoretical and methodological basis to define strategies and recommendations on the key aspects of teaching in future innovative learning spaces.
Bu yayın 2019-1-TR01-KA201-076567 sayılı hibe sözleşmesi kapsamında Avrupa Birliği'nin Erasmus + KA2-Yenilik için iş birliği ve iyi uygulamaların değişimi tarafından finanse edilen Designing Future Innovative Learning Spaces-Design FILS (Geleceğin Yenilikçi Öğrenme Alanlarını Tasarlama) projesinin bir parçasıdır. Bu doküman, Türkiye Cumhuriyeti Millî Eğitim Bakanlığı, European Schoolnet, Universidade de Lisboa, Future Learning Lab Wien, Hacettepe Üniversitesi, Centro Autonómico de Formación e Innovación ve Zakladni Skola Dr. Edvarda Benese'nin ortak çalışmasının sonucudur. Design FILS projesi ve ortakları hakkında daha fazla bilgi http://designfils.eba.gov.tr adresinde mevcuttur. Yayının içeriği tamamen yazarların / proje konsorsiyumunun sorumluluğundadır ve burada yer alan bilgilerin herhangi bir şekilde kullanımından Avrupa Komisyonu sorumlu tutulamaz. Yayın, Creative Commons License Attribution-Non Commercial (CC-BY-NC) koşulları altında kullanıma sunulmuştur. Özet Geleneksel öğretim, standartlaştırılmış ve sabit bir sınıf ortamında gerçekleşen bir pedagojik uygulama modeli olarak tanımlanır. Günümüz sınıfları incelendiğinde, öğretmenlerin birçoğunun artık farklı pedagojiler kullanarak kişiselleştirilmiş, öğrenci merkezli ve aktif öğrenmeyi sağlamak ve bu sırada da öğrencilerin geleceğe yönelik becerilerini geliştirmek için öğretim uygulamalarında değişiklik yapmayı hedeflediği görülmektedir. Bu çalışmada aktif öğrenmeyi pratiğe dönüştürmek için farklı parametreler araştırılmıştır. Öğrenme alanı tasarımı ve eğitim teknolojisinin kullanımı aktif öğrenme pedagojisini destekleyen kritik bileşenlerdir. Aktif öğrenmenin üç temel bileşeni-pedagoji, öğrenme alanı tasarımı ve teknoloji-hakkındaki akademik çalışmalar, gelecekteki yenilikçi öğrenme alanlarında öğretimin temel boyutlarına ilişkin stratejileri ve tavsiyeleri tanımlamak için teorik ve metodolojik temeli oluşturur. Anahtar Kelimeler: Aktif öğrenme pedagojisi, öğrenme alanı tasarımı, eğitim teknolojisi
This book talks about why, and how, to design 'serious games'.
Contemporary research has indicated that students enjoy playing computer games. As a consequence, recreational computer games are becoming an increasingly significant part of student's lives. At the same time, use of educational software in schools is increasing. It has been proposed that playing recreational computer games may facilitate cognitive processes such as forming complex mental representations and making inferences. In this study, a qualitative approach was adopted to determine the cognitive processes students engaged in while playing recreational computer games. This was conducted with a view to determining the validity of incorporating computer game features into educational software. Twenty-one high school students participated. Results indicated that players practiced complex cognitive processes such as interpreting explicit and implicit information, inductive reasoning, metacognitive analysis, and problem solving.
an Education Arcade paper (research report)
Evaluating Tools For Assessing Games With Systems Thinking Design Principles
Source: https://www.researchgate.net/publication/283395597_Games_Learning_and_Assessment
Posted by: montgomerytheactiones.blogspot.com
0 Response to "Evaluating Tools For Assessing Games With Systems Thinking Design Principles"
Post a Comment