Links
Course Documents
     Register
     Main Page
     The Razor's Edge
     Design Team
     STRONG Project
     STRONG Architecture
     Conceptual Model
     Graphic Organizer
     Benchmarks
     Reflections 4 STRONG
     Independent Research
     Learning with Games
     Challenges
     Course Swiki
Swiki Features:
  View this Page
  Edit this Page
  Printer Friendly View
  Lock this Page
  References to this Page
  Uploads to this Page
  History of this Page
  Top of the Swiki
  Recent Changes
  Search the Swiki
  Help Guide
Related Links:
     Atlas Program
     Center for LifeLong Learning and Design
     Computer Science Department
     Institute of Cognitive Science
     College of Architecture and Planning
     University of Colorado at Boulder
STRONG Project Proposal

Statement of Problem


Hands-on inquiry learning without domain knowledge merely entertains students and may result in inadequate conceptual understanding. Many resource-deprived students reach schools with limited cognitive skills and are consequently less motivated (Balasubramanian et al., 2005). Direct instruction to impart domain knowledge in sterilized learning environments leaves learners unenlightened and unable to see its real-world relevance (Wilson 1997).


Using developmentally appropriate STEM concepts and standards outlined in the Benchmarks for Science Literacy (1993), we will identify, use, and embed student misconceptions of one concept in STRONG to foster a deliberate STOP –> REFLECT –> THINK –> ACT approach to rekindle players' intentionality and inherent preference for goal-oriented actions before launching them into active hands-on inquiry learning.


Rationale


Learning is affected by both motivational and cognitive issues. An expectation failure or cognitive dissonance initiated through a discrepant event could influence learners' motivation. With increased motivation to learn and heightened domain knowledge, more resource-deprived learners will be able to access, use, and learn from meaningful hands-on inquiry activities.

Outline and Justification of Technical Approach


Although STRONG might only succeed in eliciting students’ rudimentary and incomplete conceptual understanding, with rekindled intentionality and better domain knowledge afforded to the players, it is designed to launch them into active inquiry learning (Balasubramanian, 2003).


As the first task for all students, STRONG is designed to increase the domain knowledge and motivation of all learners by stimulating thoughtful conversations in non-threatening, low stress, high challenge, small-group settings. Players' pre-reflections typed in the online chat or reflection spaces will not be assessed but their selections from a drop down menu in the text box slots will be assessed as described below.




Implementation Plan


The STRONG interface that we will be implementing is composed of three distinct parts: a gaming space, a reflection space and a question/answer assessment area. The game space is where the scenario will unfold as the players explore STRONG. A reflection space will allow students to record thoughts, ideas and notes. And lastly, the question/answer assessment area will pose reflection questions to the players as ranking tasks. The options in the ranking questions (1, 2, 3, . . .) will include correct and common answers along with student misconceptions in drop down menus to assess their knowledge of a concept, after the players have explored/played STRONG.

STRONG requires minimal teacher intervention during play because students’ selected responses from the text box fields (as an aggregate of each students' response) are recorded, processed, and assessed continuously during the 15-20 minutes of play.


When both players have submitted their responses, the embedded critics in STRONG will offer some suggestions to both players in the reflection spaces based on their initial responses, and also display expert solutions (rankings) to problems.


Using a six-step data mining and knowledge discovery process, the underlying STRONG system architecture uses embedded fuzzy logic and machine learning techniques to provide necessary feedback to the learners in the STRONG reflection space, based on their text field choices. Such dynamic feedback to players promotes their active learning (Cios et al., 1998).

Drawing from the experience of our initial prototyping phase, we have discovered the importance of user interface design, and databases and the tractability of their implementation. We have drawn a rapid prototype of the user interface, and now it is our challenge to adapt it to practical and aesthetic guidelines. During its construction we will maintain a few guidelines, possibly including but not limited to these constraints and design principles:
  • We will continually orient users, explaining the features embedded in STRONG
  • We will enhance text with graphics and interactivity
  • We will use a minimalistic style and eliminate superfluous items
  • We will provide emphasis where needed
  • etc (Thorson, 2003).

As for the implementation of databases, we have opted to keep our questions as ranking tasks for the time being to simplify implementation. Perhaps we will find a good application of text parsing that is capable of recognizing a variety of answers and correctly labeling them correct and incorrect, but for now we are keeping things simple.

From our teaching and learning experience, we know students are more motivated and interested in learning when they encounter problems that have counterintuitive solutions. We deliberately select such problems and scenarios which bring to surface students' misconceptionsin in STRONG. Knowing that individuals remembering unfinished tasks better, based on the Zeigarnik effect, STRONG attempts to allay and dispel students' misconceptions by presenting challenging qustions before they interact with the action space to stimulate students' thinking about problems and possible solutions.


To examine the effectiveness of STRONG, we will administer pre- and post-tests to check student understanding of the concept.


References


American Association for the Advancement of Science (AAAS). (1993). Benchmarks for science literacy. New York: Oxford University Press.


Balasubramanian, N. (2003, June 2). Smart education: Blending subject expertise with the
concept of career development for effective classroom management
. Retrieved October 10, 2004, from University of Georgia, Instructional Technology Forum (ITFORUM) Web site: http://it.coe.uga.edu/itforum/paper73/paper73.html



Balasubramanian, N., Wilson, B. G., & Cios, K. J. (2005, July). Innovative methods of teaching and learning science and engineering in middle schools. Paper accepted for presentation at the joint meeting of the 3rd International Conference on Education and Information Systems, Technologies and Applications (EISTA 2005) and the International Conference on Cybernetics and Information Technologies, Systems and Applications (CITSA 2005), to be held in Orlando, USA.


Bransford, J. D., Brown, A. L., Cocking, R. R., Donovan, M. S., Bransford, J. D., & Pellegrino, J. W. (2000). How people learn: Brain, mind, experience, and school (Expanded ed.). Washington, D.C.: National Academy Press.


Cios, K. J., Pedrycz, W. and Swiniarski, R. W. (1998). Data mining methods for knowledge
discovery
. Boston: Kluwer Academic Publishers.



Gee, J. P. (2003). What Video Games Have to Teach Us About Learning and Literacy. New York: Palgrave Macmillan.


McDonald, K. K., & Hannafin, R. D. (2003). Using web-based computer games to meet the
demands of today’s high-stakes testing: A mixed-methods inquiry. Journal of Research on Technology in Education, 35(4), 459-472.



Mitchell, A., & Savill-Smith, C. (2004). The use of computer and video games for learning: A review of literature. London: Learning and Skills Development Agency.


Randel, J. M., Morris, B. A., Wetzel, C. D., & Whitehill, B. V. (1992). The effectiveness of games for educational purposes: A review of recent research. Simulation & Gaming, 23(3), 261-276.


Thorsen, C. (2003). TechTactics: Instructional Models for Educational Computing. Boston: Allyn & Bacon.


Wilson, B. G. (1997). The postmodern paradigm. In C. R. Dills & A. J. Romiszowski (Eds.),
Instructional Development Paradigms. Englewood Cliffs, NJ: Educational Technology Publications. 297-309.


View this PageEdit this PagePrinter Friendly ViewLock this PageReferences to this PageUploads to this PageHistory of this PageTop of the SwikiRecent ChangesSearch the SwikiHelp Guide