Developing Professional Development as Part of the Community Engagement Project.

The community engagement project challenges students to create a professional development session to be presented at a conference of the student’s choosing.  As part of building effective digital age environments, as prescribed by the ISTE Standards for Coaches #3, I chose to create an interactive session that focused on active learning and digital collaboration tools to improve current practices in nutrition education. Technology in nutrition education currently has limited uses but impactful potential. Despite the fact that nutrition information is plentiful in the digital world, the approach of dietitians and nutritionists has been to increase presence through blogs, social media, and videos (such as those on YouTube), while the Academy of Nutrition and Dietetics (AND), the representative organization for all dietitians, set their efforts to instill a code of ethics and provide information on privacy in the digital workplace.  These efforts may help mitigate nutrition misinformation but are often one-sided or engage only limited populations. For example, blogs may allow comments but do not allow for active engagement with the blog topics nor takes into account implementation on a local level. Social media platforms such as Facebook, Pinterest, and Twitter allow for nutritionists’ voices to be heard but rarely offer collaborative engagement between other experts, or communities. The solution is relatively simple as the digital tools mentioned offered plenty room for continued collaboration among participants at any level, (local or global).

The Academy itself recognizes the potential of technology in nutrition and has published a practice paper on nutrition informatics.  Nutrition informatics is a relatively new field in dietetics that addresses technology’s role in health practices.  The Academy discusses the potential pros and cons for each of the various practice fields in dietetics (clinical, food services, education/research, community, consultation/business) and technology’s potential for growth in each of those areas. In education specifically, the Academy recognizes use in distance learning, student progress tracking, speciality testing for licensing and certification, and professional course development.  However, it does not mention need for collaboration or engaging various audiences requiring nutrition education.

In order to bridge this gap and address the ISTE Coaching Standard, the topic for this professional development proposal focuses on building better nutrition education through digital collaboration tools.   The goal of this session is to explore benefits of active learning through technology aides (EdTech) and implement tools into existing lesson plans with the following objectives in mind:

  • a) Understand and/or review importance of active learning (evidence-based practice)
  • b) Become familiarized with collaborative edtech tools
  • c) Engage with edtech tool selection criteria and best practices
  • d) Explore ways to incorporate digital tools in lesson plan scenarios.

Professional Development Session Elements

In this one-hour session, participants will be invited to explore the main topic through both face-to-face and online collaboration, as the entire group navigates through a website developed specifically for the presentation. Since all of major content is available to them online, there is no need for note-taking, allowing participants to remain engaged throughout the session. Elements of the session involve: a pre-session technology self-assessment, an online group discussion via Padlet, think pair share elements, and lastly self-reflection elements submitted during and after the session.  More details on these elements are provided below.

Length. The Academy hosts local sub-organizations in each state. I chose to develop this professional development session for local dietitians and nutrition educators with the opportunity to present at the local education conference held annually.  The requirements of this local organization state that all educational sessions must be a minimum length of one hour. This is to meet the CEU (continuing education unit) minimum for registering dietitians. Considering that through the DEL program we have taken entire classes dedicated to active learning and digital tools, the length will limit the depth of information presented.  However, the ability to continually collaborate with both participants and presenter will allow for continued resource sharing after the session has ended.

Active, engaged learning with collaborative participation. Participants will be encouraged to participate and collaborate before, during, and after the session for a full engagement experience. The audience will be asked to review certain elements of the presentation website available here intermittently as they discuss key elements with the participants next to them. See figure 1.1 for lesson plan details.

Building Better Nutrition Education Through Digital Collaboration Tools
Objectives

Session Goal: Introduce ways to incorporate digital collaboration tools into existing nutrition education lesson plans.

Learning Objectives: At the end of the session participants will:

  • a) Understand and/or review importance of active learning (evidence-based practice)
  • b) Become familiarized with collaborative edtech tools
  • c) Engage with edtech tool selection criteria and best practices
  • d) Explore ways to incorporate digital tools in lesson plan scenarios
Performance Tasks

  • Participants will complete self-assessment prior to the session
  • Participants will demonstrate understanding of active learning by submitting informal Google Form Quiz in session
  • Participants will engage in collaborative edtech tools by submitting responses during the session
  • Participants will create their own digital tool need by complete case scenario
  • They will submit self-reflection via flipgrid post session
Plan Outline

  • Session Introduction (5 mins)
    • Prompt and Participation: Padlet Q & A- Describe a time you attended a great education session, what made that session great?
    • Review of self-assessment (completed prior to session)
  • Importance of active learning- evidence-based practice (5-10 mins)
    • Review of evidence: Google form quiz (embedded in site)
    • How can digital tools help? (5-10 mins)
  • Choosing the right digital tool (10 mins)
    • Triple E Framework rubric
    • Criteria for choosing the right digital tool
  • Tips on incorporating tools into existing lesson plan (10 mins)
    • Video Tutorial (take home message/resource)
  • Active practice (10 mins)
    • Case scenarios-flipgrid response
    • Flipgrid self-reflection
  • Questions (5 mins)

Total session length: 60 mins.

Figure 1.1 “Building Better Nutrition Education through Digital Tools” Session Lesson Plan.

Before the presentation, the participants will be invited to a google form self-assessment poll addressing comfort and knowledge with technology tools as well as their current use of technology tools in practice. During the presentation, the audience will be prompted to participate in “think, pair, share” elements, as well as, respond to collaboration tools prompts on padlet, google forms, and embedded websites.  After the presentation, participants will be encouraged to summarize their learning by submitting a flipgrid video.  

Content knowledge needs. The session content begins with establishing the importance of active learning as evidence-based practice to meet objectives a) and b). Just as motivational interviewing and patient-centered practice is desirable in nutrition, active learning invoking 21st century skills is evidence-based and an education standard. The content will then shift into teacher-focused how-tos for digital tools including how digital tools can help, how to select the right digital tool, and how to incorporate that tool into an existing lesson plan to address objectives c) and d). My assumption is that participants who are not comfortable with technology may be fearful or lack of motivation to explore various tools.  Group collaboration, modelling and gentle encouragement through case studies may help mitigate these fears.

Teachers’ needs. While the majority of the session focuses on introductory content to active learning and digital tools, teacher’s needs in digital tool management can be addressed through coach/presenter modeling. Simple statements such as, “I created this flipgrid video to serve as a model for students.” or “This google form was hyperlinked to gauge students’ understanding so far,” can serve as a basis to explore class management and digital tool management within the limited time. The website itself offer a section on FAQs, exploring questions and misconceptions about active learning and digital tools. Even with all of these resources, the audience will be introduced to technology coaching and may choose to consult a coach at their current institution.

In addition to modeling, three tutorial videos are available on the website to help teachers begin creating their own active learning lesson plans using the backwards design model. Each of the tutorials features closed captioned created through TechSmith Relay for accessibility.  The Google Site was also chosen because content is made automatically accessible to viewers, all the website creator has to do is include the appropriate heading styles and use alt text for pictures, figures, and graphs.

Lessons Learned through the Development Process.

One of the major challenges to developing this project was understanding the needs of the target audience.  Because nutrition informatics is relatively new, technology use has not be standardized in the profession, therefore estimating the previous knowledge and use of digital tools by the audience was difficult. My assumption is that technology use and attitudes about technology will be varied. The website attempts to breakdown information to a semi-basic level.  The only assumption I made was that the audience has good background in standard nutrition education practices. I also chose to develop the Technology Self-Assessment for the audience to complete prior to the session as a way to gain some insight into current technology use and comfort so that I may better tailor the session to that particular audience’s needs.

I realized as I was developing the lesson plan for this session that I only have time to do a brief introduction to these very important topics. If I were to create a more comprehensive professional development, I could expand the content into three one-hour sessions including 1) introduction and theory to collaborative learning which would address the importance of digital tools in nutrition education and establish need for active learning, 2) selecting, evaluating, and curating tech tools allowing educators to become familiarized with available tools based on individual need, and 3) lesson plan development integrating collaboration tools, a “how-to” session where participants create their own plan to implement. I had not anticipated that length was going to be a barrier, however, if the audience truly has limited digital familiarity and comfort, perhaps beginning with an introduction to these topics is sufficient.

One positive lesson that I’ve learned is that trying new things, such as creating a Google Site, can be very rewarding.  I have never experimented with Google Sites prior to this project and I am quite happy with the final website, though the perfectionist in me wants to continue tweaking and editing content. I originally was aiming to create slides for this presentation but realized that I am attempting to convince a possibly skeptical audience on the benefits of digital tools so using the same old tool would not allow me to do the scope of modelling I desire.  

I must admit that before this project, I had a hard time placing myself into the role of a “tech coach” because I would continually see each concept through the lens of an educator and how to apply the concepts to my own teaching.  It has been difficult for me to take a step back and realize that I am teaching but just in a different context. Creating the step-by-step tutorials was the turning point where I envisioned the audience modeling their lesson plans to the example I had given.  I hope I have the opportunity to present this session at the educational conference and bring the ideals of active learning and digital tools to professionals working in various education settings.

The Connection between Digital Competence and Problem-Solving

The word “troubleshooting” most often invokes images involving a conversation with the IT department, a progression of actions guided by the technician and performed by the user, and ending with a resolution in which the user’s original knowledge of technology has not been augmented. Unfortunately this is a all too common scenario. The user defaults all troubleshooting responsibility to a third party because of unfamiliarity or knowledge deficit of technology. This is not limited to just consumers and companies, there is a concern that students also do not troubleshoot well. According to the ISTE coaching standard, coaches should help teachers and students “troubleshoot basic software, hardware, and connectivity problems common in digital learning environments,” (ISTE, 2017). While calling for IT or passing responsibility onto another party, like a teacher for example, is generally practiced, learning to troubleshoot is a beneficial 21st century skill because it helps develop digital competence.

Why is digital competence important?

Like all 21st century skills, digital competence is a highly-sought skill in the ever-evolving workforce. An e-magazine, Training Industry, wrote an industry-perspective article on digital competence and highlights the need for competence in the workforce from the top of the organization chart down.  The author believes that the tech world today emcompasses “VUCA”, or volatility, uncertainty, complexity, and ambiguity. The role of those working in tech today should be to navigate this VUCA world seamlessly and one of the ways to do this is to reinforce digital competence, (Newhouse, 2017).  The industry definition of digital competence expands to include not only knowledge of technology but also involves understanding digital environments, effectively creating and consuming digital information, communicating and collaborating with diverse stakeholders, innovating rapidly, critically thinking/problem solving, and maintaining security, (Newhouse, 2017). This definition was devised from new European Union definitions and involves five major facets summarized in figure 1.1 below.

Infographic on the 5 major facets of digital competence
Figure 1.1 Facets of Digital Competence

What role does “digital competence” play in helping students problem-solve and troubleshoot online/technology issues?

One issue that arises is the general assumption that since students grew up with technology, or are considered digital natives, that they automatically build digital knowledge or that students know how to use technology well, (Hatlevik, et. al, 2015).  However, in order to use technology well, students need to build digital competence and literacy. According to researchers Hatlevik, Gudmundsdottik, and Loi, building digital competence is complex and involves various factors as summarized in figure 1.2 below.

Infographic on the key elements for developing digital competence
Figure 1.2 Developing Digital Competence

The researchers recognize that these facets are essential to culviating a deep understanding of technology while promoting critical reflection and creativity of digital skills.  These qualities in turn develop problem-solving skills in both independent and collaborative settings, (Hatelvik,et. al., 2015).

Other than knowledge deficits involving how to perform troubleshooting tasks, researchers suggest that when demanding conditions, such as a completing an assignment,  becomes difficult, it may hurt self-regulation and autonomy, (Koole, et.al, 2012). These difficulties can include cognitive, motivational, implementational, or a combinations of these factors.  While this theory is debated, meta-analyses indicate that low intrinsic value activities (such as homework) may lower complex problem solving abilities such as those required by troubleshooting, (Koole, et al. 2012).  Along with motivational issues, students may resolve themselves to believing that there is only one correct path or resolution to a specific problem in which the educator is the gatekeeper of the solution. Rather than seeking the solution for themselves, students prefer to go straight to the source which develops a learned helplessness, (Miller, 2015).

How can students develop digital competence?

Digital competence is a very complex concept that spans several social, motivational, personal, cultural, and technical understandings, therefore, there is no straightforward way for developing digital competence.  However, educators play a big role in establishing foundations for competence that may lead to better problem-solving and troubleshooting in two major ways:

  1. Allowing for self-directed learning. A consensus exists in the fact that students need to be reflective of their own learning, (Miller, 2015 and Plaza de la Hoz, et. al., 2015).  The role of the educator then shifts to provide resources including digital tools that allow students to experiment by active participation and engagement.
  2. Change in class culture. The attitudes and beliefs of the educator also reflects importance of digital competence in students. If the educator places low importance in digital competence, the students learn not to value or develop these important skills.  The educator can establish new beliefs, resources, and structures to promote a culture of answer-seeking through appropriate digital tools and tool use. Lastely, students must build self-efficacy through trial and error in a safe environment.

While researchers are investigating efficient methods for developing competences, all sources agree that in order for students to be successful in the 21st century, educators must open up the path to new technologies, new pedagogies, and new attitudes that help build digital competency, (Miller, 2015, and Plaza de la Hoz, et. al., 2015).  

Resources

Hatlevik, O.E., Gudmundsdottik, G.B., Loi, M. (2015). Digital diversity among among upper secondary students: A multilevel analysis of the relationship between cultural capital, self-efficacy, strategic use of information, and digital competence. Computers & Education. 81: 245-353. Available from: https://drive.google.com/file/d/0B5W5P9bQJ6q0RFNib3A5Vm9wWWM/view

ISTE, (2017). ISTE standards for coaches. Available from:

https://www.iste.org/standards/for-coaches

Koole, S.L., Jostmann, N.B., Baumann, N. (2012). Do demanding conditions help or hurt regulation? Available from: https://drive.google.com/file/d/0B5W5P9bQJ6q0M0QzalRBa0FfTXM/view

Miller, A. (2015, May 11). Avoiding learning helpness. Available from: https://www.edutopia.org/blog/avoiding-learned-helplessness-andrew-miller

Newhouse, B. (2017). Closing the digital competence gap. Available from: https://trainingindustry.com/magazine/issue/closing-the-digital-competence-gap/

Plaza de la Hoz, J., Mediavilla, D.M., Garcia-Gutierrez, J. (2015). How do teachers develop digital competence in their students? Appropriations, problematics, and perspectives. Available from: https://www.researchgate.net/publication/301914474_How_do_teachers_develop_Digital_Competence_in_their_students_Appropriations_problematics_and_perspectives

Implementing Student-Centered Activities in Content-Intensive Courses

If you’ve ever taught a content-intensive course, you’ll know it’s like trying to finish a marathon in a sprint. In my experience, you get to the finish line, but you hardly remember the journey there. The content-intensive courses I teach are the foundational nutrition classes. Each contain at least six major learning objectives with about two sub-objectives and are designed to cover upwards of fifteen chapters of material in a ten-week quarter system. The predominant approach to these types of classes by faculty is to go broad, not deep, in learning and understanding.  I must admit this has been my approach as well, in fear that I will miss out on covering one of the learning objectives or sub-objectives. While my students tell me that the courses are interesting and engaging, I can’t help wonder if they will actually remember any content from the course or if they feel as if their brain has been put through a blender by spring break. Is the learning authentic or are they just learning for the sake of memorization to pass the final exam?

The ISTE Standards for Educators charge instructors with, “design[ing] authentic, learner-driven activities and environments that recognize and accommodate learner variability,” (ISTE, 2017).  If instructors truly wish to design their course using evidence-based practices, the focus needs to shift from covering material to student learning without compromising the learning objectives. ISTE educator standard 5b implies that technology can help marry the two concerns, “design authentic learning activities that align with content area standards and use digital tools and resources to maximize active, deep learning,” (ISTE, 2017). This ISTE 5b standard can best be illustrated by the “genius hour” concept developed by Nicohle Carter in pursuit of developing a personalized learning environment for her students. The idea is brilliant.  Allow students one opportunity a week (or as time allows) to dive deep into a topic they are interested in and demonstrate their learning through an artifact or digital presentation. The implementation of genius hour follows a six-component design model that highlights new roles and responsibilities for teachers and students alike, (Carter, 2014). See figure 1.1 for more information on the six-component personalized learning design.

Infographic highlighting 6 essentials for personalized learning.
Figure 1.1 Nicohle Carter’s Personalized Learning Essentials.

When implemented well, intrinsic motivation for learning soars, students are engaged in the material, and teachers can meet those ever-important learning objectives without feeling like they are just shoveling materials into students’ brains, (Carter, 2014). It seems like a win-win.  However, I started thinking back on my content-intensive courses and wondered how can student-centered activities (like genius hour) be implemented in these types of courses?

As a starting place for answering my question, I revisited Kathleen McClaskey’s continuum of choice.  I find the concept interesting that developing student-centered learning/activities, it ultimately comes down to how much control the teacher wants to let go of and how much “choice” is open for the students. In traditional content-intensive courses, the teacher has all of the control, or what McClaskey would classify as teacher-centered, (McClaskey, 2005).  She/he creates the lectures that revolve around a specific chapter in a textbook, then lectures to ensure the material in covered. Students, in this model, sit and observe the lecturer in hopes of absorbing some of the materials (or in most cases, cramming the information into their brain the night before the exam) while never actually deeply engaging with the information.  Using McClaskey’s continuum of choice suggests that some activities can still be controlled while giving the students some freedom to explore topics in their own choosing, i.e. consider the participant and co-designer models, (McClaskey, 2005).

Diagram of the Continuum of Choice.
Figure 1.2 McClaskey’s Continuum of Choice. (Continuum of Choice TM by Barbara Bray and Kathleen McClaskey is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.)

The challenging thing about the more student-centered models such as the designer or advocate from McClaskey’s continuum requires time, a luxury oftentimes not afforded in content-intensive courses, nor do they address how to implement each model topic.  However, despite these concerns, I am beginning to realize that in order to allow for more intrinsic and authentic learning, I need to let go of the desire to control all aspects of the content-intensive courses and shift my focus to what is really important, student learning.

Many of the resources similar to McClaskey, mention explicit instruction as part of a student-centered classroom. Explicit instruction provides “effective, meaningful, direct teaching…where students are active participants in the learning process,” (Shasta County, 2009). Creating an explicit learning lesson involves six guiding principles. 1) The instructor begins the class by setting the stage for learning, the learning objectives are clear and students understand their responsibility for their learning. 2) This is followed by clear, simple, and direct explanation of what the daily task is, why it is important, and how to best complete the task.  Students appreciate when tasks are broken down into smaller, logical steps. 3)The instructor models the process, including their thought process using visuals. This is important because simply explaining a concept doesn’t mean that the students will understand it or know what to do. 4) Before diving into the assignment on their own, students are given a guided activity where the instructor assesses readiness of the class. 5) Once the concept has been mastered, the students take to the task independently. 6) After the task(s) has been completed, the students are given an option for informal or formal reflection, the artifact is collected and compared to the learning objectives, (Shasta County, 2009).  Figure 1.3 provides a reference guide for these steps.

Infographic on explicit learning
Figure 1.3 Explicit Learning Reference Guide

According to the Shasta County Curriculum Lead, explicit learning is best used when there is a “well-defined body of information or skills students must master,” especially when other models such inquiry-based or project-based cannot be successfully implemented, (Shasta County, 2009).  The role of the teacher is more directed, specific, and allows students more insight and practice into the skills that they are learning. What I like about explicit learning is that the classroom activities do not have to be modified completely but the modification occurs is how the material is presented and practiced.  Students can appreciate this model because they engage in active learning but still have guidance and support from the teacher via modelling.

Through explicit learning even the content-intensive courses can have a deeper and more meaningful impact on learning. I had one class in particular in mind when considering the explicit learning/personalized learning approach. I teach a not-so-introductory nutrition class designed to meet the needs of allied health students.  All allied health students are required to take at least one nutrition class as part of their career training, and for many, this class will be the only nutrition class they will ever take. The pressure is high in terms of delivering content as it is very likely that they will not revisit this material anywhere else. While I can’t change the fact that they need to explore the chemical compositions and the processing of the nutrients in the body, I can influence how they engage with the health effects and recommendations of these nutrients, which are ever-changing anyway.  Using personalized learning and the explicit learning models, I could allot for one class time a week for the exploration of the health effects/recommendations on whatever condition, trend, or issue they wished to explore. Like the genius hour, the students could work together to investigate and create a digital artifact of their choosing that would best present their topic, and lastly to further promote collaboration, they could work together to provide feedback to one another on their topics. The students would be learning through co-learning, gaining a stronger and deeper interest into the subject matter, proving that content-intensive courses can also be student-centered.

Resources

Carter, N. (2014, August 4).Genius Hour and the 6 Essentials of Personalized Education. Retrieved from http://www.edutopia.org/blog/genius-hour-essentials-personalized-education-nichole-carter

International Society for Technology in Education, (2017).  The ISTE standards for educators. Retrieved from: https://www.iste.org/standards/for-educators.

McClaskey, K. (2005, November 5). Continuum of choice- More than a menu of options. Retrieved from http://kathleenmcclaskey.com/choice/

Shasta County Curriculum Lead, (2009).  What is direct/explicit learning [Word doc]. Retrieved from http://www.shastacoe.org/uploaded/dept/is/district_support/explicit_instruction_may_2009.doc

Building Computational Thinking through a Gamified Classroom

Who says playing video games doesn’t teach you anything?  Playing and creating games could actually help students develop another 21st century skill, computational thinking (CT).  Computational thinking is  a form of problem solving that takes large, complex problems, breaks them down into smaller problems, and uses technology to help derive solution. In deriving solutions, students engage in a systematic form of problem solving that involves four steps: 1) “decomposition” where a complex problem is broken down into smaller, more manageable problems, 2) “pattern recognition” or making predictions by finding similarities and differences between the broken down components, 3) “abstraction” developing general principles for the patterns that emerge, and  4) “algorithm design”, creating step-by-step instructions to solve not only this problem but other similar problems in the future, (Google School, 2016). By engaging in computational thinking, “students develop and employ strategies for understanding and solving problems in ways that leverage the power of technological methods to develop and test solutions, (ISTE, 2017).  In other words, the key to successfully following this process is that students develop their own models rather than simply applied existing models, (Google School, 2016).

Figure 1.1 Components of Computational Thinking
Figure 1.1 Components of Computational Thinking

In researching ways to apply computational thinking in the classroom, I ran across scholarly articles discussing the gamified classroom. I have always been intrigued with this concept, from my own experience students are so much more engaged during class time when the required content is converted into a game.  During these game sessions, my role changes from the the person delivering the content, to the person delivering the game (i.e. asking the questions).  The students are responsible for providing the content by providing solutions to the posed questions, thereby evoking problem-solving skills and in some cases, critical thinking skills. This idea-thread then led me to think “what are some ways that a “gamified” classroom can help develop computational thinking?”

To help answer my question, I came across two articles that pinpointed models in game-design to build computational thinking:

Article 1: Yang & Chang, 2013. Empowering students through digital game authorship: Enhancing concentration, critical thinking, and academic achievement.

Yang and Chang explore how students can increase their motivation for learning when they are allowed to design their own game given a specific topic.  During the game design process there is significant problem-solving that occurs because of the interaction and the immediate feedback the process entails.  In addition, students gain high order thinking such as building creativity, and critical thinking. The authors mention three game building software that does not require extensive coding skills: RPG Maker, Game Maker, and Scratch. During their study, the researchers investigated the effects of game design process on seventh grade biology students that were using either Flash animation (digital flash cards)  or RPG Maker.  The investigated effects included concentration, critical thinking, and academic performance. Their result demonstrated that the group using the RPG maker had significant improvements on critical thinking and academic performance, while no significant difference was noted on concentration for both groups.

Article 2: Kazimoglu, et. al., 2012.  A serious game for developing computational thinking and learning introductory computer programming.

Kazimoglu et. al. begin their inquiry by providing a few definitions.  It is important to understand the terminology they use, mainly defining any game used for educational purposes as a “serious” game.  They acknowledge that several definitions of computational thinking exist so they create their own definition that require the following elements: 1) conditional logic (true vs. false conditions); 2) building algorithms (step-by-step instructions); 3) debugging (resolving issues with the instructions); 4) simulation (modeling); and 5) distributed computation (social sharing). The authors are challenged to create a non-threatening introduction to programming unit to combat common student perception that programming is “difficult.” Kazimoglu et. al. believe that when students are allowed to engage in game design, they are motivated to learn which provokes problem solving. They take this approach to their introduction programming class where they challenge students through a series of exercises using the Robocode platform. At the end of the study, all students successfully completed the exercise, engaging in problem-solving skills.

Conclusions. Interestingly, both of these articles struggle to exactly define “computational thinking” and both mention that specific research investigating the extent to which games can develop CT is lacking.  However, what both can agree on is that CT is best developed when students are the game designers.  In order to do this, both studies involved elements of programming instruction to help students successfully build their games.

While these articles offer models into successfully implementing computational thinking through game design and creation, it was a little disheartening to discover that programming instruction was a necessary component. My inclination was to think how can these processes be implemented and/or adapted in other classroom scenarios particularly when programming instruction may or may not be feasible.  Interestingly, not all researchers agree that programming need be involved in successful CT implementation. Voogt et. al. argue that although most research on CT involves programming, because CT is a thinking skill,  it does not require programming in order to be successfully implemented, (Voogt et. al., 2015). In fact, in a literature review conducted by Voogt demonstrated that students do not automatically transfer CT skills to a non-programming context when instruction focused on programming alone. The strongest indicator of CT mastery was actually heavily dependant on instructional practices that focuses on application, (Voogt et. al., 2015).

The lack of a standard definition of computational thinking also needs to be addressed. The two articles above and the Voogt researchers agree that discrepancies exist among current definitions of computational thinking.  To avoid confusion regarding the role of programming and other such technologies, computational thinking can be simply defined as a way of processing information and tasks to solve complex problems, (Voogt et. al., 2015).  It is a way to look at similarities and relationships between a problem and follow a systematic process to reaching a solution.  Figure 1.2 summarizes this simplified process.

Figure 1.2 Simplified Computational Thinking Components
Figure 1.2 Simplified Computational Thinking Components

According to this new context, it is not necessary to program games in order for students to build computational thinking.  Allowing students to participate in systematic artifact creation will do the trick.  Some examples of artifact creation without the use of programing include: remixing music, generating animations, developing websites, and writing programs.  The main idea of this artifact creation process is that students follow procedures that can be applied to similar problems. Figure 1.3 highlights this artifact creation process.

Figure 1.3 Artifact Creation Process for Computational Thinking
Figure 1.3 Artifact Creation Process for Computational Thinking

How can this artifact creation process be used in creating gamified classroom?  To help me explore this issue, one of my colleagues suggested allowing students to develop and design their own board game. While the solution seems low-tech, others agree with this strategy.  Michele Haiken, an educational leadership for ISTE, writes about adapting “old school” games for the classroom to help develop critical thinking and problem solving skills, (Haiken, 2017).  Students can even create an online “quest,” scavenger hunt, or create a “boss event” to problem-solve computationally, (Haiken, 2017).  For more tech-y solutions, existing platforms and/or games such as GradeCraft and 3DGameLab can be used to  apply computational thinking in a gamified classroom, (Kolb, 2015). Regardless of the method used, low-tech board games or high-tech game creation through programming, allowing students to participate in the artifact creation process helps to build computational skills that they can then apply to other complex problems to create their own models.

References

Google School, (2016). What is computational thinking? [Youtube Video]. Retrieved from: https://www.youtube.com/watch?v=GJKzkVZcozc&feature=youtu.be.

Haiken, M., (2017).  5 ways to gamify your classroom. Retrieved from: https://www.iste.org/explore/articledetail?articleid=884.

International Society for Technology in Education, (2017).  The ISTE standards for students. Retrieved from: https://www.iste.org/standards/for-students.

Kazimoglu, C., et. al., (2012). A serious game for developing computational thinking and learning introductory computer programming. Procedia-Social and Behavioral Sciences, 47, 1991-1999.

Kolb, L., (2015). Epic fail or win? Gamifying learning in my classroom. Retrived from: https://www.edutopia.org/blog/epic-fail-win-gamifying-learning-liz-kolb.

Voogt J, et. al., (2015). Computational thinking in compulsory education: Toward an agenda for research and practice. Education and Technologies, 20(4), 715-728.

Yang, Y. C., & Chang, C. (2013). Empowering students through digital game authorship: Enhancing concentration, critical thinking, and academic achievement. Computers & Education, 68(c), 334–344.

Innovation Through Using Problem-Based Learning

Whenever I think of the word “innovation,” I am reminded of the bear, honey, and powerline story. If you are not familiar with this story, I’ll offer a brief synopsis here, though there are other detailed versions available.

Employees of a powerline company met to brainstorm the issue of snow and ice accumulation on power lines which would down the lines in winter months. Despite formal, morning-long brainstorming, the session yielded little results. Frustrated, the team decided to take a short break. While on break, a few of the team members began to talk over coffee where one team member reminisced about how he got chased by a bear while out servicing the lines. After a good laugh, other team members jokingly suggested that they get bears to remove the snow/ice by placing honey pots on top of the powerlines. Continuing the joke, one team member suggested that they use helicopters to place the pots.  This idea was put to rest as another team member mentioned that the vibrations from the helicopters would scare the bears. Suddenly they realized they had a great solution on their hands, the company could use helicopters to remove the snow/ice through the force and vibrations caused by the helicopter blades. Because of this impromptu brainstorming session, using helicopters to remove snow and ice from powerlines is a common practice today.

diagram of a bear, honey, and a helicopter facilitating innovation.
Figure 1.1 A bear, honey, and a helicopter for innovation.

I like this story because it dispels the misconception that to be innovative you must create something new, like a product or a service.  Instead, innovation can be a way to problem solve. Much like the process that unfolded in the bear story, students should be encouraged to problem solve in creative ways.  By offering students opportunity to seek, identify, and apply information, they are building cognitive flexibility, a 21st century skill, (Kuo et. al., 2014). Cognitive flexibility encourages the development of creativity needed for innovation, a concept that involves the ISTE innovative designer standard where “students use a variety of technologies within a design process to identify and solve problems by creating new, useful or imaginative solutions,” (ISTE, 2017).

So then, how do you get students to begin thinking less about the “correct answer” and more “bears, honey, and helicopters” for innovation? This can be particularly difficult when students historically have been offered a “right” and “wrong” depiction of problems. Students can be “eased” into creativity through scaffolding using the systematic thinking concept of the creative problem solving model, (Kuo et. al., 2014). A summary of the model can be found in figure 1.2 below.  

Diagram of the Creative Problem Solving Model
Figure 1.2 The Creative Problem Solving Model

The creative problem solving model transitions students between understanding a problem, generating ideas about the problem, and finding solutions to that problem, (Kuo et. al., 2014).  The students evolve their thinking from identification to more complex thinking, ultimately evoking creativity and innovation.

While the creative problem solving model can be used to build cognition through various problem-solving steps, problem-based learning (PBL) can help format the classroom to help achieve self-directed learning. An instructor can start with any question-type from the creative problem solving model and allow students to work through that question with PBL.  The general process for designing a problem-based classroom is demonstrated in figure 1.3 below.  

Diagram depicting the Problem-Based Learning Process
Figure 1.3 The Problem-Based Learning Process

According to the National Academies Press, a PBL activity focuses on student-centered learning where the instructor is a facilitator or guide and the students work together to gather information, then generate ideas to solve the problem. The problem itself becomes the tool to obtain knowledge and develop problem solving skills, (National Academies Press, 2011).  PBL is not without its faults, in using PBL, students have slightly lower content knowledge than in the traditional classroom and students in a group may not share the same level of cognition, (National Academies Press, 2011).  Despite this, students engaging in PBL have a higher retention of content than in traditional classrooms, are better able to apply their knowledge, and have a deeper understanding of the content, (National Academies Press, 2011).

Putting the Theory into Practice: The Investigation

Several of the classes that I teach are content-based/coverage-based classes. These classes are designed to be foundational, meant to prepare students for higher level or more in-depth, application-based classes later on. As I was thinking about problem-based learning, I started wondering: “how can we fully expect students to become problem-solvers and apply content in more advanced classes when all they are expected to do is identify a concept in these foundational classes”?  Students really don’t understand the importance of a particular topic because the idea of application and innovation isn’t introduced until they are in another class.  To help give these coverage-based classes more meaning to the students now, I am considering applying more PBL-based activities to directly replace coverage-based activities. My investigation leads me develop the two guiding questions below that will help me gather ideas on how to solve this problem. I realize that I am essentially engaging in my own PBL.

Question 1: What are some examples of problem-based, or “idea-finding” class activities that better support student learning in coverage-based classes?  One resource that addresses this question is from the National Academies Press who published a summary of two workshops conducted in 2011 on “Promising Practices in Undergraduate Science.” The selected chapter (Chapter 4) summarizes the benefits of problem-based learning and describes 3-methods that show promise in content-heavy classrooms. Additionally the chapter provides templates or guiding principles for problem based activities, case-scenarios, and complex problems that are clear, concise, and general enough that they can be applied to various assignments or learning activities.  However, this chapter does not address specific examples to use as a model.  Despite this, the chapter is supportive in building theory and gathering initial ideas for PBL in the classroom. Another resource that may help address this question comes from the The Creative Classroom Project.  The project is a website created by the Eramus project led by university lecturers in Estonia specializing in digitally-enhanced learning scenarios.  The website/blog provides not only offers theory-based ideas but actual examples of the various methods that use PBL.  The professors call the various PBL methods “learning scenarios” and base their work off of a “trialogical learning design.” Though most of the examples are for primary and secondary education, the formatting  is helpful in brainstorming similar scenarios for higher education.

Question 2: How/can ICT be used to enhance learning in those above examples? To be honest, I was not sure I would find very many examples on how to apply technology in PBL.  I was quite mistaken.  Depending on the goal or scope of the learning activity, a multitude of tech apps and websites can be applied to the various PBL methods. Here are just a few examples of tech resources that can be used with PBL:

  • LePlanner lesson plan templates from the Creative Classroom Project. This resource provides several examples of specific tech such as padlet, pearltree, and mindmiester, that can be used to enhance classroom activities. The templates also provide lesson plans (via LePlanner software) which includes description of objectives, class activities that meets the objectives, and even includes timelines for each activity.
  • Digital storytelling corresponds with the case-studies (case scenario) PBL method. According to the National Academies Press chapter, one of the justifications for using case studies is that it is a form of storytelling.  Storytelling helps students learn by integrating knowledge, reflecting on ideas, and later articulating them while considering various perspectives, (National Academie Press, 2011).  Digital storytelling is a way to introduce technology as a problem-solving tool and helps students express their various perspectives. This digital storytelling resource offers background information about digital storytelling, the seven elements of storytelling, and resources (tech solutions) can be explored. I had never considered using blogs, pinterest, and other such social media resources for the purposes of digital storytelling.

The Next Steps.

This investigation has been a great first step in generating ideas for implementing more PBL activities into my content-intensive courses. There seems to be an endless world of possibilities for  integrating technology to develop creative solutions and innovation in the classroom. What I find interesting is that my findings mirrors that of the bear, honey, and helicopter story.  I discovered that coming up with a solution to my questions doesn’t involve reinventing the wheel, but rather considers ideas/products that already exist and using them in creative ways.  For example, I would have never considered using the Pinterest app or even Google Docs as a creative solution to digital storytelling.  Nor would I have considered that developing good problem-solving skills for students simply involves asking the right questions.

My process doesn’t end here. If I choose to implement PBL, the next steps will involve the six-step process highlighted in this article to successfully design, implement, and evaluate problem-based learning.  I need to carefully consider the major objectives of my course(s) and the amount of time needed for this process.  As suggested by the National Academies Press, successfully implementing any of the PBL methods takes time which may not always be a luxury in coverage-based classes. Before moving forward, I need to understand that I would not be able implement PBL with every topic but must carefully select activities that would help solidify the major objectives of the course.

My colleagues and professors have also suggested using alternative models such as the  human-centered design or Kathleen McClaskey’s Continuum of Choice (see figure 1.4 below).

Diagram of the Continuum of Choice.
Figure 1.4 McClaskey’s Continuum of Choice. (Continuum of Choice TM by Barbara Bray and Kathleen McClaskey is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License available at: https://creativecommons.org/licenses/by-nc-nd/4.0/.)

I would need to investigate which design model best fits with specific course needs as well as brainstorm what questions need to be asked in order for problem-solving to be effective. Perhaps the answer to these questions will be course-specific and may require the use different models for different activities to further promote cognitive flexibility.

References

International Society for Technology in Education, (2017).  The ISTE standards for students. Retrieved from: https://www.iste.org/standards/for-students.

Kuo, F.-R., Chen, N.-S., & Hwang, G.-J. (2014). A creative thinking approach to enhancing the web-based problem solving performance of university students. Computers & Education, 72(c), 220–230.

National Academies Press. (2011). Chapter 4: Scenario-, problem-, and case-based teaching and learning. In National Academies Press, Promising practices in undergraduate science, technology, engineering, and mathematics: Summary of two workshops.(pp. 26-34.) Washington, DC. DOI: https://doi.org/10.17226/13099.

Pata, K. (2016). Problem-based learning in task-based and inquiry-based scenarios. [Blog] Retrieved from: https://creativeclassroomproject.wordpress.com/creative-classroom-collection/problem-based-learning/

Incorporating Feedback Loops to Develop An Empowered Student

Being a successful professor means preparing students to be successful. Delivering knowledge-centered classes on a particular topic is no longer the primary task of professors. Gone are the days of the large lecture halls, professor front and center, exhibiting knowledge for students to somehow absorb.  Scholars are now calling for students and professors to engage in a new learning paradigm that provokes the development of specific skills for the 21st century.  This paradigm includes teaching five major career skills that are highly sought after by employers today.  Mastering these five essential skills means that students: 1) thrive on change by being receptive to feedback, 2) are able to get things done independently and without direction, 3) are open-minded, understand their own biases, and appreciate differences in others, 4) know how to prioritize tasks, and are good at influencing behavior of others, 5) facilitate activities and relationships within an organization, (Kivunja, 2014).  This is not an easy feat as skills need time and practice to be cultivated. The first ISTE standard for students calls for the empowered learner as a mechanism to help build 21st century skills.  The empowered learner is one that, “…leverages technology to take an active role in choosing, achieving, and demonstrating competence in their learning goals,” (ISTE, 2017). An empowered student is one that is at the forefront of their learning by thinking beyond the lecture and is autonomous because they have intrinsic motivation, (Stefanou et. al., 2004).  

Figure 1.1 Empowered Student Flowchart

So if students need to develop self-determination and become autonomous in order to thrive in the current workforce, are we, as educators, doing our part in preparing them to do so?  This question can only be answered positively if we adopt a student-centered approach to teaching.  The authors of the book, Understanding by Design, challenge educators to consider the backward design approach. In this design approach, the educator starts their plan with the desired results, determines which indicators are appropriate for measuring the outcomes of their results, then plans the experiences and/or instruction required to achieve these outcomes, (Wiggins & McTighe, 2005).  When students are informed of the desired results and are allowed to take part in the creation process, that’s when self-determination and autonomy develops, (Stefanou et. al, 2004).

It is also important to remember that students are still developing these skills so simply stating the purpose or goal of an assignment and leaving them to their own devices will not help them develop autonomy.  Coupled with the student-centered approach, formative feedback must be included to help guide and remind students of the big-picture results.  Formative assessment when conducted as a feedback loop helps to “enhance performance and achievement,” (Wiggins, 2012).  Essentially, this means that students are given consistent, on-going, and immediate feedback as a way to encourage continual practice of skills.  Formative feedback is not evaluated formally (i.e. no grades are assigned to the feedback) and does not offer extensive evaluation, advice, nor it is purely praise.  Instead, formative feedback offers the student a “gauge of their efforts to reach a goal”, (Wiggins, 2012).  In order to provide good feedback, the assessor must first observe, then comment or ask questions on those observations, (Wiggins, 2012). Figure 1.2 summarizes Wiggin’s strategy on formative feedback.

Figure 1.2

Putting the Theory Into Practice: The Investigation.

In our digital education leadership program, we were asked to create a question(s) related to the classes we teach and investigate a resource(s) that would aid in addressing the first ISTE standard for students.  I teach a nutrition research class whose main purpose is to develop not only students’ research skills but also build autonomy as researchers. Students must  investigate a food-related issue, then design and implement an experiment, later report their findings through a final research paper. This class explores the research process including hypothesis creation, experiment -building and -testing, and scientific writing.  The current challenge is to allow enough freedom for autonomy to develop while providing  direction to ensure correct research protocol is established.  

I began my brainstorming process for a student-centered approach to the issue by first identifying the important design outcomes. I started with a goal: Allow students to take their research project into their own hands while working toward a common goal and using the research protocol. Though students will be developing autonomy and need to be self-driven, they will also need appropriate feedback in order to gauge their work at critical points in the quarter. With this goal in mind, two main questions developed: 1) What feedback timeline would be most effective to design a researcher-centered approach to teaching nutrition research classes? and 2) What computer driven-tools would effectively provide timely and ongoing feedback?  The findings of my investigation and potential resources are explored below.

Question 1: What feedback timeline would be most effective to design a researcher-centered approach to teaching nutrition research classes? Upon further investigation, this question can’t be answered directly. Each assessment will vary in scope and length, therefore a prescribed timeline is not feasible. However, according to education leaders Hicks and Wiggins, they both agree that formative feedback is the best approach using the student-centered or researcher-centered approach.  As a reminder, formative feedback is not formally assessed but rather allows the student/researcher an opportunity to take a step back to evaluate and reflect upon their own work in relation to their research goals. The timing of feedback should be immediate, ongoing, and consistent,(Hicks 2014, Wiggins 2012).  Feedback should follow a specific format which does not make judgements nor evaluates the work.  Hicks references the RISE model (see figure 1.3) to format formative feedback in a meaningful way, which is why I’ve chosen the model as the resource of choice for this question.

Figure 1.3

The RISE model can be used for self-assessment, peer-review, or evaluator review in formative feedback.  The process begins by assessing the degree to which the current work meets the goals/objectives of the assignment.  The subsequent steps allow for specific, tangible, and actionable suggestions to the author for improvements on their current version and future version of the work. The benefit of using this model is that as the feedback advances towards higher steps, it also involves higher level of thinking. RISE allows the user to get at the heart of student-centered learning by allowing students to evaluate and create works. I have not used this model in action but my predictions for any drawbacks may involve peer-feedback where students skip a level or provide judgements without fully understanding the model itself.  These concerns could be combated with scaffolding and more detailed instruction on the feedback process.

Question 2: What computer-driven tools would effectively provide timely and ongoing feedback? For an assessment item such as a research paper, using a collaboration tool such as G suite or the Google Doc Collaboration feature in CANVAS is ideal.  Google Docs are available to anyone that holds a gmail sign-in, along with several other features of the G suite including: to-do lists, calendar, google hangout, and gchat, just to name a few.  The Google Doc collaboration feature in CANVAS allows students to access a google doc on one google drive (usually belonging to the instructor).  The owner of the google drive would then have access to all of the collaboration pages for the class. The use of these collaboration tools is appealing because the docs are easily accessible by students, the professor, or individual providing the feedback.  Formative feedback is simple to provide using the “comment” feature. Google Docs also track changes throughout the life of the document and provides comment notifications in gmail. Using Google Docs would also help address issues related to equality of work among team members (i.e. members doing their fair share of the collaboration). To further my justification of this technology, it would help me improve my current assignment by achieving M and R from the SAMR model.  Google Doc collaboration also scores roughly a 14 on the Triple E rubric (according to my assessment of intended use).

The only downside related to the collaboration tool feature in CANVAS. The feature is not intuitive and somewhat difficult for students to access. It is also not well integrated with Google Docs, for example, simply placing students into groups on CANVAS and assigning these groups to a Google Doc collaboration does not automatically give students access to their group’s Google Doc in the drive.  The instructor has to manually give permission to each student. The collaboration feature also does not link instantly to the gradebook or back to CANVAS where other course materials/resources would be kept.

The Next Steps.

The RISE model and Google Doc tool were well received by my colleagues when evaluating them as resources that resolve my two questions on formative feedback. Not surprisingly, others also shared similar concerns with using Google Doc as a collaboration feature in CANVAS. Since Google Docs can be used independently of CANVAS, this is not a big issue particularly since formative feedback is not associated with a formal grade therefore an association with CANVAS materials or gradebook is not necessary.

Interestingly, most of their feedback on these two resources related to implementation, namely what assessment tools would/could be used to implement the RISE model and would/could Google Apps for Education help facilitate this assessment function? My initial reaction on creating an assessment tool to implement the RISE model was to create “guiding questions” students would answer as part of their feedback comments.  By answering the questions fully, the students would effectively go through the entire model without skipping steps. I have yet to investigate other Google Apps for Education for feedback features.  Though I do not have complete answers to these great questions, I do have the beginning of of my next investigation: Feedback Implementation.

References

Hicks, T. (2014, October 14). Make it count: Providing feedback as formative assessment. Edutopia. Retrieved from: https://www.edutopia.org/blog/providing-feedback-as-formative-assessment-troy-hicks

International Society for Technology in Education, (2017).  The ISTE standards for students. Retrieved from: https://www.iste.org/standards/for-students.

(Kivunja, C. (2014). Teaching students to learn and to work well with 21st century skills: Unpacking the career and life skills domain of the new learning paradigm. International Journal of Higher Education, 4(1), p1. Retrieved from http://files.eric.ed.gov/fulltext/EJ1060566.pdf

Stefanou, Candice R., Perencevich, Kathleen C., DiCintio, Matthew, & Turner, Julianne C. (2004). Supporting Autonomy in the Classroom: Ways Teachers Encourage Student Decision Making and Ownership. Educational Psychologist, 39(2), 97-110.

Wiggins, G., & McTighe, Jay. (2005). Understanding by design (Expanded 2nd ed., Gale virtual reference library). Alexandria, VA: Association for Supervision and Curriculum Development.

Wiggins, G. (2012, September). 7 keys to effective feedback. Education Leadership. 70 (1).

Wray, E. (2018). RISE Model. Retrieved from: http://www.emilywray.com/rise-model.

css.php