Developing Professional Development as Part of the Community Engagement Project.

The community engagement project challenges students to create a professional development session to be presented at a conference of the student’s choosing.  As part of building effective digital age environments, as prescribed by the ISTE Standards for Coaches #3, I chose to create an interactive session that focused on active learning and digital collaboration tools to improve current practices in nutrition education. Technology in nutrition education currently has limited uses but impactful potential. Despite the fact that nutrition information is plentiful in the digital world, the approach of dietitians and nutritionists has been to increase presence through blogs, social media, and videos (such as those on YouTube), while the Academy of Nutrition and Dietetics (AND), the representative organization for all dietitians, set their efforts to instill a code of ethics and provide information on privacy in the digital workplace.  These efforts may help mitigate nutrition misinformation but are often one-sided or engage only limited populations. For example, blogs may allow comments but do not allow for active engagement with the blog topics nor takes into account implementation on a local level. Social media platforms such as Facebook, Pinterest, and Twitter allow for nutritionists’ voices to be heard but rarely offer collaborative engagement between other experts, or communities. The solution is relatively simple as the digital tools mentioned offered plenty room for continued collaboration among participants at any level, (local or global).

The Academy itself recognizes the potential of technology in nutrition and has published a practice paper on nutrition informatics.  Nutrition informatics is a relatively new field in dietetics that addresses technology’s role in health practices.  The Academy discusses the potential pros and cons for each of the various practice fields in dietetics (clinical, food services, education/research, community, consultation/business) and technology’s potential for growth in each of those areas. In education specifically, the Academy recognizes use in distance learning, student progress tracking, speciality testing for licensing and certification, and professional course development.  However, it does not mention need for collaboration or engaging various audiences requiring nutrition education.

In order to bridge this gap and address the ISTE Coaching Standard, the topic for this professional development proposal focuses on building better nutrition education through digital collaboration tools.   The goal of this session is to explore benefits of active learning through technology aides (EdTech) and implement tools into existing lesson plans with the following objectives in mind:

  • a) Understand and/or review importance of active learning (evidence-based practice)
  • b) Become familiarized with collaborative edtech tools
  • c) Engage with edtech tool selection criteria and best practices
  • d) Explore ways to incorporate digital tools in lesson plan scenarios.

Professional Development Session Elements

In this one-hour session, participants will be invited to explore the main topic through both face-to-face and online collaboration, as the entire group navigates through a website developed specifically for the presentation. Since all of major content is available to them online, there is no need for note-taking, allowing participants to remain engaged throughout the session. Elements of the session involve: a pre-session technology self-assessment, an online group discussion via Padlet, think pair share elements, and lastly self-reflection elements submitted during and after the session.  More details on these elements are provided below.

Length. The Academy hosts local sub-organizations in each state. I chose to develop this professional development session for local dietitians and nutrition educators with the opportunity to present at the local education conference held annually.  The requirements of this local organization state that all educational sessions must be a minimum length of one hour. This is to meet the CEU (continuing education unit) minimum for registering dietitians. Considering that through the DEL program we have taken entire classes dedicated to active learning and digital tools, the length will limit the depth of information presented.  However, the ability to continually collaborate with both participants and presenter will allow for continued resource sharing after the session has ended.

Active, engaged learning with collaborative participation. Participants will be encouraged to participate and collaborate before, during, and after the session for a full engagement experience. The audience will be asked to review certain elements of the presentation website available here intermittently as they discuss key elements with the participants next to them. See figure 1.1 for lesson plan details.

Building Better Nutrition Education Through Digital Collaboration Tools
Objectives

Session Goal: Introduce ways to incorporate digital collaboration tools into existing nutrition education lesson plans.

Learning Objectives: At the end of the session participants will:

  • a) Understand and/or review importance of active learning (evidence-based practice)
  • b) Become familiarized with collaborative edtech tools
  • c) Engage with edtech tool selection criteria and best practices
  • d) Explore ways to incorporate digital tools in lesson plan scenarios
Performance Tasks

  • Participants will complete self-assessment prior to the session
  • Participants will demonstrate understanding of active learning by submitting informal Google Form Quiz in session
  • Participants will engage in collaborative edtech tools by submitting responses during the session
  • Participants will create their own digital tool need by complete case scenario
  • They will submit self-reflection via flipgrid post session
Plan Outline

  • Session Introduction (5 mins)
    • Prompt and Participation: Padlet Q & A- Describe a time you attended a great education session, what made that session great?
    • Review of self-assessment (completed prior to session)
  • Importance of active learning- evidence-based practice (5-10 mins)
    • Review of evidence: Google form quiz (embedded in site)
    • How can digital tools help? (5-10 mins)
  • Choosing the right digital tool (10 mins)
    • Triple E Framework rubric
    • Criteria for choosing the right digital tool
  • Tips on incorporating tools into existing lesson plan (10 mins)
    • Video Tutorial (take home message/resource)
  • Active practice (10 mins)
    • Case scenarios-flipgrid response
    • Flipgrid self-reflection
  • Questions (5 mins)

Total session length: 60 mins.

Figure 1.1 “Building Better Nutrition Education through Digital Tools” Session Lesson Plan.

Before the presentation, the participants will be invited to a google form self-assessment poll addressing comfort and knowledge with technology tools as well as their current use of technology tools in practice. During the presentation, the audience will be prompted to participate in “think, pair, share” elements, as well as, respond to collaboration tools prompts on padlet, google forms, and embedded websites.  After the presentation, participants will be encouraged to summarize their learning by submitting a flipgrid video.  

Content knowledge needs. The session content begins with establishing the importance of active learning as evidence-based practice to meet objectives a) and b). Just as motivational interviewing and patient-centered practice is desirable in nutrition, active learning invoking 21st century skills is evidence-based and an education standard. The content will then shift into teacher-focused how-tos for digital tools including how digital tools can help, how to select the right digital tool, and how to incorporate that tool into an existing lesson plan to address objectives c) and d). My assumption is that participants who are not comfortable with technology may be fearful or lack of motivation to explore various tools.  Group collaboration, modelling and gentle encouragement through case studies may help mitigate these fears.

Teachers’ needs. While the majority of the session focuses on introductory content to active learning and digital tools, teacher’s needs in digital tool management can be addressed through coach/presenter modeling. Simple statements such as, “I created this flipgrid video to serve as a model for students.” or “This google form was hyperlinked to gauge students’ understanding so far,” can serve as a basis to explore class management and digital tool management within the limited time. The website itself offer a section on FAQs, exploring questions and misconceptions about active learning and digital tools. Even with all of these resources, the audience will be introduced to technology coaching and may choose to consult a coach at their current institution.

In addition to modeling, three tutorial videos are available on the website to help teachers begin creating their own active learning lesson plans using the backwards design model. Each of the tutorials features closed captioned created through TechSmith Relay for accessibility.  The Google Site was also chosen because content is made automatically accessible to viewers, all the website creator has to do is include the appropriate heading styles and use alt text for pictures, figures, and graphs.

Lessons Learned through the Development Process.

One of the major challenges to developing this project was understanding the needs of the target audience.  Because nutrition informatics is relatively new, technology use has not be standardized in the profession, therefore estimating the previous knowledge and use of digital tools by the audience was difficult. My assumption is that technology use and attitudes about technology will be varied. The website attempts to breakdown information to a semi-basic level.  The only assumption I made was that the audience has good background in standard nutrition education practices. I also chose to develop the Technology Self-Assessment for the audience to complete prior to the session as a way to gain some insight into current technology use and comfort so that I may better tailor the session to that particular audience’s needs.

I realized as I was developing the lesson plan for this session that I only have time to do a brief introduction to these very important topics. If I were to create a more comprehensive professional development, I could expand the content into three one-hour sessions including 1) introduction and theory to collaborative learning which would address the importance of digital tools in nutrition education and establish need for active learning, 2) selecting, evaluating, and curating tech tools allowing educators to become familiarized with available tools based on individual need, and 3) lesson plan development integrating collaboration tools, a “how-to” session where participants create their own plan to implement. I had not anticipated that length was going to be a barrier, however, if the audience truly has limited digital familiarity and comfort, perhaps beginning with an introduction to these topics is sufficient.

One positive lesson that I’ve learned is that trying new things, such as creating a Google Site, can be very rewarding.  I have never experimented with Google Sites prior to this project and I am quite happy with the final website, though the perfectionist in me wants to continue tweaking and editing content. I originally was aiming to create slides for this presentation but realized that I am attempting to convince a possibly skeptical audience on the benefits of digital tools so using the same old tool would not allow me to do the scope of modelling I desire.  

I must admit that before this project, I had a hard time placing myself into the role of a “tech coach” because I would continually see each concept through the lens of an educator and how to apply the concepts to my own teaching.  It has been difficult for me to take a step back and realize that I am teaching but just in a different context. Creating the step-by-step tutorials was the turning point where I envisioned the audience modeling their lesson plans to the example I had given.  I hope I have the opportunity to present this session at the educational conference and bring the ideals of active learning and digital tools to professionals working in various education settings.

The Connection between Digital Competence and Problem-Solving

The word “troubleshooting” most often invokes images involving a conversation with the IT department, a progression of actions guided by the technician and performed by the user, and ending with a resolution in which the user’s original knowledge of technology has not been augmented. Unfortunately this is a all too common scenario. The user defaults all troubleshooting responsibility to a third party because of unfamiliarity or knowledge deficit of technology. This is not limited to just consumers and companies, there is a concern that students also do not troubleshoot well. According to the ISTE coaching standard, coaches should help teachers and students “troubleshoot basic software, hardware, and connectivity problems common in digital learning environments,” (ISTE, 2017). While calling for IT or passing responsibility onto another party, like a teacher for example, is generally practiced, learning to troubleshoot is a beneficial 21st century skill because it helps develop digital competence.

Why is digital competence important?

Like all 21st century skills, digital competence is a highly-sought skill in the ever-evolving workforce. An e-magazine, Training Industry, wrote an industry-perspective article on digital competence and highlights the need for competence in the workforce from the top of the organization chart down.  The author believes that the tech world today emcompasses “VUCA”, or volatility, uncertainty, complexity, and ambiguity. The role of those working in tech today should be to navigate this VUCA world seamlessly and one of the ways to do this is to reinforce digital competence, (Newhouse, 2017).  The industry definition of digital competence expands to include not only knowledge of technology but also involves understanding digital environments, effectively creating and consuming digital information, communicating and collaborating with diverse stakeholders, innovating rapidly, critically thinking/problem solving, and maintaining security, (Newhouse, 2017). This definition was devised from new European Union definitions and involves five major facets summarized in figure 1.1 below.

Infographic on the 5 major facets of digital competence
Figure 1.1 Facets of Digital Competence

What role does “digital competence” play in helping students problem-solve and troubleshoot online/technology issues?

One issue that arises is the general assumption that since students grew up with technology, or are considered digital natives, that they automatically build digital knowledge or that students know how to use technology well, (Hatlevik, et. al, 2015).  However, in order to use technology well, students need to build digital competence and literacy. According to researchers Hatlevik, Gudmundsdottik, and Loi, building digital competence is complex and involves various factors as summarized in figure 1.2 below.

Infographic on the key elements for developing digital competence
Figure 1.2 Developing Digital Competence

The researchers recognize that these facets are essential to culviating a deep understanding of technology while promoting critical reflection and creativity of digital skills.  These qualities in turn develop problem-solving skills in both independent and collaborative settings, (Hatelvik,et. al., 2015).

Other than knowledge deficits involving how to perform troubleshooting tasks, researchers suggest that when demanding conditions, such as a completing an assignment,  becomes difficult, it may hurt self-regulation and autonomy, (Koole, et.al, 2012). These difficulties can include cognitive, motivational, implementational, or a combinations of these factors.  While this theory is debated, meta-analyses indicate that low intrinsic value activities (such as homework) may lower complex problem solving abilities such as those required by troubleshooting, (Koole, et al. 2012).  Along with motivational issues, students may resolve themselves to believing that there is only one correct path or resolution to a specific problem in which the educator is the gatekeeper of the solution. Rather than seeking the solution for themselves, students prefer to go straight to the source which develops a learned helplessness, (Miller, 2015).

How can students develop digital competence?

Digital competence is a very complex concept that spans several social, motivational, personal, cultural, and technical understandings, therefore, there is no straightforward way for developing digital competence.  However, educators play a big role in establishing foundations for competence that may lead to better problem-solving and troubleshooting in two major ways:

  1. Allowing for self-directed learning. A consensus exists in the fact that students need to be reflective of their own learning, (Miller, 2015 and Plaza de la Hoz, et. al., 2015).  The role of the educator then shifts to provide resources including digital tools that allow students to experiment by active participation and engagement.
  2. Change in class culture. The attitudes and beliefs of the educator also reflects importance of digital competence in students. If the educator places low importance in digital competence, the students learn not to value or develop these important skills.  The educator can establish new beliefs, resources, and structures to promote a culture of answer-seeking through appropriate digital tools and tool use. Lastely, students must build self-efficacy through trial and error in a safe environment.

While researchers are investigating efficient methods for developing competences, all sources agree that in order for students to be successful in the 21st century, educators must open up the path to new technologies, new pedagogies, and new attitudes that help build digital competency, (Miller, 2015, and Plaza de la Hoz, et. al., 2015).  

Resources

Hatlevik, O.E., Gudmundsdottik, G.B., Loi, M. (2015). Digital diversity among among upper secondary students: A multilevel analysis of the relationship between cultural capital, self-efficacy, strategic use of information, and digital competence. Computers & Education. 81: 245-353. Available from: https://drive.google.com/file/d/0B5W5P9bQJ6q0RFNib3A5Vm9wWWM/view

ISTE, (2017). ISTE standards for coaches. Available from:

https://www.iste.org/standards/for-coaches

Koole, S.L., Jostmann, N.B., Baumann, N. (2012). Do demanding conditions help or hurt regulation? Available from: https://drive.google.com/file/d/0B5W5P9bQJ6q0M0QzalRBa0FfTXM/view

Miller, A. (2015, May 11). Avoiding learning helpness. Available from: https://www.edutopia.org/blog/avoiding-learned-helplessness-andrew-miller

Newhouse, B. (2017). Closing the digital competence gap. Available from: https://trainingindustry.com/magazine/issue/closing-the-digital-competence-gap/

Plaza de la Hoz, J., Mediavilla, D.M., Garcia-Gutierrez, J. (2015). How do teachers develop digital competence in their students? Appropriations, problematics, and perspectives. Available from: https://www.researchgate.net/publication/301914474_How_do_teachers_develop_Digital_Competence_in_their_students_Appropriations_problematics_and_perspectives

Developing Evaluation Criteria for EdTech Tools

Digital tools in the classroom is an asset to learning. According to the U.S. Department of Education, technology in the classroom ushers in a new wave of teaching and learning that can enhance productivity, accelerate learning, increase student engagement and motivation, as well as, build 21st century skills, (U.S. Department of Education, n.d.).  The offerings of technology tools for the classroom are plentiful as priorities shift to support a more integrated education. Educators now have several options for cultivating digital tools to better engage students, promote active learning, and personalize instruction. But choosing the right tools can be challenging especially considering that educators face a seemingly overwhelming array of options. How would can educators filter through all of the options to select the best tool(s) for their classroom?  

Enlisting the help of a technology coach who can systematically break down the selection process to ensure that the most appropriate tools are used is part of the solution.  In following with best practices, the third ISTE standard for coaching (3b) states that in order for tech coaches to support effective digital learning environments, coaches should manage and maintain a wide array of tools and resources for teachers, (ISTE, 2017).  In order to cultivate those resources, coaches themselves need a reliable way to select, evaluate, and curate successful options. Much like an educator may use a rubric or standards to assess an assignment’s quality, coaches can develop specific criteria (even a rubric) to assess quality of technology tools.  

Tanner Higgin of Common Sense Education understands the barrage of ed tech tools and the need for reliable tech resources, which is why he published an article describing what makes a good edtech tool great.  The article seems to be written more from a developer’s point of view on app “must-haves”, however Higgin also makes reference to a rubric used by Common Sense Education to evaluate education technology. He mentions the fact that very few tech tools reviewed receive a 5 out of 5 rating which makes me assume that Common Sense Education has a rigorous review system in place. I was curious to learn what criteria they use to rate and review each tool and/or so I investigated their rating process.  In the about section on their website, Common Sense Education mentions a 15-point rubric which they do not share. They do share, however, the key elements included in their rubric: engagement, pedagogy, and support, (Common Sense Education, n.d.). They also share information about the reviewers and how they decide which tools to review. This information serves as a great jumping off point in developing criteria for selecting, evaluating, and curating digital tools. Understanding the thought process of an organization that dedicates their time and resources for this exact purpose is useful for tech coaches in developing their own criteria.  

Continuing the search for technology tool evaluation criteria led me to several education leaders who share their process through various blog posts and articles.  Reading through the criteria suggestion, a common theme started to develop. Most of the suggested criteria fit under the umbrella terms defined by Common Sense with a few modifications, which are synthesized in figure 1.1 below.

Infographic with suggestions on evaluation criteria
Figure 1.1 Digital Tool Evaluation Criteria Suggestions

There is consensus among the educational leaders who placed emphasis on engagement and collaboration features of the tool. Tod Johnston from Clarity Innovations noted that a good tech tool should allow for personalization or differentiation of the learning process that also allowed the instructor to modify the content as needed for each class, (Johnston, 2015).  ISTE author, Liz Kolb added to this by stating that tools that allow for scaffolding help to better engage differentiation, (Kolb, 2016). Both Edutopia and ISTE authors agreed that sociability and shareability of the platform was important to engage students in wider audiences, (Hertz, 2010, & Kolb, 2016).

While engagement was a key element of selecting a tech tool for the classroom, even more important was how the tool fared in the realm of pedagogy in that first and foremost the technology needs to play a role in meeting learning goals and objectives, (Hertz, 2010).  Secondly, the tool should allow for instructional best practices including appropriate methods for modeling and instruction of the device, and functionality in providing student feedback, (Hertz, 2010 &, Johnston, 2015). Another pedagogical consideration is the ability of the platform to instill higher level thinking rather than “skill and drill” learning, (Kolb, 2016). Specific rubrics on pedagogy such as the SAMR and TRIPLE E framework models has been created and can be used in conjunction with these principles.

Support and usability was among the top safety concerns for evaluating these tools.  Cost and the desired features accessed within cost premium was among these concerns particularly when students needed to create an account or needed an email was a concern, (Hertz, 2010). Hertz called this issue free vs. “freemium”, meaning that some apps only allow access to limited functionality of the platform while full functionality could only be accessed through purchase of premium packages. If the platform was free, the presence of ads would need to be accessed,  (Hertz, 2010). In terms of usability, coveted features such as easy interface, instructor management of student engagement, and seperate teacher/student account were desirable, (Johnston, 2015). Along with cost and usability, app reliability and compatibility with existing technology was also listed as important features, (Johnston, 2015).

The evaluation process itself varied from curated lists of the top tech tools, criteria suggestions, even completed rubrics.  If those don’t quite apply to a specific evaluation process, a unique approach would be to convert the rubric into a schematic like the one shared from Denver Public Schools  where each key evaluation element could be presented as a “yes” or “no” question with a “yes, then” or “no, then” response following a  clear decisive trajectory for approval or rejection.  

What I’ve learned through the exploratory process of developing evaluation criteria for tech tools is that It is not important or necessary that a tool meet every single criteria item. Even the educational and tech experts reviewed in this blog emphasized different things in their criteria. In his blog, Tod Johnston suggests that there is no right or wrong way to evaluate technology tools because this isn’t a cookie cutter process.  Just like all teachers have a different style and approach to teaching so would their style and approach to using tech tools. The key to evaluating tools to to find the one that best fits the teacher’s needs, (Johnston, 2015).

Resources

Common Sense Education., (n.d.). How we rate and review. Available from: https://www.commonsense.org/education/how-we-rate-and-review

Hertz, M.B., (2010). Which technology tool do I choose? Available from: https://www.edutopia.org/blog/best-tech-tools

ISTE, 2017.  ISTE standards for coaches.  Available from: https://www.iste.org/standards/for-coaches.

Kolb, L., (2016, December 20). 4 tips for choosing the right edtech tools for learning. Available from: https://www.iste.org/explore/articleDetail?articleid=870&category=Toolbox

Johnston, T. (2015). Choosing the right classroom tools. Available from: https://www.clarity-innovations.com/blog/tjohnston/choosing-right-classroom-tools

Vincent, T. (2012). Ways to evaluate educational apps. Available from: https://learninginhand.com/blog/ways-to-evaluate-educational-apps.html

U.S. Department of Education., (n.d.). Use of technology in teaching and learning. Available from: https://www.ed.gov/oii-news/use-technology-teaching-and-learning.

Instructional Coaching: Using Rubrics to Quantify Qualitative Data for Improved Teaching Outcomes

Feedback can be a powerful tool to improve teaching and learning. Through feedback, new perspectives can be gained as teachers begin to can acern what is working and what isn’t in current instructional methods. Feedback also offers suggestions on achieving goals and standards that drive an educator’s work. There are four different types of feedback: formative, summative, confirmative, and predictive. Formative feedback occurs before an intervention takes place, such as giving students feedback on an assignment where the feedback does not impact the final grade.  I explore the benefits of formative feedback in this post. Summative feedback occurs after an intervention, such as when students turn in an assessment and the feedback provided is in relation to the grade outcome, (Becker, 2016). Predictive feedback occurs before any instruction has ever taken place to ensure that the method will be effective while confirmative occurs well after summative feedback to ensure that the methods are still effective, (Becker, 2016).  Of the four types, formative, and summative feedback are among the most widely used evaluation in educational institutions.

At the end of each quarter,  two types of summative evaluation is collected for each of the classes I’ve taught, quantitative and qualitative data to assess my performance as a professor, and the course outcomes.   The quantitative portion uses a likert scale ranging from 1=strongly disagree to 5= strongly agree, whereas at the bottom of the evaluation form, there is a section where students can provide comments, intended to give constructive feedback for classroom improvement.  While the comments are not always written constructively (I am addressing this through a mini-module students are required to complete for all of my classes), it’s mainly the common themes that present themselves in the evaluations that are powerful influencers of improving my classes.  However, what I’ve learned is that most of the time, the summative feedback is simply too late to improve the current student experience because the issue can’t be addressed until the next time the course is offered. As a technology and instructional coach, in order to help other educators improve their teaching outcomes, more timely feedback would be required that utilized both quantitative and qualitative assessment measures. While most learning management system (LMS) platforms can offer a multitude of analytics, quantifying data such as exam scores, class averages for assignments, and average engagement time on the platform, there isn’t an explicit way to neither collect nor quantify qualitative data.

The ISTE standard for coaching states that coaches should, “coach teachers in and model effective use of tools and resources to systematically collect  and analyze student achievement data, interpret results, and communicate findings to improve instructional practice and maximize student learning, (ISTE, 2017). If LMS can collect quantitative data that can be assessed throughout the quarter (through summative feedback), could it also be used to quantify qualitative data (i.e. comments) for improved teaching outcomes?  To answer this question,  I’d like to address it two ways:  1) Establish an understanding in the value and importance of self-reflection of assessments, and 2) Address how rubrics can help quantify qualitative data.

Importance of self-reflection.  Self-reflection can give several insights into the effectiveness of teaching.  According the Virginia Journal of Education, self reflection is a method to support current strengths and identify areas of improvement including continuing education or professional development needs. Educators may seek out self-reflection in order to review past activities, define issues that arise throughout the quarter/semester, understand how students are learning, modify a class due to unexpected circumstances, or address whether or not the teacher’s expectations have been met. Overall, self-reflection improves teacher quality, (Hindman & Stronge, n.d.)

Educators may sometimes make decisions based on emotions when deciding whether or not an element worked well in the classroom. However, without context to justify that decision, emotions are not a clear indicator of outcomes. Self reflection puts a process in place in which educators can collect, analyze, and interpret specific classroom outcomes, (Cox, n.d.).  Though there are various ways to perform self-reflection (see Figure 1.1), the most effective outcome is to ensure that the process has been thoroughly completed.

Figure on Cox's Types of Self-Reflection
Figure 1.1 Cox’s Types of Self-Reflection.

For an  instructional coach, following the proper self-reflection steps would be a great way to begin the discussion with someone wanting to improve their teaching. An instructional coach would help the educator:

  • Understand their outcome goals,
  • Choose the data collection/reflection method best suited to meet these goals,
  • Analyze the data together to identify needs,
  • Develop implementation strategies to address needs.

Because is the process is general, it can be modified and applied to various learning institutions. With my coaching background as a dietitian, similar to my clients needs for change, I would also include questions about perceived barriers to change implementation.  These questions would include a discussion on any materials, or equipment the educator would deem necessary but that may be difficult to obtain or that may require new skills sets to use fully.

Using rubrics to quantify qualitative data. Part of self-assessment includes using rubrics, in addition to analyzing data, goal setting, and reflection. According to the Utah Education Association (UEA), using a rubric helps to address the question “What do I need to reach my goals?”,  (UEA, n.d.). Rubrics present expected outcomes and expected performance, both qualitative qualities, in quantifiable terms. Good rubrics should include appropriate criteria that is definable, observable, complete, and includes a continuum of quality, (UEA, n.d.).  

If rubrics help quantify qualitative data, then how can rubrics assess reflection?  DePaul University tackled that very question, in which the response asked more questions including: what is the purpose of the reflection, will the assessment process promote reflection, and how will reflection be judged or assessed? (DePaul, n.d.).  Educational Leader, Lana Danielson remarks on the importance of reflective thinking and how technological, situational, deliberate, or dialectical thinking can influence teaching outcomes. Poor reflective outcomes, according to Danielson, is a result of not understanding why teachers do the things they do, and that great teachers are those know what needs to change and can identify reasons why, (Danielson, 2009).   Figure 1.2 describes the four types of reflective thinking in more detail.

Infographic on the four modes of reflective thinking
Figure 1.2 Grimmett’s Model of the Four Modes of Reflective Thinking

Developing rubrics based on the various types of reflective thinking will help quantify expectations and performances to frame improvement. The only issue with this model is that it is more diagnostic rather than quantifiable.  A more specific rubric model developed by Ash and Clayton in 2004, involves an eight-step prescriptive process including:

  • Identifying and analyzing the experience,
  • Identifying, articulating, and analyzing learning,
  • Undertaking  new learning experiences based on reflection outcomes, (DePaul, n.d.)

The Ash/Clayton model involves developing and refining a rubric based on learning categories related to goals.  All of the qualities related to the learning categories are defined and refined at each stage of the reflection process. More information on the eight-step process can be found here.

Regardless of the reflection assessment model used, coaches can capture enough criteria to create and use rubrics as part of the self-reflection process that can help improve teaching outcomes due to new awareness, and identified learning needs that may block improvements. Most LMS systems support rubrics as part of assessment in various capacities (some only support rubrics on designated “assignments” but not features like “discussions,” for example).  Each criteria item includes quality indicators which are also associated with a number, making the qualitative data now quantifiable similar to the way “coding” in qualitative research allows for quantifiable results. New rubric features allow for a range of quality points on common criteria and freeform responses, allowing for the possibility of modifications to the various reflection types. Because of the new functionalities and the myriad of rubric uses in LMS today, creating a good-quality rubric is now the only obstacle of rubric implementation for self reflection.

References

Becker, K. (2016, August 29.) Formative vs. summative vs. confirmative vs. predictive evaluation. Retrieved from: http://minkhollow.ca/beckerblog/2016/08/29/formative-vs-summative-vs-confirmative-vs-predictive-evaluation/

Cox, J. (n.d). Teaching strategies: The value of self-reflection. Retrieved from: http://www.teachhub.com/teaching-strategies-value-self-reflection.

Danielson, L. (2009). Fostering reflection. Educational Leadership. 66 (5)  [electronic copy]. Retrieved from: http://www.ascd.org/publications/educational-leadership/feb09/vol66/num05/Fostering-Reflection.aspx

DePaul University, (n.d.) Assessing reflection. Retrieved from: https://resources.depaul.edu/teaching-commons/teaching-guides/feedback-grading/Pages/assessing-reflection.aspx

Hindman, J.L., Stronge, J.H. (n.d). Reflecting on teaching: Examining your practice is one of the best ways to improve it. Retrieved from: http://www.veanea.org/home/1327.htm

ISTE, (2017). ISTE standards for coaching. Retrieved from: https://www.iste.org/standards/for-coaches.

Utah Education Association., (n.d.) Self-Assessment: Rubrics, goal setting, and reflection. [Presenter’s notes]. Retrieved from: http://myuea.org/sites/utahedu/Uploads/files/Teaching%20and%20Learning/Assessment_Literacy/SelfAssessment/Presenter%20Notes_Self-Assessment_Rubrics_Goal_Setting.pdf

Implementing Student-Centered Activities in Content-Intensive Courses

If you’ve ever taught a content-intensive course, you’ll know it’s like trying to finish a marathon in a sprint. In my experience, you get to the finish line, but you hardly remember the journey there. The content-intensive courses I teach are the foundational nutrition classes. Each contain at least six major learning objectives with about two sub-objectives and are designed to cover upwards of fifteen chapters of material in a ten-week quarter system. The predominant approach to these types of classes by faculty is to go broad, not deep, in learning and understanding.  I must admit this has been my approach as well, in fear that I will miss out on covering one of the learning objectives or sub-objectives. While my students tell me that the courses are interesting and engaging, I can’t help wonder if they will actually remember any content from the course or if they feel as if their brain has been put through a blender by spring break. Is the learning authentic or are they just learning for the sake of memorization to pass the final exam?

The ISTE Standards for Educators charge instructors with, “design[ing] authentic, learner-driven activities and environments that recognize and accommodate learner variability,” (ISTE, 2017).  If instructors truly wish to design their course using evidence-based practices, the focus needs to shift from covering material to student learning without compromising the learning objectives. ISTE educator standard 5b implies that technology can help marry the two concerns, “design authentic learning activities that align with content area standards and use digital tools and resources to maximize active, deep learning,” (ISTE, 2017). This ISTE 5b standard can best be illustrated by the “genius hour” concept developed by Nicohle Carter in pursuit of developing a personalized learning environment for her students. The idea is brilliant.  Allow students one opportunity a week (or as time allows) to dive deep into a topic they are interested in and demonstrate their learning through an artifact or digital presentation. The implementation of genius hour follows a six-component design model that highlights new roles and responsibilities for teachers and students alike, (Carter, 2014). See figure 1.1 for more information on the six-component personalized learning design.

Infographic highlighting 6 essentials for personalized learning.
Figure 1.1 Nicohle Carter’s Personalized Learning Essentials.

When implemented well, intrinsic motivation for learning soars, students are engaged in the material, and teachers can meet those ever-important learning objectives without feeling like they are just shoveling materials into students’ brains, (Carter, 2014). It seems like a win-win.  However, I started thinking back on my content-intensive courses and wondered how can student-centered activities (like genius hour) be implemented in these types of courses?

As a starting place for answering my question, I revisited Kathleen McClaskey’s continuum of choice.  I find the concept interesting that developing student-centered learning/activities, it ultimately comes down to how much control the teacher wants to let go of and how much “choice” is open for the students. In traditional content-intensive courses, the teacher has all of the control, or what McClaskey would classify as teacher-centered, (McClaskey, 2005).  She/he creates the lectures that revolve around a specific chapter in a textbook, then lectures to ensure the material in covered. Students, in this model, sit and observe the lecturer in hopes of absorbing some of the materials (or in most cases, cramming the information into their brain the night before the exam) while never actually deeply engaging with the information.  Using McClaskey’s continuum of choice suggests that some activities can still be controlled while giving the students some freedom to explore topics in their own choosing, i.e. consider the participant and co-designer models, (McClaskey, 2005).

Diagram of the Continuum of Choice.
Figure 1.2 McClaskey’s Continuum of Choice. (Continuum of Choice TM by Barbara Bray and Kathleen McClaskey is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.)

The challenging thing about the more student-centered models such as the designer or advocate from McClaskey’s continuum requires time, a luxury oftentimes not afforded in content-intensive courses, nor do they address how to implement each model topic.  However, despite these concerns, I am beginning to realize that in order to allow for more intrinsic and authentic learning, I need to let go of the desire to control all aspects of the content-intensive courses and shift my focus to what is really important, student learning.

Many of the resources similar to McClaskey, mention explicit instruction as part of a student-centered classroom. Explicit instruction provides “effective, meaningful, direct teaching…where students are active participants in the learning process,” (Shasta County, 2009). Creating an explicit learning lesson involves six guiding principles. 1) The instructor begins the class by setting the stage for learning, the learning objectives are clear and students understand their responsibility for their learning. 2) This is followed by clear, simple, and direct explanation of what the daily task is, why it is important, and how to best complete the task.  Students appreciate when tasks are broken down into smaller, logical steps. 3)The instructor models the process, including their thought process using visuals. This is important because simply explaining a concept doesn’t mean that the students will understand it or know what to do. 4) Before diving into the assignment on their own, students are given a guided activity where the instructor assesses readiness of the class. 5) Once the concept has been mastered, the students take to the task independently. 6) After the task(s) has been completed, the students are given an option for informal or formal reflection, the artifact is collected and compared to the learning objectives, (Shasta County, 2009).  Figure 1.3 provides a reference guide for these steps.

Infographic on explicit learning
Figure 1.3 Explicit Learning Reference Guide

According to the Shasta County Curriculum Lead, explicit learning is best used when there is a “well-defined body of information or skills students must master,” especially when other models such inquiry-based or project-based cannot be successfully implemented, (Shasta County, 2009).  The role of the teacher is more directed, specific, and allows students more insight and practice into the skills that they are learning. What I like about explicit learning is that the classroom activities do not have to be modified completely but the modification occurs is how the material is presented and practiced.  Students can appreciate this model because they engage in active learning but still have guidance and support from the teacher via modelling.

Through explicit learning even the content-intensive courses can have a deeper and more meaningful impact on learning. I had one class in particular in mind when considering the explicit learning/personalized learning approach. I teach a not-so-introductory nutrition class designed to meet the needs of allied health students.  All allied health students are required to take at least one nutrition class as part of their career training, and for many, this class will be the only nutrition class they will ever take. The pressure is high in terms of delivering content as it is very likely that they will not revisit this material anywhere else. While I can’t change the fact that they need to explore the chemical compositions and the processing of the nutrients in the body, I can influence how they engage with the health effects and recommendations of these nutrients, which are ever-changing anyway.  Using personalized learning and the explicit learning models, I could allot for one class time a week for the exploration of the health effects/recommendations on whatever condition, trend, or issue they wished to explore. Like the genius hour, the students could work together to investigate and create a digital artifact of their choosing that would best present their topic, and lastly to further promote collaboration, they could work together to provide feedback to one another on their topics. The students would be learning through co-learning, gaining a stronger and deeper interest into the subject matter, proving that content-intensive courses can also be student-centered.

Resources

Carter, N. (2014, August 4).Genius Hour and the 6 Essentials of Personalized Education. Retrieved from http://www.edutopia.org/blog/genius-hour-essentials-personalized-education-nichole-carter

International Society for Technology in Education, (2017).  The ISTE standards for educators. Retrieved from: https://www.iste.org/standards/for-educators.

McClaskey, K. (2005, November 5). Continuum of choice- More than a menu of options. Retrieved from http://kathleenmcclaskey.com/choice/

Shasta County Curriculum Lead, (2009).  What is direct/explicit learning [Word doc]. Retrieved from http://www.shastacoe.org/uploaded/dept/is/district_support/explicit_instruction_may_2009.doc

Professional Development-Improving Digital Literacy through Peer Modeling

It shouldn’t be a surprise that experts support the idea of incorporating technology into new and existing learning models to facilitate deeper and different skill sets than those taught by conventional methods today.  The biggest push for more technology adoption in education is to move the educational system away from antiquated models developed during the industrial revolution to a system that reflects today’s society and workplace. I particularly enjoy Sir Ken Robinson’s argument for changing the education system because we are living an an era where we are trying to meet the needs of the future with old methods designed for a different society than the one we live in now, (RSA, 2010).  Robinson stresses that we need to adopt new models that redefine the idea of “academic” versus “non-academic” and accept differences in thinking in regards to what it means to be “educated”. Part of the reason for this push is that today’s children are exposed to information stimuli which capture attention and change learning needs, (RSA, 2010).  

Incorporating 21st century skills requires introduction, implementation, and use of technology at all levels of education. Considering the importance of developing these skills, it is also important to understand the reasons behind creating a paradigm shift, particularly as we prepare students for the real-world in higher education. The New Media Consortium (NMC) published a report looking into the key trends that would promote and accelerate technology adoption in higher education. NMC identified and classified these trends in terms of length of time needed for implementation as well as difficulty, (NMC, 2017). Figure 1.1 summarizes the six key trends for technology adoption.

Infographic summarizing the key trends for accelerating technology adoption from NMC
Figure 1.1 NMC’s Key Trends for Accelerating Technology Adoption

What’s interesting to note about the trends above is that they not only focus on types of technology, or ways that technology is used in the classroom, but also on important skill sets and new ways of thinking that elevate technology use to a different, more meaningful level.  

Because the primary responsibility of a higher education institution should be to prepare students for the real-world, understanding the technology implications behind each of these trends call us, the professors, to reevaluate our technology use in the classroom. Despite these conversations on the need for technology adoption in higher education, several challenges continue to slow the rate of adoption.  NMC summarized six key challenges that significantly impede the process of the aforementioned trends. The challenges were classified from “solvable”, meaning the problem is well understood and solutions exist, to “wicked” where the challenges involve societal change,or,  dramatic restructuring of thinking or existing models, where solutions can’t be identified in the near future, (NMC, 2017). Figure 1.2 describes these challenges in more depth.

Infographic on the six challenges to technology adoption by NMC
Figure 1.2 Summary of the Six Challenges to Technology Adoption.

While experts look into the challenges that require more investigation and assessment of impact, I’d like to focus on one of the solvable challenges: digital literacy. Digital literacy has a broad definition which include a set of skills that “… fit the individual for living, learning, and working in a digital society,” (JISC, 2014). While mostly thought of as the ability to use different types of technology, the definition expands to include a deeper understanding of the digital environment, (NMC, 2017). Successful components of digital literacy include accessing, managing, evaluating, integrating, creating, and communicating information in all aspects of life, (UNESCO, 2011).  The UNESCO Institute for Information Technology in Education argues that digital literacy is basic skill that is equally as important as learning to read, write, and do math, (UNESCO, 2011). Interesting, when students are taught digital literacy and are allowed to use technology in learning, they grasp math and science more readily and easily than students without this skill, (UNESCO, 2011).

While it is clear that digital literacy is an important skill, during a departmental assessment conducted for another class, digital literacy was one of the biggest impediments to adopting technology. Faculty were only adopting technology only in response to industry need.  Many professors were eager to learn but not sure how to start using new technology, while others simply did not see a value in spending time and energy in implement new learning methods. Among the biggest barriers explored were time, knowledge deficit, and lack of professional development on digital literacy. Therefore, improving digital literacy will prove to be crucial to promote more tech adoption in the classroom. Professional development would need to include a conversation on what literacy looks like for each discipline and should not only include online etiquette, digital rights and responsibilities, curriculum design built around student-facing services, but also on the incorporation for the right technology for each context, (NMC, 2017).  

The ISTE standard for educators (2c) states that modelling is the, “identification, exploration, evaluation, curation and adoption of new digital resources and tools for learning” that can be used in professional development, (ISTE, 2017). So the what are effective methods for modeling and facilitating good digital literacy as part of faculty (formal or informal) development?

Peer modeling has been suggested as an alternative to traditional professional development or inservice. Among the reasons for peer modeling success is the fact that peer modeling is personalizable and actionable.  Faculty can choose the various digital literacy topics they are personally interested in, receive one-on-one training related to their knowledge gap and needs while receiving hands-on application, (Samek, et. al, 2016).  George Fox University piloted a peer modeling project after reviewing key data related to a digital fluency mentorship program that utilized tech solutions and the pedagogy to support tech use). The program was initially developed to address faculty desire for one-on-one training.  From faculty feedback survey, the program developers learned that faculty are more likely to adopt a tech solution if they see it in action (actionable examples) and are given evidence of positive student learning outcomes. Due to the success of the program, the university has expanded its efforts to other collaborative development, (Samek, et. al. 2016).

Learning from George Fox’s example, universities could build resources to offer similar professional development on digital literacy to improve technology adoption. What I particularly like about this idea is that it is a different way to look a professional development where the mentor can be the expert but it could also later transition into a co-learning model to increase ownership and interest in technology adoption. This model goes beyond professional development to focus on the real-time needs of each faculty member and work on existing classroom components. Above all, peer modeling improves digital literacy to increase technology adoption to further develop the 21st century skills of students and teachers alike.

References

JISC, (2014, Dec. 16). Developing digital literacy. [website]. Available from: https://www.jisc.ac.uk/guides/developing-digital-literacies.

New Media Consortium, (2017). Horizon report: 2017 Higher Education. [pdf].  Available from: http://cdn.nmc.org/media/2017-nmc-horizon-report-he-EN.pdf

RSA, (2010, Oct 14).  Changing educational paradigms [Youtube Video]. Available from: https://www.youtube.com/watch?v=zDZFcDGpL4U.

Samek, L., Ashford, R.M., Doherty, G., Espinor, D., Barardi, A.A., (2016). A peer training model to promote digital fluency among university faculty: Program component and initial efficacy data. Faculty Publications, School of Education. Paper 144.  Available from: http://digitalcommons.georgefox.edu/cgi/viewcontent.cgi?article=1143&context=soe_faculty

UNESCO Institute for Information Technology in Education, (2011, May). Policy brief. [pdf]. Available from: http://unesdoc.unesco.org/images/0021/002144/214485e.pdf

Co-learning, Co-teaching, and Cogenerative Dialogues to Improve Learning and Teaching Outcomes

What happens when you allow two people with seemingly different backgrounds to work together?  Great collaboration! This is true of a program co-sponsored by the Center for Educational Equity and Big Brother/ Big Sister that paired 9-14 year old girls with adult women to learn about computers.  The little and big sisters would meet to solve computer problems through a software program called SISCOM, (Wolman, 1986). Together they would dive deep into discussion, take turns leading and learning, helping each other problem solve through a process that provided 20 hours of computer basics instruction, (Wolman, 1986). Not only did the pairs work together to solve their shared problem but institutions worked together to provide the necessary resources.  This story highlights the successes of Co-Learning.

Traditional learning environments are generally set up to rely on one “expert” or teacher to lead and the remaining participants as the learners.  The teacher chooses what material to cover and to what extent the participants engage in the material. While this system works on the surface level, one of the major problems is that the teacher and students do not interact,“…when teachers and students do not interact successfully, contradictions occur,” (Tobin & Roth, 2005). This leads to the development of negative emotions that can manifest as disinterest, disappointment, frustration for the students, and job dissatisfaction for the teachers, (Tobin & Roth, 2005). According to Rheingold, one of the appeals of co-learning is that it levels out the hierarchy of the classroom.  When Rheingold engages in co-learning, he has everyone sit in a circle because then everyone is visible and everyone has an equal voice, (Rheingold, 2018). Co-learning assumes that teacher isn’t the gatekeeper nor the expert in all subjects and that all participants have something valuable to share and teach about a given concept. Just like in the Big Brother/Big Sister example above, neither the little nor big sister had an advantage over the learning and teaching of the SISCOM program. Both partners took equal interest and value in what the other knew, shared, and did. Because of the flattened hierarchy, it increased motivation, engagement, and excitement about learning/teaching, thereby improving learning outcome and attitudes towards learning, (Tobin, 2014).

One of the coveats of co-learning is co-teaching. While co-learning gives all participants an equal voice in learning together, co-teaching takes this a step further by inviting participants to also engage in all phases of the teaching process, (Tobin and Roth, 2005).  When implemented, co-teaching occurs between two or more teachers where one teacher may take on a mentor role. The most important factor of co-teaching is that it is not a mere division of tasks, but rather that teachers participate in the creation of all tasks.  Because some of the learning that occurs is subconscious, following through on process of co-teaching is important, (Tobin & Roth, 2005).

Diagram of the Co-teaching summary
Figure 1.1 Co-Teaching Summary

I’d also like to make a small mention about cogenerative dialogues. Tobin defines cogenerative dialogues as a side-component of co-teaching though it may also be used seperately.  Cogenerative dialogues involves small groups of about 5 individuals representing stakeholders (or demographics) that discuss specific incidences in class including reflection on lessons, (Tobin, 2014). Initially, these discussions can explore what works and what doesn’t in class lessons, but the discussions can also be expanded to roles of students/teachers, classroom rules, and how to use resources, (Tobin, 2014).  The benefit of these independent discussions that that all views and understandings are valued and all explanations are co-generated. It helps to ease communications among all cultural, socioeconomic boundaries by identifying (and acting upon) contradictions and later improving the quality of teaching and learning (Tobin & Roth, 2005).

Diagram of summary of cogenerative dialogue theory
Figure 1.2 Summary of Cogenerative Dialogue Theory

Despite the benefits of co-learning, several barriers should be addressed. Rheingold hypothesizes that teachers may be adverse to adopting co-learning because of the high level of trial and error that goes along with it, (Rheingold, 2018).  Teachers must give up a certain level of control and understand that outcomes will vary from classroom to classroom. While Rheingold is sympathetic to these barriers, he argues that trial and error also offers real-time modeling of problem solving and troubleshooting.  The key is to show students how to reflect upon a problem, re-examine, and adjust to the situation as necessary, (Rheingold, 2018).

Co-learning with a tech twist.  The ISTE standard for educators (4b in particular) indicates that teachers “collaborate and co-learn with students to discover and use new digital resources and diagnose and troubleshoot technology issues”, (ISTE, 2017).  In short, the standard places importance on the principles of co-learning addressed by Tobin and Roth, in addition to the modeling Rheingold stresses as a key factor to co-learning by focusing on how technology can foster collaboration while improving troubleshooting skills.  I had a particular problem in mind when I chose to explore this ISTE standard 4 component.  In my human nutrition class, students conduct a dietary analysis on their own diet.  The main features of this assignment is that students must accurately track their intake over the course of three days then input the data into an analysis program, later analyzing the findings in comparison to the Dietary Guidelines for Americans. The analysis program I had selected for this assignment, SuperTracker (https://www.supertracker.usda.gov/), will be discontinued at the end of this academic year for undisclosed reasons.  While the program was not without its faults, I supported the use of SuperTracker due to the fact that it is a free program easily accessible to anyone with internet, and it relied on the USDA database, an accurate and reliable set of nutrition data. I am now facing the challenge of reviewing apps and websites for SuperTracker’s replacement. However, the assignment would take a whole new meaning for students if they were allowed to co-learn from the start to finish of this project. In order for this project idea to be successful, it is important to consider how  nutrition-related apps can be leveraged to facilitate co-learning among students and professors regarding modes of nutrition education.

Addressing the ISTE Standard. As I started my search of nutrition-related apps and their feasibility for co-learning, I determined that credibility of app information should be a top priority. One of the challenges my students face is finding credible information to further their understanding.  For as long as I’ve been a professor, we’ve always looked at articles and websites and discussed the importance of reviewing these for credibility. However, information is now found in a variety of different mediums not limited to digital articles. Students are now using apps, videos, and other multimedia to gather information.  Understanding where that medium sourced their information is key to determining credibility. By examining and evaluating credibility for each app, all members involved in the use of this app would participate in troubleshooting and problem solving, a key caveat of the ISTE standard.

 The sheer amount of nutrition apps is staggering so I decided to narrow my search by starting with a credible source that provided a curated list, the Apps Review section of the Food and Nutrition Magazine. Food and Nutrition Magazine is a publication of the Academy of Nutrition and Dietetics (AND).  Where AND publishes research through the Journal of Nutrition and Dietetics, the magazine is often viewed as the “lighter” side or the “practical” side of the dietetics world. Food and Nutrition Magazine features new products, recipes, research highlights, in short, ways to keep updated in the food and nutrition world. The curated list of apps (https://foodandnutrition.org/tag/apps/) contains reviews of new and upcoming apps by the editors.  Those that are deemed reliable, credible, and useful, make the app list. The apps featured on the list explore a variety of nutrition topics that may have a nutrition education focus including food safety, physical activity, dining out, meal planning, in addition to apps that may be used by professionals in a variety of different capacities, such as video recording.

The list could serve as a good starting point for facilitating co-learning of the human nutrition dietary analysis project.  Having students further explore these apps in pairs (or small groups of three) in relation to assignment parameters can help facilitate collaboration and co-learning.  Adding a presentation element where these pairs teach the class on the usability of their chosen app may invoke the principles of co-learning. Finally, placing students in small, diverse groups and allowing them to reflect on the assignment makes their viewpoints heard as they embark in cogenerative dialogues.

While I initially had my sights set on this curated list for my human nutrition class, some of these apps may help facilitate student-professor collaboration, while others help foster practitioner-patient collaboration, making the possibility for implementing this list in other co-learning scenarios very feasible.  When both parties are able to contribute to how and why an app is used for various purposes, the co-learning is maximized.

References

ISTE. (2017).  ISTE standards for educators. Available at: https://www.iste.org/standards/for-educators

Rheingold, H. (2018). Co-learning: Modeling cooperative-collaborative learning [blog]. Available at: https://dmlcentral.net/co-learning-modeling-cooperative-collaborative-learning/

Tobin, K. (2014). Twenty questions about cogenerative dialogues. In book: Transforming urban education: Collaborating to produce success in science, mathematics and technology education, Chapter 11, Publisher: Sense Netherlands, Editors: Kenneth Tobin, Ashraf Shady, pgs.181-190 DOI: 10.1007/978-94-6209-563-2_11

Tobin, K., Roth, W.M. (2005). Implementing coteaching and cogenerative dialoguing in urban science education. School of Science and Mathematics, 105 (5): 313-21.

Wolman, J. (1986). Co-learning about computers. Educational Leadership, 43 (6), pg. 42. 

Digital Storytelling and Creative Communication: Does One Help Develop the Other?

Alan Alda, from M*A*S*H*, knows how to tell a story.  In one of his presentations, he asks a young woman to the stage.  Alda then asks the young woman to carry an empty glass across the stage.  She stares at the him awkwardly and does it without much fanfare. Alda then walks to her with a pitcher of water.  He pours water into the empty glass and fills to the brim. He asks her to carry the glass to the other side of the stage. “Don’t spill a drop of water or your entire village will die.”- he says.  The young woman, slowly, deliberately walks across the stage. She carefully gauges the level of water in the glass as she takes each step. The audience is silent, enraptured in the backstory of the overfilled glass.  They are interested and invested in the story. (Watch Alan Alda explain the importance of storytelling in his video: “Knowing How to Tell a Good Story is Like Having Mind Control.”)

Stories are powerful. Storytelling is one of the oldest forms of communication that we have.  We are attracted to stories because they are human, (Alda, 2017). Stories relay information about human nature, accomplishments, challenges, and discoveries. They make us feel part of a community and help evoke empathy, (Dillion, 2014).  According to Alan Alda, we like stories because we think in stories, particularly if the story has an obstacle. Like in the example above, we are interested in listening to the attempts overcoming the obstacle, (Alda, 2017).

Stories can also be powerful in the classroom.  A good story helps shape mental models, motivates and persuades others, and teaches lessons, (Dillion, 2014).  There are many ways to deliver a story but I have been gaining significant interest in digital storytelling. Technology is not stoic but rather highly personalizable as people are discovering unique ways to learn, entertain, network, and build relationships using technology, (Robin, 2008).  It is not surprising then that people are using technology to also share their story. Digital storytelling is technique that I discovered as I was exploring problem based learning (PBL) to develop innovation skills.  In that blog post, I explained that digital storytelling was one mode students could employ to “solve” a problem in PBL by creating an artifact. I realize that this wasn’t directly related to my inquiry at the time, because problem-based learning is more focused on the process of problem-solving rather than the artifact itself.  Despite this, I found the idea of digital storytelling interesting and wanted to revisit it. “Storytelling” in particular, is a buzzword that circles back in unexpected mediums. For example, my husband attended a conference that explored storytelling through data, in other words, how to design graphs, charts, and other visual representations of data that share a story without any significant description or explanation. Yet these graphs communicate important information. That then got me pondering about how digital storytelling can be used to teach students to creativity communicate information either about themselves or about a topic using technology.

So then how can students use digital storytelling for the purposes of creative communication? This question relates to ISTE Student Standard 6: Creative Communicator in which, “students communicate clearly and express themselves creatively for a variety of purposes using the platforms, tools, styles, formats and digital media appropriate to their goals.”  Digital storytelling is one vehicle in which students can use to express and communicate clearly.  Interestingly, the idea of digital storytelling isn’t new, it was originally developed in the 1980’s but is experiencing a renaissance in the 2000’s, (Robin, 2008). Not only can digital storytelling be a medium for learning, but also different types of information can be relayed using this technique including personal narrative (what most non-ed professionals use), stories on informing/instructing, and lastly, stories that examine historical events, (Robin, 2008).

Stories must be well-crafted in order for them to be effective and memorable. Students can deliver a story by investigating a topic, write a script, develop their story, and tie it all together using multimedia, (Robin, 2008).  Blogs, podcasts, wikis, and other mediums like pinterest can be used to convey a story simply,(University of Houston, 2018). To help students get started, the University of Houston’s Educational Uses of Digital Storytelling webpage offers great information such as timing, platforms, and examples of artifacts.

Figure depicting the digital storytelling process.
Figure 1.1 The Digital Storytelling Process

Before diving into a story, the most important elements are explored in its theoretical framework.  This framework includes the seven-elements needed in order for each story to be impactful. Figure 1.2 below summarizes the seven key elements.  

Infographic describing the 7 elements of digital storytelling
Figure 1.2 The 7 Elements of Digital Storytelling

Just as Alan Alda explores in his video, the seven-elements emphasize that good stories must capture the audience’s attention, explore obstacles or serious issues that the audience can connect with, and must be personal in order to enhance and accelerate comprehension, (Robin, 2008). By allowing students to engage in digital storytelling, they are also developing crucial 21st century skills: digital, global, technology, visual, and information literacy.

Tying it all together: How does digital storytelling fulfill the requirements for the ISTE student standard on creative communicator?

As Robin alludes to, it can be challenging to distinguish the various types of stories because oftentime they overlap, particularly considering the personal narrative, (Robin, 2008). A good story is relatable, we can put ourselves into the shoes of the protagonist.  The use of technology is just another medium we can use to communicate our stories. By implementing digital storytelling in the classroom, it would allow for transformation (SAMR) of existing assignments and lectures.  Here are some additional thoughts on how this technique can help students become creative communicators:

  • ISTE 6A: “Students choose the appropriate platforms and tools for meeting the desired objectives of their creation or communication”.  Platforms such as blogs, podcasts, in addition to tools such as cameras, and editing software are all components of digital storytelling. Allowing students to evaluate the various platforms and tools in relation to their desired outcome, they would be developing digital, technology, and visual literacy.
  • ISTE 6B: “Students create original works or responsibly repurpose or remix digital resources into new creations”. Though the most common application of digital storytelling would be to create an original artifact, Robin provides an example of remixing in recreating historical events by using photos, or old headlines to provide depth and meaning to the facts students are learning in class, (Robin, 2008). By curating and remixing existing artifacts, students would develop global, digital, visual, and information literacy.
  • ISTE 6C: “Students communicate complex ideas clearly and effectively by creating or using a variety of digital objects such as visualizations, models or simulations”. This idea goes back to the example I shared of storytelling using data (graphs/charts/figures) but it can also include infographics. Depicting complex data through an interesting visual medium engages digital, global, technology, visual, and information literacy.
  • ISTE 6D: “Students publish or present content that customizes the message and medium for their intended audiences”. The basis of storytelling is that it is meant to be shared with others.  If the story doesn’t match the audience, it will not be impactful or important. This is a point the 7-elements of digital storytelling stresses. Understanding and crafting stories for a specific audience demonstrates digital and global literacy.

Good digital storytelling can allow students become creative communicators.  Using technology can reach audiences in many ways never thought of before while still sharing the human experience.  As Robin puts it, in a world where we are receiving thousands of messages a day across many different platforms, stories become engaging, driving, and a powerful way to share a message in a short period of time, (Robin, 2008).

Resources

[big think channel]. (2017, July 18). Knowing how to tell a good story is like having mind control: Alan Alda. [Video File]. Retrieved from https://www.youtube.com/watch?v=r4k6Gm4tlXw

Dillon, B. (2014). The power of digital story. Edutopia. Retrieved from http://www.edutopia.org/blog/the-power-of-digital-story-bob-dillon

International Society for Technology in Education, (2017).  The ISTE standards for students. Retrieved from: https://www.iste.org/standards/for-students.

Robin, BR., (2008). Digital storytelling: A powerful technology tool for the 21st century classroom. Theory into Practice, 47: 220-228. DOI:1080/00405840802153916

University of Houston, (2018). Educational use of digital storytelling. Retrieved from: http://digitalstorytelling.coe.uh.edu/page.cfm?id=27&cid=27&sublinkid=75

Lessons from the Six Facets of Understanding and Backward Design Process

For the past ten weeks, my cohort and I have been exploring techniques to get more out of the classes we teach.  I have been personally exploring teaching methods that truly achieve student understanding. Interestingly, authors of the book, Understanding by Design, argue that our interpretation of the word “understanding” is narrow and doesn’t encompass the word’s full translation.  In my field of higher education, academic application of “understanding” typically means the “ability to explain”. Students who can explain demonstrate their understanding through academic performance such as achieving high test scores or through products such as essays, where they explain how things work, what they imply, and how the concepts are connected, (Wiggins & McTighe, 2005).  While this skill is important, we shouldn’t rely solely on explanation to demonstrate whether or not students are understanding, as we could potentially deemphasize the other meanings that hold an equal value, (Wiggins & McTighe, 2005). In fact, there are six facets of understanding which are highlighted in figure 1.1 below.

Infographic of Understanding by Design's six facets of understanding.
Figure 1.1 The Six Facets of Understanding from Understanding by Design.

One of the best practices for accomplishing student understanding (in one or multiple facets) is to lesson plan using the “backward design” approach. In this approach, educators are encouraged to look at their objectives, identify what they want students to learn and accomplish, then design a lesson plan that achieves those goals.  This lesson planning begins by first reviewing and refining objectives and/or learning outcomes. By establishing the lesson plan objectives early on, it ensures that the ultimate mission of the class is clearly defined. In other words, the objectives help set the destination of the lesson.  This step is followed by developing how these objectives/outcomes will be evaluated, setting the road map  for the learning journey.  Lastly, the actual plan with the learning activities is designed ensuring that the objectives are appropriately met, this will where the journey begins.  Figure 1.2 explores the backward design process from Understanding by Design more in-depth.

Figure describing the backward design process.
Figure 1.2 Understanding by Design’s Backward Design Process.

Implementing Backward Design

In our case, it wasn’t enough to understand what backward design is through explanation alone, our cohort was challenged to interpret and apply this design method.  We were given the option of designing a new lesson that we would use in the future, or choose an existing lesson to improve. I chose to focus on a unit from a project-based class I teach, whose main focus is mastering scientific writing while also developing research skills.  The ultimate assessment item of this unit is a final draft of the “Introduction” and “Methodology” sections of the research paper. This assessment focuses on appropriately and expertly incoportating components necessary to set the purpose and procedure of the research project.

Lesson Background. Before reaching this assessment, there are several steps that the students must accomplish.  By the time they turn in the final intro and methods draft, the students have already picked their research food (the topic of the research project and paper), created their hypothesis(es), designed their experiment, and are conducting several experiments a week. In order to successfully craft their experiment, they should have prepared a good annotated bibliography, which is the basis for the introductory section of the paper.  

In this introductory section, students develop a mini literature review exploring the properties and potential outcomes of their foods. Students understand that they are showcasing the work and results of other researchers, what literature is missing, and how their experiment contributes to the body of literature. The final paragraph introduces their experiment along with their hypothesis(es).

The methodology section of the paper is a brief, yet descriptive, mention of the procedure for producing the research food, its variations (typically students choose 2 variations), and other relevant how-to details of their experiment. The idea behind these few paragraphs is that anyone should be able to pick up their paper and clearly understand how to reproduce their experiment.

The Challenge. Historically, students struggle with the concept of a “final” draft, submitting for formal evaluation something that resembles a paper closer to a first rough draft. Students are then disappointed by their low assessment scores.

From the professor’s perspective, this assignment is frustrating to grade and disappointing to see the low quality effort from students. Despite the fact that students take an entire class dedicated to research writing prior to this class, it is evident that they have not mastered it.  In particular, they struggle with the content of these two sections. The two most common comments made in their writing is that some sections have far too much “fluff” or unnecessary explanation while other sections are too vague or lack clarity. They have a hard time writing concisely but descriptively.

From the student’s perspective (based on course evaluations and face-to-face feedback) the assignment is hard, they need more instruction on the writing process, and they have a misunderstanding of what the term “final draft” means. Students always comment that the writing portion is the most frustrating component of the course.

Students are not motivated to practice writing skills on their own though they are encouraged to write several drafts prior to the final draft due date. To help understand what content should be included, students  examine examples of scientific writing by identifying the necessary components of the intro and methods sections. Students become very good at identifying these pieces yet still struggle to apply them to their own work. This is likely because most students wait to write their first rough draft the night before the final draft is due, are not familiar with the proper draft writing process, or underestimate the difficulty of scientific writing and do not seek outside assistance. 

Revising the lesson. In an effort to resolve frustration from both the professor’s and student perspectives, my mission is to find simple, actionable solutions to address the issues present above. I would like to see students moving away from frustration to feeling challenged and having the intrinsic motivation to practice becoming great scientific writers.  One possible solution is making this draft process more collaborative. Since students become very good at identifying necessary components in the works of others, by providing more peer and instructor formative feedback, any clarity issues and missing content would be identified earlier. Students would also be encouraged to review their own work more frequently using the RISE model, addressing the issue of last-minute drafts.

By incorporating more collaboration, this provides an opportunity to focus on building digital citizenship.  In particular, I wish to address the ISTE student standard of digital citizenship that “develops safe, legal, and ethical behavior” when using technology by allowing students to write their drafts using a Google Doc collaboration, (ISTE, 2017).  Another way to implement this standard is through the curation process leading to the annotated bibliography using the web app, Diigo.  A second aspect of the digital citizenship standard I wish to address is “responsibly using and sharing intellectual property”, (ISTE, 2017).  Students will encounter this at various aspects of the class as they will rely heavily off of the works of others.

By working backwards to design a solution, I realized that all of the challenges faced by students in writing the final draft was actually pretty easy to overcome once I had all of the right tools and techniques.  My solution did involve significant re-arranging of existing helpful class topics, removal of unhelpful topics, and implementation of topics that previous students had identified as missing. Figure 1.3 summarizes the unit lesson planning with the new topics highlighted in bolded, yellow font.

Chart depicting a summary of the intro and methods unit learning and teaching activities.
Figure 1.3. Summary of the Intro and Methods Unit Learning and Teaching Activities.

As depicted by Figure 1.3 above, the concept of digital citizenship is introduced through an online literature curation process in which the students collect, organize, and annotate relevant research articles.   This new assignment is a spin-off of an existing assessment, annotated bibliography, that allows students not only to cultivate new skills, but provide a helpful tool to better capture information from the articles they read. Students are still required to submit an annotated bibliography but the artifact has been changed to include self-reflection.

The biggest change in this unit is the introduction of the three-step formative feedback process using the RISE model where students undergo peer, self, and instructor feedback.  Through this new process, it will help students write multiple drafts prior to the submission of the final draft. Sharing their work and thoughts are made simpler through the use of Google Docs.  This new collaboration effort allows students to work together and share their expertise to gain a better understanding of the draft writing process.

Final Thoughts on the Backward Design Process.

Wiggins and McTighe admit that is it difficult to follow this design process step by step without fighting the desire to skip to the next step or write one area with another in mind, (Wiggins & McTighe, 2005).  This was the case for me. The objectives and the evaluation criteria were clear as they were based off of accredited standards and those featured elements of scientific writing. The challenge existed in the preparation steps necessary to help students achieve those objectives. However, the most illuminating moment was the emphasis on the evaluation process.  By taking a closer look at my unit planning and through considerable reflection, I had realized that there were missing components that were not setting up my students to achieve the desired outcomes. It was like I had the the destination in mind, I knew the road I needed to take but I forgot which vehicle was going to get me there most efficiently.  Though I did fight the urge to jump straight into lesson planning, the backward design process helped remind me of what was important for this unit and better equipped me to  address the existing problems that I was previously unsure how to solve.

What I’ve also learned to appreciate is that as an educator, you are never quite done with this process.  One benefit that I had as I was revising my unit planning was the previous feedback I received from my students.  If they hadn’t voiced their frustrations in a constructive way, I wouldn’t have been able to address these issues so specifically. I didn’t need to reinvent the wheel, but rather just fix the small area that was not working. Thanks to their feedback, my design process was streamlined and poignant. As I gear up to implement these changes in the upcoming quarters, I look forward to the improved successes of my students while also being cognisant of the fact that I will, at some point, need to revisit the backward design process and make small yet significant changes again.

References

International Society for Technology in Education, (2017).  The ISTE standards for students. Retrieved from: https://www.iste.org/standards/for-students.

Wiggins, G., & McTighe, Jay. (2005). Understanding by design (Expanded 2nd ed., Gale virtual reference library). Alexandria, VA: Association for Supervision and Curriculum Development.

Building Computational Thinking through a Gamified Classroom

Who says playing video games doesn’t teach you anything?  Playing and creating games could actually help students develop another 21st century skill, computational thinking (CT).  Computational thinking is  a form of problem solving that takes large, complex problems, breaks them down into smaller problems, and uses technology to help derive solution. In deriving solutions, students engage in a systematic form of problem solving that involves four steps: 1) “decomposition” where a complex problem is broken down into smaller, more manageable problems, 2) “pattern recognition” or making predictions by finding similarities and differences between the broken down components, 3) “abstraction” developing general principles for the patterns that emerge, and  4) “algorithm design”, creating step-by-step instructions to solve not only this problem but other similar problems in the future, (Google School, 2016). By engaging in computational thinking, “students develop and employ strategies for understanding and solving problems in ways that leverage the power of technological methods to develop and test solutions, (ISTE, 2017).  In other words, the key to successfully following this process is that students develop their own models rather than simply applied existing models, (Google School, 2016).

Figure 1.1 Components of Computational Thinking
Figure 1.1 Components of Computational Thinking

In researching ways to apply computational thinking in the classroom, I ran across scholarly articles discussing the gamified classroom. I have always been intrigued with this concept, from my own experience students are so much more engaged during class time when the required content is converted into a game.  During these game sessions, my role changes from the the person delivering the content, to the person delivering the game (i.e. asking the questions).  The students are responsible for providing the content by providing solutions to the posed questions, thereby evoking problem-solving skills and in some cases, critical thinking skills. This idea-thread then led me to think “what are some ways that a “gamified” classroom can help develop computational thinking?”

To help answer my question, I came across two articles that pinpointed models in game-design to build computational thinking:

Article 1: Yang & Chang, 2013. Empowering students through digital game authorship: Enhancing concentration, critical thinking, and academic achievement.

Yang and Chang explore how students can increase their motivation for learning when they are allowed to design their own game given a specific topic.  During the game design process there is significant problem-solving that occurs because of the interaction and the immediate feedback the process entails.  In addition, students gain high order thinking such as building creativity, and critical thinking. The authors mention three game building software that does not require extensive coding skills: RPG Maker, Game Maker, and Scratch. During their study, the researchers investigated the effects of game design process on seventh grade biology students that were using either Flash animation (digital flash cards)  or RPG Maker.  The investigated effects included concentration, critical thinking, and academic performance. Their result demonstrated that the group using the RPG maker had significant improvements on critical thinking and academic performance, while no significant difference was noted on concentration for both groups.

Article 2: Kazimoglu, et. al., 2012.  A serious game for developing computational thinking and learning introductory computer programming.

Kazimoglu et. al. begin their inquiry by providing a few definitions.  It is important to understand the terminology they use, mainly defining any game used for educational purposes as a “serious” game.  They acknowledge that several definitions of computational thinking exist so they create their own definition that require the following elements: 1) conditional logic (true vs. false conditions); 2) building algorithms (step-by-step instructions); 3) debugging (resolving issues with the instructions); 4) simulation (modeling); and 5) distributed computation (social sharing). The authors are challenged to create a non-threatening introduction to programming unit to combat common student perception that programming is “difficult.” Kazimoglu et. al. believe that when students are allowed to engage in game design, they are motivated to learn which provokes problem solving. They take this approach to their introduction programming class where they challenge students through a series of exercises using the Robocode platform. At the end of the study, all students successfully completed the exercise, engaging in problem-solving skills.

Conclusions. Interestingly, both of these articles struggle to exactly define “computational thinking” and both mention that specific research investigating the extent to which games can develop CT is lacking.  However, what both can agree on is that CT is best developed when students are the game designers.  In order to do this, both studies involved elements of programming instruction to help students successfully build their games.

While these articles offer models into successfully implementing computational thinking through game design and creation, it was a little disheartening to discover that programming instruction was a necessary component. My inclination was to think how can these processes be implemented and/or adapted in other classroom scenarios particularly when programming instruction may or may not be feasible.  Interestingly, not all researchers agree that programming need be involved in successful CT implementation. Voogt et. al. argue that although most research on CT involves programming, because CT is a thinking skill,  it does not require programming in order to be successfully implemented, (Voogt et. al., 2015). In fact, in a literature review conducted by Voogt demonstrated that students do not automatically transfer CT skills to a non-programming context when instruction focused on programming alone. The strongest indicator of CT mastery was actually heavily dependant on instructional practices that focuses on application, (Voogt et. al., 2015).

The lack of a standard definition of computational thinking also needs to be addressed. The two articles above and the Voogt researchers agree that discrepancies exist among current definitions of computational thinking.  To avoid confusion regarding the role of programming and other such technologies, computational thinking can be simply defined as a way of processing information and tasks to solve complex problems, (Voogt et. al., 2015).  It is a way to look at similarities and relationships between a problem and follow a systematic process to reaching a solution.  Figure 1.2 summarizes this simplified process.

Figure 1.2 Simplified Computational Thinking Components
Figure 1.2 Simplified Computational Thinking Components

According to this new context, it is not necessary to program games in order for students to build computational thinking.  Allowing students to participate in systematic artifact creation will do the trick.  Some examples of artifact creation without the use of programing include: remixing music, generating animations, developing websites, and writing programs.  The main idea of this artifact creation process is that students follow procedures that can be applied to similar problems. Figure 1.3 highlights this artifact creation process.

Figure 1.3 Artifact Creation Process for Computational Thinking
Figure 1.3 Artifact Creation Process for Computational Thinking

How can this artifact creation process be used in creating gamified classroom?  To help me explore this issue, one of my colleagues suggested allowing students to develop and design their own board game. While the solution seems low-tech, others agree with this strategy.  Michele Haiken, an educational leadership for ISTE, writes about adapting “old school” games for the classroom to help develop critical thinking and problem solving skills, (Haiken, 2017).  Students can even create an online “quest,” scavenger hunt, or create a “boss event” to problem-solve computationally, (Haiken, 2017).  For more tech-y solutions, existing platforms and/or games such as GradeCraft and 3DGameLab can be used to  apply computational thinking in a gamified classroom, (Kolb, 2015). Regardless of the method used, low-tech board games or high-tech game creation through programming, allowing students to participate in the artifact creation process helps to build computational skills that they can then apply to other complex problems to create their own models.

References

Google School, (2016). What is computational thinking? [Youtube Video]. Retrieved from: https://www.youtube.com/watch?v=GJKzkVZcozc&feature=youtu.be.

Haiken, M., (2017).  5 ways to gamify your classroom. Retrieved from: https://www.iste.org/explore/articledetail?articleid=884.

International Society for Technology in Education, (2017).  The ISTE standards for students. Retrieved from: https://www.iste.org/standards/for-students.

Kazimoglu, C., et. al., (2012). A serious game for developing computational thinking and learning introductory computer programming. Procedia-Social and Behavioral Sciences, 47, 1991-1999.

Kolb, L., (2015). Epic fail or win? Gamifying learning in my classroom. Retrived from: https://www.edutopia.org/blog/epic-fail-win-gamifying-learning-liz-kolb.

Voogt J, et. al., (2015). Computational thinking in compulsory education: Toward an agenda for research and practice. Education and Technologies, 20(4), 715-728.

Yang, Y. C., & Chang, C. (2013). Empowering students through digital game authorship: Enhancing concentration, critical thinking, and academic achievement. Computers & Education, 68(c), 334–344.

css.php