Professional Development for Mixed Audiences

In its intention, professional development offers an opportunity for individuals to learn about new advancements in their respective field, including industry best practices.  However, professional development (PD) is criticized for its inability to offer either content, format, or context that is relevant. In the DEL program, we were asked for our opinions on what makes for good professional development (PD).  I reflected upon my experiences and noted that good professional development should be actionable, timely, and applicable.  PD should focus less on the “what” and more on the “how”.  My colleagues commented on the fact that good PD is characterized by interaction, relevancy, purposefulness, and focused on the learner. On the other hand, bad PD can be characterized as singular, stoic, and passive.  Looking back on my own experiences, I remember one PD training I took that was a five-hour long video of a therapist droning on about the physiology of stress. While the topic was interesting (for about half an hour) without any engagement or application, the training suddenly felt like an endless lecture.  More so, what makes it bad is that the PD worked on the premises that bombardment of facts equates into deep knowledge, however, “having knowledge in and of itself is not sufficient to constitute as expertise,” (Gess-Newsome, et. al., n.d.). 

Criteria for Good Professional Development. 

Because my colleagues and I all work in education and have experienced our fair share of PD, both good and bad, we were able to use our personal experience to determine the above criteria.  Research on how we (humans) learn demonstrates that my classmates and I were not wrong.  The goal of any professional development should impact student learning by augmenting knowledge in pedagogical content knowledge, (Gess-Newsome, n.d.).  In other words, the main idea behind PD is to help individuals become experts.  According to Gess-Newsome, et. al, expert knowledge is deep, developed over time, contextually bound, organized, and connected to big ideas, (Gess-Newsome et. al., n.d).  This is interesting considering that most PD is offered in one timeframe at about an hour, hardly enough to begin the application and reflection necessary for that content to become “expert knowledge.”   

What most PD, including my example of bad PD, is lacking is the opportunity to apply and reflect.  Research on how we learn notes that learning needs two elements, 1) a social context which helps us to maintain high levels of motivation (because learning takes incredible amounts of effort) and, 2) an active component that allows the learner to engage with ideas that can either create new experiences, build opportunities to acquire knowledge, or directly challenge what we already know, (Gess-Newsome, et. al., n.d.). Engaging the learner also takes into consideration that learners will come into the session with their own conceptions and preconceived notions based on their current learning needs.  To include all of these factors, the researchers from Northern Arizona University, strongly recommend the five principles of effective professional development summarized in figure 1.1 below. 

Infographic on principles of effective professional development
Figure 1.1 Principles of Effective Professional Development

The ISTE Standard 4b explores the properties of good professional development by defining the coach’s role as, “design[ing], develop[ing], and implement[ing] technology rich professional learning programs that model principles of adult learning and promote digital age best practices in teaching, learning, and assessment.” (ISTE, 2017).  The standard highlights all of the principles of effective PD.  Coaches should be able deliver PD that meets the needs of the learner within the context that is relevant to the learner.   

While understanding the theory behind effective PD is important, on a personal level, applying these theories will prove crucial in the upcoming months as I was asked to facilitate a professional development session at a conference.  My audience will be mixed group of registered dietitians with various levels of expertise in both nutrition education and technology.  Understanding the need to develop effective PD, I realized it will be important to also understand which professional development model works best for audiences of mixed technology skill for me to meet learners’ needs. 

After some investigation and feedback, it appears the best approach to address this inquiry will be in two parts, 1) understanding models for technology-infused PD, and 2) understanding the principles of learning differentiation. 

Technology-Infused Professional Development. 

Falling in line with the education best practices as noted by the ISTE standard above and the need for evidence-based practice required for all dietetic professional development, the PD should use technology in a way that allows for modelling adult learning and expose learners to using technology well in a professional setting.  Northern Arizona University researchers offers four PD models that utilize technology in different ways as summarized in figure 1.2 below. 

 

Figure 1.2 Technology-Infused PD Models

While reflecting upon these four models, professional development does not have to be limited to just one. All could be used as part of an on-going development process.  However, the one that struck me as most useful for the PD session I am planning would be the face-to-face with technology support.  I like the idea that the face-to-face portion isn’t a means to an end but rather the beginning of a longer term conversation. The researchers stressed that the audience engagement shapes the direction of the PD through the development of shared learning goals, (Gess-Newsome, et. al., n.d.).  This was a unique way to view the face-to-face model that has been traditionally maintained as PD.   

Learning Differentiation.  

Differentiated learning implies that educators take into consideration individual learning styles and level of readiness prior to designing the lesson plan, (Weselby, 2014). According to Concordia University, there are four ways to incorporate differentiated learning: 

1) Content–  Though the role of any educator is to ensure that learning outcomes are met, differentiating content implies what learners are able to do with that content by applying Bloom’s taxonomy of thinking skills.  Depending on the level of the learner, one learner might be content with simply defining a particular concept while another will strive to create a solution with that same content.  Allowing learners to select their level of readiness through content differentiation allows for smoother introduction of the material. 

2) Process- In process differentiation, the learners are engaging with the same content but are allowed a choice in the way in which they learn it.  Not all learners require the same level of instructor assistance, or require the same materials.  Process differentiation also assumes that some learners prefer to learn in groups while other may prefer to learn alone.  

3) Product– In this model, the learning outcome is the same but the final product is different. 

Learners have the ability to choose how they demonstrate mastery in a particular area through product differentiation.   

4) Learning Environment– The learning environment that accommodates different learning needs can be crucial to optimal learning.  Flexibility is key for this type of differentiation as learner may want various physical or emotional learning arrangements, (Weselby, 2014). 

One of my colleagues suggested that I consider differentiated instruction as a strategy to approach the various technology skill levels of my target audience.  I must admit that at first, I wasn’t sure how this could be applied to a conference setting.  However, considering the face-to-face technology-infused PD model above, differentiated instruction suddenly became not only plausible but also the more effective method. Differentiated learning aligns with the principles of effective PD by allowing the session to be as learner-centered as possible.  Because the learners take more responsibility for their own learning, they become better engaged in the process. 

In searching for professional development models that incorporate technology for mixed audiences, I learned that understanding the pillars of good professional development is just as important as applying technology in a relevant mode for everyone to understand.  Taking the two factors above into consideration, effective PD for my conference will need both a technology-infused model and the opportunity for differentiated learning. 

Resources 

Gess-Newsome, J., Blocher, M.J., Clark, J., Menasco, J., Willis, E.M. (n.d.) Technology infused professional development: A framework for development and analysis. Available from: https://www.citejournal.org/volume-3/issue-3-03/general/technology-infused-professional-development-a-framework-for-development-and-analysis/ 

ISTE, (2017). ISTE standards for coaches. Available from: https://www.iste.org/standards/for-coaches 

Weselby, C. (2014). What is differentiated instruction? Examples on how to differentiate instruction in the classroom. Available from: https://education.cu-portland.edu/blog/classroom-resources/examples-of-differentiated-instruction/ 

Instructional Coaching: Using Rubrics to Quantify Qualitative Data for Improved Teaching Outcomes

Feedback can be a powerful tool to improve teaching and learning. Through feedback, new perspectives can be gained as teachers begin to can acern what is working and what isn’t in current instructional methods. Feedback also offers suggestions on achieving goals and standards that drive an educator’s work. There are four different types of feedback: formative, summative, confirmative, and predictive. Formative feedback occurs before an intervention takes place, such as giving students feedback on an assignment where the feedback does not impact the final grade.  I explore the benefits of formative feedback in this post. Summative feedback occurs after an intervention, such as when students turn in an assessment and the feedback provided is in relation to the grade outcome, (Becker, 2016). Predictive feedback occurs before any instruction has ever taken place to ensure that the method will be effective while confirmative occurs well after summative feedback to ensure that the methods are still effective, (Becker, 2016).  Of the four types, formative, and summative feedback are among the most widely used evaluation in educational institutions.

At the end of each quarter,  two types of summative evaluation is collected for each of the classes I’ve taught, quantitative and qualitative data to assess my performance as a professor, and the course outcomes.   The quantitative portion uses a likert scale ranging from 1=strongly disagree to 5= strongly agree, whereas at the bottom of the evaluation form, there is a section where students can provide comments, intended to give constructive feedback for classroom improvement.  While the comments are not always written constructively (I am addressing this through a mini-module students are required to complete for all of my classes), it’s mainly the common themes that present themselves in the evaluations that are powerful influencers of improving my classes.  However, what I’ve learned is that most of the time, the summative feedback is simply too late to improve the current student experience because the issue can’t be addressed until the next time the course is offered. As a technology and instructional coach, in order to help other educators improve their teaching outcomes, more timely feedback would be required that utilized both quantitative and qualitative assessment measures. While most learning management system (LMS) platforms can offer a multitude of analytics, quantifying data such as exam scores, class averages for assignments, and average engagement time on the platform, there isn’t an explicit way to neither collect nor quantify qualitative data.

The ISTE standard for coaching states that coaches should, “coach teachers in and model effective use of tools and resources to systematically collect  and analyze student achievement data, interpret results, and communicate findings to improve instructional practice and maximize student learning, (ISTE, 2017). If LMS can collect quantitative data that can be assessed throughout the quarter (through summative feedback), could it also be used to quantify qualitative data (i.e. comments) for improved teaching outcomes?  To answer this question,  I’d like to address it two ways:  1) Establish an understanding in the value and importance of self-reflection of assessments, and 2) Address how rubrics can help quantify qualitative data.

Importance of self-reflection.  Self-reflection can give several insights into the effectiveness of teaching.  According the Virginia Journal of Education, self reflection is a method to support current strengths and identify areas of improvement including continuing education or professional development needs. Educators may seek out self-reflection in order to review past activities, define issues that arise throughout the quarter/semester, understand how students are learning, modify a class due to unexpected circumstances, or address whether or not the teacher’s expectations have been met. Overall, self-reflection improves teacher quality, (Hindman & Stronge, n.d.)

Educators may sometimes make decisions based on emotions when deciding whether or not an element worked well in the classroom. However, without context to justify that decision, emotions are not a clear indicator of outcomes. Self reflection puts a process in place in which educators can collect, analyze, and interpret specific classroom outcomes, (Cox, n.d.).  Though there are various ways to perform self-reflection (see Figure 1.1), the most effective outcome is to ensure that the process has been thoroughly completed.

Figure on Cox's Types of Self-Reflection
Figure 1.1 Cox’s Types of Self-Reflection.

For an  instructional coach, following the proper self-reflection steps would be a great way to begin the discussion with someone wanting to improve their teaching. An instructional coach would help the educator:

  • Understand their outcome goals,
  • Choose the data collection/reflection method best suited to meet these goals,
  • Analyze the data together to identify needs,
  • Develop implementation strategies to address needs.

Because is the process is general, it can be modified and applied to various learning institutions. With my coaching background as a dietitian, similar to my clients needs for change, I would also include questions about perceived barriers to change implementation.  These questions would include a discussion on any materials, or equipment the educator would deem necessary but that may be difficult to obtain or that may require new skills sets to use fully.

Using rubrics to quantify qualitative data. Part of self-assessment includes using rubrics, in addition to analyzing data, goal setting, and reflection. According to the Utah Education Association (UEA), using a rubric helps to address the question “What do I need to reach my goals?”,  (UEA, n.d.). Rubrics present expected outcomes and expected performance, both qualitative qualities, in quantifiable terms. Good rubrics should include appropriate criteria that is definable, observable, complete, and includes a continuum of quality, (UEA, n.d.).  

If rubrics help quantify qualitative data, then how can rubrics assess reflection?  DePaul University tackled that very question, in which the response asked more questions including: what is the purpose of the reflection, will the assessment process promote reflection, and how will reflection be judged or assessed? (DePaul, n.d.).  Educational Leader, Lana Danielson remarks on the importance of reflective thinking and how technological, situational, deliberate, or dialectical thinking can influence teaching outcomes. Poor reflective outcomes, according to Danielson, is a result of not understanding why teachers do the things they do, and that great teachers are those know what needs to change and can identify reasons why, (Danielson, 2009).   Figure 1.2 describes the four types of reflective thinking in more detail.

Infographic on the four modes of reflective thinking
Figure 1.2 Grimmett’s Model of the Four Modes of Reflective Thinking

Developing rubrics based on the various types of reflective thinking will help quantify expectations and performances to frame improvement. The only issue with this model is that it is more diagnostic rather than quantifiable.  A more specific rubric model developed by Ash and Clayton in 2004, involves an eight-step prescriptive process including:

  • Identifying and analyzing the experience,
  • Identifying, articulating, and analyzing learning,
  • Undertaking  new learning experiences based on reflection outcomes, (DePaul, n.d.)

The Ash/Clayton model involves developing and refining a rubric based on learning categories related to goals.  All of the qualities related to the learning categories are defined and refined at each stage of the reflection process. More information on the eight-step process can be found here.

Regardless of the reflection assessment model used, coaches can capture enough criteria to create and use rubrics as part of the self-reflection process that can help improve teaching outcomes due to new awareness, and identified learning needs that may block improvements. Most LMS systems support rubrics as part of assessment in various capacities (some only support rubrics on designated “assignments” but not features like “discussions,” for example).  Each criteria item includes quality indicators which are also associated with a number, making the qualitative data now quantifiable similar to the way “coding” in qualitative research allows for quantifiable results. New rubric features allow for a range of quality points on common criteria and freeform responses, allowing for the possibility of modifications to the various reflection types. Because of the new functionalities and the myriad of rubric uses in LMS today, creating a good-quality rubric is now the only obstacle of rubric implementation for self reflection.

References

Becker, K. (2016, August 29.) Formative vs. summative vs. confirmative vs. predictive evaluation. Retrieved from: http://minkhollow.ca/beckerblog/2016/08/29/formative-vs-summative-vs-confirmative-vs-predictive-evaluation/

Cox, J. (n.d). Teaching strategies: The value of self-reflection. Retrieved from: http://www.teachhub.com/teaching-strategies-value-self-reflection.

Danielson, L. (2009). Fostering reflection. Educational Leadership. 66 (5)  [electronic copy]. Retrieved from: http://www.ascd.org/publications/educational-leadership/feb09/vol66/num05/Fostering-Reflection.aspx

DePaul University, (n.d.) Assessing reflection. Retrieved from: https://resources.depaul.edu/teaching-commons/teaching-guides/feedback-grading/Pages/assessing-reflection.aspx

Hindman, J.L., Stronge, J.H. (n.d). Reflecting on teaching: Examining your practice is one of the best ways to improve it. Retrieved from: http://www.veanea.org/home/1327.htm

ISTE, (2017). ISTE standards for coaching. Retrieved from: https://www.iste.org/standards/for-coaches.

Utah Education Association., (n.d.) Self-Assessment: Rubrics, goal setting, and reflection. [Presenter’s notes]. Retrieved from: http://myuea.org/sites/utahedu/Uploads/files/Teaching%20and%20Learning/Assessment_Literacy/SelfAssessment/Presenter%20Notes_Self-Assessment_Rubrics_Goal_Setting.pdf

Implementing Student-Centered Activities in Content-Intensive Courses

If you’ve ever taught a content-intensive course, you’ll know it’s like trying to finish a marathon in a sprint. In my experience, you get to the finish line, but you hardly remember the journey there. The content-intensive courses I teach are the foundational nutrition classes. Each contain at least six major learning objectives with about two sub-objectives and are designed to cover upwards of fifteen chapters of material in a ten-week quarter system. The predominant approach to these types of classes by faculty is to go broad, not deep, in learning and understanding.  I must admit this has been my approach as well, in fear that I will miss out on covering one of the learning objectives or sub-objectives. While my students tell me that the courses are interesting and engaging, I can’t help wonder if they will actually remember any content from the course or if they feel as if their brain has been put through a blender by spring break. Is the learning authentic or are they just learning for the sake of memorization to pass the final exam?

The ISTE Standards for Educators charge instructors with, “design[ing] authentic, learner-driven activities and environments that recognize and accommodate learner variability,” (ISTE, 2017).  If instructors truly wish to design their course using evidence-based practices, the focus needs to shift from covering material to student learning without compromising the learning objectives. ISTE educator standard 5b implies that technology can help marry the two concerns, “design authentic learning activities that align with content area standards and use digital tools and resources to maximize active, deep learning,” (ISTE, 2017). This ISTE 5b standard can best be illustrated by the “genius hour” concept developed by Nicohle Carter in pursuit of developing a personalized learning environment for her students. The idea is brilliant.  Allow students one opportunity a week (or as time allows) to dive deep into a topic they are interested in and demonstrate their learning through an artifact or digital presentation. The implementation of genius hour follows a six-component design model that highlights new roles and responsibilities for teachers and students alike, (Carter, 2014). See figure 1.1 for more information on the six-component personalized learning design.

Infographic highlighting 6 essentials for personalized learning.
Figure 1.1 Nicohle Carter’s Personalized Learning Essentials.

When implemented well, intrinsic motivation for learning soars, students are engaged in the material, and teachers can meet those ever-important learning objectives without feeling like they are just shoveling materials into students’ brains, (Carter, 2014). It seems like a win-win.  However, I started thinking back on my content-intensive courses and wondered how can student-centered activities (like genius hour) be implemented in these types of courses?

As a starting place for answering my question, I revisited Kathleen McClaskey’s continuum of choice.  I find the concept interesting that developing student-centered learning/activities, it ultimately comes down to how much control the teacher wants to let go of and how much “choice” is open for the students. In traditional content-intensive courses, the teacher has all of the control, or what McClaskey would classify as teacher-centered, (McClaskey, 2005).  She/he creates the lectures that revolve around a specific chapter in a textbook, then lectures to ensure the material in covered. Students, in this model, sit and observe the lecturer in hopes of absorbing some of the materials (or in most cases, cramming the information into their brain the night before the exam) while never actually deeply engaging with the information.  Using McClaskey’s continuum of choice suggests that some activities can still be controlled while giving the students some freedom to explore topics in their own choosing, i.e. consider the participant and co-designer models, (McClaskey, 2005).

Diagram of the Continuum of Choice.
Figure 1.2 McClaskey’s Continuum of Choice. (Continuum of Choice TM by Barbara Bray and Kathleen McClaskey is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.)

The challenging thing about the more student-centered models such as the designer or advocate from McClaskey’s continuum requires time, a luxury oftentimes not afforded in content-intensive courses, nor do they address how to implement each model topic.  However, despite these concerns, I am beginning to realize that in order to allow for more intrinsic and authentic learning, I need to let go of the desire to control all aspects of the content-intensive courses and shift my focus to what is really important, student learning.

Many of the resources similar to McClaskey, mention explicit instruction as part of a student-centered classroom. Explicit instruction provides “effective, meaningful, direct teaching…where students are active participants in the learning process,” (Shasta County, 2009). Creating an explicit learning lesson involves six guiding principles. 1) The instructor begins the class by setting the stage for learning, the learning objectives are clear and students understand their responsibility for their learning. 2) This is followed by clear, simple, and direct explanation of what the daily task is, why it is important, and how to best complete the task.  Students appreciate when tasks are broken down into smaller, logical steps. 3)The instructor models the process, including their thought process using visuals. This is important because simply explaining a concept doesn’t mean that the students will understand it or know what to do. 4) Before diving into the assignment on their own, students are given a guided activity where the instructor assesses readiness of the class. 5) Once the concept has been mastered, the students take to the task independently. 6) After the task(s) has been completed, the students are given an option for informal or formal reflection, the artifact is collected and compared to the learning objectives, (Shasta County, 2009).  Figure 1.3 provides a reference guide for these steps.

Infographic on explicit learning
Figure 1.3 Explicit Learning Reference Guide

According to the Shasta County Curriculum Lead, explicit learning is best used when there is a “well-defined body of information or skills students must master,” especially when other models such inquiry-based or project-based cannot be successfully implemented, (Shasta County, 2009).  The role of the teacher is more directed, specific, and allows students more insight and practice into the skills that they are learning. What I like about explicit learning is that the classroom activities do not have to be modified completely but the modification occurs is how the material is presented and practiced.  Students can appreciate this model because they engage in active learning but still have guidance and support from the teacher via modelling.

Through explicit learning even the content-intensive courses can have a deeper and more meaningful impact on learning. I had one class in particular in mind when considering the explicit learning/personalized learning approach. I teach a not-so-introductory nutrition class designed to meet the needs of allied health students.  All allied health students are required to take at least one nutrition class as part of their career training, and for many, this class will be the only nutrition class they will ever take. The pressure is high in terms of delivering content as it is very likely that they will not revisit this material anywhere else. While I can’t change the fact that they need to explore the chemical compositions and the processing of the nutrients in the body, I can influence how they engage with the health effects and recommendations of these nutrients, which are ever-changing anyway.  Using personalized learning and the explicit learning models, I could allot for one class time a week for the exploration of the health effects/recommendations on whatever condition, trend, or issue they wished to explore. Like the genius hour, the students could work together to investigate and create a digital artifact of their choosing that would best present their topic, and lastly to further promote collaboration, they could work together to provide feedback to one another on their topics. The students would be learning through co-learning, gaining a stronger and deeper interest into the subject matter, proving that content-intensive courses can also be student-centered.

Resources

Carter, N. (2014, August 4).Genius Hour and the 6 Essentials of Personalized Education. Retrieved from http://www.edutopia.org/blog/genius-hour-essentials-personalized-education-nichole-carter

International Society for Technology in Education, (2017).  The ISTE standards for educators. Retrieved from: https://www.iste.org/standards/for-educators.

McClaskey, K. (2005, November 5). Continuum of choice- More than a menu of options. Retrieved from http://kathleenmcclaskey.com/choice/

Shasta County Curriculum Lead, (2009).  What is direct/explicit learning [Word doc]. Retrieved from http://www.shastacoe.org/uploaded/dept/is/district_support/explicit_instruction_may_2009.doc

Professional Development-Improving Digital Literacy through Peer Modeling

It shouldn’t be a surprise that experts support the idea of incorporating technology into new and existing learning models to facilitate deeper and different skill sets than those taught by conventional methods today.  The biggest push for more technology adoption in education is to move the educational system away from antiquated models developed during the industrial revolution to a system that reflects today’s society and workplace. I particularly enjoy Sir Ken Robinson’s argument for changing the education system because we are living an an era where we are trying to meet the needs of the future with old methods designed for a different society than the one we live in now, (RSA, 2010).  Robinson stresses that we need to adopt new models that redefine the idea of “academic” versus “non-academic” and accept differences in thinking in regards to what it means to be “educated”. Part of the reason for this push is that today’s children are exposed to information stimuli which capture attention and change learning needs, (RSA, 2010).  

Incorporating 21st century skills requires introduction, implementation, and use of technology at all levels of education. Considering the importance of developing these skills, it is also important to understand the reasons behind creating a paradigm shift, particularly as we prepare students for the real-world in higher education. The New Media Consortium (NMC) published a report looking into the key trends that would promote and accelerate technology adoption in higher education. NMC identified and classified these trends in terms of length of time needed for implementation as well as difficulty, (NMC, 2017). Figure 1.1 summarizes the six key trends for technology adoption.

Infographic summarizing the key trends for accelerating technology adoption from NMC
Figure 1.1 NMC’s Key Trends for Accelerating Technology Adoption

What’s interesting to note about the trends above is that they not only focus on types of technology, or ways that technology is used in the classroom, but also on important skill sets and new ways of thinking that elevate technology use to a different, more meaningful level.  

Because the primary responsibility of a higher education institution should be to prepare students for the real-world, understanding the technology implications behind each of these trends call us, the professors, to reevaluate our technology use in the classroom. Despite these conversations on the need for technology adoption in higher education, several challenges continue to slow the rate of adoption.  NMC summarized six key challenges that significantly impede the process of the aforementioned trends. The challenges were classified from “solvable”, meaning the problem is well understood and solutions exist, to “wicked” where the challenges involve societal change,or,  dramatic restructuring of thinking or existing models, where solutions can’t be identified in the near future, (NMC, 2017). Figure 1.2 describes these challenges in more depth.

Infographic on the six challenges to technology adoption by NMC
Figure 1.2 Summary of the Six Challenges to Technology Adoption.

While experts look into the challenges that require more investigation and assessment of impact, I’d like to focus on one of the solvable challenges: digital literacy. Digital literacy has a broad definition which include a set of skills that “… fit the individual for living, learning, and working in a digital society,” (JISC, 2014). While mostly thought of as the ability to use different types of technology, the definition expands to include a deeper understanding of the digital environment, (NMC, 2017). Successful components of digital literacy include accessing, managing, evaluating, integrating, creating, and communicating information in all aspects of life, (UNESCO, 2011).  The UNESCO Institute for Information Technology in Education argues that digital literacy is basic skill that is equally as important as learning to read, write, and do math, (UNESCO, 2011). Interesting, when students are taught digital literacy and are allowed to use technology in learning, they grasp math and science more readily and easily than students without this skill, (UNESCO, 2011).

While it is clear that digital literacy is an important skill, during a departmental assessment conducted for another class, digital literacy was one of the biggest impediments to adopting technology. Faculty were only adopting technology only in response to industry need.  Many professors were eager to learn but not sure how to start using new technology, while others simply did not see a value in spending time and energy in implement new learning methods. Among the biggest barriers explored were time, knowledge deficit, and lack of professional development on digital literacy. Therefore, improving digital literacy will prove to be crucial to promote more tech adoption in the classroom. Professional development would need to include a conversation on what literacy looks like for each discipline and should not only include online etiquette, digital rights and responsibilities, curriculum design built around student-facing services, but also on the incorporation for the right technology for each context, (NMC, 2017).  

The ISTE standard for educators (2c) states that modelling is the, “identification, exploration, evaluation, curation and adoption of new digital resources and tools for learning” that can be used in professional development, (ISTE, 2017). So the what are effective methods for modeling and facilitating good digital literacy as part of faculty (formal or informal) development?

Peer modeling has been suggested as an alternative to traditional professional development or inservice. Among the reasons for peer modeling success is the fact that peer modeling is personalizable and actionable.  Faculty can choose the various digital literacy topics they are personally interested in, receive one-on-one training related to their knowledge gap and needs while receiving hands-on application, (Samek, et. al, 2016).  George Fox University piloted a peer modeling project after reviewing key data related to a digital fluency mentorship program that utilized tech solutions and the pedagogy to support tech use). The program was initially developed to address faculty desire for one-on-one training.  From faculty feedback survey, the program developers learned that faculty are more likely to adopt a tech solution if they see it in action (actionable examples) and are given evidence of positive student learning outcomes. Due to the success of the program, the university has expanded its efforts to other collaborative development, (Samek, et. al. 2016).

Learning from George Fox’s example, universities could build resources to offer similar professional development on digital literacy to improve technology adoption. What I particularly like about this idea is that it is a different way to look a professional development where the mentor can be the expert but it could also later transition into a co-learning model to increase ownership and interest in technology adoption. This model goes beyond professional development to focus on the real-time needs of each faculty member and work on existing classroom components. Above all, peer modeling improves digital literacy to increase technology adoption to further develop the 21st century skills of students and teachers alike.

References

JISC, (2014, Dec. 16). Developing digital literacy. [website]. Available from: https://www.jisc.ac.uk/guides/developing-digital-literacies.

New Media Consortium, (2017). Horizon report: 2017 Higher Education. [pdf].  Available from: http://cdn.nmc.org/media/2017-nmc-horizon-report-he-EN.pdf

RSA, (2010, Oct 14).  Changing educational paradigms [Youtube Video]. Available from: https://www.youtube.com/watch?v=zDZFcDGpL4U.

Samek, L., Ashford, R.M., Doherty, G., Espinor, D., Barardi, A.A., (2016). A peer training model to promote digital fluency among university faculty: Program component and initial efficacy data. Faculty Publications, School of Education. Paper 144.  Available from: http://digitalcommons.georgefox.edu/cgi/viewcontent.cgi?article=1143&context=soe_faculty

UNESCO Institute for Information Technology in Education, (2011, May). Policy brief. [pdf]. Available from: http://unesdoc.unesco.org/images/0021/002144/214485e.pdf

Co-learning, Co-teaching, and Cogenerative Dialogues to Improve Learning and Teaching Outcomes

What happens when you allow two people with seemingly different backgrounds to work together?  Great collaboration! This is true of a program co-sponsored by the Center for Educational Equity and Big Brother/ Big Sister that paired 9-14 year old girls with adult women to learn about computers.  The little and big sisters would meet to solve computer problems through a software program called SISCOM, (Wolman, 1986). Together they would dive deep into discussion, take turns leading and learning, helping each other problem solve through a process that provided 20 hours of computer basics instruction, (Wolman, 1986). Not only did the pairs work together to solve their shared problem but institutions worked together to provide the necessary resources.  This story highlights the successes of Co-Learning.

Traditional learning environments are generally set up to rely on one “expert” or teacher to lead and the remaining participants as the learners.  The teacher chooses what material to cover and to what extent the participants engage in the material. While this system works on the surface level, one of the major problems is that the teacher and students do not interact,“…when teachers and students do not interact successfully, contradictions occur,” (Tobin & Roth, 2005). This leads to the development of negative emotions that can manifest as disinterest, disappointment, frustration for the students, and job dissatisfaction for the teachers, (Tobin & Roth, 2005). According to Rheingold, one of the appeals of co-learning is that it levels out the hierarchy of the classroom.  When Rheingold engages in co-learning, he has everyone sit in a circle because then everyone is visible and everyone has an equal voice, (Rheingold, 2018). Co-learning assumes that teacher isn’t the gatekeeper nor the expert in all subjects and that all participants have something valuable to share and teach about a given concept. Just like in the Big Brother/Big Sister example above, neither the little nor big sister had an advantage over the learning and teaching of the SISCOM program. Both partners took equal interest and value in what the other knew, shared, and did. Because of the flattened hierarchy, it increased motivation, engagement, and excitement about learning/teaching, thereby improving learning outcome and attitudes towards learning, (Tobin, 2014).

One of the coveats of co-learning is co-teaching. While co-learning gives all participants an equal voice in learning together, co-teaching takes this a step further by inviting participants to also engage in all phases of the teaching process, (Tobin and Roth, 2005).  When implemented, co-teaching occurs between two or more teachers where one teacher may take on a mentor role. The most important factor of co-teaching is that it is not a mere division of tasks, but rather that teachers participate in the creation of all tasks.  Because some of the learning that occurs is subconscious, following through on process of co-teaching is important, (Tobin & Roth, 2005).

Diagram of the Co-teaching summary
Figure 1.1 Co-Teaching Summary

I’d also like to make a small mention about cogenerative dialogues. Tobin defines cogenerative dialogues as a side-component of co-teaching though it may also be used seperately.  Cogenerative dialogues involves small groups of about 5 individuals representing stakeholders (or demographics) that discuss specific incidences in class including reflection on lessons, (Tobin, 2014). Initially, these discussions can explore what works and what doesn’t in class lessons, but the discussions can also be expanded to roles of students/teachers, classroom rules, and how to use resources, (Tobin, 2014).  The benefit of these independent discussions that that all views and understandings are valued and all explanations are co-generated. It helps to ease communications among all cultural, socioeconomic boundaries by identifying (and acting upon) contradictions and later improving the quality of teaching and learning (Tobin & Roth, 2005).

Diagram of summary of cogenerative dialogue theory
Figure 1.2 Summary of Cogenerative Dialogue Theory

Despite the benefits of co-learning, several barriers should be addressed. Rheingold hypothesizes that teachers may be adverse to adopting co-learning because of the high level of trial and error that goes along with it, (Rheingold, 2018).  Teachers must give up a certain level of control and understand that outcomes will vary from classroom to classroom. While Rheingold is sympathetic to these barriers, he argues that trial and error also offers real-time modeling of problem solving and troubleshooting.  The key is to show students how to reflect upon a problem, re-examine, and adjust to the situation as necessary, (Rheingold, 2018).

Co-learning with a tech twist.  The ISTE standard for educators (4b in particular) indicates that teachers “collaborate and co-learn with students to discover and use new digital resources and diagnose and troubleshoot technology issues”, (ISTE, 2017).  In short, the standard places importance on the principles of co-learning addressed by Tobin and Roth, in addition to the modeling Rheingold stresses as a key factor to co-learning by focusing on how technology can foster collaboration while improving troubleshooting skills.  I had a particular problem in mind when I chose to explore this ISTE standard 4 component.  In my human nutrition class, students conduct a dietary analysis on their own diet.  The main features of this assignment is that students must accurately track their intake over the course of three days then input the data into an analysis program, later analyzing the findings in comparison to the Dietary Guidelines for Americans. The analysis program I had selected for this assignment, SuperTracker (https://www.supertracker.usda.gov/), will be discontinued at the end of this academic year for undisclosed reasons.  While the program was not without its faults, I supported the use of SuperTracker due to the fact that it is a free program easily accessible to anyone with internet, and it relied on the USDA database, an accurate and reliable set of nutrition data. I am now facing the challenge of reviewing apps and websites for SuperTracker’s replacement. However, the assignment would take a whole new meaning for students if they were allowed to co-learn from the start to finish of this project. In order for this project idea to be successful, it is important to consider how  nutrition-related apps can be leveraged to facilitate co-learning among students and professors regarding modes of nutrition education.

Addressing the ISTE Standard. As I started my search of nutrition-related apps and their feasibility for co-learning, I determined that credibility of app information should be a top priority. One of the challenges my students face is finding credible information to further their understanding.  For as long as I’ve been a professor, we’ve always looked at articles and websites and discussed the importance of reviewing these for credibility. However, information is now found in a variety of different mediums not limited to digital articles. Students are now using apps, videos, and other multimedia to gather information.  Understanding where that medium sourced their information is key to determining credibility. By examining and evaluating credibility for each app, all members involved in the use of this app would participate in troubleshooting and problem solving, a key caveat of the ISTE standard.

 The sheer amount of nutrition apps is staggering so I decided to narrow my search by starting with a credible source that provided a curated list, the Apps Review section of the Food and Nutrition Magazine. Food and Nutrition Magazine is a publication of the Academy of Nutrition and Dietetics (AND).  Where AND publishes research through the Journal of Nutrition and Dietetics, the magazine is often viewed as the “lighter” side or the “practical” side of the dietetics world. Food and Nutrition Magazine features new products, recipes, research highlights, in short, ways to keep updated in the food and nutrition world. The curated list of apps (https://foodandnutrition.org/tag/apps/) contains reviews of new and upcoming apps by the editors.  Those that are deemed reliable, credible, and useful, make the app list. The apps featured on the list explore a variety of nutrition topics that may have a nutrition education focus including food safety, physical activity, dining out, meal planning, in addition to apps that may be used by professionals in a variety of different capacities, such as video recording.

The list could serve as a good starting point for facilitating co-learning of the human nutrition dietary analysis project.  Having students further explore these apps in pairs (or small groups of three) in relation to assignment parameters can help facilitate collaboration and co-learning.  Adding a presentation element where these pairs teach the class on the usability of their chosen app may invoke the principles of co-learning. Finally, placing students in small, diverse groups and allowing them to reflect on the assignment makes their viewpoints heard as they embark in cogenerative dialogues.

While I initially had my sights set on this curated list for my human nutrition class, some of these apps may help facilitate student-professor collaboration, while others help foster practitioner-patient collaboration, making the possibility for implementing this list in other co-learning scenarios very feasible.  When both parties are able to contribute to how and why an app is used for various purposes, the co-learning is maximized.

References

ISTE. (2017).  ISTE standards for educators. Available at: https://www.iste.org/standards/for-educators

Rheingold, H. (2018). Co-learning: Modeling cooperative-collaborative learning [blog]. Available at: https://dmlcentral.net/co-learning-modeling-cooperative-collaborative-learning/

Tobin, K. (2014). Twenty questions about cogenerative dialogues. In book: Transforming urban education: Collaborating to produce success in science, mathematics and technology education, Chapter 11, Publisher: Sense Netherlands, Editors: Kenneth Tobin, Ashraf Shady, pgs.181-190 DOI: 10.1007/978-94-6209-563-2_11

Tobin, K., Roth, W.M. (2005). Implementing coteaching and cogenerative dialoguing in urban science education. School of Science and Mathematics, 105 (5): 313-21.

Wolman, J. (1986). Co-learning about computers. Educational Leadership, 43 (6), pg. 42. 

Digital Storytelling and Creative Communication: Does One Help Develop the Other?

Alan Alda, from M*A*S*H*, knows how to tell a story.  In one of his presentations, he asks a young woman to the stage.  Alda then asks the young woman to carry an empty glass across the stage.  She stares at the him awkwardly and does it without much fanfare. Alda then walks to her with a pitcher of water.  He pours water into the empty glass and fills to the brim. He asks her to carry the glass to the other side of the stage. “Don’t spill a drop of water or your entire village will die.”- he says.  The young woman, slowly, deliberately walks across the stage. She carefully gauges the level of water in the glass as she takes each step. The audience is silent, enraptured in the backstory of the overfilled glass.  They are interested and invested in the story. (Watch Alan Alda explain the importance of storytelling in his video: “Knowing How to Tell a Good Story is Like Having Mind Control.”)

Stories are powerful. Storytelling is one of the oldest forms of communication that we have.  We are attracted to stories because they are human, (Alda, 2017). Stories relay information about human nature, accomplishments, challenges, and discoveries. They make us feel part of a community and help evoke empathy, (Dillion, 2014).  According to Alan Alda, we like stories because we think in stories, particularly if the story has an obstacle. Like in the example above, we are interested in listening to the attempts overcoming the obstacle, (Alda, 2017).

Stories can also be powerful in the classroom.  A good story helps shape mental models, motivates and persuades others, and teaches lessons, (Dillion, 2014).  There are many ways to deliver a story but I have been gaining significant interest in digital storytelling. Technology is not stoic but rather highly personalizable as people are discovering unique ways to learn, entertain, network, and build relationships using technology, (Robin, 2008).  It is not surprising then that people are using technology to also share their story. Digital storytelling is technique that I discovered as I was exploring problem based learning (PBL) to develop innovation skills.  In that blog post, I explained that digital storytelling was one mode students could employ to “solve” a problem in PBL by creating an artifact. I realize that this wasn’t directly related to my inquiry at the time, because problem-based learning is more focused on the process of problem-solving rather than the artifact itself.  Despite this, I found the idea of digital storytelling interesting and wanted to revisit it. “Storytelling” in particular, is a buzzword that circles back in unexpected mediums. For example, my husband attended a conference that explored storytelling through data, in other words, how to design graphs, charts, and other visual representations of data that share a story without any significant description or explanation. Yet these graphs communicate important information. That then got me pondering about how digital storytelling can be used to teach students to creativity communicate information either about themselves or about a topic using technology.

So then how can students use digital storytelling for the purposes of creative communication? This question relates to ISTE Student Standard 6: Creative Communicator in which, “students communicate clearly and express themselves creatively for a variety of purposes using the platforms, tools, styles, formats and digital media appropriate to their goals.”  Digital storytelling is one vehicle in which students can use to express and communicate clearly.  Interestingly, the idea of digital storytelling isn’t new, it was originally developed in the 1980’s but is experiencing a renaissance in the 2000’s, (Robin, 2008). Not only can digital storytelling be a medium for learning, but also different types of information can be relayed using this technique including personal narrative (what most non-ed professionals use), stories on informing/instructing, and lastly, stories that examine historical events, (Robin, 2008).

Stories must be well-crafted in order for them to be effective and memorable. Students can deliver a story by investigating a topic, write a script, develop their story, and tie it all together using multimedia, (Robin, 2008).  Blogs, podcasts, wikis, and other mediums like pinterest can be used to convey a story simply,(University of Houston, 2018). To help students get started, the University of Houston’s Educational Uses of Digital Storytelling webpage offers great information such as timing, platforms, and examples of artifacts.

Figure depicting the digital storytelling process.
Figure 1.1 The Digital Storytelling Process

Before diving into a story, the most important elements are explored in its theoretical framework.  This framework includes the seven-elements needed in order for each story to be impactful. Figure 1.2 below summarizes the seven key elements.  

Infographic describing the 7 elements of digital storytelling
Figure 1.2 The 7 Elements of Digital Storytelling

Just as Alan Alda explores in his video, the seven-elements emphasize that good stories must capture the audience’s attention, explore obstacles or serious issues that the audience can connect with, and must be personal in order to enhance and accelerate comprehension, (Robin, 2008). By allowing students to engage in digital storytelling, they are also developing crucial 21st century skills: digital, global, technology, visual, and information literacy.

Tying it all together: How does digital storytelling fulfill the requirements for the ISTE student standard on creative communicator?

As Robin alludes to, it can be challenging to distinguish the various types of stories because oftentime they overlap, particularly considering the personal narrative, (Robin, 2008). A good story is relatable, we can put ourselves into the shoes of the protagonist.  The use of technology is just another medium we can use to communicate our stories. By implementing digital storytelling in the classroom, it would allow for transformation (SAMR) of existing assignments and lectures.  Here are some additional thoughts on how this technique can help students become creative communicators:

  • ISTE 6A: “Students choose the appropriate platforms and tools for meeting the desired objectives of their creation or communication”.  Platforms such as blogs, podcasts, in addition to tools such as cameras, and editing software are all components of digital storytelling. Allowing students to evaluate the various platforms and tools in relation to their desired outcome, they would be developing digital, technology, and visual literacy.
  • ISTE 6B: “Students create original works or responsibly repurpose or remix digital resources into new creations”. Though the most common application of digital storytelling would be to create an original artifact, Robin provides an example of remixing in recreating historical events by using photos, or old headlines to provide depth and meaning to the facts students are learning in class, (Robin, 2008). By curating and remixing existing artifacts, students would develop global, digital, visual, and information literacy.
  • ISTE 6C: “Students communicate complex ideas clearly and effectively by creating or using a variety of digital objects such as visualizations, models or simulations”. This idea goes back to the example I shared of storytelling using data (graphs/charts/figures) but it can also include infographics. Depicting complex data through an interesting visual medium engages digital, global, technology, visual, and information literacy.
  • ISTE 6D: “Students publish or present content that customizes the message and medium for their intended audiences”. The basis of storytelling is that it is meant to be shared with others.  If the story doesn’t match the audience, it will not be impactful or important. This is a point the 7-elements of digital storytelling stresses. Understanding and crafting stories for a specific audience demonstrates digital and global literacy.

Good digital storytelling can allow students become creative communicators.  Using technology can reach audiences in many ways never thought of before while still sharing the human experience.  As Robin puts it, in a world where we are receiving thousands of messages a day across many different platforms, stories become engaging, driving, and a powerful way to share a message in a short period of time, (Robin, 2008).

Resources

[big think channel]. (2017, July 18). Knowing how to tell a good story is like having mind control: Alan Alda. [Video File]. Retrieved from https://www.youtube.com/watch?v=r4k6Gm4tlXw

Dillon, B. (2014). The power of digital story. Edutopia. Retrieved from http://www.edutopia.org/blog/the-power-of-digital-story-bob-dillon

International Society for Technology in Education, (2017).  The ISTE standards for students. Retrieved from: https://www.iste.org/standards/for-students.

Robin, BR., (2008). Digital storytelling: A powerful technology tool for the 21st century classroom. Theory into Practice, 47: 220-228. DOI:1080/00405840802153916

University of Houston, (2018). Educational use of digital storytelling. Retrieved from: http://digitalstorytelling.coe.uh.edu/page.cfm?id=27&cid=27&sublinkid=75

Lessons from the Six Facets of Understanding and Backward Design Process

For the past ten weeks, my cohort and I have been exploring techniques to get more out of the classes we teach.  I have been personally exploring teaching methods that truly achieve student understanding. Interestingly, authors of the book, Understanding by Design, argue that our interpretation of the word “understanding” is narrow and doesn’t encompass the word’s full translation.  In my field of higher education, academic application of “understanding” typically means the “ability to explain”. Students who can explain demonstrate their understanding through academic performance such as achieving high test scores or through products such as essays, where they explain how things work, what they imply, and how the concepts are connected, (Wiggins & McTighe, 2005).  While this skill is important, we shouldn’t rely solely on explanation to demonstrate whether or not students are understanding, as we could potentially deemphasize the other meanings that hold an equal value, (Wiggins & McTighe, 2005). In fact, there are six facets of understanding which are highlighted in figure 1.1 below.

Infographic of Understanding by Design's six facets of understanding.
Figure 1.1 The Six Facets of Understanding from Understanding by Design.

One of the best practices for accomplishing student understanding (in one or multiple facets) is to lesson plan using the “backward design” approach. In this approach, educators are encouraged to look at their objectives, identify what they want students to learn and accomplish, then design a lesson plan that achieves those goals.  This lesson planning begins by first reviewing and refining objectives and/or learning outcomes. By establishing the lesson plan objectives early on, it ensures that the ultimate mission of the class is clearly defined. In other words, the objectives help set the destination of the lesson.  This step is followed by developing how these objectives/outcomes will be evaluated, setting the road map  for the learning journey.  Lastly, the actual plan with the learning activities is designed ensuring that the objectives are appropriately met, this will where the journey begins.  Figure 1.2 explores the backward design process from Understanding by Design more in-depth.

Figure describing the backward design process.
Figure 1.2 Understanding by Design’s Backward Design Process.

Implementing Backward Design

In our case, it wasn’t enough to understand what backward design is through explanation alone, our cohort was challenged to interpret and apply this design method.  We were given the option of designing a new lesson that we would use in the future, or choose an existing lesson to improve. I chose to focus on a unit from a project-based class I teach, whose main focus is mastering scientific writing while also developing research skills.  The ultimate assessment item of this unit is a final draft of the “Introduction” and “Methodology” sections of the research paper. This assessment focuses on appropriately and expertly incoportating components necessary to set the purpose and procedure of the research project.

Lesson Background. Before reaching this assessment, there are several steps that the students must accomplish.  By the time they turn in the final intro and methods draft, the students have already picked their research food (the topic of the research project and paper), created their hypothesis(es), designed their experiment, and are conducting several experiments a week. In order to successfully craft their experiment, they should have prepared a good annotated bibliography, which is the basis for the introductory section of the paper.  

In this introductory section, students develop a mini literature review exploring the properties and potential outcomes of their foods. Students understand that they are showcasing the work and results of other researchers, what literature is missing, and how their experiment contributes to the body of literature. The final paragraph introduces their experiment along with their hypothesis(es).

The methodology section of the paper is a brief, yet descriptive, mention of the procedure for producing the research food, its variations (typically students choose 2 variations), and other relevant how-to details of their experiment. The idea behind these few paragraphs is that anyone should be able to pick up their paper and clearly understand how to reproduce their experiment.

The Challenge. Historically, students struggle with the concept of a “final” draft, submitting for formal evaluation something that resembles a paper closer to a first rough draft. Students are then disappointed by their low assessment scores.

From the professor’s perspective, this assignment is frustrating to grade and disappointing to see the low quality effort from students. Despite the fact that students take an entire class dedicated to research writing prior to this class, it is evident that they have not mastered it.  In particular, they struggle with the content of these two sections. The two most common comments made in their writing is that some sections have far too much “fluff” or unnecessary explanation while other sections are too vague or lack clarity. They have a hard time writing concisely but descriptively.

From the student’s perspective (based on course evaluations and face-to-face feedback) the assignment is hard, they need more instruction on the writing process, and they have a misunderstanding of what the term “final draft” means. Students always comment that the writing portion is the most frustrating component of the course.

Students are not motivated to practice writing skills on their own though they are encouraged to write several drafts prior to the final draft due date. To help understand what content should be included, students  examine examples of scientific writing by identifying the necessary components of the intro and methods sections. Students become very good at identifying these pieces yet still struggle to apply them to their own work. This is likely because most students wait to write their first rough draft the night before the final draft is due, are not familiar with the proper draft writing process, or underestimate the difficulty of scientific writing and do not seek outside assistance. 

Revising the lesson. In an effort to resolve frustration from both the professor’s and student perspectives, my mission is to find simple, actionable solutions to address the issues present above. I would like to see students moving away from frustration to feeling challenged and having the intrinsic motivation to practice becoming great scientific writers.  One possible solution is making this draft process more collaborative. Since students become very good at identifying necessary components in the works of others, by providing more peer and instructor formative feedback, any clarity issues and missing content would be identified earlier. Students would also be encouraged to review their own work more frequently using the RISE model, addressing the issue of last-minute drafts.

By incorporating more collaboration, this provides an opportunity to focus on building digital citizenship.  In particular, I wish to address the ISTE student standard of digital citizenship that “develops safe, legal, and ethical behavior” when using technology by allowing students to write their drafts using a Google Doc collaboration, (ISTE, 2017).  Another way to implement this standard is through the curation process leading to the annotated bibliography using the web app, Diigo.  A second aspect of the digital citizenship standard I wish to address is “responsibly using and sharing intellectual property”, (ISTE, 2017).  Students will encounter this at various aspects of the class as they will rely heavily off of the works of others.

By working backwards to design a solution, I realized that all of the challenges faced by students in writing the final draft was actually pretty easy to overcome once I had all of the right tools and techniques.  My solution did involve significant re-arranging of existing helpful class topics, removal of unhelpful topics, and implementation of topics that previous students had identified as missing. Figure 1.3 summarizes the unit lesson planning with the new topics highlighted in bolded, yellow font.

Chart depicting a summary of the intro and methods unit learning and teaching activities.
Figure 1.3. Summary of the Intro and Methods Unit Learning and Teaching Activities.

As depicted by Figure 1.3 above, the concept of digital citizenship is introduced through an online literature curation process in which the students collect, organize, and annotate relevant research articles.   This new assignment is a spin-off of an existing assessment, annotated bibliography, that allows students not only to cultivate new skills, but provide a helpful tool to better capture information from the articles they read. Students are still required to submit an annotated bibliography but the artifact has been changed to include self-reflection.

The biggest change in this unit is the introduction of the three-step formative feedback process using the RISE model where students undergo peer, self, and instructor feedback.  Through this new process, it will help students write multiple drafts prior to the submission of the final draft. Sharing their work and thoughts are made simpler through the use of Google Docs.  This new collaboration effort allows students to work together and share their expertise to gain a better understanding of the draft writing process.

Final Thoughts on the Backward Design Process.

Wiggins and McTighe admit that is it difficult to follow this design process step by step without fighting the desire to skip to the next step or write one area with another in mind, (Wiggins & McTighe, 2005).  This was the case for me. The objectives and the evaluation criteria were clear as they were based off of accredited standards and those featured elements of scientific writing. The challenge existed in the preparation steps necessary to help students achieve those objectives. However, the most illuminating moment was the emphasis on the evaluation process.  By taking a closer look at my unit planning and through considerable reflection, I had realized that there were missing components that were not setting up my students to achieve the desired outcomes. It was like I had the the destination in mind, I knew the road I needed to take but I forgot which vehicle was going to get me there most efficiently.  Though I did fight the urge to jump straight into lesson planning, the backward design process helped remind me of what was important for this unit and better equipped me to  address the existing problems that I was previously unsure how to solve.

What I’ve also learned to appreciate is that as an educator, you are never quite done with this process.  One benefit that I had as I was revising my unit planning was the previous feedback I received from my students.  If they hadn’t voiced their frustrations in a constructive way, I wouldn’t have been able to address these issues so specifically. I didn’t need to reinvent the wheel, but rather just fix the small area that was not working. Thanks to their feedback, my design process was streamlined and poignant. As I gear up to implement these changes in the upcoming quarters, I look forward to the improved successes of my students while also being cognisant of the fact that I will, at some point, need to revisit the backward design process and make small yet significant changes again.

References

International Society for Technology in Education, (2017).  The ISTE standards for students. Retrieved from: https://www.iste.org/standards/for-students.

Wiggins, G., & McTighe, Jay. (2005). Understanding by design (Expanded 2nd ed., Gale virtual reference library). Alexandria, VA: Association for Supervision and Curriculum Development.

Building Computational Thinking through a Gamified Classroom

Who says playing video games doesn’t teach you anything?  Playing and creating games could actually help students develop another 21st century skill, computational thinking (CT).  Computational thinking is  a form of problem solving that takes large, complex problems, breaks them down into smaller problems, and uses technology to help derive solution. In deriving solutions, students engage in a systematic form of problem solving that involves four steps: 1) “decomposition” where a complex problem is broken down into smaller, more manageable problems, 2) “pattern recognition” or making predictions by finding similarities and differences between the broken down components, 3) “abstraction” developing general principles for the patterns that emerge, and  4) “algorithm design”, creating step-by-step instructions to solve not only this problem but other similar problems in the future, (Google School, 2016). By engaging in computational thinking, “students develop and employ strategies for understanding and solving problems in ways that leverage the power of technological methods to develop and test solutions, (ISTE, 2017).  In other words, the key to successfully following this process is that students develop their own models rather than simply applied existing models, (Google School, 2016).

Figure 1.1 Components of Computational Thinking
Figure 1.1 Components of Computational Thinking

In researching ways to apply computational thinking in the classroom, I ran across scholarly articles discussing the gamified classroom. I have always been intrigued with this concept, from my own experience students are so much more engaged during class time when the required content is converted into a game.  During these game sessions, my role changes from the the person delivering the content, to the person delivering the game (i.e. asking the questions).  The students are responsible for providing the content by providing solutions to the posed questions, thereby evoking problem-solving skills and in some cases, critical thinking skills. This idea-thread then led me to think “what are some ways that a “gamified” classroom can help develop computational thinking?”

To help answer my question, I came across two articles that pinpointed models in game-design to build computational thinking:

Article 1: Yang & Chang, 2013. Empowering students through digital game authorship: Enhancing concentration, critical thinking, and academic achievement.

Yang and Chang explore how students can increase their motivation for learning when they are allowed to design their own game given a specific topic.  During the game design process there is significant problem-solving that occurs because of the interaction and the immediate feedback the process entails.  In addition, students gain high order thinking such as building creativity, and critical thinking. The authors mention three game building software that does not require extensive coding skills: RPG Maker, Game Maker, and Scratch. During their study, the researchers investigated the effects of game design process on seventh grade biology students that were using either Flash animation (digital flash cards)  or RPG Maker.  The investigated effects included concentration, critical thinking, and academic performance. Their result demonstrated that the group using the RPG maker had significant improvements on critical thinking and academic performance, while no significant difference was noted on concentration for both groups.

Article 2: Kazimoglu, et. al., 2012.  A serious game for developing computational thinking and learning introductory computer programming.

Kazimoglu et. al. begin their inquiry by providing a few definitions.  It is important to understand the terminology they use, mainly defining any game used for educational purposes as a “serious” game.  They acknowledge that several definitions of computational thinking exist so they create their own definition that require the following elements: 1) conditional logic (true vs. false conditions); 2) building algorithms (step-by-step instructions); 3) debugging (resolving issues with the instructions); 4) simulation (modeling); and 5) distributed computation (social sharing). The authors are challenged to create a non-threatening introduction to programming unit to combat common student perception that programming is “difficult.” Kazimoglu et. al. believe that when students are allowed to engage in game design, they are motivated to learn which provokes problem solving. They take this approach to their introduction programming class where they challenge students through a series of exercises using the Robocode platform. At the end of the study, all students successfully completed the exercise, engaging in problem-solving skills.

Conclusions. Interestingly, both of these articles struggle to exactly define “computational thinking” and both mention that specific research investigating the extent to which games can develop CT is lacking.  However, what both can agree on is that CT is best developed when students are the game designers.  In order to do this, both studies involved elements of programming instruction to help students successfully build their games.

While these articles offer models into successfully implementing computational thinking through game design and creation, it was a little disheartening to discover that programming instruction was a necessary component. My inclination was to think how can these processes be implemented and/or adapted in other classroom scenarios particularly when programming instruction may or may not be feasible.  Interestingly, not all researchers agree that programming need be involved in successful CT implementation. Voogt et. al. argue that although most research on CT involves programming, because CT is a thinking skill,  it does not require programming in order to be successfully implemented, (Voogt et. al., 2015). In fact, in a literature review conducted by Voogt demonstrated that students do not automatically transfer CT skills to a non-programming context when instruction focused on programming alone. The strongest indicator of CT mastery was actually heavily dependant on instructional practices that focuses on application, (Voogt et. al., 2015).

The lack of a standard definition of computational thinking also needs to be addressed. The two articles above and the Voogt researchers agree that discrepancies exist among current definitions of computational thinking.  To avoid confusion regarding the role of programming and other such technologies, computational thinking can be simply defined as a way of processing information and tasks to solve complex problems, (Voogt et. al., 2015).  It is a way to look at similarities and relationships between a problem and follow a systematic process to reaching a solution.  Figure 1.2 summarizes this simplified process.

Figure 1.2 Simplified Computational Thinking Components
Figure 1.2 Simplified Computational Thinking Components

According to this new context, it is not necessary to program games in order for students to build computational thinking.  Allowing students to participate in systematic artifact creation will do the trick.  Some examples of artifact creation without the use of programing include: remixing music, generating animations, developing websites, and writing programs.  The main idea of this artifact creation process is that students follow procedures that can be applied to similar problems. Figure 1.3 highlights this artifact creation process.

Figure 1.3 Artifact Creation Process for Computational Thinking
Figure 1.3 Artifact Creation Process for Computational Thinking

How can this artifact creation process be used in creating gamified classroom?  To help me explore this issue, one of my colleagues suggested allowing students to develop and design their own board game. While the solution seems low-tech, others agree with this strategy.  Michele Haiken, an educational leadership for ISTE, writes about adapting “old school” games for the classroom to help develop critical thinking and problem solving skills, (Haiken, 2017).  Students can even create an online “quest,” scavenger hunt, or create a “boss event” to problem-solve computationally, (Haiken, 2017).  For more tech-y solutions, existing platforms and/or games such as GradeCraft and 3DGameLab can be used to  apply computational thinking in a gamified classroom, (Kolb, 2015). Regardless of the method used, low-tech board games or high-tech game creation through programming, allowing students to participate in the artifact creation process helps to build computational skills that they can then apply to other complex problems to create their own models.

References

Google School, (2016). What is computational thinking? [Youtube Video]. Retrieved from: https://www.youtube.com/watch?v=GJKzkVZcozc&feature=youtu.be.

Haiken, M., (2017).  5 ways to gamify your classroom. Retrieved from: https://www.iste.org/explore/articledetail?articleid=884.

International Society for Technology in Education, (2017).  The ISTE standards for students. Retrieved from: https://www.iste.org/standards/for-students.

Kazimoglu, C., et. al., (2012). A serious game for developing computational thinking and learning introductory computer programming. Procedia-Social and Behavioral Sciences, 47, 1991-1999.

Kolb, L., (2015). Epic fail or win? Gamifying learning in my classroom. Retrived from: https://www.edutopia.org/blog/epic-fail-win-gamifying-learning-liz-kolb.

Voogt J, et. al., (2015). Computational thinking in compulsory education: Toward an agenda for research and practice. Education and Technologies, 20(4), 715-728.

Yang, Y. C., & Chang, C. (2013). Empowering students through digital game authorship: Enhancing concentration, critical thinking, and academic achievement. Computers & Education, 68(c), 334–344.

Innovation Through Using Problem-Based Learning

Whenever I think of the word “innovation,” I am reminded of the bear, honey, and powerline story. If you are not familiar with this story, I’ll offer a brief synopsis here, though there are other detailed versions available.

Employees of a powerline company met to brainstorm the issue of snow and ice accumulation on power lines which would down the lines in winter months. Despite formal, morning-long brainstorming, the session yielded little results. Frustrated, the team decided to take a short break. While on break, a few of the team members began to talk over coffee where one team member reminisced about how he got chased by a bear while out servicing the lines. After a good laugh, other team members jokingly suggested that they get bears to remove the snow/ice by placing honey pots on top of the powerlines. Continuing the joke, one team member suggested that they use helicopters to place the pots.  This idea was put to rest as another team member mentioned that the vibrations from the helicopters would scare the bears. Suddenly they realized they had a great solution on their hands, the company could use helicopters to remove the snow/ice through the force and vibrations caused by the helicopter blades. Because of this impromptu brainstorming session, using helicopters to remove snow and ice from powerlines is a common practice today.

diagram of a bear, honey, and a helicopter facilitating innovation.
Figure 1.1 A bear, honey, and a helicopter for innovation.

I like this story because it dispels the misconception that to be innovative you must create something new, like a product or a service.  Instead, innovation can be a way to problem solve. Much like the process that unfolded in the bear story, students should be encouraged to problem solve in creative ways.  By offering students opportunity to seek, identify, and apply information, they are building cognitive flexibility, a 21st century skill, (Kuo et. al., 2014). Cognitive flexibility encourages the development of creativity needed for innovation, a concept that involves the ISTE innovative designer standard where “students use a variety of technologies within a design process to identify and solve problems by creating new, useful or imaginative solutions,” (ISTE, 2017).

So then, how do you get students to begin thinking less about the “correct answer” and more “bears, honey, and helicopters” for innovation? This can be particularly difficult when students historically have been offered a “right” and “wrong” depiction of problems. Students can be “eased” into creativity through scaffolding using the systematic thinking concept of the creative problem solving model, (Kuo et. al., 2014). A summary of the model can be found in figure 1.2 below.  

Diagram of the Creative Problem Solving Model
Figure 1.2 The Creative Problem Solving Model

The creative problem solving model transitions students between understanding a problem, generating ideas about the problem, and finding solutions to that problem, (Kuo et. al., 2014).  The students evolve their thinking from identification to more complex thinking, ultimately evoking creativity and innovation.

While the creative problem solving model can be used to build cognition through various problem-solving steps, problem-based learning (PBL) can help format the classroom to help achieve self-directed learning. An instructor can start with any question-type from the creative problem solving model and allow students to work through that question with PBL.  The general process for designing a problem-based classroom is demonstrated in figure 1.3 below.  

Diagram depicting the Problem-Based Learning Process
Figure 1.3 The Problem-Based Learning Process

According to the National Academies Press, a PBL activity focuses on student-centered learning where the instructor is a facilitator or guide and the students work together to gather information, then generate ideas to solve the problem. The problem itself becomes the tool to obtain knowledge and develop problem solving skills, (National Academies Press, 2011).  PBL is not without its faults, in using PBL, students have slightly lower content knowledge than in the traditional classroom and students in a group may not share the same level of cognition, (National Academies Press, 2011).  Despite this, students engaging in PBL have a higher retention of content than in traditional classrooms, are better able to apply their knowledge, and have a deeper understanding of the content, (National Academies Press, 2011).

Putting the Theory into Practice: The Investigation

Several of the classes that I teach are content-based/coverage-based classes. These classes are designed to be foundational, meant to prepare students for higher level or more in-depth, application-based classes later on. As I was thinking about problem-based learning, I started wondering: “how can we fully expect students to become problem-solvers and apply content in more advanced classes when all they are expected to do is identify a concept in these foundational classes”?  Students really don’t understand the importance of a particular topic because the idea of application and innovation isn’t introduced until they are in another class.  To help give these coverage-based classes more meaning to the students now, I am considering applying more PBL-based activities to directly replace coverage-based activities. My investigation leads me develop the two guiding questions below that will help me gather ideas on how to solve this problem. I realize that I am essentially engaging in my own PBL.

Question 1: What are some examples of problem-based, or “idea-finding” class activities that better support student learning in coverage-based classes?  One resource that addresses this question is from the National Academies Press who published a summary of two workshops conducted in 2011 on “Promising Practices in Undergraduate Science.” The selected chapter (Chapter 4) summarizes the benefits of problem-based learning and describes 3-methods that show promise in content-heavy classrooms. Additionally the chapter provides templates or guiding principles for problem based activities, case-scenarios, and complex problems that are clear, concise, and general enough that they can be applied to various assignments or learning activities.  However, this chapter does not address specific examples to use as a model.  Despite this, the chapter is supportive in building theory and gathering initial ideas for PBL in the classroom. Another resource that may help address this question comes from the The Creative Classroom Project.  The project is a website created by the Eramus project led by university lecturers in Estonia specializing in digitally-enhanced learning scenarios.  The website/blog provides not only offers theory-based ideas but actual examples of the various methods that use PBL.  The professors call the various PBL methods “learning scenarios” and base their work off of a “trialogical learning design.” Though most of the examples are for primary and secondary education, the formatting  is helpful in brainstorming similar scenarios for higher education.

Question 2: How/can ICT be used to enhance learning in those above examples? To be honest, I was not sure I would find very many examples on how to apply technology in PBL.  I was quite mistaken.  Depending on the goal or scope of the learning activity, a multitude of tech apps and websites can be applied to the various PBL methods. Here are just a few examples of tech resources that can be used with PBL:

  • LePlanner lesson plan templates from the Creative Classroom Project. This resource provides several examples of specific tech such as padlet, pearltree, and mindmiester, that can be used to enhance classroom activities. The templates also provide lesson plans (via LePlanner software) which includes description of objectives, class activities that meets the objectives, and even includes timelines for each activity.
  • Digital storytelling corresponds with the case-studies (case scenario) PBL method. According to the National Academies Press chapter, one of the justifications for using case studies is that it is a form of storytelling.  Storytelling helps students learn by integrating knowledge, reflecting on ideas, and later articulating them while considering various perspectives, (National Academie Press, 2011).  Digital storytelling is a way to introduce technology as a problem-solving tool and helps students express their various perspectives. This digital storytelling resource offers background information about digital storytelling, the seven elements of storytelling, and resources (tech solutions) can be explored. I had never considered using blogs, pinterest, and other such social media resources for the purposes of digital storytelling.

The Next Steps.

This investigation has been a great first step in generating ideas for implementing more PBL activities into my content-intensive courses. There seems to be an endless world of possibilities for  integrating technology to develop creative solutions and innovation in the classroom. What I find interesting is that my findings mirrors that of the bear, honey, and helicopter story.  I discovered that coming up with a solution to my questions doesn’t involve reinventing the wheel, but rather considers ideas/products that already exist and using them in creative ways.  For example, I would have never considered using the Pinterest app or even Google Docs as a creative solution to digital storytelling.  Nor would I have considered that developing good problem-solving skills for students simply involves asking the right questions.

My process doesn’t end here. If I choose to implement PBL, the next steps will involve the six-step process highlighted in this article to successfully design, implement, and evaluate problem-based learning.  I need to carefully consider the major objectives of my course(s) and the amount of time needed for this process.  As suggested by the National Academies Press, successfully implementing any of the PBL methods takes time which may not always be a luxury in coverage-based classes. Before moving forward, I need to understand that I would not be able implement PBL with every topic but must carefully select activities that would help solidify the major objectives of the course.

My colleagues and professors have also suggested using alternative models such as the  human-centered design or Kathleen McClaskey’s Continuum of Choice (see figure 1.4 below).

Diagram of the Continuum of Choice.
Figure 1.4 McClaskey’s Continuum of Choice. (Continuum of Choice TM by Barbara Bray and Kathleen McClaskey is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License available at: https://creativecommons.org/licenses/by-nc-nd/4.0/.)

I would need to investigate which design model best fits with specific course needs as well as brainstorm what questions need to be asked in order for problem-solving to be effective. Perhaps the answer to these questions will be course-specific and may require the use different models for different activities to further promote cognitive flexibility.

References

International Society for Technology in Education, (2017).  The ISTE standards for students. Retrieved from: https://www.iste.org/standards/for-students.

Kuo, F.-R., Chen, N.-S., & Hwang, G.-J. (2014). A creative thinking approach to enhancing the web-based problem solving performance of university students. Computers & Education, 72(c), 220–230.

National Academies Press. (2011). Chapter 4: Scenario-, problem-, and case-based teaching and learning. In National Academies Press, Promising practices in undergraduate science, technology, engineering, and mathematics: Summary of two workshops.(pp. 26-34.) Washington, DC. DOI: https://doi.org/10.17226/13099.

Pata, K. (2016). Problem-based learning in task-based and inquiry-based scenarios. [Blog] Retrieved from: https://creativeclassroomproject.wordpress.com/creative-classroom-collection/problem-based-learning/

Incorporating Feedback Loops to Develop An Empowered Student

Being a successful professor means preparing students to be successful. Delivering knowledge-centered classes on a particular topic is no longer the primary task of professors. Gone are the days of the large lecture halls, professor front and center, exhibiting knowledge for students to somehow absorb.  Scholars are now calling for students and professors to engage in a new learning paradigm that provokes the development of specific skills for the 21st century.  This paradigm includes teaching five major career skills that are highly sought after by employers today.  Mastering these five essential skills means that students: 1) thrive on change by being receptive to feedback, 2) are able to get things done independently and without direction, 3) are open-minded, understand their own biases, and appreciate differences in others, 4) know how to prioritize tasks, and are good at influencing behavior of others, 5) facilitate activities and relationships within an organization, (Kivunja, 2014).  This is not an easy feat as skills need time and practice to be cultivated. The first ISTE standard for students calls for the empowered learner as a mechanism to help build 21st century skills.  The empowered learner is one that, “…leverages technology to take an active role in choosing, achieving, and demonstrating competence in their learning goals,” (ISTE, 2017). An empowered student is one that is at the forefront of their learning by thinking beyond the lecture and is autonomous because they have intrinsic motivation, (Stefanou et. al., 2004).  

Figure 1.1 Empowered Student Flowchart

So if students need to develop self-determination and become autonomous in order to thrive in the current workforce, are we, as educators, doing our part in preparing them to do so?  This question can only be answered positively if we adopt a student-centered approach to teaching.  The authors of the book, Understanding by Design, challenge educators to consider the backward design approach. In this design approach, the educator starts their plan with the desired results, determines which indicators are appropriate for measuring the outcomes of their results, then plans the experiences and/or instruction required to achieve these outcomes, (Wiggins & McTighe, 2005).  When students are informed of the desired results and are allowed to take part in the creation process, that’s when self-determination and autonomy develops, (Stefanou et. al, 2004).

It is also important to remember that students are still developing these skills so simply stating the purpose or goal of an assignment and leaving them to their own devices will not help them develop autonomy.  Coupled with the student-centered approach, formative feedback must be included to help guide and remind students of the big-picture results.  Formative assessment when conducted as a feedback loop helps to “enhance performance and achievement,” (Wiggins, 2012).  Essentially, this means that students are given consistent, on-going, and immediate feedback as a way to encourage continual practice of skills.  Formative feedback is not evaluated formally (i.e. no grades are assigned to the feedback) and does not offer extensive evaluation, advice, nor it is purely praise.  Instead, formative feedback offers the student a “gauge of their efforts to reach a goal”, (Wiggins, 2012).  In order to provide good feedback, the assessor must first observe, then comment or ask questions on those observations, (Wiggins, 2012). Figure 1.2 summarizes Wiggin’s strategy on formative feedback.

Figure 1.2

Putting the Theory Into Practice: The Investigation.

In our digital education leadership program, we were asked to create a question(s) related to the classes we teach and investigate a resource(s) that would aid in addressing the first ISTE standard for students.  I teach a nutrition research class whose main purpose is to develop not only students’ research skills but also build autonomy as researchers. Students must  investigate a food-related issue, then design and implement an experiment, later report their findings through a final research paper. This class explores the research process including hypothesis creation, experiment -building and -testing, and scientific writing.  The current challenge is to allow enough freedom for autonomy to develop while providing  direction to ensure correct research protocol is established.  

I began my brainstorming process for a student-centered approach to the issue by first identifying the important design outcomes. I started with a goal: Allow students to take their research project into their own hands while working toward a common goal and using the research protocol. Though students will be developing autonomy and need to be self-driven, they will also need appropriate feedback in order to gauge their work at critical points in the quarter. With this goal in mind, two main questions developed: 1) What feedback timeline would be most effective to design a researcher-centered approach to teaching nutrition research classes? and 2) What computer driven-tools would effectively provide timely and ongoing feedback?  The findings of my investigation and potential resources are explored below.

Question 1: What feedback timeline would be most effective to design a researcher-centered approach to teaching nutrition research classes? Upon further investigation, this question can’t be answered directly. Each assessment will vary in scope and length, therefore a prescribed timeline is not feasible. However, according to education leaders Hicks and Wiggins, they both agree that formative feedback is the best approach using the student-centered or researcher-centered approach.  As a reminder, formative feedback is not formally assessed but rather allows the student/researcher an opportunity to take a step back to evaluate and reflect upon their own work in relation to their research goals. The timing of feedback should be immediate, ongoing, and consistent,(Hicks 2014, Wiggins 2012).  Feedback should follow a specific format which does not make judgements nor evaluates the work.  Hicks references the RISE model (see figure 1.3) to format formative feedback in a meaningful way, which is why I’ve chosen the model as the resource of choice for this question.

Figure 1.3

The RISE model can be used for self-assessment, peer-review, or evaluator review in formative feedback.  The process begins by assessing the degree to which the current work meets the goals/objectives of the assignment.  The subsequent steps allow for specific, tangible, and actionable suggestions to the author for improvements on their current version and future version of the work. The benefit of using this model is that as the feedback advances towards higher steps, it also involves higher level of thinking. RISE allows the user to get at the heart of student-centered learning by allowing students to evaluate and create works. I have not used this model in action but my predictions for any drawbacks may involve peer-feedback where students skip a level or provide judgements without fully understanding the model itself.  These concerns could be combated with scaffolding and more detailed instruction on the feedback process.

Question 2: What computer-driven tools would effectively provide timely and ongoing feedback? For an assessment item such as a research paper, using a collaboration tool such as G suite or the Google Doc Collaboration feature in CANVAS is ideal.  Google Docs are available to anyone that holds a gmail sign-in, along with several other features of the G suite including: to-do lists, calendar, google hangout, and gchat, just to name a few.  The Google Doc collaboration feature in CANVAS allows students to access a google doc on one google drive (usually belonging to the instructor).  The owner of the google drive would then have access to all of the collaboration pages for the class. The use of these collaboration tools is appealing because the docs are easily accessible by students, the professor, or individual providing the feedback.  Formative feedback is simple to provide using the “comment” feature. Google Docs also track changes throughout the life of the document and provides comment notifications in gmail. Using Google Docs would also help address issues related to equality of work among team members (i.e. members doing their fair share of the collaboration). To further my justification of this technology, it would help me improve my current assignment by achieving M and R from the SAMR model.  Google Doc collaboration also scores roughly a 14 on the Triple E rubric (according to my assessment of intended use).

The only downside related to the collaboration tool feature in CANVAS. The feature is not intuitive and somewhat difficult for students to access. It is also not well integrated with Google Docs, for example, simply placing students into groups on CANVAS and assigning these groups to a Google Doc collaboration does not automatically give students access to their group’s Google Doc in the drive.  The instructor has to manually give permission to each student. The collaboration feature also does not link instantly to the gradebook or back to CANVAS where other course materials/resources would be kept.

The Next Steps.

The RISE model and Google Doc tool were well received by my colleagues when evaluating them as resources that resolve my two questions on formative feedback. Not surprisingly, others also shared similar concerns with using Google Doc as a collaboration feature in CANVAS. Since Google Docs can be used independently of CANVAS, this is not a big issue particularly since formative feedback is not associated with a formal grade therefore an association with CANVAS materials or gradebook is not necessary.

Interestingly, most of their feedback on these two resources related to implementation, namely what assessment tools would/could be used to implement the RISE model and would/could Google Apps for Education help facilitate this assessment function? My initial reaction on creating an assessment tool to implement the RISE model was to create “guiding questions” students would answer as part of their feedback comments.  By answering the questions fully, the students would effectively go through the entire model without skipping steps. I have yet to investigate other Google Apps for Education for feedback features.  Though I do not have complete answers to these great questions, I do have the beginning of of my next investigation: Feedback Implementation.

References

Hicks, T. (2014, October 14). Make it count: Providing feedback as formative assessment. Edutopia. Retrieved from: https://www.edutopia.org/blog/providing-feedback-as-formative-assessment-troy-hicks

International Society for Technology in Education, (2017).  The ISTE standards for students. Retrieved from: https://www.iste.org/standards/for-students.

(Kivunja, C. (2014). Teaching students to learn and to work well with 21st century skills: Unpacking the career and life skills domain of the new learning paradigm. International Journal of Higher Education, 4(1), p1. Retrieved from http://files.eric.ed.gov/fulltext/EJ1060566.pdf

Stefanou, Candice R., Perencevich, Kathleen C., DiCintio, Matthew, & Turner, Julianne C. (2004). Supporting Autonomy in the Classroom: Ways Teachers Encourage Student Decision Making and Ownership. Educational Psychologist, 39(2), 97-110.

Wiggins, G., & McTighe, Jay. (2005). Understanding by design (Expanded 2nd ed., Gale virtual reference library). Alexandria, VA: Association for Supervision and Curriculum Development.

Wiggins, G. (2012, September). 7 keys to effective feedback. Education Leadership. 70 (1).

Wray, E. (2018). RISE Model. Retrieved from: http://www.emilywray.com/rise-model.

css.php