Creating a Peer Coaching Culture

Coaching culture is prevalent in the business world. Simple internet searches on the topic offer many articles and resources providing suggestions to build a stronger culture on the corporate scale. In these articles, managers are called to encourage coaching through shared experiences and incentivize employees to successfully participate.  In education, peer coaching has established itself as a useful and resourceful form of professional development in K-12 schools. Peer coaching is so highly esteemed that the Department for K-12 Public Schools in Washington State offers educational grants to support peer coaching.  Yet in the higher education world, peer coaching is still in nascent stages. University professors Victoria Scott, and Craig Miner explore peer coaching use in higher education and claim that only 25% of institutions use it as a way to stimulate innovation and improvement, (Scott and Miner, 2008). Even among institutions that employ peer coaching, peer observation was among the most widely used method.  This is true for my higher education institution. Every year faculty complete a Professional Development Plan (PDP) in which the faculty member addresses needs for improvement, innovation, and training.  One of the required components for completing the PDP process is to participate in a classroom observation by a peer and receive feedback.  While observation can be a form of peer coaching, the observers oftentimes are not trained as coaches and these conversations tend to explore course content and audience engagement only.  Learning outcomes, active learning, and 21st century skills are largely ignored. Scott and Miner acknowledge that this form of peer evaluation isn’t new in higher education but can be limited because it is one-sided and short-term, (Scott and Miner, 2008).

As I reflect back upon my experience in peer coach training, I realize that there currently isn’t a system in place for me to continue my work as a peer coach outside of isolated events. Outside of my personal desire to continue using my peer coaching skills, ISTE also encourages this through its sixth coaching standard highlighting the importance of continuous learning to improve professional practice.  The standard states that coaches “engage in continuous learning to deepen professional knowledge, skills, and dispositions in organizational change and leadership, project management, and adult learning to improve professional practice,” (ISTE, 2017).  Reflecting on this call to action, I began wondering how to embark of organizational change to develop a peer coaching culture in higher education.

Barriers to establishing peer coaching culture.

Coaching creates an innovation culture where the team is responsible for solving complex problems and supports accountability, (Brooks, n.d.). In higher education, it allows a department to improve collegiality and provide moments of reflection necessary for critical discourse to occur, (Scott & Miner, 2008). Despite the benefits of promoting a peer coaching culture, no sooner did I start investigating culture implementation did I run into potential barriers to culture change. Of the various reasons why an institution may not be receptive to peer coaching, lack of vision, isolation, and lack of confidence in collaboration efforts, were among the top barriers, (Slater & Simmons, 2001).

Lack of vision.  Current institutional culture dictates success of peer coaching initiatives.  Institutions with short-term goals and/or top-down management styles do not provoke qualities necessary for good peer coaching culture as it requires support from administrators, (Brook, n.d.). Administrators not only give approval for peer coaching to take place, but are also vocal in its promotion, participation, and long term viability, (Brook, n.d.).  Misconceptions and lack of understanding in the value of peer coaching lead to poor administer buy-in. When administrators view coaching as “time-wasting” or valued as a remedial action, its potential is diminished.  This ignores large benefits such as attracting and maintaining top talent, promoting constant innovation, maintaining intrinsic motivation and workplace satisfaction for all, (Brook, n.d.). The lack of vision may not spark from misconception but rather lack of awareness or a knowledge deficit about peer coaching.  Administrators without previous exposure to coaching may have trouble envisioning the process, may have logistical questions, or worry about potential negative outcomes of peers observing one another for the purpose of growth and development, (Barnett, 1990).

Professors tend to work in isolation. Professors Victoria Scott and Craig Miner recognize that peer coaching has not been more readily implemented in higher education because professors work autonomously, independently trying to achieve improvement and innovation through the scholarship of teaching, (Scott & Miner, 2008). There is fear that collaboration may remove the academic freedom that professors are rewarded, leading to strict and rigid changes in teaching, (Scott and Miner, 2008). Another significant barrier for professors is the perceived lack of time. In order for peer coaching to be successful, the assumption is that peer coaching efforts are long-term and on-going.  Given other commitments and required scholarly activities, even if a professor has the intention to participate, actual follow-through is lacking (Scott and Miner, 2008). Professors also fear that their peer coaching efforts will not be rewarded or recognized by their institution, particularly if current policies on promotion do not support such efforts. Scott and Miner argue, however, that peer coaching has been linked to improved course evaluations which are used for tenure and other promotion efforts, (Scott and Miner, 2008).

Lack of confidence in collaboration. Confidence needs to be instilled through better understanding of the peer coaching process.  Scott and Miner define peer coaching as a “confidential” process in which both parties hold no judgement but rather build a relationship on collaborative and reflective dialogue, (Scott & Miner, 2008). “No one grows as a leader without the support of others,” (Friedman, 2015).  Peer coaching works because building trust and rapport is an essential component to the process. Innovation and change happen quickly because peer coaching makes partners honest about goals, hold each other accountable, and creates actionable tasks leading to better and more effective outcomes, (Friedman, 2015).

The lack of confidence can also stem from inadequate peer coaching training. This can result largely from institution resource allocation.  However, continued peer coaching training does not have to rely on monetary resource only but also recognize that outside sources can be used to support additional training such as social networks and the establishment of Professional Learning Cohorts (PLCs), (Brook, n.d.)

Institutional implementation of peer coaching culture.

“When good coaching is practiced, the whole organization will learn new things more quickly and…adapt to changes more effectively,” (Mansor et. al., 2012).  Coaching can serve as a catalyst for change on multiple levels of an institution.  Department chair, professor, and educational leader, Barbara Gottesman, has been working to establish peer coaching in university settings since the nineties.  Her book, “Peer Coaching in Higher Education,” highlights numerous case studies in which peer coaching cultures have not only helped enrich the learning environment but also helped address several of the barriers listed above. Dr. Gottesman argues that successful coaching culture only functions when specific rules and concepts are in place and all stakeholders adhere to the process, (Gottesman, 2009).  Figure 1.1 provides a summary of Dr. Gotteman’s peer coaching process.

Infographic summarizing the peer coaching process by Dr. Gotteman
Figure 1.1 Summary of Dr. Gotteman’s Peer Coaching Process

Drawing from the recommendations from Dr. Gotteman, and additional business and coaching leaders, the following are summarized determinants of a successful peer coaching culture:

  1. A strong link between organizational strategy and developmental focus. The alignment of professional development with tangible organizational goals is the strongest indicator of peer coaching culture adoption.  For an organization, one supports a means for achieving the other, (Mansor, et.al, 2012). In order to do this, coaching leaders recommend performing a culture assessment which addresses this link.  The assessment should focus on attitudes and understanding of peer coaching, along with the institution’s mission, value statements, vision, and a review of the current policies in place that may support or inhibit peer coaching practice, (Leadership That Works, n.d).
  2. Administrative commitment. Strong administrative commitment supports proper implementation and addresses resistance to change. In order to overcome barriers, administrators hold the responsibility for culture promotion and encouragement. Resistance to change should be managed in a manner that normalizes the emotional impact, meaning that employees’ concerns and voices are heard, (Leadership That Works, n.d.).  In addition to normalizing the fear, coaching consulting firm “Leadership that Works”, recommends identifying early adopters who would slowly begin incorporating others in peer coaching projects.  Successes of early peer coaching helps build excitement and alleviates the fear of the unknown, (Leadership That Works, n.d.).
  3. Sufficient and appropriate peer coaching training. All experts agree that successful peer coaching culture takes time to establish because good peer coaches need to build skills. The initial need of skilled peer coaches can be met through the use of external coaches that can provide an outside perspective, training, and build innovation, (Leadership That Works, n.d.).  Once successful initiation has taken hold, internal coaches can be deployed to further the work.  In fact, internal coaches are often more impactful because of their intimate knowledge of the systems and procedures that are being improved upon, (Leadership That Works). When training internal coaches, Dr. Gotteman and coaching leader John Brooks, argue that complicated peer coaching theories should be reserved for more advanced and skilled coaches.  Even basic coaching models can be successful, (Gotteman, 2009; Brooks, n.d.).
  4. Develop culture of recognition and rewards. Professors Scott and Miner recognize that some reward and recognition should be given to professors that embark on peer coaching projects. However, the rewards must go beyond promotion and tenure.  The reasoning behind this, Scott and Miner warn, is that there would be little motivation for senior faculty to participate in project without recognition.  Since senior faculty can provide a wealth of experience, faculty buy-in is imperative for peer coaching success, (Scott and Miner, 2008).
  5. Continual learning and development opportunities. The primary purpose of peer coaching is to serve as professional development with the assumption that is the process is on-going. To support continual learning and development opportunities, constant program evaluation will be important, (Leadership That Works, n.d.).

While full-scale institutional change may take time and effort to employ, small changes at the program or department level may help pave the way for larger changes and benefits.  Conversations around culture should involve all key stakeholders to gain perspectives and eliminate resistance to change and the barriers that it creates.  Promoting coaching culture works for business, it works for K-12 education, and it can certainly also work for higher education.

Resources

Abu Mansor, N.N., Syafiquah, A.R., Mohamed, A., Idris, N. (2012). Determines of coaching culture development: A case study. Procedia. 40, 485-489. Retrieved from https://www.sciencedirect.com/science/article/pii/S1877042812006878

Barnett, B. (1990). Overcoming obstacles to peer coaching for principals. Educational Leadership [pdf]. Available from: http://www.ascd.org/ASCD/pdf/journals/ed_lead/el_199005_barnett.pdf

Brook, J. (n.d.) Common barriers to a coaching culture and how to overcome them. StrengthScope website.  Available from: https://www.strengthscope.com/common-barriers-coaching-culture-overcome/

Gotteman, B.L. (2009). Peer Coaching in Higher Education. Lanham, Maryland: Rowman & Littlefield Education.

Friedman, S.D. (2015). How to get your team to coach each other. Harvard Business Review website. Available from: https://hbr.org/2015/03/how-to-get-your-team-to-coach-each-other

ISTE. (2017) ISTE standards for coaches. Available from: http://www.iste.org/standards/for-coaches

Leadership that works, (n.d.) 7 Steps for developing a coaching culture. Available from: http://www.leadershipthatworks.com/article/5037/index.cfm

Scott, V., Miner, C. (2008). Peer coaching: Implication for teaching and program improvement [pdf.] Available from:  http://www.kpu.ca/sites/default/files/Teaching%20and%20Learning/TD.1.3_Scott%26Miner_Peer_Coaching.pdf  

Slater, C., & Simmons, D. (2001). The Design and Implementation of a Peer Coaching Program. American Secondary Education, 29(3), 67-76. Retrieved from http://www.jstor.org.ezproxy.spu.edu/stable/41064432

 

Building Checklists for Effective Engagement Resources

Good peer coaching ensures successful lesson plan outcomes. In my last blog post, I explored the value of teacher vs. student focus during peer coaching sessions and concluded that when the teacher is focused on improved learning, both the students and the teacher benefit greatly, (Vlad-Ortiz, 2018).  In peer coaching, the main tasks involve co-planning a lesson, and improving upon that lesson to ensure that the activities described facilitate learning in a purposeful manner. For improvement to occur, according to coaching leader Les Foltos, there must be an explicit agreement between the peer and the coach on the definition of “improvement,” (Foltos, 2013).  This dialogue between the coach and the peer should be specific as it will drive the focus of the work. Foltos suggests using an effective learning checklist to guide this work and offers a framework of four main improvement focus areas: standard-based, engagement-based, problem-based, and technology enhanced learning, (Foltos, 2013).  Once the definition and checklist has been created, then the improvement process can begin. Figure 1.1 below summarizes the evidence-based design process described by Foltos.

infographic on the lesson plan improvement process.
Figure 1.1 Foltos’ Lesson Plan Improvement Process

The coach’s responsibility is to stimulate innovation by taking an outside perspective and offering suggestions and resources.  Without this distinct perspective, teachers can’t innovate, (Foltos, 2013). By participating in the innovation process, the coach meets the ISTE standard in “contribut[ing] to the planning, development, communication, implementation, and evaluation of technology-infused strategic plans at the district and school levels,” (ISTE, 2017).

In my peer coaching relationship, my peer and I have stepped into the process of improvement after designing a unit for a blended course.  My peer developed several learning activities along with associated deliverables that achieved the desired learning outcomes. However, my peer was concerned that the unit is too dry and may be isolating as students work independently on many activities. Because of these concerns, my peer would like to focus our improvement efforts on engagement.  In addition, my peer would like me to offer some good technology options that would help enhance engagement.  My peer’s request requires an actionable and tangible outcome from me. This got me thinking, how would a peer coach begin exploring technology resources for increasing classroom engagement with their peer?

Creating a criteria checklist.   “Even highly effective collaboration isn’t enough to improve learning,” (Foltos, 2013).  Effective resources that stimulate engagement must contribute to the learning outcome in some way. Gathering technology resources for collaboration then must involve some discernment process that narrows down options to the best fit with a learning outcome.  One way to start building a resource list for my peer could be to create a criteria checklist.  I originally got this idea by watching a YouTube video created by Dr. Dykema-Vanderark who is an English professor at Grand Rapids Community College. In his video, Dr. Dkykema-Vanderark explores various technology tools that increase classroom engagement. He starts off the session by communicating his criteria for a “good” engagement tool that helps meets his needs, which include: free (or low cost), easy to use, well-designed, and were flexible multi-use tools. He subsequently presents nine tools, he felt best met these pre-established criteria highlighting the main features and offers suggestions on how to use them. Though the video itself does not address my main question, the main ideas behind Dr. Dykema’s processes clarifies how to begin this process and how my peer and I could use the criteria to explore various tools together. The idea of a checklist is not a new idea, nor is it limited to process improvement. Education consultant, Patricia Vitale-Rilley, suggests using checklists as a way to manage active learning by students, suggesting that checklists are a good tool for facilitating student engagement, (Vitale-Rilley, 2015).

Criteria Considerations that Include Engagement Characteristics. Edutopia describes engagement as activities that allow students to do something with the material that they are taught. Students are talking, practicing, and moving on the content rather than passively absorbing the content through lecture, (Johnson, 2012). Given the participatory nature of the above definition, criteria characteristics for the resource checklist should include technology that allows for sharing, commenting, and other collaboration features while excluding any technology resource that simply curates information without a basis for interaction among the students. Adding to this, Foltos describes characteristics of engagement-based tasks in his own checklist on effective learning.  Some of those characteristics include: tasks that are challenging (in a good way),  hold intrinsic interest, offer choices, allow students to draw upon existing knowledge and skills, facilitates creation of a product/artifact, and allow students to apply their skills to new situations, (Foltos, 2013). While these criteria focus more on classroom activities, they may be used to evaluate technology resources particularly technology that can offer students choices or multi-functionality, and allow them to create a tangible product.  Resources that allow students to use their skills to new technology could also be a consideration.

Next Steps.

Building a resource checklist will help narrow down the list of potential technology tools used in the classroom and aid in the selection of the tools that best fits with the intended outcome of each learning activity. In moving forward with lesson plan improvement with my peer, we would need to complete the following steps to build a successful lesson:

  • Identify which learning activities would benefit from revision to improve engagement.
  • Identify characteristics of engagement for those selected activities.
  • Establish a tech resource checklist highlighting key features needed to fulfil learning outcomes.
  • Curate technology resources.
  • Compare and contrast technology resources against the checklist.

“Many educators need a research-based process for lesson design…to help them create…learning activities,” (Foltos, 2013). Using the above process ensures that my peer and I follow an evidence-based practice, keeping our focus on the student learning outcomes while increasing active-learning for more impactful lesson plans.

Resources

Dykema-Vanderark,T. (2017). Beyond the discussion board: Using online tools to increase student engagement [YouTube video]. Available from:  https://www.youtube.com/watch?v=9INVsMsFyH0

Foltos, L., 2013. Peer Coaching: Unlocking the Power of Collaboration. Chapter 7: Lesson improvement process. Corwin Publishing. Thousand Oaks, CA

ISTE, (2017). ISTE standards for coaches. Available from: https://www.iste.org/standards/for-coaches

Johnson, B. (2012, March 02). How do we know when students are engaged? Available from: https://www.edutopia.org/blog/student-engagement-definition-ben-johnson

Vitale-Rilley, P. (2015). Your classroom environment checklist for student engagement [blog]. Available from: https://blog.heinemann.com/classroom-environs-checklist

Vlad-Ortiz, C. (2018). Peer coaching focus- For teacher or student outcomes? [blog]. Available from: http://digitaleducationblog-cvo.org/peer-coaching-focus-for-teacher-or-student-outcomes/

Peer Coaching Focus- For Teacher or Student Outcomes?

Educators are facing an ever-changing professional landscape. As society evolves into the 21st century, the needs of various industries change, requiring different skills. Teachers are challenged to improve and update skills, knowledge, and actions to match those needs, (Ma, Xin, Du, 2018). Teachers can’t keep up on their own.  “New curriculum, standards, resources/materials, assessments, methodologies, technology, and reforms will not and do not have much impact unless teachers have appropriate access, knowledge, skills and continuous learning opportunities. Teachers require time for reflection, mentoring relationships, collegial interaction, expert role models, and ongoing professional development for any of these changes to be effective,” (Becker, 2014).  As Becker alludes to, the format of professional development is important in providing educators the tools they need to make the changes necessary for successful student impact.  In order to maximize success, professional development is moving away from theory-only, lecture-based models to more effective personalized learning models such as peer coaching. Studies show that educators participating in peer coaching better practice and adopt new strategies, retain and increase skills over time, and are better able to explain teaching/learning models than un-coached educators, (Joyce & Showers, 2002).  Statistics back these findings, five percent of educators will transfer new skills into practice as a result of theory, whereas ninety percent of educators will transfer new skills into practice with theory, demonstration, practice within training, feedback, and coaching, (Becker, 2014).

The sixth ISTE standard for coaches encourages this peer coaching model by recommending an “engage[ment] in continual learning to deepen content and pedagogical knowledge in technology integration and current and emerging technologies necessary to effectively implement the ISTE Student and Educator standards,” (ISTE, 2017). If peer coaching is to be done correctly, should the coaching focus on teacher outcomes or student outcomes? This inquiry comes from my reflective thoughts on the skills and strategies used in successful coaching which are mainly teacher-focused. Given that the learner audience would be a peer, the coaching efforts logically should be focused on meeting their needs. My hypothesis is that meeting these needs would automatically relate into increased learning outcomes for the students through improved instructional methods. However, in a current peer coaching relationship, we are heavily focused on student learning outcomes rather than the peer’s needs. Are my peer’s needs being met through meeting the student learning outcomes, or should one be given priority over another?  Below are the results of my investigation, offering both sides of the argument from which I draw my conclusions at the end.

Evidence for Teacher-Focused Peer Coaching.

There is evidence to support that peer coaching has a marked effect on professional improvement and classroom implementation. A research study conducted in China looked at the impact of peer coaching on professional development, learning, and application of that learning in instructional design, attempting to investigate the problem that teachers who had knowledge of certain pedagogies were unable to apply them in the classroom. Twenty peers were coached and evaluated through performance rubrics and teaching videos. The results of the study suggest that personalized approaches such as peer coaching increased learning participation which improved in-depth learning.  In addition, participants were more effective in content application than traditional methods, (Ma, Xin, & Du, 2018). This study makes the case for keeping peer coaching focus on the instructors for improved teaching outcomes.

Several studies have concluded peer coaching effectiveness not only teaching modalities but also in personal development. Undergoing the peer coaching process can help teachers become more reflective of their work and therefore better able to identify own professional development needs, (Soisangwarn & Wongwanich, 2014).  Ma, Xin, and Du found similar results in their study, by sharing and offering suggestions to other teachers, the peers became more reflective of their own work, (Ma, Xin, & Du, 2018). By becoming more reflective, they are building emotional intelligence and self-awareness.

Lastly, effective peer coaching can also increase the self-efficacy of teachers. Researchers investigated the effect of peer coaching on instructional knowledge and self-efficacy on student teachers in a TEFL (teaching English as a foreign language) program. The results of the study indicated an increased self-confidence as the student teachers expressed freedom to ask questions and express their opinions. Undergoing the process of peer coaching also allowed the student teachers to become self-directed learners which built self-efficacy, (Goker, 2006).  The above evidence supports teacher-focused peer coaching because the intent of coaching is to serve as professional development, helping the peer, not the students, improve in both personal and professional skill development.

Evidence for Student-Outcome Focused Peer Coaching.

The evidence for student-outcome focused peer coaching is driven by results. Researchers Joyce and Showers argue that learning how to learn is equally as important as acquiring skills and knowledge for classroom application, (Joyce & Showers, 2002). Interestingly, Joyce and Showers make the case that the teachers should be treated like the students when approaching professional development through peer coaching.  They state that in order for peer coaching to be successful, the pair needs to identify the learning outcomes and select the training component most adequate for successful achieving those outcomes, (Joyce & Showers, 2002). This approach to peer coaching puts the student outcome first by treating the peer as a student and following a similar approach to learning outcomes.

Researchers Scott and Miner explore peer coaching solely for the purpose of improving student outcomes in higher education.  They argue that peer coaching is rarely used in higher ed due to environmental and cultural factors including the fact that professors are mostly autonomous, peer coaching can be time-consuming, and outcomes are not tied to tenure efforts nor other evaluation efforts, (Scott & Miner, 2008). However, when peer coaching focused on improved student outcomes, other evaluation methods, such as course evaluations also improved, (Scott & Miner, 2008). This makes the case for incorporating more peer-coaching and feedback as the predominant feedback mechanism in higher education, i.e. course evaluations, typically lack enough information for true improvement to occur.

Infographic on benefits of teacher vs. student outcome focused peer coaching.
Figure 1.1 Summary of Teacher vs. Student-Outcome Focused Peer Coaching.

Conclusion

The matter of teacher versus student-outcome driven peer coaching is not an easy debate to settle.  Most authors evaluated in this review often provided a two-pronged view of coaching looking at the benefits on both sides. Joyce and Showers concluded their study explaining that when teachers learn how to learn, and consistently use newly acquired skills and strategies well in the classroom, a critical point is reached that impacts students’ development, (Joyce & Showers, 2002).  Becker agrees, peer coaching can accomplish both improved outcomes from the teacher and the student when allowed in the right capacity including organizational implementation, (Becker, 2014). These sentiments are mirrored by several other authors and researchers as well. Pam Robbins, author of “Peer Coaching to Enrich Professional Practice, School Culture, and Student Learning”, explains that there are many uses and purposes for peer coaching from understanding diversity in the classroom, implementing new technologies, or improving learning outcomes. Peer coaching is poised to help teachers face many challenges in the classroom and promotes new opportunities, (Robbins, 2015). Given all of the above evidence, it can be concluded that peer coaching should focus on both teacher and student outcomes. When done well, both teachers and students benefit.

References

Becker, J.M. (2014). Peer coaching for improvement of teaching and learning [pdf]. Available from: http://radforward.com/blog/wp-content/uploads/2014/01/peer_coach_article.pdf.

Goker, S.D. (2006) Impact of peer coaching on self-efficacy and instructional skills in TEFL teacher education. System. 34: 239-254l

ISTE. (2017) ISTE standards for coaches. Available from: http://www.iste.org/standards/for-coaches

Joyce, B., Showers, B. (2002). Student achievement through staff development [pdf]. Available from: https://www.unrwa.org/sites/default/files/joyce_and_showers_coaching_as_cpd.pdf

Ma, N., Xin, S., & Du, J. Y. (2018). A peer coaching-based professional development approach to improving the learning participation and learning design skills of in-service teachers. Educational Technology & Society, 21 (2), 291–304.

Robbins, P. (2015). Chapter 1: Establishing the need for peer coaching. In: Peer Coaching to Enrich Profession Practice, School Culture, and Student Learning [e-book]. Available from: http://www.ascd.org/publications/books/115014/chapters/Establishing-the-Need-for-Peer-Coaching.aspx

Scott, V., Miner, C. (2008). Peer coaching: Implication for teaching and program improvement [pdf.] Available from:  http://www.kpu.ca/sites/default/files/Teaching%20and%20Learning/TD.1.3_Scott%26Miner_Peer_Coaching.pdf

Soisangwarn, A., Wongwanich, S. (2014). Promoting the reflective teacher through peer coaching to improve teaching skills. Procedia – Social and Behavioral Sciences. 116: 2504 – 2511. Available from:  https://ac.els-cdn.com/S1877042814006181/1-s2.0-S1877042814006181-main.pdf?_tid=aa5bc8ae-6473-42f0-a7e3-a561b25b9b8a&acdnat=1541369407_8987477626b3f7a71d8baf9789f13d8f

 

Managing Common Coaching Miscommunication

If the foundation of effective peer coaching is collaboration, good communication is one of its pillars. Mark Ladin, CMO of Tiger Connect, an IT company, shares this mindset by defining communication and collaboration as one and the same.  He argues that both communication and collaboration function on the exchange of information, however without good communication, you can’t have a functioning collaborative relationship that yields productive results, (Ladin, 2015).  Therefore, eliminating miscommunication in partnerships promotes good collaboration, (Lohrey, n.d.).  Collaborative communication offers many benefits including: creating flexible work environments that promote trust and familiarity, enhances decision-making by tackling problems through various angles, and increasing overall satisfaction of the collaboration process, (Lohrey, n.d.)

The ISTE Coaching Standard (1D) calls for coaches to implement strategies for initiating and sustaining technology innovations and manage the change process in schools and classrooms, (ISTE, 2017). A peer can feel comfortable enough to implement suggested strategies, when good communication between the collaboration peers is established. If good communication is central to collaboration, what miscommunication is common during peer coaching and what are some strategies to avoid it? This question does not readily yield concrete results on peer coaching alone, but rather there are several approaches to reasons for miscommunication including: modes of communication, a variety of communication barriers, and types of information given that may lead to miscommunication.

Modes of communication.

While mode of communication may not be the first thing to come to mind when considering miscommunication, the impact communication delivery has on conversation comprehension is compelling. According to Willy Steiner, an executive career coach, the degree of communication effectiveness compared to information efficiency differs when offered via face-to-face, telephone, or email communication, (Steiner, 2014). The author argues that face-to-face communication offers the best information efficiency (i.e. better understood) while email is most effective (i.e. quick). This can be further compounded by factoring in three types of communication: visual, verbal, and non-verbal. Face-to-face communication allows for better understanding in all three communication types, though it is the slowest communication mode.  Email is the quickest mode but tends to promote higher levels of misunderstanding in verbal and visual communication and does not allow for any interpretation of non-verbal communication, (Steiner, 2014).  A research study on adult learners using information communication technology found similar results.  The aim of the study was to determine what type of information communication technology would better support virtual coaching. The results found that email was useful for the exchange of information but lacked the ability to create authentic communication experiences or relationships, and often led to more miscommunication, (Ladyshewky & Pettapiece, n.d.).  Use of telephone technology was more effective than emailing because phone calls offered more verbal cues, while video-conferencing (mimicking face-to-face communication) was just as efficient as face-to-face conversations if technical issues are not present, (Ladyshewky & Pettapiece, n.d.).  As a result, communication comprehension is a major consideration for avoiding miscommunication. When possible, face-to-face or similar communication modes should be used to help build relationships and deliver the most amount of understanding while limiting email to information transfer only.

Communication barriers.

Research shows that face-to-face communication better maximizes understanding and relationship building in collaborative partnership. However, even in face-to-face environments, several barriers may create inadvertent miscommunication events.  According to the Coaching Room Company, there are seven potential barriers that may lead to ineffective coaching, summarized in figure 1.1 below.

Infographic highlighting seven barriers to good communication.
Figure 1.1 Seven Barrier to Good Communication.

Considering that many of these barriers involve understanding and respect of the coaching peer, developing a good collaborative relationship prior to working on the mutual project is essential for avoiding miscommunication.

Information miscommunication.

Peer coaching invites the coach to step into a leadership position in which the goal is to collaborate and facilitate work with a peer toward a mutual goal. Another area of potential miscommunication may stem from how the coach leader presents information to the peer.  Figure 1.2 below lists the various information communication errors that may arise in leadership.

Infographic on common communication mistakes
Figure 1.2 Common Communication Mistakes

It is not only important to consider how communication is performed but also what is being communicated.  Forbes Coaching Council expands on the communication errors provided in Figure 1.2 to focus on information clarity. Miscommunication can occur when the message is non-individualized or personal, (Forbes, 2018). Using the same strategies, communication techniques, and information to various coaching peers can harm the coaching relationship. A common miscommunication is use of vague, generic language or messages leading to lack of clarity in direction. The peer is left feeling like they are missing out on important information or that the information they were provided was not delivered effectively, (Forbes, 2018). To help eliminate the lack of direction, clear expectations that are developed by both parties can help promote the shared vision contributing to better collaboration.  The peer leader should avoid communicating only negative outcomes, instead include the positive outcomes to avoid creating an image that the shared work is not successful, (Forbes, 2018). Lastly, it is crucial that the coach recognize their bias and remember that the process is not about their wants but the needs of the peer being coached.  Business coach Tony Alessandra said it best, “You can choose to connect with others from their perspective, the way they want to be communicated with by modifying your own presentation style; or you can choose to meet only your own needs – facing the consequence misconnecting with others…,” (Alessandra, 2015).

Promoting good communication. Several of the communication barriers addressed above stem from how communication is delivered, what information is delivered, and how each party perceives that information. Good communication is established when both parties feel safe, comfortable, and trust one another in their collaborative environment. Both hold the responsibility of keeping an open-mind into the process and commit to relationship building. Only after good communication occurs between coaching peers can good collaboration exist.

Resources.

Alessandra, T. (2015). Expert advice- How you can prevent miscommunication. Available from:  https://www.fripp.com/expert-advice-how-you-can-prevent-miscommunication/

Forbes Coaching Counsel. (2018). Common communication mistakes to avoid as board directors. Available from: https://www.forbes.com/sites/forbescoachescouncil/2018/01/18/common-communication-mistakes-to-avoid-as-a-board-of-directors/#6f86f4332b44

ISTE, (2017). ISTE standards for coaches. Available from: https://www.iste.org/standards/for-coaches

Ladin, M. (2015). Communication and collaboration: Why they are one in the same? Available from: https://www.tigerconnect.com/blog/communication-collaboration-theyre-one/

Ladyskewshy, R., Pettapiece, R.G. (n.d.). Exploring adult learners usage of information communication technology during a virtual peer coaching experience. Available from: https://espace.curtin.edu.au/bitstream/handle/20.500.11937/32326/227280_153211_Jnl_online_learning_full_paper.pdf?sequence=2&isAllowed=y

Lohrey, J. (n.d.) Importance of promoting collaborative communication in the healthcare environment. Available from: https://smallbusiness.chron.com/importance-promoting-collaborative-communications-health-care-environment-79568.html.

Ramsey, P.G.S. (2008). The twenty biggest communication mistakes school leaders make and how to avoid them. Available from: https://www.corwin.com/sites/default/files/upm-binaries/25868_081218_Ramsey_ch1.pdf.

Steiner, W. (2014). Avoiding communication breakdowns. Available from: https://executivecoachingconcepts.com/avoiding-communication-breakdowns/

The Coaching Room. (2016). 7 barriers to effective communication killing your relationships. Available from: https://www.thecoachingroom.com.au/blog/7-barriers-to-effective-communication-killing-your-relationships

Length of Peer Coaching Session for Successful Planning and Implementation

Building 21st century skills is the imperative focus for most educational institutions.  Many education articles and blog posts are centered around techniques and concepts that educators can use to develop these skills in their students. Yet what about an educator’s need to build these skills? How can educators learn and gain 21st century skills before teaching and modeling them in the classroom? One proposed method is to provide more professional development to help educators build these skills, but now many researchers argue that traditional presentation-only professional development sessions leave little room for implementation. An early study conducted by Showers and Joyce found that only ten percent of professional development participants implemented what they learned into the classroom. When educators were allowed to practice what they had learned, implementation increased drastically, (Showers & Joyce, 1996).   

What Showers and Joyce were researching was the concept of “peer coaching.” Peer coaching is a professional development strategy in which colleagues spend time in a collaborative environment working towards improving standard-based instruction and support efforts for building 21st century skills, (Foltos, 2013).  Peer coaching may take on many forms but usually includes a collaborative process in which the teacher leader assists in co-planning activities, models strategies and techniques, provides observation of teaching and reflection, while avoiding formal evaluation of the peer, (Foltos, 2013).  Through peer coaching, the collaborating pair begin to build a culture of standards and expectations, increase instructional capacity, support ongoing evaluation, and create a platform for connecting teaching practices to school policies, (NSW Department of Education, 2018).  Student learning benefits when teachers learn, grow, and change through peer coaching, (Showers & Joyce, 1996). 

The ISTE standard for coaches defines a peer coach’s role: as “contribut[ing] to the planning, development, communication, implementation, and evaluation of technology-infused strategic plans at the district and school levels,” (ISTE, 2017).  Therefore, understanding peer coaching best practices is important to effective coaching.  Since the coach’s role is to take part in the planning, implementation, and evaluation cycle, I began wondering about effective time spent in coaching sessions with a peer.  This wonderment stems back from my past role as a nutrition counselor. One of the biggest issues that would come up concerned the appropriate length of a counseling session. Medical insurance allowed for billing in fifteen-minute increments though fifteen minutes was hardly enough time for any successful progress to take place. There was distension among professionals about whether 30 minutes or one hour was more effective. My former employer insisted that every session should be a minimum of one hour, which felt appropriate for first, second, and sometimes even third session, yet felt unnecessarily long after about the fourth session.  I often wondered at what point is there too much information given, in comparison to too little, for a coaching session to be effective?  Now as I step into the role of a technology coach, these same questions enter my mind, what is a reasonable timeframe for peer coaches to fulfill their roles (i.e. how long would a coaching cycle take)? 

My questions, as it appears, do not have a straightforward answer. A program called “Incredible Years” offers some guidelines into actual number and timeframes, citing that one-hour coaching sessions should occur after every two or three teaching lessons particularly if the educator is new to the program. More experienced educators may meet less often. Despite these very specific guidelines, the program designers state that the guidelines serve as recommendations at best, (Incredible Years, n.d.).  

Researchers and educational leaders agree that coaching, regardless of its medium, is an individualized process. According to educational leader, Les Foltos, peer coaching needs to be personalized to be effective. One of the hallmarks of a good peer coach is making the process manageable for the coaching partner, (Foltos, 2013). Time spent on improvement will be dependent on other time obligations, such as current workload.  Rather than focusing on a fixed time minimum, Foltos recommends that the time set out for coaching should be based on the peer’s capacity and readiness for improvement, (Foltos, 2013). In fact, peer coaching may never have a clear resolution time but rather it may be a cyclical process. The key to understanding the process length will lie in continual reflection and evaluation of the coaching goal(s), (Foltos, 2013). 

Foltos isn’t the only educational leader to suggest the long-term nature of peer coaching, the NSW Department of Education defines peer coaching as a “long term professional development strategy,“ (NSW Department of Education, 2018). Like Foltos, the NSW suggests a cyclical nature to peer coaching as outlined in figure 1.1 below. 

Infographic describing the four steps to peer coaching facilitation.
Figure 1.1 Peer Coaching Facilitation

The peer coaching cycle is dependent on relationship development and trust building that supports open, honest communication and comfort with risk-taking. Once these relationships have formed, the coaching process can be ongoing because professional development needs and goals change. The length may also be naturally determined as many teachers choose to continue the collaboration process even after the initial goal has been met, (Showers & Joyce, 1996). There is congruence among researchers that length of peer coaching session is less important than the process that is followed.  Initial peer coaching sessions should focus on relationship building in which both parties share goals, agree on the coaching process, and establish agendas with topics to explore.  A good peer coach would help their collaborative partner establish SMART (specific, measurable, attainable, realistic, timely) goals that help them build a personalized timeline for meeting their joint objectives, (NSW, 2018). Once this process has been followed, any sequent sessions should allow for flexibility and reflection, ensuring its ongoing nature, (NSW, 2018). 

Though there are many similarities in nutritional and technology coaching, the timeline needs are vastly different.  In both instances, the relationship development between a coach and their partner is crucial for success.  Open, honest communication and risk taking does not readily occur without a safe and established relationship. However, in technology coaching, the idea is to work with a peer, not a client, to build a collaborative partnership that is long lasting and transcends any initial short-term goal.  

Resources 

Foltos, L., 2013. Peer Coaching: Unlocking the Power of Collaboration. Chapter 1: Coaching roles and responsibilities. Corwin Publishing. Thousand Oaks, CA. 

Incredible Years, (n.d.) IY peer coaching expectations. Available from: file:///C:/Users/Catalina/Downloads/Peer-Coaching-Dosage-8-16%20(1).pdf 

ISTE, (2017). ISTE standards for coaches. Available from: https://www.iste.org/standards/for-coaches  

NSW Department of Education, (2018). Peer Coaching [website]. Available from:  https://education.nsw.gov.au/teaching-and-learning/curriculum/learning-for-the-future/Future-focused-resources/peer-coaching 

Showers, B., Joyce, B., (1996). The evolution of peer coaching. Available from: http://educationalleader.com/subtopicintro/read/ASCD/ASCD_351_1.pdf

Developing Professional Development as Part of the Community Engagement Project.

The community engagement project challenges students to create a professional development session to be presented at a conference of the student’s choosing.  As part of building effective digital age environments, as prescribed by the ISTE Standards for Coaches #3, I chose to create an interactive session that focused on active learning and digital collaboration tools to improve current practices in nutrition education. Technology in nutrition education currently has limited uses but impactful potential. Despite the fact that nutrition information is plentiful in the digital world, the approach of dietitians and nutritionists has been to increase presence through blogs, social media, and videos (such as those on YouTube), while the Academy of Nutrition and Dietetics (AND), the representative organization for all dietitians, set their efforts to instill a code of ethics and provide information on privacy in the digital workplace.  These efforts may help mitigate nutrition misinformation but are often one-sided or engage only limited populations. For example, blogs may allow comments but do not allow for active engagement with the blog topics nor takes into account implementation on a local level. Social media platforms such as Facebook, Pinterest, and Twitter allow for nutritionists’ voices to be heard but rarely offer collaborative engagement between other experts, or communities. The solution is relatively simple as the digital tools mentioned offered plenty room for continued collaboration among participants at any level, (local or global).

The Academy itself recognizes the potential of technology in nutrition and has published a practice paper on nutrition informatics.  Nutrition informatics is a relatively new field in dietetics that addresses technology’s role in health practices.  The Academy discusses the potential pros and cons for each of the various practice fields in dietetics (clinical, food services, education/research, community, consultation/business) and technology’s potential for growth in each of those areas. In education specifically, the Academy recognizes use in distance learning, student progress tracking, speciality testing for licensing and certification, and professional course development.  However, it does not mention need for collaboration or engaging various audiences requiring nutrition education.

In order to bridge this gap and address the ISTE Coaching Standard, the topic for this professional development proposal focuses on building better nutrition education through digital collaboration tools.   The goal of this session is to explore benefits of active learning through technology aides (EdTech) and implement tools into existing lesson plans with the following objectives in mind:

  • a) Understand and/or review importance of active learning (evidence-based practice)
  • b) Become familiarized with collaborative edtech tools
  • c) Engage with edtech tool selection criteria and best practices
  • d) Explore ways to incorporate digital tools in lesson plan scenarios.

Professional Development Session Elements

In this one-hour session, participants will be invited to explore the main topic through both face-to-face and online collaboration, as the entire group navigates through a website developed specifically for the presentation. Since all of major content is available to them online, there is no need for note-taking, allowing participants to remain engaged throughout the session. Elements of the session involve: a pre-session technology self-assessment, an online group discussion via Padlet, think pair share elements, and lastly self-reflection elements submitted during and after the session.  More details on these elements are provided below.

Length. The Academy hosts local sub-organizations in each state. I chose to develop this professional development session for local dietitians and nutrition educators with the opportunity to present at the local education conference held annually.  The requirements of this local organization state that all educational sessions must be a minimum length of one hour. This is to meet the CEU (continuing education unit) minimum for registering dietitians. Considering that through the DEL program we have taken entire classes dedicated to active learning and digital tools, the length will limit the depth of information presented.  However, the ability to continually collaborate with both participants and presenter will allow for continued resource sharing after the session has ended.

Active, engaged learning with collaborative participation. Participants will be encouraged to participate and collaborate before, during, and after the session for a full engagement experience. The audience will be asked to review certain elements of the presentation website available here intermittently as they discuss key elements with the participants next to them. See figure 1.1 for lesson plan details.

Building Better Nutrition Education Through Digital Collaboration Tools
Objectives

Session Goal: Introduce ways to incorporate digital collaboration tools into existing nutrition education lesson plans.

Learning Objectives: At the end of the session participants will:

  • a) Understand and/or review importance of active learning (evidence-based practice)
  • b) Become familiarized with collaborative edtech tools
  • c) Engage with edtech tool selection criteria and best practices
  • d) Explore ways to incorporate digital tools in lesson plan scenarios
Performance Tasks

  • Participants will complete self-assessment prior to the session
  • Participants will demonstrate understanding of active learning by submitting informal Google Form Quiz in session
  • Participants will engage in collaborative edtech tools by submitting responses during the session
  • Participants will create their own digital tool need by complete case scenario
  • They will submit self-reflection via flipgrid post session
Plan Outline

  • Session Introduction (5 mins)
    • Prompt and Participation: Padlet Q & A- Describe a time you attended a great education session, what made that session great?
    • Review of self-assessment (completed prior to session)
  • Importance of active learning- evidence-based practice (5-10 mins)
    • Review of evidence: Google form quiz (embedded in site)
    • How can digital tools help? (5-10 mins)
  • Choosing the right digital tool (10 mins)
    • Triple E Framework rubric
    • Criteria for choosing the right digital tool
  • Tips on incorporating tools into existing lesson plan (10 mins)
    • Video Tutorial (take home message/resource)
  • Active practice (10 mins)
    • Case scenarios-flipgrid response
    • Flipgrid self-reflection
  • Questions (5 mins)

Total session length: 60 mins.

Figure 1.1 “Building Better Nutrition Education through Digital Tools” Session Lesson Plan.

Before the presentation, the participants will be invited to a google form self-assessment poll addressing comfort and knowledge with technology tools as well as their current use of technology tools in practice. During the presentation, the audience will be prompted to participate in “think, pair, share” elements, as well as, respond to collaboration tools prompts on padlet, google forms, and embedded websites.  After the presentation, participants will be encouraged to summarize their learning by submitting a flipgrid video.  

Content knowledge needs. The session content begins with establishing the importance of active learning as evidence-based practice to meet objectives a) and b). Just as motivational interviewing and patient-centered practice is desirable in nutrition, active learning invoking 21st century skills is evidence-based and an education standard. The content will then shift into teacher-focused how-tos for digital tools including how digital tools can help, how to select the right digital tool, and how to incorporate that tool into an existing lesson plan to address objectives c) and d). My assumption is that participants who are not comfortable with technology may be fearful or lack of motivation to explore various tools.  Group collaboration, modelling and gentle encouragement through case studies may help mitigate these fears.

Teachers’ needs. While the majority of the session focuses on introductory content to active learning and digital tools, teacher’s needs in digital tool management can be addressed through coach/presenter modeling. Simple statements such as, “I created this flipgrid video to serve as a model for students.” or “This google form was hyperlinked to gauge students’ understanding so far,” can serve as a basis to explore class management and digital tool management within the limited time. The website itself offer a section on FAQs, exploring questions and misconceptions about active learning and digital tools. Even with all of these resources, the audience will be introduced to technology coaching and may choose to consult a coach at their current institution.

In addition to modeling, three tutorial videos are available on the website to help teachers begin creating their own active learning lesson plans using the backwards design model. Each of the tutorials features closed captioned created through TechSmith Relay for accessibility.  The Google Site was also chosen because content is made automatically accessible to viewers, all the website creator has to do is include the appropriate heading styles and use alt text for pictures, figures, and graphs.

Lessons Learned through the Development Process.

One of the major challenges to developing this project was understanding the needs of the target audience.  Because nutrition informatics is relatively new, technology use has not be standardized in the profession, therefore estimating the previous knowledge and use of digital tools by the audience was difficult. My assumption is that technology use and attitudes about technology will be varied. The website attempts to breakdown information to a semi-basic level.  The only assumption I made was that the audience has good background in standard nutrition education practices. I also chose to develop the Technology Self-Assessment for the audience to complete prior to the session as a way to gain some insight into current technology use and comfort so that I may better tailor the session to that particular audience’s needs.

I realized as I was developing the lesson plan for this session that I only have time to do a brief introduction to these very important topics. If I were to create a more comprehensive professional development, I could expand the content into three one-hour sessions including 1) introduction and theory to collaborative learning which would address the importance of digital tools in nutrition education and establish need for active learning, 2) selecting, evaluating, and curating tech tools allowing educators to become familiarized with available tools based on individual need, and 3) lesson plan development integrating collaboration tools, a “how-to” session where participants create their own plan to implement. I had not anticipated that length was going to be a barrier, however, if the audience truly has limited digital familiarity and comfort, perhaps beginning with an introduction to these topics is sufficient.

One positive lesson that I’ve learned is that trying new things, such as creating a Google Site, can be very rewarding.  I have never experimented with Google Sites prior to this project and I am quite happy with the final website, though the perfectionist in me wants to continue tweaking and editing content. I originally was aiming to create slides for this presentation but realized that I am attempting to convince a possibly skeptical audience on the benefits of digital tools so using the same old tool would not allow me to do the scope of modelling I desire.  

I must admit that before this project, I had a hard time placing myself into the role of a “tech coach” because I would continually see each concept through the lens of an educator and how to apply the concepts to my own teaching.  It has been difficult for me to take a step back and realize that I am teaching but just in a different context. Creating the step-by-step tutorials was the turning point where I envisioned the audience modeling their lesson plans to the example I had given.  I hope I have the opportunity to present this session at the educational conference and bring the ideals of active learning and digital tools to professionals working in various education settings.

The Connection between Digital Competence and Problem-Solving

The word “troubleshooting” most often invokes images involving a conversation with the IT department, a progression of actions guided by the technician and performed by the user, and ending with a resolution in which the user’s original knowledge of technology has not been augmented. Unfortunately this is a all too common scenario. The user defaults all troubleshooting responsibility to a third party because of unfamiliarity or knowledge deficit of technology. This is not limited to just consumers and companies, there is a concern that students also do not troubleshoot well. According to the ISTE coaching standard, coaches should help teachers and students “troubleshoot basic software, hardware, and connectivity problems common in digital learning environments,” (ISTE, 2017). While calling for IT or passing responsibility onto another party, like a teacher for example, is generally practiced, learning to troubleshoot is a beneficial 21st century skill because it helps develop digital competence.

Why is digital competence important?

Like all 21st century skills, digital competence is a highly-sought skill in the ever-evolving workforce. An e-magazine, Training Industry, wrote an industry-perspective article on digital competence and highlights the need for competence in the workforce from the top of the organization chart down.  The author believes that the tech world today emcompasses “VUCA”, or volatility, uncertainty, complexity, and ambiguity. The role of those working in tech today should be to navigate this VUCA world seamlessly and one of the ways to do this is to reinforce digital competence, (Newhouse, 2017).  The industry definition of digital competence expands to include not only knowledge of technology but also involves understanding digital environments, effectively creating and consuming digital information, communicating and collaborating with diverse stakeholders, innovating rapidly, critically thinking/problem solving, and maintaining security, (Newhouse, 2017). This definition was devised from new European Union definitions and involves five major facets summarized in figure 1.1 below.

Infographic on the 5 major facets of digital competence
Figure 1.1 Facets of Digital Competence

What role does “digital competence” play in helping students problem-solve and troubleshoot online/technology issues?

One issue that arises is the general assumption that since students grew up with technology, or are considered digital natives, that they automatically build digital knowledge or that students know how to use technology well, (Hatlevik, et. al, 2015).  However, in order to use technology well, students need to build digital competence and literacy. According to researchers Hatlevik, Gudmundsdottik, and Loi, building digital competence is complex and involves various factors as summarized in figure 1.2 below.

Infographic on the key elements for developing digital competence
Figure 1.2 Developing Digital Competence

The researchers recognize that these facets are essential to culviating a deep understanding of technology while promoting critical reflection and creativity of digital skills.  These qualities in turn develop problem-solving skills in both independent and collaborative settings, (Hatelvik,et. al., 2015).

Other than knowledge deficits involving how to perform troubleshooting tasks, researchers suggest that when demanding conditions, such as a completing an assignment,  becomes difficult, it may hurt self-regulation and autonomy, (Koole, et.al, 2012). These difficulties can include cognitive, motivational, implementational, or a combinations of these factors.  While this theory is debated, meta-analyses indicate that low intrinsic value activities (such as homework) may lower complex problem solving abilities such as those required by troubleshooting, (Koole, et al. 2012).  Along with motivational issues, students may resolve themselves to believing that there is only one correct path or resolution to a specific problem in which the educator is the gatekeeper of the solution. Rather than seeking the solution for themselves, students prefer to go straight to the source which develops a learned helplessness, (Miller, 2015).

How can students develop digital competence?

Digital competence is a very complex concept that spans several social, motivational, personal, cultural, and technical understandings, therefore, there is no straightforward way for developing digital competence.  However, educators play a big role in establishing foundations for competence that may lead to better problem-solving and troubleshooting in two major ways:

  1. Allowing for self-directed learning. A consensus exists in the fact that students need to be reflective of their own learning, (Miller, 2015 and Plaza de la Hoz, et. al., 2015).  The role of the educator then shifts to provide resources including digital tools that allow students to experiment by active participation and engagement.
  2. Change in class culture. The attitudes and beliefs of the educator also reflects importance of digital competence in students. If the educator places low importance in digital competence, the students learn not to value or develop these important skills.  The educator can establish new beliefs, resources, and structures to promote a culture of answer-seeking through appropriate digital tools and tool use. Lastely, students must build self-efficacy through trial and error in a safe environment.

While researchers are investigating efficient methods for developing competences, all sources agree that in order for students to be successful in the 21st century, educators must open up the path to new technologies, new pedagogies, and new attitudes that help build digital competency, (Miller, 2015, and Plaza de la Hoz, et. al., 2015).  

Resources

Hatlevik, O.E., Gudmundsdottik, G.B., Loi, M. (2015). Digital diversity among among upper secondary students: A multilevel analysis of the relationship between cultural capital, self-efficacy, strategic use of information, and digital competence. Computers & Education. 81: 245-353. Available from: https://drive.google.com/file/d/0B5W5P9bQJ6q0RFNib3A5Vm9wWWM/view

ISTE, (2017). ISTE standards for coaches. Available from:

https://www.iste.org/standards/for-coaches

Koole, S.L., Jostmann, N.B., Baumann, N. (2012). Do demanding conditions help or hurt regulation? Available from: https://drive.google.com/file/d/0B5W5P9bQJ6q0M0QzalRBa0FfTXM/view

Miller, A. (2015, May 11). Avoiding learning helpness. Available from: https://www.edutopia.org/blog/avoiding-learned-helplessness-andrew-miller

Newhouse, B. (2017). Closing the digital competence gap. Available from: https://trainingindustry.com/magazine/issue/closing-the-digital-competence-gap/

Plaza de la Hoz, J., Mediavilla, D.M., Garcia-Gutierrez, J. (2015). How do teachers develop digital competence in their students? Appropriations, problematics, and perspectives. Available from: https://www.researchgate.net/publication/301914474_How_do_teachers_develop_Digital_Competence_in_their_students_Appropriations_problematics_and_perspectives

Developing Evaluation Criteria for EdTech Tools

Digital tools in the classroom is an asset to learning. According to the U.S. Department of Education, technology in the classroom ushers in a new wave of teaching and learning that can enhance productivity, accelerate learning, increase student engagement and motivation, as well as, build 21st century skills, (U.S. Department of Education, n.d.).  The offerings of technology tools for the classroom are plentiful as priorities shift to support a more integrated education. Educators now have several options for cultivating digital tools to better engage students, promote active learning, and personalize instruction. But choosing the right tools can be challenging especially considering that educators face a seemingly overwhelming array of options. How would can educators filter through all of the options to select the best tool(s) for their classroom?  

Enlisting the help of a technology coach who can systematically break down the selection process to ensure that the most appropriate tools are used is part of the solution.  In following with best practices, the third ISTE standard for coaching (3b) states that in order for tech coaches to support effective digital learning environments, coaches should manage and maintain a wide array of tools and resources for teachers, (ISTE, 2017).  In order to cultivate those resources, coaches themselves need a reliable way to select, evaluate, and curate successful options. Much like an educator may use a rubric or standards to assess an assignment’s quality, coaches can develop specific criteria (even a rubric) to assess quality of technology tools.  

Tanner Higgin of Common Sense Education understands the barrage of ed tech tools and the need for reliable tech resources, which is why he published an article describing what makes a good edtech tool great.  The article seems to be written more from a developer’s point of view on app “must-haves”, however Higgin also makes reference to a rubric used by Common Sense Education to evaluate education technology. He mentions the fact that very few tech tools reviewed receive a 5 out of 5 rating which makes me assume that Common Sense Education has a rigorous review system in place. I was curious to learn what criteria they use to rate and review each tool and/or so I investigated their rating process.  In the about section on their website, Common Sense Education mentions a 15-point rubric which they do not share. They do share, however, the key elements included in their rubric: engagement, pedagogy, and support, (Common Sense Education, n.d.). They also share information about the reviewers and how they decide which tools to review. This information serves as a great jumping off point in developing criteria for selecting, evaluating, and curating digital tools. Understanding the thought process of an organization that dedicates their time and resources for this exact purpose is useful for tech coaches in developing their own criteria.  

Continuing the search for technology tool evaluation criteria led me to several education leaders who share their process through various blog posts and articles.  Reading through the criteria suggestion, a common theme started to develop. Most of the suggested criteria fit under the umbrella terms defined by Common Sense with a few modifications, which are synthesized in figure 1.1 below.

Infographic with suggestions on evaluation criteria
Figure 1.1 Digital Tool Evaluation Criteria Suggestions

There is consensus among the educational leaders who placed emphasis on engagement and collaboration features of the tool. Tod Johnston from Clarity Innovations noted that a good tech tool should allow for personalization or differentiation of the learning process that also allowed the instructor to modify the content as needed for each class, (Johnston, 2015).  ISTE author, Liz Kolb added to this by stating that tools that allow for scaffolding help to better engage differentiation, (Kolb, 2016). Both Edutopia and ISTE authors agreed that sociability and shareability of the platform was important to engage students in wider audiences, (Hertz, 2010, & Kolb, 2016).

While engagement was a key element of selecting a tech tool for the classroom, even more important was how the tool fared in the realm of pedagogy in that first and foremost the technology needs to play a role in meeting learning goals and objectives, (Hertz, 2010).  Secondly, the tool should allow for instructional best practices including appropriate methods for modeling and instruction of the device, and functionality in providing student feedback, (Hertz, 2010 &, Johnston, 2015). Another pedagogical consideration is the ability of the platform to instill higher level thinking rather than “skill and drill” learning, (Kolb, 2016). Specific rubrics on pedagogy such as the SAMR and TRIPLE E framework models has been created and can be used in conjunction with these principles.

Support and usability was among the top safety concerns for evaluating these tools.  Cost and the desired features accessed within cost premium was among these concerns particularly when students needed to create an account or needed an email was a concern, (Hertz, 2010). Hertz called this issue free vs. “freemium”, meaning that some apps only allow access to limited functionality of the platform while full functionality could only be accessed through purchase of premium packages. If the platform was free, the presence of ads would need to be accessed,  (Hertz, 2010). In terms of usability, coveted features such as easy interface, instructor management of student engagement, and seperate teacher/student account were desirable, (Johnston, 2015). Along with cost and usability, app reliability and compatibility with existing technology was also listed as important features, (Johnston, 2015).

The evaluation process itself varied from curated lists of the top tech tools, criteria suggestions, even completed rubrics.  If those don’t quite apply to a specific evaluation process, a unique approach would be to convert the rubric into a schematic like the one shared from Denver Public Schools  where each key evaluation element could be presented as a “yes” or “no” question with a “yes, then” or “no, then” response following a  clear decisive trajectory for approval or rejection.  

What I’ve learned through the exploratory process of developing evaluation criteria for tech tools is that It is not important or necessary that a tool meet every single criteria item. Even the educational and tech experts reviewed in this blog emphasized different things in their criteria. In his blog, Tod Johnston suggests that there is no right or wrong way to evaluate technology tools because this isn’t a cookie cutter process.  Just like all teachers have a different style and approach to teaching so would their style and approach to using tech tools. The key to evaluating tools to to find the one that best fits the teacher’s needs, (Johnston, 2015).

Resources

Common Sense Education., (n.d.). How we rate and review. Available from: https://www.commonsense.org/education/how-we-rate-and-review

Hertz, M.B., (2010). Which technology tool do I choose? Available from: https://www.edutopia.org/blog/best-tech-tools

ISTE, 2017.  ISTE standards for coaches.  Available from: https://www.iste.org/standards/for-coaches.

Kolb, L., (2016, December 20). 4 tips for choosing the right edtech tools for learning. Available from: https://www.iste.org/explore/articleDetail?articleid=870&category=Toolbox

Johnston, T. (2015). Choosing the right classroom tools. Available from: https://www.clarity-innovations.com/blog/tjohnston/choosing-right-classroom-tools

Vincent, T. (2012). Ways to evaluate educational apps. Available from: https://learninginhand.com/blog/ways-to-evaluate-educational-apps.html

U.S. Department of Education., (n.d.). Use of technology in teaching and learning. Available from: https://www.ed.gov/oii-news/use-technology-teaching-and-learning.

Culturally Relevant Learning Environments- Examples in Nutrition

How you learn is built in to the larger part of who you are, embodies your collective experiences, norms, beliefs, and values; it is a part of your culture. Building community in the learning environment, whether on- or off-line, establishes safety, facilitates collaboration, and can help cultivate sense of self and role in the community. The ISTE standard for coaches calls coaches to “create and support effective digital age learning environments to maximize the learning of all students… by model[ing] effective classroom management and collaborative learning strategies to maximize teacher and student use of digital tools and resources and access to technology-rich learning environments”,(ISTE, 2017). In order to maximize these resources for learning, we need to establish a technology environment that engages students’ cultural background and understandings.

Building community can be particularly difficult in an online environment where social cues, particularly non-verbal ones, may be more challenging to interpret or oftentimes gets misinterpreted.  This becomes confounded when factoring in cultural languages and exchanges. These exchanges are not limited to ethnic cultures, but also generational cultures where task interpretations may take on different meanings.  For example, assigning students the task of investigating three community food resources may be interpreted and approached differently by students who are very familiar with technology, as opposed to non-traditional students or students that have limited access to technology.  Coaches can help instructors build understanding of the cultures present in a classroom, and implement successful learning strategies through culturally relevant pedagogy (CRP).

What is CRP and why is it important?

McCarther defines culture as an “amalgamation of human activity, production, thought, and belief systems,”(McCarther, 2017). “Culture is fundamental to learning,” (Pitsoe, 2014). Each student brings to the classroom a “fund of knowledge” shaped by their culture that influences who students are, what they believe, and how they think, (Cavalli, 2014). It is easy to understand that students bring all of themselves represented through culture in their learning, but does how they are taught represent them and their culture?  In 1995 researcher Gloria Ladson-Billings coined the termed “culturally relevant pedagogy” (CRP) in response to the fact that students learn best when their ideas and voice are shared and appreciated by the world, (McCarther, 2017). CRP invites educators to create socially just spaces and structure for students to share their voice by using teaching strategies that support the use of cultural knowledge, previous experiences, and unique performance styles that are familiar to diverse students in the classroom, (Cavalii, 2014 & McCather, 2017).  According to Ladson-Billings, student learning success encompasses academic success, cultural competence, and sociopolitical consciousness. CRP is not prescriptive but rather flexible and ever-changing in response to the cultures unique to a particular classroom, (McCather, 2017). Good implementation of CRP in the classroom involve four key components as described by Pitsoe and summarized in Figure 1.1 below.

Infographic of CRP Components
Figure 1.1 Components of Culturally Relevant Pedagogy

Understanding how students learn, the reality of their world today, and what skills they need to challenge the existing systems is crucial to the implementation of CRP.

Need for CRP in Nutrition

The need for CRP in nutrition education is great. Nutrition is incredibly personal as we all eat certain foods for a variety of different reasons. Most reasons for eating are linked to social and cultural norms rather than a strong connection to health (though cultural eating is linked to maintenance of health).  Nutrition practitioners and educators need to be aware of the delicate interplay between culture and health as new foods and traditions are introduced to the diet. Presenting nutrition information in a culturally relevant manner helps engage individuals by giving them the appropriate context and tools to facilitate change. Below are two examples that help illustrate the need for CRP in nutrition counseling:

In the article, “Culturally tailored post secondary nutrition and health education curricula for indigenous populations”, the authors investigate the types and number of culturally relevant nutrition and health programs offered to students seeking to work with Alaskan natives and studying for an allied health degree.  There is a need for such training as Alaskan natives currently face a disproportionate rate of chronic disease development, particularly when Western diets substitute the traditional diet, (McConnell, 2013). After a brief review, the authors found very limited curriculum related to culturally appropriate/relevant nutrition counseling that included spirituality, respect of elders, and personal relationships with the land, waterways, and animals, (McConnell, 2013).  The information that they found was limited to stand-alone culturally tailored courses that the authors argued were considered “dead-end” trainings that were short term and only offered non-transferable skill-building, (McConnell, 2013). After a more comprehensive search, the authors found limited offerings of post-secondary training that resulted in a mainstream credential. Reasons for the limited availablity were hypothesized to be possibly related to funding, oral culture, researchers available for study, or a mix of the above, (McConnell, 2013).

The authors’ rationale for culturally tailored curriculum is very interesting, arguing that the more effective nutritional counseling approach was not to create courses for the indigenous patients themselves, but rather train future nutritionists/dietitians with additional credentials to tailor teachings that align with the food norms and beliefs of the target population. This correlates with the CRP theory principles in which states that is the role of the instructor to understand the culture of the class/client, not the client/student, as it is more effective to receive education in a context that is culturally familiar and resonates better with clients, (Pitsoe, 2014).  

When considering my own education options, to my knowledge, there isn’t post-secondary continuing education ending in credentials available for nutritionists/dietitians on culturally appropriate/relevant counseling. However, when implemented well, CRP can deliver results.  Another article, “Adaptation of a Culturally Relevant Nutrition and Physical Activity Program for Low-Income, Mexican-Origin Parents With Young Children”, described a community intervention nutrition program designed around the “Social Learning Theory” to help low-income hispanic families decrease rates of childhood obesity.  This 5-year program gave individuals in the intervention group $25 a month to spend on fresh fruit and vegetables while participating in family nutrition and physical activity nights.  As part of the model, the researchers used the “Anchor, Add, Apply, and Away” approach where participants would share food memories from childhood, share stories of life as an immigrant, problem solve by learning to make a new recipe with local foods, and share what was learned at the end of the process, (Kaiser, et. al., 2015). Parents were also asked to provide examples of what they did to promote nutrition and physical activity in their family. This served to give ideas and motivate others in the group.  At the end of the program, parents reported that children spent less time watching tv or playing video games, did more physical activity, and either maintained weight or lost weight, (Kaiser, et. al., 2015). This article explores a patient-centered approach to culturally relevant nutrition education where success was gained not only through cultural food norms and values, but also encouraged the exploration of new foods through the social learning theory.

Implementation of CRP in Nutrition Classes

There is a demonstrated need for more culturally relevant pedagogy in nutrition education, particularly considering that using the same teaching techniques on all students does not set up these individuals for sustainable success when cultural aspects to nutrition are not fully incorporated.  This begs the question: What are some approaches and examples of using culturally relevant pedagogy in nutrition classes?

According to Pitsoe, in order to maximize learning, teachers must first understand the cultures represented in their classrooms and use that understanding into their lessons, (Pitsoe, 2014).  To help with this, the Milwaukee Public Schools offers a list of questions to help teachers gain a better understanding of their students. Figure 1.2 examines these questions.

Infographic of questions building CRP
Figure 1.2 Questions for Building Culturally Relevant Practices from Milwaukee Public Schools

Once the class culture is understood, the next step is to select instruction strategies that effectively engage that culture. Some ways that teachers have successfully implemented this is by using cultural mythology to open discussions about a topic, conduct an environmental study of pollution in local community, or investigate the nutrition status of the local community, (Cavalli, 2014). These strategies could also be expanded to include discussions on the impacts of technology on food culture and generational culture.

A master’s thesis by A.C. Cavalii, provides an fuller example of CRP as implemented  in an urban science class setting. Her approach to CRP involved taking an eleven-lesson unit and blending strategies to incorporate not only direct teaching but also guided inquiry, and community investigation.  A summary of her approach can be found in Figure 1.3 below.

Figure depicts CRP Lesson Planning
Figure 1.3 A. Cavalli’s CRP Lesson Planning Example

By modeling and providing examples for instructors on building culturally relevant lessons, coaches can help teachers better develop online strategies that incorporates cultural relevance to enhance learning and build better online communities.

References

Cavalli, A. C., (2014). Teaching nutrition and health in the urban science classroom- A blended approach to culturally relevant and problem based learning. Education and Human Development Theses, The College at Brockport [website].  Available at: https://digitalcommons.brockport.edu/cgi/viewcontent.cgi?article=1547&context=ehd_theses

ISTE, (2017). ISTE standards for coaches. Available at: https://www.iste.org/standards/for-coaches

Kaiser, L., Martinez, J., Horowitz, M., Lamp, C., Johns, M., et al. (2015).  Adaptation of a culturally relevant nutrition and physical activity program for low-income, Mexican-origin parents with young children. Center for Disease Control [webpage]. Available at: (https://www.cdc.gov/pcd/issues/2015/14_0591.htm)

McConnell, S., (2013). Culturally tailored post secondary nutrition and health education curricula for indigenous populations. Int J Circumpolar Health. Available online at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3748461/)

Milwaukee Public Schools, (n.d.). Culturally responsive practices. Available at: http://mps.milwaukee.k12.wi.us/en/Families/Family-Services/Intervention—PBIS/Culturally-Responsive-Practices.htm

Instructional Coaching: Using Rubrics to Quantify Qualitative Data for Improved Teaching Outcomes

Feedback can be a powerful tool to improve teaching and learning. Through feedback, new perspectives can be gained as teachers begin to can acern what is working and what isn’t in current instructional methods. Feedback also offers suggestions on achieving goals and standards that drive an educator’s work. There are four different types of feedback: formative, summative, confirmative, and predictive. Formative feedback occurs before an intervention takes place, such as giving students feedback on an assignment where the feedback does not impact the final grade.  I explore the benefits of formative feedback in this post. Summative feedback occurs after an intervention, such as when students turn in an assessment and the feedback provided is in relation to the grade outcome, (Becker, 2016). Predictive feedback occurs before any instruction has ever taken place to ensure that the method will be effective while confirmative occurs well after summative feedback to ensure that the methods are still effective, (Becker, 2016).  Of the four types, formative, and summative feedback are among the most widely used evaluation in educational institutions.

At the end of each quarter,  two types of summative evaluation is collected for each of the classes I’ve taught, quantitative and qualitative data to assess my performance as a professor, and the course outcomes.   The quantitative portion uses a likert scale ranging from 1=strongly disagree to 5= strongly agree, whereas at the bottom of the evaluation form, there is a section where students can provide comments, intended to give constructive feedback for classroom improvement.  While the comments are not always written constructively (I am addressing this through a mini-module students are required to complete for all of my classes), it’s mainly the common themes that present themselves in the evaluations that are powerful influencers of improving my classes.  However, what I’ve learned is that most of the time, the summative feedback is simply too late to improve the current student experience because the issue can’t be addressed until the next time the course is offered. As a technology and instructional coach, in order to help other educators improve their teaching outcomes, more timely feedback would be required that utilized both quantitative and qualitative assessment measures. While most learning management system (LMS) platforms can offer a multitude of analytics, quantifying data such as exam scores, class averages for assignments, and average engagement time on the platform, there isn’t an explicit way to neither collect nor quantify qualitative data.

The ISTE standard for coaching states that coaches should, “coach teachers in and model effective use of tools and resources to systematically collect  and analyze student achievement data, interpret results, and communicate findings to improve instructional practice and maximize student learning, (ISTE, 2017). If LMS can collect quantitative data that can be assessed throughout the quarter (through summative feedback), could it also be used to quantify qualitative data (i.e. comments) for improved teaching outcomes?  To answer this question,  I’d like to address it two ways:  1) Establish an understanding in the value and importance of self-reflection of assessments, and 2) Address how rubrics can help quantify qualitative data.

Importance of self-reflection.  Self-reflection can give several insights into the effectiveness of teaching.  According the Virginia Journal of Education, self reflection is a method to support current strengths and identify areas of improvement including continuing education or professional development needs. Educators may seek out self-reflection in order to review past activities, define issues that arise throughout the quarter/semester, understand how students are learning, modify a class due to unexpected circumstances, or address whether or not the teacher’s expectations have been met. Overall, self-reflection improves teacher quality, (Hindman & Stronge, n.d.)

Educators may sometimes make decisions based on emotions when deciding whether or not an element worked well in the classroom. However, without context to justify that decision, emotions are not a clear indicator of outcomes. Self reflection puts a process in place in which educators can collect, analyze, and interpret specific classroom outcomes, (Cox, n.d.).  Though there are various ways to perform self-reflection (see Figure 1.1), the most effective outcome is to ensure that the process has been thoroughly completed.

Figure on Cox's Types of Self-Reflection
Figure 1.1 Cox’s Types of Self-Reflection.

For an  instructional coach, following the proper self-reflection steps would be a great way to begin the discussion with someone wanting to improve their teaching. An instructional coach would help the educator:

  • Understand their outcome goals,
  • Choose the data collection/reflection method best suited to meet these goals,
  • Analyze the data together to identify needs,
  • Develop implementation strategies to address needs.

Because is the process is general, it can be modified and applied to various learning institutions. With my coaching background as a dietitian, similar to my clients needs for change, I would also include questions about perceived barriers to change implementation.  These questions would include a discussion on any materials, or equipment the educator would deem necessary but that may be difficult to obtain or that may require new skills sets to use fully.

Using rubrics to quantify qualitative data. Part of self-assessment includes using rubrics, in addition to analyzing data, goal setting, and reflection. According to the Utah Education Association (UEA), using a rubric helps to address the question “What do I need to reach my goals?”,  (UEA, n.d.). Rubrics present expected outcomes and expected performance, both qualitative qualities, in quantifiable terms. Good rubrics should include appropriate criteria that is definable, observable, complete, and includes a continuum of quality, (UEA, n.d.).  

If rubrics help quantify qualitative data, then how can rubrics assess reflection?  DePaul University tackled that very question, in which the response asked more questions including: what is the purpose of the reflection, will the assessment process promote reflection, and how will reflection be judged or assessed? (DePaul, n.d.).  Educational Leader, Lana Danielson remarks on the importance of reflective thinking and how technological, situational, deliberate, or dialectical thinking can influence teaching outcomes. Poor reflective outcomes, according to Danielson, is a result of not understanding why teachers do the things they do, and that great teachers are those know what needs to change and can identify reasons why, (Danielson, 2009).   Figure 1.2 describes the four types of reflective thinking in more detail.

Infographic on the four modes of reflective thinking
Figure 1.2 Grimmett’s Model of the Four Modes of Reflective Thinking

Developing rubrics based on the various types of reflective thinking will help quantify expectations and performances to frame improvement. The only issue with this model is that it is more diagnostic rather than quantifiable.  A more specific rubric model developed by Ash and Clayton in 2004, involves an eight-step prescriptive process including:

  • Identifying and analyzing the experience,
  • Identifying, articulating, and analyzing learning,
  • Undertaking  new learning experiences based on reflection outcomes, (DePaul, n.d.)

The Ash/Clayton model involves developing and refining a rubric based on learning categories related to goals.  All of the qualities related to the learning categories are defined and refined at each stage of the reflection process. More information on the eight-step process can be found here.

Regardless of the reflection assessment model used, coaches can capture enough criteria to create and use rubrics as part of the self-reflection process that can help improve teaching outcomes due to new awareness, and identified learning needs that may block improvements. Most LMS systems support rubrics as part of assessment in various capacities (some only support rubrics on designated “assignments” but not features like “discussions,” for example).  Each criteria item includes quality indicators which are also associated with a number, making the qualitative data now quantifiable similar to the way “coding” in qualitative research allows for quantifiable results. New rubric features allow for a range of quality points on common criteria and freeform responses, allowing for the possibility of modifications to the various reflection types. Because of the new functionalities and the myriad of rubric uses in LMS today, creating a good-quality rubric is now the only obstacle of rubric implementation for self reflection.

References

Becker, K. (2016, August 29.) Formative vs. summative vs. confirmative vs. predictive evaluation. Retrieved from: http://minkhollow.ca/beckerblog/2016/08/29/formative-vs-summative-vs-confirmative-vs-predictive-evaluation/

Cox, J. (n.d). Teaching strategies: The value of self-reflection. Retrieved from: http://www.teachhub.com/teaching-strategies-value-self-reflection.

Danielson, L. (2009). Fostering reflection. Educational Leadership. 66 (5)  [electronic copy]. Retrieved from: http://www.ascd.org/publications/educational-leadership/feb09/vol66/num05/Fostering-Reflection.aspx

DePaul University, (n.d.) Assessing reflection. Retrieved from: https://resources.depaul.edu/teaching-commons/teaching-guides/feedback-grading/Pages/assessing-reflection.aspx

Hindman, J.L., Stronge, J.H. (n.d). Reflecting on teaching: Examining your practice is one of the best ways to improve it. Retrieved from: http://www.veanea.org/home/1327.htm

ISTE, (2017). ISTE standards for coaching. Retrieved from: https://www.iste.org/standards/for-coaches.

Utah Education Association., (n.d.) Self-Assessment: Rubrics, goal setting, and reflection. [Presenter’s notes]. Retrieved from: http://myuea.org/sites/utahedu/Uploads/files/Teaching%20and%20Learning/Assessment_Literacy/SelfAssessment/Presenter%20Notes_Self-Assessment_Rubrics_Goal_Setting.pdf

css.php