"The means is dialogue, the end is learning, the purpose is peace." ~ Founder Dr. Jane Vella

Posts tagged with "Evaluation"

Engaging Graduate Students to Deepen Learning

I was first introduced to Jane Vella’s steps of design and the world of Dialogue Education™ during my graduate studies at Fuller Theological Seminary. To say that my world was flipped upside-down would be an understatement. I found it extremely encouraging to know tools were available for teaching in an academic setting that helped to engage learners and create a strong learning environment.           

Before that moment in time, Dialogue Education was as foreign to me as the countries I had visited. The adage, “We teach the way we were taught,” was a living reality as I lectured to students in a variety of settings and languages. With each lecture, I increased my knowledge of the subject, but something was missing. Apart from an exam at the end of the course, how could I measure the level of learning for each student? I desired greater engagement yet feared open discussion due to my inability to answer the unknown, or worse, the uncomfortable.

As an educator, I realize I have much to learn to develop what I refer to as the Optimal Learning Environment (OLE). Basically, the OLE exists at the intersection of the methods, objectives, and evaluation of the learners’ cultural context. Incorporating formative evaluation throughout the Eight Steps of Design makes the optimal learning environment possible. Because the cultural context is dynamic, formative evaluation is essential as each step of the design process is formed and implemented. The result provides both engagement and learning for every participant.

I recently taught a graduate course in Advanced Homiletics at the Bear Valley Bible Institute International in Denver, Colorado. The academic dean asked if I would focus on expository preaching, but also wanted a larger portion of the course to address teaching. As a rookie in the arena of Dialogue Education, this was my opportunity to implement what I had learned as well as deepen my own learning. Let me share a few take-aways from this first-time experience.

  1. The Learning Needs and Resources Assessment (LNRA) is critical. The LNRA provided essential information to initially structure the course. I learned personal information about each graduate student, gained an understanding of their strengths and weaknesses, listened to what each learner desired to achieve, and captured a glimpse of their plans for the future. Based on this invaluable information, I determined achievement-based objectives (ABOs) that guided the lesson plans for the week. My classes will not be taught without this information.
  2. Evaluate every step. At the end of each day, I processed what was experienced in the learning environment. This time of formative evaluation enabled me to adjust the direction needed for the next day. While this might be considered as education “on-the-fly,” I assure you it was not. I became less concerned about covering an amount of content and focused more on adjusting the content to achieve what these graduate students desired to learn. I will admit that I am far from perfecting the formative evaluation process, but I learned that even a small tweak here and there makes a major difference in the result.
  3. Model the method. In other words, “practice what you preach.” Why say it, when you can show it? I knew that Dialogue Education was as foreign to these graduate students, as it was to me years ago. Therefore, if they were going to transfer these concepts into their context, then I needed to model the concepts, design learning tasks that enabled learners to put these concepts into practice, and discuss how the whole process might impact their ministries. By the end of the week, I am positive I learned more than anyone else, but their enthusiasm was clear as they implemented the process and applied the principles and practices of this learning-centered approach.
  4. Feedback is vital for future growth. For the purpose of my own personal development, I followed up the course with an evaluation sent to each graduate student. The design of the evaluation form offered participants an opportunity to share honest feedback and ways to improve the course. The value of the information provided cannot be measured. I have already implemented changes for the future and am confident this iterative approach will continue to strengthen my courses, planning and teaching.
     

As I continue to process the experience of the week, additional lessons surfaced that highlighted the value of Dialogue Education. Let me sum up my approach to Dialogue Education in this way:

  • Become a learner, not a teacher
  • Draw upon the experience of others
  • Invite dialogue by posture, not position
  • Equip by providing more learning tasks, less lecturing
  • Grow in application, not information
  • Introduce more strategy to learning, less content
  • Bring passion, not power to the learning environment.

I want to thank my friends at Global Learning Partners for the opportunity to share my experience. When these graduate students engaged in learning through Dialogue Education, the whole process made sense and learning was deepened.

How do you deepen learning in your university or college classes?

*****

Bob Turner (bturner@wetrainpreachers.com) earned his Doctorate in Intercultural Studies from the Fuller Theological Seminary, serves as an adjunct instructor for the Bear Valley Bible Institute International and a minister for the Bastrop Church of Christ in Bastrop, Louisiana. He is married to the love of his life, Sheryl. They have three children and ten grandchildren.

Here are a few additional GLP resources connected to the topic of teaching in academia:

  1. Dialogue Education in the University: First and Last Day
  2. Dialogue Education in the University: From Monologue to Dialogue
  3. Dialogue Education in the University: Creating a Learning Environment
  4. Dialogue Education in the University: Using a Learning Needs and Resources Assessment
  5. Dialogue Education in the University: Starting with the Syllabus

How to Evaluate a Training

Evaluation Checklist

Do you ever get emails asking you to spend a few minutes sharing your thoughts about evaluating a training? I do. And I'm always a little torn because I know whoever asks is eager for a few crisp tips. Instead, I grill them with questions!

The rest of this short blog post is about questions:  questions to ask somebody if they ask you about how to evaluate a training.

Question #1 - When you say evaluate, are you looking for feedback (i.e. how people perceived their experience) or learning (i.e. how well people grasped the skills/ knowledge/ attitudes being "taught") or something more (see Q4 below on the topic of something more)?

Question #2 - If you are looking for feedback, is it primarily on the design of the training (i.e. course content, structure, sequence, relevance) or on the facilitation (i.e. the way the facilitator listened, guided the dialogue, posed questions, etc)? What kind of feedback will you be able to make use of in the future?

Question #3 -  If you are looking to evaluate learning, have you set clear objectives against which to evaluate?  Are those objectives written in such a way that you - and the learner - will KNOW when they've been achieved?

Side bar:  I just watched a presentation by Dr. James Zull, author of "The Art of Changing the Brain." His words echoed those of our very own Dr. Jane Vella when he said "The way we know we know is if our back cortex (area of sensory input) senses an action we initiate with our front cortex (area of motor skills)."  I loved hearing that because it made the biology of learning evaluation so crystal clear. We know it was learned when we did something with it. That's what achievement-based objectives set us up for!

Question #4 - (With this question I draw a little diagram showing how learning leads to transfer, and transfer leads to impact.) If you are looking for something more, then you probably want to evaluate "transfer" (i.e. how learners use what they got in the learning) or "impact" (i.e. what difference it made to them or those around them). If so, have you set up a plan to capture evidence of learning and then track what happens after the training ends?

By the time I hit question #4, the caller usually pauses their note-taking and says something like "Hmmm. I guess I have to think this through a bit more.” And that's when I feel like I've done my work for the day.

What do you say when a colleague asks you to spend a few minutes talking to them about training evaluation? Share it with us in the comments section.