Thursday, December 9, 2010

Congratulations!

Several OPE research associates had successful comps last month and will be commencing their dissertation research.  Min Zhu's topic is the impact of rater effects on inter-rater reliability estimates, and Grant Morgan is studying the performance of latent class cluster models in real-world conditions.

Monday, December 6, 2010

Leigh's AEA Reflections

The AEA conference offered an opportunity to see a new city (I was born only 250 miles away in Dallas, but had never been to San Antonio), reunite with colleagues from across the country, spend quality time with my officemates, and hear about a variety of exciting evaluation techniques. While I was only in San Antonio for two days, I made the most of the time. I attended one session that featured Michael Quinn Patton discussing a systems/strategy approach to evaluation based on the work of Henry Mintzenberg who published Tracking Strategies: Toward a General Theory of Strategy Formation (2008). Rather than evaluating programs or activities, this is a method to examine strategies and ideas. One example highlighted evaluating end-of-life care for terminally ill patients. The evaluators used a historical approach to review type of care received in an effort to determine the systems approach of the organization. The upcoming issue of New Directions for Evaluation will highlight this type of evaluation. In another session, we learned about attempts in New Zealand to develop a set of evaluator competencies. The competencies center around cultural awareness and respect as emphasized in the Treaty of Waitangi (learned some history too). Two discussants furthered the conversation highlighting issues with centering evaluations around cultural competencies (potential of becoming one-dimensional or just including rhetoric about inclusion) and the importance of useable evaluation reports that provide recommendations for improvement. Min, Tara, Ching Ching, and I also met unexpectedly at a session on evaluating higher education coursework that was quite entertaining. The presenter highlighted real evaluation attempts and efforts to improve from within his institution that were often unflattering (if you give too many As, give fewer As). Grant and I presented during the final session of the conference on Saturday afternoon. While we had a small group, we had a spirited discussion and we have even had requests for more information about our work. Enjoyable two days!

Grant's AEA Experience

At AEA’s Evaluation 2010 conference in San Antonio, TX, I presented or co-presented three studies: 1) Coding Open-Ended Survey Items: A Discussion of Codebook Development and Coding Procedures, 2) Estimating Rater Consistency: Which Method Is Appropriate, and 3) Understanding Student Mastery of Higher Education Curriculum Standards. Each presentation was accompanied by a series of thought-provoking questions that resulted in rich dialogue between the audience and me (and co-presenters where applicable). In addition, I attended sessions on the use of multiple comparisons, Rasch modeling, and latent class analysis. The multiple comparison session was presented by Roger Kirk, who authored Experimental Design: Procedures for Behavioral Sciences ( http://www.amazon.com/Experimental-Design-Procedures-Behavioral-Psychology/dp/0534250920/ref=sr_1_6?s=books&ie=UTF8&qid=1291062847&sr=1-6), and the Rasch session was presented by Christine Fox, who authored Applying the Rasch Model: Fundamental Measurement in the Human Sciences (http://www.amazon.com/Applying-Rasch-Model-Fundamental-Measurement/dp/0805854622/ref=sr_1_1?s=books&ie=UTF8&qid=1291063163&sr=1-1). I have used both texts and was therefore pleased to have gotten the opportunity to meet them. My interaction with other audience members following these sessions resulted in ideas for future study regarding reliability estimation and latent class clustering. In fact, Dr. Robert Johnson, Min Zhu, and I are currently preparing a manuscript based on the rater reliability study that was presented at AEA. I was also able to touch base with my editor in order to provide her with a brief update on the progress of the book I am currently co-authoring. Overall, I feel the Evaluation 2010 conference was a success.

Friday, December 3, 2010

Posters!

Several posters now hang in the hallway outside the OPE office here on the Garden Level of Wardlaw College.  Come down and take a look!


Thursday, December 2, 2010

Ashlee'a AEA Impressions

I always find AEA to be a conference that both nourishes and energizes me professionally. This year, I kicked off my conference experience in San Antonio with a one-day professional development workshop on Utilization-Focused Evaluation with the revered evaluation thinker Michael Quinn Patton. Through the course of the day, he provided much direction about designing evaluations with results that get used rather than producing reports that “gather dust on a shelf.” In the workshop, Dr. Patton provided convincing support for the utilization practices he suggests by giving examples from his own evaluation work. I left the workshop with a developing utilization toolbox and with the feeling that such evaluations “can be done” with thoughtful planning and diligent work.

During the conference, I had the fabulous opportunity to engage in conversations with Dr. Rodney Hopson and Dr. Henry Frierson, two leaders in the field of culturally responsive evaluation. Each of them gave me many things to ponder, and they also provided highly valuable advice for developing a career in evaluation. I also attended many sessions aligned with my interests in qualitative methodologies, many of which addressed the conference theme of quality as it pertained to qualitative methods. I was thrilled with the methods of data visualization used to demonstrate group collaboration over time in a presentation by a group of design students. Their work revealed some of the ways in which data might be more creatively visualized to effectively communicate evaluation findings to clients and even to better generate discussion about evaluation findings.

Finally, Min Zhu and I presented our study of how providing rubrics to teachers impacts performance assessment scores. We received positive feedback about our work from the audience as well as some thoughtful questions and suggestions that provided direction for further examining the relationships among rubric use, instruction, and student performance assessment scores.

Tuesday, November 30, 2010

Joanna's reflections from American Evaluation Association 2010

I was the primary presenter for one demonstration session at American Evaluation Association and I also assisted in preparing two additional presentations. The focus of my primary presentation was on cognitive lab methodology whereby you ask an individual to "think-aloud" as they engage in a decision-making process such as completing a survey item or solving a multiple-choice test item. This method provides information about the cognitive and affective processes that an individual engages in while making a decision and is often used to validate or improve research instruments. The audience that attended my demonstration included eight that were interested in conducting cognitive labs. Thus they posed several logistical and methodological questions. The audience members also provided helpful suggestions for improving cognitive labs in future research. The presentation was well received.

The most valuable presentation that I attended was think-tank session #605 led by Dr. Nathan Balasubramanian. This session was focused on defining and measuring teacher quality and effectiveness which is one of my primary research interests. Balasubramanian shared his work which has been aimed at creating a system for providing teachers with timely assessment data in his state. In many states, assessment data is obtained too late for teachers to use it to guide instructional decision-making. He has also found a way to report the data so that teachers can better evaluate growth over time by standardizing scale scores across grade levels. Balasubramanian invited colleagues from representatives from several schools in his state to discuss how they were using this information and how it has impacted instruction and achievement in their schools. The audience also encouraged Balasubramanian to consider other measures of teaching quality beyond standardized tests and there was considerable debate about the most important indicators of quality teaching and the most appropriate goals for K-12 instruction. The powerpoint slides for this presentation can be seen at http://comm.eval.org/EVAL/EVAL/Resources/ViewDocument/Default.aspx?DocumentKey=11a532a4-d321-417b-bffa-ac625fb1df56

Tuesday, November 2, 2010

The Power of Photo Voice (PV)

Have you ever been looking through the old albums and thinking of your life, of how things were or how could they actually be? Have you ever remembered the happy moments that you had by wondering through those photo-albums? Have you ever thought of the power which brings back those memories?

That’s the power of a photography which actually talks to you and brings you back the story of your own life, the story of the people around you, the stories, which could have been forgotten if not the invention of photography.
In this small abstract I would like to talk a bit about the Photo voice research method which is one of the most powerful tools to speak up and to tell stories about different aspects, problems and issues that we face during our lives.
Why the photo?-Well, because it is visual and you can see, feel and understand it better than maybe just a black text written on the white paper.
And why voice?-Well, because it actually talks to you and explains you many things that may not be so obvious for a naked eye.
Photo voice is a participatory action research method in which individuals photograph their everyday realities. Photo Voice (PV) aims to enable people to record and reflect on their communities’ and their own strengths and concerns; promote critical dialogue; and reach policy makers(Wang & Burris 1997). Photo voice puts the power of image creation into hands of participants by giving them cameras and an opportunity to record, name and reflect on their lives and their issues. The research component comes through analysis of images ,critical dialogue, and self-reflection after the photos have been taken.
Photo Voice is a research method that falls within Participatory Action Research approaches Thus, PV shares similar principles and ethics of research with PAR/CBR:

PV was developed by Wang, Burris and colleagues in research with village women in rural China. It draws on community photography traditions of photojournalism and diverse theoretical traditions centered on bringing marginalized groups into voice and power by naming their own realities based on their own experiences. Some of the main influences include, Paolo Freire’s problem-posing approach to critical literacy education, feminist theory, and documentary photography.

The basic PV is comprised of the following stages

A. A planning stage- Work is conducted to develop goals and objectives
B. A recruitment stage- Photo voice participants are recruited, selected and screened.
C. A training stage- Photo Voice participants are trained in the use of cameras and ethics of photo voice.
D. A discussion and brainstorming stage-Participants brainstorm and select themes or issues that will be the basis for photographs.

E. Photo shoot assignment stage. (May be conducted several times over a period
of weeks).

Photo voice places high ethical demands on researchers. The approaches and methodologies should be used with care and sensitivity. Researchers should have a clear sense of purpose, respect principles underlying photo voice, and have a commitment to stay involved in all stages through to action.

I have been involved in Photo voice action research for already 7 years. I have to tell you it helpd to make change, it helps to deliver the message, it helps to penatrate the minds and the most important, it shows the reality.

Monday, November 1, 2010

Congrats to new Ph.D.s!

First, Joanna Gilmore, and now, Farzana Sultana!  Congratulations on your successful defenses! Farzana, it was great to see you today--it's been a while.

Monday, September 20, 2010

Evaluation References

Use the label "References" to share your favorite references and resources for evaluation. If you were just starting out in evaluation, what books and other resources would you want to know about?