• Početna
  • Za čitanje
  • O Blogu
  • Kontakt
  • English
facebook
twitter
youtube
pinterest

...

02
AVG
2015

Reflections on a workshop: case study methodology and qualitative analysis in evaluation

Tags : analysis, case study, evaluacija, evaluation, Judita Jankovic, method, methodology, qualitative, quantitative, research, rigour, workshop
Posted By : Evaluacija
Comments : 0

Guest post By Judita Janković*

I have been trained in the social sciences and quantitative methodology to be sacrosanct in the world of research and evaluation. Moreover, as a social psychologist I have been taught to do research using the scientific method as the only right and proper one through the use of experiments, evaluacija qualitative and quantitativecontrol groups and statistical testing. Interviews, if they needed to be done, had strict protocols in conducting them so as not to influence and bring in experimenter error into the process.

Although I have been regularly using the qualitative research methodology in my evaluation practice I have only recently started to fully appreciate it. I have started to shift from the ideology that I have been conditioned into believing that it is the right one and the only way to produce meaningful research and evaluation. I think it has started with attending Michael Patton’s workshop at IPDET in Ottawa last year. And I think that some participants had ideological heart attacks. This year, I had the opportunity to attend Delwyn Goodrick’s workshop on case study methodology and qualitative analysis at the Evaluator’s Institute in Washington. They were all excellent workshops.

I believe that challenging our own beliefs and practices is an important part of becoming a better evaluator and a more effective one. It is important not to be caged into thinking that there is only one best way of doing evaluations. After all, we are only talking about different tools to use for a common aim we have. If one tool is a better fit for purpose, then why not use it? We need to be flexible as evaluators since the world is increasingly complex and uncovering patterns, outcomes and impact needs effective and adaptable approaches.

So here are my reflections from the workshops at the Evaluator’s Institute this summer (July, 2015). Some of these are certainly not new revelations but are worthwhile reaffirming:

1. A case study is more than a method; it is a type of evaluation design.

2. A case study is not a case profile. Instead, it is an in-depth analysis that can use both qualitative and quantitative data and information.

3. A case study is not only exploratory (descriptive process evaluations), it can be explanatory (contribution analysis). That is, it can be used to uncover outcomes and impact.

4. A priori definitions of evaluation questions, criteria of success or effectiveness should be malleable during the evaluation process – evaluators can get it wrong even if basing these on sound theory, experience or even a priori or pilot assessments. The subject of our evaluation can reveal additional or different meanings as to what success is or means to them and these emerging criteria should then become integrated in the evaluation as it progresses.

5. Rigour is paramount in any evaluation, whether quantitative, qualitative or mixed-methods. The evaluator needs to be able to justify why a case study design has been selected over any other, or to explain what the selection criteria for the type of a case study chosen (event, programme, school, strategy or individual).

6. Journaling the experiential self-reflection on the evaluation process is an important practice in qualitative research and whenever feasible to do, is worth doing.

7. Regardless of the method we use, the brain is the most important interpretative tool to use.

8. Evaluators have a huge responsibility, they have the power to extend knowledge or perpetuate ignorance (Tuhiwai Smith, 1999, p.176)

What are your experiences in using case study methodology? Have you used it in outcome and impact evaluations? How have you ensured evaluation rigour in this case?

Back to the top

JuditaJudita Jankovic has been conducting evaluations since the start of her career in market research in New Zealand. She has joined the United Nations in 2005 (Food and Agriculture Organisation in Rome) and currently she is the Evaluation Officer at the International Civil Aviation Organization (ICAO) based in Montréal, Quebec, Canada. At ICAO, she has instituted the first-ever Evaluation Policy for the purpose of strengthening the evaluation function in the organisation. In addition, she has designed, conducted, managed and successfully completed a number of corporate, programme and project evaluations at ICAO. She has a doctorate in Social Psychology (University of Sussex, the UK) and over 15 years of international work experience in New Zealand, Croatia, the UK, Italy and Canada.

Podeli

Ostavi komentar Poništi odgovor

*
*

captcha *

Oznake

5 pitanja za aktivnost Aleksandar Vučić APES Asocijacija profesionalnih evaluatora Srbije cilj civilno društvo efektivnost efikasnost EPDET evaluacija evaluaciona kultura evaluation evaluator indikator intervencija intervju IPDET javna politika konferencija kriterijum monitoring OCD održivost OECD Pariska deklaracija pokazatelj procena program projekat promena razvoj relevantnost resursi rezultat Slovenačko društvo evaluatora teorija promene trening Udruženje evaluatora BiH uspeh uticaj Vlada Republike Srbije vrednovanje Zapadni Balkan znanje

Pročitajte i ovo:

  • Treća konferencija evaluatora Zapadnog Balkana 17. Novembra 2019.
  • Revizija kriterijuma za evaluaciju razvojnih projekata, programa i politika 24. Jula 2018.
  • Kako razumeti termin Accountability? 30. Juna 2017.
  • Introducing Mobile Data Collection: Lessons Learned 3. Septembra 2016.
  • Čemu služe podaci prikupljeni monitoringom i evaluacijom? 15. Februara 2016.
Creative Commons License
Blog "Triput meri jednom seci" by Ivan Tasić is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Skorašnji članci

  • Treća konferencija evaluatora Zapadnog Balkana 17. Novembra 2019.
  • Revizija kriterijuma za evaluaciju razvojnih projekata, programa i politika 24. Jula 2018.
  • Kako razumeti termin Accountability? 30. Juna 2017.
  • Introducing Mobile Data Collection: Lessons Learned 3. Septembra 2016.
  • Čemu služe podaci prikupljeni monitoringom i evaluacijom? 15. Februara 2016.

Skorašnji komentari

  • Evalvacijski blog Ivana Tasića, Niš, Srbija – Slovensko drustvo evalvatorjev na Ima li života nakon evaluacije? Dva saveta za organizacije i jedan za evaluatore
  • bojan radej na Evaluation Brings Value – The First Balkan Evaluation Conference Opening Speech by Jorg Petrovič
  • Evaluacija na 5 Questions for Linda Morra Imas and Ray Rist on 15 Years of IPDET
  • Evaluacija na 5 Questions for Linda Morra Imas and Ray Rist on 15 Years of IPDET
  • Marie Korner na 5 Questions for Linda Morra Imas and Ray Rist on 15 Years of IPDET
  1. Ime *
    * Molimo vas da upišete vaše ime
  2. Email *
    * Upišite validnu adresu
  3. Poruka *
    * Upišite tekst poruke