The following table provides an overview of the basic methods to collect data.
Method |
Overall Purpose |
Advantages |
Challenges |
questionnaires, surveys, |
when need to quickly and/or easily get lots of information from people in a non-threatening way |
-can complete anonymously |
-might not get careful feedback |
interviews |
when want to fully understand someone's impressions or experiences, or learn more about their answers to questionnaires |
-get full range and depth of information |
-can take much time |
documentation review |
when wanting an impression of how the program operates without interrupting the program; is from a review of applications, finances, memos, minutes, etc. |
-get comprehensive and historical information |
-often takes much time |
observation |
to gather accurate information about how a program operates, particularly about processes |
-view operations of a program as they are occurring |
-can be difficult to interpret seen behaviours |
focus groups |
explore a topic in depth through group discussion, e.g. about reactions to an experience or suggestion, understanding common complaints, etc.; useful in evaluation and marketing |
-quickly and reliably get common impressions |
-can be hard to analyse responses |
case studies |
to fully understand or depict client's experiences in a program, and conduct comprehensive examination through cross-comparison of cases |
-fully depicts the client's experience in program input, process, and results |
-usually, quite a time consuming to collect, organize and describe |
Case studies are particularly useful in depicting a holistic portrayal of a client's experiences and results regarding a program. For example, to evaluate the effectiveness of a program's processes, including its strengths and weaknesses, evaluators might develop cases studies on the program's successes and failures.
All data about the case is gathered. For example, if the study is to highlight a program's failure with a client, data would be collected about the program, its processes, and the client. Data could result from a combination of methods, including documentation (applications, histories, records, etc.), questionnaires, interviews, and observation.
Data is organized into an approach to highlight the focus of the study. In our example, data in the case would be organized in chronological order to portray how the client got into the program, went through the program, and did not receive effective services.
A case study narrative is developed. The narrative is a highly readable story that integrates and summarizes key information around the focus of the case study. The narrative should be complete to the extent that it is the eyes and ears for an outside reader to understand what happened regarding the case. In our example, the narrative might include key demographic information about the client, phases in the program's process through which the client passed and any major differences noticed about that client during the process, early indicators of failures, and key quotes from the client
The narrative might be validated by review from program participants. For example, the client for whom the program failed, would read the narrative to ensure it fully depicted his or her experience and results.
Case studies might be cross-compared to isolate any themes or patterns. For example, various case studies about program failures might be compared to notice commonalities in these clients' experiences and how they went through the program. These commonalities might highlight where in the program the process needs to be strengthened.
Interviews are particularly useful for getting the story behind a participant's experiences. The interviewer can pursue in-depth information around a topic. Interviews may be useful as follow-up to certain respondents to questionnaires, e.g. to further investigate their responses. Usually, open-ended questions are asked during interviews.
Before you start to design your interview questions and process, clearly articulate to yourself what problem or need is to be addressed using the information to be gathered by the interviews. This helps you keep a clear focus on the intent of each question.
Choose a setting with little distraction.
Explain the purpose of the interview.
Address terms of confidentiality.
Explain the format of the interview
Indicate how long the interview usually takes.
Tell them how to get in touch with you later if they want to.
Ask them if they have any questions before you both get started with the interview.
Don't count on your memory to recall their answers.
Informal, conversational interview - no predetermined questions are asked, to remain as open and adaptable as possible to the interviewee's nature and priorities; during the interview, the interviewer "goes with the flow".
General interview guide approach - the guide approach is intended to ensure that the same general areas of information are collected from each interviewee; this provides more focus than the conversational approach, but still allows a degree of freedom and adaptability in getting information from the interviewee.
Standardized, open-ended interview - here, the same open-ended questions are asked to all interviewees (an open-ended question is where respondents are free to choose how to answer the question, i.e., they don't select "yes" or "no" or provide a numeric rating, etc.); this approach facilitates faster interviews that can be more easily analyzed and compared.
Closed, fixed-response interview - where all interviewees are asked the same questions and asked to choose answers from among the same set of alternatives. This format is useful for those not practised in interviewing.
One can ask questions about:
Behaviours - about what a person has done or is doing
Opinions/values - about what a person thinks about a topic
Feelings - note that respondents sometimes respond with "I think..." so be careful to note that you're looking for feelings
Knowledge - to get facts about a topic
Sensory - about what people have seen, touched, heard, tasted, or smelled
Background/demographics - standard background questions, such as age, education, etc.
Note that the above questions can be asked in terms of past, present, or future.
Get the respondents involved in the interview as soon as possible.
Before asking about controversial matters (such as feelings and conclusions), first, ask about some facts. With this approach, respondents can more easily engage in the interview before warming up to more personal matters.
Intersperse fact-based questions throughout the interview to avoid long lists of fact-based questions, which tends to leave respondents disengaged.
Ask questions about the present before questions about the past or future. It's usually easier for them to talk about the present and then work into the past or future.
The last questions might be to allow respondents to provide any other information they prefer to add and their impressions of the interview.
The wording should be open-ended. Respondents should be able to choose their terms when answering questions.
Questions should be as neutral as possible. Avoid wording that might influence answers, e.g. evocative, judgmental wording.
Questions should be asked one at a time.
Questions should be worded clearly. This includes knowing any terms of the program or the respondents' culture.
Be careful asking "why" questions. These questions may also cause respondents to feel defensive, e.g. that they have to justify their response, which may inhibit their responses to this and future questions.
Occasionally verify the tape recorder (if used) is working.
Ask one question at a time.
Attempt to remain as neutral as possible. That is, don't show strong emotional reactions to their responses. Patton suggests acting as if "you've heard it all before."
Encourage responses with occasional nods of the head, uh-huh's, etc.
Be careful about the appearance when note-taking. That is, if you jump to take a note, it may appear as if you're surprised or very pleased about an answer, which may influence answers to future questions.
Provide transition between major topics, e.g. "we've been talking about (some topic) and now I'd like to move on to (another topic)."
Don't lose control of the interview. This can occur when respondents stray to another topic, take so long to answer a question that times begin to run out or even begin asking questions to the interviewer.
Verify if the tape recorder, if used, worked throughout the interview.
Make any notes on your written notes, e.g. to clarify any scratchings, ensure pages are numbered, fill out any notes that don't make sense, etc.
Write down any observations made during the interview. For example, where did the interview occur and when, was the respondent particularly nervous at any time? Were there any surprises during the interview? Did the tape recorder break?
Questionnaires are useful when you have to get information from a group of people and it is not possible to interview all of them. People are often not keen to fill out questionnaires, they could thus be lacking in information. It might be necessary to verify some of the information on questionnaires.
Before you start to design your questions, clearly articulate what problem or need is to be addressed using the information to be gathered by the questions. Review why you're doing the evaluation and what you hope to accomplish by it. This provides focus on what information you need and, ultimately, on what questions should be used.
Include a brief explanation of the purpose of the questionnaire.
Include a clear explanation of how to complete the questionnaire.
Include directions about where to provide the completed questionnaire.
Note conditions of confidentiality, e.g. who will have access to the information, if you're going to attempt to keep their answers private and only accessed by yourself and/or someone who will collate answers.
Ask about what you need to know, i.e. get information regarding the goals or ultimate questions you want to address by the evaluation.
Will the respondent be able to answer your question, i.e. do they know the answer?
Will respondents want to answer the question, i.e. is it too private or silly?
Will the respondent understand the wording? Avoid any slang, cultural-specific or unclear technical words.
Are any words so strong that they might influence the respondent to answer a certain way? Attempt to avoid the use of strong adjectives with nouns in the questions, e.g. "highly effective government", "prompt and reliable", etc.
To ensure you're asking one question at a time, avoid the use of the word "and" in your question.
Avoid using "not" in your questions if you're having respondents answer "yes" or "no" to a question. The use of "not" can lead to double negatives, and cause confusion.
If you use multiple-choice questions, be sure your choices are mutually exclusive and encompass the total range of answers. Respondents should not be confused about whether two or more alternatives appear to mean the same thing. Respondents also should not have a preferred answer that is not among the choices of an answer to the question.
Be careful not to include so many questions that potential respondents are dissuaded from responding.
Attempt to get recruit respondents' motivation to complete the questionnaire. Start with fact-based questions and then go on to opinion-based questions, e.g. ask people for demographic information about themselves and then go on to questions about their opinions and perspectives. This gets respondents engaged in the questionnaire and warmed up before more challenging and reflective questions about their opinions. (Consider if they can complete the questionnaire anonymously; if so, indicate this on the form where you ask for their name.)
Attempt to get respondents' commentary in addition to their ratings, e.g., if the questionnaire asks respondents to choose an answer by circling an answer or provide a rating, ask them to provide commentary that explains their choices.
Include a question to get respondents' impressions of the questionnaire itself. For example, ask them if the questionnaire was straightforward to complete ("yes" or "no), and if not, to provide suggestions about how to improve the questionnaire.
Pilot or test your questionnaire on a small group of clients or fellow staff. Ask them if the form and questions seemed straightforward. Carefully review the answers on the questionnaires. Does the information answer the evaluation questions or provide what you want to know about the program or its specific services? What else would you like to know?
Finalize the questionnaire. Finalize the questionnaire according to the results of the pilot. Put a date on the form so you can keep track of all future versions.
Click here to view a video that explains how to data collection for a research paper.