Sunday, 31 March 2013
"Open-Hearted" Lit Reviews
In chapter 7 of Salsa Dancing Into the Social Sciences, Luker (2008) states that we should read literature surrounding our research interest in an"open-hearted way" (p. 133). This got me thinking about my own reading patterns while investigating a certain area of research. Typically, when I'm looking into a subject, I have a preconceived stance on what I am reading. I read looking for what I want to argue, while either disregarding what is irrelevant or noting it so that I can argue against it in a paper. However, in light of what Luker said in chapter 7, I now think this is the wrong way to approach literature. If I am set in my thinking on a topic, in a way, I am not allowing my brain to work at its full capacity. I am not allowing different ideas to mold, shape, and change what I think. This is a natural evolutionary process of thinking that is healthy and, as I see now, vital in research. In my prior way of reading, I was allowing room for ignorance and lazy thinking. I want to apply this, firstly, to my research proposal. As I'm working on it and finishing up my lit review, I want to make sure that I'm addressing all sides, thinking through all perspectives, and being as well informed on the topic as possible, rather than being selective in the information I talk about for convenience sake.
Big Data and Privacy
In this past Wednesday's class, we discussed the implications of revealing the identities of online trolls. The conversation reminded me of the some of material concerning the hopes and fears surrounding the Big Data phenomena. Many of those concerns center on the public's right to privacy. A recent New York Time's article addressed this issue:
"In the 1960s, mainframe computers posed a significant technological challenge to common notions of privacy. That’s when the federal government started putting tax returns into those giant machines, and consumer credit bureaus began building databases containing the personal financial information of millions of Americans. Many people feared that the new computerized databanks would be put in the service of an intrusive corporate or government Big Brother. "It really freaked people out,” says Daniel J. Weitzner, a former senior Internet policy official in the Obama administration. “The people who cared about privacy were every bit as worried as we are now" (Lohr, 2013).
Sound familiar? I find it comforting that humanity has been through a similar change before, without the dire consequences predicted at the time having come to pass.
For anyone interested in the size of Big Data, see this infographic: http://visual.ly/how-big-big-data
References
Lohr, S. Big data is opening new doors, but maybe too many. (2013, March 24). New York Times, p. BU3. Retrieved from http://www.nytimes.com/2013/03/24/technology/big-data-and-a-renewed-debate-over-privacy.html?smid=pl-share.
How big is big data: http://visual.ly/how-big-big-data
Generalizability Question Answered
In my post from February 26, I discussed the implications of the inclusion of the word "generalizable" in research legislation or guidelines (see: "Research Ethics and Generalizability"-http://researchmethodstotheextreme.blogspot.ca/2013/02/research-ethics-and-generalizability.html). In Wednesday's workshop on research ethics, I asked Dr. Dean Sharpe to clarify what the implications for the inclusion of this word in U.S. research ethics legislation has for research in the United States. Dr. Sharpe said that the research ethics reviews boards (known as "institutional review boards" (IRBs) in the U.S.) spend valuable time debating whether a particular proposal's research is generalizable, at the expense of discussing the ethical issues in the same proposal. I did further research on the topic, and discovered this article-"Institutional Review Board mission creep: the common rule, social science, and the nanny state" by Ronald F. White. The following passage illustrates the problem:
"In the narrow sense, the term generalizable might be interpreted reasonably as synonymous with quantifiable. This category would seemingly include any research that employs statistical analysis of collected data. It would certainly include all surveys, questionnaires, and so forth. It would seemingly exclude all journalistic or historical research that involves interviewing a single person. However, if researchers interview two persons and compare their answers, are they not, in a sense, generalizing? So, if we construe generalizable in the broadest sense, any research that makes generalizations apparently falls into this category. Consequently, the malleability of the concept "generalizable" has made it difficult to decide whether all, some, or none of the research in journalism, communication, ethnology, and history come under the jurisdiction of the Common Rule." (White, 2007, p. 552).
Does anyone have any further thoughts on this matter?
Reference
White, R.F. (2007). Institutional review board mission creep: The common rule, social science, and the nanny state. The Independent Review, 11(4), 547-564.
"In the narrow sense, the term generalizable might be interpreted reasonably as synonymous with quantifiable. This category would seemingly include any research that employs statistical analysis of collected data. It would certainly include all surveys, questionnaires, and so forth. It would seemingly exclude all journalistic or historical research that involves interviewing a single person. However, if researchers interview two persons and compare their answers, are they not, in a sense, generalizing? So, if we construe generalizable in the broadest sense, any research that makes generalizations apparently falls into this category. Consequently, the malleability of the concept "generalizable" has made it difficult to decide whether all, some, or none of the research in journalism, communication, ethnology, and history come under the jurisdiction of the Common Rule." (White, 2007, p. 552).
Does anyone have any further thoughts on this matter?
Reference
White, R.F. (2007). Institutional review board mission creep: The common rule, social science, and the nanny state. The Independent Review, 11(4), 547-564.
Saturday, 30 March 2013
Case studies and storytelling
Our readings for week 11 really helped focus case studies for me. Yin (1981) does a great job by explicitly distinguishing between evidence type, data collection method, and research strategy (case studies are a research strategy, by the way). Yin also helpfully points out when case studies make a good research strategy (when the phenomena and the context being studied are intertwined). Beaulieu, Scharnhorst and Wouters (2007) say that case studies are good for deconstructing the scientific method and the claims of universality that are produced as a result. This is backed up by Knight (2002), who says that they can powerfully counteract over-generalization.
What really struck me, though, was the quote of Miles, in Yin (1981), that asserted that the qualitative research done in his area of study (organizations) "cannot be expected to transcend story-telling" (p. 58). I thought to myself, what's wrong with storytelling? Stories are powerful tools for conveying information, helping the reader become engaged with the material like nothing else. Just because a narrative is being told does not mean that it is the result of shoddy research or lazy scholarship.
Works Cited
Beaulieu, A., Scharnhorst, A., & Wouters, P. (2007). Not another case study: A middle-range interrogation of ethnographic case studies in the exploration of e-science. Science, Technology, & Human Values, 32(6), 672-692.
Knight, P.T. (2002). Small-scale research: Pragmatic inquiry in social science and the caring professions. London: SAGE Publications.
Yin, R.K. (1981). The case study crisis: Some answers. Administrative Science Quarterly, 26(1), 58-65.
What really struck me, though, was the quote of Miles, in Yin (1981), that asserted that the qualitative research done in his area of study (organizations) "cannot be expected to transcend story-telling" (p. 58). I thought to myself, what's wrong with storytelling? Stories are powerful tools for conveying information, helping the reader become engaged with the material like nothing else. Just because a narrative is being told does not mean that it is the result of shoddy research or lazy scholarship.
Works Cited
Beaulieu, A., Scharnhorst, A., & Wouters, P. (2007). Not another case study: A middle-range interrogation of ethnographic case studies in the exploration of e-science. Science, Technology, & Human Values, 32(6), 672-692.
Knight, P.T. (2002). Small-scale research: Pragmatic inquiry in social science and the caring professions. London: SAGE Publications.
Yin, R.K. (1981). The case study crisis: Some answers. Administrative Science Quarterly, 26(1), 58-65.
Wednesday, 27 March 2013
Hubris, academic research, and ethics
I was in continuous disbelief as I read Zimmer's (2010) painstakingly precise smackdown of the "Taste, Ties, and Times" (T3) project's approach to research ethics. Wow. I just couldn't believe the hubris of the researchers. A point that really stuck out to me, was how the T3 researchers trumpeted how they got permission from the school and Facebook to access the data. Um, gee, don't worry about the actual students whose personal data you're snooping on. My god! I suppose when people are reduced to numbers, or thought of as simply data, then there is a greater likelihood that their concerns will not be considered.
I think the T3 case underscores how important it is to have outside perspectives check out your research. One of the researchers is quoted in Zimmer (2010) saying "we're sociologists, not technologists, so a lot of this is new to us" (p. 316). That realization should have instigated an attempt to get expert opinions on whether their privacy safeguards were sufficient. Instead, it is revealed that they did not consult any experts in privacy. Again--wow.
A lot of lessons to be learned from this. The T3 researchers would have been better off trying to learn from their mistakes, as opposed to arrogantly firing back at critics, saying that they did enough.
Works Cited
Zimmer, M. (2010). "But the data is already public": On the ethics of research in Facebook. In Ethics and Information Technology, 12, 313-325.
I think the T3 case underscores how important it is to have outside perspectives check out your research. One of the researchers is quoted in Zimmer (2010) saying "we're sociologists, not technologists, so a lot of this is new to us" (p. 316). That realization should have instigated an attempt to get expert opinions on whether their privacy safeguards were sufficient. Instead, it is revealed that they did not consult any experts in privacy. Again--wow.
A lot of lessons to be learned from this. The T3 researchers would have been better off trying to learn from their mistakes, as opposed to arrogantly firing back at critics, saying that they did enough.
Works Cited
Zimmer, M. (2010). "But the data is already public": On the ethics of research in Facebook. In Ethics and Information Technology, 12, 313-325.
Research Ethics and Contributions to Knowledge
Last week, I completed the TCPS 2: CORE
ethics tutorial for another class: http://tcps2core.ca/ It tied in very well with the readings
for this week – the tutorial is comprised of eight modules that
discuss various aspects of research ethics, with a quiz at the end of
each, so in a way it was like another reading for this class. Many of
the same issues were covered in the ethics tutorial as in our
readings and in class today, but I especially appreciated how this
tutorial clearly broke down the main components of research ethics:
risks and benefits, consent, privacy and confidentiality, fairness
and equity, and avoiding conflict of interest.
I've never really had to prepare for an
ethics review, but as we write our proposals and read about ethics
this week, I've been thinking that being forced to spell out the
ethical implications of your research in each of these areas is not
only important from a moral standpoint but also helpful for thinking
through the research itself. It especially forces you to really
consider the contribution to knowledge of your research. As I'm
writing the contribution to knowledge section of my proposal, I'm
thinking about exactly what the risks and benefits are, and having a
bit more background knowledge about research ethics is helping me
articulate them more clearly.
Looking Back at Luker and the "Hook"
In chapter 8 of Salsa Dancing Into the Social Sciences, Luker discusses interviewing as a form of research. As I first thought, interviews may seem straight forward and a simple question and answer process, however, the author shows otherwise. Specifically, Luker's address of "hooks" in interviews show complexity of interviews and the need for researchers to approach interviews in a methodical and strategic way.
In our day and age, time is money. People are very selective in how they spend their time and what they participate in. It seems that every other receipt, email, or website attempts to engage people in some sort of survey or participation in research, whether business, law, or academic. Because there is such a overload of research participation requests, researchers must be smart about the way they pitch the requests to the public.
I also need to take this into consideration in the research I am proposing for this course. I need to think about what methodology is best suited to the research I am conducting. As the semester progresses, I am seeing more and more how much thought and strategy is required in conducting research.
In our day and age, time is money. People are very selective in how they spend their time and what they participate in. It seems that every other receipt, email, or website attempts to engage people in some sort of survey or participation in research, whether business, law, or academic. Because there is such a overload of research participation requests, researchers must be smart about the way they pitch the requests to the public.
I also need to take this into consideration in the research I am proposing for this course. I need to think about what methodology is best suited to the research I am conducting. As the semester progresses, I am seeing more and more how much thought and strategy is required in conducting research.
Accurate Data Collection
The topic of cross-case study analysis, discussed in the article, "The Case Study Crisis: Some Answers," is one that, I feel, deserves some attention and consideration. Collecting and analyzing data has proved to be a complex and multilayered component of research through the readings of this course. We have learned about quantitative, qualitative, and mixed research and each of their advantages and disadvantages, we discussed methodology and the importance of being informed in the particular area of research you are pursuing, and so on. Among everything that we have learned thus far, I think that how you choose to use the data that you have collected is one of the most important. In our own research, it is important to analyze whether the data sets we collect (1) can be related to each other (2) can be generalized for a larger population, and (3) is accurate and truthful. Researchers who are highly invested in their work may be tempted to alter or manipulate data in order to come up with the conclusions they desire. However, we, as researchers, must hold ourselves up to the highest standard in order to truthfully present information with as little bias as possible. In order to do this, it is evident that citation and providing evidence is of utmost importance. Yin notes that, "the case study researcher must preserve a chain of evidence as each analytic step is conducted" (p. 63). This has challenged me to be careful about the citations and references I use in my research proposal and in future research endeavors. In creating my final research proposal, I will be paying close attention to how I set up my research in how many subjects I propose to collect data from. Yin's article demonstrates how important it is to construct and execute solid research.
References
Yin, R.K. (1981). The case study crisis: Some answers. Administrative Science Quarterly, 26(1), 58-65.
Tuesday, 26 March 2013
Cookbooks
I just got a new cookbook that I’m really excited about, and as I was browsing through it I realized that our research proposals are not that different from a recipe. For those of you who like to cook, you know that when you are experimenting with a new dish, you will probably get it wrong the first time, or even the first couple of times. Our research proposals are the same; we keep tweaking and revising our original idea until we (hopefully) have an outline for a study that can be done consistently, and hopefully achieve the desired results. However, like a recipe, there is always a chance that our research design does not go as planned. For example, you could realize halfway through that the questions you sent out in your questionnaires are all wrong. Hopefully, we can create a proposal that is like a good recipe, and produces good results!
Sunday, 24 March 2013
Assignment 4 and Thoughts About 'Doing It"
Reading Chapt. 7 of Knight "Doing It" regarding the practical implications of acting out your research design and its various stages in the real world was almost reassuring in its claim that problems and issues will definitely come up. Throughout this course I have struggled with understanding various concepts and translating what we have been learning into assignments and making it make sense for me. In my experience part of completing any project involves problem solving, and it was an important reminder to anticipate issues and have back-up plans in place so when you are in the field you are not disabled by issues that come up.
The last section "Disclosure and Harm" (p.169-172) brought up the issue of dealing with strong emotional reactions from participants, being emotionally impacted as the researcher, and how to handle these consequences during the process (p.171-172). This made me think of an anticipated problem I for see in terms of writing Assignment 4 due to my close emotional involvement with my field of research. I am emotionally invested in the issue and this social problem because of the impact it is having on the life of someone I love. I understand that this may create biases I may project on to the research if it is carried out. As I was writing Assignment 3 I became aware of the potential problem this could cause and I had to remind myself to not take sides or push for a certain outcome based on my experiences, but be open to unexpected results and be mindful of including every possibility so as not to narrow the point of view and not miss/ignore/exclude important information I may find surprising.
Initially I thought due to my personal connection it will be easier for me to invest in this assignment. But the more I am learning about the role of the researcher and the importance of being aware of your personal biases and how they may affect your work, I am learning that it could make my job as a researcher more difficult and may act as a burden. Perhaps I should have chosen something I was not as invested in emotionally (note to self for next time).
What I realized from Chapt. 7 is the importance of separating myself and my personal connection to my research question from my role as researcher while working on this assignment so as to not undermine the quality and inclusiveness of my approach and the potential data analysis and conclusions.
Knight, P.T. (2002). Small-Scale Research. Thousand Oaks, CA: Sage Publications.
The last section "Disclosure and Harm" (p.169-172) brought up the issue of dealing with strong emotional reactions from participants, being emotionally impacted as the researcher, and how to handle these consequences during the process (p.171-172). This made me think of an anticipated problem I for see in terms of writing Assignment 4 due to my close emotional involvement with my field of research. I am emotionally invested in the issue and this social problem because of the impact it is having on the life of someone I love. I understand that this may create biases I may project on to the research if it is carried out. As I was writing Assignment 3 I became aware of the potential problem this could cause and I had to remind myself to not take sides or push for a certain outcome based on my experiences, but be open to unexpected results and be mindful of including every possibility so as not to narrow the point of view and not miss/ignore/exclude important information I may find surprising.
Initially I thought due to my personal connection it will be easier for me to invest in this assignment. But the more I am learning about the role of the researcher and the importance of being aware of your personal biases and how they may affect your work, I am learning that it could make my job as a researcher more difficult and may act as a burden. Perhaps I should have chosen something I was not as invested in emotionally (note to self for next time).
What I realized from Chapt. 7 is the importance of separating myself and my personal connection to my research question from my role as researcher while working on this assignment so as to not undermine the quality and inclusiveness of my approach and the potential data analysis and conclusions.
Knight, P.T. (2002). Small-Scale Research. Thousand Oaks, CA: Sage Publications.
Tomatoes as technologies
In class last week the professor asked us to think about the technologies presented in the film, Isle of Flowers. I could not help but think that everything depicted in the film, by nature of their positioning, reflected an end product that has been molded by technology, down to the woman buying tomatoes and selling perfumes, to the film's protagonist, the tomato. Would it be fair to call the tomato a technology? I don't know, but it remains true, that if left solely to nature, the tomato would not have been produced on that farm, to be to be harvested and sold in the supermarket to the woman, who eventually tossed it out, where it was picked up by the garbage truck, sent to a landfill, found unsuitable for pigs, and eventually left for children in groups of ten to find. Each 'cog' in that technological infrastructure has to do its piece for the system to remain viable, and so perhaps an argument that what's driving the process isn't the so much the humans with the highly developed brains and opposable thumbs, but the technological ecosystem itself.
In my thinking of "tomatoes as technologies", I googled the term and came up with the following:
"In the Ilocos Region, tomato is one of the major cash crops normally cultivated by farmers during the dry season after rice is harvested. Planting season is from November to December and the bulk of the produce is harvested in February and March. Thus, the local market during this period is generally flooded with locally produced fresh market tomato. Naturally, the price becomes very low, averaging less than P 5 kg (2001). Worse, the supply is much higher than the demand for the product, resulting in a host of marketing problems.” (http://www.mixph.com/2010/06/the-technology-of-growing-tomato-during-off-season.html) The article continues on to address this issue, and in relation to the film, perhaps also shed some insights on why in that particular technologically and economically driven food chain, pigs come before children.
Wednesday, 20 March 2013
The Zimbabwe Bush Pump
This article talks about the Zimbabwe Bush Pump. However, the pump is not discussed as merely
an object, but as an entire concept which serves many purposes; providing affordable
clean water which prevents disease, building communities by providing a
community project, and empowering poor communities to be in charge of their own
water supply. The authors do not talk
about the bush pump as a thing. In fact,
they state at the beginning of the article that they “…love the Zimbabwe Bush Pump.”
Much like the short film we watched in class today, this
article gets us thinking about the deeper meaning of things. The Zimbabwe bush pump is not just a water
pump, just as the results of our research will not be comprised of one simple
interpretation. Likewise, when we are
writing our research proposals, it is important to consider all of the implications of our research. A reviewer who is not familiar with the
subject may not be aware of how the outcome will contribute to research. We might not even be aware of all of the
implications of our research at first.
Perhaps if we try looking at our projects through a few different lenses
we will be surprised at what we find.
De Laet, M., and Mol, A. (2000). The Zimbabwe Bush Pump:
Mechanics of a Fluid Technology. Social Studies of Science, 30(2),
225-263. [http://go.utlib.ca/cat/7755570]
"The Social Life of Tomatoes"
The short film we watched in class, “The Social Life of
Tomatoes”, uses the journey of one tomato
to identify the similarities and differences between human beings and some of
our social constructions, as well as to place major class divisions into
context. The film follows one tomato
from a field, where it is tended by the Japanese man, Mr. Suzuki, to a
supermarket, where it is purchased by a woman who throws it in the
garbage. It is then taken to a trash
heap on the Island of Flowers, where it is meant to be fed to pigs. The pig’s owner decides that it is unfit for
pigs, and it is then thrown in a pile where poor women and children are free to
scavenge it to eat themselves.
The beginning of the film has a light tone, and finds
connections between such topics as opposable thumbs, whales, and money. There is also a stress on religion,
identifying Jews and Roman Catholics.
However, by the end, it is obvious that this film has a more serious
message. It uses satire to stress the
major class divisions, such as the woman who sells perfume to buy food from the
supermarket for her family, and the woman and children who line up to gather
food which was rejected as pig food. The
film also uses serious images, such as footage from concentration camps.
“The Social Life ofTomatoes” shows how an everyday item or
concept can be connected to countless other items and concepts. This can be used as an exercise to open our
eyes when considering our research projects.
Sometimes, important connections can be made where we least expect them.
Tuesday, 19 March 2013
It surprised me to read in Yin's article that Matthew Miles's piece "Qualitative data as an attractive nuisance"was based on a four-year study, and that it critiqued the short-comings of qualitative analysis and case study research without offering suggestions for overcoming the problems he identified. If as a researcher you are going to devote four years of your life on something and identifying its problems, not investing significant effort to think of solutions shows a lack of commitment. Social science research has the purpose of contributing to social thought and theory and improving understanding. Pointing out problems contributes very little to growth and improvement, it is only the first step.
Yin calls Miles' discussion of the advantages and disadvantages of qualitative data and case study as a "frequent confusion regarding types of evidence,...types of data collection methods,...and research strategies" (p.1). It is easy to confuse these interrelated methods and strategies when they are not clearly defined. Throughout the course I have referred to different books outside the course texts for definitions of various methods and strategies that have come up in the course, because I did not always find Knight and Luker to provide a clear definition of various methods. What I have found is the definitions of methods and research strategies differ from text to text depending on who is doing the defining. So I appreciate Yin's efforts to untangle this mess for me and distinguish the similarities and differences of various types of evidence, data collection methods and research strategies.
Reference:
Yin, R.K. (1981). The case study crisis: Some answers. Administrative Science Quarterly, 26(1), 58-65.
Yin calls Miles' discussion of the advantages and disadvantages of qualitative data and case study as a "frequent confusion regarding types of evidence,...types of data collection methods,...and research strategies" (p.1). It is easy to confuse these interrelated methods and strategies when they are not clearly defined. Throughout the course I have referred to different books outside the course texts for definitions of various methods and strategies that have come up in the course, because I did not always find Knight and Luker to provide a clear definition of various methods. What I have found is the definitions of methods and research strategies differ from text to text depending on who is doing the defining. So I appreciate Yin's efforts to untangle this mess for me and distinguish the similarities and differences of various types of evidence, data collection methods and research strategies.
Reference:
Yin, R.K. (1981). The case study crisis: Some answers. Administrative Science Quarterly, 26(1), 58-65.
What's the critical discourse behind your content analysis Mr. Miles?
Last week's lecture on Analyzing Texts and Artifacts: Content Analysis and Critical Discourse Analysis, tried to help us distinguish between quantitative and qualitative, positivism and post-structuralism and Rumsfeld and Žižek. I found the point about content analysis vs. critical discouse to be very insightful; that of what's presented on the surface with text, versus what's actually intended. The intension behind the content. How else would intension be measured if not through the lived experiences, stories, and qualitative data that come out of those stated objectives and polished agruments?
Now to tie that in with this week's reading by Robert K. Yin, where Yin responds to Matthew Miles's attack on qualitative analysis - and it's companion, the case study, by putting forward the following points:
1. "the case study does not imply the use of a particular type of evidence. Case studies can be done by using either qualitative or quantitative evidence. The evidence may come from fieldwork, archival records, verbal reports, observations, or any combination of these."
2. the case study does not "imply the use of a particular data collection method. A common misconception is that case studies are solely the result of ethnographies or of participantobservation,
yet it should be quickly evident that numerous case studies have been done without using these method."
3. That case studies represent a 'research strategy', and as "a research strategy, the distinguishing characteristic of the case study is that it attempts to examine: (a) a contemporary phenomenon in its real-life context, especially when (b) the boundaries between phenomenon and context are not clearly evident."
Now expanding on that last point, that the case study represents a 'research strategy' by attempting to examine a contemporary phenomenon in it's real-life context. It seems to me that if the interest in and intent of the research are ethically motivated, and the analysis potentially helpful and insightful to curious minds and those living that contemporary phenomenon, what's the point of discrediting the methods used. I guess my point is, if a quantitative study could produce better and more accurate results, or even build on the qualitative data gleened from the case study, then all the better, but if supporters of the 'scientific' method are simply interested in descrediting other methods, without offering alternatives or suggestions for how to improve, then it's simply an extension of the us against them thought form, and a way of leveraging one's contributions by deminishing the contribution of another.
Reference:
Yin, R.K. (1981). The case study crisis: Some answers. Administrative Science Quarterly, 26(1), 58-65.
-Mandissa Arlain
Now to tie that in with this week's reading by Robert K. Yin, where Yin responds to Matthew Miles's attack on qualitative analysis - and it's companion, the case study, by putting forward the following points:
1. "the case study does not imply the use of a particular type of evidence. Case studies can be done by using either qualitative or quantitative evidence. The evidence may come from fieldwork, archival records, verbal reports, observations, or any combination of these."
2. the case study does not "imply the use of a particular data collection method. A common misconception is that case studies are solely the result of ethnographies or of participantobservation,
yet it should be quickly evident that numerous case studies have been done without using these method."
3. That case studies represent a 'research strategy', and as "a research strategy, the distinguishing characteristic of the case study is that it attempts to examine: (a) a contemporary phenomenon in its real-life context, especially when (b) the boundaries between phenomenon and context are not clearly evident."
Now expanding on that last point, that the case study represents a 'research strategy' by attempting to examine a contemporary phenomenon in it's real-life context. It seems to me that if the interest in and intent of the research are ethically motivated, and the analysis potentially helpful and insightful to curious minds and those living that contemporary phenomenon, what's the point of discrediting the methods used. I guess my point is, if a quantitative study could produce better and more accurate results, or even build on the qualitative data gleened from the case study, then all the better, but if supporters of the 'scientific' method are simply interested in descrediting other methods, without offering alternatives or suggestions for how to improve, then it's simply an extension of the us against them thought form, and a way of leveraging one's contributions by deminishing the contribution of another.
Reference:
Yin, R.K. (1981). The case study crisis: Some answers. Administrative Science Quarterly, 26(1), 58-65.
-Mandissa Arlain
Qualitative data & constructing narratives
Yin (1981) makes an interesting
point regarding some of the objections participants of case studies have regarding the interpretation of their data. Miles states that participants objected more often to research findings after
seeing qualitative data, like case studies, on the basis that they disagreed
with the researchers’ interpretations (Miles, 1979, cited in Yin, 1981, p.58). Ultimately, the researcher is in a position of
total power as they have the final say in terms of interpretation when writing
the final report. Miles, as Jessica points out, circumscribes qualitative
researchers in saying that they will never be able to fully “transcend
storytelling” (cited in Yin, 1981, p. 58). Yin (1981) counters Mile’s argument
by describing a number of situations when participants might object to
interpretations of researchers using quantitative methods like survey research
(p.64).
Yin (1981)
argues that participants feel more comfortable with being talked about in
aggregate data, regardless of the research method being used (p.64). I wonder
how this tension plays out in open data initiatives? Very often, social science
researchers are reluctant to turn over their qualitative data to institutional
repositories (Kuula, 2010). It could be that they are predominantly concerned
with confidentiality. I wonder how much of the reluctance also has to do with
worries that data might be subject to reinterpretation? Putting that data out there puts the
researcher in the position of no longer fully controlling their narrative.
Kuula, A. (2010). Methodological
and ethical dilemmas of archiving qualitative data. IASSIST Quarterly, 34(3), 12-17
Yin,
R.K. (1981). The case study crisis: Some answers. Administrative Science
Quarterly, 26(1), 58-65.
Subscribe to:
Posts (Atom)