The Research Paradigms of Constructiveness Innovation and Objectivity … Annotated Bibliography
Pages: 5 (1562 words) | Style: n/a | Sources: 5
Org Research Annotated Bib
Bitektine, A., & Miller, D. (2015). Methods, Theories, Data, and the Social Dynamics of Organizational Research. Journal Of Management Inquiry, 24(2), 115-130.
This source represents one that touches on all three of the major paradigms when it comes to organizational research. It is hard to be objective when it comes to research that touches on social and other issues that are usually qualitative in nature. The use of quantitative data is indeed the best way to go in terms of yielding numbers and data that supposedly speaks for itself. However, sometimes that data has to be supplemented or even replaced with things like surveys and such that take into account things that are much more malleable, different from person to person and otherwise harder to prove. This is where the innovation paradigm comes in because ideas about new methods and tactics have to be devised and perfected so as to deal with these research challenges. Finally, the research would have to be constructive because using new and revised methods can lead to people disagreeing about the choices that are being made, the conclusions that are being rendered and so forth. However, being constructive about all of this is important because this allows for the testing and development of research methods. It should be agreed upon that new methods are needed but that no new methods will be ordained as valid unless this can be shown in practice. Further, the sanctity of intentions alone is not a failsafe. Indeed, there can sometimes be a scarcity of data and/or opportunities for good research to take place and this can inhibit the scholarly sphere and the progression of its research all at the same time. Further, this could all lead to a decline in the research paradigm and this is not desirable for a huge number of reasons.
Conway, J. M., & Lance, C. E. (2010). What Reviewers Should Expect from Authors Regarding Common Method Bias in Organizational Research. Journal Of Business & Psychology, 25(3), 325-334. doi:10.1007/s10869-010-9181-6
This article was selected due to this focus on objectivity (or lack thereof) that occurs due to method bias when it comes to organizational research. Specifically, this journal article is directed towards the people that review journal articles as part of the peer-review process. Conway and Lance (2010) bring up the topic because they feel there are some misunderstandings and misconceptions about common method bias, self-reporting measures and so forth. Three main maladies that the authors point to are that relationships between self-reported variables are "necessarily and routinely upwardly biased," other reports or methods are superior to self-reports and rating sources are sometimes less than optimal or scholarly. That being said, the authors expect that there will be arguments as to why self-reports are appropriate in some situations, that there will be assertions and arguments about construct/validity evidence and that there will be lack of overlap in items for other constructs and paradigms. One supposed cardinal sin that that Conway and Lance point to is post-study adjustments'. Specifically, they say "we specifically do not recommend post-hoc statistical control strategies; while some statistical strategies are promising, all have significant drawbacks and have shown poor empirical results" (Conway & Lance, 2010). Like was mentioned in the Ware and Munafo piece covered later in this report, there are going to be some people that engage in these behaviors that truly believe they are not bastardizing and polluting their results. However, this is simply not the case a lot of the time and such adjustments should be avoided. Everything that can be planned for in advance should be and deviance from that should be rare to never.
Garner, J. T. (2015). Different Ways to Disagree: A Study of Organizational Dissent to Explore
Connections between Mixed Methods Research and Engaged Scholarship. Journal of Mixed
Methods Research, 9(2), 178-195.
Of the three major paradigms that are being covered as part of this literature review, this source covers the constructive side. Indeed, many research-related disagreements can be acrimonious and the research therein can be biased. Bias comes from things like having a desired (or required) outcome before the research starts, sloppy research methods and so forth. This source seeks to fetter out the "different ways to disagree" and coalesce them in a way that is constructive, makes sense and so forth. Many times, researchers are not actively trying to deceive or do things wrong. Indeed, there are times when a research is simply failing based on the fact that they are only using a single method to prove their point and they should instead be using mixed method research to ensure that all of the relevant results point to the same overall result. Using the right combination of sources, research methods and so forth leads to what can be referred to as engaged scholarship. It is a way of making sure that all of the relevant data does indeed point to the same answers to the questions. If there is some sort of divergence or variance between what one method says and what another says, it would then have to be found out why that variance exists and it would thus be too early to make any major conclusions as apart of that study. Regardless, organizational research is often going to have people that express disagreement and reservations. The key is to work constructively and properly together so that the reason for the seemingly disparate results can be figured out. It could be a matter of method, timing, verbiage or other things. Regardless, there should be no "square hole, round peg" scenarios where sloppy or unproven results are forwarded.
Keeton, K. E., Richard, E. E., & Davis, J. R. (2014). Solution mechanism guide: implementing innovation within a research & development organization. Aviation, Space, And Environmental Medicine, 85(10), 1061-1062. doi:10.3357/ASEM.4050.2014
While sticking to the tried and true methods may sound like the optimal way to go in research, there is much to be said for innovation and transformation in the research sphere. In other words, just because a superior and better method is not yet known about or proven does not mean that it does not exist. This brings us to the transformation paradigm of organizational research and development. Of course, the usual research staples of validity and reliability have to be maintained and ensconced within any results. However, it is entirely possible that the ways to get to such research performance can be perfected, updated or at least streamlined. Many organizations are already doing this. The source being covered in this entry talks about one such organization, that being NASA's Human Health and Performance Directorate, or HH&P. That group worked with a problem solving technique that has come to be known as the solution mechanism guide, or SMG. As explained by Keeton, Richard and Davis (2014), they explain that "SMG seamlessly introduces open innovation and collaboration concepts within HH&P to more effectively address human health and performance risks" (Keeton, Richard & Davis, 2014). The SMG framework is web-based and interactive in nature. It leverages both existing and innovated problem-solving techniques. In other words, it uses the tried and true measure and methods that have already been used for years but also integrates new methods and tricks that have been proven to be equally or more effective but are still in their nascent methods as compared to more traditional research methodology. Whether new or old, solid and innovative methods are key for strategic management decisions.
Ware, J. J., & Munafo, M. R. (2015). Significance chasing in research practice: causes, consequences and possible solutions. Addiction, 110(1), 4-8. doi:10.1111/add.12673
Remaining unbiased and grounding in doing research the right way may seem like common sense and a presume cornerstone of any research project. However, this is absolutely not always the case. However, the hinting of bad methods is not always manifested in the form of outright bias. Even so, this entry covers a particular part of the objectivity paradigm of research and how to do it right. Ware and Mnuafo (2015) talk about what is known as "significance chasing." Before they get too deeply into significant chasing in particular, they assert that some pressures brought to bear against researchers such as pressure to publish and so forth can lead to both intentional and unintentional mistakes. An intentional mistake mentioned by the authors is the fabrication or altering of data. An innocent or minor manifestation would be the use of unplanned post-hoc testing for a study. The two authors went into the study in question by pointing out that many findings in previous studies were littered with studies that could not be reproduced in an easy manner. In other words, the reliability and/or the validity of the research was lacking. One of the main causes (if not the main one) of this happenstance was deemed to be research that yielded supposedly "significant" results that were not really significant at all. Such "false positives" can lead to research that is misdirected on conclusions that… [END OF PREVIEW]
Cite This Paper:
APA FormatThe Research Paradigms Of Constructiveness Innovation And Objectivity. (2016, February 6). Retrieved March 26, 2017, from http://www.essaytown.com/subjects/paper/research-paradigms-constructiveness/9425198
MLA Format"The Research Paradigms Of Constructiveness Innovation And Objectivity." 6 February 2016. Web. 26 March 2017. <http://www.essaytown.com/subjects/paper/research-paradigms-constructiveness/9425198>.
Chicago Format"The Research Paradigms Of Constructiveness Innovation And Objectivity." Essaytown.com. February 6, 2016. Accessed March 26, 2017.