"Mathematics / Statistics" Essays

123. . .Last ›
X Filters 

Decision to Become a Math Research Paper

Research Paper  |  2 pages (625 words)
Bibliography Sources: 2

SAMPLE TEXT:

Math majors seeking a career in this field will be rewarded by high salaries ($42.14 per hour median), faster than average job growth (27%), and a non-competitive job market ("Actuaries: Summary" 2012). In addition, a bachelor's degree is usually sufficient for obtaining an actuary position, although certification is generally required.

For math majors interested in research, but uncertain about pursuing a graduate degree, there are a few opportunities available. Biostatistics is a discipline barely more than a decade old, but the need is great for mathematicians interested in applying their talents and skills to problems in basic biology research, medicine, and public health (epidemiology). Biologists and biomedical researchers have advanced to the point where they have too much data and not enough knowledge about statistics and math (Kling 2004). Although math majors have been taking biology courses and entering the biostatistics profession, the need remains great. The ideal biostatistician would be well-versed in biology, applied mathematics, statistics, bioinformatics, and computer programming. While a bachelor's degree was at one time sufficient, the field is growing and maturing so fast that graduate degrees are becoming more common. While most job opportunities for biostatisticians exist in academia, the pharmaceutical industry expects biostatisticians to represent the fastest growing segment.

References

"Actuaries: Summary." Bureau of Labor Statistics, U.S. Department of Labor. Last modified April 5, 2012. http://www.bls.gov/ooh/math/actuaries.htm.

Kling, Jim. "The mathematical biology job market." Science Careers. Published 27 Feb. 2004. http://sciencecareers.sciencemag.org/career_magazine/previous_issues/articles/2004_02_27/noDOI.6305720559640560046.

"Mathematicians: How to become a mathematician." Bureau of Labor Statistics, U.S. Department of Labor. Last modified March 29, 2012. http://www.bls.gov/ooh/math/mathematicians.htm#tab-4.

"Mathematicians: Summary." Bureau of Labor Statistics, U.S. Department of Labor. Last modified March 29, 2012. http://www.bls.gov/ooh/math/mathematicians.htm.

"Statisticians: Summary." Bureau of Labor Statistics, U.S. Department of Labor. Last modified June 26, 2012. http://www.bls.gov/ooh/math/statisticians.htm.… [read more]


Mathematics in Digital Photography Research Paper

Research Paper  |  3 pages (925 words)
Bibliography Sources: 3

SAMPLE TEXT:

Compression technology is the primary driver behind the vast expansion of digital photography. Compression allows large amounts of data to be stored in a relatively small area so that more intricate details of the photo can be accurately reconstructed.

Compression is a complicated process that involves several steps. The first step is to convert red, green, and blue color channels to Y, Cb, and Cr channels while partitioning blocks into size 8x8 pixels. For a JPEG image, the compression relies on the Discrete Cosine Transformation, which is exemplified by the equation: C = UBU^T, where B. is one of the 8x8 blocks and U. is a special 8x8 matrix. Applying certain methods of encoding enables compression of about 70% in most instances, or even more in some circumstances. The process can then be inverted to expand the image and view it in full, by finding B. For the previous equation:

B^' = U^T C^'U for each block ("Image Compression," 2011).

One downside of this compression technique for JPEG images is that decoupling can occur causing some parts of the image to appear blocky. While this loss of resolution is often within the range of acceptability, sometimes it is not good enough for certain publications, such as magazines or other media ("Image Compression," 2011). It is possible that the compression algorithms will continue to improve as both the mathematical concepts and computing power rendering them continue to improve.

Conclusion

There are many different applications of mathematics when it comes to digital photography. The equations are used for everything from the mathematical rendering of color to the compression of files down to a size that is easily stored and converted back to much more intricate images. The advances in both cameras and the software that helps to edit and alter the images have allowed many people to take pictures with their cell phones that would have required much more expensive cameras just a few short years ago. The realm of the professional photographer has been entered by every person sitting at home with a digital camera and a suite of photo editing programs, such as Adobe. Furthermore, the compression of the photos has allowed them to be easily shared over the Internet and between friends on their cell phones or Facebook pages. This marks a huge turning point in modern photography and one that will doubtless continue to push the boundaries of what is possible with digital cameras.

References

Higham, N. (2007). The mathematics of digital photography. Retrieved from:

http://www.maths.manchester.ac.uk/~higham/talks/digphot.pdf.

Hoggar, S.G. (2006). Mathematics of digital images. Cambridge, UK: Cambridge

Image Compression: How Math Led to the JPEG2000 Standard. (2011). Society for Industrial

And Applied Mathematics. Retrieved from:

http://www.whydomath.org/node/wavlets/basicjpg.html.… [read more]


Statistics in Research and Analysis the Experiments Research Paper

Research Paper  |  10 pages (3,309 words)
Bibliography Sources: 10

SAMPLE TEXT:

¶ … Statistics in Research and Analysis

The experiments, analysis and statistics-5

Uses of statistics in experiments and research-5

Tools of Analysis-7

Experimental Design-9

Common uses in every day life-12

USE of STATISTICS in RESEARCH and ANALYSIS

This paper concerns itself with the use of statistics as a means and the important tool in research and analysis -- both in… [read more]


Art and Mathematics Are Related Research Paper

Research Paper  |  10 pages (2,688 words)
Bibliography Sources: 1+

SAMPLE TEXT:

¶ … art and mathematics are related and that this relation could be used to the advantage of educators to overcome student anxiety regarding mathematics and, in particular, difficult geometry concepts

Outline the basic topics to be covered in the study

What is hyperbolic geometry?

Who is MC Escher?

How does Escher's work relate to hyperbolic geometry?

How to design… [read more]


"Basic Statistics for the Behavioral Discussion and Results Chapter

Discussion and Results Chapter  |  2 pages (524 words)
Bibliography Sources: 1

SAMPLE TEXT:

Chapter 2

The second chapter in the book continues with having readers introduced into the world of statistics while also presenting more intricate applications that they can address while using diverse calculations. This chapter is focused on having readers comprehend that experience is one of the most significant concepts when considering life as a whole. When taking into account the topic under discussion statistics can be best understood as a result of engaging in numerous calculations and as a consequence of trying to use these respective calculations with the purpose of solving issues that can emerge from rather simple activities.

Using statistics and numbers in general when discussing about people can be useful, as a research can effectively provide a conclusion regarding an event happening in society or in the natural world. Statistics makes it possible for people to be more specific about their theories and to eventually be able to verify whether or not these respective theories have a basis.

Variables are brought forward as interfering factors that can influence a research process' result. "A variable is anything that, when measured, can produce two or more different scores." (Heiman 2013, p. 16) By becoming familiarized with concepts like variables, numbers being used with the purpose of discussing things that apparently have nothing to do with them, and mathematical calculations that are particularly complex, readers gradually come to acknowledge that statistics is an active part of the social order.

Works cited:

Heiman, G. (2013). Basic Statistics for the…… [read more]


Inferential Statistics to Evaluate Sample Research Paper

Research Paper  |  4 pages (1,017 words)
Bibliography Sources: 5

SAMPLE TEXT:

Note that the two hypotheses we propose to test must be mutually exclusive; i.e., when one is true the other must be false. And we see that they must be exhaustive; they must include all possible occurrences. Lastly, the researcher must translate the research hypothesis into operational terms. The researcher goes on to operationally define fast tempo as being music at a tempo of 120 bpm (beats per minute) and slow tempo music as being music at a tempo of 60 bpm. In addition, a researcher has to specify how participants are going to rate the music for happiness (Hays, 1973).

8. Discuss probability in statistical reference, as well as the meaning of significance.

Probability is the likelihood of the occurrence of some event or outcome. A significant result is one that has a very low probability of occurring if the population means are equal. The probability required for significance is called the alpha level and is often .05. All results obtained by statistical methods suffer from the disadvantage that they might have been caused by pure statistical accident. The level of statistical significance is determined by the probability that this has not, in fact, happened. P is an estimate of the probability that the result has occurred by accident. Therefore a large value of P. represents a small level of significance (Moses, 1986).

In experiments one needs to define a level of significance at which a correlation will be deemed to have been proven, though the choice is often actually made after the event. It is important to realize that, however small the value of P, there is always a finite chance that the result is a pure accident. A typical level at which the threshold of P. is set would be 0.01, which means there is a one percent chance that the result was accidental. The significance of such a result would then be indicate by the statement P<… [read more]


How Statistics Apply to Entrepreneurship Essay

Essay  |  2 pages (620 words)
Bibliography Sources: 0

SAMPLE TEXT:

¶ … hear the word "statistics," the daily infographic on the cover of the U.S.A. Today comes to mind for many people. This is because they have an untrained concept of a tool with usefulness and predictive power that is difficult to overstate. In school, in public life at large and as an entrepreneur, descriptive and inferential statistics, and the ability to express and interpret the results, will become more rather than less important in innumerable ways. Essentially, statistics allows us to make accurate predictions, and thus identify and prevent wasted time and materials, and prevent potential harm to very real individuals every day.

Descriptive statistics literacy is invaluable for every day life, when the media is filled with claims about population or environmental change, or advertising trying to convince us products are safe, provide certain nutrition or what have you. For example, many people go through life thinking the more times an experiment is performed, the greater chance there becomes of a particular outcome -- I actually have a friend who thinks if he buys more lottery tickets with the same number on them, that will increase the likelihood of the number coming up! Mandating learning one simple but fundamental law of probability, that a tossed coin can keep coming up heads regardless what the prior outcomes were, could prevent very real waste. Understanding this as an entrepreneur will prevent needless experimentation trying to find production processes or material flaws where such replacement confounded identification.

Likewise for the display power of descriptive statistics, if production processes and experimental results can be expressed in complex but intuitive and easily understood scatter, box-and-whisker and best-fit plots, or histograms and bell curves of varying normality. Of course understanding this information depends on the basic concepts of dispersion and central tendency, and the general literature is full of claims based on mistaken use…… [read more]


Lie With Statistics Huff, Darrell Book Report

Book Report  |  3 pages (949 words)
Bibliography Sources: 1

SAMPLE TEXT:

Perhaps the easiest way to distort survey results is simply to keep taking surveys of populations until the desired result is reached. The reason this result is produced is chance, however, rather than scientifically legitimate findings. The 'well-chosen' average is another example of this, whereby the survey sample is carefully selected to yield a figure desirable to 'prove' the contention of the reporter. Including or not including persons who would distort the average is another statistical lie. The presenter can also select the specific statistic that bests proves his thesis -- the mean, median or mode (the mean is the sum divided by the number of values, the mode is the number that occurs most often in the sample, the median is the 'middle' sampling of all listed numbers). For example, finding the 'average' American salary can produce wildly different results, given the discrepancies that can result between the median and the mean because of the high salaries of the small numbers of persons at the top, and other people who make very low salaries.

Words are powerful: calling something 'flimsy and cheap' sounds much worse than calling something 'light and economical,' and even the words 'practicing celibacy' can sound ominous, because of the association of the word 'practicing' with something nefarious (Huff 102-103). The language with which statistics are presented can also cause an unwitting reader to believe in them: for example, saying 'it is obvious that the pollution is killing all of the birds, because 100% of persons surveyed said they have not seen a single bird flying this year." (The persons may not have been paying attention, for example, to the birds). More seriously, Huff gives the example of a manager who wants to construct an anti-union survey. The manager collects any and all of the complaints that have arisen about the union, and uses these complaints to 'prove' that no one wants the union on the premises. However, it is very difficult to find an entity with no complaints about it at all, so the conclusion that is arrived at is fundamentally self-serving and misguided because the survey population did not say that it disliked the union (Huff 82).

Huff even derives a word to describe deliberately manipulating the hearts and minds of people with statistics: 'statisticulation' (Huff 102). Ultimately, the book's purpose is to encourage readers to 'talk back' to statistics so they can make rational, rather than irrational decisions. Stopping before buying or believing an advertisement is essential, so you can ask yourself, "is this believable' and 'what bias might the writer have?' Numbers, by virtue of being numbers, are not inherently truthful and relevant. It depends how they are used and a vague survey with little information about how it was conducted or how the 'average' was arrived at is no more accurate than a work of…… [read more]


Statistic Project Term Paper

Term Paper  |  2 pages (496 words)
Bibliography Sources: 1

SAMPLE TEXT:

Stat Abuse

The Precautionary Principle

Peter T. Saunders of the Mathematics Department of King's College, in London, published an article titled "Use and Abuse of the Precautionary Principle" that deals with a highly specific and unique problem when it comes to the use of statistical information. Many statistical abuses occur when conclusions not fully supported by statistics are asserted, or when differences that are not statistically significant are made to appear greater than they really are. According to Saunders, there are certain situations where it is actually good to use statistics in this way -- specifically, in cases where there is a potential for harm if a correlation exists. That is, things like cigarettes and possible carcinogens should be assumed to be unsafe as soon as any evidence suggests that they might be, unless there is a compelling reason that a potentially harmful substance should be used. Saunders advocates and over-reaction to statistical data in such cases as a means of offering the greatest protection. This is called the "precautionary principle," and it is common sense according to Saunders' explanation. In situations where the precautionary principle applies, any observed change on a population should be taken as a sign that the substance/action/etc. is correlated with that change until it can be positively demonstrated that this is not the case.

After explaining the precautionary principle in great deal and making the foundational logic and ethicality behind this principle quite clear, Saunders turns to how statistics can be abused when…… [read more]


Statistics Being Studied Essay

Essay  |  3 pages (869 words)
Bibliography Sources: 3

SAMPLE TEXT:

For example, it cannot be interpreted the age of the different immigrant groups, as this is not published. In addition, no conclusions can be definitively drawn about the timing of immigration flows -- those are only guessed at. In addition, there are no statistics provided in this report about the economic condition of immigrants or their settlement patterns. No conclusion can be drawn about the age of native speakers, with respect to determining the risk those languages face of extinction. The age of native speakers can be reasonably guessed, but not on the basis of the data provided in this report.

4. These statistics could impact our perception of certain topics by delivering facts about the subject. By providing accurate information, the basis is formed for the reader of the statistics to understand the facts surrounding an issue. Many topics become either politicized or subject to erroneous assumptions. Both can be countered with the use of facts. There is often a gap between perception and reality, but by understanding the reality a new and more accurate perception can be created. This will benefit anybody studying these issues, as they are able to separate out the facts from the perceptions more easily.

5. There are a number of predictions for the future that can be made using these statistics. Such conclusions can be drawn in particular if the figures from the previous census are also made available. With the 2001 figures, trends can be determined in the populations of different ethnic groups. This can assist with a number of public functions in particular, such as English or French as a second language provision and other public service provisions. If trends on ethnic diversity and language use are known, then stakeholders can better understand the ethic makeup of Canada going forward, allowing for better decisions both in terms of public policy and commerce.

The figures also include statistics on age, which is critical for both government and commerce. The degree to which the Canadian population is aging is worth understanding because of the public policy implications as well. On a general level, any line of information contained within the demographic report can be extrapolated into the future to understand the demographic trends within the country.

Works Cited:

StatsCan. (2006). Selected demographic, cultural, educational, labor force and income characteristics. Statistics Canada. Retrieved May 23, 2011 from http://www12.statcan.gc.ca/census-recensement/2006/dp-pd/tbt/Rp-eng.cfm?LANG=E&APATH=3&DETAIL=0&DIM=0&FL=A&FREE=0&GC=0&GID=0&GK=0&GRP=1&PID=99016&PRID=0&PTYPE=88971,97154&S=0&SHOWALL=0&SUB=0&Temporal=2006&THEME=70&VID=0&VNAMEE=&VNAMEF=… [read more]


Statistics Are Integral to Research Term Paper

Term Paper  |  2 pages (713 words)
Bibliography Sources: 0

SAMPLE TEXT:

Statistics are integral to research but it is important to know how to read, interpret, and use statistics so that one can best comprehend what one is reading and not be duped by those who may distort statistical data for subjective purposes.

Statistics are powerful, but used incorrectly or erroneously they can also distort information and lead to negative results. Just as words are ambiguous, numbers and images (such as tables, flowcharts, graphs etc.) can be misleading too.

Reading articles that contain statistical data involves a critical manner throughout and involves practicing critical techniques.

Firstly, one has to constantly question the source of the data. Even when extracted from a scientific article, the journal needs to be checked to see whether it is a peer-reviewed credible source. This refers all the more so for popular books and articles. All too often, people assume diets and do-it-yourself treatments that can be potentially destructive at their worst due to respect for statistics and the fact that the data was extracted from a journal or book that contained 'psychology' or 'science' as its tile. The background of the author has to be carefully reviewed, as well as the publisher, and the context of the statistics.

The presented data may indicate only a part of the picture. Questions, therefore, have to be asked, as for instance: the purpose of the data, the subjectivity of the researcher, the intention of his research, the policy or procedure that hinges on the statistics, and so forth. Illustrative of this fact are statistics that convey some political process, as for instance, the current Iraqi conflict. A good number of the sources are partisan, and the statistical data, authoritative and impressive as they seem, too readily reflect the interest of one side or the other.

Related to this is the investigation to discover whether all the data has been included. To return to the Iraqi conflict scenario again, some data may have been intentionally excluded or presented in an incongruous manner. The other side of the picture -- and the entire picture -- has to be seen for an accurate perspective to be garnered.

Statistics…… [read more]


NCTM's Agenda for Action and Standards Term Paper

Term Paper  |  2 pages (517 words)
Bibliography Sources: 1

SAMPLE TEXT:

Mathematics

Over time, new generations of students come equipped with unique and different background knowledge. In the 1980s, NCTM, or the National Council of Teachers of Mathematics, launched a new Agenda for Action. American students had moved from a largely agricultural-based society to one that was focused on science, technology, and information. NCTM provides the blueprints from which mathematic curriculum is built across the country. In order to meet the needs of a changing society, they felt the urgent need to update the mathematic standards to fit the students of the 80s, and they continue to update the standards for today's students (Krulic, 2003, p. 21).

Many updates were made to the Agenda for Action in 2000. In the 1989 version, four standards, called process standards, were presented and reached across all grade levels, k-12: problem solving, communications, reasoning, and connections. When updated in 2000, the fifth standard of representation was added. This new process standard suggested that students now learn in reasoning skills, strategies for solving problems, understand relationships between different types of mathematics, as well as the relationships between mathematics and the other disciplines (Krulic, 2003, p. 22). Several changes were made. Communication skills, which had been long overlooked when teaching mathematics, was now being emphasized through writing, listening, and other communication about math (Krulic, 2003, p. 22-23). Connections were also being focused on, looking at the mathematical discipline as a single unit rather than numerous smaller individual parts (Krulic, 2003, p. 23). The process standards suggested a large shift in grade placement and content levels…… [read more]


SPSS Statistics: Social Science Research Instructions: Reading Essay

Essay  |  2 pages (473 words)
Bibliography Sources: 1

SAMPLE TEXT:

SPSS Statistics: Social Science Research

Instructions:

Reading: Chapter 11-14 - SPSS Statistics 17.0 - Guide to Data Analysis by Marija J. Norusis

Other attachments to follow.

Use Assignment 7a -- Tutorial

Problem 7 ?" Chapter 12:

Use the select cases facility to select only men with coronary heart disease ( variable chd equals 1). Test the hypothesis that they come from a population in which the average serum cholesterol is 205 mg/dl (variable chol58).

State the null and alternative hypotheses.

Ho: The population mean is not equal to 205.

Ha: The population mean is equal to 205.

What so you conclude about the null hypothesis based on the t test?

The null hypothesis is supported. The sample mean is significantly different from a mean of 205.

What is the difference between your sample mean and the hypothetical population value?

63.317

d. How often would you expect to see a sample difference at least this large in absolute value if the null hypothesis is true?

e. Give range of values that you are 95% confident include the population value for the mean cholesterol of men with coronary heart disease. Does the interval include your test value of 205?

257.66 to 278.98. This does not include the test value of 205.

Assignment #7b:

Use Assignment 7b -- Tutorial

Problem 9 ?" Chapter 12:

The leader of the Chicago schools claims that dramatic improvements have occurred between 1993 and…… [read more]


Actuaries the Jobs Rated Almanac Term Paper

Term Paper  |  2 pages (741 words)
Style: APA  |  Bibliography Sources: 4

SAMPLE TEXT:

The career itself requires a solid understanding of mathematics in order to analyze statistics, make recommendations and generalizations based on those statistics, monitor the financial situation of the companies they work for, and provide consultation on investment strategies (Society of Actuaries, n.d.). They also use statistical analysis to infer the probability of an undesirable event from occurring, and the likely cost related to such an event. Furthermore, they address many financial questions such as how much money should one contribute to a pension plan and how often to produce a certain retirement income level by a specified time. Finally, actuaries use their statistical, financial, and business knowledge to help design savings plans, pension plans, insurance policies, and other financial programs to help protect people and their assets from potential risks.

Actuaries are highly valued individuals (Society of Actuaries, n.d.). Their expertise is needed by society to ensure that we are protected from many of life's undesirable events. Their creativity and knowledge creates strategies to prevent such events from occurring, which relieves us of emotional pain and financial burden. Undesirable events which do occur do not have as strong of an impact on us as because of the work they do. "Actuaries…are the brains behind the financial safeguards we have implemented in our personal lives, so we can go about our daily lives without worrying too much about what the future may hold for us" (Society of Actuaries, n.d.). Furthermore, it is the knowledge that actuaries' posses regarding risks and risk-reduction which have informed many of the savings programs we invest into. These programs allow us to protect ourselves and enjoy many of life's pleasures. Thus, we all benefit from the work of actuaries.

References

Braverman, B. & Jeffries, A. (2009, December 1). Top-paying jobs. CNNMoney.com. Retrieved from http://finance.yahoo.com/personal-finance/article/108264/top-paying-jobs.

Department of Mathematics. (n.d.). Actuarial studies. University of Texas at Austin.

Retrieved from: http://www.ma.utexas.edu/dev/actuarial.

Kouba, D. (n.d.). Why choose a mathematics-related profession? University of California.

Retrieved from: http://www.math.ucdavis.edu/~kouba/MathJobs.html.

Society of Actuaries & Casualty Actuarial Society. (n.d.). Be an Actuary. Retrieved from:

www.beanactuary.org.… [read more]


Statistics: Marketing the Practice Applying One's Knowledge Essay

Essay  |  2 pages (534 words)
Style: APA  |  Bibliography Sources: 2

SAMPLE TEXT:

Statistics: Marketing the Practice

Applying one's knowledge and skills in statistics and statistical applications is not too difficult, especially when the client or end user is concerned about the validity or reliability (or both) of the data. However, as with other practice of experts in a particular field or area of expertise, the challenge to promoting the use of statistics is on the manner by which practitioners (i.e., statisticians) "market" this discipline and their expertise.

The science of statistics make this field an especially exclusive niche for academicians, and at most, practitioners working as "specialists" for statistics-dependent industries, such as market/business research and management consulting industries. Statisticians working for the academe and specialist industries have different approaches to implementing statistics in their respective fields. Statisticians working for the academe implement statistical principles, techniques, and applications with great rigor, and they usually work on projects that look at issues or problems from a generalist's approach. That is, statistics as applied in checking for data quality and analyses in the academe caters specifically to the project itself, with a broader look at how the project's findings will be used as a becnhmark or standard to similar kinds of studies.

Statisticians working as specialists for a specific industry, meanwhile, would have a more specific approach to applying statistics in their chosen field of expertise. Statisticians working for market research agencies or consulting firms would apply statistical techniques and principles to answer a client's business needs and issues, and each project's findings will betreated as confidential and would not be integrated for public use. Instead, this compilation of studies would be…… [read more]


Statistics in Management Essay

Essay  |  3 pages (781 words)
Bibliography Sources: 3

SAMPLE TEXT:

Statistics in Management: Descriptive vs. Inferential Statistics

The use of descriptive vs. inferential statistics in organizations provides decision makers, managers and leaders with the necessary insights to compete more effectively in an increasingly challenging global economic climate. The intent of this essay is to define which conditions are optimal for the use of each. Descriptive statistics by definition are more adept at the consolidating of data and its summarization (Spatz, 2008). Inferential statistics however are meant to be representative of a broader population and are developed to be statistically sound (van den Besselaar, 2003). The use of each of these types of statistics varies significantly within organizations, and has completely different interpretations when used. This essay examines how each are used to their optimal value.

Best Practices in Descriptive Statistics

There are several functional areas within organizations that rely heavily on descriptive statistics. These include accounting, business planning and analysis, financial planning, marketing, sales, product management, quality and production. Each of these functional areas are often evaluated on scorecards and benchmarks-based entirely on descriptive statistics of their activity over time (Ainslie, Leyland, 1992). Best practices in descriptive statistics for example in marketing centers on the need to accurately and succinctly summarize customer feedback about existing marketing strategies, experiences with customer service centers, and the prices paid for products as well. Descriptive statistics is an indispensible tool for evaluating which strategies are best used for retaining and growing customer loyalty as well (Ainslie, Leyland, 1992). In the area of production, descriptive statistics are very useful for evaluating the effectiveness of production techniques, systems and routing of specific products over the shop floor. This is exceptionally valuable for getting greater performance and production from less space, as lean manufacturing techniques rely on descriptive statistics for insights into how to continually improve. As these examples within organizations indicate, descriptive statistics are best for creating a synopsis or summary of a given set of variables that have a major impact on the organization. From customers to suppliers and production processes, descriptive statistics are invaluable for gaining insights into how to improve an organization over time.

Inferential Statistics and the Defining of Strategies in Organizations

The conditions for applying inferential statistics in an organization are when the data has been statistically and reliably collected to reflect a broader population of users. Inferential statistics are best…… [read more]


Impact of Mathematics on Economics From the Medieval Time Thesis

Thesis  |  1 pages (377 words)
Style: APA  |  Bibliography Sources: 2

SAMPLE TEXT:

¶ … mathematics on economics: Medieval era

A number of new developments occurred during the early Middle Ages in the Arab world to make current methods of calculating economic principles possible: the first was the development of so-called 'Arabic' numbers, which enabled easier calculation methods than the numbers of the Roman numerical system, and the second major influence was that of the development of algebra. However, in Europe, in stark contrast to the ancient Romans and Greeks as well as their Arab contemporaries, medieval Europeans during the feudal era seemed to have less of a fascination with exact calculations and geometric theories. "The Church's education program consisted of schools which taught what was dictated by the Bible and the Pope, they were attached to churches, operated by monks and taught from the geometric, musical, and arithmetic compilations of Anicius Manlius Severinus Boethius" (Dickerson 1996). Theology rather than mathematics was the most celebrated of all the intellectual disciplines.

Only with the expansion of capitalism did things begin to change. The increased use of money as a placeholder of value, the evolution more elaborate government bureaucracies and national taxation systems…… [read more]


Damned Lies and Statistics by Joel Best Research Proposal

Research Proposal  |  1 pages (409 words)
Style: MLA  |  Bibliography Sources: 1

SAMPLE TEXT:

¶ … Damned Lies and Statistics by Joel Best discusses both the uses and misuses of statistics, particularly in relation to social issues, problems, changes, and policies. Best puts his focus especially on the use of social statistics as issues and problems because information received and beliefs and perceptions developed from social statistics have a beneficial and detrimental effect to the lives of people in a society.

In his book, Best surveyed current literature, both popular and scientific/technical, that uses social statistics as bases for their claims and arguments. He noted that more often than not, this growing body of literature that is empirically-driven and -- generated have erroneously interpreted and/or reported statistical results and findings. The use of "authoritative statistics" and "missing numbers" is especially salient in Best's discussions in the book. Statistics and statistical values are used as 'tools' by individuals, groups or institutions to provide a valid claim to their arguments and claims. Best especially calls the reader's attention at how these numbers and statistical values attempt to "confuse" the general public, generally assuming that the popular audience would just accept a number or statistic mainly because it seems to come from a credible source, and secondarily, because readers generally do…… [read more]


Araybhata's Contributions to Mathematics &amp Algebra Aryabhata Research Proposal

Research Proposal  |  2 pages (518 words)
Style: MLA  |  Bibliography Sources: 4

SAMPLE TEXT:

ARAYBHATA'S CONTRIBUTIONS to MATHEMATICS & ALGEBRA

Aryabhata was born in 476 AD and was known as Aryabhata I or Aryabhata the elder. Aryabhata was a member of the Kusuma Pura School and a native of Kerala which is located in the most extreme South of India. Aryabhata is one of the greatest mathematicians of all times and is considered to be the father of the renaissance of mathematics in ancient India. (Hooda and Kapur, 2001, paraphrased) Indian mathematics historically claimed great achievements before Aryabhata's time and it was Aryabhata who first had the courage to break with tradition and to find knowledge gaps and to fill these gaps with his own research and knowledge. (Hooda and Kapur, 2001, paraphrased)

Important Contributions and Achievements in Mathematics and Algebra

Dutta (2005) in the work entitled: "Mathematics in Ancient India" states that in its earlier stages, mathematics "developed mainly along two broad overlapping traditions." (Dutta, 2005) According to Dutta (2005) these two traditions are those of:

(1) the arithmetical and algebraic; and (2) the geometric. (Dutta, 2005)

Included in Aryabhata's work on Mathematics are the following:

Arithmetic - Method of inversion, various arithmetical operators (cub, cube root)

Algebra - Formulas for find the sum of several types of series; rules for finding the number of terms of an arithmetical progression; Rule of three - improvement on Bakshali Manuscript;

rules for solving examples on interest - which led to the quadratic equation. (Indian Mathematics, 2009)

II. Most Notable Contribution in Algebra

It is stated in the work entitled: "Indian Mathematics" that of all…… [read more]


Inferential Statistics? What Are the Differences? Essay

Essay  |  4 pages (1,399 words)
Bibliography Sources: 2

SAMPLE TEXT:

¶ … inferential statistics? What are the differences? When should descriptive and inferential statistics be used?

Descriptive and inferential statistics: Summary explains the similarities and differences between descriptive and inferential stations and when each method should be used. Descriptive statistics comprises the kind of analyses to describe a study population that is small enough to include every case. Descriptive statistics can also describe the actual sample under study, but allow a researcher to extend conclusions to a broader population.

With descriptive statistics, a researcher can describe how issues affect study groups and how variables are related in to other study groups. However, the research cannot describe how those issues affect the members of the study groups and how these variables are related in those groups. Furthermore, the researcher would not be able to conclude how the results could be generalized to all groups and would not know where the groups in the study were representative of all groups.

These shortcomings of descriptive statistics are where inferential statistics come into play.

Inferential statistics extends conclusions to a broader population by making sure the study if representative of the group the researcher wishes to generalize to. This is accomplished by choosing a sample that is representative of the group to which the researcher plans to generalize. Tests of significance confirm generalization. A Chi-Sqaure or a T-Test tells the researcher the probability that the results found in the study group are representative of the population that group was chosen to represent. Chi-Sqaure or a t-test gives informs the researcher of the probability that the results found could have occurred by chance when there is really no relationship at all between the variables you studied in the population.

What are the similarities between single-case and small-N research designs? What are the differences? When should single-case and small-N research designs be used?

Cooper, Heron, and Heward (2007) explain single-case and small-n research designs. These are most often used in applied fields of psychology, education, and human behavior in which the subject serves as his/her own control, rather than utilizing another individual/group. Researchers utilize single-case and small-n designs because they are sensitive to individual organism differences vs. group designs which are sensitive to averages of groups. Small-n research includes more than one subject in a research study, but the subject still serves as his/her own control just like in the single-case design.

Single-case and small-n research have three major requirements (Kazdin):

Continuous Assessment: The research repeatedly observes the behavior of the individual over the course of the intervention. Thus, any treatment effects are observed long enough to convince the researcher that the treatment produces a lasting effect.

Baseline Assessment: Before the treatment is implemented, a researcher looks for behavioral trends. If a treatment reverses a baseline trend (e.g., things were getting worse as time went on in baseline, but the treatment reversed this trend) this is considered powerful evidence suggesting (though not proving) a treatment effect.

Variability in Data: Because behavior is assessed repeatedly, the single-subject/small-n… [read more]


Mathematics Education Term Paper

Term Paper  |  2 pages (677 words)
Style: APA  |  Bibliography Sources: 2

SAMPLE TEXT:

Mathematics Education

The objective of this work is to describe five specific methods of questions and strategies that encourage students to discuss their ideas, procedures, rules and definitions that they used to solve a problem and to discuss at least four ways in which justification of solutions to improve students' relational understanding of mathematics.

The work of Jones (2000) entitled: "Instructional Approaches to Teaching Problem Solving in Mathematics: Integrating Theories of Learning and Technology" states that: "Problem solving is defined by Kantowski as 'a situation for which the individual confronting it has no readily accessible algorithm that will guarantee a solution." (2000) NCTM standards define problem solving as "the process by which students experience the power and usefulness of mathematics in the world around them." (Jones, 2000)the stages of problem-solving are stated to be:

Understanding the problem;

Making a plan;

Carrying out the plan;

4) Looking back. (Jones, 2000)

There are five strands of mathematical proficiency, which are stated to be those as follows:

conceptual understanding;

procedural fluency;

strategic competence;

adaptive reasoning; and Productive disposition. (Taplin, nd)

Conceptual understanding of mathematics involves comprehension of mathematical concepts, operations and relations. Procedural fluency involves skills in carrying out procedures in a flexible, accurate, efficient and appropriate manner. Strategic competence involves the ability to formulate, represent, and solve mathematical problems. Adaptive reasoning involves a capacity for logical though, reflections, explanation and justification. Finally productive disposition involves the habitual inclination to see mathematics as sensible, useful and worthwhile, coupled with a belief in diligence in ones' own efficacy.

Five specific strategies that teachers may use for encouraging students to discuss their ideas, procedures, rules and definitions that they used to solve a problem include those as follows:

1) the teacher provides just enough information to establish the intent of the problem;

2) the teacher accepts right or wrong answers in a non-evaluative manner;

3) the teacher guides, coaches and asks insightful questions;

4) the teacher intervenes when appropriate and when not appropriate the teacher allows the students to make their own way;…… [read more]


Mathematics as a Creative Art Term Paper

Term Paper  |  1 pages (340 words)
Bibliography Sources: 0

SAMPLE TEXT:

Mathematics as Creative Art

P.K. Halmos waxes poetic about mathematics, claiming that not only does mathematics present practical value but also that "mathematics is an art" (p. 379). Envisioning mathematics as art affirms the creative potential of math and acknowledges the myriad ways math becomes manifest in everyday life. What Halmos refers to as "mathophysics" includes the applied principles of "mathology." Moreover, Halmos claims that "mathematics is very much alive today," a statement as true in 1968 when Halmos wrote "Mathematics as a Creative Art" as it is in 2006 (p. 380).

As a math teacher married to a painter, I especially relate to Halmos' comparison of the role of the mathematician to the role of the visual artist. The mathematician's role, like that of the painter, is varied and flexible. Both the mathematician and the painter interpret the world but just as the painter is not "a camera," neither is the mathematician "an engineer,' (p. 388). At the same time, mathematics and painting both serve concrete functions, and just…… [read more]


Growth of Mathematics Term Paper

Term Paper  |  2 pages (615 words)
Bibliography Sources: 0

SAMPLE TEXT:

Growth of Mathematics

Mathematics Hard and Soft

Mathematical truth is time-dependent, although it does not depend on the consciousness of any particular live mathematician," (p. 415). In other words, mathematics grows as the body of human knowledge grows; each generation gleans new wisdom from the environment, experimentation, or personal experience and transmits that knowledge to contemporary and future generations either orally or in writing. Noted mathematicians may get their names printed in textbooks or permanently etched on the name of their theorems but the greater body of mathematics grows whether or not momentous discoveries warrant an individual mathematician's fame. One of the primary ways mathematics changes over time is through the transformation of soft sources of information such as common knowledge, intuition, or hunch, into hard information in the form of proof.

In fact, mathematicians have accepted hunches and other soft sources of information to be "true" even before formal proof has been established. Number theory is especially full of instances in which mathematicians can rely fairly well on assumptions without demanding full proof: "in number theory, there may be heuristic evidence so strong that it carries conviction even without rigorous proof," (p. 411). For example, mathematicians do not know for sure whether or not an infinite number of twin prime number pairs exist and yet we still act as if there are an infinite number of twin prime pairs. Mathematicians take some ideas for granted, unless of course the ideas are proven wrong. In any case, proofs often take generations or even centuries to manifest. Prime number theory was first postulated in 1792 by a fifteen-year-old Gauss, but the theory remained unproven until 1896. Mathematicians rely on soft information that can be best described as working knowledge until hard information becomes available.

Similarly, mathematicians permit the existence of underlying beliefs, biases, and ideology that may influence…… [read more]


Aesthetic Appeal of Mathematics Term Paper

Term Paper  |  2 pages (659 words)
Bibliography Sources: 1+

SAMPLE TEXT:

Mathematics

From conch shells to chrysanthemums: nature abounds with spectacular arrays of geometrical forms. Their visual forms can be translated into mathematical equations, enabling an intellectual understanding of the ways such geometric forms are created and replicated throughout the visual world. Translating visual forms into equations does more than satisfy thirst for computation, though. As the "science of total intellectual order," mathematics enables human beings to perceive order in the universe, to see neither a random collection of petals nor a smelly set of sea creatures ("Patterns, Order, and Chaos," p. 189). In addition to helping human beings perceive natural order in a frequently chaotic universe, mathematics also encourages several key functions including generalization, idealization, and abstraction. The equations the mathematician conceives can be applied to all similar conch shells, not just one or two; in fact, any spiral will follow certain trajectories and will be represented through similar mathematical symbols. Equations also stimulate the innate fascination with the ideal and the absolute. Circles, pyramids, spirals, and parabolas are transformed into ideal, nearly spiritual absolutes: like Plato's forms they are archetypal representations. Represented in the mundane world, circles, pyramids, spirals and parabolas are rarely as perfect as they are in the human mind. Finally, mathematics encourages abstraction, which liberates the mind to pursue open-minded and free thinking. Mathematics reveals the beauty of the natural world and by using mathematical equations human beings can create works of majesty and art.

S. Jan Abas notes that geometric forms predominate in medieval Islamic art not only because of the admonishment of anthropomorphized depictions of deity but also because of the intrinsic aesthetic value of mathematics. The stars and rosettes that pepper Islamic art and architecture serve several key symbolic and practical functions: they symbolize divine presence and intervention; they represent divine light and spiritual illumination; and they permit actual light to flow through physical spaces such as in mosques or palaces. The aesthetic value of the star patterns in Islamic art and…… [read more]


Statistics Anxiety and Graduate Students Term Paper

Term Paper  |  4 pages (1,160 words)
Bibliography Sources: 0

SAMPLE TEXT:

STATISTICS ANXIETY and graduate students in the social sciences

Many graduate students in the social sciences need to take statistics as part of the academic training, but these students often do not necessarily have backgrounds in statistics or mathematics from their undergraduate degree or other graduate training. In the classrooms, statistics anxiety is noticeably prevalent among graduate students whose academic background has little statistical training. According to Onwuegbuzie. Slate, Paterson, Watson, and Schwartz (2000), 75% to 80% of graduate students appear to experience uncomfortable levels of statistics anxiety. As a result, conducting statistics is often rated as the lowest skill in terms of academic competence (Huntley, Schneider, and Aronson, 2000).

Statistics anxiety has been defined simply as anxiety that occurs as a result of encountering statistics in any form and at any level (Onwuegbuzie, DaRos, & Ryan, 1997), and has been found to negatively affect learning (Onwuegbuzie & Seaman, 1995). Many researchers (Lazar, 1990; Lalonde & Gardner, 1993; Onwuegbuzie, 2000b) suggested that learning statistics is as difficult as learning a foreign language. On the other hand, statistics anxiety sometimes is not necessarily due to the lack of training or insufficient skills, but due to the misperception about statistics and negative experiences in a statistical class. For instance, students often think they do not have enough mathematics training so that they cannot do well in statistical classes. With fear of failing the course, they delay enrolling in statistics courses as long as possible, which often leads to failure to complete their degree programs (Onwuegbuzie, 1997). The lack of self-efficacy and higher anxiety in statistics keep many students away from engaging in research work or further to pursue an academic career. Therefore, statistics becomes one of the most anxiety-inducing courses in their programs of study (Blalock, 1987; Caine, Centa, Doroff, Horowitz, & Wisenbaker, 1978; Schacht & Stewart, 1990; Zeidner, 1991).

A growing body of research has documented a consistent negative relationship between statistics anxiety and course performance (Zeidner, 1991; Elmore et al.,1993; Lalonde & Gardner 1993; Onwuegbuzie & Seaman 1995; Zanakis & Valenza1997). In fact, statistics anxiety has been found to be the best predictor of achievement in research methodology (Onwuegbuzie et al., 2000) and statistics courses (Fitzgerald et al., 1996). Most recently, Onwuegbuzie (in press b), using pathanalytic techniques, found that statistics anxiety and expectation play a central rolein his Anxiety-Expectation Mediation (AEM) model, being related bi-directionallyto statistics achievement and, at the same time, moderating the relationship betweenstatistics achievement and research anxiety, study habits, course load, and thenumber of statistics courses taken. The AEM model is presented in Figure 1.Onwuegbuzie (in press b) posited that the pivotal role of statistics anxiety in theAEM model suggests that Wine's (1980) Cognitive-Attentional-Interference theorycan be applied to the field of statistics, as it can be to the foreign language learningcontext. According to Onwuegbuzie, Wine's theory predicts that anxiety interferes with performance by impeding students' ability to receive, to concentrate on, and toencode statistical terminology, language, formulae and concepts. Moreover, Onwuegbuzie theorised that anxiety reduces the… [read more]


Statistics: A Question Unanswered Term Paper

Term Paper  |  5 pages (1,450 words)
Bibliography Sources: 0

SAMPLE TEXT:

Statistics: A Question Unanswered

Before there can exist any intelligent discussion with respect to the topic of statistics one must understand that a statistical process does not stand alone nor does it function without being a part of a much larger plan, namely, research investigation as a whole. Statistics and their accompanying processes are only one such part of the… [read more]


Chinese Mathematics in Ancient China Term Paper

Term Paper  |  6 pages (1,633 words)
Bibliography Sources: 1+

SAMPLE TEXT:

As a result, there is scant trace of the advanced knowledge that characterized ancient Chinese mathematics.

Influence

Much of modern mathematics today emerges as re-discoveries of principles and techniques already applied by the ancient Chinese. Pascal's Triangle, for example, was already in use as early as the 13th century in China. The Chinese had also unknowingly employed the mathematical principles of the ancient Greeks before these works were rediscovered by European mathematicians like Carl Friedrich Gauss.

Despite their early advancement, however, there is little evidence of any ancient Chinese principles on mathematics today. In contrast, ancient Arabic and Hindu principles can be discerned in the techniques and number notation system employed today.

In addition to the destruction of ancient Chinese mathematical texts, the decline of Chinese traditional methods can also be traced to Matteo Ricci, a Jesuit missionary who lived in China during the mid-16th to late 17th century. Ricci is widely credited with introducing Western mathematics to China. Ricci became proficient in Chinese language and culture. As a sign of the Chinese people's esteem for the European scholar, Ricci was allowed to visit and live in Peking, which until then had been closed to foreigners (Spence 5-9).

In addition to studying, Ricci also shared with the Chinese scholars the mathematical knowledge he learned from renowned Roman scholar Clavius. The logical construction of Euclidean elements quickly superceded traditional Chinese notations. The practical orientation of Chinese mathematics further disguised their theoretical achievements.

However, the lack of any discernible influence today should not detract from the great achievements of ancient Chinese mathematics. After all, mathematical principles also underlied the development of more popular Chinese scientific developments, such as gunpowder, principles of paper money and seismographs, which were used to measure earthquakes as early as 1000 AD. It is in these scientific and technological developments that Chinese mathematical principles continue to live.

Works Cited

Martzloff, Jean-Claude. A History of Chinese Mathematics. New York: Springer Verlag, 1997.

Needham, Joseph. Science and…… [read more]


Course Analysis: Math and Statistics Term Paper

Term Paper  |  3 pages (1,237 words)
Bibliography Sources: 3

SAMPLE TEXT:

Statistics can be described as the study of the collection, interpretation, presentation, description and analysis of data. In particular, statistics in business taught us to apply a variety of statistical methods in a business context to facilitate evidence-based decision making and to provide answers to business related questions. The overarching objective of the course was to analyze each of the course elements in detail and at the same time learn how to use this knowledge to describe data and make informed decisions based on well reasoned statistical arguments. The course elements learnt include: descriptive statistics, inferential statistics, hypothesis development and testing, selections of appropriate statistical tests, and evaluating statistical results. All the elements enabled us to make inferences about a given population from sample data, to perform statistical analyses and to interpret different results. Thus, all the knowledge gained from the statistics course is applicable in real life situations and will also be useful in solving a variety of analytical problems in our future careers. This text takes a look at the five elements of the business statistical course in detail and evaluates their applicability in day-to-day operations and different careers.

Descriptive statistics

Descriptive statistics are used to describe or summarize data in a meaningful way so that patterns can emerge from the data. Numeric values such as range, standard deviation, mean, mode and median are used to make descriptions about the main characteristics of data in a study. They, however, fail to make conclusions beyond analyzed data or conclude on any hypothesis that had been made about the data (Wasserman, 2004). Descriptive statistics can be applied in a variety of situations. For instance, they can be used to describe findings in business related research, to provide insight on business trends, and to explain deviations from expected levels of performance. Descriptive analysis methods can also be used to explain trends followed by stocks that are traded on a financial market and to explain fluctuations in currency in relation to international trade. The knowledge gained can support a career in the U.S. Census Bureau where descriptive characteristics are used to indicate average household sizes, employment rates, pa-capita income and gender and ethic breakdowns. The methods are also useful for people who conduct market research and offer financial services.

Inferential statistics

Inferential statistics makes use of probabilistic techniques to analyze sample information from a known part of a population to improve knowledge about the unknown whole. More specifically, techniques learnt in inferential statistics allowed us to use samples to make informed generalizations about the populations from which the samples were drawn. Downing and Clark (2010) state that the two methods applied in inferential statistics are the testing of statistical hypotheses and the estimation of parameters. The methods incorporate sampling errors that arise when a sample fails to represent the population perfectly. Inferential statistics are applied in daily managerial decision making, product promotion surveys, marketing and research and competitor analysis. It can also be used in the calculation of the consumer price index… [read more]


Determining Appropriate Statistics Methodology Chapter

Methodology Chapter  |  2 pages (860 words)
Bibliography Sources: 2

SAMPLE TEXT:

¶ … Kolmogorov-Smirnof test

Factor analysis

Linear regression

Goldfeld-Quandt test

Kaiser-Meiyer-Oklin (KMO)

Multivariate regression

Correlation -- Pearson's r

Cronbach's ?

Durbin-Watson statistic

See descriptions and justifications below.

Correlation importance and justifications

Correlation measures the strengths of association between two variables and, as such, enables the performance of bivariate analysis ("Statistics Solutions, 2012"). The value range of the correlation coefficient extends between +1 and -1. A correlation coefficient of ± 1 indicates a perfect degree of association between the two variables. Assumptions for the Pearson r correlation include normal distribution, linearity, and homoscedasticity ("Statistics Solutions, 2012"). Linearity assumes a straight line relationship between each of the variables in the analysis and homoscedasticity assumes that data is normally distributed about the regression line ("Statistics Solutions, 2012").

3. Reasoning (justifications) to use parametric or non-parametric statistics

Parametric statistics are used when the data is expected to show a type of probability distribution from which inferences can be drawn based on the parameters of that distribution (Geisser & Johnson, 2006). More assumptions are made when using parametric methods than non-parametric methods, which can generate more precise and accurate estimates if the assumptions are correct; this is known as statistical power (Geisser & Johnson, 2006).

4. Selection of statistical method suitable for the selected model(s) And 5. Justification of the selected statistical method

Multivariate analysis is used because there are so many independent variables. This is already discussed in the draft of the paper.

6. Assumption of the selected statistical method(s) AND 7.Discuss the need of Normality assumption

With parametric statistics, there is an assumption that the data will be based on normal probability distributions that have the same shape and are characterized (parameterized) by a mean and standard deviations ("Statistics Solutions, 2012"). That is to say, if the researcher knows the mean and standard deviation -- and if the distribution is, in fact, normal -- then the probability of any future observations can be known ("Statistics Solutions, 2012"). To verify data normality, a goodness of fit test may be used; in this study, the Kolmogorov-Smirnof test will be used ("Statistics Solutions, 2012").

7. Multicollinearity assumption & implication to student work

Multiple linear regression assumes little to no multicollinearity in the data. When independent variables are not independent from each other, multicollinearity exists ("Statistics Solutions, 2012").. There is also an assumption of independence regarding the error of the mean ("Statistics Solutions, 2012"). That is to say that the standard mean error of the dependent variable is independent from the independent variables ("Statistics Solutions, 2012").

8. Discuss ways to overcome the Multicollinearity

When multicollinearity occurs in…… [read more]


Prediction Essay

Essay  |  6 pages (1,807 words)
Style: APA  |  Bibliography Sources: 0

SAMPLE TEXT:

38; median = 3.3; mode = N/A (all values at the same frequency); variance = .692889; standard deviation = .263228; kurtosis = -1.2189; skew = .182109; range = 2.5; sum = 38.3.

Higher reaction time memory score group: being = 9.24; median = 8.85; mode = 9.5; variance = 5.004889; standard deviation = 2.237161; kurtosis = 6.86883; skew = 2.443916; range = 7.9; sum = 94.2.

Aside from the obvious difference is one would expect when separating groups into low and high scores (e.g., higher/lower mean, median compared to the overall group results, different sum, etc.) there are a couple of interesting differences here. First, the lower reaction time group does not have a specific mode (although scores in the distribution at the same frequency of occurrence, thus there are multiple modes), whereas the higher reaction time group at a specific mode. The presence of an outlier (X = 15.2) in the higher group score inflates the mean, the range and the variance in the second group (thus any descriptive statistics using the sum of the scores would also be inflated by an outlier in the high range). For instance removing the high score reduces the mean, standard deviation, range, and sum and of course the shape of the distribution would also be affected.

The measures of central tendency are unaffected when you double the same scores in each data set. This is because you are just simply adding more of the same score and you are not adding a lot of variance to the sample. Of course the sum changes significantly because you are doubling it. There are some slight changes in the measures of dispersion due to the fact that the variance decreases when you add the same score to a distribution of scores. The shape of the distribution will also change slightly as…… [read more]


Artist in Cultural Phenomenon Term Paper

Term Paper  |  3 pages (810 words)
Bibliography Sources: 1

SAMPLE TEXT:

As strange as it might seem, Aronofsky's film is meant to discuss the idea of abstract math. These are two concepts coming together and forming a paradox, as they practically enable viewers to understand that there is much more beyond calculations and simple mathematical formulae.

This film is not necessarily meant to discuss with regard to typical mathematical ideas. Instead, its producers wanted to address an intriguing idea regarding mathematics and science in general: the chaos theory. The motion picture's protagonist discovers a link between the chaos theory and the number Pi and this causes a great deal of individuals to express interest in his work. These respective people practically acknowledge the important role that this discovery could play for humanity as a whole and concentrate on understanding it themselves in order to be able to accomplish goals that would be unachievable otherwise.

Aronofsky probably wanted his viewers to realize that there is a strong connection between the world and mathematics. Mathematics can practically function as a language in explaining events happening throughout the universe and a person who is well-acquainted with this respective language is likely to gain a more complex understanding of things that seem unexplainable.

Pi is one of the most powerful concepts in the film and by looking at how the circle is a perfect shape, one can easily acknowledge that Pi has a special place in the world. By dividing a circle's circumference by its diameter, one can get the number Pi. This particular number is impressive because it would be impossible to write it by using a traditional mathematical form. Instead, people relate to it by using a symbol, thus making it possible for humanity as a whole to understand the limitations how much the world actually knows with regard to mathematics.

Aronofsky emphasized Pi as a number that goes on and on forever and that is thus impossible to understand by people. The very concept of infinity stands as an idea that people understand, but that they also find impossible to think about. Similar to the Univers, the number Pi is infinite and humanity is unable to ever understand it completely. Pi practically took this number and related to it as possibly being the answer to existential questions -- questions that people have always considered, but that they have also acknowledged as being impossible to answer.

Works cited:

Dir. Darren Aronofsky. Pi. Artisan…… [read more]


Conselling Master Questionnaire Questionnaire

Questionnaire  |  15 pages (4,070 words)
Bibliography Sources: 1+

SAMPLE TEXT:

26)

The primary objective of statistics is to make inferences concerning a population. In so doing, there is a need to explain or offer information concerning the sample, which will major in a given study. Descriptive statistics come in, and assist in describing the sample on which the study will take place. In addition, descriptive statistics first offer substantial information… [read more]


Numerical Research That Can Be Analyzed Essay

Essay  |  4 pages (1,201 words)
Bibliography Sources: 4

SAMPLE TEXT:

¶ … numerical research that can be analyzed in a statistical fashion. Quantitative research frequently -- although not exclusively -- deploys the scientific method whereby a hypothesis is tested in a controlled fashion. One group, the experimental group, is subjected to an intervention known as the independent variable while another, otherwise similar group, is designated the control group and not subjected to that variable. The dependent variable is the change or lack of change that results from the intervention, and the results prove or disprove the initial hypothesis. Quantitative research can also take the form of a survey or other instrument designed to collect raw data about a particular population.

In contrast, qualitative research is designed to explore the evolution of a particular phenomenon in narrative form. Responses from test subjects may be coded and subjected to data analysis, but ultimately the goal of this type of research is to record the particular experience of a population in a holistic fashion, not test a theory within limited parameters. This contrast means that qualitative research is often seen as subjective, versus the superior objective claims of quantitative methodologies. However, there are many persistent problems with quantitative research that complicate this schematic notion. It has been observed that "poor statistics" make for "poor science" but this is true of all disciplines: indeed, in the social sciences, where variables are more difficult to isolate within populations, rigorous statistical methodology to eliminate error is even more significant (Gardenier & Resnik 2002: 70). Also, in quantitative research, using effective statistical testing is vital, regardless of the experiment, given the ethical implications of having human subjects, take the risk of participating in a study with questionable utility and value (Gardenier & Resnik 2002: 66).

In an experiment involving statistical analysis of a population, the formal 'null hypothesis' is tested (the theory that nothing will happen). The null is actually a statement that is contrary to what researchers want to prove. In general, it is assumed that false rejection of the null hypothesis is less damaging than false acceptance -- i.e., it is thought that overestimating the potential impact of a variable is less troubling than not recognizing its impact (Baroudi & Orlikowski 1989: 88). "The embedded null approach involves embedding a hypothesis of no effect within an interaction framework. The framework is then used to show that, under certain conditions, the manipulation/predictor variable in question does produce an effect or relationship, while under the conditions of primary interest, the effect or relationship does not appear" (Cortina 2002: 342).

The cautious approach to tracking change makes sense given that the selection of the test population may be imperfect and contain too many outliers. That is why a 'statistically significant' alteration must be in evidence, not simply any change at all. "The reason that we avoid concluding a lack of effect from studies that show minimal or non-significant results is that there are many alternative explanations for this finding" (Cortina 2002: 343). Particularly in the social sciences, it… [read more]


Quantitative Analysis for Business Essay

Essay  |  2 pages (637 words)
Bibliography Sources: 2

SAMPLE TEXT:

¶ … Business

Statistics is the study of data collection, its organization, analysis, interpretation and eventually its presentation. It is the science of collecting, summarizing, analyzing data in numerical form. It entails planning on data collection in terms of design of surveys and experiments to be used (Calkins, 2005).

There are two types of statistics which are Descriptive statistics and inferential statistics. Descriptive statistics entails the methods of organizing, displaying and data description with the use of tables, graphs and summaries. Inferential statistics is a process that is used in the description of a population on the basis of results found. It entails estimations of unknown parameters of population that are based on sample results and hypothesis testing that is used to either accept or reject the hypothesis made prior.

There are four main levels of measurement that are used in statistics. These are nominal, ordinal, interval and ratio. They all have their own different degree of usefulness in statistics. Nominal measurements do not have any meaningful rank order among values it uses numbers and labels only. It can therefore be used to do cross tabulations for example the chi-square test is performed on cross-tabulation of nominal scale.

Ordinal measurements have difference between consecutive values that are imprecise but have meaningful order to those values.

Interval measurements have distances that are meaningful between measurements that have been defined but with a zero value that is arbitrary. They can be used in the computation of statistical measures that are commonly used.

Ratio measurements are those that have both a meaningful zero value and distance between the measurements clearly defined hence provide great flexibility in the particular statistical method used in data analysis (Calkins, 2005).

Statistics enables prediction of an event hence plays a crucial role in business decision making. It can bring about the difference between the continued success of a business and its eventual failure. Statistical research therefore…… [read more]


Division by Zero Mathematics Term Paper

Term Paper  |  4 pages (1,382 words)
Bibliography Sources: 4

SAMPLE TEXT:

Also, since any number times zero will equal zero, there is no one unique solution for the answer. Example problems such as 2x0 = 0 and 1450 x 0 = 0 show that a number times zero equals zero. However, at the same time, zero times itself does not equal the number multiplied by zero. The answers cannot be duplicated when the reverse operation is performed (Knifong 1980,-page 179). Even 0 x 0 = 0 is problematic in that zero divided can be divided by itself and multiplied by itself an infinite number of times.

In higher mathematics, such as calculus, the question of division by zero becomes even more complicated. Asymptotes, for example, are lines which correspond to the zeroes of the denominator of a rational function (Kuptsov 2001). This is necessary in determining geometric functions because, since zero can never be in the denominator, the person solving the equation knows that the graph of their equation cannot include the numbers which would allow x to equal zero. For example, if the graph of a line were y = 2/x, x could never be zero because then 2/0 would be an undefined number. The physical presence of x unequal to zero is shown on the graph, but still the solution to a number divided by x is not visible.

Limits are another component of calculus which complicates the division of a number by zero, but still does not change the fact that it is impossibility. Even in calculus, the actual arithmetic value of zero divided by itself cannot be determined. Instead, the function of the limit is to determine a pattern of mathematical quotients to make a best estimate at what such an answer might be if it were to exist in the real world (Weisstein 2012). The established rule for calculus and limits is that the limit of division by zero can be either plus or minus infinity, or that it can have no limit. This is written as either ? Or -?. With limits, the mathematician can get close to approaching the answer to division by zero; however this is only ever an approximation and is never able to fully solve the problem.

There have been advances in mathematics, such as hypothetical and theoretical math topics, such as fractal Cantonian space time. It is an "operational extension to algebraic groups" which poses that there is a place within quantum physics which would allow for a division by zero (Czajko 2004,-page 261). Researchers in this field have postulated that this division will allow for better understanding and utilizing of "mutually dual line vector spaces" (Czajko 2004,-page 262). Scientific inquiry also poses potential situations where there may in fact be ways to divide by zero. However, it must be noted that all these propositions are theoretical and none have been empirically proven.

In common mathematics, zero is not actually a number. Rather it is the placeholder for the position between positive and negative. It is, in reality, the lack… [read more]


Grade 1 Math Standards Essay

Essay  |  3 pages (748 words)
Bibliography Sources: 3

SAMPLE TEXT:

Through performance standards design, the two systems have developed a system of conceptual nesting rather than simply relying on the more conventional system of learning tiers. Too often, in previously employed standards for mathematics performance, students advanced through their demonstrations of mathematical reasoning by moving from one tier to the next presumably more advanced tier. For instance, first graders needed to demonstrate that they could count to 100 before they began addition, or the like. The point is that the performance standard categories were treated as discrete teaching and learning units. The approach taken by the North Carolina Teachers of Mathematics (NCTM) standards and the North Carolina Common Core state standards is integrative -- the standards do not assume that children will achieve an understanding to the numerical relationships in their instructional units. Rather, the performance standards are designed to deliberately draw and teach those relationships by coming at the constructs from any different perspectives and practical exercises.

The mathematics performance standards are part of a larger whole designed to encompass the learning requirements for students across their K-12 educational experience. These new clear and consistent state standards have been thoughtfully aligned with the expectations of higher education and the workplace. The current standards are contiguous with earlier versions of state standards, an important consideration for efficacious institutional effort in teaching and for the motivation of students. In other words, no lost time and no wasted effort results from the adoption of the new state standards.

Moreover, the state standards have been rigorously vetted through empirical research and are informed by the global standards for mathematics. Moreover, the North Carolina Teachers of Mathematics (NCTM) standards and the North Carolina Common Core state standards are referenced to rigorous content that focuses on application of higher order skills.

Conclusion

On 3 June 2010, North Carolina adopted the Common Core State Standards, joining the first group of states to do so. The adoption is based in the understanding that if students are to develop deep mathematics understanding, they must move well past a follow-the-rules position to make sense of what they are doing in math.

References

Common Core State Standards Initiative. [Webpage]. Retrieved October 19, 2012 from http://www.corestandards.org/

Curriculum and Focal…… [read more]


Psychology What Are the Similarities Research Paper

Research Paper  |  3 pages (947 words)
Bibliography Sources: 0

SAMPLE TEXT:

What are true experiments?

True experiments consist of more than a single purposively designed group, random assignment, and outcomes that are commonly measured. Ethnicity and sex cannot satisfy such requirements because it is impossible to manipulate them without purpose. These designs only occur when a sample is chosen in random and assigned to comparison groups and program. If the researchers can perform the experiments using random assignments, it means that the experiment programs are true designs.

How are threats to internal validity controlled by true experiments?

Bias is a menace to interior validity. It is the primary source of errors into results and measurements. Bias occurs when experimental items that are in favor of an age, ethnic group, or gender are used. Bias is a serious threat to internal validity because it creates an unconventional elucidation for the domino effect of a research conducted. True experiments can be used to control threats such as bias. True experiments control much of such threats through ensuring that the experimental treatment groups are equivalent before the study begins. This helps the researcher to control factors such as regression and self-selection towards the mean effect. In addition, true experiments are used in measuring the variables that could be potential threats thus controlling them statistically. As a result, threats to internal validity would be minimized.

How are they different from experimental designs?

True experiments are different from experimental designs in the way in which the ethnicity, the population, and sex are designed. The internal validity experiences threats when the researcher attempts to influence the results, this implies that, the mind of the researcher is partial and makes changes on the variables so that he or she can attain the desired results.

What are quasi-experimental designs?

These are research designs commonly used in making evaluations of educational programs when a practical or a random assignment is impossible.

Why are they important?

Researchers use quasi-experimental designs when they are unable to control the participants' assignment to conditions or when it is impossible to manipulate the variables. Instead, researchers make comparisons between variable in existing groups or a group of participants that already exists after and before the occurrence of a quasi-independent variable.

How are they different from experimental designs?

Quasi-experimental research designs are mostly used in making evaluations of problems in education when it is impossible to make random assignment or a practical. These designs are prone to numerous interpretation errors even though they are commonly used. Experimental research designs are highly effective in addressing the issue of evaluation about the usefulness and impacts of a program. These designs emphasize on the importance of comparative data as the basis for making interpretations of research findings. They increase the confidence of researchers showing that the findings are the results of an innovation or program and not of an…… [read more]


Organizational Health Educational Institutions Essay

Essay  |  8 pages (2,709 words)
Bibliography Sources: 8

SAMPLE TEXT:

The refrain is often heard that not enough students graduate with degrees in STEM majors, and that students in America are not able to compete with students in Canada, Finland, and South Korea who score higher on their mathematics tests than do students in the U.S. ("Business Coalition," 1998; Hacker, 2012).

Section IV: A Learning Solution Proposal

The expectation that… [read more]


Descriptive Statistics Data Analysis Chapter

Data Analysis Chapter  |  2 pages (687 words)
Bibliography Sources: 2

SAMPLE TEXT:

¶ … Attitude Nature

Nature Stats Attitudes

Table 1 (Appendix I) displays the results of a survey of students asked to rank their attitude toward nature on a continuous integer scale of 1-100. The results present a clear picture of these students' attitudes and also inform further question about their outdoor activities in a number of ways. The results in Table 1 support future hypothesis testing because skewness and Kurtosis are close to normal, which indicates both parametric and non-parametric inferential statistics will likely be appropriate to describe and predict correlation between other variables of interest, which is often not the case for samples this small (n=30).

Table 2, "Attitude About Nature" (Appendix II) lists the frequency of students' ranking of their attitudes for nature, against which other variables can be compared to see if the resulting distributions are statistically significant, i.e. occur as much or less often than would occur by chance at whatever degree of "alpha" or risk of Type I vs. Type II error experimenters decide is of interest, usually 0.05 or 0.01. The results demonstrate that no student answered below the value of 5, and since the top score was 100, therefore no student could answer more than that, which bodes well for the normality of data since this reduces the effect of outliers, which shows up in low skewness and Kurtosis near to normal (below). While the distance between the one "100" answer and the next-highest group of two "80" answers suggests the "100" could possibly be an outlier introducing enough skew to demand non-parametric inferential tests, a number of tests easily performed in SPSS will demonstrate the strength to which that sole "100" result disturbs the normality in the rest of the data.

Figure 1 (Appendix III) shows these results displayed in a frequency histogram with normal curve superimposed. This histogram reveals that the results were similar to normal, with the highest frequency occurring around the median, such that mean and median were very close, which Table 1 reveals was only a difference of one percent (Appendix…… [read more]


Science if Conducting an Experiment Term Paper

Term Paper  |  4 pages (1,339 words)
Bibliography Sources: 3

SAMPLE TEXT:

¶ … science if conducting an experiment that can allow the experimenter to make reasonable inferences about the material described. This paper describes different aspects of the experimental process. It discusses descriptive and inferential statistics; single case and small N. research designs; true experiments and experimental designs; and qausi-experiments. It discusses the relative strengths and weaknesses of each experimental approach.

What are the similarities between descriptive and inferential statistics? What are the differences? When should you use descriptive and inferential statistics?

Descriptive statistics refers to data that describes, shows, or summarizes data in a meaningful way (Lund Research Ltd., 2012). Descriptive statistics present the data, but they do not allow one to make conclusions about data. In other words, descriptive statistics can be described as a way to organize raw data. There are two main types of descriptive visits that are most relevant: measures of central tendency and measures of spread (Lund Research Ltd., 2012). Descriptive statistics are frequently summarized in tables, charts, and graphs, which make it easy to see the general results of a study. "Descriptive statistics are applied to populations and the properties of populations, like the mean or standard deviation, are called parameters as they represent the whole population (i.e. everybody you are interested in)" (Lund Research Ltd., 2012).

Inferential statistics is a means of translating descriptive statistics and trying to apply it to a large group, when one does not have access to an entire population. "Inferential statistics are techniques that allow us to use these samples to make generalizations about the populations from which the samples were drawn. It is, therefore, important the sample accurately represents the population. The process of achieving this is called sampling. Inferential statistics arise out of the fact that sampling naturally incurs sampling error and thus a sample is not expected to perfectly represent the population. The methods of inferential statistics are (1) the estimation of parameter(s) and (2) testing of statistical hypotheses" (Lund Research Ltd., 2012).

One would use descriptive statistics to present the information received from a specific population. Descriptive statistics are clear, but they only allow one to present information about those things that were actually measured. Inferential statistics have a margin of error, but allow the researcher to make conclusions about a broader group than was actually measured.

2. What are the similarities between single-case and small-N research designs? What are the differences? When should you use single-case and small-N research designs?

Single case research design is a design that is frequently used in applied psychology, and is when a subject serves as his own control group. The goal of the single-case research design is to examine the impact of a variable on the subject. "Single-case research is idiographic rather than nomothetic" (Brogan, Unk.). There are several features of a single-subject design including: baseline assessment to determine the status quo before an intervention is applied, and continuous assessment to determine the impact of the intervention.

Many people use the term small-N research design interchangeably with… [read more]


Psychological Research Descriptive and Inferential Research Paper

Research Paper  |  4 pages (1,173 words)
Bibliography Sources: 3

SAMPLE TEXT:

A variable is then tested on the test group, while the control group remains unaffected. The results from the test group are then compared to the control group, which has not been involved in the actual testing, and the results are observed.

By using a "true experiment" design, the results are difficult to refute, but there are things that can invalidate a study. These are known as threats to internal validity, or "confounds that serve as plausible alternative explanations for a research finding." ("Threats to Internal Validity") There are a number of different threats which can serve as alternative explanations including what is referred to as a patient's history, maturation, the effects of testing on a subject, the instrumentation used, chance, attrition, and even the selection of subjects for the test and control groups. Some of these threats can be minimized because a true experiment must be set up in a way that will do so. For instance, making certain to randomize the individuals in the both the test and the control groups, isolating the subjects from each other to prevent interaction, keeping the testing short to minimize the chance of boredom influencing the subject's reactions, or only testing one simple variable. But despite these efforts, not all internal validity threats can be eliminated. There will always be the chance of contamination of the subjects, or interaction, rivalry, or competition between subjects. Subjects may lose interest in the study, become bored, or even resentful toward the researchers. There are also what is called expectancy errors, which are errors made by the experimenters in interpreting or analyzing data. And sometimes the subject may simply alter their view of things during the study, which may effect their responses.

4. Quasi-experiments

Quasi-experiments are an important alternative when true experiments are not possible. These types of experiments are similar to true experiments but lack the degree of control found in true experiments. Quasi-experiments usually lack a sense of randomness and are usually conducted in external environments, such as a work environment. There are usually at least two variables in quasi-experiments, the "quasi-independent variable," or the variable that is being tested, and the "grouping variable," which is similar to a control group but with a variable that has a predictable result. This predictable result will serve as the base to compare the results of the quasi-independent variable against.

These types of experiments are important when a true experiment is not possible, or when certain variables of the experiment cannot be controlled. It is also important when experimenting necessitates that the experiment be conducted outside a controlled laboratory environment. And since quasi-experiments are natural experiments, their conclusions may be applied to other subjects and settings, and used to make generalizations about an entire population. While the results of true experiments may be limited to the population studied, quasi-experimental results can be expanded to cover more than just those involved in the study. This is their real value, being used to draw larger conclusions about larger populations.… [read more]


Correlation and Regression Data Analysis Chapter

Data Analysis Chapter  |  3 pages (884 words)
Bibliography Sources: 1

SAMPLE TEXT:

SPSS Statistics: Correlation & Regression

Correlation & Regression

Is there a relationship between defect rate and volume? If so, is it positive or negative?

Yes, there is a relationship between defect rate and volume. The relationship is positive, such that as volume increases, so does the defect rate (.740).

Which variable is the independent and which is the dependent variable?

The independent variable (predictor) is the volume of production, and the dependent variable is the defect rate (outcome).

Write out the regression equation and sketch it on the plot.

Predicted score = Bslope X + Bconstant

Predicted score = 0.027(X) + (-97.073)

Based on a review of the plot provided, and examining two points -- 4400 and 4000, which respectively appear to hit the Y axis at 10% and 20.7%, the slope can be calculated to be

Thus, the regression equation would be:

Predicted Score = 0.027 (X) --

54% of the variability in defect rate can be explained by differences in volume.

5. What defect rate would you predict for a shift with a volume of 4000 units?

Defect Rate = 0.027 (4000) -- 97.073

= 10.927

6. What defect rate would you predict for a shift with a volume of 9000 units?

Defect Rate = 0.027 (9000) -- 97.073

= 145.927

7. Would you expect all shifts that produced 4000 items to have the same defect rate?

No. There can still be variance.

8. What would you estimate the standard deviation of the distribution of the defect rate to be for a volume of 4000 units?

The standard deviation of the intercept is 7.819

The standard deviation of the slope is .002

The Standard Error of the Estimate is 4.92.

9. If a particular shift produced 4000 items and had a defect rate of 10%, based on the regression model what would be the residual for the shift?

-.927, as the actual defect rate is .927 below the predicted defect rate based on this model.

Question 11B

1. Yes there appears to be a linear relationship between husband and wife's education.

2. The relationship between husband and wife's education appears to be positive, such that as one increases, so does the other.

3. The slope is .620 -- such that for every unit of increase in husband's education, the wife's education increases by .62.

4. The correlation coefficient (beta) is .561.

5. There are a few outliers on the scatterplot. In most of these cases they represent husbands who have higher educations than their wives.

Question 11C

1. Husband's education = .620(X)+5.341

31.4% of the variability in husband's education can be explained by wife's education.

2. Husband's education = .620(13)+5.341…… [read more]


Mathematics for Elementary Educators Essay

Essay  |  3 pages (1,054 words)
Style: APA  |  Bibliography Sources: 1

SAMPLE TEXT:

¶ … globalization and the structures of testing in the "No Child Left Behind Initiative," it is becoming even more important that K-8 teachers be prepared to teach basic concepts of mathematics that adhere to their individual State standards, but also to a rigorous, diverse, and multicultural community. In the same way that a basic level of literacy is required before pursuing upper levels of schooling, certain mathematical constructs are vital in today's complex world of computerization, science, and the synergistic approach to many core courses. For too many people, mathematics stopped making sense somewhere along the way. Either slowly or dramatically, they gave up on the field as hopelessly baffling and difficult, and they grew up to be adults who -- confident that others share their experience -- nonchalantly announce, "Math was just not for me "or "I was never good at it" (Askey, 1999, 4).

There are four basic concepts covered in the course that particularly address the issue of relevancy in mathematical pedagogy: Mathematical Standards and Practices, Algebraic Thinking and Problem Solving, Numeration Systems and Number Theory, and Rational Numbers and Applications.

Mathematical Standards and Processes -- The National Council of Teachers of Mathematics, an international organization of teachers who are focused on improving the math curriculum globally, presented new standards in 2000 designed to improve curricula, teaching and assessment. Within their rubric, six principles were established to address themes that were valid regardless of the school culture:

Equity -- There must be high expectations and support for excellence in math education from all levels; teachers, administrators, school boards, and parents.

Curriculum -- More than a collection of problems or activities, a math curriculum should be focused, well-articulated, and flow from grade to grade.

Teaching -- Appropriate and effective math teaching requires not only an understanding of math principles but of what students need to understand, and how that should be effectively communicated to them.

Learning -- Students must learn math in a synergistic, step process- each previous module must present them with tools needed to move forward and actively build a knowledge base.

Assessment -- Assessment should support the learning aspect of math and be appropriate as a tool for understanding student needs; not simply as something easy to grade.

Technology -- Adapting technology is absolutely essential in learning mathematics (NCTM, 2009).

In addition to these overall principles, five more detailed standards and expectations were identified:

Problem Solving -- Building new knowledge through problem solving in math and other disciplines that involve mathematic calculations. Be able to apply and adapt problem solving skills.

Reasoning and Proof -- Establish initial understanding a rubric of reasoning out a problem, make and investigate mathematical conjectures, develop and evaluate mathematical arguments, and use appropriate levels of reasoning for different problems.

Communication -- Be able to communicate clearly verbally and in writing mathematical principles, equations, and solutions. Analyze the mathematical thinking of peers and others and use the language of math to express computational ideas.

Connections -- Understand the relevancy and… [read more]


Carl Friedrich Gauss Research Proposal

Research Proposal  |  2 pages (598 words)
Style: APA  |  Bibliography Sources: 3

SAMPLE TEXT:

Carl Friedrich Gauss

This is a template and guideline. Please do not use as a final turn-in paper.

Biography

Gauss, a German mathematician and scientist, was born in 1777. His contributions range over many fields including: geophysics, electrostatics, optics, astronomy, statistics, theory of numbers, differential geometry and more. His nickname was "Prince of Mathematicians" due to his outstanding impact on so many fields of math and science, and he is noted as one of the most influential mathematicians in history. At the age of 21, he wrote Disquisitiones Arithmeticae, a work that became fundamental in making the theory of numbers a discipline. It is still used today. While still in college, at the age of 19, he rediscovered a number of quite significant mathematical theorems, and invented modular arithmetic. In 1801, astronomers had discovered a small planet, Ceres, but lost it in the heavens. Using mathematics, Gauss correctly predicted where it could be relocated, and it was rediscovered. It began his path towards becoming Director of the astronomical observatory in Gottingen, a position he held and cherished the rest of his life (O'Connor & Robertson, 1996, para. 7). He invented the heliotrope, and discovered the potential of non-Euclidean geometry, which eventually led to the research that allowed Einstein to create his theory of general relativity. In 1831, he worked with physics professor Wilhelm Weber to study magnetism and constructed the first electromagnetic telegraph (Bell, 1986, p. 255). Gauss also developed a method of delineating the intensity of the earth's magnetic field. Gauss died in 1855

Main Contribution

It being impossible to present one main contribution as Gauss's foremost effort, we can separate four areas of contribution/focus for Gauss: (Encyclopedia of World Biography, 2005)

In his Disquisitiones arithmeticae he addressed the area of quadratic residues and his own discovery of…… [read more]


Mathematic v. Conceptual Modeling Limitations of Models Thesis

Thesis  |  1 pages (342 words)
Style: APA  |  Bibliography Sources: 2

SAMPLE TEXT:

Mathematic v. conceptual modeling

Limitations of Models

Mathematical models are often the most straightforward and simple forecasters of future outcomes, but they have severe limitations as well. Not only do most mathematical models contain a certain degree of uncertainty or risk, but there is also the risk of the model itself failing (Kay 2006). Mathematical models are unable to cope with non-quantifiable input, and thus are limited both in their use and by the increased risk that a key factor has been overlooked within the model itself (Kay 2006). Conceptual models are inherently adaptable, more able to account for the complexities of the real world and less fixed in their operations (Aspinall 2007). Conceptual models can often be used as a starting point for interactive with the model's user and the available information, allowing the model to be adjusted and still effective when situations change, as opposed to mathematical models which often have to be scrapped in their entirety when information or situations change (Aspinall 2007).

It has been said that…… [read more]


What's Math Got to Do With it by Jo Boaler Research Paper

Research Paper  |  5 pages (1,598 words)
Bibliography Sources: 1+

SAMPLE TEXT:

¶ … Math Got to Do With it? By Jo Boaler

Boaler, Jo. What's Math Got to Do With It? Helping Children Learn to Love Their Least

Favorite Subject -- and Why It's Important for America. New York: Viking, 2008.

Very often, students will whine in math class: 'when will we ever use this in real life?' This explains the… [read more]


Professional Mathematical Societies Thesis

Thesis  |  3 pages (924 words)
Style: APA  |  Bibliography Sources: 4

SAMPLE TEXT:

Mathematics

Professional Mathematical Societies

The American Mathematical Society which was founded in 1888 in order to further mathematical research and scholarship today fulfills its mission through programs and services that promote mathematical research and those uses strengthen mathematical education. It fosters awareness and appreciation of mathematics and its connections to other disciplines and to everyday life. The Society currently has over 32,000 individual members and 550 institutional members in the United States and around the world. It has programs and services for members and the mathematical community that include professional programs such as meetings and conferences, surveys, and employment services. It has publications including Mathematical Reviews, journals, and over 3,000 books in print (About the AMS, 2009).

The Mathematical Association of America is the largest professional society that focuses on the availability of mathematics at the undergraduate level. When it first was started it was a publication known as American Mathematical Monthly, which was founded in 1894 by Benjamin Finkel. When it became more than just a monthly publication it's structure was more of a club. The main purpose was the publication of the Monthly paper. There was one standing committee, which was the Committee on Sections, which is still the only committee that is mandated by the bylaws today. The MAA has grown tremendously over the last hundred years into a complex organization with 27,000 members. It is governed by a 50 person Board of Governors with a nationally elected President and two Vice Presidents. It currently has three peer reviewed journals, and student magazine and a newsletter, an online digital library, and, a highly regarded book publication program (Straley, 2009).

The National Council of Teachers of Mathematics is an organization that strives to be the public voice for mathematical education. It offers vision, leadership and professional development in order to support teachers in making sure that there is equitable mathematics learning for all students. The National Council of Teachers of Mathematics Board has adopted the following priorities on which the organization is run. 1. It provides guidance and resources for establishing and performing mathematics curriculum that is coherent, focused, well articulated and consistent with Principles and Standards for School Mathematics. 2. It develops and actively promotes a culture of equity in every aspect of mathematics education. 3. It engages in political and public advocacy to focus decision makers on improving learning and teaching mathematics. 4. It seeks to advance professional development by creating a coherent framework of audience-specific products and services. 5. It strives to bring existing research into the classroom, and identify and encourage research that addresses the needs of classroom practice (Mission and Goals, 2007).

The Society for Industrial Applied Mathematics is an international organization of professionals, which was incorporated in 1952. It has an interest in mathematics…… [read more]


Statistics Allowable With Nominal, Ordinal and Interval Thesis

Thesis  |  4 pages (1,160 words)
Style: APA  |  Bibliography Sources: 3

SAMPLE TEXT:

¶ … statistics allowable with nominal, ordinal and interval scales.

Nominal is a counting operation and its descriptive statistics is "frequency in each category, percentage in each category mode." Ordinal is a rank ordering and its descriptive statistics is "median range, percentile ranking." Interval is an arithmetic operation on intervals between numbers and its descriptive statistics is "mean, standard deviation and variance." Understanding descriptive statistics necessitates specifically looking at the type of data that are being described. The nominal scales only place numeric labels on non-quantitative concepts, for example, dogs have the value of "1" and cats have the value "2." Many categories or groups are actually nominal, such as racial group and gender. In some research, for instance, the study counts the number of individuals who are in a specific category, such as living in a designated city. Ordinal scales are ranked in a way that compares one to another, with a highest and lowest. An example is the tallest and shortest children in the school. It is not possible to perform ordinal data with mathematical computations. Interval scales allocate specific values to something, so that the intervals are equal, such as a six-point attitudinal scale. In this case, mathematical operations can be performed. With ratio scales, there is the interval with a true zero point, such as weight or the number of something in a room. Then ratios can be determined.

Difference between validity and reliability. The purpose of conducting a study is to come up with accurate measurement results. This is why research must be both reliable and valid; the two are interrelated. Reliability is the consistency of the measurements, or how well the study can be repeated. Does the same measurement yield the same results when repeated? Reliability cannot be calculated, only estimated. Validity is whether the test is measuring what it expects to measure. if, for example, the researchers are measuring a table that is six feet wide, they measure the table with a measuring tape and find it is six feet. They measure it again and again and consistently get six feet. The tape measure is yielding reliable results. The tape measure includes inches and feet, so it should also yield valid results. If the researchers measure the table with the "right" tape measure, it should yield a correct measurement of the table's width. In other words, when conducting research, it is necessary to use measurement tools that yield consistent responses when asked time after time and that yield accurate responses from the participants.

Difference between conceptual and operational definition. Conceptual definitions define a concept with the use of other concepts, which makes measuring difficult. An operational definition specifically identifies at least one observable condition or event, so the researcher knows how to measure that condition or event. The operational definition must be reliable and valid. For instance, if a researcher wanted to know about a person's enjoyment for his or her job, the conceptual definition would reflect interest for an enjoyment and satisfaction… [read more]


Blaise Pascal Biography Thesis

Thesis  |  8 pages (2,266 words)
Style: MLA  |  Bibliography Sources: 6

SAMPLE TEXT:

Blaise Pascal Bio

Blaise Pascal's Biography

Blaise Pascal was a French mathematician, physicist, and religious philosopher. As a person, Pascal integrated different qualities in a nearly inconsistent manner. He held a position of basic skepticism, directed not in favor of that of Descartes, who was employing primary philosophical doubt only to get hold of a secure basis for his philosophy.… [read more]


Statistics for Social Sciences Correlation Term Paper

Term Paper  |  1 pages (346 words)
Style: APA  |  Bibliography Sources: 1

SAMPLE TEXT:

Statistics for Social Sciences

Correlation

This assignment was designed to help students 1) develop a deeper understanding of the purposes of correlational techniques and 2) become more familiar with hand-computed and computer-based correlational analyses.

Use the smoking data provided to complete the following steps. NOTE: you can copy and paste the data into SPSS if you use the electronic word file of this assignment I have sent. Show all your computational work.

Hand compute the correlation between the number of years smoking (YR.SMOKE) and the number of cigarettes smoked per day (CIG.DAY).

Hand calculate the correlation between the number of cigarettes smoked per day and the level of carbon monoxide expired (CO.LEVEL).

Report both correlation coefficients and describe the strength and direction of each in words (based on Tables 5.1 and 5.2 in text).

d. Calculate the coefficients of determination and alienation for both correlations and explain what the numbers mean.

e. Draw two Venn-like diagrams similar to those in Figure 5.5 (p. 90) to demonstrate the coefficient of determination for both correlations.…… [read more]


Neuman ), Researchers Frequently Need to Summarize Term Paper

Term Paper  |  5 pages (1,578 words)
Style: APA  |  Bibliography Sources: 5

SAMPLE TEXT:

¶ … Neuman (2003), researchers frequently need to summarize information concerning one variable into a single number for which they use a measure of central tendency. Measures of central tendency are those descriptive statistics that describe the point or points about which a distribution centers. This paper provides a description of the three measures which are used to describe central… [read more]


Aristotle and His Contribution to Mathematics Term Paper

Term Paper  |  2 pages (685 words)
Style: MLA  |  Bibliography Sources: 3

SAMPLE TEXT:

¶ … Aristotle and his contribution to mathematics and mathematical concepts. Specifically it will discuss his life and contributions, including other mathematicians he worked with or influenced. Aristotle, one of the greatest philosophers and mathematicians of all time, lived from 384 B.C. To 322 B.C. He was born in Macedonia, and spent most of his adult life in Greece as a student of Plato, and then as a teacher and philosopher. He also lived on the island of Lesbos for a time, and was the teacher of Alexander the Great for a time. He also tutored Eudemus of Rhodes, who wrote a history of geometry, and Theophrastus of Lesbos (Lane). He died at the age of sixty-three in Chalcis, after being exiled from Greece for being "anti-Greek."

Aristotle is not thought of primarily as a mathematician, but rather a philosopher and biologist or scientist. In fact, many historians believe he actually left the Academy of Plato because he placed too much of an emphasis on mathematics in his instruction. Plato did influence many of his philosophies, however, which means he at least indirectly influenced his theories on logic.

However, Aristotle did contribute greatly to mathematics, particularly in the areas of deductive logic and geometry. One of the most famous theories he offered to geometry is that of triangles in circles. He discovered that a triangle drawn in a semi-circle is a right triangle, and this is always the case. It is one of his best known geometric theories, and one that many people consider the most valuable, because it helps define the "logical" rules of geometry that define this area of mathematics. Logic was perhaps his greatest contribution to mathematics, because it made the science of mathematics more effective and easier to understand.

Aristotle wrote heavily on logic, and how to apply logic to the sciences, such as mathematics. He wrote his theories in the "Organon," which contained six different treatises about logic. One writer notes, "Organon' is the Greek word for 'tool,' and this title expresses the idea that these six…… [read more]


George Polya Term Paper

Term Paper  |  4 pages (1,317 words)
Style: MLA  |  Bibliography Sources: 3

SAMPLE TEXT:

George Polya

The Hungarian mathematician, George Polya, is hailed by many as not only one of the greatest mathematicians, but also a great teacher of his time. It is interesting that his early school career did not mark a very high interest in the field, however. Later, when faced with choice, his mother encouraged him to take a career in law like his late father. When examining his biography, the reader becomes aware of Polya's extraordinary ability to face and overcome difficulty in order to attain his dreams. This trait, as will be seen, was something his father also possessed.

Polya's parents, Anna and Jakab, were both Jewish. Jakab's original surname was in fact Pollak, but he changed this for the sake of his professional goals. After his law firm failed, hw worked for an international insurance company. However, Jakab's dream was to obtain a research post at a university and pursue his true interests, economics and statistics. It appears therefore that George inherited not only his father's tenacity, but also his interest in numbers. In 1882 Jakab Polya was finally appointed as Privatdozent at the University of Budapest.

George's parents converted to the Roman Catholic faith in 1886, a year before his birth, and he was subsequently baptized in the Roman Catholic Church. George grew up in a home with four other children, three of whom were older than himself and one younger. Jeno, who was the eldest, loved mathematics, but pursued medicine, distinguishing himself in this field as prominently as George did in mathematics. Laslo, the youngest, was considered the brightest of the children, but was killed in World War I before having the opportunity to distinguish himself.

During his schooling at the Daniel Berzsenyi Gymnasium, George studied languages, biology, mathematics, geography, and other required subjects for young children. His favorites were biology and literature, where he received "outstanding grades."

As mentioned above, Polya was not greatly interested in mathematics during his early school career. Many critics ascribe this to the quality of teaching he received in this field. Indeed, he described two of the three mathematics teachers at the Gymnasium as "despicable." His grades were also not particularly high, although he did well in arithmetic.

By the time when Polya enrolled at the University of Budapest in 1905, his brother Jeno was a surgeon, and could support his study efforts financially. Although at first pursuing study in law as his mother wished, George found this extremely boring and gave up after only one semester. After this, he changed his direction to languages and literature for two years, gaining a certificate for his trouble. After this, Polya was interested in pursuing philosophy, but was advised to take physics and mathematics prior to pursuing the complicated subject.

This finally put him on the path that would become a distinguished career.

Polya studied at the University of Vienna during 1910-11, and attended mathematics lectures by Writinger and Mertens. During this time, he also continued pursuing his interest in physics by… [read more]


Reaction to Proof Things Term Paper

Term Paper  |  1 pages (386 words)
Bibliography Sources: 1

SAMPLE TEXT:

Mathematical Proofs middle school mathematics teacher seems at first to gain little from absorbing an article like Kleiner & Movshovitz-Hadar's "Proof: A Many Splendid Thing." However, the authors' explication on the origin of mathematical truth-finding and the changing role of the proof in mathematics reveals several key points that can incorporated into general math classrooms. For example, Kleiner & Movo*****z-Hadar discuss the confluence of philosophy and math throughout history, pointing out especially their shared use of logic and dependence on the successive logical proof in explaining their mutual discoveries to colleagues. Similarly, junior high students may be able to appreciate, if not the details of the Enormous Theorem, at least the process of thinking that underlies mathematical proof. The gap between the seemingly abstract world of theoretical math and everyday reality may not be as great as we all think. Students of math can especially benefit from a deeper understanding of the proof for the satisfaction it can bring. Mathematics is not only about measurements, calculations, and counting. Rather, mathematics form the building blocks of rational thought: our work is about process and proof.

With a firm foundation in logic, mathematics cannot be…… [read more]


Individual and the Culture Term Paper

Term Paper  |  2 pages (651 words)
Bibliography Sources: 1+

SAMPLE TEXT:

Mathematics

How Mathematics Grows: The Role of the Individual and the Culture

Influences the Course of Mathematical Discovery

There are many influences on the course of mathematical discovery. The dominant forces influencing mathematical discovery are individual and culture, as pointed out by the reading. There is as much subjectivity in mathematical computation as there is science, which is part of the reason so much of mathematics is ill understood and often misinterpreted. As the reading points out those not well versed in mathematics or those with little training often have an imperfect notion of how it operates and how solutions or mathematical proofs must be derived. Often what seems correct is later found incorrect, and vice versa. This is evidenced by the example in the reading, whereby professor Hans Rademacher of the University of Pennsylvania, a leading theoretician at the time, mistakenly believed he solved Riemann's Hypothesis (p. 60) only to be disproved later. The individual and culture directly influence mathematical discovery. There isn't really a dichotomy between the individual and culture; they simply influence mathematics differently. For example, an individual who is brilliant and capable of great mathematical feats may at the same time think outside of the scope of what the culture he or she lives within may consider "ordinary." Thus this person's brilliance or folly will either be embraced or rejected depending on the fit with the culture at the time the solution or hypothesis is presented. The reading for example, points out the case of Hermann Grassman, whose work is today considered genius, however during his time was consider obscure and mystical because the work was not in line with the culture Grassman grew up in.

The creating and practice of mathematics are not the same. The practice of mathematics is more aligned with cultural norms and what is considered acceptable practice during the time mathematics is accomplished. Creation however, may involve the extremes of an individual, or an individual's ability…… [read more]


Bio-Statistics Research Activities, Whether Clinical Term Paper

Term Paper  |  5 pages (2,419 words)
Bibliography Sources: 1+

SAMPLE TEXT:

In other words, the authors did not build a medical or healthcare-based paradigm for the study. Following a well-defined research question the research investigators' task is to follow-up with a statement of a testable null hypothesis or hypotheses. The null form of the hypothesis is required in order for the proper application of a statistical data analysis tool to be… [read more]


Mathematics George Cantor Term Paper

Term Paper  |  2 pages (673 words)
Bibliography Sources: 1+

SAMPLE TEXT:

When people began to understand that mathematics could influence everything from photography to art and science, they took a greater interest in mathematics and philosophical thought, which in turn led to even more innovation and scientific thinking. In fact, today, many scientists and mathematicians feel Cantor's work represented a real paradigm shift, or a radical change in mathematical and philosophical thought ("Cantor"). Two of his lasting models that showed his theories were "Cantor's Comb," which showed all points were disconnected from each other, and "Cantor's Dust," which calls these disconnected sets "fractal dust" (Breen). Cantor's work really revolutionized mathematics, and encouraged people to think philosophically about just what numbers really were.

Unfortunately, Cantor's mental health deteriorated as he aged, and he had several nervous breakdowns in reaction to criticism of his work. Today, many believe he suffered from bipolar disorder ("Cantor"). It is sad to think that Cantor may have contributed even more to the mathematical world had he not suffered from mental disorders, as he often stopped working during his bouts with depression.

Another German mathematician, David Hilbert, described Cantor's work as "the finest product of mathematical genius and one of the supreme achievements of purely intellectual human activity" (O'Connor and Robertson). Cantor's theories changed the way mathematicians thought about infinity and sets, and translated into many areas of society. Cantor's work is still questioned and studied today, but the importance of his theories is not questioned.

References

Author not Available. "Georg Cantor." Fact-Index.com. 2004. 13 April 2004. http://www.fact-index.com/g/ge/georg_cantor.html

Breen, Craig. "Georg Cantor Page." Personal Web Page. 2004. 13 April 2004. http://www.geocities.com/CollegePark/Union/3461/cantor.htm

Everdell, William R. The First Moderns: Profiles in the Origins of Twentieth-Century Thought. Chicago: University of Chicago Press, 1997.

O'Connor, J.J. And Robertson, E.F. "Georg Cantor." University of St. Andrews. 1998. 13 April 2004. http://www-gap.dcs.st-and.ac.uk/~history/Mathematicians/Cantor.html

Transfinite Number." Van Nostrond Company, Inc. Van Nostrand's Scientific Encyclopedia. Princeton: Van Nostrand, 1968.… [read more]


Germane Quality of Mathematics Research Paper

Research Paper  |  2 pages (643 words)
Bibliography Sources: 1+

SAMPLE TEXT:

Mathematical puzzles are a longstanding facet of mathematics that have numerous applications in the world today. In this respect, there is a significant amount of fascinating information regarding this element of mathematics. This document will concentrate on several different facets of mathematical puzzles, beginning with their history -- which extends as far back as nearly the history of mathematics. It will also detail some of the actual mathematical principles at work in examples of some mathematical puzzles. Additionally, the paper will provide real-world examples of how mathematical puzzles have shaped society at various points in time. Cumulatively, these three points will attest to the immense importance ascribed to mathematical puzzles in the past and present.

In researching the history of mathematical puzzles, it is nearly impossible to distinguish that history from the history of mathematics itself. Some sources date the history of these puzzles from at least 1800 BCE. And their deployment by the Egyptians (Kent, N.D.). Interestingly enough, there are numerous principles of mathematics that are directly descended from the Egyptians themselves, which helps to buttress the viewpoint that math-based puzzles coincided with the history of mathematics in general. However, there is evidence of Egyptian mathematics puzzles dating back 3,600 years (1650 B.C.) and which are strikingly similar to the riddle about the man going to St. Ives with seven wives (NY TImes). This Egyptian text was preserved on papyrus, which presages the notion of using textbooks for math puzzles. These puzzles have descended into modernity orally (such as in chants and riddles) and through the formal implementation of textbooks. Midway through the 20th century non-cooperative math games provided the basis of John Nash's game theory.

The mathematics of math games is actually fairly diverse, and largely hinges upon which particular math game one is playing. Still, there are some general principles that apply to most of these games. For instance, cardinality is typically important in math games. Cardinality is…… [read more]


Different Components of Statistical Testing Essay

Essay  |  2 pages (830 words)
Bibliography Sources: 2

SAMPLE TEXT:

Statistics in Research: Different Factors to Consider

Statistics in research take two primary forms: that of inferential vs. descriptive statistics. Descriptive statistics, as the name suggests, merely seeks to describe a particular phenomenon as it exists numerically. Examples of descriptive statistics include determining the mean, median, mode or midrange of a particular set of figures or establishing a correlation between two sets of data (Taylor 2015). Presenting statistics in a graph is also considered descriptive in nature (Taylor 2015). Inferential statistics, in contrast, are used when it is impossible to assess data about an entire population group. "It is typically impossible or infeasible to examine each member of the population individually. So we choose a representative subset of the population, called a sample" (Taylor 2015). A good example of this is polling after an election: since it is impossible to accumulate data about all of the voters, a demographically representative group of voters may be polled after they vote. Multiple measurements are often taken in the case of inferential statistics to ensure greater accuracy.

Another distinction in regards to statistical findings is the question of statistical significance. All statistics contain some margin of error. Statistical significance means that given the sample size and the probability of error, the computed difference is still likely to be true. For example, "a difference of 3% (58% for women minus 55% for men) can be statistically significant if the sample size is big enough" ("Statistical vs. practical significance," 2015). However, merely because a sampling is statistically significant does not necessarily mean it is practically significant. Practical significance means that the finding is notable enough that it will have a material impact upon decision-making in the real world. Factors may include cost, feasibility, and the extent to which the intervention would have a meaningful and demonstrable effect on the quality of participants' lives, given the size of the effect ("Statistical vs. practical significance," 2015).

There are two major types of errors in statistical analysis: Type I and Type II (Hopkins 2013). Type I is when a study is overly sensitive and over-estimates the magnitude of the effect of the study which often occurs without an appropriate use of a control; Type II is when the effect of the intervention is underestimated (Hopkins 2013). A Type II error often occurs when too small a sample size is selected (Hopkins 2013). Another type of error is that of bias, either unintentional or intentional upon the part of the study design (Hopkins 2013).…… [read more]


SPSS Data Analysis Research Paper

Research Paper  |  3 pages (827 words)
Bibliography Sources: 3

SAMPLE TEXT:

However, to determine the strength of this relationship a Pearson's product-moment correlation coefficient (r) can be calculated for these two variables. Based on the SPSS results, there is a very strong, statistically significant correlation between hours and scores [r (18) = .967, p < .01, two-tailed]. The percentage of the variation in the dependent variable due to the independent variable is also very high (r2 = .934), which suggests that the average number of hours per week studied may be responsible for 93.4% of the final exam grade; however, a correlation cannot determine causality, only that there is a strong association between the two variables.

There are a number of potential ethical considerations concerning how the data was collected in this study. Of primary concern was the possibility that average hours studied could influence final grades for the course, but the professor collected this data at the end of the semester during the final examination. While there is still a potential ethical concern, the assumption is that the final exam was graded before the data was viewed and analyzed. Even so, most students would expect the hours data and final exam scores to remain confidential during the grading period.

Regression

The predictor variable (Y) is the dependent variable, which in this study were the final exam scores. The criterion variable (X) is the independent variable, or in this case the average hours of study per week. The form of the basic linear regression equation is Y = ?0 + ?1X1 + ?2X2 + & #8230; nXn + ?n, with ?0 representing the Y-intercept, ?1-n representing the slope, and ?n representing the errors of prediction. The values of the dependent variable can therefore be estimated by the regression equation: Y = 47.918 + 2.619*X, with ?1-n = 2.619, t (18) = 16.014, p < .0001. For example, if a student wanted to know what grade they might get if they studied 15 hours a week then he or she would substitute 15 hrs for X in this equation and get a predicted final exam score of 87. The accuracy of this prediction depends on the amount of variation in the data, which is the difference between the best fit line and the experimental values (Y -- Y' = ?) and is given as the standard error of the estimate (SEE = 3.842). These calculations allow students and professors to predict how much studying must occur on…… [read more]


Statistics in Criminal Justice Discussion and Results Chapter

Discussion and Results Chapter  |  2 pages (517 words)
Bibliography Sources: 2

SAMPLE TEXT:

M8D1

Questions like this one make me wonder whether I should give an honest response or give the response that I believe is being sought by the question. In all honestly, my Minimal Statistics Baselines (MSB) is zero. I am honestly not committing to any MSB from this point forward; if I happen to be able to live without ever using statistical analysis again, then I will not be taking any action to attempt to incorporate statistics into my life. I do not intend to read newspaper articles with the purpose of understanding the statistics or look up research articles at a library for the purposes of understanding the statistical analysis.

However, while I have no intention of taking steps to have any type of MSB because of an intentional focus on statistics, I am well aware that I need to use and understand statistics to be able to function as an informed adult. Thinking about election season and the apparently at-odds poll results that are always being touted to support different issues, I realize that understanding how that data was obtained and analyzed is critical to being able to understand that information. As a parent, my child will take standardized tests, and I will need to understand basis statistics in order to understand what test scores mean. If I am ill and looking at potential therapies, I will need to understand the relative advantages and disadvantages of different treatment modalities, and understanding those requires understanding statistics. Depending on where my career takes me, I may find myself…… [read more]


Normal Distribution Central Limit Theorem and Point Estimate and an Interval Term Paper

Term Paper  |  3 pages (918 words)
Style: APA  |  Bibliography Sources: 3

SAMPLE TEXT:

Normal distribution is very much what it sounds like. This distribution is symmetrical and is shaped like a bell when graphed on the Cartesian plane. The normal distribution has the mean figure, the median figure and the mode all basically located at the same place on the distribution. This occurs at the peak and the frequencies will gradually decrease at both ends of this bell shaped curved.

Unfortunately this is simply a model of looking at a problem and no definite predictions can be made with this or any other statistical tool, however this model does have real practical value. Many things in life follow this model and are normally distributed offering a least a guide in how to best understand and predict behavior mathematically using statistics.

Suppose X is normal with mean ? And variance ?2. Any probability involving X can be computed by converting to the z-score, where Z = (X?

)/?. Eg: If the mean IQ score for all test-takers is 100 and the standard deviation is 10, what is the z-score of someone with a raw IQ score of 127?the z-score defined above measures how many standard deviations X is from its mean. The z-score is the most appropriate way to express distances from the mean. For example, being 27 points above the mean is useful if the standard deviation is 10, but not so great if the standard deviation is 20. (z= 2.7, vs. z= 1.35).

Question 2

The central limit theorem states that the distribution of the sum of a large number of independent, identically distributed variables will be approximately normal, regardless of the underlying distribution. The importance of the central limit theorem is very widespread as it is the reason that many statistical procedures work. Regardless of the population distribution model, as the sample size increases, the sample mean tends to be normally distributed around the population mean, and its standard deviation shrinks as n increases.

To use the central limit theorem the sample size must be independent and large enough so a decent amount of data can be formulated to utilize this statistical tool. When taking samples using the central limit theorem each one should represent a random sample from the population or follow the population distribution. The samples size should also be less than ten percent of the entire population.

Simple random sampling refers to any sampling method that consists of a population with N. objects, the sample consists of n objects and if all the possible samples of n objects are equally likely to occur, the sampling method is called simple random sampling. This method allows researchers to use methods to analyze sample results. Confidence intervals are created that deviate from the sample mean to help model their situation.…… [read more]


Stat Notes Sampling Error and Standard Research Paper

Research Paper  |  2 pages (442 words)
Bibliography Sources: 0

SAMPLE TEXT:

Stat Notes

Sampling error and standard error of the mean (SEM) measure the error in assuming the sample accurately represents the population; the smaller the error, the more closely the sample can be assumed to match the population.

Confidence intervals (CIs) are ranges in which a value might fall and are limited by the confidence level -- the higher the confidence level (the degree of certainty desired), the larger the confidence interval will be to ensure the data point falls within it.

Null hypothesis is always that there is no effect of an intervention/variable -- that measured groups do not differ significantly on the measured area(s).

Alternative hypothesis states that there is an effect of the measured intervention(s)/variable(s).

Probability Sampling is used to obtain a study sample that is representative of the population. This rarely occurs with fully randomized sampling; reducing sampling error is key in yielding reliable and valid results. Because of the relationship between sample size and the standard error of the mean (SEM), a larger sample size means a smaller error.

Statistical inference refers to both an estimation of parameters such as the mean and other basic summary statistics of a data set/population, and to hypothesis testing.

Interval estimation is an estimated confidence interval.

Hypothesis testing includes the objective means of determining if a null hypothesis should or…… [read more]


Math Anxiety Term Paper

Term Paper  |  3 pages (1,080 words)
Bibliography Sources: 1

SAMPLE TEXT:

Given that this is the case, it is shown that performance in the math class necessarily means that there is greater pressure on the student to be correct that there is in other subjects of academic discourse.

Extensive research has been conducted into the topic of math anxiety in both psychological and physiological avenues. Researchers assert that "Math anxiety can bring about widespread, intergenerational discomfort with the subject, which could lead to anything from fewer students pursuing math and science careers to less public interest in financial markets" (Sparks 2011,-page 1). This is a very interesting perspective. If these findings are accurate, then the anxiety an individual feels might not only be impacted by their own histories with math, but with the experience that their parents or guardians had as well. Thinking about the issue, this actually makes a lot of sense. When a child does not understand his or her homework, then the child will go to an adult who they are close to for help with the material. If that adult also does not understand the material or if they react negatively to the topic, then that will influence that child, providing the youngster with another example of a person who responds to mathematics in the same way. This can be damaging to the relationship between the child and math at a potentially exponential rate.

During the interview, the math instructor I talked with gave me their opinion about what might be the basis for math anxiety. They believe that math anxiety is largely caused by lack of confidence. If a person has been unsuccessful with math throughout their childhood, then they will more than likely have negative opinions about their abilities in the subject once they have reached adulthood. Building of self-confidence, the instructor asserts, will help with the anxiety we feel when we are dealing with mathematics. There are ways in which this confidence can be rebuilt, such as reviewing of knowledge that a person already has, building confidence that they do in fact have mathematical knowledge. Another way is by seeking out help from teachers and classmates. Admitting that you are struggling in math is the first step to gaining the knowledge you need to be successful in the subject.

Many people experience math anxiety and it seems to be a symptom of a greater truth. Anxiety will ultimately beget further examples of anxiety. When a person struggles with something and continues to deal with the issue without overcoming those early struggles, then the problem becomes exacerbated. Not liking math or not succeeding in math as a young person becomes something of a self-fulfilling prophecy. If a child fails at math, then he or she goes into the next examination or the next math class fully expecting that they will fail again. They become so consumed with this idea that it winds up coming to pass. I did not understand the all consuming nature of math anxiety, but it seems that it could strike anyone… [read more]


Create and Analyze a Self-Designed Fictitious Act vs. SAT Scores of Low Income Students Research Paper

Research Paper  |  3 pages (840 words)
Bibliography Sources: 3

SAMPLE TEXT:

Score Stats

A Statistical Analysis of ACT vs. SAT Scores of Low Income Students

Study Description

Apparent income disparities in standardized test scores have been noted in many previous studies, with the determination that the income level of a student's family -- along with other sociocultural factors -- has a major effect on their ability to achieve on standardized tests (Kohn, 2002). For tests like the ACT and the SAT, which are commonly (almost universally) used by colleges and universities in the United States as part of their admissions criteria and decision-making process, a gap in performance caused by income levels puts low-income students at a significant disadvantage for entry into four-year degree programs, which in turn limits earning potential and thus could in fact perpetuate lower income levels (Kohn, 2002). This study set out to determine if there is a significant difference in the ACT test scores of low-income students when compared to the SAT scores of the same student population, as a means of determining if test composition can mediate or make more pronounced any impacts of income standing on student performance. 30 students who completed both the ACT and the SAT tests and who matched income criteria of living at or below 150% of the poverty level were included in the study, with their total scores on both tests compared in order to determine if a significant difference exists. This would indicate that something in the test structures(s) worked to influence the impact that a low-income background has been observed to have on standardized test scores.

Statement of Hypothesis

The null hypothesis is that there will be no difference between the means. The alternative hypothesis, which is the hypothesis this study is investigating, is that there is a significant difference between the mean scores on the ACT and the SAT, indicating that test structure can determine the degree to which income impacts test scores.

Variable Description

Income level was one independent variable used to determine eligibility/inclusion for the study, with a family income of 150% of the defined poverty level the upper income limit. Subjects were randomly selected and were also polled for age and gender, though these variables were not analyzed further. Income level was not recorded past the point of inclusion, therefore figures for this data are not given; gender and age are given and a descriptive analysis was performed. Dependent variables of interest were test scores on the ACT and test scores on the SAT. These two sets of variables constitute the data points…… [read more]


Behavior Science Research a Researcher Research Paper

Research Paper  |  2 pages (860 words)
Bibliography Sources: 1

SAMPLE TEXT:

An ordinal measure would ask the 130 individuals if they bought:

One to three vegetables each week

Three to five vegetables each week

More than five vegetables each week

A scale measure would ask the 130 individuals whether on a scale of one to five if they purchased (with 1 representing no vegetables and 5 representing many vegetables) how many vegetables they purchased each week.

7. In the fall of 2008, the U.S. stock market plummeted several times, which meant grave consequences for the world economy. A researcher might assess the economic effects this situation had by seeing how much money people saved in 2008. Those amounts could be compared to how much money people saved in more economically stable years. How might you calculate (or operationalize) economic implications at a national level?

The researcher would examine the amount of money people saved in 2008 and compare it to the amount saved in other years that were more economically stable and provide as a result a national average for the amount saved each year to be compared.

8. A researcher might be interested in evaluating how the physical and emotional distanceu a person had from Manhattan at the time of the 9/11 terrorist attacks relates to the accuracy for the event. Identify the independent variables and the dependent variable.

Physical distance and emotional distance are dependent variables and actual distance lived from Manhattan at the time of the 9/11 terrorist attacks is the independent variable.

9. Referencing Exercise 8, imagine that a physical distance is assessed as within 100 miles, or 100 miles or farther; also, imagine that emotional distance is assessed as knowing no one who was affected, knowing people who were affected but lived, and knowing someone who died in the events. How many levels do the independent variables have? The independent variable in this study has levels.

10. A study of effects of skin tone ( light, medium, and dark) on the severity of facial wrinkles in middle age might be of interest to cosmetic surgeons.

a. What is the independent variable in the study.

The independent variable in this study is the severity of facial wrinkles.

b. What is the dependent variable in the study.

The dependent variables in this study are light, medium, and dark skin.

c. How many levels does the independent variable have?

The independent variable has only one level and that being middle age.

11. Referring to Exercise 10, what might be the purpose of an outlier analysis in this case? What…… [read more]


Frequency Distribution Below Shows Research Paper

Research Paper  |  3 pages (870 words)
Bibliography Sources: 3

SAMPLE TEXT:

frequency distribution below shows the distribution for suspended solid concentration (in ppm) in river water of 50 different waters collected in September 2011.

Concentration (ppm)

Frequency

What percentage of the rivers had suspended solid concentration greater than or equal to 70?

Total samples (N) =50. (7+2+2)/50=0.22. 22% have a concentration of 70 or greater.

Calculate the mean of this frequency distribution.

Midpoint for each concentration group is determined and multiplied by frequency; results summed and divided by N (50). Mean = 57.1

In what class interval must the median lie? Explain your answer. (You don't have to find the median)

The median must lie in the 50-59 interval as this is where the middle data points (25 and 26) would fall in this data set of 50 points (there are 17 point before this group and 23 after; though on the lower end, this group contains the median data points).

Assume that the smallest observation in this dataset is 20. Suppose this observation were incorrectly recorded as 2 instead of 20. Will the mean increase, decrease, or remain the same? Will the median increase, decrease or remain the same? Explain.

The mean of the raw data set -- that is, not of the frequency distribution -- would change based on this incorrect record somewhat significantly. The mean of the frequency distribution would change very slightly if the one observation in the 20-29 category simply disappeared, and even more slightly if the lowest group was altered to include this point (the midpoint used to estimate the mean would drop to 14.5). The median would remain the same, however, as moving the lowest data point would not change the order of the data or the position/identity of the central data point(s).

Refer to the following information for Questions 5 and 6.

A coin is tossed 4 times. Let a be the event that the first toss is heads. Let B. be the event that the third toss is heads.

5. What is the probability that the third toss is heads, given that the first toss is heads?

If it is already given that the first toss is heads, there is a 0.5 probability that the third toss will be heads -- the same as for any standard coin toss. Though the overall probability of both a and B. occurring is 0.25 (0.5*0.5), knowing a has already occurred gives B. its natural and independent probability.

6. Are a and B. independent? Why or why not? Each coin toss is independent as it is not influenced by previous tosses -- a previous heads does not actually change the coin…… [read more]


Inferential Because it Makes Claims A-Level Coursework

A-Level Coursework  |  3 pages (946 words)
Bibliography Sources: 4

SAMPLE TEXT:

¶ … inferential because it makes claims about the population of adult Americans based on a sample of 9000 persons. The use of a sample proportion to estimate the population proportion makes this study inferential (Gravetter & Wallnau, 2008).

The research question in the study was; does the lack of health care increase the risk of death?

The data were obtained using a survey of 9000 persons tracked by the U.S. Centers for Disease Control and Prevention. While it is not explicitly stated in the article that the data was collected using a survey the tracking of individuals could not be done using an experimental design. Additionally, the purpose of the study suggested that it would engage in a correlational approach to explicating the problem.

The exclusion of persons aged 65 and over is an attempt to eliminate bias, as the inclusion of these persons would create a systematized form of error within the study. These older Americans receive health care through Medicare.

5. The conclusions drawn from the article are warranted because they are logical. Firstly persons who do not access medical attention will die from aliments that require medical attention. Secondly, the design of the study and the sample used are representative of the country and it would be legitimate to use such a sample. Finally, the design of the study followed a similar study done in 1993, therefore there is methodological support for the approach employed.

6. A large trial is necessary to ensure that the sample would be representative of the population. This representativeness means that the sample is similar to the population in key characteristics and the is little difference between the sample proportions and the population proportions. The error in the sample is therefore small.

7. A control group is needed to ensure that non-spuriousness is address adequately in the study. Using a control group means that the researcher is certain that the independent variable has the stated effect on the dependent (Lenth, 2001).

8. The double blind feature takes care of the propensity of human error to seep into the study. When both the participants and the researcher is unaware of which group is the treatment group, other variables that can have an effect on the study are controlled for.

9. The use of volunteers would have biased the results as that would have introduced systematic error into the study. The generalizability and validity of the study would be called into question (Creswell 1994). It is only through randomization that random error can be statistically determined.

Chapter 2

10-B

11-B

12-B

13-F

14-F

15-T

16-T

17

A random sample is similar to a convenience sample and a systematic sample only as they select member of a population for investigation. All methods of sampling will contain error. It is different because with a random sample there is…… [read more]


Z Test in Psychology Term Paper

Term Paper  |  2 pages (432 words)
Bibliography Sources: 0

SAMPLE TEXT:

Entering the provided values gives: (75-70)/?[(12/?36) + (12/?36)] = 5/2 = 2.5 = z.

Step 4: Probability Calculation

What is being tested is whether the students' attitudes towards the mentally ill change as a result of viewing the film, thus it does not matter if the students' attitudes are better or worse, only if they changed significantly from students who did not view the film. Since the attitudes could be worse or better, this is a two-tailed test. If the Z score falls within 95% of control scores, then we have to retain the null hypothesis and reject the alternative hypothesis.

If the scores fall outside of 95% of control scores, within the 2.5% of the extreme tail of the normal distribution at either end (two-tailed), then we have to reject the null hypothesis and retain the alternative hypothesis. Based on the Z score table, a Z score of 1.96 or greater would be needed to obtain a probability value below the alpha of 0.05, two-tailed.

Step 5: Conclusions

The Z score obtained by comparing the two means was 2.5. For these two means to be significantly different, using an alpha of 0.05, the Z score would have had to be 1.96 or greater. Therefore, since 2.5 >…… [read more]


Five Process Standards Term Paper

Term Paper  |  3 pages (1,149 words)
Bibliography Sources: 1

SAMPLE TEXT:

¶ … Standards

Five process standards

Describe the mathematical process standards

Problem solving

Engaging in a task without knowing the solution method in advance is what is referred to as problem solving. Drawing from their knowledge, the students are better equipped to find a solution for the problem, and while doing this the students will develop a new understanding of mathematics. The students are also able to solve any other problems they encounter both in mathematics and other life situations using their problem solving skills for example, "I have pennies, dimes, and nickels in my pocket. If I take three coins out of my pocket, how much money could I have taken?" Mathematics, 2000()

Problem solving involves the application and adaptation of various strategies to assist the student in solving problems.

Reasoning and proof

To gain better understanding on a wide range of phenomena, a student will need to have a strong mathematical reasoning and proof. Thinking and reasoning analytically allows a person to identify structures, patterns, and regularities in symbolic objects and real world situations. To better understand mathematics, a student needs to be able to reason. A good example is "Write down your age. Add 5. Multiply the number you just got by 2. Add 10 to this number. Multiply this number by 5. Tell me the result. I can tell you your age." Mathematics, 2000()

Students are able to better evaluate and develop their own mathematical arguments by employing reasoning and proof.

Communication

For the teaching of mathematics, communication is an integral part. It provides an avenue for the students and lecturers to share ideas, and make clarification where necessary. Challenging students to communicate their mathematical results and reasoning will help them learn to justify themselves in front of others, which leads to better mathematical understanding. Working on mathematical problems with others and having discussions will allow students to gain more perspectives when solving mathematical problems e.g. "There are some rabbits and some hutches. If one rabbit is put in each hutch, one rabbit will be left without a place. If two rabbits are put in each hutch, one hutch will remain empty. How many rabbits and how many hutches are there?" Mathematics, 2000()

Connections

A students understanding is deepened when they are able to connect mathematical ideas. By continuously developing and teaching students' new mathematics that are connected to what the students had learnt previously, the students are able to make connections. Learning mathematics by working on problems that arise from outside mathematics should be incorporated into the curriculum. These connections will give the students an opportunity to connect what they learn in relation to other subjects or disciplines. Mathematics is connected to many other subjects, and it is very important that students get to experience mathematics in context.

Representation

Proper and easy representation of mathematical ideas assists people to better understand and use these ideas. For example, it is very difficult to do multiplication using roman numerals than it is to use Arabic base-ten Mathematics,… [read more]


Operations Essay

Essay  |  4 pages (1,335 words)
Bibliography Sources: 3

SAMPLE TEXT:

d.).

As the first step, solving an equation requires combining like terms for the two expressions within the equation. In this case, like terms are those containing the same variable or group of variables that are raised to the same exponent despite of their numerical co-efficient. The second step is to isolate terms that contain the variable, which means getting terms containing that variable on one side of the equation while the other variables and constants are moved on the opposite side of the equation.

This is followed by isolating the variable to solve for that can result in obtaining a numerical coefficient. When a numerical coefficient of one is obtained following isolating the terms containing the variable to solve for, the variable was automatically isolated. The fourth step for solving an equation is substituting the answer into the original question in order to ensure that the answer is correct. In this case, substitution is a process of swapping variables with expressions or numbers as part of checking the answer to ensure it is correct. When solving an equation and explaining how to solve an equation, the most important factor to consider are the variables in the equation. This is primary because the variables in the equation play an important role in determining the accuracy of the process. The variables should also be critically considered because they help to determine whether the right or incorrect answer will be obtained.

Four Steps for Solving a Problem:

In most cases, mathematical problems usually require established procedures as well as knowing the procedures and when to apply them. Moreover, the process of learning to solve a mathematical problem is generally knowing what to search for. In order to identify the necessary procedures for solving an equation, an individual needs to be familiar with the problem situation, gather the appropriate information, and identify and use the strategy appropriately. While there are various steps for solving a problem in mathematics, effective problem solving requires more practice (Russell, n.d.).

The first step for solving a problem is looking at the clues through reading the problem carefully and underlining the clue words or phrases. When looking for clues, it may be important to examine if the person has encountered a similar problem in the past and what was done in that situation. The second step is defining the game plan, which involves developing strategies for solving the problem. During this process, the various strategies developed can be tried out in order to identify the effective one.

The third step in the process is to solve the problem suing the already identified strategy in the second step. The appropriate strategy that is used to solve the problem is normally identified by trying out various strategies. The fourth step is reflecting on the solution to the problem to examine whether it's probable, it solved the problem appropriately, and it answered the problem using the question in the language.

When solving a problem or explaining how to solve a problem,… [read more]


Guess and Check Essay

Essay  |  2 pages (615 words)
Bibliography Sources: 1

SAMPLE TEXT:

A popular problem solving strategy that an increasing number of students encounter before middle school is model drawing, sometimes taught as "Singapore Math." The Singapore method, named for the Asian nation in which it was developed, teaches students how to create visuals in a systematic way to assist in solving word problems. When students learn this method and have sufficient opportunities to practice, it can go a long way toward preparing them for the guess and check strategy. The Singapore method asks students to really examine the relationship between values in a problem and carefully consider the question being asked with respect to the solution.

There is a greater language base in today's mathematics programs. Prospective teachers should be reminded to discuss problem solving with their students, emphasizing the process and eschewing exclusive focus on "the right answer." Obviously, solving problems correctly is the goal. Students cannot get credit for wrong answers on standardized tests. More importantly, if students do not understand why they got a wrong answer, they have little hope of solving similar problems successfully in the future. Guess and check enables students to thoughtfully work through problems and gain understanding of mathematical relationships. It is a strategy that can help foster success. Success tends to beget further success; students who are able to solve problems build confidence in their ability to do so. They feel good about themselves and good about their mathematics classes. They learn that math does not have to be intimidating. There are thoughtful, logical ways to approach problem solving. It is a good lesson for mathematics as well as for other academic content areas.

References

Guerrero, S.M. (2010). The value of guess and check. Mathematics Teaching in the Middle

School…… [read more]


Nursing Research Analyzing Qualitative Data Essay

Essay  |  3 pages (842 words)
Bibliography Sources: 3

SAMPLE TEXT:

Statistics and Quantitative Analysis Design

Inferential statistics are based on the laws of probability and allow inferences to be drawn about a population based on a sampling of that population. Three applications for inferential statistics are: the sampling distribution of the mean; estimating parameters; testing hypotheses. The Sampling Distribution of the Mean employs an infinite number of samples from a selected population and theoretically distributes the means of those samples. Estimating Parameters consists of defining and establishing a framework for the target population from statistical samples (Polit & Beck, 2008, pp. 583-584). Finally, hypotheses are tested with objective criteria provided by data to infer whether the hypotheses are sufficiently supported by the evidence (Polit & Beck, 2008, p. 587).

Multivariate Statistics is an area of statistics concerned with the collection, analysis and interpretation of several statistical variables at once. While statistics may be artificially confined for convenience sake, health care actually involves complex relationships of variables for patients themselves, within a single health care institution, within a group of health care institutions, and within the entire health care system. Multivariate statistics observes and analyzes several of these variables at once using several types of tests for various purposes.

Multivariate Statistics analysis is integrated in quantitative analysis through a number of tests to compare a number of variables in complex relationships. Tests used in multivariate statistics include: multiple regression/correlation tests, used to understand the effects of at least 2 independent variables on one continuous dependent variable (Polit & Beck, 2008, p. 614); analysis of covariance (ANCOVA), which compares the means of at least two groups with a single central question (Polit & Beck, 2008, p. 624); multivariate analysis of covariance (MANCOVA), which involves controlling covariates -- or extraneous variables -- when the analysis involves at least two dependent variables (Polit & Beck, 2008, p. 627); discriminant function analysis, which involves using a known group to predict an unknown group with independent variables (Polit & Beck, 2008, p. 628); canonical correlation, which involves testing one or more relationships between two sets of variables (Polit & Beck, 2008, p. 638); logistic regression, which predicts the probability of an outcome based on an odds ratio (Polit & Beck, 2008, p. 640).

Inferential Statistics assists in… [read more]


Person Hired a Firm to Build Essay

Essay  |  1 pages (403 words)
Bibliography Sources: 1

SAMPLE TEXT:

¶ … Person hired a firm to build a CB radio tower. The firm charges $100 for labor for the first 10 feet. After that, the cost of the labor for each succeeding 10 feet is $25 more than the proceeding 100 feet. That is, the next 10 feet will cost $125; the next 10 feet will cost $150, etc. How much will it cost to build a 90-foot tower?

We see that there is a new price for every ten feet of tower. Each new price is $25 added to the previous price. Since repeated addition is involved, this is an arithmetic sequence. First, we need to identify the following numbers:

n = number of terms n = 9

d = the common difference

al = first term al = 100

an = last term an = a9

We know n = 9 because the tower increases in increments of ten feet, and the final height is 90 feet. 90/10=

To find the nth term of an arithmetic sequence, Page 271 of Mathematics in Our World gives us the following formula:

an = a1 + (n-1)d a9 = 100 + (9-1)

a9= 100 +8(25)

a9= 100 + 200

a9= 300…… [read more]


Correlation and Causation Understanding Essay

Essay  |  3 pages (1,147 words)
Bibliography Sources: 1+

SAMPLE TEXT:

Correlation and Causation

Understanding correlation

Within any population the variables that concern a researcher will hold different values. This difference in value for any variable becomes the basis of different types of analysis, which go beyond simply counting categories of the phenomenon. This type of analysis engages the use of variation to make statements about the nature of the relationship between variables. One of the ways to measure the association between two variables is the use of correlation. Correlation is consequently a useful tool that provides a quantitative measure of the presumed relationship between two or more variables.

Correlation therefore is a statistical technique that provides a numerical or quantitative assessment of the degree to which two variable co-vary. The idea of association is tied to the concept of co-variation. Co-variation occurs when two variables change values. This changing of values is a conceptual association that exists as a consequence of the way in which we try to make sense of the world. Within the mind of the observer it is possible to consider that the presence of y is linked to the presence of y. This linking is as a consequence of observing instances of x and seeing instances of y existing within close proximity to y. One may observe that changes in the diet may result in the loss or gaining of weight. This observation forms the basis of common understandings about the relationship between things. What scientist have attempted to do is to measure the strength of that relationship, thus providing a number that can be compared to other numbers to indicate different features of the observed relationship.

The main way to represent a correlation is to use the correlation coefficient (r). The correlation coefficient is the product of a series of statistical calculations that are produced when either the Pearson's Product Moment Correlation or the Spearman Rho is computed. The correlation coefficient ranges in value from -1.0 to + 1.0. The larger the size of the correlation coefficient that is, (tending toward 1 or -1) the stronger the relationship between the variables being tested. Moderate correlations are understood to begin at around 0.6 and weak correlations around 0.4 these values may be positive or negative. If the correlation coefficient is 0 then that suggests there is no relationship between the variables being tested.

The positive and negative signs are very important in interpreting the correlation between two variables. While the number tells the magnitude or size of the correlation the sign before the number indicates the direction of the correlation. The direction of the correlation can be positive or negative. These directions are also known as a direct correlation and an inverse correlation (Cooper & Schindler 2011). With a direct correlation the values of both variables increase together. Consequently as the number of calories that an individual ingests increases their weight may also increase. The relationship that has been describe is a positive correlation, where as one variable increases the other decreases. In an indirect correlation… [read more]


Normal Distribution Curve Essay

Essay  |  2 pages (593 words)
Bibliography Sources: 4

SAMPLE TEXT:

This equality is a constituent of the normal curve and something that makes it helpful to psychology in that different measurements can be based on the normal curve and applied to varying situations.

The Z score, for instance, are raw scores that are converted to units of standard deviation. These, because of the nature of the normal distribution, can be converted to percentiles or to other scores of measurement if necessary.

The normal distribution is the shape that happens to occur the most often in describing a population, and it is lucky that it does because it can be precisely quantified by mathematical equations. IQ scores are an ideal example of a normal distribution where you have the greatest frequency (or mean) in the middle with frequencies tapering off on either side.

Because it occurs often and because the shape is mathematically guaranteed, parametric (i.e. those based upon a normal distribution) studies and statistical tools are often more reliable than are non-parametric.

The normal distribution, finally, is also important in statistics since not only does it state that under certain conditions (mild and commonly the most frequent), the sum of a large number of random variables is distributed normally, but it is also a convenient choice for modeling a large variety of random variables that are generally encountered.

Moreover, of all distributions the normal distribution is the only absolutely continuous distribution who cumulates other than the mean and standard deviation are zero.

References

Cassela, G. & Berger, R. (2001). Statistical inference. UK: Duxbury.

Gravetter, F,. & Wallnau, L. (2007). Essentials of statistics for the behavioral sciences. Thamson Wadsworth, USA.

Weinbach, RW, & Grinnel, RM. (1991).…… [read more]


How Math Explains the World Term Paper

Term Paper  |  4 pages (1,403 words)
Bibliography Sources: 2

SAMPLE TEXT:

¶ … Math Explains the World

The title of James Stein's book, How Math Explains the World, is, perhaps, a bit deceptive. The reader who is expecting simplified explanations of complex mathematical principles will be disappointed. Although Stein has simplified many concepts, they will still be challenging for the reader who struggled with math in high school or who took… [read more]


Patient Perceptions of Maternal HIV Case Study

Case Study  |  3 pages (771 words)
Bibliography Sources: 4

SAMPLE TEXT:

For each patient in this study X and Y were known, but the researchers wanted to establish a straight line through the data that minimizes the Sum of the Squares of the vertical distances on a graph of the various points from the line that dissects the points.

Study bias. Participating patients self-selected to complete surveys, and not all survey respondents may have understood the terminology used in the survey in the same way.

Summary of Table 4. The recollection of patients regarding their physicians' practices are shown in Table 4, along with the responses of the physicians. Physician responses reflect their practice standards for recommending testing to women with exhibiting certain attributes or life situations, and also two specific questions that the physicians ask their patients. The responses of patients with regard to various questions are shown categorically for pregnant and non-pregnant women.

Chi-Square Test -- Race and Recall. Race was not found to be a strong predictor but a test did indicate that a patient's race is associated with her report that she had an HIV test. White, non-Hispanic and Asian women were significantly less likely to report having been tested for HIV than were African-American or Hispanic women. In this form, X2 (3), X2 represents the Chi-Square statistic and the "3" stands for degrees of freedom, while 17.3 is the Chi-Square value. The p value is a measure of how much evidence there is against the null hypothesis. In other words, it indicates the probability of getting a result as extreme as that one obtained. A small p value indicates that the null hypothesis can be rejected, with an understanding that there is still a possibility of making an error. P < 0.01 is the probability that the null hypothesis (of no differences between the groups) is true. Chi-Square is a non-parametric test, and though it can indicate if two groups are similar, it cannot tell the nature of the similarity.

Limiting factor. The researchers noted that physicians were used to distribute, collect, and return the patient surveys to the principle investigators and, as such, there was not a way to introduce random selection of the patient sample. Also, the patients in the study are associated with only 68 physicians so generalization may be…… [read more]

123. . .Last ›
NOTE:  We can write a brand new paper on your exact topic!  More info.