"Mathematics / Statistics" Essays

1234. . .
X Filters 

Structured Analysis of an Experimental Research Paper

Research Paper  |  3 pages (1,207 words)
Bibliography Sources: 1

SAMPLE TEXT:

When using sample variances to estimate the overall variance of a population, it is very important to avoid biasing the estimation by using (n-1) for the sample size in the variance formula, instead of the actual sample size n. Without this sample size correction, the computed sample variance would become an incorrect or biased estimate of the population variance.

In the 2004 study by Buller et al., the dispersion measures of variance and standard deviation were not of primary interest to the researchers in themselves, however the confidence intervals for their calculated results were paramount. Computing valid confidence intervals (CI) relies upon firstly the establishment that data are normally distributed, and secondly having available either the mean or standard deviation value to compute the CI. Therefore, the internal computations of mean and standard deviation from the large sample size were key to the results of this study. The range parameter was of incidental interest to the researchers, and was implied by the bounds of the categorical ranges they defined for each of their various tests. As noted by the researchers, "the large sample size allowed outcome assessment in patients with a broad range of body weights and renal function." 1

A standard normal distribution is a formal construct, defined as a normal distribution having a mean of zero (0), and a standard deviation of one (1). The area under the standard normal distribution curve represents the proportion or number of observations in the sample being analyzed, and their distance relative to the mean (represented by the center line of the graph), measured using the distance of each observation from the mean, measured using the positive or negative number of standard deviations of the observation relative to the mean. If sample is observed to have a normal distribution, this means that it will have characteristics similar to a standard normal distribution, and it therefore becomes possible to use familiar tools to compute the probabilities of selected outcomes, or proportions of value ranges.

In the 2004 study by Buller et al., the majority of the data gathered was categorical in nature, and were used to classify trial results as either significant or not significant for each of a large number of specific symptomatic tests. The essence of the comparisons relied upon the techniques of hypothesis testing and confidence intervals to validate whether the effect of each drug was significant for each of the symptomatic tests, and then computing the relative significance to compare performance of the drugs. The experimental result data were entered into a statistical analysis tool (SAS), which established the necessary preliminary criteria that the data conformed to a normal distribution, enabling the researches to employ the standard statistical tools.

The 2004 study by Buller et al. demonstrates the characteristics of a well-designed and appropriate statistical analysis. The researchers made a conscious effort to use very large sample sizes for each of the medication trials (n 1100), and they established a standard method of hypothesis testing with 95% confidence intervals… [read more]


Pre-Calc Trigonometry Journal

Journal  |  9 pages (2,604 words)
Bibliography Sources: 1+

SAMPLE TEXT:

Modeling Real-World Data with Sinusoidal Functions

The sinusoid which is sometimes referred to as the sine wave referrers to a function of mathematics describing a smooth oscillation that is also repetitive. It usually takes place in pure mathematics and also in physics, electrical engineering and signal processing besides numerous other fields. Its form as a function of time (t) is:… [read more]


Bayesian Methods for Data Analysis in Transcription Networks Term Paper

Term Paper  |  6 pages (1,735 words)
Bibliography Sources: 10

SAMPLE TEXT:

¶ … Bayesian method refers to methods on probability and statistics particularly those related to the degree of belief interpretation of probability as opposed to frequency interpretation. In Bayesian statistics, a probability is assigned to a statement, whereas under frequency conditions the hypothesis is tested without this probability. Bayesian method, or probability, may be seen as an extension of logic… [read more]


Representation in Algebra: A Problem Solving Approach Essay

Essay  |  12 pages (4,074 words)
Bibliography Sources: 15

SAMPLE TEXT:

Representation in Algebra: A Problem Solving Approach

The need for a solid background in mathematics for high school and college students in the 21st century is well documented (Katz & Barton 2007). A number of emerging career fields in the Age of Information are directly related to mathematical knowledge. For instance, Conaway and Rennolds emphasize that the "With the onset… [read more]


Testing a Critical Element of Research Essay

Essay  |  3 pages (1,110 words)
Bibliography Sources: 3

SAMPLE TEXT:

¶ … Testing

A critical element of research is determining whether a set of observations are the product of chance or the result of the action on the independent variable on the dependent variable. This approach to knowledge creation and verification engages deductive reasoning in the decision making process. Thus, the essential feature of this process is making the correct decision with reference to what the data are pointing to. To make this process more reliable and valid researcher engage a type of thinking and analysis involving statistical testing called hypothesis testing.

A hypothesis is a conjectured statement or predicted statement of relationship between two variables or more variables. The researcher uses the hypothesis to represent the relationship he anticipates will explain some aspect of the variance in the dependent variable. For the purposes of hypothesis, testing the researcher will usually have a null hypothesis and an alternate hypothesis. The null hypothesis is the hypothesis that is actually tested. It is either accepted or rejected by the researcher based on the test results (Ryan 2004). To accomplish this successfully requires that some decisions be made as to when the relationship exists or does not.

The result of the hypothesis test is considered significant if it is highly unlikely that it could be the product of chance. This is determined using a specific threshold for the rejection of the null hypothesis based on a specific significance level. The significance level for the rejection of the null hypothesis is determined before the test in undertaken.

There are two errors that can be made in hypothesis testing, and they center on the rejection of the null hypothesis. The researcher can reject the null hypothesis when the null hypothesis is true. This type of error is called a type I error and the probability of making this type of error called the alpha level. It therefore stands to reason that the lower the probability that is set for the alpha level, then there is a smaller chance of making a type I error. This also means that a more extreme test value is required to have the result be considered as significant. The other type of error that can be made in hypothesis testing is called the type II error. This is the reverse of the Type I error. As the researcher seeks to ensure that the result is not the product of chance, you increase the possibility of not rejecting the null when the null is actually false. So that the type II error leads, the researcher to determine that there is no effect when there actually was an effect.

Aron, Coups & Aron (2011) identify a five-step procedure for successful hypothesis testing. The first step involves restating the research question as a "research hypothesis and a null hypothesis about the populations" in general the null hypothesis is the opposite of the research hypothesis and states that there is no change, no difference or no effect. Secondly, the characteristics of the comparison distribution… [read more]


Codes Were Labeled Thus for Data Analysis A-Level Coursework

A-Level Coursework  |  4 pages (1,173 words)
Bibliography Sources: 5

SAMPLE TEXT:

¶ … codes were labeled thus for data analysis. Thus the category of participants who found my work to be highly interesting was coded 1, those who found it somewhat interesting were coded 2; those who found it tedious / boring, I coded 3; whilst those who found it highly disinteresting, I coded 4. I should have coded those whose data was missing (e.g. illegible, absent, or had not responded). 1,673 out of 2,715 participants had failed to complete their surveys (there was partially or utterly incomplete missing data). This made for 61.6% of the survey. The cases that I could, ultimately, rely on were 1,042.

Absolute frequency denotes the number of percentage of those who had responded. For instance 650 participants had responded that they were highly interested. This tells us that my presentation (assuming it was that) must have been highly interesting, or I am popular, or perhaps the participants were in some way or other influenced to vote on my behalf, for almost half of the population had found me highly interesting, altogether the overwhelming majority were at least positively inclined towards my presentation whilst only 89 participants ranged to being somewhat bored to being utterly bored. I might conclude that my presentation was successful. The relative frequency is my converting the absolute frequency into an approximate percentage. So, for instance, I calculate the 650 respondents of Code 1 from the total 2715 (650 / 2,715) or gain my relative frequency. The adjusted frequency allows me to compare the response across the various spectrums (to compare the response of the whole). So, for instance, I see that the majority of participants (62.4%) were highly interested in my work, whilst the lowest group of all (2.7%) loathed it. The cumulative frequency is the number of individuals as you move up the scale. For instance 62.4% of participants (i.e. 650 individuals) responded that they were highly interested, by the time you progress to the category of those who were disinterested you receive 100% cumulative frequency.

2.

Age

frequency

16

1

17

2

18

1

19

2

20

2

(b) the genders are equally balanced in their preference for coke (n=7)

2. Mean: all the scores were added and then divided by the number of scores (i.e. 30 in this case).

Standard error: The sample is never a perfectly accurate reflection of the population. There will always be some error between sample and population and the S.E. measures the average difference that should be expected between one and the other. In this case, the S.E. is low.

Median: List the score in order lowest to highest; the median is the middle score in the list (the 50th percentile). Here 16 is the median.

Mode: the most frequently reported score. # 16.

Sample variance: A sample is always biased to its population (different than it) in some way. Sample variance is measuring the extent to which it differs. In this case it verges 6.21% away from norm.

Sample deviation: The square… [read more]


Direction Attach Term Paper

Term Paper  |  2 pages (888 words)
Bibliography Sources: 2

SAMPLE TEXT:

¶ … prioritized items that I would take with me were I going away on a long tri

Lengthy letters, with photographs attached, from people who are emotionally closest to me, with these letters also containing personal memories, assurances of their love, and advice and encouragement for the future.

My laptop -- fully upgraded to the most secure and most functional program then in existence. It should have the basic capabilities, and ability to access and listen to a wide range of music and movies. I would ascertain, too, that I have long-term subscription to the major academic databases that interest me.

A thorough compendium of global philosophy, from past to present, as well as theories from the social sciences, in particular from sociology.

One of Winget's books, likely "People are idiots and I can prove it" (2009)

An introductory textbook to a wide-ranging spectrum of mathematical and logical disciplines. The one I have in mind is called, "A survey of mathematics with applications" (Angel & Porter, 1997)

Part B. Description of Item

1. The letters: I would request those sentimentally closest to me that they describe their feelings towards me in as best a manner as they can, that they describe events that have happened between us that have positively impacted them, and I would conclude with a request for their impressions of my strongest and weakest points. I would ask them to attach their photographs, and an addendum of specific encouragement and/or advice for the future.

2. The laptop -- I might switch to MEPIS, a program I have read that is more secure and reliable than Windows. I would ascertain that it is virus-free with a rapid Internet connection. I would also sign up for long-term subscription with pertinent online Academic databases; ensure that I have access to reliable music and DVD capabilities and, take along a starters' base of several of my best-loved music CD and DVDs (possibly although not necessarily the latter). I would ensure that I have all computer paraphernalia along with a large supply of printing paper, and several empty notebooks as well as a large supply of pens.

3. "People are idiots and I can prove it" (2009): In a down-to-earth, acerbic tone, Winget shows you exactly where and what you are -- he cuts through all the delusions -- whilst in an unusually commonsensical way he shows you how to see the mess in your life for what it is, and how to straighten it out. This is no self-help book; this is a self-'do' book.

4. The compendium: universal in approach, authoritative and comprehensive, an encyclopedia written in a scholarly manner covering every single…… [read more]


Statistics in Social Work Research Paper

Research Paper  |  4 pages (1,453 words)
Bibliography Sources: 5

SAMPLE TEXT:

Back end testing of additional questions utilizing data from the sample, and test type, performs 'audit' of sorts on the research. While not typically necessary, as outlandish findings are invariably obvious to professionals whom have been working at practice, in training this applied method of assessment is perhaps the best way to learn how to form a strong hypothesis.

b.… [read more]


NCTM Process Standards Essay

Essay  |  2 pages (649 words)
Bibliography Sources: 2

SAMPLE TEXT:

¶ … NCTM Process Standards

Problem-solving

In my class, problem-solving activities were integrated into every learning unit. Some of the methods deployed included learning how to use fractions in a hands-on fashion. As well as doing standard fraction-related problems on paper, students were asked to make visual representations of fractions and use them to solve word problems.

Learning how to make unit conversions was one of the most useful skills learned by the students. Students were given problems similar to those they might cope with in daily life, such as converting standard measurements to the metric system and vice versa. Students also were given the task of painting an imaginary room, and were asked to scale 'up' the amount of paint it would take to cover the surface area, based upon the previous amount used for the smaller, similarly-shaped room.

Students were given problems involving distance, rate, and time. All of these were intended to show the applications of problem-solving activities in math in 'real life' and teach students that understanding math required more than merely manipulating equations.

Reasoning & proof

For all problems worked on in class or at home, students were required to show how they arrived at their answers. It was not enough to simply get the right answer -- the process had to be demonstrated correctly. Focusing on the process of solving a problem over getting the right answer was stressed, contrary to how mathematics is usually taught. Using a process-based teaching strategy underlines the fact that there are different, but equally valid ways of arriving at the same answer for a problem, although some methods are more efficient.

Depending on the learning orientation of the student (verbal, visual, spatial, or kinesthetic) some activities proved more effective for certain members of the class than others, so a variety of strategies were used to teach a single concept. For example, one kinesthetic activity entitled "Walk down the line" required the students…… [read more]


Online Field Trip Comprised of Visits Essay

Essay  |  2 pages (454 words)
Bibliography Sources: 4

SAMPLE TEXT:

Online field trip comprised of visits to five online locations. Each website was related to the teaching and understanding of mathematics. The contents of the sites included information for teachers, parents and students. Some sites were concerned with the testing of particular skills other focused on providing relevant information to interested persons. The following paragraphs will provide a brief summary of the specific websites.

The first site visited was titled "Illumination: Resources for Teaching Math." This website was well organized and vivid. There were links for activities, lessons, standards and other online math resources. The home page of the site provided easy access to some of the more relevant resources on the site. The material on this site is designed to make, the teaching of math fun and enjoyable in the classroom.

Following the "Illumination" site, the next site visited was the "National Council of Teachers of Mathematics" (NCTM) website. This website was abuzz with a multitude of links and a wealth of information specifically for teachers. The resources and articles on this site focused on teacher development. From conferences to job opportunities, the professional development of the teacher was central to this site.

The website "A Math Dictionary for Kids" by Jenny Eather employed bold, bright attention grabbing colors. The website provided definitions to mathematical concepts at a level that children could easily grasp. Selecting a…… [read more]


Curriculum Design Essay

Essay  |  2 pages (580 words)
Bibliography Sources: 3

SAMPLE TEXT:

Curriculum Design

Mathematics -- Trigonometry

Grade

Spiritual Principle: "And he [Hiram] made a molten sea, ten cubits from the one rim to the other it was round all about, and...a line of thirty cubits did compass it round about....And it was an hand breadth thick...." -- First Kings, chapter 7, verses 23 and 26

This refers to the importance of studying exact measurements (see note) and utilizing mathematical principles to perform accurate calculations.

Students will apply the basic principles of trigonometry to investigate and explain the characteristics of rational functions. Students will apply these basic principles to understand why trigonometric measurement is necessary based on the limits of geometrical measurement. Students will understand basic principles of ratio, sine, cosine, and tangent. Students will be able to explain how trigonometry might be used in their daily lives. (Why is Trigonometry Important?, 2001).

Suggested Activities and Experiences-

1. Introduction to Trigonometric Principles -- To find relevancy, students need to see why trigonometry was invented and what questions it can answer. In addition, there are different problem solving skills necessary when dealing with trigonometric functions. Break students into groups so that there are at least 4 separate groups. Here is the problem:

You are in a group which is to abseil down a rock face tomorrow. Your task is to estimate the height of the face. You have no measuring instruments. You need to determine the height to know how much rope to take. You cannot take excess rope as you are at the start of a four day exercise and you must not have extra weight with you. Tomorrow morning you will walk the track which will take you to the top of the rock face.

Questions on board to read: What…… [read more]


Objective Map Essay

Essay  |  2 pages (629 words)
Bibliography Sources: 4

SAMPLE TEXT:

Mathematics

Grade 9

H.S School Curriculum

Lynchburg, VA

Spiritual Principle

To everything there is a season, and a time to every purpose under heaven (KJV Ecclesiastics 3:1-8)

Standard 1: Students will be able to explain the principles of, graph, and solve step and piecewise functions.

They will be able to convert absolute into piecewise functions.

Standard 2: Students will be able to graph and solve exponential functions and use them to model and predict real life scenarios.

Standard 3: Students will be able to solve quadratic equations and inequalities in one variable. Students will be able to determine and graph the inverses of linear, quadratic and power functions, including restricted domains

Suggested Activities and Experiences

Standard

Students will list the types of real world experiences that must be measured in terms of functions or rates of change over time, like changes in distance, temperature, and amounts of interest.

Students will find real world examples of piecewise functions in the newspaper and online, such as the rates of change of distance and speed, cell phone plans, and the value of buying in bulk and then graph these scenarios while in class (McClain & Rieves 2010, p.12).

3. The class will be divided into two halves and given transparencies and markers: one half will graph a linear function, the other half a quadratic function. After graphing both on transparences, students will lay the graphs together and see if the final, combined graph demonstrates or is a new type of function (McClain & Rieves 2010, p.11).

Standard 2:

1. Students will use the principles of compound interest to solve real-world investment goals. For example, a student might ask how he or she can save a specific amount of money within a defined time period to meet a life goal. If he or she has the opportunity to invest in a financial instrument yielding a particular amount of…… [read more]


Students at the End of This Grade Term Paper

Term Paper  |  3 pages (944 words)
Bibliography Sources: 4

SAMPLE TEXT:

Students at the end of this grade level must be able to investigate and solve step and piecewise functions. This means that they must be able to write absolute value functions as piecewise functions. Piecewise functions that students must be able to explain include domain, range, vertex, axis of symmetry, intercepts, extrema, and points of discontinuity. Students must show the ability to solve absolute value equations and inequalities analytically and graphically.

Standard 2: Students must be able to explore exponential functions. This includes the ability to extend properties of exponents to include all integer exponents. They must also be able to solve exponential equations and inequalities of relative simplicity both analytically and graphically. Students must demonstrate an understanding and ability to use basic exponential functions to model reality.

Standard 3: Students must be able to solve quadratic equations and inequalities in one variable. This involves finding real and complex solutions to mathematical equations by means of processes such as factoring, square roots, and the application of the quadratic formula. They must be able to analyze the nature of roots by means of technology and the discriminant. They must be able to describe their solutions by means of linear inequalities.

Standard 4: Students must be able to explore inverses of functions. This includes a discussion of functions and their inverses, by means of concepts such as one-to-oneness, domain, and range. Students must demonstrate an ability to determine the inverses of linear, quadratic and power functions, including restricted domains. They must also be familiar with the use of graphs to determine functions and their inverses. Composition must be used to verify the relationship between functions and their inverses.

Standards for Grade Level 10

Standard 1: Students must be able to analyze a higher degree of polynomial function graphs. This means that they must be able to graph simple polynomial functions and understand the effects of elements such as degree, lead coefficient, and multiplicity of real zeros on the graph. Students must also be able to determine the symmetry of polynomial functions in terms of their nature as even, odd, or neither. They must also demonstrate an ability to explain polynomial functions by referring to elements such as domain and range, intercepts, zeros, relative and absolute extreme, and end behavior.

Standard 2: Students at the end of this grade level must show an ability to explore and understand logarithmic functions as inverses of exponential functions. This includes the definition and understanding of nth root functions, as well as extending the properties of exponents to include rational exponents. Students must be able to extend the laws of exponents in order to understand and use the properties of logarithms.

Standard 3: Students must be able to penetrate various equations and inequalities by finding real and complex roots of higher degree polynomial equations. They must demonstrate an ability…… [read more]


Statistics Teaching Measures Essay

Essay  |  3 pages (981 words)
Bibliography Sources: 3

SAMPLE TEXT:

Teaching Measures of Central Tendency

This paper provides a descriptive narration of Measures of Central Tendency (the mean, the median, the mode, the weighted mean and the distribution shapes) with solved examples to illustrate these measures. As the paper describes, measures of central tendency is a category of descriptive analysis, which uses a single value to describe the central representation of any dataset and thus a useful tool in analysis. Due to the disparities that happen to be in different data sets, the mean or the average by itself may not provide the needed information about the distribution of the data. Therefore, the different measures of central tendency give adequate information concerning the distribution of any data set thus important to understand them.

Teaching Measures of Central Tendency

Measures of central tendency is one of the two categories of descriptive statistics that uses a single value as a central representation of a data set and it is important in statistical analysis as it represents a large set of data using only one value. From this category of description, several methods apply to represent this central part. Among the measures includes mean, median, mode, weighted mean and description shapes. The methods of analysis are crucial in statistical analysis as they give information of any data set of interest.

First, we examine the mean as measure of central tendency. Being the commonly utilized measure, it takes another name as average and it involves calculation of summing up all values in a selected population and then dividing the total sum by the involved number of observations. Depending on the desired mean, sample mean, or population mean, the resulting formula can differ slightly. All the same, the result is a central representation of a data set. For instance, if a data set constitutes the following 5 observations, 2, 7, 4, 9,and 3, then the mean will be obtained by summing up all observations (2 + 7 + 4 + 9 + 3) to obtain a cumulative sum of 25, then dividing this result with the number of observations (Mean = 25/5 = 5). Therefore, the mean of the five observations is equal to five (Donnelly, 2004, p. 46).

The next measure is the weighted mean. Unlike the normal mean or average, which allocates equal weight to all values of the observation, weighted mean gives the flexibility to allocate more weight on certain values of the observation compared to others. For example, considering the scores of a student in three exams, that constitutes the exam having a 50% weight, practical contributing 30% weight while the homework takes the remaining 20% weight. If this student scores 80, 70 and 65 in exam, practical and homework respectively, then the weighted mean of these scores obtainable. This is possible through summing up the products of exam score and its respective weight, then dividing the result by to total sum of the three weights, (weighted mean = ((50*80)+(70*30)+(65*20))/(50+30+20))=74) (Salkind,…… [read more]


Chi-Square Analysis: The History, Development, and Applicability of a Common Statistical Tool Term Paper

Term Paper  |  3 pages (888 words)
Bibliography Sources: 3

SAMPLE TEXT:

Chi Square

An Overview of Chi-Square Analysis: The History, Development, and Applicability of a Common Statistical Tool

There are many different types of information available in the world, and each type can be utilized in very different and highly specific ways depending on both the form of the information and the needs of those utilizing it. These types of information are, in some perspectives, classified into two broader types of information: quantitative and qualitative. Quantitative information is information the can essentially be boiled down to numeric form, and can arise out of either counting or measurement, leading to discrete or continuous data points that can be further analyzed and manipulated to result in deeper understandings of quantifiable phenomena and events. Qualitative data, on the other hand, cannot be reduced to numbers and must be analyzed through other means. Statistics has developed as a field of mathematics that enables researchers to analyze both quantitative and qualitative information in a way that allows for their comparison and analysis in many different ways.

The Chi-Square analysis is one statistical tool that has been developed as a way of analyzing and manipulating qualitative data. Specifically, the Chi Square method was developed in order to compare categorical data in order to determine what type of relationship existed between different qualitative variables (HWS 2010). A drug trial, for instance, might want to compare the number of people receiving a drug to the rates at which their symptoms improved when compared to another group not taking the drug -- the Chi Square analysis test would be a necessary tool in determining the drug's true efficacy.

There are actually several different types of Chi Square analysis that can be utilized, depending on the needs and scope of the research, but the most common of these is the Pearson's Chi-Square test. Karl Pearson was a scientist, philosopher, and mathematician of some renown both during and after his day, and his development of a specific method for analyzing the goodness of fit of a sample distribution and for testing the independence of certain variables/phenomenon (as in the drug trial example given above) is only one of his contributions to the worlds of science and data analysis (Plackett 1983). In 1900, he began working with the Chair of Zoology at the University College of London who supplied a great deal of data to Pearson at a time when his decade of work in correlation (methods of determining the degree to which separate observations occurred together or specifically in the other's absence, suggesting some relationship) and regression analyses (determining the relationship(s) between two or more independent variables on an independent variable) were culminating into the method…… [read more]


Geometry Manipulative Essay

Essay  |  2 pages (586 words)
Style: APA  |  Bibliography Sources: 1

SAMPLE TEXT:

Geometry Manipulative

Elementary Geometry Manipulative

Introducing complex math problems can be difficult to introduce to elementary students. Yet, there are many patterns within mathematics that, if explained properly, can be learned by young eager minds. Thus, it is with this in mind that this geometry lesson aims to teach angle relationships to fifth graders.

The math level being explored is that of the fifth grade. This is an old enough age to begin implementing algebraic and geometrical conceits within the curriculum. Within this grade level, there are three major standards presented by the National Council of Teachers of Mathematics (NCTM): multiplicative thinking, equivalence, and computational fluency, (National Council of Teachers of Mathematics 1989). Thus, it is a perfect age for the beginning basics of geometry. Understanding the formula for finding missing degrees of angles seems very simple but needs a clear and concise explanation. Therefore, within this lesson plan, the concept of angles, degrees, and the relationships between parallel lines and their corresponding angles will be introduced alongside the corresponding algebraic strategy for finding missing variables. In working with the unknown variable, x in most cases, students begin to understand equivalence by using x as a factor which completes a specific sequence. For example, it is clear within angles that if you know one degree within a split sector, you can find the other with the knowledge that the two equal 180 degrees. Thus, the known degree plus the unknown (x) will equal 180 degrees. This concept will satisfy the beginning workings of multiplicative thinking, equivalence, and computational fluency. In order for students to grasp this concept they will need to work with the provided handout and their pencils.

After practice with this hand out, students should be able to grasp the geometrical…… [read more]


Framing the Research Problem: Basic Steps Essay

Essay  |  3 pages (875 words)
Style: APA  |  Bibliography Sources: 1

SAMPLE TEXT:

Framing the Research Problem: Basic Steps

The specific steps undertaken when framing a research problem for a study will vary with the type of discipline, subject area of research, and the level of accuracy demanded of the research. For example, a small exploratory study designed to see if there is a market for a new fitness studio in a suburban area will demand a different level of scrupulosity than a statistical study designed to see if a new drug has dangerous side effects within certain demographic populations. However, broadly speaking, the steps of the research process are as follows (Marketing research, 2009, Quick MBA):

Define the problem

The problem must be framed in a clear question format, and the data the research is attempting to accumulate must provide a reasonable answer to that question. For example 'is there a statistically significant correlation between hours of television watched and a child's BMI (Body Mass Index)' or 'what characteristics do mothers say influence their breakfast cereal choice when shopping for the family' are both examples of research-based questions.

Most research is framed as a null hypothesis: in other words, the research statement is the opposite of what the researcher actually wants to prove. In the case of a study regarding television watching and childhood obesity, the null hypothesis might state that there is no correlation between hours of television the child watches and the likelihood that the child's BMI will be in the overweight or obese range. The null hypothesis often states conventional wisdom or the status of the control group.

Step 2: Determine research design

Is the research merely designed to describe a specific phenomenon, such as the average age or weight of a consumer of fast food, in the form of descriptive research? Or is it designed to explore possible reasons for the statistical tendency and take the form of exploratory research? Exploratory research might follow a particular population for a period of time to suggest a correlation, such as between obesity and number of fast food restaurants located near a child's school. A causal research design that aims to show a clear cause-and-effect relationship demands a more narrow study design, and usually a control group. It strives to eliminate other possible variables that could influence the outcome: for example, children who live in areas with many fast food establishments near their school might have less access to other leisure-time pursuits because of poverty and a poor diet at home -- factors beyond the location of fast food restaurants might be more of a cause, rather than the availability of fast food alone. More fast food restaurants…… [read more]


Kde and Kme Kernel Density Estimation (Kde) Term Paper

Term Paper  |  8 pages (2,601 words)
Bibliography Sources: 1+

SAMPLE TEXT:

KDE and KME

Kernel Density Estimation (KDE)

Abstract-- Kernel Density Estimation KDE is also known as the Parzen Window Method, after Emanuel Parzen. Parzen is the pioneer of kernel density estimation. Density estimation entails constructing an estimate based upon observed data, where the underlying probability density function cannot be observed. A kernel in turn is used as a weighting function… [read more]


Healthcare Practitioners as Well as Other Professionals Essay

Essay  |  2 pages (557 words)
Style: APA  |  Bibliography Sources: 1

SAMPLE TEXT:

Healthcare practitioners as well as other professionals must know how to deal with statistical data in order to do their jobs on a daily basis. As Rumsey (2003) points out, professionals are presented with statistical data and claims constantly and they must be able to understand how such claims are formulated and whether they are accurate in order to decide what to do about the information presented in such claims. This brief paper will outline some of the most important factors that professionals must understand and apply in order to make practical use of statistics in their work obligations.

Perhaps the first and most important information a professional must have about statistical claims is how the data was gathered and what methodologies were used to crunch the numbers. While the practitioner doesn't necessarily need to know how data was coded or what formulas were used in order to analyze results, a basic understanding of both factors will help the practitioner see if there any red flags in the data. For example, claims that are made about etiology of diseases should be performed under controlled conditions with suitably large and varied populations to ensure that the data is accurate. A study that relies on self-reporting of symptoms in the form of a survey may be adequate for an exploratory study, but not for making determinations about scientific bases for disease or treatment. Therefore the practitioner must understand the difference between quantitative and qualitative research and must know that quantitative research, when conducted with appropriate controls and adequate methodologies can make stronger claims about causal factors.

Rumsey points out the most important statistical measures the practitioner must understand and apply…… [read more]


Geometry Proof Research Proposal

Research Proposal  |  5 pages (1,680 words)
Bibliography Sources: 10

SAMPLE TEXT:

Geometry Proof

Geometry as a subject learned in school has a primary purpose, and that is to improve the ability of students to reason logically. Logical reasoning is one of the most vital things that a student can learn, not only for mathematics, but for many of the issues that he or she will face in life (Discovering, 2009). Without… [read more]


Psychological Research "It Is Difficult to Turn Thesis

Thesis  |  6 pages (1,904 words)
Style: APA  |  Bibliography Sources: 10

SAMPLE TEXT:

¶ … Psychological Research

"It is difficult to turn the pages of a newspaper without coming across a story that makes an important claim about human nature" (America Psychological Association, 2003, par. 1).

Often, we come across specific claims about individual behavior, nature, principles, and/or dynamics which we might find interesting. These articles often cite research studies conducted on the… [read more]


Size in the Field of Statistics Thesis

Thesis  |  2 pages (758 words)
Style: APA  |  Bibliography Sources: 1

SAMPLE TEXT:

¶ … Size

In the field of statistics, the term effect size is used to refer to the degree of relationship between two variables. Quite simply, it is the size of the effect that one thing has on another. There are many different examples of effect size that we encounter in our daily lives; it is a comparison and judgment we are so used to main that it appears like second nature. Think about the last time you saw a commercial for a product that advertised itself as "30% more effective than the leading brand," or made some similar "more-than" claim. This is a very direct and open use of effect size -- or at least claimed effect size -- to make what the advertisers want you to believe is a mathematical point. They are basically saying that their product, whatever it is, has a bugger effect size on whatever that products is intended to do. For instance, if a commercial for Brand X weight-gain powder for body builders claimed it was 20% more effective than Brand Y, they would be saying that their powder makes your muscles grow 20% more than the other powder -- that their powder has a larger effect size on muscles.

Though understanding effect size is relatively simple, understanding the mathematical formula behind it can be a little trickier. There are actually many different ways to measure effect size, some of them more reliable for certain cases than others. In general, effect size applies to the meta-analysis aspects of statistics. This means it is used to analyze the analysis, in a way -- while other data is analyzed to establish a correlation, effect size is used to measure the strength or degree of that correlation -- or rather, effect size is the measure of that correlation. According to Professor Becker's overview of effect size on the University of Colorado website (2000), one of the most commonly used measures of effect size is Cohen's d (section II). The "d" stands for difference, and this measure is used to measure effect size between two independent groups of data points. The formula for calculating Cohen's d is (M1-M2)/s, where M1 is the mean of the first set of data points, M2 is the mean of the second…… [read more]


Algebra in Daily Life it Strange Essay

Essay  |  2 pages (718 words)
Bibliography Sources: 0

SAMPLE TEXT:

Algebra in Daily Life

It strange, though kind of comforting, to think of the many things in our lives that math in general and algebra specifically are so involved in. Strange because we don't often have to do the mathematical operations involved in order to do our daily tasks and go through our routine, and comforting because the concrete and unchanging nature of numbers adds some certainty to this world that can so often seem chaotic and entirely ungrounded. Even if algebra can't predict what will happen to oil prices and mortgage meltdowns, it can at lest provide us with an explanation of what's happening and how it's happening -- and maybe even why.

I will leave this kind of math to the economists and the members of the Federal Reserve board, however; I have neither the know-how nor the inclination to become involved in that mountain of numbers. Still, there are plenty of smaller ways in which numbers play a role in my every day life. One of those ways is my bicycle. I ride my bicycle almost everywhere, and several times I have had minor breakdowns on the road. These incidents have given me a basic understanding of the way my bike and its various gears move me around, and the functions of the bike and its gears can be expressed algebraically. The actual equations that describe the bike's travel would be quite complex and would require a great deal of measurement and experimentation, but the basic equations that would be needed to calculate effort, speed, and travel time for various distances in various gears can be simulated in a simple thought experiment, using simple numbers.

First, a basic description of the bike is needed. I own a twenty-one sped mountain bike, but around town I'm usually on my three-speed cruiser, and for the sake of simplicity, let's look at the equations pertaining to that bike. Terms that need defining in terms of assigning numerical value are the tire circumference (which is also the distance traveled per revolution of the tire), number of revolutions of the tire that result from each turn of the pedals, and effort…… [read more]


Group Spending Comparison Between British, German, French Term Paper

Term Paper  |  5 pages (1,291 words)
Style: Harvard  |  Bibliography Sources: 2

SAMPLE TEXT:

Group Spending Comparison Between British, German, French, And Italian Consumers

From the results, we conclude that the Germans, French, and Italians outspend the British in groups, but that the variance is higher amongst Germans. This is shown by the higher upper limit amongst Germans as compared to others. Germans have the highest standard deviation and standard error, which shows that there is more variance than amongst other nationalities. Since the means fall near the median, we can say that our sample of mean are true.

Task 1(b) - Individual Spending

British

German

French

Italian

Mean

Sample Standard Deviation

Standard Error

Estimate of Mean

Upper Limit

Lower Limit

Comments: The French and Italians have the highest mean, while the British and the Germans are close together in with a lower spending per person. The variances, however, between the British and the Germans are much higher for the Germans, indicating that there may be a subset of higher spenders. The same is true for the French, which could mean a skew on the higher or lower spending range. This difference between Germans and Brits is supported by the higher limit number for Germans. The French and Italians seem to uniformly spend more, as evidenced by their mid-sized SD and relatively high lower limit and relatively low upper limit. The sample data from the Germans is higher, as shown by the higher standard deviation.

Task 1- - Difference in Means of Group Spending

Group Spending

British

German

French

Italian

Mean

Sample SD

Comments: as we can analyse from the given results that value of z-score lies within the +/- standard deviation for all the values, which means that the null hypothesis level is accepted for the pair which lie in the 95% of confidence limit.

Task 1(d) - Difference in Means of Individual Spending

Group Spending

British

German

French

Italian

Sample SD

Standard Error

British

German

French

Italiian

British

Germans

French

Italian

Z-Test

British

German

French

Italiian

British

Germans

French

Italian

Comments: The above results demonstrate that the Z-score values are above SD for all but FI, BI, BF and BG. For these, the null hypothesis is not proven, for all others the null hypothesis is accepted at a 95% confidence limit.

Task 1(e) - Regression

Comments: Above result of regression shows that if none of the nationality go to the holiday so the expenditure for the respective family will be 515.80,550.69,545.70 and 617.42 for respective nationalities as given in the table. And if they go to the holiday so the expenditure to a large extend will be influenced by the slope of the regression equation.

Task 1(f) - Correlation

Comments: As R-square coefficient shows that,71%, 71%, 66% and 67% of the variation may be predicted by change in actual family size, for respective nationalities and the rest of the percentage i.e,29%, 29%, 34% and 33% are unpredicted. The value of the T-statistics of intercept and slope, indicates that they cannot be zero, and for the each nationality the regression equation can be… [read more]


Ethnomathematics: Mathematics and Culture Term Paper

Term Paper  |  2 pages (741 words)
Style: APA  |  Bibliography Sources: 4

SAMPLE TEXT:

Ethnomathematics

What is "ethnomathematics," and what role should the study of indigenous counting systems play in the teaching of number and numeration?

Ethnomathematics, as its name suggests, is the study of the interaction between mathematics and culture. Ethnomathematics' most obvious application in elementary school classes may be in social studies units. Students can study how the development of different mathematical methods enabled the construction of various architectural structures that changed the way people lived and worshipped, like the pyramids. Also, the study of mathematics can be integrated into the study of history, as the development of Arabic numbers facilitated the creation of algebra. Mathematics classes may make use of word problems involving students of many ethnic backgrounds or include units such as examining the concept of slope in the designs of Navajo blankets, a technique used by one teacher in his curriculum (Fugit & Smith, 1995)

However, the application of ethnomathematics can be much broader. "Ethnomathematics is the study of mathematical techniques used by identifiable cultural groups in understanding, explaining, and managing problems and activities arising in their own environment" (Patterson, 2005). For example, the manner in which "professional basketball players estimate angles and distances differs greatly from the corresponding manner used by truck drivers. Both professional basketball players and truck drivers are identifiable cultural groups that use mathematics in their daily work. They have their own language and specific ways of obtaining these estimates and ethnomathematicians study their techniques" (Patterson, 2005).Likewise, the practical physics used by engineers is quite different from the theoretical physics explored by physicists in academia. Although ethnomathematics' use of indigenous counting techniques is often assumed to be non-Western in style, indigenous subgroups within Western society also exist. Approaching math from this practical perspective also provides a very concrete answer to the frequent complaint of many children that math has no application to 'real' life.

The importance of ethnomathematics is perhaps best illustrated by examining the origins of the word more closely. Broken down, the word "ethno" refers culture, and culture refers to national as well as a tribal status, professional status, and even age, in deference to Piaget's exploration of how children of various ages have different perceptions of depth and mass (Patterson, 2005). Culture…… [read more]


Browse the PA State Standards and Select Term Paper

Term Paper  |  4 pages (1,268 words)
Style: APA  |  Bibliography Sources: 2

SAMPLE TEXT:

Browse the PA state standards and select the standards on which you would like to base your unit. In a separate document, write two to three paragraphs explaining how your unit of instruction supports local guidelines and student academic content standards. Remember to submit this with your task.

Geometry:

Construct figures incorporating perpendicular and parallel lines, the perpendicular bisector of a line segment and an angle bisector using computer software.

Draw, label, measure and list the properties of complementary, supplementary and vertical angles.

Classify familiar polygons as regular or irregular up to a decagon.

Identify, name, draw and list all properties of squares, cubes, pyramids, parallelograms, quadrilaterals, trapezoids, polygons, rectangles, rhombi, circles, spheres, triangles, prisms and cylinders.

Construct parallel lines, draw a transversal and measure and compare angles formed (e.g., alternate interior and exterior angles).

The standard that I wish to base my unit is on the standards that apply for geometry for grade 8 mathematics. Geometry is one of the most crucial components for the 8th grade level because it supports understanding of higher mathematics at the high school level. Not only is it a very basic component of understanding calculus and linear algebra, but it is the fundamental basis for most science and computer technology classes as well.

The unit that I will create focuses on exploring the properties of the polygon. The polygon has many unique properties and it is a very important unit because it shows students that the squares, rectangles and even circles that they are so familiar with fits within the framework of a greater geometrical understanding, polygons. This essentially ties together all of the random "shapes" that they have had to master into a unified rule for application. According to Pennsylvania standards students need to be able to perform five different functions for polygons, they have to be able to classify polygons as regular or irregular up to the decagon. Identify, name and draw the properties of many different polygons. The unit that I will construct focuses on teaching students the tools necessary for constructing and understanding the properties of polygons. It shows the universality of these shapes and how they can be constructed in accordance with their specific characteristics. My focus will be on understanding the "universal" application of polygons and then applying them to specific shapes so that students do not engage in "memorization" so much as understanding of the root concept of polygons.

My unit fits specifically into the purpose of the PA standards for 8th grade math, and specifically attacks the need for students to have strong geometry experience going into high school. Therefore this unit is critical for the success of students in general and especially when focusing in core understanding.

B. Write four instructional goals for your unit (two for each lesson plan). Enter the goals in the Objectives field in the Unit and Lesson Builder templates.

Unit: Understanding the properties of a polygon (Geometry)

Lesson Plans:

Focus on polygons

Goals:

Understand what makes an object a… [read more]


Statistical Analysis of Restaurant Patrons Term Paper

Term Paper  |  4 pages (1,636 words)
Bibliography Sources: 1

SAMPLE TEXT:

Statistical Analysis of Restaurant Patrons

What type of research question (ie: descriptive, comparative, relationship) is being asked by the researchers?

The research question being asked by the researchers is that of comparing the expression on a patrons' face in a restaurant being a predictor of the percentage of tip given to their waiter or waitress. The researchers are using a… [read more]


History of Pi Term Paper

Term Paper  |  2 pages (749 words)
Style: MLA  |  Bibliography Sources: 3

SAMPLE TEXT:

Greek Letter Pi Equations and Notations

Some of the most complex ideas and concepts came from the earliest history of mankind. For example, the notion of Greek letter pi, or the ratio between a circle's circumference and diameter, stems back early biblical times.

Algebra began its development in both the nations of Egypt and Babylonia about 1650 BC. However, historians remain uncertain as to whether or not new ideas traveled between these two countries. Written relics such as papyri and the Hammurabi clay tablets of this time indicate that algebra in Egypt was less sophisticated than that in Babylonia (Gullberg, 1997), in part because the it had a more primitive numeral system. It is also believed that the Babylon influences spread to Greece, 500 BC to 300 BC, then to the Arabian Empire and India, 700 AD, and finally to Europe,1100 AD (Baumgart, 1969)

The equations and notations that are applied today were first used around 1700 BC and standardized by about 1700 AD, primarily because of the invention of the printing press in 1450 and the ability of scholars to easily travel from one location to another. This helped spread ideas across the continents. However, there has never been complete consistency of algebraic notations and differences are still found in various areas of the world. For instance, many Americans use a period with decimals and Europeans use a comma, and thus 3,14 as an approximation for pi or 3.14 (Baumgart, 1969).

The concept of pi was also found in the Bible's Old Testament. For example, 1 Kings 7:23, says: "Also he made a molten sea of ten cubits from brim to brim, round in compass, and five cubits the height thereof; and a line of thirty cubits did compass it round about" (Blatner, 13), meaning, perhaps, that pi = 3. Scholars have debated about this verse for centuries, and they are not much close to knowing the truth now. Some people believe it is just an approximation and others argue.".. The diameter perhaps was measured from outside, while the circumference was measured from inside" (Tsaban, 76).

According to Tsaban (78), most of these scholars do not notice another use of pi that is more helpful: In Hebrew, each letter equals a certain number,…… [read more]


Statistical Language Term Paper

Term Paper  |  2 pages (644 words)
Style: APA  |  Bibliography Sources: 2

SAMPLE TEXT:

¶ … larger population of cases. Term used to represent the population under study.

Population- set of cases from which a sample is drawn and to which a researcher wants to generalize from.

Frequency- symbolized by f, this is the number of cases with a particular value of a variable, or values of two or more variables.

Measures of Central Tendency- representative values for a set of scores that are considered averages of univariate information.

Mean- arithmetical average of all scores; the sum of cases divided by the number of cases.

Median- value that divides an ordered set of scores in half.

Mode- most frequently occurring score on a variable.

Measures of dispersion- the distribution of statistical frequency; distribution about an average or median

Standard deviation- measure of variation in scores. It is also the square root of the variance.

Range- extent of the frequency distribution; the difference between the minimum and maximum value in a frequency distribution

Variance- square of standard deviation; statistical measure of the spread or variation of a group of values in a sample

Standard error- the standard deviation of a sampling distribution.

Descriptive statistics- refers to methods for summarizing information so that information is more intelligible, more useful or can be communicated more effectively.

Inferential statistics- refers to procedures used to generalize from a sample to the larger population and to assess the confidence we have in such generalizing.

Independent variable- variable determining the value of others; the variable in a mathematical statement whose value, when specified, determines the value of another variable or other variables

Dependent variable- an element in a mathematical expression that changes its value according to the value of other elements present

Confounding variable- variable that may be confused for the independent variable; commonly makes researchers fail to distinguish between the independent variable and confounding variable

Sampling- the process of selecting a sample group to be used as the representative or the random…… [read more]


Derivatives Calculus Term Paper

Term Paper  |  2 pages (591 words)
Style: MLA  |  Bibliography Sources: 2

SAMPLE TEXT:

Mathematics: Derivatives

Derivatives: an Explanation

Derivative" is a mathematical answer to the question, "how quickly does it change?" For instance, if one noted that the national debt was changing rather quickly, one could also say that the national debt had a high derivative. If one specified and went on to say that the national debt was rising rather quickly, one could also say that the national debt had a high, positive derivative. It follows that if the national debt were falling rather quickly (although that is unlikely to happen), one could also say that the derivative of the national debt had a high, negative derivative.

When working with derivatives, it is important to avoid ambiguity. While most would assume that a high derivate was positive, the word "high" is not mathematically defined. For that reason, a certain vocabulary should be used when working with derivatives to ensure effective communication. The words "high" and "low" should be discarded in favor of well-defined terms like "negative" (below zero) and "positive" (above zero).

Establishing that vocabulary begs the question: what if the derivative is zero? If a derivate is the answer to the question, "how quickly does it change," and the answer is zero, that must mean it didn't change at all. Therefore, if one were to say that the national debt was stable, or not changing, then one could also say that it had a derivative of zero.

Using some basic concepts from algebra, another definition for "derivative" can be reached. A common tool is algebra is a graph, a system that plots points based on their value. Each point has two values, labeled "X" and "Y" respectively, and the point is located "X" units to the right (if positive) or left (if negative) of the origin…… [read more]


William Gosset Term Paper

Term Paper  |  3 pages (960 words)
Bibliography Sources: 1+

SAMPLE TEXT:

William Gosset

William Sealey Gosset was one of the leading statisticians of his time, particularly with his work on the concept of standard deviation in small samples. His theories which were published under the name of "student" are still used today in both the study of statistics and the practical application.

Gosset was born on June 13, 1876, in Canterbury, England to Colonel Frederic Gosset and Agnes Sealy Vidal. Gosset was well educated from the beginning first at Winchester, a prestigious private school, then at New College at Oxford. He received his degree in mathematics in 1897, followed two years later by a degree in chemistry (O'Connor and Robertson). It was the combination of these two fields of study that gave Gosset a career and an opportunity to create his theory.

Upon graduation, Gosset was hired as a chemist by the Arthur Guinness and Son Company in Dublin. Working in the brewery, required Gosset to constantly attempt to find the best varieties of barley for use in the production of Guinness. This was a complicated procedure of taking small samples to determine the best quality product. Gosset continuously played around with the results of various samples of barley in order to find ones of the best quality with the highest yields that were capable of adapting to changes in soil and weather conditions. Much of his work was trial and error both in the laboratory and on the farms, but he also spent time with Karl Pearson, a biometrician, in 1906-07, at his Galton Eugenics Laboratory at University College (O'Connor and Robertson).

Pearson assisted Gosset with the mathematics of the process. Gosset published his findings under the name of "student" because the brewery would not permit him to publish. The brewery feared that trade secrets would get out if information about the brewing process was published. Consequently, Gosset had to assume a pseudonym even though his information would not have impacted the business in the way the brewery was concerned (O'Connor and Robertson).

Gosset published his work in an article called "The Probable Error of a Mean" in a journal operated by Pearson called Biometrika. As a result of Gosset's pseudonym, his contribution to statistics is called the Student t-distribution. Gosset's work caught the attention of Sir Ronald Aylmer Fisher, a statistician and geneticist of the time. Fisher declared that Gosset had developed a "logical revolution" with his findings about small samples and t-distribution (O'Connor and Robertson).

In his work with the barley for the brewery, Gosset was concerned with estimating standard deviation for a small sample. A large sample's standard deviation has a normal distribution. However, Gosset did not have the luxury of working with large samples. He had to find a way to determine the standard deviation for a small sample without having a preliminary sample to make an estimate. Gosset developed the t-test to satisfy this need.…… [read more]


Scores of First Born and Second Term Paper

Term Paper  |  3 pages (789 words)
Bibliography Sources: 0

SAMPLE TEXT:

¶ … scores of first born and second born children on the Perceptual Aberration test. Ha: There is a significant difference in mean scores of first born and second born children on the Perceptual Aberration test. The alternative hypothesis is one sided, since the null hypothesis is concerning non-directional data, in that we are not predicting directional information.

The data will be examined using an independent t test. This test is used since the groups in these circumstances are not related. If the samples were correlated, where each individual had two scores under two treatment conditions, the dependant t test would have been used.

The test statistic in this experiment represents the difference between the mean scores of the children tested divided by the standard error of the difference. This result would represent whether there was a significant difference between the means. If so, we would reject the null hypothesis. By using the t test, we are able to determine the ratio of the mean difference in test scores when compared to the error of differences in the means. A large mean difference does not guarantee a large t, hence the use of the standard error of difference.

1d. Following a .05 level of significance, and after calculating the df (92), the critical value needed to reject the null hypothesis is 2.367. Our calculations show that t=.529982. Thus, we would accept the null hypothesis because our calculated value (t = .529982) is less extreme than the critical values (2.367 or -2.367).

1e. The observed mean difference in test scores between first (M=17.2563) and second (M=16.14815) born children was not significantly different, t (92) = .529982, p>.05.

t =

17.2963-16.14815

4331.62 +1497.407

1

1

92

27

27

t = 1.14815 / SQRT (5829.037/92) * ((1/27) + (1/27))

t =

1.14815 / SQRT (63.3591 *.074074) = 1.14815 / 2.166395 = .529982

2a. H0: There is no difference in the population means of the different scores on recall of words testing following memory techniques. Ha: There is a difference in the population means of the different scores on recall of words testing following memory techniques.

2b. An independent two samples t test is not appropriate because our samples include more than two samples. We are comparing the results of three groups of subjects.

2c. The between groups ANOVA test statistic will measure if there is a difference…… [read more]


Historic Mathematicians Term Paper

Term Paper  |  6 pages (2,172 words)
Bibliography Sources: 1+

SAMPLE TEXT:

Historic Mathematicians

Born on January 29, 1700 Daniel Bernoulli was a famous Swiss Mathematician. His father -- Johan Bernoulli was the head of mathematics at Groningen University in the Netherlands. His father planned his future so that Daniel would become a merchant. but, Daniel never wanted to become a merchant, as he his favorite was calculus. His father could not… [read more]


Chaos Theory Has Filtered Down Book Review

Book Review  |  6 pages (1,570 words)
Bibliography Sources: 1+

SAMPLE TEXT:

Chaos Theory has filtered down to the public through such short discussions of the issue as are found in films like Jurassic Park or on television documentaries. The issue are more complex than can be indicated in such media depictions, and two authors who have set out to explain chaos theory more thoroughly, though still in a popular vein, are… [read more]


Probability Statistics Term Paper

Term Paper  |  2 pages (735 words)
Bibliography Sources: 0

SAMPLE TEXT:

Probability: Its Use in Business Statistics

Business, one might say, is an exercise in probability. No one knows exactly what the market will do in the future, not even the most skilled analysts and prognosticators. One can only make educated guesses, and the use of probability models and statistics enables the professional to make such guesses, even though, no consumer behaves perfectly according to mathematical economic metric models. If used correctly, statistical analysis can be important guides that enable one to purse intelligent business practices and function as aids in the decision making process, even though they are only, ultimately projected 'guesses' as to how the economic environment will evolve, given a variety of variable factors.

Probability, in its most ideal mathematical form, attempts to make use of various concepts to determine what is likely to occur, given a particular set of variable circumstances. One of the most important uses of probability in business is to determine what a particular consumer market's spending habits are likely to be, given a particular set of events. For instance, if the Federal Reserve lowers interest rates yet again, and consumer spending is likely to increase, what is the most desirable course of action, in terms of production of a business that manufactures durable goods, if all other market aspects remain relatively unchanged? Probability theory can also be used to assess what to do if a new and potentially variable competitor advances into a market, pricing comparable goods competitively against one's own product line. What will consumers do, and how will the market behave, given these circumstances?

Probability theory thus deals with what is variable and also with what is unknown in projected circumstances or futures. One must know certain fixed attributes about the circumstances, such as certain fixed production costs, but the use of probability theory allows for the introduction of a set of uncertain or variable factors.

Thus, the use statistical probability attempts to project a variety of foreseeable futures, so the businessperson can prepare for the possible negative aspects of these foreseeable futures. These unknowns are represented, in equations, as variables or unknowns. Various scenarios can be plugged into these placeholders, represented as 'xy' in integral calculus functions.…… [read more]


Guillaume Francois Antoine De L'hopital Term Paper

Term Paper  |  5 pages (1,595 words)
Bibliography Sources: 1+

SAMPLE TEXT:

Apparently, out of respect to the mathematician who made much of his fame possible, L'Hopital abandoned the project.

'L'Hopital was a major figure in the early development of the calculus on the continent of Europe" (Robinson 2002). During this time of scientific and mathematic enlightenment in Europe, and particularly in France, L'Hopital established himself as one of the world's premier mathematicians and book writers. It is noteworthy that many of the accomplishments L'Hopital is credited with have come into question over the years. Most obvious among these include the rule that is named after him, which every calculus student has been forced to memorize for the past three hundred years. Despite these questions, perhaps the most telling thing about L'Hopital is that he was widely accepted and respected by his peers. He became the third man on continental Europe to learn calculus simply because he impressed the man who later became his tutor. "According to the testimony of his contemporaries, L'Hopital possessed a very attractive personality, being, among other things, modest and generous, two things which were not widespread among the mathematicians of his time." (Robinson 2002). He died on the second of February, 1704, in Paris; the city of his birth.

Works Cited

1. Addison and Wesley. Calculus: Graphical, Numerical, Algebraic. New York: Addison-Wesley Publishing, 1994.

2. Feinberg, Joel and Russ Shafer-Landau. Reason and Responsibility. Boston: Wadsworth Publishing, 1999.

3. Goggin, J. And R. Burkes. Traveling Concepts II: Frame, Meaning and Metaphor. Amsterdam: ASCA Press, 2002.

4. Greenberg, Michael D. Advance Engineering Mathematics: Second Edition. Delaware: University of Delaware, 1998.

5. O'Connor, J.J. And EF Robertson. "Blaise Pascal." JOC/EFR. December 1996. School of Mathematics and Statistics, University of St. Andrews, Scotland.…… [read more]


Statistical Analysis Reported in Two Term Paper

Term Paper  |  12 pages (3,282 words)
Bibliography Sources: 1+

SAMPLE TEXT:

First, no such mention was ever made in the beginning of the study with respect to gender differences. Second, logistic regression analysis and/or techniques have no earthly association with differences. Had the authors wanted to determine whether or not differences occurred they should have employed the proper descriptive tool "t" test or ANOVA." Again, this was not the case. Additionally… [read more]


Mathematician - Maria Gaetana Agnesi Term Paper

Term Paper  |  2 pages (587 words)
Bibliography Sources: 1+

SAMPLE TEXT:

She had written 2 volumes of mathematical books, the Institutioni analytiche ad uso della gioventu italiana (Analytical Institutions), that covers elementary and advanced mathematics which she started to develop when she was teaching mathematics to her younger brothers. Her books aim to present a complete lecture of algebra and mathematical analysis.

Maria Gaetana Agnesi was well-known for her "The Witch of Agnesi," which, actually, should be called "The Curve of Agnesi." The Italian term "versiera," or plane curve, was mistakenly translated by John Colson into the word "witch" (Parente, 2003). Thus, "The Curve of Agnesi" was also known as "The Witch of Agnesi." Elif Unlu describes "The Witch of Agnesi" by stating the following.

Agnesi wrote the equation of this curve in the form y = a*sqrt (a*x-x*x)/x because she considered the x-axis to be the vertical axis and the y-axis to be the horizontal axis [Kennedy]. Reference frames today use x horizontal and y vertical, so the modern form of the curve is given by the Cartesian equation y*x^2=a^2(a-y) or y = a^3/(x^2 + a^2). It is a versed sine curve, originally studied by Fermat.

When Agnesi first wrote her 2 volumes of Analytical Institutions, she used her genius in mathematics to teach her younger brothers, and the young Italians as well. Her prowess in mathematics was shared when, after the success of her book, she became a professor of mathematics in the University of Bologna.

Bibliography

Crowley, Paul. Maria Gaetana Agnesi.

New Advent. 08 Dec 2003. http://www.newadvent.org/cathen/01214b.htm

Unlu, Elif. Maria Gaetana Agnesi.

1995. Agnes Scott College. 08 Dec 2003. http://www.agnesscott.edu/lriddle/women/agnesi.htm

Parente, Anthony. I Wrote the First Surviving Mathematical Work by a Woman.

2003. ITALIANSRUS.com. 08 Dec 2003. http://www.italiansrus.com/articles/whoami5.htm… [read more]


Pascal's Triangle Who Really Invented Term Paper

Term Paper  |  4 pages (1,265 words)
Bibliography Sources: 1+

SAMPLE TEXT:

In fact, the understanding of probabilities the triangle helped mathematicians understand has led to the development of "average gain" or "probable gain" formulas that are still used extensively in business and industry (Borel, 1963, p. 20).

The basic formula for the triangle is simple, as one expert notes.

If we assume a fictitious row of noughts prolonging each of these lines to right and left, it is possible to lay down the following rule: each number in any one of these lines is equal to the sum of whatever number lies immediately above it in the preceding line, and whatever number lies immediately to the left of that number. Thus the third number in the fifth line is 10 = 6 + 4; the fourth number in this same line is 10 = 4 + 6; the fifth number is 5 = 1 + 4 (Borel, 1963, p. 18).

There is one problem with Pascal's formula, however. Unfortunately, as the numbers increase, the triangle takes much longer to solve, and the formula becomes ungainly. This created problems with the formula initially, but mathematicians have learned to cope with the formula and have created alternates that let them work with the numbers more effectively, as this expert notes. "Mathematicians have established certain formulas that allow them to work out the numbers which appear in Pascal's Triangle, as well as the sums of whole rows of these numbers included between fixed limits" (Borel, 1963, p. 18). Thus, Pascal's triangular theory was not perfect, but the formula has lasted through time, been improved, and still makes the study of probabilities cognitive.

However, this simple formula has made quite a difference in mathematics circles for centuries for a number of reasons. First, his treatise on these binomial coefficients later helped contribute to Sir Isaac Newton's eventual invention of the general binomial theorem for fractional and negative powers. In addition, Pascal carried on a long correspondence with Pierre de Fermat, and in 1654, this correspondence helped contribute to the development of the foundation of the theory of probability, which is one of our most important mathematical developments even today.

Interestingly enough, Pascal devoted the last eight years of his short life to philosophy and religion, and gave up his studies in the sciences and mathematics. One must wonder what he could have accomplished had he continued his studies, and indeed, what improvements he could have made to his triangle had he given it even more time and effort. His discoveries and inventions live on today, along with his name, as one of the greatest minds of all time, and he contributed greatly to our lives today, from a clearer understanding of probabilities to measuring the weather, dispensing medications, and ultimately computing our calculations quickly and efficiently.

In conclusion, Blaise Pascal died in 1662 at the age of thirty-nine - two years before the significance of his triangle would be known to those outside his academic circle, and the final formula would be published. Today, mathematicians… [read more]


Proof, a Nova Episode Aired Term Paper

Term Paper  |  3 pages (1,088 words)
Bibliography Sources: 1

SAMPLE TEXT:

This is another way of looking at solving complex problems. The show made the problem seem all encompassing (which it was to Wiles), and used a variety of experts to explain just what Wiles was attempting to prove, and why it was so important to the mathematical community. They took a topic which could have been boring and nearly incomprehensible, and made it interesting enough to keep the viewer watching. In fact, NOVA managed to get the viewer behind Wiles, and by the end of the show, when it seemed like he might not prove his theory, it was almost as if I was rooting for him to continue and not give up. To end the program, NOVA said, "Andrew Wiles is probably one of the few people on earth who had the audacity to dream that you could actually go and prove this conjecture" (NOVA). Therefore, this story is as much about dreams and goals as it is about pursuing something complex throughout your life to fruition. Andrew Wiles dared to dream, and in the end, his most complex "proof" may have been that sometimes dreams come true - with hard work, determination, and thinking "outside the box," - or in this case, the theorem.

This video is also quite important in what it shows about how people learn to do mathematics, and it was somewhat how I learned to do mathematics. Wiles broke down an extremely complex problem into bits and pieces, but he also had to look at it in unaccepted and untried ways. This is often how new truths are learned in any area. He also said that he suddenly had some kind of understanding that had not been there before. "I had this incredible revelation. [...] It was the most -- the most important moment of my working life. It was so indescribably beautiful; it was so simple and so elegant, and I just stared in disbelief for twenty minutes" (NOVA). While I have not attempted to solve complex problems such as Wiles', I had a hard time "getting" algebra at first, and it seemed like it took me years and years of study to understand even the most simple equation. Then suddenly, one day in class, I looked at an equation, and it suddenly just "made sense," and I could see the solution without struggle. I finally "got" it, and I know just how Wiles felt when the solution suddenly came to him. It was an incredible feeling, and once I had "gotten" it, not only was mathematics simpler, it was not so frightening or frustrating.

The Proof" is an elegant look at a complex subject, and it not only made mathematics more human, it made it clear how the best problem solving approach is one that takes a complex problem, breaks it down into more solvable areas, and then looks at every angle of the problem to find a solution. That solution might be, in the end, simple, but it needed alternate thinking… [read more]


Low Math Term Paper

Term Paper  |  8 pages (2,870 words)
Bibliography Sources: 1+

SAMPLE TEXT:

In the book, Ma provides an example of a Chinese teacher who has this profound understanding.

This teacher prepares for their lesson by considering what they will teach and what it means. They link the lesson that will be taught to the underlying concepts they want the students to learn, to the other concepts the information should link to, and… [read more]


Sine, Cosine, and Tangent Term Paper

Term Paper  |  4 pages (1,135 words)
Bibliography Sources: 1+

SAMPLE TEXT:

Because of trigonometry, it was now possible to determine the approximate volume of a star simply by finding its diameter. When it was first discovered, people used simple right-angle trigonometry to find heights of mountains and tall buildings.

It was soon discovered that the entire wave spectrum could be described in terms of frequency and amplitude, and graphed by trigonometric functions, such as sine, cosine and tangent.

The Babylonian measure of 360° formed the study of chords. With this information, sine and cosine were loosely defined as =1. Another Greek mathematician, Menelaus, wrote six books on chords. Ptolemy subsequently created a complete chord table. His new discovery included a variety of different theorems such as a quad inscribed inside a circle has the property that the product of the diagonals = sum of products of opposite sides; the half angle theorem; the sum and difference formulae; the inverse trigonometry functions; and more sine and cosine rules.

How Sine, Cosine and Tangent are Used Today

Today, sine, cosine and tangent are still used for astronomy and for geography, as well as in navigation and mapmaking. The trio is also used in physics with the study of visible light and fluid motion. Engineers today use trigonometric functions for military engineers and conveyors.

Trigonometric functions are the functions of an angle. These functions are important when studying triangles and modeling periodic phenomena. The trigonometric functions may be accurately defined as ratios of two sides of a right triangle containing the angle, or as ratios of coordinates of points on the unit circle.

Of the six trigonometric functions, sine, cosine and tangent are the most important. Sine, cosine, and tangent are used when you know an angle and a length of one of the sides of a right triangle, and you want to know the length of another side. For these functions, the angle is in radians, not degrees

The sine of an angle is the ratio of the length of the opposite side to the length of the hypotenuse. (Moyer) The cosine of an angle is the ratio of the length of the adjacent side to the length of the hypotenuse. The tangent of an angle is the ratio of the length of the opposite side to the length of the adjacent side.

Without sine, cosine and tangent, the mathematical tables on our computer screens would only show blank pages, and scientific calculators would not react to punching in numbers. Draftsmen would make serious errors when designing buildings, geologists would have inaccuracies of measurement, and so on.

Trigonometry has even been used in analyzing motor vehicle collisions. (Kaye) Geometry is used to determine curve radii for use in circular motion calculations while sine, cosine and tangent are used in momentum, vaults and road grade determinations.

Trigonometric functions were originally developed for astronomy and geography, but scientists are now using them for other purposes, too. Besides other fields of mathematics, trigonometry is used in physics, engineering, and chemistry.

Within mathematics, trigonometry is used primarily… [read more]


Theory on Plate Tectonics Term Paper

Term Paper  |  3 pages (1,158 words)
Bibliography Sources: 1+

SAMPLE TEXT:

Tragedy was to strike again, only a year after taking up this post, when in 1808 his father dies, and then in 1809, whilst in childbirth, his wife dies, and the second son, who she was giving birth to, was also to die soon after. However, is work does not appear to have suffered in the long-term, but the short-term saw him take time off of work and devote himself to his three children (Schaaf, 1964).

In 1810 he remarried, there were another three children, but this is generally though to have been a marriage of convenience rather than a love match (Schaaf, 1964).

Some of his major works included work on how to calculate the orbit of the planets. In his work Theoria Motus Corporum Coelestium he examined and discussed the use of differential equations, conic sections and the elliptic orbits, and then in the next volume of this work he then showed how the orbit of a planet could be estimated and then the estmate could be further refined (Rassias, 1991). By 1817 he had made is contributions to astronomy, and despite continuing observations he did not add more to the theoretical framework of astronomy (Schaaf, 1964).

Gauss did look to other subjects, publishing a total of one hundred and fifty papers over his career, he contributed to many other areas. Papers included Methodus nova integralium valores per approximationem inveniendi which was a practical essay that concerned the use of approximate integration, a discussion of statistical estimators in Bestimmung der Genauigkeit der Beobachtungen and geodesic problems in Theoria attractionis corporum sphaeroidicorum ellipticorum homogeneorum methodus nova tractate (Schaaf, 1964).

During the 1820's the work of Gauss appeared to start taking him more in the direction of geodesy. This may have started when, in 1818, he was requested to undertake a geodesic survey of Hanover, to link up to the Danish grid that was already in existence. He took total charge, and made the measurements during the day, and in the evenings he would reduce them to the calculations. It was during this survey, and as a result of the survey needs, that he invented the heliotrope (Rassias, 1991). Unfortunately, in the survey there were erroneous base lines used (Rassias, 1991).

Other work included may theories that were also discovered independently of Gauss by other mathematicians which have gained the recognition. For example, he had formed the ideas for non-Euclidean geometry, claiming to have discovered it fifty four years before Lobachevsky, but he still praised it. The fifty four-year framework may not be correct, but there are certainly some vague references to it in some of his work (Schaaf, 1964).

It was in 1832 when Gauss started to work with Weber, regarding terrestrial magnetism, many ideas were mentioned, and Dirichlet's principle was also included, but with a proof. They also proved there could only be two poles with Allgemeine Theorie (Schaaf, 1964).

The papers and theories have outlasted the name and reputation of their founder. However, the long-term impact of… [read more]


Exploring the Correlation Between Age and Cell Phone Use Chapter Writing

Chapter Writing  |  3 pages (732 words)
Bibliography Sources: 1

SAMPLE TEXT:

Computer Lab: Hypothesis Testing Correlations

The following null hypothesis is applicable for testing the correlation between the two variables "Age" and "Q18: "On an average day, about how many phone calls do you make and receive on your cell phone?"

Ho = The age of the cell phone user is not related to the average number of cell phone calls made or received by the cell phone user.

The Pearson correlation coefficient is .055 (p = 0.05, 2-tailed). In addition, Spearman's rho (-.244) and Kendall's tau (-.340) both show the correlation as significant at the 0.01 level (2-tailed). The null hypothesis is rejected at 0.01. There is a positive relationship between the age of the cell phone user and the average number of cell phone calls made or received by the cell phone user.

The number of observations for the "Age" variable was 1917 and the number of observations for the "Q18: "On an average day, about how many phone calls do you make and receive on your cell phone?" variable was 2252. User-defined missing values were treated as missing. Statistics for each pair of variables was based on all the cases that had valid data for that pair.

The assumptions about the data, including normality and linearity, were tested by examining the descriptive statistics and tests for skewness and kurtosis as shown in the output table directly below. The data are assumed to be normally distributed and independent. Some outliers are present in the data, which is to be expected since not all frequent users of cell phones are young. Variables such as the type of work or employment in which cell phone users engage can strongly influence the frequency of the cell phone calls made and received. These considerations encourage additional statistical analysis of other variables and perhaps additional research.

Statistics

Q18. On an average day, about how many phone calls do you make and receive on your cell phone?

AGE. What is your age?

N

Valid

1917

Missing

0

Mean

46.99

52.06

Std. Error of Mean

4.295

.412

Median

5.00

52.00

Mode

10

60

Std. Deviation

19.565

Skewness

4.807

.170

Std. Error of Skewness

.056

.052

Kurtosis

21.367

-.544

Std. Error of Kurtosis…… [read more]


Math Problems and Concepts in Teaching Research Paper

Research Paper  |  2 pages (725 words)
Bibliography Sources: 2

SAMPLE TEXT:

Mathematics of Mathematic Puzzles

Using mathematics puzzles is a frequently-used deployed pedagogical device to teach critical concepts to math students of all ages. "Understanding in mathematics is born not only from formulas, definitions and theorems but, and even more so, from those networks of related problems… Mathematicians seek knowledge. In search of knowledge, they enjoy themselves tremendously inventing and solving new problems" (Bogomolny 2015). The mathematics of puzzles is also designed to challenge the student's instinctive conceptions of how the world works. For example consider the question posed 1702 of how long would a rope encircling the earth have to be "for it to be one foot off the ground all the way around the equator -- this can be used to illustrate the geometric concept of finding the circumference (Pickover 2010). The student's intuitive instinct would be that the rope must be extremely long but in fact the answer is only 2pi. "If r is the radius of the Earth, and 1 + r is the radius in feet of the enlarged circle, we can compare the rope circumference before (2pir) and after (2pi (1 + r))" (Pickover 2010).

Other examples of these teaching devices are the multiplying 'wheat on a chessboard' dilemma. In this problem, the grains of wheat, beginning with one, are doubled every square leading to such a proliferation of grains that it would be impossible to provide that much wheat in reality, thus illustrating the concept of geometric growth (Pickover 2010). There is also the barber's paradox "which involves a town with one male barber who, every day, shaves every man who doesn't shave himself, and no one else" which is used to illustrate the concept of set theory (Pickover 2010).

The applicability of apparently pointless mathematical queries to vital foundational concepts in math has also come to light in the evolving discipline of game theory, frequently used by economists to illustrate how people make choices. For example, in the classical case of the Prisoner's Dilemma, two separately-imprisoned individuals are placed in a cell. If neither confess, both men go free, if both confess, both get a reduced sentence but still do jail time, while if only…… [read more]


History and Math Puzzles Research Paper

Research Paper  |  2 pages (727 words)
Bibliography Sources: 1+

SAMPLE TEXT:

Mathematical puzzles are not simply fun and interesting topics on which to muse: they have great significance within the history of the discipline, often connecting many generations of mathematicians who strive to unravel such riddles. "What is the value of such puzzles and enigmas? One important value is the fact that a math puzzle has an answer. We spend much of our time puzzling over problems that do not appear to have easy answers, if they have answers at all, and math puzzles offer a simplified way of solving problems that lead to a satisfying conclusion" (Bright 2010). Math problems are frequently-deployed mental exercises to help individuals understand certain concepts better. But some math problems have been so widely regarded as unsolvable they have become notorious.

Amazingly, many of the earliest puzzles were only solved in the modern era. For example, the Greek Archimedes constructed a puzzle asking how many ways there were to divide a square into 14 pieces. Only in the 21st century did a Cornell professor of mathematics discover that there "are a total of 17,152 solutions, but some can be considered equivalent if a rotation or a reflection are performed" (Pitici 2008). In another famous mathematical puzzle pertaining to spatial relations called the bridges of Konigsberg, the central problem arose when "seven bridges were built so that the people of the city could get from one part to another;" people began to speculate how to walk over the various bridges to go straight across the city while crossing each bridge only once ("The beginnings of topology," Math Forum). The problem of the bridges was one of the first questions ever posed in the evolving discipline of topology in mathematics, which reflects the fact that the "properties of the shapes remain the same" regardless of whether they are "stretched or compressed" ("The beginnings of topology," 2015).

The concept of paradoxes was also highly significant to the discipline of mathematics. Zeno's paradox about the "infinite divisibility" of space asked the question of how motion was possible ("Zeno's paradox," 1998). Obviously, motion is possible from an observable perspective but in theory motion is impossible because to reach one point, first we need…… [read more]


Easy Explanation of Analysis of Variance Essay

Essay  |  2 pages (679 words)
Bibliography Sources: 2

SAMPLE TEXT:

¶ … goals of statistics is to create an easy way to compare two or more factors in a meaningful way. Basic comparison tests, such as the t-test, allow for the comparison of two groups. Analysis of Variance (ANOVA) allows for the comparison of three or more different groups. While the concept of ANOVA may seem complex, it is simply a way of describing how the various items that belong within a single large group vary from one another. This can be very important because there are two main types of variance in a group. One type of variance is the type of variance that can be found naturally within a group; this type of variance is frequently referred to as random variance and may also be described as in-group variance. The other type of variance is the type of variance that is caused by the impact of independent variables on the dependent variable and may also be described as between-group variance. Without understanding the underlying variance in a group, one could overestimate or underestimate the impact that an independent variable was having on the dependent variable. An ANOVA test does not provide conclusive data; instead, it provides analysts with a key to understanding other test results, so that they can understand the significance of those results (Investopedia, 2015).

An ANOVA test can help explain whether there are any significant differences between the average results for three or more different groups. In an experimental context, each of the three groups will be as similar as possible at the beginning and the in-group variance can help explain how much variation is simply the result of random chance. Then, once the independent variable is applied to the group, the changes are examined to determine what type of impact the independent variable has on the dependent variables. The goal of the ANOVA is to test the null hypothesis, which is a hypothesis developed for the purpose of research, which states that the independent variable will have no impact on the dependent variable.…… [read more]


Hypothesis Testing Essay

Essay  |  2 pages (769 words)
Bibliography Sources: 2

SAMPLE TEXT:

Alpha level determines the confidence with which the researcher can decide to reject the null hypothesis. When researchers establish an alpha level prior to performing the research they are essentially deciding that if the null hypothesis is true then the probability of getting significant results in this study by either sampling error or by chance is whatever the alpha level they have set is (conventionally set at 0.05; Runyon, Coleman, & Pittenger, 2000). The choice of the alpha level is arbitrary, in other words researchers decide where to set it before their analysis. It is been a convention in research to use the 0.05 level; however, this is certainly not set in stone and recently this convention has come under some sharp criticisms (e.g., see Ioannidis, 2005). There are times when researchers will decide to use a more liberal or conservative alpha level in the research.

For example, suppose that a researcher is developing a way to diagnose a very serious and debilitating disease that typically cannot be diagnosed until the infected individual is nearly terminal. Also let us assume that if people with this disorder can be identified before it becomes terminal the cure is relatively safe. In this case the cost of making false positive errors (Type I errors) is low whereas the cost of making a false negative error (Type II error) is considered high and researchers are often motivated to increase their decision -- making criteria by raising their alpha level (Runyon et al., 2000). Also when researchers are investigating areas where there is very little known about the potential outcome and they are trying to develop theoretical models they may also decide to use a more liberal alpha level in order to identify potential significant constructs that will be scrutinized more conservatively in the future.

There also times when researchers will opt for a more conservative alpha level. In fact, there is one specific situation where researchers should adopt a more conservative alpha level but often do not. This occurs when researchers are making multiple statistical comparisons on the same data set (e.g., when the comparisons are not independent of one another; Runyon et al., 2000). The Type I error rate for any single statistical comparison is set by the researcher prior to their analysis and this is the…… [read more]


Limit Your Summary Term Paper

Term Paper  |  4 pages (1,206 words)
Bibliography Sources: 1

SAMPLE TEXT:

Low working memory individuals did not suffer appreciably under pressure, showing that working memory is indeed crucial to success and also sensitive to pressure constraints such as social evaluation.

8. How are these findings relevant to everyday life? Provide at least two examples. At least one of these examples should be an original example. That is, it should not be mentioned by the author(s) of the research article.

These findings are highly relevant for a number of real life scenarios. One example is in school children or in students at university level. Students who do well on their homework and low-pressure class exercises might choke under pressure, such as when they are a leader of a team, when there are time constraints on them, or when they are being watched closely. Another example is in the workplace. Employees who depend on their working memory might choke in high-pressure scenarios, such as public speaking engagements or competing with a colleague for an important contract. The implications are that persons who have a high capacity for success should develop their skills in coping with anxiety.

9. Describe two ways in which this research expands upon the theories and concepts discussed in class? You may wish to consult your textbook and lecture notes. Please cite appropriately (in APA format) when you discuss information from your text (e.g., Goldstein, 2011) and from your class notes (e.g., Trammell, personal communication, 2013). Please limit your response to 8 sentences.

The theories and concepts discussed in class have to do more with general cognitive psychology. Short-term memory is discussed, but not in terms of working memory in the context of performance on cognitively demanding tasks like mathematics. We have not learned that working memory is like a flash drive, in that the brain is able to process material effectively and rapidly by relying on short-term memory space. Yet anxiety also takes up this short-term memory space. Invading working memory, anxiety therefore inhibits performance. This would seem to be true for everyone, but it is especially true for those who rely most on short-term or working memory. It appears that the most successful or most capable persons are those who rely strongly on their working memory.

10. What research questions remain unanswered? That is, describe at least two areas for future research. Please limit your response to 8 sentences.

This study raises several questions for future research. One is related to the different effects of different types of pressure. It would be helpful to know what types of pressure affect what types of people. For instance, some people might not find financial incentives stressful and would therefore not have any problem being distracted by that stimulus when solving math problems. Other people might find that social pressures are less important. It would also be helpful to know if time constraints and other factors are relevant. Furthermore, it would be interesting to know if there are gender differences between high capacity persons. Another question for future research is how… [read more]


And Standard Deviation Case Study

Case Study  |  1 pages (364 words)
Bibliography Sources: 2

SAMPLE TEXT:

87, which is outside of the confidence interval.

The null hypothesis is therefore rejected -- the bottles contain a mean fill that is lower than the fill allowable within the 95% confidence interval. The complaint is correct -- the bottles are not sufficiently filled.

There are a couple of reasons why the bottles are not being filled properly. Most likely this is a calibration issue with the filling machine. The machine is chronically underfilling, which is most likely explained by calibration. The actual volatility of the fills (standard deviation) is high but does not explain the strong deviation from the expected mean of 16.0.

To solve this problem, I would check the calibration of the filler. Normally, the filler will fill to a level around the mean, usually at the 95% confidence interval -- or better if it is a more expensive filler. So if the machine is doing that here, it clearly thinks that 14.9 ounces is 16 ounces, or somebody changed the target fill level to 14.9 from 16. So the machine either needs to have its fill level target…… [read more]


ANOVA: Fobt, Tukey's HSD, and Effect Size Calculations Essay

Essay  |  6 pages (1,878 words)
Bibliography Sources: 0

SAMPLE TEXT:

Statistics and Probability

A researcher investigated the number of viral infections people contract as a function of the amount of stress they experienced during a 6-month period. The obtained data:

Amount of Stress

Negligible Stress

Minimal Stress

Moderate Stress

Severe Stress

What are Ho and Ha?

Ho: Stress levels experienced during a six-month period does not have a significant impact… [read more]


Multivariate Analysis Is Appropriate Essay

Essay  |  2 pages (579 words)
Bibliography Sources: 0

SAMPLE TEXT:

discrete data), or make an unlimited number of comparisons. Like any research tool multivariate statistics have their limitations. Different types of multivariate tests allow a researcher to answer different types of questions, but a researcher's ability to make inferences is always limited by the type of data collected in the methodology used to collect it.

I am interested in how gender rates differ in cases of human trafficking into the United States. Currently there is a study that proposes looking at the differences in gender in human trafficking cases based on the data from The National Human Trafficking Resource Center. This particular to the research could be adequately performed using a simple t-test or one-way ANOVA. However, other variables could be added to the methodology and multivariate analyses could offer much more information regarding the differences in males and females involved in human trafficking. Using other variables such as geographical area trafficked to (e.g., in the United States this could be divided into Midwest, North, South, etc.), the type of context into which human victims are placed (e.g., hard labor, domestic work, prostitution, etc.), ethnic background of the trafficked individual, and others allows for a richer analysis. In this case there would be multiple independent variables that can contribute to the differences in the number of males and females that are trafficked into the United States and multivariate techniques such as factorial ANOVA (or its counterpart multiple regression depending on the type of question) is more appropriate. In this type of study we would expect that the particular context would lead to differences in the rates of males and females trafficked into the country. For example, we would expect that people put into hard labor jobs such as… [read more]


Meyer Et Al. Meyer, Wang Essay

Essay  |  2 pages (772 words)
Bibliography Sources: 2

SAMPLE TEXT:

In the data. Different methodologies designate the types of conclusions that can be made from the analyses. For example in the current study Meyer et al. (2009) employ a correlational design, therefore they are unable to make causal inferences but can describe the relationships, their relative strengths, and in some cases direction given the analyses.

The type of data also influences the type of analyses one can do. There are a number of different levels of measurement in the current study due to a large number of variables ranging from nominal (e.g., diagnoses or patient employment status), ordinal (e.g., employment status coded as full-time, part-time, or casual), to ratio level variables (e.g., years of nursing work experience). Some of the variables are categorical such as employment status and some variables are continuous variables such as age.

Meyer et al. (2009) used multiple data collection methods that included collecting hospital records, daily unit data, surveys, and patient data forms. In order to ensure that different data sources and collection methods were consistent they calculated the inter-rater reliability for all measures (which they claim was at 90% throughout the study). The use of surveys in the study was extremely important as surveys allow the collection of anonymous data (no identification on the part of the person that takes a survey so they are free to answer candidly) and questions can be asked and rated on scales that allow the researchers to calculate such important internal constructs as a nurse or patient's attitude, opinions, and also allow to collect hard external data such as the number of hours worked, education levels, etc. Surveys remain an important staple in all areas of correlational research as the data can be coded and easily subject to statistical analyses; however, survey data typically does not allow the researcher to determine cause-and-effect associations (Tabachnick & Fidell, 2012). Thus, survey data can be extremely useful, but has limitations.

References

Jackson, S.L. (2012). Research methods and statistics: A critical thinking approach (4th

ed).Belmont, CA: Wadsworth.

Meyer, R.M., Wang, S., Li, X., Thomson, D., & O'Brien-Pallas, L. (2009). Evaluation of a patient care delivery model: Patient outcomes in acute cardiac care. Journal of Nursing Scholarship, 41(4), 399-410.

Tabachnick, B.G., & Fidell, L.S. (2012). Using multivariate statistics (6th…… [read more]


Experimental Research Research Proposal

Research Proposal  |  3 pages (987 words)
Style: APA  |  Bibliography Sources: 3

SAMPLE TEXT:

Experimental Research

One of the many important decisions a researcher must make during his or her work is which design to use for the research. Two main experimental designs include within-subjects and between-subjects design. Each has its own merits and drawbacks, and researchers tend to choose these according to the purpose and nature of the experiments or surveys to be conducted. For the experiment in question, where three different types of survey invitations are offered online, either the within- or between-subjects design can be used.

According to MacKenzie (2013), most empirical evaluations of input devices or interaction techniques, like the one to be conducted here, will be comparative. In this specific experiment, three types of online survey invitations are explored; the first containing only a link, the second offering to donate $10 to charity as a reward for participation, and finally, a chance to win $1,000 as a potential reward for participation. The comparative aspect lies in why people would be moved to respond to each of the invitations, and which invitation would receive the most participants. While some cases require a between-subjects design and others a within-subjects design, this particular survey could be studied by using either.

A within-subject design means that there is one group in which each participant is tested for all the conditions. Two major advantages of this design is that fewer participants are required, which means that recruiting, scheduling, briefing, demonstrating, and all the other aspects of the research procedure would be somewhat easier and take less time. Second, there is less variance resulting from participant disposition. Certain dispositions will occur consistently for certain participants, which makes it easier to make concessions for the particular behavior, as far as it influences the experiment results. Further, differences among measurements will then be due to differences in the conditions being measured rather than to disposition or behavior differences among participants (MacKenzie, 2013).

In the Zikmund experiment, a within-subject design would mean a single group of people would be tested for each type of survey invitation. In practical terms, this would mean that the group would be exposed to all three invitations and asked to choose the one they would most likely participate in. They could also be asked to give reasons for their participation. This would include both quantitative and qualitative effects in the experiment results. Quantitatively, the results would then reveal the invitation that is most enticing to participants, while the reasons can be investigated for their consistency with the results and among each other.

Although within-subject designs have distinct advantages, it is also true that there could be interference between the conditions imposed. When exposed to all three choices, for example, some participants may experience some difficulty choosing among them, which could compromise the reliability of the results. For this reason, researchers sometimes choose to opt for a between-subjects design instead (MacKenzie, 2013).

According to Shuttleworth (2013), a between-subjects design refers to an experiment that involves more than…… [read more]


Statistical Research II Measuring Research Paper

Research Paper  |  2 pages (675 words)
Style: APA  |  Bibliography Sources: 2

SAMPLE TEXT:

The Median is the number in a set that equates to the middle, if every figure in the distribution were to be listed in ascending or descending order. One of the most attractive aspects of the median from a statistical analysis standpoint comes from the fact that "the median is often used instead of the mean for asymmetric data because it is closer to the mode and is insensitive to extreme values in the sample" (Bickel, 2003), and this ability to resist the effects of variant data makes the median an extremely effective diagnostic tool. Median can also be conceptualized as the exact point within a number set where perfect separation into two halves occurs, with exactly 50% of the values in the distribution falling to one side or another, which is why statisticians often refer to the median as the visual center of a numerical distribution.

Crude Range is the result of subtracting the lowest number in a distribution from the highest. The Range equates to the amount of distinct values which are present between these highest and the lowest data points, when one includes the highest and lowest values that are present. The most prevalent limiting factor linked to the application of crude range and range is based on these tools being so reliant on the extreme ends of a distribution set, which typically complicates most analytic processes with the presence of variance. When one refers to the standard deviation, this term describes the precise level of variation or dispersion from the mean that can be detected. When the standard deviation is a low number, this is reflective of data points that are closely situated to the mean, while on the other hand standard deviations that are high numbers are derived from data points that span a wide spectrum.

References

Bickel, D.R. (2003). Robust and efficient estimation of the mode of continuous data: The mode as a viable measure of central tendency. Journal of statistical computation and simulation, 73(12), 899-912.

Manikandan, S. (2011). Measures of…… [read more]


Criminal Justice and Criminology Interpreting Simple Data Research Proposal

Research Proposal  |  7 pages (2,030 words)
Bibliography Sources: 6

SAMPLE TEXT:

¶ … Criminal Justice and Criminology

Interpreting simple data

The data collection exercise involved posting a picture of a bear on Facebook. The caption of the picture asked viewers to provide their thoughts on the picture. This caption was kept simple in order to prevent bias that could arise from asking viewers to like or dislike the pictures. The data… [read more]


ANOVA Study Analysis of Variance Research Paper

Research Paper  |  3 pages (774 words)
Bibliography Sources: 2

SAMPLE TEXT:

Alternative Hypothesis: There is no significant difference between the treatments that the students are subjected to.

Assumptions:

It is important to note here that in this case, there are two assumptions that will be made;

1. That there is a normal population distribution.

2. The variance associated with each variable is the same.

Types of errors likely in ANOVA case

Accuracy in statistics is the degree of closeness of a measurement of a quantity to the quantities true or actual value. Precision also termed as the reproducibility or repeatability is the degree to which repeated measurements under conditions that are not changed. A measurement in a system can be accurate but not precise and also accurate but not precise; it can also be neither or both. There are two categories of errors therefore experienced frequently in such a calculation. Type I and type II errors. Type I error is also known as error of the first kind and it occurs when the null hypothesis is true but is rejected. Type II error is also known as the error of the second kind and occurs when the null hypothesis is false but is not rejected erroneously. An example of the relationship between accuracy and precision if for instance when one reads out time right to the second even if one knows very well that the watch they are reading from is one minute slow, this means that the reading is precise but it is not accurate. An example of the relationship between type 1 and type 2 errors in our case would be that the type of treatment that the students are subjected does affect their marks. Type I error occurs when a conclusion is made that the treatment does not affect the performance in GMAT when it actually does. While a type II error is made when a conclusion is made that the treatment does affect the performance while it actually does not (Shera, 2006). In short, Type I is when the null hypothesis is rejected yet it should not and Type two is when the null hypothesis is accepted yet it should not be accepted.

References

Shera, J (2006). Statistical Errors (Type I, Type II,…… [read more]


Forensics One of the Most Important Statistical Discussion and Results Chapter

Discussion and Results Chapter  |  2 pages (522 words)
Bibliography Sources: 2

SAMPLE TEXT:

¶ … Forensics

One of the most important statistical concept that deals with psychological research is the population being studied. Although it may sound preliminary, deciding who to be studied is the most primary, and thus important, subjects in all of research. Specifically, for psychological research, population selection dictates the basis of many of the subjects involved. In something as subtle and imprecise as psychology, the population the research is selected from is that much more important.

Another statistical idea that is of great importance is the relationship between validity and reliability. Once again, subtle and nuanced definitions of these words help create arguments that become documented and eventually known as facts. Understanding the difference and important of each of these terms can also explain misunderstandings that occur in seemingly well thought out experiments and research. Either way, both concepts can lead to learning and improvement, each in their own way.

Some other statistical ideas are very interesting. One such idea is the ability to predict the future with statistical inference. Although nothing is guaranteed in life, through mathematical relationships, statistics can help create images of the future in predictive and systematic ways. This discovery is truly overlooked in many instances, and in others, too heavily relied on. Finding a balanced and reasoned approach to the incorporation of statistical inference to science remains an interesting challenge.

Dunifon (2005) raised another interesting point in dealing with statistical concepts. He wrote that "experiment is the only way to truly determine whether a treatment causes an outcome." I agree with this very interesting concept,…… [read more]


Russia's Contributions to Science Essay

Essay  |  5 pages (1,609 words)
Bibliography Sources: 3

SAMPLE TEXT:

Russia's Contribution To Science

Russian contribution to the field of science is famous due to many reasons including the invention of Radio by a. Popov, development of the periodic table by D. Mendeleev, the creation of principals in relation to the space flights, which are interplanetary, on multistage rockets by K. Tsiolkovskiy. The Russian scientists contributed a lot to the… [read more]


Random Variable for Each Statement as Being Term Paper

Term Paper  |  3 pages (820 words)
Bibliography Sources: 3

SAMPLE TEXT:

¶ … Random Variable for Each Statement as Being Discreet or Continuous by

(a) the number of freshman in the required course, English 101

A) Discreet B) Continuous

(b) the number of phone calls between Florida and New York on Thanksgiving day.

A) Discreet B) Continuous

(c) the height of a radomly selected student.

A) Discreet B) Continuous

(d) the number of spills that occur in a local hospital.

A) Discreet B) Continuous

(e) the braking time of a car.

A) Discreet B) Continuous

Provide an appropriate response.

List the four requirements for a binomial distribution.

(i) Observations are independent

(ii) Outcome of observation is either a success or failure

(iii) Probability of outcome is the same

(iii) Fixed number of observations

Identify each of the variables in the binomial probability formula.

P (x) = __ n!__ . px . qn-x

(n -- x)! x!

n = number of trials x = number of successes p = probability of success q = probability of failure

Also, explain what the fraction __ n!____ computes.

(n -- x)!x!

Number of ways to select 'x' items from 'n' given items

4. Assume that a procedure yields a binomial distribution with a trial repeated n times. Use the binomial probability formula to find the probability of x successes given the probability p of success on a single trial.

n = 12, x = 5, p = 0.25, q = 0.75

P (x = 5) = __ 12!__ . (0.25)5 . (0.75)(12-5)

(12 -- 5)! 5!

= __ 12!__ . (0.25)5 . (0.75)7

7! 5!

= 12 x 11 x 10 x 9 x 8 x 7!_ . (0.000977). (0.133484)

7! x 5 x 4 x 3 x 2 x 1

= 0.103241

CHAPTER 6

1. The Precision Scientific Instrument Company manufactures thermometers that are supposed to give readings of 0°C at the freezing point of water. Test on a large sample of these instruments reveal that at the freezing point of water, some thermometers give reading below 0° (denoted by negative numbers ). Assume that the mean reading is 0°C and that standard deviation of the reading is 1.00°C. Also assume that the readings are normally distributed. If one thermometer is randomly selected the, find the probability that at the freezing point of water, the reading is less than 1.57°C.

Z-score

Area

1.5 + 0.07 = 1.57

0.9418

This is already standardized,

2. If Z. is the standard variable, find the probability, that Z.…… [read more]


Solved Problems Term Paper

Term Paper  |  2 pages (477 words)
Bibliography Sources: 3

SAMPLE TEXT:

Statistical Estimates

(the Number That Appeared The Most In The Data)

median: (7. 30 + 7.60) / 2 =

mean:

range: 8.90-6.60 = 2.30 (Highest number = 8.90, lowest number = 6.60)

Computing standard deviation sample standard deviation =

sample mean =

sample standard deviation =

Using the empirical rule

68% of values lie within mean ± standard deviation

95% of lie within mean ± 2 standard deviation

% lie within mean ± 3 standard deviation

Given mean = 120cm, standard deviation = 12cm

Let x represent percentage of values,

Then, mean ± standard deviation = 120 mm Hg ±12 mm Hg

=108 mm Hg

132 mm Hg

mean ± 2 standard deviation = 120 mm Hg ±24 mm Hg

= 96 mm Hg

144 mm Hg

Thus, the approximate percentage of women between 96 mm Hg and 144mm Hg is 95%

Implementing Chebychev's theorem

Z-score (critical value)

Given that -2.37 is less than -2.00 or 2.37 is greater than 2.00, we may consider this value as unusual

Chapter 4

1. Probability estimate

2. Probability odds

3. Probability odds

The outcomes could be 1 or 2 or 3 or 4, and these are mutually exclusive events. Thus,

P (rolling a number less than 5) = P (1 or 2 or 3 or 4)

= P (1) + P (2) + P (3) + P (4)

= (1/6) + (1/6) + (1/6) + (1/6)

= 4/6 = 2/3 [Answer…… [read more]


Speeches and Presentations Essay

Essay  |  2 pages (677 words)
Bibliography Sources: 0

SAMPLE TEXT:

Speech Organization

When a presentation is made there are certain verbal and visual supports that can be used to aid it. This supports are quite important in that they help in clarity as they make ideas that are complicated clear. They also help in developing interest in the audience as they make the main points more vivid. Finally this supports make the presentation more convincing because they provide evidence that enhance the claims made. This paper will therefore look into these supports and the impact they had on a presentation that I attended. Verbal supports include definitions, examples, stories, statistics, comparisons, quotations, citing sources and so on. On the other hand the visual supports include objects, diagrams, list and tables, photographs and so on.

What captured my interest in the presentation?

The verbal support that captured my interest in the presentation is the use of "Examples." I really liked the way the use of examples has been used in the presentation was. This helped me as the audience in understanding well what the presentations were all about. The brief illustrations of the points allowed me to get exactly what the presenter was talking about and it became very effective since several of the examples were given out.

What confused in the presentation

The use of statistics really confused me. This is because there were many numbers that the speaker used to present their ideas. The choice of the statistics that was applied and hence the statistics presented were very overwhelming to me. Another verbal support that confused me was the use of comparisons. This is because I did not seem to clearly get the figurative analogies that were being applied. They were very confusing to me since I could not seem to get the validity of the comparisons that were being made.

What bored me in the presentation

What made me bored in the presentation was in the way the stories were narrated. For instance, the stories given was non-fiction, hence did not create materials that helps the…… [read more]


Norway Brand Statistical Summary and Hypotheses Decisions Data Analysis Chapter

Data Analysis Chapter  |  5 pages (1,712 words)
Bibliography Sources: 0

SAMPLE TEXT:

Norway Brand

Statistical Summary and Hypotheses Decisions

The statistical method used to compare the experimental group to the control group was straightforward and fairly standard. First, with an established confidence interval of 95% and a significance or alpha of .05, the Critical't score for the control group was established using Excel's built-in function. For each instrument item compared, the mean,… [read more]


Offline During the Final Exam Week A-Level Coursework

A-Level Coursework  |  7 pages (2,030 words)
Style: APA  |  Bibliography Sources: 7

SAMPLE TEXT:

¶ … offline during the final exam week. Once you have complete the exam, input your exam into the final exam shell in the exam folder on the course web-page. Good luck

Census statistics show that college graduates make more than $254,000 more in their lifetime than non-college graduates. If you were to question the validity of this observation, what… [read more]


Teach Geometry Dear Parent Essay

Essay  |  2 pages (730 words)
Bibliography Sources: 2

SAMPLE TEXT:

Children develop their math vocabulary and learn to use appropriate terms. They have an opportunity to connect new understanding to prior knowledge. Math is not simply rote learning of facts and equations. The Common Core State Standards (CCSS), already adopted by forty-five states, were designed to facilitate higher-order thinking and problem solving skills ("Common core standards adoption by state," 2012). These abilities will better prepare students for the real world. Students will communicate with their teachers and with their peers to figure out different ways to solve problems. There is focus on problem solving as a process, so students will be able to understand where they went wrong and so they will be able to solve similar problems in the future.

As far as studying geometry instead of "the basics," geometry is the basis for much of what we do in mathematics. The foundations young students will get in geometry will support their work later on in higher-level mathematics. For example, fractional amounts and percentages are most often represented using geometric shapes. Shapes drawn on a coordinate grid are analyzed in terms of algebraic relationships. Geometry can be thought of as "a conceptual glue" (Schwartz, 2008, p. 72) that connects many different areas within mathematics. With respect to real-world applications, analysis of two- and three-dimensional shapes and the study of geometric relationships are used in fields such as landscaping and architectural design. The ability to specify locations and describe spatial relationships is necessary in transportation, navigation, and construction. Transformations and symmetry are useful in packaging and product design, as well as artistic expression. Geometry made possible the programming of computer graphics and the intuitive interface with computers (Schwartz, p. 72).

Your child's teacher

References

Chard, D.J., Baker, S.K., Clarke, B., Jungjohann, K., Davis, K., and Smolkowski, K. (2008).

Preventing early mathematics difficulties: The feasibility of a rigorous kindergarten mathematics curriculum. Learning Disability Quarterly 31(1), pp. 11-20.

Common core standards adoption by state. (2012). ASCD. Retrieved from http://www.ascd.org/common-core-state-standards/common-core-state-standards- adoption-map.aspx

Cooke, B.D., and Buccholz, D. (2005). Mathematical communication in the classroom: A teacher makes a difference. Early Childhood Education Journal 32(6), pp. 365-369).

Schwartz, J.E. (2008). Elementary…… [read more]


Database Developer (Based on Job Research Paper

Research Paper  |  2 pages (695 words)
Bibliography Sources: 0

SAMPLE TEXT:

I instructed the optimizer to use a specific access path by using a hint.

d. Access Plan Execution

-- Executes the selected access plan.

I used the EXPLAIN PLAN command for looking at the execution plan of the SQL.

2. Provide examples of errors that you had during your professional experiences.

I had been frustrated with questions that included the following: why the query was running slow; why one query was going slower than another; I wondered whether my index was getting used, and if not, why not... The execution plan told me how the query would be executed, but I had trouble following the steps and had to be habituated to it. Syntactic analysis took a while.

As novitiate, I had difficulty in the beginning reading the code (in the results of the query) as well as understanding the different graphical, text and XML execution plans. The Graphical Plans were somewhat easier to read than the Text Plans, although the detailed data (of graph) was somewhat harder. The format that may have been the most obscure for me was the SHOWPLAN_TEXT

I had also been advised to do certain things for querying and working with the plan cache. I had to run a certain SQL script in order to see how long a plan takes to compile. In the beginning, I had to ask someone in order to understand the objects within the cache (in order to see how the optimizer and storage engine created my plan).

There were also differences between the esti-mated and actual execution plans. This probably occurred due to statistics being stale when over time data was modified and the statistics gradually became mismatched to actual data. I was told that I received bad execution plans because the statistical data was not up-to-date.

Sometimes, when I wanted a parallel query, I saw a completely different plan -- simply because the optimizer felt it could not support my request at that time. At least once or twice the estimated plan didn't work at all since it…… [read more]


Statistical Tests Can Provide More Information Data Analysis Chapter

Data Analysis Chapter  |  1 pages (383 words)
Bibliography Sources: 2

SAMPLE TEXT:

¶ … statistical tests can provide more information than a single one, allowing for more meaningful assessments of a situation. This interaction of two statistical tests (as described below) demonstrates that in this scenario younger women are by far the most likely to be the best employees for this call center.

The Pearson r provides an answer to the question of whether or not two variables are related to each other. More than simply establishing whether a relationship exists or not, Pearson r determines how strong this relationship is and whether it is a direct or inverse relationship. In a direct relationship, if one variable goes up than so does another (or others).

For example, in general as an individual's height goes up, so does his/or her weight. In an inverse relationship, as one variable goes up another one goes down. An example of this would be: The fewer workers are assigned to construct a building the longer it will take to construct the building. Both of these relationships cited here make intuitive sense to us. We may never have considered them to be a part of the world of statistics,…… [read more]


Calculus and Definitions Assessment

Assessment  |  5 pages (1,309 words)
Bibliography Sources: 1+

SAMPLE TEXT:

In simple terms the Riemann sum is used to define the definite integral of a function. We begin by considering a simple case, whereby the definition of the Riemann integral of a continuous function f over a rectangle R. In this case rather than having a one-variable case, we can overcome the tendency of connecting integration too strongly with anti-differentiation. (Buck 2003)

Formal definition

It is the definition of a function by use of graphs to define the limits; it uses Greek letters epsilon (?) and Delta (?). Epsilon always represents any distance on the limiting side and delta represents the distance on the x- axis. The limits of a given function clearly explain how that given function behaves when it nears the x value.

Consider the following functions g (x) and f (x), this functions are as a result of definition of real numbers. The following relationship exist x ?

This relationship exist only when there is a positive constant C. such that for all sufficiently large values of x, f (x) is at most C. multiplied by g (x) in absolute value. That is, f (x) = O (g (x)) if and only if there exists a positive real number C. And a real number x0 such that

In general the growth rate are of much interest in that the variable x which goes to infinity is often left unstated, and one writes more simply that f (x) = O (g (x)).

Additional explanation indicates that it doesn't matter how close a function can be to a limit, it is always necessary to find the corresponding x value which is closer to the given value and using the new notations of epsilon (?) and delta (?), we make f (x) within ? Of L, the limit, and later determine x within ? Of C. (Bradley et al.,.2000)

Again, since this is tricky, let's resume our example from before: f (x) =x2at x=2. To start, let's say we want f (x) to be within .01 of the limit. We know by now that the limit should be 4, so we say: for ?= 0.1, there is some ? so that as long as, then

To show this, we can pick any delta (?) that is bigger than 0, so long as it works. For example, you might pick .000000001, because you are absolutely sure that if x is within .00000000000001 of 2, then f (x) will be within .01 of 4. This works for. But we can't just pick a specific value for, like .01, because we said in our definition "for every." This means that we need to be able to show an infinite number of s, one for each.

In summary indefinite integration exists when the limits of integration are not given, this means that the upper and the lower limits are not given. Definite integration occurs when the limits are given and therefore you need to calculate the area that is when x=c to x=d

Examples… [read more]


Mathematical Modeling Term Paper

Term Paper  |  4 pages (1,320 words)
Bibliography Sources: 1+

SAMPLE TEXT:

¶ … Mathematical Modeling

Although even complex mathematical modeling is certainly not new, the process has been facilitated enormously in recent years by the introduction of computer-based modeling applications. Despite these innovations, there are still some significant limitations to mathematical modeling that must be taken into account when using these techniques. To gain some additional insights in this area, this paper provides a review of the relevant literature to identify the benefits and limitations of mathematical modeling, a discussion concerning the use of mathematical modeling in the author's profession and the extent to which such modeling is used as value-added to other kinds of empirical research, and the extent to which it is used in place of other kinds of empirical research. A summary of the research and important findings are presented in the conclusion.

Review and Analysis

Serious interest in mathematical modeling emerged during the mid-20th century when computer science was in its infancy but the need for ways to simulate real-world situations became pronounced. According to Maxwell (2004), "The federal government and many private enterprises have used mathematical modeling since the late 1950s as aids in developing policies, conducting research and development, and engineering complex systems" (p. 67). Today, computer-driven mathematical modeling applications have a number of real-world applications, including gambling and sports simulations as well as modeling human interactions for couples therapy and other "people prediction" applications (Albert, 2002). In this regard, Oliver and Myers report that, "Game theory provides a rich history of considering the strategies derived from various payoff structures, rules about repeating the game, and how players communicate" (p. 34). Mathematical modeling has proven efficacy in other settings as well, including the entire range of economic analyses (Oliver & Myers, n.d.) and even enormously complex weather prediction applications (Kirlik, 2006). Moreover, mathematical modeling has been used to good effect in helping researchers better understand how physiological processes operate at the molecular level. For example, Peter (2008) reports that, "Mathematical models allow researchers to investigate how complex regulatory processes are connected and how disruptions of these processes may contribute to the development of disease" (p. 49).

Furthermore, mathematical modeling can facilitate the systematic analyses of various "what-if"-type scenarios (Oliver & Myers, n.d.), formulate new hypotheses to serve as the basis for regimens of therapeutic interventions and even to evaluate the appropriateness of specific molecules for therapeutic purposes (Peter, 2008). According to Peter, "Numerous mathematical methods have been developed to address different categories of biological processes, such as metabolic processes or signaling and regulatory pathways. Today, modeling approaches are essential for biologists, enabling them to analyze complex physiological processes, as well as for the pharmaceutical industry, as a means for supporting drug discovery and development programs" (2008, p. 50). In fact, some authorities suggest that the limits of mathematical modeling are fundamentally human-based rather than technologically restricted. In this regard, Maxwell (2004) points out that, "Mathematical modeling and computer simulation are limited only by the ingenuity of the person or team conducting the analysis. They have… [read more]


Score Z Scores Z Research Proposal

Research Proposal  |  1 pages (351 words)
Bibliography Sources: 2

SAMPLE TEXT:

33

P= Area of the curve beyond 1.33

=0.0918 or 9.18%

b) Less than 80 minutes

Using formula Z=X-? + ?

Z=50-60/15

=10/15

=-0.67

P= Area of the curve below -0.67

P=0.2514

%=25.14%

Between 45-75

Using the formula Z=X-? + ?

Z (45) =45-60/15

=-1

Z (75) = 75-60/15

=1

P between -1 and 1

= 0.3413 + 0.3413

=0.6826

% =68.26%

Question 3

Bob takes an online IQ test and finds that his IQ according to the test is 134. Assuming that the mean IQ is 100, the standard deviation is 15, and the distribution of IQ scores is normal, what proportion of the population would score higher than Bob? Lower than Bob?

Proportion higher than Bob would be the area of the curve beyond Bob's score, the proportion lower than Bob would be the area of the curve below Bob's score.

Using the formula Z=X-? + ?

Z =134-100/15

=34/15

=2.27

P Higher than Bob's Score = 0.0116

P Lower than Bob's Score =0.9984

References…… [read more]


Strategic Plan on Janix Healthcare Consultation Essay

Essay  |  3 pages (929 words)
Bibliography Sources: 7

SAMPLE TEXT:

¶ … people test hypotheses?

A hypothesis is a statement that predicts the outcome of an experiment or research. Every experiment must have a hypothesis statement which shows the aim of an experiment. It is usually an educated guess and indicates the expectations of a researcher. Carrying out a number of experiments, can either approve or disapprove a hypothesis Moschopoulos & Davidson, 1985.

A hypothesis is formed after literature study has been finished and the problem of the study stated. There are different types of hypothesis these include, Inductive hypothesis, based on specific observation, deductive hypothesis provides evidence that expands, supports or contradicts a theory. Non-directional hypothesis states the relationship or difference between variables, directional hypothesis defines the desired direction of the relation or difference between the variables. Null hypothesis states that there is either a significant difference between variables or no significant relationship between the variables Dembo & Peres, 1994()

Hypothesis should be specific while its concept should be clear. One should be able to test a hypothesis and should be related to the theory. A hypothesis should recognize certain variables, must be verifiable and must be in simple terms made easy to understand.

For example, in the article about the changing roles of teachers in an era of high stakes accountability, it is hypothesized that the teachers have changing roles as high stakes accountability becomes increasingly pervasive in their day-to-day work. In the second article on the supervisor perceptions of the quality of troops to teachers program, the hypothesis is that the T3 program increases the professional education level of the teachers and has a positive impact on the achievement of students.

Theory

A theory is a statement or a principle that has been well established so as to explain and describe the cause and effect of a certain research investigation. A theory summarizes a hypothesis .These hypotheses have been accepted to be true by a number of experiments. A theory is what hypotheses become when they are accepted to be true. A theory remains valid until evidence that disputes it arises. Some of the scientific theories that are true include Newton's theory of gravity. This theory has proved to be true because it has enabled man to send others to the moon and even launch satellites. Other theories include Maxwell's theory of electromagnetism, periodic table theory, Einstein's theory of relativity, Quantum theory and Darwin's theory of evolution. These theories have been tested and accepted by the scientific community. Components of a theory can be improved on upon further experiments in the future or changed, but the overall truth of the theory is not changed Loosen, 1997()

Sources of theories and hypothesis

The activities that happen in our daily life may serve as a source for developing a hypothesis. An example of…… [read more]


How Do We Combat Math Anxiety? Research Paper

Research Paper  |  5 pages (1,548 words)
Bibliography Sources: 5

SAMPLE TEXT:

Math Anxiety

How to Combat Math Anxiety

Causes for Anxiety

How to Begin to Help

What Schools Can Do

What Parents Can Do

Albert Einstein once stated, "Do not worry about your difficulties in mathematics; I assure you mine are greater."

Yet no matter how great this man's difficulties were, for those suffering from math anxiety, any math problem is… [read more]


Bayes Probability Can Bayes Confirmation Essay

Essay  |  8 pages (2,345 words)
Bibliography Sources: 7

SAMPLE TEXT:

[footnoteRef:7] He was studying the paradox which arises from the use of Bayes theorem when trying to explain phenomena in the field of psychology. He states that; [7: PE Meehl, 'Theory-testing in psychology and physics: A methodological paradox', Philosophy of Science, vol. 34, 1967, pp. 103-115.]

'In the physical sciences [physics, and others], the usual result in an improvement in… [read more]


Devise a Standard of Existence Rule Essay

Essay  |  4 pages (1,190 words)
Bibliography Sources: 4

SAMPLE TEXT:

¶ … Existence / Rule for Existence

Existence is a philosophical question that has eluded thinkers for centuries. From as early as ancient Greek, philosophers have sought to define existence as a concept to encompass not only the physical world, but those objects that exist in different non-physical plains. It is these issues that present the challenge in determining the true meaning of existence and non-existence.

An object exists when it has a form that is not in violation of any universal rules or truths. A form is any physical, metaphysical, or cognitive presentation. Universal rules are those derived in science and mathematics such as gravity, mass, geometry, and algebra. Truths are those statements that are absolute and cannot be refuted. When held against this definition of existence, a horse, the number four, and a unicorn exist whereas the square circle does not exist.

A horse is the most obvious of the items that exists. The reason is the horse fulfills the definition perfectly. First, the horse exists in a form, two forms actually. The first form is its archetypical form. This is the form in the mind that creates the definition of a horse. A object exists in an archetypical form when the mentioning of the object brings a specific image or definition to the mind. In this case, when the word "horse" is mentioned, a person immediately conjures up the image of a four-legged mammal with hooves, main and tail. Horses also exist in a physical for as well. Their physical form exists in the third dimension along-side humans. This means that horses can be touched, smelled, heard, watched and interacted with. It is these features that even greater solidify the horse's existence within the mind. Now to address the second and third parts of the definition. A horse is not in violation of any universal rules or truths. Its very definition, in fact, is solely derived from its physical form and the observations thereof. So, a horse meets the full criteria of an existing object and therefor does exist.

The number 4 also exists, except it does not exist in the same way that a horse exists. Unlike a horse, the number four is not a living thing, in the sense that it breaths, eats, or grows. It does still, nonetheless exist. The number 4, like the horse, has two forms. The first form is the archetypical form. When the number 4 is mentioned, those trained in mathematical law immediately conjure up the image. While this time the image is not as physical as it was with the horse, the concept can still be conjured within the mind and is solidified when tied to another object such as the horses. When 4 horses is mentioned, it becomes even easier to envision the number 4 in use. The second form that the number four can take is physical. Once again, unlike the horse, the physical form is not alive, but it still exists. This form, commonly referred to… [read more]


Nursing Research: Discussion Questions Quantitative Essay

Essay  |  2 pages (610 words)
Bibliography Sources: 1

SAMPLE TEXT:

The human rights of subjects must always be protected in research -- and that includes not publishing data that could result in harm to individuals, who are treated in a particular manner, based upon inaccurate data.

Q4. A simple hypothesis states the relationship between two variables. A complex hypothesis states the relationship between three or more variables. A nondirectional hypothesis states that a relationship exists between two variables. A directional hypothesis predicts the relationship between the two variables (Burns 2010: 172-173). An associative hypothesis describes phenomenon that occur together, while causal hypothesis describes one phenomenon that causes another (Burns 2010: 168). A null hypothesis states that there is no relationship between two variables (and usually the researcher wants to disprove the null hypothesis). The research hypothesis states that there is a relationship between the two variables, which the researcher is usually trying to prove.

Q5. Quantitative research attempts to accumulate numerical data about a specific phenomenon. A quantitative literature review attempts to accumulate data from a vast array of different quantitative studies, to either describe or find out specific tendencies in the types of hypotheses tested regarding the phenomenon. Of course, it is rare that all studies will reach the same conclusion, so the researchers will evaluate the quality of the studies (for example, if a study produces an anomalous result, the author of the review will likely try to determine why this is the case, such as if there was too small a sampling size). The literature review may reach a conclusion about the phenomenon, based upon statistically analyzing the data. A qualitative research study merely assesses the variety of informational studies on a particular phenomenon to paint a clearer picture of…… [read more]


Linear Regression Models (Meier, Chapter 18 Article Review

Article Review  |  5 pages (1,293 words)
Bibliography Sources: 5

SAMPLE TEXT:

Linear Regression models (Meier, Chapter 18 / 19)

These are used in order to determine whether a correlation (or relationship) exists between one element and another and, if so, in which direction (negative or positive).

The two variables are plotted on a graph. Independent variable on the x line (horizon); y- variable (dependent) on the vertical line. The pattern between them is called the 'slope'. The point where X and Y intersect online is called 'intercept'.

The theorem used tells us that the slope of the line will be equal to the change in x (IV) given changes of y (DV). The shape of the slope (their direction and gradient) describes the relationship between X and Y.

Linear regression, as are the previous models, is used apply results population sample to population as a whole. >Linear regression is also useful for predicting occurrences in that sphere. For instance, linear regression may be used to determine whether there is a correlation between vehicle collisions and rainy days. If so, one can predict that the stormier the weather the greater the quantity of collisions.

Goodness of Fit

We will want to know the amount of error i.e. how well the regression line fits the data. The distance a point is from the regression line is known as error. A certain calculation exists to find this out. Another goodness of fit measure is the standard error of the estimation where a calculation is used to find out the extent to which the results of the sampled population will correspond to the population as a whole. Thirdly, the coefficient of determination is used to measure the total variation in the independent variable (X). Complex calculations exist for this. (All of these calculations can be worked out by special computer programs too).

Linear regression has various assumptions:

1. For any value in X, the errors in predicting Y are normally distributed with a mean of zero.

2. Errors do not get larger as X becomes larger; rather the errors remain constant throughout slope regardless of the X value.

3. The errors of Y and X are independent of one another.

4. Both IV and DV (X and Y) must be interval variables (i.e. numerical data).

5. The relationships between X and Y are linear.

Ignoring these assumptions will result in faulty statistical conclusions.

Topic 2: Comparing 2 Groups

A researcher may run the same study on two different groups with one, for instance, acting as control and the other as experimental. He may then want to know whether differences are observed between the two groups.

1. Research and null hypothesis are drawn up stating that: (a) significant difference will be found, (b) significant difference will not be found between both groups.

e.g. Alternative Hyp. H1: Employees who have taken *program will have higher job scores

Null hyp (H0): There is no difference in scores between employees who have taken program and employees who have not.

2. Mean and standard deviation of each group is calculated… [read more]


Improved Your Knowledge, Skills, Abilities, and Yourself Essay

Essay  |  2 pages (704 words)
Bibliography Sources: 0

SAMPLE TEXT:

¶ … Improved Your Knowledge, Skills, Abilities, and Yourself in This Session Through This Course

The mathematical skill that was taught in the course is necessary for a career that may be related to business, investment and analysis. In other words the course shows the path to business mathematics by the use of exponential, and logarithmic functions that management use in the Managerial information systems -- MIS and this is also used with functions, set theories and other allied matters thought to analyze data and solve problems related to the market, and necessary decisions to be made can be based on these knowledge. To that extent I feel I have gained a lot. The overall experience is that the course has given me the ability to attend to problems that I once feared. Due to this course the approach to math, which I always approached with trepidation has changed and am now ready and willing to go further in exploring mathematics, both for its academic interest and also as a useful tool for me in my daily work.

2. Evaluation of the work you did during the session for the class and explanations of ways you could have performed better

I have been introduced and provided guidance and training to seek the results of complicated functions that can provide with answers to every day questions that I may have to answer in the course of my trade or occupation and generally in life. These include the data sets. The lessons on functions were tough and I believe that I could have put in better effort there. The functions are still hazy but I found the use of the calculation of simple things like interest, and the profitability etc. As something very useful to me. I did concentrate more on them and perhaps they may have caused the problems in my understanding of the other topics. I believe that I could have done better with the study of functions, and I performed rather well but I could have done better.

There were small gaps in my understanding probably a result of my anxiety…… [read more]

1234. . .
NOTE:  We can write a brand new paper on your exact topic!  More info.