"Learning Statistics with R" covers the contents of an introductory statistics class, as typically taught to undergraduate psychology students, focusing on the use of the R statistical software and adopting a light, conversational style throughout. The book discusses how to get started in R, and gives an introduction to data manipulation and writing scripts. From a statistical perspective, the book discusses descriptive statistics and graphing first, followed by chapters on probability theory, sampling and estimation, and null hypothesis testing. After introducing the theory, the book covers the analysis of contingency tables, t-tests, ANOVAs and regression. Bayesian statistics are covered at the end of the book. For more information (and the opportunity to check the book out before you buy!) visit http://ua.edu.au/ccs/teaching/lsr or http://learningstatisticswithr.com
The OpenIntro project was founded in 2009 to improve the quality and availability of education by producing exceptional books and teaching tools that are free to use and easy to modify. We feature real data whenever possible, and files for the entire textbook are freely available at openintro.org. Visit our website, openintro.org. We provide free videos, statistical software labs, lecture slides, course management tools, and many other helpful resources.
"While most books on statistics seem to be written as though targeting other statistics professors, John Reinard′s Communication Research Statistics is especially impressive because it is clearly intended for the student reader, filled with unusually clear explanations and with illustrations on the use of SPSS. I enjoyed reading this lucid, student-friendly book and expect students will benefit enormously from its content and presentation. Well done!" --John C. Pollock, The College of New Jersey Written in an accessible style using straightforward and direct language, Communication Research Statistics guides students through the statistics actually used in most empirical research undertaken in communication studies. This introductory textbook is the only work in communication that includes details on statistical analysis of data with a full set of data analysis instructions based on SPSS 12 and Excel XP. Key Features: Emphasizes basic and introductory statistical thinking: The basic needs of novice researchers and students are addressed, while underscoring the foundational elements of statistical analyses in research. Students learn how statistics are used to provide evidence for research arguments and how to evaluate such evidence for themselves. Prepares students to use statistics: Students are encouraged to use statistics as they encounter and evaluate quantitative research. The book details how statistics can be understood by developing actual skills to carry out rudimentary work. Examples are drawn from mass communication, speech communication, and communication disorders. Incorporates SPSS 12 and Excel: A distinguishing feature is the inclusion of coverage of data analysis by use of SPSS 12 and by Excel. Information on the use of major computer software is designed to let students use such tools immediately. Companion Web Site! A dedicated Web site includes a glossary, data sets, chapter summaries, additional readings, links to other useful sites, selected "calculators" for computation of related statistics, additional macros for selected statistics using Excel and SPSS, and extra chapters on multiple discriminant analysis and loglinear analysis. Intended Audience: Ideal for undergraduate and graduate courses in Communication Research Statistics or Methods; also relevant for many Research Methods courses across the social sciences
Statistics: A Short, Clear Guide is an accessible, humorous and easy introduction to statistics for social science students. In this refreshing book, experienced author and academic Neil Burdess shows that statistics are not the result of some mysterious "black magic", but rather the result of some very basic arithmetic. Getting rid of confusing x′s and y′s, he shows that it′s the intellectual questions that come before and after the calculations that are important: (i) What are the best statistics to use with your data? and (ii) What do the calculated statistics tell you? Statistics: A Short, Clear Guide aims to help students make sense of the logic of statistics and to decide how best to use statistics to analyse their own data. What′s more, it is not reliant on students having access to any particular kind of statistical software package. This is a very useful book for any student in the social sciences doing a statistics course or needing to do statistics for themselves for the first time.
This revised book provides a thorough explanation of the foundation of robust methods, incorporating the latest updates on R and S-Plus, robust ANOVA (Analysis of Variance) and regression. It guides advanced students and other professionals through the basic strategies used for developing practical solutions to problems, and provides a brief background on the foundations of modern methods, placing the new methods in historical context. Author Rand Wilcox includes chapter exercises and many real-world examples that illustrate how various methods perform in different situations. Introduction to Robust Estimation and Hypothesis Testing, Second Edition, focuses on the practical applications of modern, robust methods which can greatly enhance our chances of detecting true differences among groups and true associations among variables. * Covers latest developments in robust regression * Covers latest improvements in ANOVA * Includes newest rank-based methods * Describes and illustrated easy to use software
Probability, Random Variables, Statistics, and Random Processes: Fundamentals & Applications is a comprehensive undergraduate-level textbook. With its excellent topical coverage, the focus of this book is on the basic principles and practical applications of the fundamental concepts that are extensively used in various Engineering disciplines as well as in a variety of programs in Life and Social Sciences. The text provides students with the requisite building blocks of knowledge they require to understand and progress in their areas of interest. With a simple, clear-cut style of writing, the intuitive explanations, insightful examples, and practical applications are the hallmarks of this book. The text consists of twelve chapters divided into four parts. Part-I, Probability (Chapters 1 – 3), lays a solid groundwork for probability theory, and introduces applications in counting, gambling, reliability, and security. Part-II, Random Variables (Chapters 4 – 7), discusses in detail multiple random variables, along with a multitude of frequently-encountered probability distributions. Part-III, Statistics (Chapters 8 – 10), highlights estimation and hypothesis testing. Part-IV, Random Processes (Chapters 11 – 12), delves into the characterization and processing of random processes. Other notable features include: Most of the text assumes no knowledge of subject matter past first year calculus and linear algebra With its independent chapter structure and rich choice of topics, a variety of syllabi for different courses at the junior, senior, and graduate levels can be supported A supplemental website includes solutions to about 250 practice problems, lecture slides, and figures and tables from the text Given its engaging tone, grounded approach, methodically-paced flow, thorough coverage, and flexible structure, Probability, Random Variables, Statistics, and Random Processes: Fundamentals & Applications clearly serves as a must textbook for courses not only in Electrical Engineering, but also in Computer Engineering, Software Engineering, and Computer Science.
Mounting failures of replication in social and biological sciences give a new urgency to critically appraising proposed reforms. This book pulls back the cover on disagreements between experts charged with restoring integrity to science. It denies two pervasive views of the role of probability in inference: to assign degrees of belief, and to control error rates in a long run. If statistical consumers are unaware of assumptions behind rival evidence reforms, they can't scrutinize the consequences that affect them (in personalized medicine, psychology, etc.). The book sets sail with a simple tool: if little has been done to rule out flaws in inferring a claim, then it has not passed a severe test. Many methods advocated by data experts do not stand up to severe scrutiny and are in tension with successful strategies for blocking or accounting for cherry picking and selective reporting. Through a series of excursions and exhibits, the philosophy and history of inductive inference come alive. Philosophical tools are put to work to solve problems about science and pseudoscience, induction and falsification.