This book is an account of modeling and idealization in modern scientific practice, focusing on concrete, mathematical, and computational models. The main topics of this book are the nature of models, the practice of modeling, and the nature of the relationship between models and real-world phenomena. In order to elucidate the model/world relationship, Weisberg develops a novel account of similarity called weighted feature matching.
The present text sets itself in relief to other titles on the subject in that it addresses the means and methodologies versus a narrow specific-task oriented approach. Concepts and their developments which evolved to meet the changing needs of applications are addressed. This approach provides the reader with a general tool-box to apply to their specific needs. Two important tools are presented: dimensional analysis and the similarity analysis methods. The fundamental point of view, enabling one to sort all models, is that of information flux between a model and an original expressed by the similarity and abstraction Each chapter includes original examples and applications. In this respect, the models can be divided into several groups. The following models are dealt with separately by chapter; mathematical and physical models, physical analogues, deterministic, stochastic, and cybernetic computer models. The mathematical models are divided into asymptotic and phenomenological models. The phenomenological models, which can also be called experimental, are usually the result of an experiment on an complex object or process. The variable dimensionless quantities contain information about the real state of boundary conditions, parameter (non-linearity) changes, and other factors. With satisfactory measurement accuracy and experimental strategy, such models are highly credible and can be used, for example in control systems.
This is the most comprehensive book ever published on philosophical methodology. A team of thirty-eight of the world's leading philosophers present original essays on various aspects of how philosophy should be and is done. The first part is devoted to broad traditions and approaches to philosophical methodology (including logical empiricism, phenomenology, and ordinary language philosophy). The entries in the second part address topics in philosophical methodology, such as intuitions, conceptual analysis, and transcendental arguments. The third part of the book is devoted to essays about the interconnections between philosophy and neighbouring fields, including those of mathematics, psychology, literature and film, and neuroscience.
Develops a theory of contemporary culture that relies on displacing economic notions of cultural production with notions of cultural expenditure. This book represents an effort to rethink cultural theory from the perspective of a concept of cultural materialism, one that radically redefines postmodern formulations of the body.
Although computational modeling is now a widespread technique in cognitive science and in psychology, relatively little work in developmental psychology has used this technique. The approach is not entirely new, as a small group of researchers has attempted to create computational accounts of cognitive developmental phenomena since the inception of the technique. It should seem obvious that transition mechanisms -- or how the system progresses from one level of competence to the next -- ought to be the central question for investigation in cognitive developmental psychology. Yet, if one scans the literature of modern developmental studies, it appears that the question has been all but ignored. However, only recently have advances in computational technology enabled the researcher access to fully self-modifying computer languages capable of simulating cognitive change. By the beginning of the 1990s, increasing numbers of researchers in the cognitive sciences were of the opinion that the tools of mathematical modeling and computer simulation make theorizing about transition mechanisms both practical and beneficial -- by using both traditional symbolic computational systems and parallel distributed processing or connectionist approaches. Computational models make it possible to define the processes that lead to a system being transformed under environmental influence from one level of competence observed in children to the next most sophisticated level. By coding computational models into simulations of actual cognitive change, they become tangible entities that are accessible to systematic study. Unfortunately, little of what has been produced has been published in journals or books where many professionals would easily find them. Feeling that developmental psychologists should be exposed to this relatively new approach, a symposium was organized at the biennial meeting of the Society for Research in Child Development. The "cost of entry" was that speakers had to have a running computational model of a documented cognitive transition. Inspired by that conference, this volume is the first collection where each content chapter presents a fully implemented, self-modifying simulation of some aspect of cognitive development. Previous collections have tended to discuss general approaches -- less than fully implemented models -- or non self-modifying models. Along with introductory and review chapters, this volume presents a set of truly "developmental" computational models -- a collection that can inform the interested researcher as well as form the basis for graduate-level courses.
The fluid flow in fracture porous media plays a significant role in the assessment of deep underground reservoirs, such as through CO2 sequestration, enhanced oil recovery, and geothermal energy development. Many methods have been employed—from laboratory experimentation to theoretical analysis and numerical simulations—and allowed for many useful conclusions. This Special Issue aims to report on the current advances related to this topic. This collection of 58 papers represents a wide variety of topics, including on granite permeability investigation, grouting, coal mining, roadway, and concrete, to name but a few. We sincerely hope that the papers published in this Special Issue will be an invaluable resource for our readers.
The modeling of stochastic dependence is fundamental for understanding random systems evolving in time. When measured through linear correlation, many of these systems exhibit a slow correlation decay--a phenomenon often referred to as long-memory or long-range dependence. An example of this is the absolute returns of equity data in finance. Selfsimilar stochastic processes (particularly fractional Brownian motion) have long been postulated as a means to model this behavior, and the concept of selfsimilarity for a stochastic process is now proving to be extraordinarily useful. Selfsimilarity translates into the equality in distribution between the process under a linear time change and the same process properly scaled in space, a simple scaling property that yields a remarkably rich theory with far-flung applications. After a short historical overview, this book describes the current state of knowledge about selfsimilar processes and their applications. Concepts, definitions and basic properties are emphasized, giving the reader a road map of the realm of selfsimilarity that allows for further exploration. Such topics as noncentral limit theory, long-range dependence, and operator selfsimilarity are covered alongside statistical estimation, simulation, sample path properties, and stochastic differential equations driven by selfsimilar processes. Numerous references point the reader to current applications. Though the text uses the mathematical language of the theory of stochastic processes, researchers and end-users from such diverse fields as mathematics, physics, biology, telecommunications, finance, econometrics, and environmental science will find it an ideal entry point for studying the already extensive theory and applications of selfsimilarity.
Multidimensional Similarity Structure Analysis comprises a class of models that represent similarity among entities (for example, variables, items, objects, persons, etc.) in multidimensional space to permit one to grasp more easily the interrelations and patterns present in the data. The book is oriented to both researchers who have little or no previous exposure to data scaling and have no more than a high school background in mathematics and to investigators who would like to extend their analyses in the direction of hypothesis and theory testing or to more intimately understand these analytic procedures. The book is repleted with examples and illustrations of the various techniques drawn largely, but not restrictively, from the social sciences, with a heavy emphasis on the concrete, geometric or spatial aspect of the data representations.
The ability to successfully predict industrial product performance during service life provides benefits for producers and users. This book addresses methods to improve product quality, reliability, and durability during the product life cycle, along with methods to avoid costs that can negatively impact profitability plans. The methods presented can be applied to reducing risk in the research and design processes and integration with manufacturing methods to successfully predict product performance. This approach incorporates components that are based on simulations in the laboratory. The results are combined with in-field testing to determine degradation parameters. These approaches result in improvements to product quality, performance, safety, profitability, and customer satisfaction. Among the methods of analyses included are: • Accelerated Reliability Testing (ART) • Accelerated Durability Testing (ADT) • system variability / input variability • engineering risk versus time and expense