The concepts of formal and informal remain central to the theory and practice of development more than half a century after they were introduced into the debate. They help structure the way that statistical services collect data on the economies of developing countries, the development of theoretical and empirical analysis, and, most important, the formulation and implementation of policy. This volume brings together a significant new collection of studies on formality and informality in developing countries. The volume is multidisciplinary in nature, with contributions from anthropologists, economists, sociologists, and political scientists. It contains contributions from among the very best analysts in development studies. Between them the chapters argue for moving beyond the formal-informal dichotomy. Useful as it has proven to be, a more nuanced approach is needed in light of conceptual and empirical advances, and in light of the policy failures brought about by a characterization of the 'informal' as 'disorganized'. The wealth of empirical information in these studies, and in the literature more widely, can be used to develop guiding principles for intervention that are based on ground level reality.
The organization of the lexicon, and especially the relations between groups of lexemes is a strongly debated topic in linguistics. Some authors have insisted on the lack of any structure of the lexicon. In this vein, Di Sciullo & Williams (1987: 3) claim that “[t]he lexicon is like a prison – it contains only the lawless, and the only thing that its inmates have in commonis lawlessness”. In the alternative view, the lexicon is assumed to have a rich structure that captures all regularities and partial regularities that exist between lexical entries.Two very different schools of linguistics have insisted on the organization of the lexicon. On the one hand, for theories like HPSG (Pollard & Sag 1994), but also some versions of construction grammar (Fillmore & Kay 1995), the lexicon is assumed to have a very rich structure which captures common grammatical properties between its members. In this approach, a type hierarchy organizes the lexicon according to common properties between items. For example, Koenig (1999: 4, among others), working from an HPSG perspective, claims that the lexicon “provides a unified model for partial regularties, medium-size generalizations, and truly productive processes”. On the other hand, from the perspective of usage-based linguistics, several authors have drawn attention to the fact that lexemes which share morphological or syntactic properties, tend to be organized in clusters of surface (phonological or semantic) similarity (Bybee & Slobin 1982; Skousen 1989; Eddington 1996). This approach, often called analogical, has developed highly accurate computational and non-computational models that can predict the classes to which lexemes belong. Like the organization of lexemes in type hierarchies, analogical relations between items help speakers to make sense of intricate systems, and reduce apparent complexity (Köpcke & Zubin 1984). Despite this core commonality, and despite the fact that most linguists seem to agree that analogy plays an important role in language, there has been remarkably little work on bringing together these two approaches. Formal grammar traditions have been very successful in capturing grammatical behaviour, but, in the process, have downplayed the role analogy plays in linguistics (Anderson 2015). In this work, I aim to change this state of affairs. First, by providing an explicit formalization of how analogy interacts with grammar, and second, by showing that analogical effects and relations closely mirror the structures in the lexicon. I will show that both formal grammar approaches, and usage-based analogical models, capture mutually compatible relations in the lexicon.
Formal Axiology and Its Critics consists of two parts, both of which present criticisms of the formal theory of values developed by Robert S. Hartman, replies to these criticisms, plus a short introduction to formal axiology. Part I consists of articles published or made public during the lifetime of Hartman to which he personally replied. It contains previously published replies to Hector Neri Castañeda, William Eckhardt, and Robert S. Brumbaugh, and previously unpublished replies to Charles Hartshorne, Rem B. Edwards, Robert E. Carter, G.R. Grice, Nicholas Rescher, Robert W. Mueller, Gordon Welty, Pete Gunter, and George K. Plochmann in an unfinished but now completed article on which Hartman was working at the time of his death in 1973. Part II consists of articles presented at recent annual meetings of the R.S. Hartman Institute for Formal and Applied Axiology that continue to criticize and further develop Hartman's formal axiology. An article by Rem B. Edwards raises serious unanswered questions about formal axiology and ethics. Another by Frank G. Forrest shows how the formal value calculus based on set theory might answer these questions, and an article by Mark A. Moore points out weaknesses in the Hartman/Forrest value calculus and develops an alternative calculus based upon the mathematics of quantum mechanics. While recognizing that unsolved problems remain, the book intends to make the theoretical foundations and future promise of formal axiology much more secure. Open Access funding for this volume has been provided by the Robert S. Hartman Institute.
Formal methods are changing how epistemology is being studied and understood. A Critical Introduction to Formal Epistemology introduces the types of formal theories being used and explains how they are shaping the subject. Beginning with the basics of probability and Bayesianism, it shows how representing degrees of belief using probabilities informs central debates in epistemology. As well as discussing induction, the paradox of confirmation and the main challenges to Bayesianism, this comprehensive overview covers objective chance, peer disagreement, the concept of full belief, and the traditional problems of justification and knowledge. Subjecting each position to a critical analysis, it explains the main issues in formal epistemology, and the motivations and drawbacks of each position. Written in an accessible language and supported study questions, guides to further reading and a glossary, positions are placed in an historic context to give a sense of the development of the field. As the first introductory textbook on formal epistemology, A Critical Introduction to Formal Epistemology is an invaluable resource for students and scholars of contemporary epistemology.
TRENDS IN LINGUISTICS is a series of books that open new perspectives in our understanding of language. The series publishes state-of-the-art work on core areas of linguistics across theoretical frameworks as well as studies that provide new insights by building bridges to neighbouring fields such as neuroscience and cognitive science. TRENDS IN LINGUISTICS considers itself a forum for cutting-edge research based on solid empirical data on language in its various manifestations, including sign languages. It regards linguistic variation in its synchronic and diachronic dimensions as well as in its social contexts as important sources of insight for a better understanding of the design of linguistic systems and the ecology and evolution of language. TRENDS IN LINGUISTICS publishes monographs and outstanding dissertations as well as edited volumes, which provide the opportunity to address controversial topics from different empirical and theoretical viewpoints. High quality standards are ensured through anonymous reviewing.
Formal methods for development of computer systems have been extensively studied over the years. A range of semantic theories, speci?cation languages, design techniques, and veri?cation methods and tools have been developed and applied to the construction of programs used in critical applications. The ch- lenge now is to scale up formal methods and integrate them into engineering - velopment processes for the correct and e?cient construction and maintenance of computer systems in general. This requires us to improve the state of the art on approaches and techniques for integration of formal methods into industrial engineering practice, including new and emerging practice. The now long-established series of International Conferences on Formal - gineering Methods brings together those interested in the application of formal engineering methods to computer systems. Researchers and practitioners, from industry, academia, and government, are encouraged to attend and to help - vance the state of the art. This volume contains the papers presented at ICFEM 2009, the 11th International Conference on Formal Engineering Methods, held during December 9–11, in Rio de Janeiro, Brazil.