This edited volume covers an array of the most relevant topics in translation cognition, taking different approaches and using different research tools. It explores theoretical and methodological issues using case studies and examining their practical and pedagogical implications. It is a valuable resource for translation studies scholars, graduate students and those interested in translation and translation training, enabling them to conceptualize translation cognition, in order to enhance their research methods and designs, manage innovations in their translation training or simply understand their own translation behaviours.
As computers and communications technology advance, greater opportunities arise for intelligent mathematical computation. While computer algebra, au- mated deduction and mathematical publishing each have long and successful histories, we are now seeing increasing opportunities for synergy among them. The Conferences on Intelligent Computer Mathematics (cicm 2009) is a c- lection of co-located meetings, allowing researchers and practitioners active in these related areas to share recent results and identify the next challenges. The speci?c areas of the cicm conferences and workshops are described below, but the unifying theme is the computerized handling of mathematical knowledge. The successful formalization of much of mathematics, as well as a better - derstanding of its internal structure, makes mathematical knowledge in many waysmore tractable than generalknowledge,as traditionally treatedin arti?cial intelligence. Similarly, we can also expect the problem of e?ectively using ma- ematical knowledge in automated ways to be much more tractable. This is the goal of the work in the cicm conferences and workshops. In the long view, so- ing the problems addressed by cicm is an important milestone in formulating the next generation of mathematical software.
This book provides a wide variety of algorithms and models to integrate linguistic knowledge into Statistical Machine Translation (SMT). It helps advance conventional SMT to linguistically motivated SMT by enhancing the following three essential components: translation, reordering and bracketing models. It also serves the purpose of promoting the in-depth study of the impacts of linguistic knowledge on machine translation. Finally it provides a systematic introduction of Bracketing Transduction Grammar (BTG) based SMT, one of the state-of-the-art SMT formalisms, as well as a case study of linguistically motivated SMT on a BTG-based platform.
Translation is in motion. Technological developments, digitalisation and globalisation are among the many factors affecting and changing translation and, with it, translation studies. Moving Boundaries in Translation Studies offers a bird’s-eye view of recent developments and discusses their implications for the boundaries of the discipline. With 15 chapters written by leading translation scholars from around the world, the book analyses new translation phenomena, new practices and tools, new forms of organisation, new concepts and names as well as new scholarly approaches and methods. This is key reading for scholars, researchers and advanced students of translation and interpreting studies. The Open Access version of this book, available at http://www.taylorfrancis.com, has been made available under a Creative Commons Attribution-Non Commercial-No Derivatives 4.0 license
Translation practice and workflows have witnessed significant changes during the last decade. New market demands to handle digital content as well as technological advances are leading this transition. The development and integration of machine translation systems have given post-editing practices a reason to be in the context of professional translation services. Translators may still work from a source text, but more often than not they are presented with already translated text involving different degrees of translation automation. This scenario radically changes the cognitive demands of translation. Technological development has inevitably influenced the translation research agenda as well. It has provided new means of penetrating deeper into the cognitive processes that make translation possible and has endorsed new concepts and theories to understand the translation process. Computational analysis of eye movements and keystroke behaviour provides us with new insights into translational reading, processes of literality, effects of directionality, similarities between inter- and intralingual translation, as well as the effects of post-editing on cognitive processes and on the quality of the final outcome. All of these themes are explored in-depth in the articles in this volume which presents new and valuable insights to anyone interested in what is currently happening in empirical, process-oriented translation research.
Information modeling techniques are used during information systems analysis and design, and are important kinds of techniques, that are part of information systems development methodologies. An optimal information modeling technique may be defined as an information modeling technique that is most appropriate to be applied in a specific situation indicated by certain contingency factors. Optimal Information Modeling Techniques examines these methods and provides the most recent research in the field, to be applied to the management applications of modern organizations.
Artificial intelligence (AI) plays a vital part in the continued development of computer science and informatics. The AI applications employed in fields such as medicine, economics, linguistics, philosophy, psychology and logical analysis, not forgetting industry, are now indispensable for the effective functioning of a multitude of systems. This book presents the papers from the 20th biennial European Conference on Artificial Intelligence, ECAI 2012, held in Montpellier, France, in August 2012. The ECAI conference remains Europe's principal opportunity for researchers and practitioners of Artificial Intelligence to gather and to discuss the latest trends and challenges in all subfields of AI, as well as to demonstrate innovative applications and uses of advanced AI technology. ECAI 2012 featured four keynote speakers, an extensive workshop program, seven invited tutorials and the new Frontiers of Artificial Intelligence track, in which six invited speakers delivered perspective talks on particularly interesting new research results, directions and trends in Artificial Intelligence or in one of its related fields. The proceedings of PAIS 2012 and the System Demonstrations Track are also included in this volume, which will be of interest to all those wishing to keep abreast of the latest developments in the field of AI.
This book constitutes the refereed proceedings of the 4th CCF Conference, NLPCC 2015, held in Nanchang, China, in October 2015. The 35 revised full papers presented together with 22 short papers were carefully reviewed and selected from 238 submissions. The papers are organized in topical sections on fundamentals on language computing; applications on language computing; NLP for search technology and ads; web mining; knowledge acquisition and information extraction.