This textbook gives a systematized and compact summary, providing the most essential types of modern models for languages and computation together with their properties and applications. Most of these models properly reflect and formalize current computational methods, based on parallelism, distribution and cooperation covered in this book. As a result, it allows the user to develop, study, and improve these methods very effectively. This textbook also represents the first systematic treatment of modern language models for computation. It covers all essential theoretical topics concerning them. From a practical viewpoint, it describes various concepts, methods, algorithms, techniques, and software units based upon these models. Based upon them, it describes several applications in biology, linguistics, and computer science. Advanced-level students studying computer science, mathematics, linguistics and biology will find this textbook a valuable resource. Theoreticians, practitioners and researchers working in today’s theory of computation and its applications will also find this book essential as a reference.
The theory of computation is used to address challenges arising in many computer science areas such as artificial intelligence, language processors, compiler writing, information and coding systems, programming language design, computer architecture and more. To grasp topics concerning this theory readers need to familiarize themselves with its computational and language models, based on concepts of discrete mathematics including sets, relations, functions, graphs and logic.
A Concise Introduction to Computation Models and Computability Theory provides an introduction to the essential concepts in computability, using several models of computation, from the standard Turing Machines and Recursive Functions, to the modern computation models inspired by quantum physics. An in-depth analysis of the basic concepts underlying each model of computation is provided. Divided into two parts, the first highlights the traditional computation models used in the first studies on computability: - Automata and Turing Machines; - Recursive functions and the Lambda-Calculus; - Logic-based computation models. and the second part covers object-oriented and interaction-based models. There is also a chapter on concurrency, and a final chapter on emergent computation models inspired by quantum mechanics. At the end of each chapter there is a discussion on the use of computation models in the design of programming languages.
An introduction to a broad range of topics in deep learning, covering mathematical and conceptual background, deep learning techniques used in industry, and research perspectives. “Written by three experts in the field, Deep Learning is the only comprehensive book on the subject.” —Elon Musk, cochair of OpenAI; cofounder and CEO of Tesla and SpaceX Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models. Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.
Semantic change — how the meanings of words change over time — has preoccupied scholars since well before modern linguistics emerged in the late 19th and early 20th century, ushering in a new methodological turn in the study of language change. Compared to changes in sound and grammar, semantic change is the least understood. Ever since, the study of semantic change has progressed steadily, accumulating a vast store of knowledge for over a century, encompassing many languages and language families. Historical linguists also early on realized the potential of computers as research tools, with papers at the very first international conferences in computational linguistics in the 1960s. Such computational studies still tended to be small-scale, method-oriented, and qualitative. However, recent years have witnessed a sea-change in this regard. Big-data empirical quantitative investigations are now coming to the forefront, enabled by enormous advances in storage capability and processing power. Diachronic corpora have grown beyond imagination, defying exploration by traditional manual qualitative methods, and language technology has become increasingly data-driven and semantics-oriented. These developments present a golden opportunity for the empirical study of semantic change over both long and short time spans. A major challenge presently is to integrate the hard-earned knowledge and expertise of traditional historical linguistics with cutting-edge methodology explored primarily in computational linguistics. The idea for the present volume came out of a concrete response to this challenge. The 1st International Workshop on Computational Approaches to Historical Language Change (LChange'19), at ACL 2019, brought together scholars from both fields. This volume offers a survey of this exciting new direction in the study of semantic change, a discussion of the many remaining challenges that we face in pursuing it, and considerably updated and extended versions of a selection of the contributions to the LChange'19 workshop, addressing both more theoretical problems — e.g., discovery of "laws of semantic change" — and practical applications, such as information retrieval in longitudinal text archives.
The book is a collection of best selected research papers presented at the International Conference on Intelligent Systems and Sustainable Computing (ICISSC 2021), held in School of Engineering, Malla Reddy University, Hyderabad, India, during 24–25 September 2021. The book covers recent research in intelligent systems, intelligent business systems, soft computing, swarm intelligence, artificial intelligence and neural networks, data mining & data warehousing, cloud computing, distributed computing, big data analytics, Internet of Things (IoT), machine learning, speech processing, sustainable high-performance systems, VLSI and embedded systems, image and video processing, and signal processing and communication.
This three-volume set constitutes the refereed proceedings of the 12th National CCF Conference on Natural Language Processing and Chinese Computing, NLPCC 2023, held in Foshan, China, during October 12–15, 2023. The ____ regular papers included in these proceedings were carefully reviewed and selected from 478 submissions. They were organized in topical sections as follows: dialogue systems; fundamentals of NLP; information extraction and knowledge graph; machine learning for NLP; machine translation and multilinguality; multimodality and explainability; NLP applications and text mining; question answering; large language models; summarization and generation; student workshop; and evaluation workshop.
John Mauchly, J. Presper Eckert, Jr., and their team built ENIAC (Electronic Numerical Integrator and Computer) in 1946, the first modern stored-program electronic computer. They built it primarily to design weapons during the Second World War. Since then, computers have entered every facet of our daily life. Nowadays, we use computers extensively to process data in banks, government offices, and commercial establishments. We use them to book train tickets, airline tickets, and hotel rooms. They control systems such as satellites and moon landers in real-time. They create complex graphics and animation. They synthesize speech and music. They write essays and draw pictures. They control Robots. Publishers use them as tools. They are used to play video games. Many devices, such as audio and video tape recorders and film cameras, have died and been replaced by digital devices. They have eliminated many jobs, such as type-setters, and created new jobs, such as programmers, requiring better skills. It is fascinating to trace this history. This book recounts the history of modern computing as a sequence of seventy-two anecdotes, beginning with how engineers at the University of Pennsylvania built the modern stored program computer ENIAC in 1946 and ends with the story of the evolution of ChatGPT and Gemini, the generative large language model neural network released between 2022 and 2024 that give natural language answers to natural language questions, write essays, compose poems, and write computer programs. The anecdotes in this book are short. Each anecdote is between 1500 and 2500 words and recounts the story of an important invention in the evolution of modern computing and the people who innovated. There are seventy-two anecdotes in this book. The anecdotes cover the history of computer hardware, software, applications, computer communications, and artificial intelligence. The set of anecdotes on hardware systems describes, among others, the history of the evolution of computers, such as the IBM 701, CDC 6600, IBM 360 family, Digital Equipment Corporation's PDP series, Apple – the early personal computer, and Atlas – a pioneering British computer, IBM PC, Connection Machine, Cray series supercomputers, computing cluster Beowulf, IBM Roadrunner – the fastest and the most expensive ($ 600 million) computer in the World in 2022, Raspberry Pi – the cheapest ($35) computer. The group of anecdotes on software describes the evolution of Fortran, COBOL, BASIC, Compatible Time-shared systems, Unix, CP/M OS, MS-DOS, Project MAC, and open-source software movement, among others. Some anecdotes are on computer applications, such as Data Base Management Systems (DBMS), spreadsheets, cryptography, and Global Positioning Systems (GPS). The anecdotes on computer communications recount the evolution of computer communication networks, such as ALOHAnet, Ethernet, ARPANET, and the Internet, among others. The anecdotes on Artificial Intelligence (AI) start with "Who coined the word Artificial Intelligence?" and recounts early chess-playing programs, the evolution of neural networks, Expert Systems, and the history of chatbots and Robots. These anecdotes are similar to a short story collection. A reader may read them in any order. Each anecdote is self-contained, and readers may read the one that interests them. The language used in the book is simple, with no jargon. Anyone with a high school education can understand the material in this book. KEY FEATURES • The book recounts the history of modern computing as a series of 72 anecdotes • Each anecdote tells the story of an important event in the history of computing • Each anecdote describes an invention and those who invented • Each anecdote is self-contained and may be read in any order • Suitable for a general reader with a high school education TARGET AUDIENCE • Students Pursuing Computer Science & IT Courses • IT Professionals • 10+2 students