Research Directions in Computer Science

Research Directions in Computer Science

Author: Albert R. Meyer

Publisher: Mit Press

Published: 1991

Total Pages: 490

ISBN-13: 9780262132572

DOWNLOAD EBOOK

Research Directions in Computer Science celebrates the twenty-fifth anniversary of the founding of MIT's Project MAC. It covers the full range of ongoing computer science research at the MIT Laboratory for Computer Science and the MIT Artificial Intelligence Laboratory, both of which grew out of the original Project MAC. Leading researchers from the faculties and staffs of the laboratories highlight current research and future activities in multiprocessors and parallel computer architectures, in languages and systems for distributed computing, in intelligent systems (AI) and robotics, in complexity and learning theory, in software methodology, in programming language theory, in software for engineering research and education, and in the relation between computers and economic productivity. ContributorsAbelson, Arvind, Rodney Brooks, David Clark, Fernando Corbato, William Daily, Michael Dertouzos, John Guttag, Berthold K. P. Horn, Barbara Liskov, Albert Meyer, Nicholas Negroponte, Marc Raibert, Ronald Rivest, Michael Sipser, Gerald Sussman, Peter Szolovits, and John Updike


Research Directions in Object-oriented Programming

Research Directions in Object-oriented Programming

Author: Bruce D. Shriver

Publisher:

Published: 1987

Total Pages: 604

ISBN-13:

DOWNLOAD EBOOK

Once a radical notion, object-oriented programming is one of today's most active research areas. It is especially well suited to the design of very large software projects involving many programmers all working on the same project. The original contributions in this book will provide researchers and students in programming languages, databases, and programming semantics with the most complete survey of the field available. Broad in scope and deep in its examination of substantive issues, the book focuses on the major topics of object-oriented languages, models of computation, mathematical models, object-oriented databases, and object-oriented environments. The object-oriented languages include Beta, the Scandinavian successor to Simula (a chapter by Bent Kristensen, whose group has had the longest experience with object-oriented programming, reveals how that experience has shaped the group's vision today); CommonObjects, a Lisp-based language with abstraction; Actors, a low-level language for concurrent modularity; and Vulcan, a Prolog-based concurrent object-oriented language. New computational models of inheritance, composite objects, block-structure layered systems, and classification are covered, and theoretical papers on functional object-oriented languages and object-oriented specification are included in the section on mathematical models. The three chapters on object-oriented databases (including David Maier's "Development and Implementation of an Object-Oriented Database Management System," which spans the programming and database worlds by integrating procedural and representational capability and the requirements of multi-user persistent storage) and the two chapters on object-oriented environments provide a representative sample of good research in these two important areas. Bruce Shriver is a researcher at IBM's Thomas J. Watson Research Center. Peter Wegner is a professor in the Department of Computer Science at Brown University. Research Directions in Object-Oriented Programmingis included in the Computer Systems series, edited by Herb Schwetman.


The Technological Indian

The Technological Indian

Author: Ross Bassett

Publisher: Harvard University Press

Published: 2016-02-15

Total Pages: 397

ISBN-13: 0674495462

DOWNLOAD EBOOK

In the late 1800s, Indians seemed to be a people left behind by the Industrial Revolution, dismissed as “not a mechanical race.” Today Indians are among the world’s leaders in engineering and technology. In this international history spanning nearly 150 years, Ross Bassett—drawing on a unique database of every Indian to graduate from the Massachusetts Institute of Technology between its founding and 2000—charts their ascent to the pinnacle of high-tech professions. As a group of Indians sought a way forward for their country, they saw a future in technology. Bassett examines the tensions and surprising congruences between this technological vision and Mahatma Gandhi’s nonindustrial modernity. India’s first prime minister, Jawaharlal Nehru, sought to use MIT-trained engineers to build an India where the government controlled technology for the benefit of the people. In the private sector, Indian business families sent their sons to MIT, while MIT graduates established India’s information technology industry. By the 1960s, students from the Indian Institutes of Technology (modeled on MIT) were drawn to the United States for graduate training, and many of them stayed, as prominent industrialists, academics, and entrepreneurs. The MIT-educated Indian engineer became an integral part of a global system of technology-based capitalism and focused less on India and its problems—a technological Indian created at the expense of a technological India.


Funding a Revolution

Funding a Revolution

Author: National Research Council

Publisher: National Academies Press

Published: 1999-02-11

Total Pages: 300

ISBN-13: 0309062780

DOWNLOAD EBOOK

The past 50 years have witnessed a revolution in computing and related communications technologies. The contributions of industry and university researchers to this revolution are manifest; less widely recognized is the major role the federal government played in launching the computing revolution and sustaining its momentum. Funding a Revolution examines the history of computing since World War II to elucidate the federal government's role in funding computing research, supporting the education of computer scientists and engineers, and equipping university research labs. It reviews the economic rationale for government support of research, characterizes federal support for computing research, and summarizes key historical advances in which government-sponsored research played an important role. Funding a Revolution contains a series of case studies in relational databases, the Internet, theoretical computer science, artificial intelligence, and virtual reality that demonstrate the complex interactions among government, universities, and industry that have driven the field. It offers a series of lessons that identify factors contributing to the success of the nation's computing enterprise and the government's role within it.


Algorithms, Automation, and News

Algorithms, Automation, and News

Author: Neil Thurman

Publisher: Routledge

Published: 2021-05-18

Total Pages: 246

ISBN-13: 100038439X

DOWNLOAD EBOOK

This book examines the growing importance of algorithms and automation—including emerging forms of artificial intelligence—in the gathering, composition, and distribution of news. In it the authors connect a long line of research on journalism and computation with scholarly and professional terrain yet to be explored. Taken as a whole, these chapters share some of the noble ambitions of the pioneering publications on ‘reporting algorithms’, such as a desire to see computing help journalists in their watchdog role by holding power to account. However, they also go further, firstly by addressing the fuller range of technologies that computational journalism now consists of: from chatbots and recommender systems to artificial intelligence and atomised journalism. Secondly, they advance the literature by demonstrating the increased variety of uses for these technologies, including engaging underserved audiences, selling subscriptions, and recombining and re-using content. Thirdly, they problematise computational journalism by, for example, pointing out some of the challenges inherent in applying artificial intelligence to investigative journalism and in trying to preserve public service values. Fourthly, they offer suggestions for future research and practice, including by presenting a framework for developing democratic news recommenders and another that may help us think about computational journalism in a more integrated, structured manner. The chapters in this book were originally published as a special issue of Digital Journalism.


Computational Analysis and Deep Learning for Medical Care

Computational Analysis and Deep Learning for Medical Care

Author: Amit Kumar Tyagi

Publisher: John Wiley & Sons

Published: 2021-08-24

Total Pages: 532

ISBN-13: 1119785723

DOWNLOAD EBOOK

The book details deep learning models like ANN, RNN, LSTM, in many industrial sectors such as transportation, healthcare, military, agriculture, with valid and effective results, which will help researchers find solutions to their deep learning research problems. We have entered the era of smart world devices, where robots or machines are being used in most applications to solve real-world problems. These smart machines/devices reduce the burden on doctors, which in turn make their lives easier and the lives of their patients better, thereby increasing patient longevity, which is the ultimate goal of computer vision. Therefore, the goal in writing this book is to attempt to provide complete information on reliable deep learning models required for e-healthcare applications. Ways in which deep learning can enhance healthcare images or text data for making useful decisions are discussed. Also presented are reliable deep learning models, such as neural networks, convolutional neural networks, backpropagation, and recurrent neural networks, which are increasingly being used in medical image processing, including for colorization of black and white X-ray images, automatic machine translation images, object classification in photographs/images (CT scans), character or useful generation (ECG), image caption generation, etc. Hence, reliable deep learning methods for the perception or production of better results are a necessity for highly effective e-healthcare applications. Currently, the most difficult data-related problem that needs to be solved concerns the rapid increase of data occurring each day via billions of smart devices. To address the growing amount of data in healthcare applications, challenges such as not having standard tools, efficient algorithms, and a sufficient number of skilled data scientists need to be overcome. Hence, there is growing interest in investigating deep learning models and their use in e-healthcare applications. Audience Researchers in artificial intelligence, big data, computer science, and electronic engineering, as well as industry engineers in transportation, healthcare, biomedicine, military, agriculture.


Architectural and Operating System Support for Virtual Memory

Architectural and Operating System Support for Virtual Memory

Author: Abhishek Bhattacharjee

Publisher: Springer Nature

Published: 2022-05-31

Total Pages: 168

ISBN-13: 3031017579

DOWNLOAD EBOOK

This book provides computer engineers, academic researchers, new graduate students, and seasoned practitioners an end-to-end overview of virtual memory. We begin with a recap of foundational concepts and discuss not only state-of-the-art virtual memory hardware and software support available today, but also emerging research trends in this space. The span of topics covers processor microarchitecture, memory systems, operating system design, and memory allocation. We show how efficient virtual memory implementations hinge on careful hardware and software cooperation, and we discuss new research directions aimed at addressing emerging problems in this space. Virtual memory is a classic computer science abstraction and one of the pillars of the computing revolution. It has long enabled hardware flexibility, software portability, and overall better security, to name just a few of its powerful benefits. Nearly all user-level programs today take for granted that they will have been freed from the burden of physical memory management by the hardware, the operating system, device drivers, and system libraries. However, despite its ubiquity in systems ranging from warehouse-scale datacenters to embedded Internet of Things (IoT) devices, the overheads of virtual memory are becoming a critical performance bottleneck today. Virtual memory architectures designed for individual CPUs or even individual cores are in many cases struggling to scale up and scale out to today's systems which now increasingly include exotic hardware accelerators (such as GPUs, FPGAs, or DSPs) and emerging memory technologies (such as non-volatile memory), and which run increasingly intensive workloads (such as virtualized and/or "big data" applications). As such, many of the fundamental abstractions and implementation approaches for virtual memory are being augmented, extended, or entirely rebuilt in order to ensure that virtual memory remains viable and performant in the years to come.


Innovative Teaching Strategies and New Learning Paradigms in Computer Programming

Innovative Teaching Strategies and New Learning Paradigms in Computer Programming

Author: Ricardo Queirós

Publisher: IGI Global

Published: 2014-11-30

Total Pages: 339

ISBN-13: 1466673052

DOWNLOAD EBOOK

Courses in computer programming combine a number of different concepts, from general problem-solving to mathematical precepts such as algorithms and computational intelligence. Due to the complex nature of computer science education, teaching the novice programmer can be a challenge. Innovative Teaching Strategies and New Learning Paradigms in Computer Programming brings together pedagogical and technological methods to address the recent challenges that have developed in computer programming courses. Focusing on educational tools, computer science concepts, and educational design, this book is an essential reference source for teachers, practitioners, and scholars interested in improving the success rate of students.


New Research Directions for the National Geospatial-Intelligence Agency

New Research Directions for the National Geospatial-Intelligence Agency

Author: National Research Council

Publisher: National Academies Press

Published: 2010-08-18

Total Pages: 71

ISBN-13: 0309159997

DOWNLOAD EBOOK

The National Geospatial-Intelligence Agency (NGA) within the Department of Defense has the primary mission of providing timely, relevant, and accurate imagery, imagery intelligence, and geospatial information-collectively known as geospatial intelligence (GEOINT)-in support of national security. In support of its mission, NGA sponsors research that builds the scientific foundation for geospatial intelligence and that reinforces the academic base, thus training the next generation of NGA analysts while developing new approaches to analytical problems. Historically, NGA has supported research in five core areas: (1) photogrammetry and geomatics, (2) remote sensing and imagery science, (3) geodesy and geophysics, (4) cartographic science, and (5) geographic information systems (GIS) and geospatial analysis. Positioning NGA for the future is the responsibility of the InnoVision Directorate, which analyzes intelligence trends, technological advances, and emerging customer and partner concepts to provide cutting-edge technology and process solutions. At the request of InnoVision, the National Research Council (NRC) held a 3-day workshop to explore the evolution of the five core research areas and to identify emerging disciplines that may improve the quality of geospatial intelligence over the next 15 years. This workshop report offers a potential research agenda that would expand NGA's capabilities and improve its effectiveness in providing geospatial intelligence.