The challenges faced by IBM's research and development laboratories, the technological paths they chose, and how these choices affected the company and the computer industry.
Computers are increasingly the enabling devices of the information revolution, and computing is becoming ubiquitous in every corner of society, from manufacturing to telecommunications to pharmaceuticals to entertainment. Even more importantly, the face of computing is changing rapidly, as even traditional rivals such as IBM and Apple Computer begin to cooperate and new modes of computing are developed. Computing the Future presents a timely assessment of academic computer science and engineering (CS&E), examining what should be done to ensure continuing progress in making discoveries that will carry computing into the twenty-first century. Most importantly, it advocates a broader research and educational agenda that builds on the field's impressive accomplishments. The volume outlines a framework of priorities for CS&E, along with detailed recommendations for education, funding, and leadership. A core research agenda is outlined for these areas: processors and multiple-processor systems, data communications and networking, software engineering, information storage and retrieval, reliability, and user interfaces. This highly readable volume examines: Computer science and engineering as a discipline-how computer scientists and engineers are pushing back the frontiers of their field. How CS&E must change to meet the challenges of the future. The influence of strategic investment by federal agencies in CS&E research. Recent structural changes that affect the interaction of academic CS&E and the business environment. Specific examples of interdisciplinary and applications research in four areas: earth sciences and the environment, computational biology, commercial computing, and the long-term goal of a national electronic library. The volume provides a detailed look at undergraduate CS&E education, highlighting the limitations of four-year programs, and discusses the emerging importance of a master's degree in CS&E and the prospects for broadening the scope of the Ph.D. It also includes a brief look at continuing education.
This volume contains the proceedings of the 8th Conference on Foundations of Software Technology and Theoretical Computer Science held in Pune, India, on December 21-23, 1988. This internationally well-established Indian conference series provides a forum for actively investigating the interface between theory and practice of Software Science. It also gives an annual occasion for interaction between active research communities in India and abroad. Besides attractive invited papers the volume contains carefully reviewed submitted papers on the following topics: Automata and Formal Languages, Graph Algorithms and Geometric Algorithms, Distributed Computing, Parallel Algorithms, Database Theory, Logic Programming, Programming Methodology, Theory of Algorithms, Semantics and Complexity.
It was the aim of the conference to present issues in parallel computing to a community of potential engineering/scientific users. An overview of the state-of-the-art in several important research areas is given by leading scientists in their field. The classification question is taken up at various points, ranging from parametric characterizations, communication structure, and memory distribution to control and execution schemes. Central issues in multiprocessing hardware and operation, such as scalability, techniques of overcoming memory latency and synchronization overhead, as well as fault tolerance of communication networks are discussed. The problem of designing and debugging parallel programs in a user-friendly environment is addressed and a number of program transformations for enhancing vectorization and parallelization in a variety of program situations are described. Two different algorithmic techniques for the solution of certain classes of partial differential equations are discussed. The properties of domain-decomposition algorithms and their mapping onto a CRAY-XMP-type architecture are investigated and an overview is given of the merit of various approaches to exploiting the acceleration potential of multigrid methods. Finally, an abstract performance modeling technique for the behavior of applications on parallel and vector architectures is described.
Providing a sequence of steps for matching cost engineering needs with helpful computer tools, this reference addresses the issues of project complexity and uncertainty; cost estimation, scheduling, and cost control; cost and result uncertainty; engineering and general purpose software; utilities th
This work has been selected by scholars as being culturally important and is part of the knowledge base of civilization as we know it. This work is in the public domain in the United States of America, and possibly other nations. Within the United States, you may freely copy and distribute this work, as no entity (individual or corporate) has a copyright on the body of the work. Scholars believe, and we concur, that this work is important enough to be preserved, reproduced, and made generally available to the public. To ensure a quality reading experience, this work has been proofread and republished using a format that seamlessly blends the original graphical elements with text in an easy-to-read typeface. We appreciate your support of the preservation process, and thank you for being an important part of keeping this knowledge alive and relevant.
Computational concepts and techniques have always played a major role in control engineering since the first computer-based control systems were put into operation over twenty years ago. This role has in fact been accelerating over the intervening years as the sophistication of the computing methods and tools available, as well as the complexity of the control problems they have been used to solve, have also increased. In particular, the introduction of the microprocessor and its use as a low-cost computing element in a distributed computer control system has had a profound effect on the way in which the design and implementation of a control system is carried out and, to some extent, on the theory which underlies the basic design strategies. The development of interactive computing has encouraged a substantial growth in the use of computer aided design methods and robust and efficient numerical algorithms have been produced to support these methods. Major advances have also taken place in the languages used for control system implementation, notably the recent introduction of Ada'", a language whose design is based on some very fundamental computer science concepts derived and developed over the past decade. With the extremely high rate of change in the field of computer science, the more recent developments have outpaced their incorporation into new control system design and implementation techniques.
Understanding Computers and Cognition presents an important and controversial new approach to understanding what computers do and how their functioning is related to human language, thought, and action. While it is a book about computers, Understanding Computers and Cognition goes beyond the specific issues of what computers can or can't do. It is a broad-ranging discussion exploring the background of understanding in which the discourse about computers and technology takes place. Understanding Computers and Cognition is written for a wide audience, not just those professionals involved in computer design or artificial intelligence. It represents an important contribution to the ongoing discussion about what it means to be a machine, and what it means to be human. Book jacket.