The information age has grown out of the work of experimental computer science, which is dedicated to the development of new hardware, software, graphics, interfaces, and other computer system technologies. While it is important to society in this larger sense, experimental computer science has found an awkward fit in university environments. This volume examines what is special about experimental computer science and what can be done to achieve a better fit for its practitioners in the academic context.
Computers, communications, digital information, softwarethe constituents of the information ageare everywhere. Being computer literate, that is technically competent in two or three of todays software applications, is not enough anymore. Individuals who want to realize the potential value of information technology (IT) in their everyday lives need to be computer fluentable to use IT effectively today and to adapt to changes tomorrow. Being Fluent with Information Technology sets the standard for what everyone should know about IT in order to use it effectively now and in the future. It explores three kinds of knowledgeintellectual capabilities, foundational concepts, and skillsthat are essential for fluency with IT. The book presents detailed descriptions and examples of current skills and timeless concepts and capabilities, which will be useful to individuals who use IT and to the instructors who teach them.
The U.S. information technology (IT) research and development (R&D) ecosystem was the envy of the world in 1995. However, this position of leadership is not a birthright, and it is now under pressure. In recent years, the rapid globalization of markets, labor pools, and capital flows have encouraged many strong national competitors. During the same period, national policies have not sufficiently buttressed the ecosystem, or have generated side effects that have reduced its effectiveness. As a result, the U.S. position in IT leadership today has materially eroded compared with that of prior decades, and the nation risks ceding IT leadership to other nations within a generation. Assessing the Impacts of Changes in the Information Technology R&D Ecosystem calls for a recommitment to providing the resources needed to fuel U.S. IT innovation, to removing important roadblocks that reduce the ecosystem's effectiveness in generating innovation and the fruits of innovation, and to becoming a lead innovator and user of IT. The book examines these issues and makes recommendations to strengthen the U.S. IT R&D ecosystem.
Advances in computer science and technology and in biology over the last several years have opened up the possibility for computing to help answer fundamental questions in biology and for biology to help with new approaches to computing. Making the most of the research opportunities at the interface of computing and biology requires the active participation of people from both fields. While past attempts have been made in this direction, circumstances today appear to be much more favorable for progress. To help take advantage of these opportunities, this study was requested of the NRC by the National Science Foundation, the Department of Defense, the National Institutes of Health, and the Department of Energy. The report provides the basis for establishing cross-disciplinary collaboration between biology and computing including an analysis of potential impediments and strategies for overcoming them. The report also presents a wealth of examples that should encourage students in the biological sciences to look for ways to enable them to be more effective users of computing in their studies.
These post-proceedings contain the revised versions of the papers presented at the \Symposium on Objects and Databases" which was held in Sophia-Antipolis, France, June 13, 2000, in conjunction with the Fourteenth European Conference on Object-Oriented Programming, ECOOP 2000. This event continued the t- dition established the year before in Lisbon (Portugal) with the First Workshop on Object-Oriented Databases. The goal of the symposium was to bring together researchers working in various corners of the eld of objects and databases, to discuss the current state of research in the eld and to critically evaluate existing solutions in terms of their current usage, their successes and limitations, and their potential for new applications. The organizing committee received 21 papers which were reviewed by a p- gram committee of people active in the eld of objects and databases. There were 3 reviews for each paper, and nally the organizing committee selected 9 long papers, 2 short papers, and a demonstration to be presented and discussed at the symposium. The selected papers cover a wide spectrum of topics, including data modeling concepts, persistent object languages, consistency and integrity of persistent data, storage structures, class versioning and schema evolution, query languages, and temporal object-oriented databases. In addition to the regular papers, the symposium included an invited p- sentation, given by Prof. Malcolm Atkinson from the University of Glasgow (Scotland) where he heads the Persistence and Distribution Group.
Experimental algorithmics, as its name indicates, combines algorithmic work and experimentation: algorithms are not just designed, but also implemented and tested on a variety of instances. Perhaps the most important lesson in this process is that designing an algorithm is but the first step in the process of developing robust and efficient software for applications. Based on a seminar held at Dagstuhl Castle, Germany in September 2000, this state-of-the-art survey presents a coherent survey of the work done in the area so far. The 11 carefully reviewed chapters provide complete coverage of all current topics in experimental algorithmics.
Progress in information technology (IT) has been remarkable, but the best truly is yet to come: the power of IT as a human enabler is just beginning to be realized. Whether the nation builds on this momentum or plateaus prematurely depends on today's decisions about fundamental research in computer science (CS) and the related fields behind IT. The Computer Science and Telecommunications Board (CSTB) has often been asked to examine how innovation occurs in IT, what the most promising research directions are, and what impacts such innovation might have on society. Consistent themes emerge from CSTB studies, notwithstanding changes in information technology itself, in the IT-producing sector, and in the U.S. university system, a key player in IT research. In this synthesis report, based largely on the eight CSTB reports enumerated below, CSTB highlights these themes and updates some of the data that support them.
For nearly a year, the Pres.'s Information Tech. Advisory Comm. (PITAC) has studied the security of the information tech. (IT) infrastructure of the U.S., which is essential to nat. & homeland security as well as everyday life. The IT infrastructure is highly vulnerable to premeditated attacks with potentially catastrophic effects. Thus, it is a prime target for cyber terrorism as well as criminal acts. The IT infrastructure encompasses not only the public Internet -- e-commerce, communication, & Web services -- but also the less visible systems & connection of the Nation's critical infrastructures such as power grids, air traffic control systems, financial systems, & military & intelligence systems. These all require a secure IT infrastructure.