This distinctive book presents a history of an increasingly important class of computers, personal workstations. It is a history seen from the unique perspective of the people who pioneered their development.
How the computer became universal. Over the past fifty years, the computer has been transformed from a hulking scientific supertool and data processing workhorse, remote from the experiences of ordinary people, to a diverse family of devices that billions rely on to play games, shop, stream music and movies, communicate, and count their steps. In A New History of Modern Computing, Thomas Haigh and Paul Ceruzzi trace these changes. A comprehensive reimagining of Ceruzzi's A History of Modern Computing, this new volume uses each chapter to recount one such transformation, describing how a particular community of users and producers remade the computer into something new. Haigh and Ceruzzi ground their accounts of these computing revolutions in the longer and deeper history of computing technology. They begin with the story of the 1945 ENIAC computer, which introduced the vocabulary of "programs" and "programming," and proceed through email, pocket calculators, personal computers, the World Wide Web, videogames, smart phones, and our current world of computers everywhere--in phones, cars, appliances, watches, and more. Finally, they consider the Tesla Model S as an object that simultaneously embodies many strands of computing.
From the first digital computer to the dot-com crash—a story of individuals, institutions, and the forces that led to a series of dramatic transformations. This engaging history covers modern computing from the development of the first electronic digital computer through the dot-com crash. The author concentrates on five key moments of transition: the transformation of the computer in the late 1940s from a specialized scientific instrument to a commercial product; the emergence of small systems in the late 1960s; the beginning of personal computing in the 1970s; the spread of networking after 1985; and, in a chapter written for this edition, the period 1995-2001. The new material focuses on the Microsoft antitrust suit, the rise and fall of the dot-coms, and the advent of open source software, particularly Linux. Within the chronological narrative, the book traces several overlapping threads: the evolution of the computer's internal design; the effect of economic trends and the Cold War; the long-term role of IBM as a player and as a target for upstart entrepreneurs; the growth of software from a hidden element to a major character in the story of computing; and the recurring issue of the place of information and computing in a democratic society. The focus is on the United States (though Europe and Japan enter the story at crucial points), on computing per se rather than on applications such as artificial intelligence, and on systems that were sold commercially and installed in quantities.
From the first digital computer to the dot-com crash—a story of individuals, institutions, and the forces that led to a series of dramatic transformations. This engaging history covers modern computing from the development of the first electronic digital computer through the dot-com crash. The author concentrates on five key moments of transition: the transformation of the computer in the late 1940s from a specialized scientific instrument to a commercial product; the emergence of small systems in the late 1960s; the beginning of personal computing in the 1970s; the spread of networking after 1985; and, in a chapter written for this edition, the period 1995-2001. The new material focuses on the Microsoft antitrust suit, the rise and fall of the dot-coms, and the advent of open source software, particularly Linux. Within the chronological narrative, the book traces several overlapping threads: the evolution of the computer's internal design; the effect of economic trends and the Cold War; the long-term role of IBM as a player and as a target for upstart entrepreneurs; the growth of software from a hidden element to a major character in the story of computing; and the recurring issue of the place of information and computing in a democratic society. The focus is on the United States (though Europe and Japan enter the story at crucial points), on computing per se rather than on applications such as artificial intelligence, and on systems that were sold commercially and installed in quantities.
A revelatory history of the people who created the computer and the Internet discusses the process through which innovation happens in the modern world, citing the pivotal contributions of such figures as Ada Lovelace, Alan Turing, Bill Gates, and Tim Berners-Lee.
One of the most important elements in the computer revolution has been agreement on technological standards. This book tells the complete story of the battle between several competing technologies in the late 1970s and early 1980s to become the compatibility standard in one high-tech arena, the LAN (local area network) industry.
Network Evolution and Applications provides a comprehensive, integrative, and easy approach to understanding the technologies, concepts, and milestones in the history of networking. It provides an overview of different aspects involved in the networking arena that includes the core technologies that are essential for communication and important in our day-to-day life. It throws some light on certain past networking concepts and technologies that have been revolutionary in the history of science and technology and have been highly impactful. It expands on various concepts like Artificial Intelligence, Software Defined Networking, Cloud Computing, and Internet of Things, which are very popular at present. This book focuses on the evolutions made in the world of networking. One can’t imagine the world without the Internet today; with the Internet and the present- day networking, distance doesn’t matter at all. The COVID-19 pandemic has resulted in a tough time worldwide, with global lockdown, locked homes, empty streets, stores without consumers, and offices with no or fewer staff. Thanks to the modern digital networks, the culture of work from home (WFH) or working remotely with the network/Internet connection has come to the fore, with even school and university classes going online. Although WFH is not new, the COVID-19 pandemic has given it a new look, and industries are now willfully exploring WFH to extend it in the future. The aim of this book is to present the timeline of networking to show the developments made and the milestones that were achieved due to these developments.
Human–Systems Integration: From Virtual to Tangible Subject Guide: Ergonomics and Human Factors This book is an attempt to better formalize a systemic approach to human–systems integration (HSI). Good HSI is a matter of maturity... it takes time to mature. It takes time for a human being to become autonomous, and then mature! HSI is a matter of human–machine teaming, where human–machine cooperation and coordination are crucial. We cannot think engineering design without considering people and organizations that go with it. We also cannot think new technology, new organizations, and new jobs without considering change management. More specifically, this book is a follow-up of previous contributions in human-centered design and practice in the development of virtual prototypes that requires progressive operational tangibility toward HSI. The book discusses flexibility in design and operations, tangibility of software-intensive systems, virtual human-centered design, increasingly autonomous complex systems, human factors and ergonomics of sociotechnical systems, systems integration, and changed management in digital organizations. The book will be of interest to industry, academia, those involved with systems engineering, human factors, and the broader public.