Why do computers use so much energy? What are the fundamental physical laws governing the relationship between the precise computation run by a system, whether artificial or natural, and how much energy that computation requires? This volume integrates concepts from diverse fields, cultivating a modern, nonequilibrium thermodynamics of computation.
A great technological and scientific innovation of the last half of the 20th century, the computer has revolutionised how we organise information, how we communicate with each other, and the way we think about the human mind. This book offers a short history of this dynamic technology, covering its central themes since ancient times.
Before Palm Pilots and iPods, PCs and laptops, the term "computer" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology. Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider world of women and men who did the hard computational labor of science. His grandmother's casual remark, "I wish I'd used my calculus," hinted at a career deferred and an education forgotten, a secret life unappreciated; like many highly educated women of her generation, she studied to become a human computer because nothing else would offer her a place in the scientific world. The book begins with the return of Halley's comet in 1758 and the effort of three French astronomers to compute its orbit. It ends four cycles later, with a UNIVAC electronic computer projecting the 1986 orbit. In between, Grier tells us about the surveyors of the French Revolution, describes the calculating machines of Charles Babbage, and guides the reader through the Great Depression to marvel at the giant computing room of the Works Progress Administration. When Computers Were Human is the sad but lyrical story of workers who gladly did the hard labor of research calculation in the hope that they might be part of the scientific community. In the end, they were rewarded by a new electronic machine that took the place and the name of those who were, once, the computers.
Konrad Zuse is one of the great pioneers of the computer age. He created thefirst fully automated, program controlled, freely programmable computer using binary floating-point calculation. It was operational in 1941. He built his first machines in Berlin during the Second World War, with bombs falling all around, and after the war he built up a company that was taken over by Siemens in 1967. Zuse was an inventor in the traditional style, full of phantastic ideas, but also gifted with a powerful analytical mind. Single-handedly, he developed one of the first programming languages, the Plan Calculus, including features copied only decades later in other languages. He wrote numerousbooks and articles and won many honors and awards. This is his autobiography, written in an engagingly lively and pleasant style, full of anecdotes, reminiscences, and philosophical asides. It traces his life from his childhood in East Prussia, through tense wartime experiences and hard times building up his business after the war, to a ripe old age andwell-earned celebrity.
A primer on the underlying technologies that allow computer programs to work. Covers topics like computer hardware, combinatorial logic, sequential logic, computer architecture, computer anatomy, and Input/Output. Many coders are unfamiliar with the underlying technologies that make their programs run. But why should you care when your code appears to work? Because you want it to run well and not be riddled with hard-to-find bugs. You don't want to be in the news because your code had a security problem. Lots of technical detail is available online but it's not organized or collected into a convenient place. In The Secret Life of Programs, veteran engineer Jonathan E. Steinhart explores--in depth--the foundational concepts that underlie the machine. Subjects like computer hardware, how software behaves on hardware, as well as how people have solved problems using technology over time. You'll learn: How the real world is converted into a form that computers understand, like bits, logic, numbers, text, and colors The fundamental building blocks that make up a computer including logic gates, adders, decoders, registers, and memory Why designing programs to match computer hardware, especially memory, improves performance How programs are converted into machine language that computers understand How software building blocks are combined to create programs like web browsers Clever tricks for making programs more efficient, like loop invariance, strength reduction, and recursive subdivision The fundamentals of computer security and machine intelligence Project design, documentation, scheduling, portability, maintenance, and other practical programming realities. Learn what really happens when your code runs on the machine and you'll learn to craft better, more efficient code.
The computing technology on which we are now so dependent has risen to its position of ascendency so rapidly that few of us have had the opportunity to take a step back and wonder where we are headed. This book urges us to do so. Taking a big-picture perspective on digital technology, Living with Computers leads the reader on a whistle-stop tour of the history of information and information technology. This journey culminates in a deep exploration into the meaning and role of computers in our lives, and what this experience might possibly mean for the future of human society – and the very existence of humanity itself. In the face of the transformative power of computing, this book provokes us to ask big questions. If computers become integrated into our bodies, merging with the information processing of our very DNA, will computing help to shape the evolution of biological life? If artificial intelligence advances beyond the abilities of the human brain, will this overturn our anthropocentrism and lead to a new view of reality? Will we control the computers of the future, or will they control us? These questions can be discomforting, yet they cannot be ignored. This book argues that it is time to reshape our definition of our species in the context of our interaction with computing. For although such science-fiction scenarios are not likely to happen any time soon – and may, in fact, never happen – it is nevertheless vital to consider these issues now if we wish to have any influence over whatever is to come. So, humans, let’s confront our possible destiny! James W. Cortada is a Senior Research Fellow at the Charles Babbage Institute at the University of Minnesota. He holds a Ph.D. in modern history and worked at IBM in various positions for 38 years, including in IBM’s management research institute, The IBM Institute for Business Value (IBV). He is the author of over a dozen books on management, and nearly two dozen books on the history of information technology. These include the Springer title From Urban Legends to Political Fact-Checking: Online Scrutiny in America, 1990-2015 (with William Aspray).
Computers and Society explores the history and impact of modern technology on everyday human life, considering its benefits, drawbacks, and repercussions. Particular attention is paid to new developments in artificial intelligence and machine learning, and the issues that have arisen from our complex relationship with AI.