We introduce the concept of “coded computing”, a novel computing paradigm that utilizes coding theory to effectively inject and leverage data/computation redundancy to mitigate several fundamental bottlenecks in large-scale distributed computing, namely communication bandwidth, straggler’s (i.e., slow or failing nodes) delay, privacy and security bottlenecks.
This “sobering tale of the real consequences of gender bias” explores how Britain lost its early dominance in computing by systematically discriminating against its most qualified workers: women (Harvard Magazine) In 1944, Britain led the world in electronic computing. By 1974, the British computer industry was all but extinct. What happened in the intervening thirty years holds lessons for all postindustrial superpowers. As Britain struggled to use technology to retain its global power, the nation’s inability to manage its technical labor force hobbled its transition into the information age. In Programmed Inequality, Mar Hicks explores the story of labor feminization and gendered technocracy that undercut British efforts to computerize. That failure sprang from the government’s systematic neglect of its largest trained technical workforce simply because they were women. Women were a hidden engine of growth in high technology from World War II to the 1960s. As computing experienced a gender flip, becoming male-identified in the 1960s and 1970s, labor problems grew into structural ones and gender discrimination caused the nation’s largest computer user—the civil service and sprawling public sector—to make decisions that were disastrous for the British computer industry and the nation as a whole. Drawing on recently opened government files, personal interviews, and the archives of major British computer companies, Programmed Inequality takes aim at the fiction of technological meritocracy. Hicks explains why, even today, possessing technical skill is not enough to ensure that women will rise to the top in science and technology fields. Programmed Inequality shows how the disappearance of women from the field had grave macroeconomic consequences for Britain, and why the United States risks repeating those errors in the twenty-first century.
The computing profession faces a serious gender crisis. Today, fewer women enter computing than anytime in the past 25 years. This book provides an unprecedented look at the history of women and men in computing, detailing how the computing profession emerged and matured, and how the field became male coded. Women's experiences working in offices, education, libraries, programming, and government are examined for clues on how and where women succeeded—and where they struggled. It also provides a unique international dimension with studies examining the U.S., Great Britain, Germany, Norway, and Greece. Scholars in history, gender/women's studies, and science and technology studies, as well as department chairs and hiring directors will find this volume illuminating.
An analysis of the ways that software creates new spatialities in everyday life, from supermarket checkout lines to airline flight paths. After little more than half a century since its initial development, computer code is extensively and intimately woven into the fabric of our everyday lives. From the digital alarm clock that wakes us to the air traffic control system that guides our plane in for a landing, software is shaping our world: it creates new ways of undertaking tasks, speeds up and automates existing practices, transforms social and economic relations, and offers new forms of cultural activity, personal empowerment, and modes of play. In Code/Space, Rob Kitchin and Martin Dodge examine software from a spatial perspective, analyzing the dyadic relationship of software and space. The production of space, they argue, is increasingly dependent on code, and code is written to produce space. Examples of code/space include airport check-in areas, networked offices, and cafés that are transformed into workspaces by laptops and wireless access. Kitchin and Dodge argue that software, through its ability to do work in the world, transduces space. Then Kitchin and Dodge develop a set of conceptual tools for identifying and understanding the interrelationship of software, space, and everyday life, and illustrate their arguments with rich empirical material. And, finally, they issue a manifesto, calling for critical scholarship into the production and workings of code rather than simply the technologies it enables—a new kind of social science focused on explaining the social, economic, and spatial contours of software.
This book constitutes the refereed proceedings of the International Conference on Embedded and Ubiquitous Computing, EUC 2007, held in Taipei, Taiwan, in December 2007. The 65 revised full papers presented were carefully reviewed and selected from 217 submissions. The papers are organized in topical sections. They include sections on power aware computing, reconfigurable embedded systems, wireless networks, real-time/embedded operating systems, and embedded system architectures.
This book constitutes the thoroughly refereed post-proceedings of the 13th International Workshop on Languages and Compilers for Parallel Computing, LCPC 2000, held in Yorktown Heights, NY, USA, in August 2000. The 22 revised full papers presented together with 5 posters were carefully selected during two rounds of reviewing and improvement. All current aspects of parallel processing are addressed with emphasis on issues in optimizing compilers, languages, and software environments in high-performance computing.
The optimization of traffic management operations has become a considerable challenge in today’s global scope due to the significant increase in the number of vehicles, traffic congestions, and automobile accidents. Fortunately, there has been substantial progress in the application of intelligent computing devices to transportation processes. Vehicular ad-hoc networks (VANETs) are a specific practice that merges the connectivity of wireless technologies with smart vehicles. Despite its relevance, empirical research is lacking on the developments being made in VANETs and how certain intelligent technologies are being applied within transportation systems. IoT and Cloud Computing Advancements in Vehicular Ad-Hoc Networks provides emerging research exploring the theoretical and practical aspects of intelligent transportation systems and analyzing the modern techniques that are being applied to smart vehicles through cloud technology. Featuring coverage on a broad range of topics such as health monitoring, node localization, and fault tolerance, this book is ideally designed for network designers, developers, analysists, IT specialists, computing professionals, researchers, academics, and post-graduate students seeking current research on emerging computing concepts and developments in vehicular ad-hoc networks.
The three volume set LNAI 10462, LNAI 10463, and LNAI 10464 constitutes the refereed proceedings of the 10th International Conference on Intelligent Robotics and Applications, ICIRA 2017, held in Wuhan, China, in August 2017. The 235 papers presented in the three volumes were carefully reviewed and selected from 310 submissions. The papers in this second volume of the set are organized in topical sections on industrial robot and robot manufacturing; mechanism and parallel robotics; machine and robot vision; robot grasping and control.