In-/Near-Memory Computing

In-/Near-Memory Computing

Author: Daichi Fujiki

Publisher: Springer Nature

Published: 2022-05-31

Total Pages: 124

ISBN-13: 3031017722

DOWNLOAD EBOOK

This book provides a structured introduction of the key concepts and techniques that enable in-/near-memory computing. For decades, processing-in-memory or near-memory computing has been attracting growing interest due to its potential to break the memory wall. Near-memory computing moves compute logic near the memory, and thereby reduces data movement. Recent work has also shown that certain memories can morph themselves into compute units by exploiting the physical properties of the memory cells, enabling in-situ computing in the memory array. While in- and near-memory computing can circumvent overheads related to data movement, it comes at the cost of restricted flexibility of data representation and computation, design challenges of compute capable memories, and difficulty in system and software integration. Therefore, wide deployment of in-/near-memory computing cannot be accomplished without techniques that enable efficient mapping of data-intensive applications to such devices, without sacrificing accuracy or increasing hardware costs excessively. This book describes various memory substrates amenable to in- and near-memory computing, architectural approaches for designing efficient and reliable computing devices, and opportunities for in-/near-memory acceleration of different classes of applications.


Neuromorphic Computing and Beyond

Neuromorphic Computing and Beyond

Author: Khaled Salah Mohamed

Publisher: Springer Nature

Published: 2020-01-25

Total Pages: 241

ISBN-13: 3030372243

DOWNLOAD EBOOK

This book discusses and compares several new trends that can be used to overcome Moore’s law limitations, including Neuromorphic, Approximate, Parallel, In Memory, and Quantum Computing. The author shows how these paradigms are used to enhance computing capability as developers face the practical and physical limitations of scaling, while the demand for computing power keeps increasing. The discussion includes a state-of-the-art overview and the essential details of each of these paradigms.


Applied Reconfigurable Computing. Architectures, Tools, and Applications

Applied Reconfigurable Computing. Architectures, Tools, and Applications

Author: Steven Derrien

Publisher: Springer Nature

Published: 2021-06-23

Total Pages: 338

ISBN-13: 3030790258

DOWNLOAD EBOOK

This book constitutes the proceedings of the 17th International Symposium on Applied Reconfigurable Computing, ARC 2021, held as a virtual event, in June 2021. The 14 full papers and 11 short presentations presented in this volume were carefully reviewed and selected from 40 submissions. The papers cover a broad spectrum of applications of reconfigurable computing, from driving assistance, data and graph processing acceleration, computer security to the societal relevant topic of supporting early diagnosis of Covid infectious conditions.


The Apache Ignite Book

The Apache Ignite Book

Author: Michael Zheludkov

Publisher: Lulu.com

Published: 2019-02-25

Total Pages: 642

ISBN-13: 0359439373

DOWNLOAD EBOOK

Apache Ignite is one of the most widely used open source memory-centric distributed, caching, and processing platform. This allows the users to use the platform as an in-memory computing framework or a full functional persistence data stores with SQL and ACID transaction support. On the other hand, Apache Ignite can be used for accelerating existing Relational and NoSQL databases, processing events & streaming data or developing Microservices in fault-tolerant fashion. This book addressed anyone interested in learning in-memory computing and distributed database. This book intends to provide someone with little to no experience of Apache Ignite with an opportunity to learn how to use this platform effectively from scratch taking a practical hands-on approach to learning. Please see the table of contents for more details.


Artificial Intelligence Hardware Design

Artificial Intelligence Hardware Design

Author: Albert Chun-Chen Liu

Publisher: John Wiley & Sons

Published: 2021-08-23

Total Pages: 244

ISBN-13: 1119810477

DOWNLOAD EBOOK

ARTIFICIAL INTELLIGENCE HARDWARE DESIGN Learn foundational and advanced topics in Neural Processing Unit design with real-world examples from leading voices in the field In Artificial Intelligence Hardware Design: Challenges and Solutions, distinguished researchers and authors Drs. Albert Chun Chen Liu and Oscar Ming Kin Law deliver a rigorous and practical treatment of the design applications of specific circuits and systems for accelerating neural network processing. Beginning with a discussion and explanation of neural networks and their developmental history, the book goes on to describe parallel architectures, streaming graphs for massive parallel computation, and convolution optimization. The authors offer readers an illustration of in-memory computation through Georgia Tech’s Neurocube and Stanford’s Tetris accelerator using the Hybrid Memory Cube, as well as near-memory architecture through the embedded eDRAM of the Institute of Computing Technology, the Chinese Academy of Science, and other institutions. Readers will also find a discussion of 3D neural processing techniques to support multiple layer neural networks, as well as information like: A thorough introduction to neural networks and neural network development history, as well as Convolutional Neural Network (CNN) models Explorations of various parallel architectures, including the Intel CPU, Nvidia GPU, Google TPU, and Microsoft NPU, emphasizing hardware and software integration for performance improvement Discussions of streaming graph for massive parallel computation with the Blaize GSP and Graphcore IPU An examination of how to optimize convolution with UCLA Deep Convolutional Neural Network accelerator filter decomposition Perfect for hardware and software engineers and firmware developers, Artificial Intelligence Hardware Design is an indispensable resource for anyone working with Neural Processing Units in either a hardware or software capacity.


Data Analytics with Hadoop

Data Analytics with Hadoop

Author: Benjamin Bengfort

Publisher: "O'Reilly Media, Inc."

Published: 2016-06

Total Pages: 288

ISBN-13: 1491913762

DOWNLOAD EBOOK

Ready to use statistical and machine-learning techniques across large data sets? This practical guide shows you why the Hadoop ecosystem is perfect for the job. Instead of deployment, operations, or software development usually associated with distributed computing, you’ll focus on particular analyses you can build, the data warehousing techniques that Hadoop provides, and higher order data workflows this framework can produce. Data scientists and analysts will learn how to perform a wide range of techniques, from writing MapReduce and Spark applications with Python to using advanced modeling and data management with Spark MLlib, Hive, and HBase. You’ll also learn about the analytical processes and data systems available to build and empower data products that can handle—and actually require—huge amounts of data. Understand core concepts behind Hadoop and cluster computing Use design patterns and parallel analytical algorithms to create distributed data analysis jobs Learn about data management, mining, and warehousing in a distributed context using Apache Hive and HBase Use Sqoop and Apache Flume to ingest data from relational databases Program complex Hadoop and Spark applications with Apache Pig and Spark DataFrames Perform machine learning techniques such as classification, clustering, and collaborative filtering with Spark’s MLlib


In-Memory Data Management

In-Memory Data Management

Author: Hasso Plattner

Publisher: Springer Science & Business Media

Published: 2011-03-08

Total Pages: 245

ISBN-13: 3642193633

DOWNLOAD EBOOK

In the last 50 years the world has been completely transformed through the use of IT. We have now reached a new inflection point. Here we present, for the first time, how in-memory computing is changing the way businesses are run. Today, enterprise data is split into separate databases for performance reasons. Analytical data resides in warehouses, synchronized periodically with transactional systems. This separation makes flexible, real-time reporting on current data impossible. Multi-core CPUs, large main memories, cloud computing and powerful mobile devices are serving as the foundation for the transition of enterprises away from this restrictive model. We describe techniques that allow analytical and transactional processing at the speed of thought and enable new ways of doing business. The book is intended for university students, IT-professionals and IT-managers, but also for senior management who wish to create new business processes by leveraging in-memory computing.


Efficient Processing of Deep Neural Networks

Efficient Processing of Deep Neural Networks

Author: Vivienne Sze

Publisher: Springer Nature

Published: 2022-05-31

Total Pages: 254

ISBN-13: 3031017668

DOWNLOAD EBOOK

This book provides a structured treatment of the key principles and techniques for enabling efficient processing of deep neural networks (DNNs). DNNs are currently widely used for many artificial intelligence (AI) applications, including computer vision, speech recognition, and robotics. While DNNs deliver state-of-the-art accuracy on many AI tasks, it comes at the cost of high computational complexity. Therefore, techniques that enable efficient processing of deep neural networks to improve key metrics—such as energy-efficiency, throughput, and latency—without sacrificing accuracy or increasing hardware costs are critical to enabling the wide deployment of DNNs in AI systems. The book includes background on DNN processing; a description and taxonomy of hardware architectural approaches for designing DNN accelerators; key metrics for evaluating and comparing different designs; features of DNN processing that are amenable to hardware/algorithm co-design to improve energy efficiency and throughput; and opportunities for applying new technologies. Readers will find a structured introduction to the field as well as formalization and organization of key concepts from contemporary work that provide insights that may spark new ideas.


Programming Persistent Memory

Programming Persistent Memory

Author: Steve Scargall

Publisher: Apress

Published: 2020-01-09

Total Pages: 387

ISBN-13: 1484249321

DOWNLOAD EBOOK

Beginning and experienced programmers will use this comprehensive guide to persistent memory programming. You will understand how persistent memory brings together several new software/hardware requirements, and offers great promise for better performance and faster application startup times—a huge leap forward in byte-addressable capacity compared with current DRAM offerings. This revolutionary new technology gives applications significant performance and capacity improvements over existing technologies. It requires a new way of thinking and developing, which makes this highly disruptive to the IT/computing industry. The full spectrum of industry sectors that will benefit from this technology include, but are not limited to, in-memory and traditional databases, AI, analytics, HPC, virtualization, and big data. Programming Persistent Memory describes the technology and why it is exciting the industry. It covers the operating system and hardware requirements as well as how to create development environments using emulated or real persistent memory hardware. The book explains fundamental concepts; provides an introduction to persistent memory programming APIs for C, C++, JavaScript, and other languages; discusses RMDA with persistent memory; reviews security features; and presents many examples. Source code and examples that you can run on your own systems are included. What You’ll Learn Understand what persistent memory is, what it does, and the value it brings to the industry Become familiar with the operating system and hardware requirements to use persistent memory Know the fundamentals of persistent memory programming: why it is different from current programming methods, and what developers need to keep in mind when programming for persistence Look at persistent memory application development by example using the Persistent Memory Development Kit (PMDK)Design and optimize data structures for persistent memoryStudy how real-world applications are modified to leverage persistent memoryUtilize the tools available for persistent memory programming, application performance profiling, and debugging Who This Book Is For C, C++, Java, and Python developers, but will also be useful to software, cloud, and hardware architects across a broad spectrum of sectors, including cloud service providers, independent software vendors, high performance compute, artificial intelligence, data analytics, big data, etc.


GPU Gems 2

GPU Gems 2

Author: Matt Pharr

Publisher: Addison-Wesley Professional

Published: 2005

Total Pages: 814

ISBN-13: 9780321335593

DOWNLOAD EBOOK

More useful techniques, tips, and tricks for harnessing the power of the new generation of powerful GPUs.