Divided roughly into two sections, this book provides a brief history of the development of ECG along with heart rate variability (HRV) algorithms and the engineering innovations over the last decade in this area. It reviews clinical research, presents an overview of the clinical field, and the importance of heart rate variability in diagnosis. The book then discusses the use of particular ECG and HRV algorithms in the context of clinical applications.
This book presents innovative research works to demonstrate the potential and the advancements of computing approaches to utilize healthcare centric and medical datasets in solving complex healthcare problems. Computing technique is one of the key technologies that are being currently used to perform medical diagnostics in the healthcare domain, thanks to the abundance of medical data being generated and collected. Nowadays, medical data is available in many different forms like MRI images, CT scan images, EHR data, test reports, histopathological data and doctor patient conversation data. This opens up huge opportunities for the application of computing techniques, to derive data-driven models that can be of very high utility, in terms of providing effective treatment to patients. Moreover, machine learning algorithms can uncover hidden patterns and relationships present in medical datasets, which are too complex to uncover, if a data-driven approach is not taken. With the help of computing systems, today, it is possible for researchers to predict an accurate medical diagnosis for new patients, using models built from previous patient data. Apart from automatic diagnostic tasks, computing techniques have also been applied in the process of drug discovery, by which a lot of time and money can be saved. Utilization of genomic data using various computing techniques is another emerging area, which may in fact be the key to fulfilling the dream of personalized medications. Medical prognostics is another area in which machine learning has shown great promise recently, where automatic prognostic models are being built that can predict the progress of the disease, as well as can suggest the potential treatment paths to get ahead of the disease progression.
This volume brings together research on how gameplay data in serious games may be turned into valuable analytics or actionable intelligence for performance measurement, assessment, and improvement. Chapter authors use empirical research methodologies, including existing, experimental, and emerging conceptual frameworks, from various fields, such as: computer science software engineering educational data mining statistics information visualization. Serious games is an emerging field where the games are created using sound learning theories and instructional design principles to maximize learning and training success. But how would stakeholders know what play-learners have done in the game environment, and if the actions performance brings about learning? Could they be playing the game for fun, really learning with evidence of performance improvement, or simply gaming the system, i.e., finding loopholes to fake that they are making progress? This volume endeavors to answer these questions.
The Poincaré plot (named after Henri Poincaré) is a popular two-dimensional visualization tool for dynamic systems due to its intuitive display of the dynamic properties of a system from a time series. This book presents the basis of Poincaré plot and focus especially on traditional and new methods for analysing the geometry, temporal and spatial dynamics disclosed by the Poincaré plot to evaluate heart rate variability (HRV). Mathematical descriptors of Poincaré plot have been developed to quantify the autonomic nervous system activity (sympathetic and parasympathetic modulation of heart rate). Poincaré plot analysis has also been used in various clinical diagnostic settings like diabetes, chronic heart failure, chronic renal failure and sleep apnea syndrome. The primary aims of quantification of the Poincaré plots are to discriminate healthy physiological systems from pathological conditions and to classify the stage of a disease. The HRV analysis by Poincaré plot has opened up ample opportunities for important clinical and research applications. Therefore, the present book can be used either for self-study, as a supplement to courses in linear and nonlinear systems, or as a modern monograph by researchers in this field of HRV analysis.
The development of a new tool, analytic device, or approach frequently facilitates rapid growth in scientific understanding, although the process is seldom linear. The study of heart rate variability (HRV) defined as the extent to which beat-to-beat variation in heart rate varies, is a rapidly maturing paradigm that integrates health and wellness observations across a wide variety of biomedical and psychosocial phenomena and illustrates this nonlinear path of development. The utility of HRV as an analytic and interventive technique goes far beyond its original application as a robust predictor of sudden cardiac death. This Research Topic aims to provide a conceptual framework to use in exploring the utility of HRV as a robust parameter of health status, using a broad and inclusive definition of ‘health’ and ‘well-being’. From the broadest perspective, current biomedical science emerged from shamanistic and religious healing practices and empirically observed interventions made as humans emerged from other hominins. The exponential growth of physics, chemistry and biology provided scientific support for the model emphasizing pathology and disorders. Even before the momentous discovery of germ theory, sanitation and other preventive strategies brought about great declines in mortality and morbidity. The revolution that is currently expanding the biomedical model is an integrative approach that includes the wide variety of non-physio/chemical factors that contribute to health. In the integrative approach, health is understood to be more than the absence of disease and emphasis is placed on optimal overall functioning, within the ecological niche occupied by the organism. This approach also includes not just interventive techniques and procedures, but also those social and cultural structures that provide access to safe and effective caring for sufferers. Beyond the typical drug and surgical interventions - which many identify with the Western biomedical model that currently enjoys an unstable hegemony - such factors also include cognitive-behavioral, social and cultural practices such as have been shown to be major contributors to the prevention and treatment of disease and the promotion of health and optimal functioning. This Integrative Model of Health and Well-being also derives additional conceptual power by recognizing the role played by evolutionary processes in which conserved, adaptive human traits and response tendencies are not congruent with current industrial and postindustrial global environmental demands and characteristics. This mismatch contributes to an increasing incidence of chronic conditions related to lifestyle and health behavior. Such a comprehensive model will make possible a truly personalized approach to health and well-being, including and going far beyond the current emphasis on genomic analysis, which has promised more that it has currently delivered. HRV offers an inexpensive and easily obtained measure of neurovisceral functioning which has been found to relate to the occurrence and severity of numerous physical disease states, as well as many cognitive-behavioral health disorders. This use of the term neurovisceral refers to the relationships between the nervous system and the viscera, providing a more focused and specific conceptual alternative to the now nearly archaic “mind-body” distinction. This awareness has led to the recent and growing use of HRV as a health biomarker or health status measure of neurovisceral functioning. It facilitates studying the complex two way interaction between the central nervous system and other key systems such as the cardiac, gastroenterological, pulmonary and immune systems. The utility of HRV as a broad spectrum health indicator with possible application both clinically and to population health has only begun to be explored. Interventions based on HRV have been demonstrated to be effective evidence-based interventions, with HRV biofeedback treatment for PTSD representing an empirically supported modality for this complex and highly visible affliction. As an integral measure of stress, HRV can be used to objectively assess the functioning of the central, enteric and cardiac nervous systems, all of which are largely mediated by the vagal nervous complex. HRV has also been found to be a measure of central neurobiological concepts such as executive functioning and cognitive load. The relatively simple and inexpensive acquisition of HRV data and its ease of network transmission and analysis make possible a promising digital epidemiology which can facilitate objective population health studies, as well as web based clinical applications. An intriguing example is the use of HRV data obtained at motor vehicle crash sites in decision support regarding life flight evacuations to improve triage to critical care facilities. This Research Topic critically addresses the issues of appropriate scientific and analytic methods to capture the concept of the Integrative Health and Well-being Model. The true nature of this approach can be appreciated only by using both traditional linear quantitative statistics and nonlinear systems dynamics metrics, which tend to be qualitative. The Research Topic also provides support for further development of new and robust methods for evaluating the safety and effectiveness of interventions and practices, going beyond the sometimes tepid and misleading “gold standard” randomized controlled clinical trial.
Widespread chronic diseases (e.g., heart diseases, diabetes and its complications, stroke, cancer, brain diseases) constitute a significant cause of rising healthcare costs and pose a significant burden on quality-of-life for many individuals. Despite the increased need for smart healthcare sensing systems that monitor / measure patients’ body balance, there is no coherent theory that facilitates the modeling of human physiological processes and the design and optimization of future healthcare cyber-physical systems (HCPS). The HCPS are expected to mine the patient’s physiological state based on available continuous sensing, quantify risk indices corresponding to the onset of abnormality, signal the need for critical medical intervention in real-time by communicating patient’s medical information via a network from individual to hospital, and most importantly control (actuate) vital health signals (e.g., cardiac pacing, insulin level, blood pressure) within personalized homeostasis. To prevent health complications, maintain good health and/or avoid fatal conditions calls for a cross-disciplinary approach to HCPS design where recent statistical-physics inspired discoveries done by collaborations between physicists and physicians are shared and enriched by applied mathematicians, control theorists and bioengineers. This critical and urgent multi-disciplinary approach has to unify the current state of knowledge and address the following fundamental challenges: One fundamental challenge is represented by the need to mine and understand the complexity of the structure and dynamics of the physiological systems in healthy homeostasis and associated with a disease (such as diabetes). Along the same lines, we need rigorous mathematical techniques for identifying the interactions between integrated physiologic systems and understanding their role within the overall networking architecture of healthy dynamics. Another fundamental challenge calls for a deeper understanding of stochastic feedback and variability in biological systems and physiological processes, in particular, and for deciphering their implications not only on how to mathematically characterize homeostasis, but also on defining new control strategies that are accounting for intra- and inter-patient specificity – a truly mathematical approach to personalized medicine. Numerous recent studies have demonstrated that heart rate variability, blood glucose, neural signals and other interdependent physiological processes demonstrate fractal and non-stationary characteristics. Exploiting statistical physics concepts, numerous recent research studies demonstrated that healthy human physiological processes exhibit complex critical phenomena with deep implications for how homeostasis should be defined and how control strategies should be developed when prolonged abnormal deviations are observed. In addition, several efforts have tried to connect these fractal characteristics with new optimal control strategies that implemented in medical devices such as pacemakers and artificial pancreas could improve the efficiency of medical therapies and the quality-of-life of patients but neglecting the overall networking architecture of human physiology. Consequently, rigorously analyzing the complexity and dynamics of physiological processes (e.g., blood glucose and its associated implications and interdependencies with other physiological processes) represents a fundamental step towards providing a quantifiable (mathematical) definition of homeostasis in the context of critical phenomena, understanding the onset of chronic diseases, predicting deviations from healthy homeostasis and developing new more efficient medical therapies that carefully account for the physiological complexity, intra- and inter-patient variability, rather than ignoring it. This Research Topic aims to open a synergetic and timely effort between physicians, physicists, applied mathematicians, signal processing, bioengineering and biomedical experts to organize the state of knowledge in mining the complexity of physiological systems and their implications for constructing more accurate mathematical models and designing QoL-aware control strategies implemented in the new generation of HCPS devices. By bringing together multi-disciplinary researchers seeking to understand the many aspects of human physiology and its complexity, we aim at enabling a paradigm shift in designing future medical devices that translates mathematical characteristics in predictable mathematical models quantifying not only the degree of homeostasis, but also providing fundamentally new control strategies within the personalized medicine era.
This book constitutes the thoroughly refereed post-proceedings of three workshops and an industrial track held in conjunction with the 11th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2007, held in Nanjing, China in May 2007. The 62 revised full papers presented together with an overview article to each workshop were carefully reviewed and selected from 355 submissions.
This volume aims to introduce organizational researchers and practitioners to the role of neuroscience in building theory, research methodologies and practical applications. The volume introduces the field of organizational neuroscience and explores its influence on topics such as leadership, ethics and moral reasoning.
Photoplethysmography: Technology, Signal Analysis, and Applications is the first comprehensive volume on the theory, principles, and technology (sensors and electronics) of photoplethysmography (PPG). It provides a detailed description of the current state-of-the-art technologies/optical components enabling the extreme miniaturization of such sensors, as well as comprehensive coverage of PPG signal analysis techniques including machine learning and artificial intelligence. The book also outlines the huge range of PPG applications in healthcare, with a strong focus on the contribution of PPG in wearable sensors and PPG for cardiovascular assessment. - Presents the underlying principles and technology surrounding PPG - Includes applications for healthcare and wellbeing - Focuses on PPG in wearable sensors and devices - Presents advanced signal analysis techniques - Includes cutting-edge research, applications and future directions
In each generation, scientists must redefine their fields: abstracting, simplifying and distilling the previous standard topics to make room for new advances and methods. Sethna's book takes this step for statistical mechanics - a field rooted in physics and chemistry whose ideas and methods are now central to information theory, complexity, and modern biology. Aimed at advanced undergraduates and early graduate students in all of these fields, Sethna limits his main presentation to the topics that future mathematicians and biologists, as well as physicists and chemists, will find fascinating and central to their work. The amazing breadth of the field is reflected in the author's large supply of carefully crafted exercises, each an introduction to a whole field of study: everything from chaos through information theory to life at the end of the universe.