Gør som tusindvis af andre bogelskere
Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.
Ved tilmelding accepterer du vores persondatapolitik.Du kan altid afmelde dig igen.
Segmentation and landmarking of computed tomographic (CT) images of pediatric patients are important and useful in computer-aided diagnosis (CAD), treatment planning, and objective analysis of normal as well as pathological regions. Identification and segmentation of organs and tissues in the presence of tumors are difficult. Automatic segmentation of the primary tumor mass in neuroblastoma could facilitate reproducible and objective analysis of the tumor's tissue composition, shape, and size. However, due to the heterogeneous tissue composition of the neuroblastic tumor, ranging from low-attenuation necrosis to high-attenuation calcification, segmentation of the tumor mass is a challenging problem. In this context, methods are described in this book for identification and segmentation of several abdominal and thoracic landmarks to assist in the segmentation of neuroblastic tumors in pediatric CT images. Methods to identify and segment automatically the peripheral artifacts and tissues, the rib structure, the vertebral column, the spinal canal, the diaphragm, and the pelvic surface are described. Techniques are also presented to evaluate quantitatively the results of segmentation of the vertebral column, the spinal canal, the diaphragm, and the pelvic girdle by comparing with the results of independent manual segmentation performed by a radiologist. The use of the landmarks and removal of several tissues and organs are shown to assist in limiting the scope of the tumor segmentation process to the abdomen, to lead to the reduction of the false-positive error, and to improve the result of segmentation of neuroblastic tumors. Table of Contents: Introduction to Medical Image Analysis / Image Segmentation / Experimental Design and Database / Ribs, Vertebral Column, and Spinal Canal / Delineation of the Diaphragm / Delineation of the Pelvic Girdle / Application of Landmarking / Concluding Remarks
Tremor represents one of the most common movement disorders worldwide. It affects both sexes and may occur at any age. In most cases, tremor is disabling and causes social difficulties, resulting in poorer quality of life. Tremor is now recognized as a public health issue given the aging of the population. Tremor is a complex phenomenon that has attracted the attention of scientists from various disciplines. Tremor results from dynamic interactions between multiple synaptically coupled neuronal systems and the biomechanical, physical, and electrical properties of the external effectors. There have been major advances in our understanding of tremor pathogenesis these last three decades, thanks to new imaging techniques and genetic discoveries. Moreover, significant progress in computer technologies, developments of reliable and unobtrusive wearable sensors, improvements in miniaturization, and advances in signal processing have opened new perspectives for the accurate characterization and daily monitoring of tremor. New therapies are emerging. In this book, we provide an overview of tremor from pathogenesis to therapeutic aspects. We review the definitions, the classification of the varieties of tremor, and the contribution of central versus peripheral mechanisms. Neuroanatomical, neurophysiological, neurochemical, and pharmacological topics related to tremor are pointed out. Our goals are to explain the fundamental basis of tremor generation, to show the recent technological developments, especially in instrumentation, which are reshaping research and clinical practice, and to provide up-to-date information related to emerging therapies. The integrative transdisciplinary approach has been used, combining engineering and physiological principles to diagnose, monitor, and treat tremor. Guidelines for evaluation of tremor are explained. This book has been written for biomedical engineering students, engineers, researchers, medical students, biologists, neurologists, and biomedical professionals of any discipline looking for an updated and multidisciplinary overview of tremor. It can be used for biomedical courses. Table of Contents: Introduction / Anatomical Overview of the Central and Peripheral Nervous System / Physiology of the Nervous System / Characterization of Tremor / Prinipal Disorders Associated with Tremor / Quantification of Tremor / Mechanisms of Tremor / Treatments
The field of brain imaging is developing at a rapid pace and has greatly advanced the areas of cognitive and clinical neuroscience. The availability of neuroimaging techniques, especially magnetic resonance imaging (MRI), functional MRI (fMRI), diffusion tensor imaging (DTI) and magnetoencephalography (MEG) and magnetic source imaging (MSI) has brought about breakthroughs in neuroscience. To obtain comprehensive information about the activity of the human brain, different analytical approaches should be complemented. Thus, in "e;intermodal multimodality"e; imaging, great efforts have been made to combine the highest spatial resolution (MRI, fMRI) with the best temporal resolution (MEG or EEG). "e;Intramodal multimodality"e; imaging combines various functional MRI techniques (e.g., fMRI, DTI, and/or morphometric/volumetric analysis). The multimodal approach is conceptually based on the combination of different noninvasive functional neuroimaging tools, their registration and cointegration. In particular, the combination of imaging applications that map different functional systems is useful, such as fMRI as a technique for the localization of cortical function and DTI as a technique for mapping of white matter fiber bundles or tracts. This booklet gives an insight into the wide field of multimodal imaging with respect to concepts, data acquisition, and postprocessing. Examples for intermodal and intramodal multimodality imaging are also demonstrated. Table of Contents: Introduction / Neurological Measurement Techniques and First Steps of Postprocessing / Coordinate Transformation / Examples for Multimodal Imaging / Clinical Aspects of Multimodal Imaging / References / Biography
Quantitative Neurophysiology is supplementary text for a junior or senior level course in neuroengineering. It may also serve as an quick-start for graduate students in engineering, physics or neuroscience as well as for faculty interested in becoming familiar with the basics of quantitative neuroscience. The first chapter is a review of the structure of the neuron and anatomy of the brain. Chapters 2-6 derive the theory of active and passive membranes, electrical propagation in axons and dendrites and the dynamics of the synapse. Chapter 7 is an introduction to modeling networks of neurons and artificial neural networks. Chapter 8 and 9 address the recording and decoding of extracellular potentials. The final chapter has descriptions of a number of more advanced or new topics in neuroengineering. Throughout the text, vocabulary is introduced which will enable students to read more advanced literature and communicate with other scientists and engineers working in the neurosciences. Numerical methods are outlined so students with programming knowledge can implement the models presented in the text. Analogies are used to clarify topics and reinforce key concepts. Finally, homework and simulation problems are available at the end of each chapter. Table of Contents: Preface / Neural Anatomy / Passive Membranes / Active Membranes / Propagation / Neural Branches / Synapses / Networks of Neurons / Extracellular Recording and Stimulation / The Neural Code / Applications / Biography / Index
This lecture book is intended to be an accessible and comprehensive introduction to random signal processing with an emphasis on the real-world applications of biosignals. Although the material has been written and developed primarily for advanced undergraduate biomedical engineering students it will also be of interest to engineers and interested biomedical professionals of any discipline seeking an introduction to the field. Within education, most biomedical engineering programs are aimed to provide the knowledge required of a graduate student while undergraduate programs are geared toward designing circuits and of evaluating only the cardiac signals. Very few programs teach the processes with which to evaluate brainwave, sleep, respiratory sounds, heart valve sounds, electromyograms, electro-oculograms, or random signals acquired from the body. The primary goal of this lecture book is to help the reader understand the time and frequency domain processes which may be used and to evaluate random physiological signals. A secondary goal is to learn the evaluation of actual mammalian data without spending most the time writing software programs. This publication utilizes "e;DADiSP"e;, a digital signal processing software, from the DSP Development Corporation.
Computer software has been productive in helping individuals with cognitive disabilities. Personalizing the user interface is an important strategy in designing software for these users, because of the barriers created by conventional user interfaces for the cognitively disabled. Cognitive assistive technology (CAT) has typically been used to provide help with everyday activities, outside of cognitive rehabilitation therapy. This book describes a quarter century of computing R&D at the Institute for Cognitive Prosthetics, focusing on the needs of individuals with cognitive disabilities from brain injury. Models and methods from Human Computer Interaction (HCI) have been particularly valuable, initially in illuminating those needs. Subsequently HCI methods have expanded CAT to be powerful rehabilitation therapy tools, restoring some damaged cognitive abilities which have resisted conventional therapy. Patient-Centered Design (PCD) emerged as a design methodology which incorporates both clinical and technical factors. PCD also takes advantage of the patient's ability to redesign and refine the user interface, and to achieve a very good fit between user and system. Cognitive Prosthetics Telerehabilitation is a powerful therapy modality. Essential characteristics are delivering service to patients in their own home, having the patient's priority activities be the focus of therapy, using cognitive prosthetic software which applies Patient Centered Design, and videoconferencing with a workspace shared between therapist and patient. Cognitive Prosthetics Telerehabilitation has a rich set of advantages for the many stakeholders involved with brain injury rehabilitation.
Transport processes represent important life-sustaining elements in all humans. These include mass transfer processes, including gas exchange in the lungs, transport across capillaries and alveoli, transport across the kidneys, and transport across cell membranes. These mass transfer processes affect how oxygen and carbon dioxide are exchanged in your bloodstream, how metabolic waste products are removed from your blood, how nutrients are transported to tissues, and how all cells function throughout the body. A discussion of kidney dialysis and gas exchange mechanisms is included. Another element in biomedical transport processes is that of momentum transport and fluid flow. This describes how blood is propelled from the heart and throughout the cardiovascular system, how blood elements affect the body, including gas exchange, infection control, clotting of blood, and blood flow resistance, which affects cardiac work. A discussion of the measurement of the blood resistance to flow (viscosity), blood flow, and pressure is also included. A third element in transport processes in the human body is that of heat transfer, including heat transfer inside the body towards the periphery as well as heat transfer from the body to the environment. A discussion of temperature measurements and body protection in extreme heat conditions is also included. Table of Contents: Biomedical Mass Transport / Biofluid Mechanics and Momentum Transport / Biomedical Heat Transport
Increasingly, biomedical scientists and engineers are involved in projects, design, or research and development that involve humans or animals. The book presents general concepts on professionalism and the regulation of the profession of engineering, including a discussion on what is ethics and moral conduct, ethical theories and the codes of ethics that are most relevant for engineers. An ethical decision-making process is suggested. Other issues such as conflicts of interest, plagiarism, intellectual property, confidentiality, privacy, fraud, and corruption are presented. General guidelines, the process for obtaining ethics approval from Ethics Review Boards, and the importance of obtaining informed consent from volunteers recruited for studies are presented. A discussion on research with animals is included. Ethical dilemmas focus on reproductive technologies, stem cells, cloning, genetic testing, and designer babies. The book includes a discussion on ethics and the technologies of body enhancement and of regeneration. The importance of assessing the impact of technology on people, society, and on our planet is stressed. Particular attention is given to nanotechnologies, the environment, and issues that pertain to developing countries. Ideas on gender, culture, and ethics focus on how research and access to medical services have, at times, been discriminatory towards women. The cultural aspects focus on organ transplantation in Japan, and a case study of an Aboriginal child in Canada; both examples show the impact that culture can have on how care is provided or accepted. The final section of the book discusses data collection and analysis and offers a guideline for honest reporting of results, avoiding fraud, or unethical approaches. The appendix presents a few case studies where fraud and/or unethical research have occurred. Table of Contents: Introduction to Ethics / Experiments with Human Subjects or Animals / Examples of Ethical Dilemmas in Biomedical Research / Technology and Society / Gender, Culture, and Ethics / Data Collection and Analysis
This book provides an introduction to the principles of several of the more widely used methods in medical imaging. Intended for engineering students, it provides a final-year undergraduate- or graduate-level introduction to several imaging modalities, including MRI, ultrasound, and X-Ray CT. The emphasis of the text is on mathematical models for imaging and image reconstruction physics. Emphasis is also given to sources of imaging artefacts. Such topics are usually not addressed across the different imaging modalities in one book, and this is a notable strength of the treatment given here. Table of Contents: Introduction / Diagnostic X-Ray Imaging / X-Ray CT / Ultrasonics / Pulse-Echo Ultrasonic Imaging / Doppler Velocimetry / An Introduction to MRI
In the last ten years many different brain imaging devices have conveyed a lot of information about the brain functioning in different experimental conditions. In every case, the biomedical engineers, together with mathematicians, physicists and physicians are called to elaborate the signals related to the brain activity in order to extract meaningful and robust information to correlate with the external behavior of the subjects. In such attempt, different signal processing tools used in telecommunications and other field of engineering or even social sciences have been adapted and re-used in the neuroscience field. The present book would like to offer a short presentation of several methods for the estimation of the cortical connectivity of the human brain. The methods here presented are relatively simply to implement, robust and can return valuable information about the causality of the activation of the different cortical areas in humans using non invasive electroencephalographic recordings. The knowledge of such signal processing tools will enrich the arsenal of the computational methods that a engineer or a mathematician could apply in the processing of brain signals. Table of Contents: Introduction / Estimation of the Effective Connectivity from Stationary Data by Structural Equation Modeling / Estimation of the Functional Connectivity from Stationary Data by Multivariate Autoregressive Methods / Estimation of Cortical Activity by the use of Realistic Head Modeling / Application: Estimation of Connectivity from Movement-Related Potentials / Application to High-Resolution EEG Recordings in a Cognitive Task (Stroop Test) / Application to Data Related to the Intention of Limb Movements in Normal Subjects and in a Spinal Cord Injured Patient / The Instantaneous Estimation of the Time-Varying Cortical Connectivity by Adaptive Multivariate Estimators / Time-Varying Connectivity from Event-Related Potentials
This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the probability distribution of a function of one random variable using the CDF and then the PDF. Next, the probability distribution for a single random variable is determined from a function of two random variables using the CDF. Then, the joint probability distribution is found from a function of two random variables using the joint PDF and the CDF. The aim of all three books is as an introduction to probability theory. The audience includes students, engineers and researchers presenting applications of this theory to a wide variety of problems-as well as pursuing these topics at a more advanced level. The theory material is presented in a logical manner-developing special mathematical skills as needed. The mathematical background required of the reader is basic knowledge of differential calculus. Pertinent biomedical engineering examples are throughout the text. Drill problems, straightforward exercises designed to reinforce concepts and develop problem solution skills, follow most sections.
Lung sounds auscultation is often the first noninvasive resource for detection and discrimination of respiratory pathologies available to the physician through the use of the stethoscope. Hearing interpretation, though, was the only means of appreciation of the lung sounds diagnostic information for many decades. Nevertheless, in recent years, computerized auscultation combined with signal processing techniques has boosted the diagnostic capabilities of lung sounds. The latter were traditionally analyzed and characterized by morphological changes in the time domain using statistical measures, by spectral properties in the frequency domain using simple spectral analysis, or by nonstationary properties in a joint time-frequency domain using short-time Fourier transform. Advanced signal processing techniques, however, have emerged in the last decade, broadening the perspective in lung sounds analysis. The scope of this book is to present up-to-date signal processing techniques that have been applied to the area of lung sound analysis. It starts with a description of the nature of lung sounds and continues with the introduction of new domains in their representation, new denoising techniques, and concludes with some reflective implications, both from engineers' and physicians' perspective. Issues of nonstationarity, nonlinearity, non-Gaussianity, modeling, and classification of lung sounds are addressed with new methodologies, revealing a more realistic approach to their pragmatic nature. Advanced denoising techniques that effectively circumvent the noise presence (e.g., heart sound interference, background noise) in lung sound recordings are described, providing the physician with high-quality auscultative data. The book offers useful information both to engineers and physicians interested in bioacoustics, clearly demonstrating the current trends in lung sound analysis. Table of Contents: The Nature of Lung Sound Signals / New Domains in LS Representation / Denoising Techniques / Reflective Implications
Within the context of healthcare, there has been a long-standing interest in understanding the posture and movement of the human body. Gait analysis work over the years has looked to articulate the patterns and parameters of this movement both for a normal healthy body and in a range of movement-based disorders. In recent years, these efforts to understand the moving body have been transformed by significant advances in sensing technologies and computational analysis techniques all offering new ways for the moving body to be tracked, measured, and interpreted. While much of this work has been largely research focused, as the field matures, we are seeing more shifts into clinical practice. As a consequence, there is an increasing need to understand these sensing technologies over and above the specific capabilities to track, measure, and infer patterns of movement in themselves. Rather, there is an imperative to understand how the material form of these technologies enables them also to be situated in everyday healthcare contexts and practices. There are significant mutually interdependent ties between the fundamental characteristics and assumptions of these technologies and the configurations of everyday collaborative practices that are possible them. Our attention then must look to social, clinical, and technical relations pertaining to these various body technologies that may play out in particular ways across a range of different healthcare contexts and stakeholders. Our aim in this book is to explore these issues with key examples illustrating how social contexts of use relate to the properties and assumptions bound up in particular choices of body-tracking technology. We do this through a focus on three core application areas in healthcare-assessment, rehabilitation, and surgical interaction-and recent efforts to apply body-tracking technologies to them.
Neural interfaces are one of the most exciting emerging technologies to impact bioengineering and neuroscience because they enable an alternate communication channel linking directly the nervous system with man-made devices. This book reveals the essential engineering principles and signal processing tools for deriving control commands from bioelectric signals in large ensembles of neurons. The topics featured include analysis techniques for determining neural representation, modeling in motor systems, computing with neural spikes, and hardware implementation of neural interfaces. Beginning with an exploration of the historical developments that have led to the decoding of information from neural interfaces, this book compares the theory and performance of new neural engineering approaches for BMIs. Contents: Introduction to Neural Interfaces / Foundations of Neuronal Representations / Input-Outpur BMI Models / Regularization Techniques for BMI Models / Neural Decoding Using Generative BMI Models / Adaptive Algorithms for Point Processes / BMI Systems
This textbook is intended for undergraduate students (juniors or seniors) in Biomedical Engineering, with the main goal of helping these students learn about classical control theory and its application in physiological systems. In addition, students should be able to apply the Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) Controls and Simulation Modules to mammalian physiology. The first four chapters review previous work on differential equations for electrical and mechanical systems. Chapters 5 through 8 present the general types and characteristics of feedback control systems and foot locus, frequency response, and analysis of stability and margins. Chapters 9 through 12 cover basic LabVIEW programming, the control module with its pallets, and the simulation module with its pallets. Chapters 13 through 17 present various physiological models with several LabVIEW control analyses. These chapters cover control of the heart (heart rate, stroke volume, and cardiac output), the vestibular system and its role in governing equilibrium and perceived orientation, vestibulo-ocular reflex in stabilizing an image on the surface of the retina during head movement, mechanical control models of human gait (walking movement), and the respiratory control model. The latter chapters (Chapters 13-17) combine details from my class lecture notes in regard to the application of LabVIEW control programming by the class to produce the control virtual instruments and graphical displays (root locus, Bode plots, and Nyquist plot). This textbook was developed in cooperation with National Instruments personnel. Table of Contents: Electrical System Equations / Mechanical Translation Systems / Mechanical Rotational Systems / Thermal Systems and Systems Representation / Characteristics and Types of Feedback Control Systems / Root Locus / Frequency Response Analysis / Stability and Margins / Introduction to LabVIEW / Control Design in LabVIEW / Simulation in LabVIEW / LabVIEW Control Design and Simulation Exercise / Cardiac Control / Vestibular Control System / Vestibulo-Ocular Control System / Gait and Stance Control System / Respiratory Control System
This book is concerned with the study of continuum mechanics applied to biological systems, i.e., continuum biomechanics. This vast and exciting subject allows description of when a bone may fracture due to excessive loading, how blood behaves as both a solid and fluid, down to how cells respond to mechanical forces that lead to changes in their behavior, a process known as mechanotransduction. We have written for senior undergraduate students and first year graduate students in mechanical or biomedical engineering, but individuals working at biotechnology companies that deal in biomaterials or biomechanics should also find the information presented relevant and easily accessible. Table of Contents: Tensor Calculus / Kinematics of a Continuum / Stress / Elasticity / Fluids / Blood and Circulation / Viscoelasticity / Poroelasticity and Thermoelasticity / Biphasic Theory
Evaluating biomedical technology poses a significant challenge in light of the complexity and rate of introduction in today's healthcare delivery system. Successful evaluation requires an integration of clinical medicine, science, finance, and market analysis. Little guidance, however, exists for those who must conduct comprehensive technology evaluations. The 3Q Method meets these present day needs. The 3Q Method is organized around 3 key questions dealing with 1) clinical and scientific basis, 2) financial fit and 3) strategic and expertise fit. Both healthcare providers (e.g., hospitals) and medical industry providers can use the Method to evaluate medical devices, information systems and work processes from their own perspectives. The book describes the 3Q Method in detail and provides additional suggestions for optimal presentation and report preparation. Table of Contents: Introduction / Question #1: Is It Real? / Question #2: Can We Win? / Question #3: Is It Worth It? / 3Q Case Study Example -- Pershing Medical Company / Appendix A: Health Care Technology Assessment Sample Class Syllabus / Appendix B: How do Hospitals and Clinicians Get Paid? / Appendix C: Technology Assessment PowerPoint Report Guidelines / Appendix D: Class Report Scenario Example / Appendix E: Four-Blocker Slide Templates for 3Q Reports
This is the second in a series of three short books on probability theory and random processes for biomedical engineers. This volume focuses on expectation, standard deviation, moments, and the characteristic function. In addition, conditional expectation, conditional moments and the conditional characteristic function are also discussed. Jointly distributed random variables are described, along with joint expectation, joint moments, and the joint characteristic function. Convolution is also developed. A considerable effort has been made to develop the theory in a logical manner-developing special mathematical skills as needed. The mathematical background required of the reader is basic knowledge of differential calculus. Every effort has been made to be consistent with commonly used notation and terminology-both within the engineering community as well as the probability and statistics literature. The aim is to prepare students for the application of this theory to a wide variety of problems, as well give practicing engineers and researchers a tool to pursue these topics at a more advanced level. Pertinent biomedical engineering examples are used throughout the text.
Heredity performs literal communication of immensely long genomes through immensely long time intervals. Genomes nevertheless incur sporadic errors referred to as mutations which have significant and often dramatic effects, after a time interval as short as a human life. How can faithfulness at a very large timescale and unfaithfulness at a very short one be conciliated? The engineering problem of literal communication has been completely solved during the second half of the XX-th century. Originating in 1948 from Claude Shannon's seminal work, information theory provided means to measure information quantities and proved that communication is possible through an unreliable channel (by means left unspecified) up to a sharp limit referred to as its capacity, beyond which communication becomes impossible. The quest for engineering means of reliable communication, named error-correcting codes, did not succeed in closely approaching capacity until 1993 when Claude Berrou and Alain Glavieuxinvented turbocodes. By now, the electronic devices which invaded our daily lives (e.g., CD, DVD, mobile phone, digital television) could not work without highly efficient error-correcting codes. Reliable communication through unreliable channels up to the limit of what is theoretically possible has become a practical reality: an outstanding achievement, however little publicized. As an engineering problem that nature solved aeons ago, heredity is relevant to information theory. The capacity of DNA is easily shown to vanish exponentially fast, which entails that error-correcting codes must be used to regenerate genomes so as to faithfully transmit the hereditary message. Moreover, assuming that such codes exist explains basic and conspicuous features of the living world, e.g., the existence of discrete species and their hierarchical taxonomy, the necessity of successive generations and even the trend of evolution towards increasingly complex beings. Providing geneticists with an introduction to information theory and error-correcting codes as necessary tools of hereditary communication is the primary goal of this book. Some biological consequences of their use are also discussed, and guesses about hypothesized genomic codes are presented. Another goal is prompting communication engineers to get interested in genetics and biology, thereby broadening their horizon far beyond the technological field, and learning from the most outstanding engineer: Nature. Table of Contents: Foreword / Introduction / A Brief Overview of Molecular Genetics / An Overview of Information Theory / More on Molecular Genetics / More on Information Theory / An Outline of Error-Correcting Codes / DNA is an Ephemeral Memory / A Toy Living World / Subsidiary Hypothesis, Nested System / Soft Codes / Biological Reality Conforms to the Hypotheses / Identification of Genomic Codes / Conclusion and Perspectives
This book provides an in-depth review of the historical and state-of-the-art use of technology by and for individuals with autism. The design, development, deployment, and evaluation of interactive technologies for use by and with individuals with autism have been rapidly increasing over the last few decades. There is great promise for the use of these technologies to enrich lives, improve the experience of interventions, help with learning, facilitate communication, support data collection, and promote understanding. Emerging technologies in this area also have the potential to enhance assessment and diagnosis of autism, to understand the nature and lived experience of autism, and to help researchers conduct basic and applied research.The intention of this book is to give readers a comprehensive background for understanding what work has already been completed and its impact as well as what promises and challenges lie ahead. A large majority of existing technologies have been designed for autistic children, there is increased interest in technology's intersection with the lived experiences of autistic adults. By providing a classification scheme and general review, this book can help technology designers, researchers, autistic people, and their advocates better understand how technologies have been successful or unsuccessful, what problems remain open, and where innovations can further address challenges and opportunities for individuals with autism and the variety of stakeholders connected to them.
The auscultation method is an important diagnostic indicator for hemodynamic anomalies. Heart sound classification and analysis play an important role in the auscultative diagnosis. The term phonocardiography refers to the tracing technique of heart sounds and the recording of cardiac acoustics vibration by means of a microphone-transducer. Therefore, understanding the nature and source of this signal is important to give us a tendency for developing a competent tool for further analysis and processing, in order to enhance and optimize cardiac clinical diagnostic approach. This book gives the reader an inclusive view of the main aspects in phonocardiography signal processing. Table of Contents: Introduction to Phonocardiography Signal Processing / Phonocardiography Acoustics Measurement / PCG Signal Processing Framework / Phonocardiography Wavelets Analysis / Phonocardiography Spectral Analysis / PCG Pattern Classification / Special Application of Phonocardiography / Phonocardiography Acoustic Imaging and Mapping
Attention Deficit Hyperactivity Disorder (ADHD) is the most prevalent childhood psychiatric condition, with estimates of more than 5% of children affected worldwide, and has a profound public health, personal, and family impact. At the same time, a multitude of adults, both diagnosed and undiagnosed, are living, coping, and thriving while experiencing ADHD. It can cost families raising a child with ADHD as much as five times the amount of raising a child without ADHD (Zhao et al. 2019). Given the chronic and pervasive challenges associated with ADHD, innovative approaches for supporting children, adolescents, and adults have been engaged, including the use of both novel and off-the-shelf technologies. A wide variety of connected and interactive technologies can enable new and different types of sociality, education, and work, support a variety of clinical and educational interventions, and allow for the possibility of educating the general population on issues of inclusion and varying models of disability.This book provides a comprehensive review of the historical and state-of-the-art use of technology by and for individuals with ADHD. Taking both a critical and constructive lens to this work, the book notes where great strides have been made and where there are still open questions and considerations for future work. This book provides background and lays foundation for a general understanding of both ADHD and innovative technologies in this space. The authors encourage students, researchers, and practitioners, both with and without ADHD diagnoses, to engage with this work, build upon it, and push the field further.
The presence of oriented features in images often conveys important information about the scene or the objects contained; the analysis of oriented patterns is an important task in the general framework of image understanding. As in many other applications of computer vision, the general framework for the understanding of oriented features in images can be divided into low- and high-level analysis. In the context of the study of oriented features, low-level analysis includes the detection of oriented features in images; a measure of the local magnitude and orientation of oriented features over the entire region of analysis in the image is called the orientation field. High-level analysis relates to the discovery of patterns in the orientation field, usually by associating the structure perceived in the orientation field with a geometrical model. This book presents an analysis of several important methods for the detection of oriented features in images, and a discussion of the phase portrait method for high-level analysis of orientation fields. In order to illustrate the concepts developed throughout the book, an application is presented of the phase portrait method to computer-aided detection of architectural distortion in mammograms. Table of Contents: Detection of Oriented Features in Images / Analysis of Oriented Patterns Using Phase Portraits / Optimization Techniques / Detection of Sites of Architectural Distortion in Mammograms
Approximately 10% of North Americans have some communication disorder. These can be physical as in cerebral palsy and Parkinson's disease, cognitive as in Alzheimer's disease and dementia generally, or both physical and cognitive as in stroke. In fact, deteriorations in language are often the early hallmarks of broader diseases associated with older age, which is especially relevant since aging populations across many nations will result in a drastic increase in the prevalence of these types of disorders. A significant change to how healthcare is administered, brought on by these aging populations, will increase the workload of speech-language pathologists, therapists, and caregivers who are often already overloaded. Fortunately, modern speech technology, such as automatic speech recognition, has matured to the point where it can now have a profound positive impact on the lives of millions of people living with various types of disorders. This book serves as a common ground for two communities: clinical linguists (e.g., speech-language pathologists) and technologists (e.g., computer scientists). This book examines the neurological and physical causes of several speech disorders and their clinical effects, and demonstrates how modern technology can be used in practice to manage those effects and improve one's quality of life. This book is intended for a broad audience, from undergraduates to more senior researchers, as well as to users of these technologies and their therapists.
This book explores the ways in which AgeTech can contribute to healthy cognitive aging and support the independence of people with dementia. Technology can play a key role in supporting the health, independence, and well-being of older adults, particularly as a response to rapid worldwide population aging. AgeTech refers to the use of technologies, such as information and communication technologies (ICTs), robotics, mobile technologies, artificial intelligence, ambient systems, and pervasive computing to drive technology-based innovation to benefit older adults. AgeTech has the potential to provide new ways of meeting the growing demands on health and social care services to support people to stay healthy and active. As such, AgeTech represents an increasingly important market sector within world economies. The book also addresses some of the research, innovation, and policy challenges that need to be resolved if technology-based products and services are to fulfill their potential and deliver real-world impacts to improve the lives of older adults and their carers, thus promoting more inclusive communities for the benefit of all.
Demographic trends and increasing support costs means that good design for older and disabled people is an economic necessity, as well as a moral imperative. Alan Newell has been described as "e;a visionary who stretches the imagination of all of us"e; and "e;truly ahead of his time."e; This monograph describes research ranging from developing communication systems for non-speaking and hearing-impaired people to technology to support older people, and addresses the particular challenges older people have with much modern technology. Alan recounts the insights gained from this research journey, and recommends a philosophy, and design practices, to reduce the "e;Digital Divide"e; between users of information technology and those who are excluded by the poor design of many current systems. How to create and lead interdisciplinary teams, and the practical and ethical challenges of working in clinically related fields are discussed. The concepts of "e;Ordinary and Extra-ordinary HCI"e;, "e;User Sensitive Inclusive Design"e; , and "e;Design for Dynamic Diversity"e;, and the use of "e;Creative Design"e; techniques are suggested as extensions of "e;User Centered"e; and "e;Universal Design."e; Also described are the use of professional theatre and other methods for raising designers' awareness of the challenges faced by older and disabled people, ways of engaging with these groups, and of ascertaining what they "e;want"e; rather than just what they "e;need."e;This monograph will give all Human Computer Interaction (HCI) practitioners and designers of both mainstream and specialized IT equipment much food for thought. Table of Contents: 40 years--Highlights and a Brief Review / Communication Systems for Non-Speaking and Hearing-Impaired People / TV Subtitling for Hearing-Impaired People / Word Prediction for Non-Speaking People and Systems for those with Dyslexia / Providing Reusable Conversation for Non-Speaking People / Story Telling and Emotion in Synthetic Speech / Lessons Learned from Designing AAC Devices / IT Systems for Older People / Designing IT Systems for Older People / Ordinary and Extra-Ordinary Human Computer Interaction / User Sensitive Inclusive Design / The Use of Professional Theatre / Attacking the Digital Divide
Content-based image retrieval (CBIR) is the process of retrieval of images from a database that are similar to a query image, using measures derived from the images themselves, rather than relying on accompanying text or annotation. To achieve CBIR, the contents of the images need to be characterized by quantitative features; the features of the query image are compared with the features of each image in the database and images having high similarity with respect to the query image are retrieved and displayed. CBIR of medical images is a useful tool and could provide radiologists with assistance in the form of a display of relevant past cases. One of the challenging aspects of CBIR is to extract features from the images to represent their visual, diagnostic, or application-specific information content. In this book, methods are presented for preprocessing, segmentation, landmarking, feature extraction, and indexing of mammograms for CBIR. The preprocessing steps include anisotropic diffusion and the Wiener filter to remove noise and perform image enhancement. Techniques are described for segmentation of the breast and fibroglandular disk, including maximum entropy, a moment-preserving method, and Otsu's method. Image processing techniques are described for automatic detection of the nipple and the edge of the pectoral muscle via analysis in the Radon domain. By using the nipple and the pectoral muscle as landmarks, mammograms are divided into their internal, external, upper, and lower parts for further analysis. Methods are presented for feature extraction using texture analysis, shape analysis, granulometric analysis, moments, and statistical measures. The CBIR system presented provides options for retrieval using the Kohonen self-organizing map and the k-nearest-neighbor method. Methods are described for inclusion of expert knowledge to reduce the semantic gap in CBIR, including the query point movement method for relevance feedback (RFb). Analysis of performance is described in terms of precision, recall, and relevance-weighted precision of retrieval. Results of application to a clinical database of mammograms are presented, including the input of expert radiologists into the CBIR and RFb processes. Models are presented for integration of CBIR and computer-aided diagnosis (CAD) with a picture archival and communication system (PACS) for efficient workflow in a hospital. Table of Contents: Introduction to Content-based Image Retrieval / Mammography and CAD of Breast Cancer / Segmentation and Landmarking of Mammograms / Feature Extraction and Indexing of Mammograms / Content-based Retrieval of Mammograms / Integration of CBIR and CAD into Radiological Workflow
Designed Technologies for Healthy Aging identifies and presents a variety of contemporary technologies to support older adults' abilities to perform everyday activities. Efforts of industry, laboratories, and learning institutions are documented under four major categories: social connections, independent self care, healthy home and active lifestyle. The book contains well-documented and illustrative recent examples of designed technologies-ranging from wearable devices, to mobile applications, to assistive robots- on the broad areas of design and computation, including industrial design, interaction design, graphic design, human-computer interaction, software engineering, and artificial intelligence.
Take one elephant and one man to the top of a tower and simultaneously drop. Which will hit the ground first?You are a pilot of a jet fighter performing a high-speed loop. Will you pass out during the maneuver?How can you simulate being an astronaut with your feet still firmly placed on planet Earth?In the aerospace environment, human, animal, and plant physiology differs significantly from that on Earth, and this book provides reasons for some of these changes. The challenges encountered by pilots in their missions can have implications on the health and safety of not only themselves but others. Knowing the effects of hypergravity on the human body during high-speed flight led to the development of human centrifuges. We also need to better understand the physiological responses of living organisms in space. It is therefore necessary to simulate weightlessness through the use of specially adapted equipment, such as clinostats, tilt tables, and body suspension devices. Each of these ideas, and more, is addressed in this review of the physical concepts related to space flights, microgravity, and hypergravity simulations. Basic theories, such as Newton's law and Einstein's principle are explained, followed by a look at the biomedical effects of experiments performed in space life sciences institutes, universities, and space agencies. Table of Contents: General Concepts in Physics - Definition of Physical Terms / The Effects of Hypergravity on Biomedical Experiments / The Effects of Microgravity on Biomedical Experiments / References
This short book provides basic information about bioinstrumentation and electric circuit theory. Many biomedical instruments use a transducer or sensor to convert a signal created by the body into an electric signal. Our goal here is to develop expertise in electric circuit theory applied to bioinstrumentation. We begin with a description of variables used in circuit theory, charge, current, voltage, power and energy. Next, Kirchhoff's current and voltage laws are introduced, followed by resistance, simplifications of resistive circuits and voltage and current calculations. Circuit analysis techniques are then presented, followed by inductance and capacitance, and solutions of circuits using the differential equation method. Finally, the operational amplifier and time varying signals are introduced. This lecture is written for a student or researcher or engineer who has completed the first two years of an engineering program (i.e., 3 semesters of calculus and differential equations). A considerable effort has been made to develop the theory in a logical manner-developing special mathematical skills as needed. At the end of the short book is a wide selection of problems, ranging from simple to complex.
Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.
Ved tilmelding accepterer du vores persondatapolitik.