Gør som tusindvis af andre bogelskere
Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.
Ved tilmelding accepterer du vores persondatapolitik.Du kan altid afmelde dig igen.
Blurring is almost an omnipresent effect on natural images. The main causes of blurring in images include: (a) the existence of objects at different depths within the scene which is known as defocus blur; (b) blurring due to motion either of objects in the scene or the imaging device; and (c) blurring due to atmospheric turbulence.Automatic estimation of spatially varying sharpness/blurriness has several applications including depth estimation, image quality assessment, information retrieval, image restoration, among others.There are some cases in which blur is intentionally introduced or enhanced; for example, in artistic photography and cinematography in which blur is intentionally introduced to emphasize a certain image region. Bokeh is a technique that introduces defocus blur with aesthetic purposes. Additionally, in trending applications like augmented and virtual reality usually, blur is introduced in order to provide/enhance depth perception.Digital images and videos are produced every day in astonishing amounts and the demand for higher quality is constantly rising which creates a need for advanced image quality assessment. Additionally, image quality assessment is important for the performance of image processing algorithms. It has been determined that image noise and artifacts can affect the performance of algorithms such as face detection and recognition, image saliency detection, and video target tracking. Therefore, image quality assessment (IQA) has been a topic of intense research in the fields of image processing and computer vision. Since humans are the end consumers of multimedia signals, subjective quality metrics provide the most reliable results; however, their cost in addition to time requirements makes them unfeasible for practical applications. Thus, objective quality metrics are usually preferred.
Although the field of sparse representations is relatively new, research activities in academic and industrial research labs are already producing encouraging results. The sparse signal or parameter model motivated several researchers and practitioners to explore high complexity/wide bandwidth applications such as Digital TV, MRI processing, and certain defense applications. The potential signal processing advancements in this area may influence radar technologies. This book presents the basic mathematical concepts along with a number of useful MATLAB(R) examples to emphasize the practical implementations both inside and outside the radar field. Table of Contents: Radar Systems: A Signal Processing Perspective / Introduction to Sparse Representations / Dimensionality Reduction / Radar Signal Processing Fundamentals / Sparse Representations in Radar
This short book is for students, professors and professionals interested in signal processing of seismic data using MATLAB(TM). The step-by-step demo of the full reflection seismic data processing workflow using a complete real seismic data set places itself as a very useful feature of the book. This is especially true when students are performing their projects, and when professors and researchers are testing their new developed algorithms in MATLAB(TM) for processing seismic data. The book provides the basic seismic and signal processing theory required for each chapter and shows how to process the data from raw field records to a final image of the subsurface all using MATLAB(TM). The MATLAB(TM) codes and seismic data can be downloaded here. Table of Contents: Seismic Data Processing: A Quick Overview / Examination of A Real Seismic Data Set / Quality Control of Real Seismic Data / Seismic Noise Attenuation / Seismic Deconvolution / Carrying the Processing Forward / Static Corrections / Seismic Migration / Concluding Remarks
Linear algebra is one of the most basic foundations of a wide range of scientific domains, and most textbooks of linear algebra are written by mathematicians. However, this book is specifically intended to students and researchers of pattern information processing, analyzing signals such as images and exploring computer vision and computer graphics applications. The author himself is a researcher of this domain. Such pattern information processing deals with a large amount of data, which are represented by high-dimensional vectors and matrices. There, the role of linear algebra is not merely numerical computation of large-scale vectors and matrices. In fact, data processing is usually accompanied with "e;geometric interpretation."e; For example, we can think of one data set being "e;orthogonal"e; to another and define a "e;distance"e; between them or invoke geometric relationships such as "e;projecting"e; some data onto some space. Such geometric concepts not only help us mentally visualize abstract high-dimensional spaces in intuitive terms but also lead us to find what kind of processing is appropriate for what kind of goals. First, we take up the concept of "e;projection"e; of linear spaces and describe "e;spectral decomposition,"e; "e;singular value decomposition,"e; and "e;pseudoinverse"e; in terms of projection. As their applications, we discuss least-squares solutions of simultaneous linear equations and covariance matrices of probability distributions of vector random variables that are not necessarily positive definite. We also discuss fitting subspaces to point data and factorizing matrices in high dimensions in relation to motion image analysis. Finally, we introduce a computer vision application of reconstructing the 3D location of a point from three camera views to illustrate the role of linear algebra in dealing with data with noise. This book is expected to help students and researchers of pattern information processing deepen the geometric understanding of linear algebra.
Real-time or applied digital signal processing courses are offered as follow-ups to conventional or theory-oriented digital signal processing courses in many engineering programs for the purpose of teaching students the technical know-how for putting signal processing algorithms or theory into practical use. These courses normally involve access to a teaching laboratory that is equipped with hardware boards, in particular DSP boards, together with their supporting software. A number of textbooks have been written discussing how to achieve real-time implementation on these hardware boards. This book discusses how to use smartphones as hardware boards for real-time implementation of signal processing algorithms, thus providing an alternative to the hardware boards that are used in signal processing laboratory courses. The fact that mobile devices, in particular smartphones, have become powerful processing platforms led to the development of this book to enable students to use their own smartphones to run signal processing algorithms in real-time considering that these days nearly all students possess smartphones. Changing the hardware platforms that are currently used in applied or real-time signal processing courses to smartphones creates a truly flexible laboratory experience or environment for students. In addition, it relieves the cost burden associated with using dedicated signal processing boards noting that the software development tools for smartphones are free of charge and are well-maintained by smartphone manufacturers. This book is written in such a way that it can be used as a textbook for real-time or applied digital signal processing courses offered at many universities. Ten lab experiments that are commonly encountered in such courses are covered in the book. It is written primarily for those who are already familiar with signal processing concepts and are interested in their real-time and practical aspects. Similar to existing real-time courses, knowledge of C programming is assumed. This book can also be used as a self-study guide for those who wish to become familiar with signal processing app development on either Android or iOS smartphones/tablets.
This book is designed for use as a textbook for a one semester Signals and Systems class. It is sufficiently user friendly to be used for self study as well. It begins with a gentle introduction to the idea of abstraction by looking at numbers-the one highly abstract concept we use all the time. It then introduces some special functions that are useful for analyzing signals and systems. It then spends some time discussing some of the properties of systems; the goal being to introduce the idea of a linear time-invariant system which is the focus of the rest of the book. Fourier series, discrete and continuous time Fourier transforms are introduced as tools for the analysis of signals. The concepts of sampling and modulation which are very much a part of everyday life are discussed as applications of the these tools. Laplace transform and Z transform are then introduced as tools to analyze systems. The notions of stability of systems and feedback are analyzed using these tools.The book is divided into thirty bite-sized modules. Each module also links up with a video lecture through a QR code in each module. The video lectures are approximately thirty minutes long. There are a set of self study questions at the end of each module along with answers to help the reader reinforce the concepts in the module.
A typical undergraduate electrical engineering curriculum incorporates a signals and systems course. The widely used approach for the laboratory component of such courses involves the utilization of MATLAB to implement signals and systems concepts. This book presents a newly developed laboratory paradigm where MATLAB codes are made to run on smartphones which are possessed by nearly all students. As a result, this laboratory paradigm provides an anywhere-anytime hardware platform or processing board for students to learn implementation aspects of signals and systems concepts. The book covers the laboratory experiments that are normally covered in signals and systems courses and discusses how to run MATLAB codes for these experiments as apps on both Android and iOS smartphones, thus enabling a truly mobile laboratory paradigm. A zipped file of the codes discussed in the book can be acquired via the website http://sites.fastspring.com/bookcodes/product/SignalsSystemsBookcodesThirdEdition
Compressed sensing (CS) allows signals and images to be reliably inferred from undersampled measurements. Exploiting CS allows the creation of new types of high-performance sensors including infrared cameras and magnetic resonance imaging systems. Advances in computer vision and deep learning have enabled new applications of automated systems. In this book, we introduce reconstruction-free compressive vision, where image processing and computer vision algorithms are embedded directly in the compressive domain, without the need for first reconstructing the measurements into images or video. Reconstruction of CS images is computationally expensive and adds to system complexity. Therefore, reconstruction-free compressive vision is an appealing alternative particularly for power-aware systems and bandwidth-limited applications that do not have on-board post-processing computational capabilities. Engineers must balance maintaining algorithm performance while minimizing both the number of measurements needed and the computational requirements of the algorithms. Our study explores the intersection of compressed sensing and computer vision, with the focus on applications in surveillance and autonomous navigation. Other applications are also discussed at the end and a comprehensive list of references including survey papers are given for further reading.
The Kalman filter is the Bayesian optimum solution to the problem of sequentially estimating the states of a dynamical system in which the state evolution and measurement processes are both linear and Gaussian. Given the ubiquity of such systems, the Kalman filter finds use in a variety of applications, e.g., target tracking, guidance and navigation, and communications systems. The purpose of this book is to present a brief introduction to Kalman filtering. The theoretical framework of the Kalman filter is first presented, followed by examples showing its use in practical applications. Extensions of the method to nonlinear problems and distributed applications are discussed. A software implementation of the algorithm in the MATLAB programming language is provided, as well as MATLAB code for several example applications discussed in the manuscript.
This book is Volume III of the series DSP for MATLAB¿ and LabVIEW¿. Volume III covers digital filter design, including the specific topics of FIR design via windowed-ideal-lowpass filter, FIR highpass, bandpass, and bandstop filter design from windowed-ideal lowpass filters, FIR design using the transition-band-optimized Frequency Sampling technique (implemented by Inverse-DFT or Cosine/Sine Summation Formulas), design of equiripple FIRs of all standard types including Hilbert Transformers and Differentiators via the Remez Exchange Algorithm, design of Butterworth, Chebyshev (Types I and II), and Elliptic analog prototype lowpass filters, conversion of analog lowpass prototype filters to highpass, bandpass, and bandstop filters, and conversion of analog filters to digital filters using the Impulse Invariance and Bilinear Transform techniques. Certain filter topologies specific to FIRs are also discussed, as are two simple FIR types, the Comb and Moving Average filters. The entire series consists of four volumes that collectively cover basic digital signal processing in a practical and accessible manner, but which nonetheless include all essential foundation mathematics. As the series title implies, the scripts (of which there are more than 200) described in the text and supplied in code form here will run on both MATLAB¿ and LabVIEW¿.The text for all volumes contains many examples, and many useful computational scripts, augmented by demonstration scripts and LabVIEW¿ Virtual Instruments (VIs) that can be run to illustrate various signal processing concepts graphically on the user's computer screen. Volume I consists of four chapters that collectively set forth a brief overview of the field of digital signal processing, useful signals and concepts (including convolution, recursion, difference equations, LTI systems, etc), conversion from the continuous to discrete domain and back (i.e., analog-to-digital and digital-to-analog conversion), aliasing, the Nyquist rate, normalized frequency, sample rate conversion and Mu-law compression, and signal processing principles including correlation, the correlation sequence, the Real DFT, correlation by convolution, matched filtering, simple FIR filters, and simple IIR filters. Chapter four of Volume I, in particular, provides an intuitive or ""first principle"" understanding of how digital filtering and frequency transforms work. Volume II provides detailed coverage of discrete frequency transforms, including a brief overview of common frequency transforms, both discrete and continuous, followed by detailed treatments of the Discrete Time Fourier Transform (DTFT), the z-Transform (including definition and properties, the inverse z-transform, frequency response via z-transform, and alternate filter realization topologies including Direct Form, Direct Form Transposed, Cascade Form, Parallel Form, and Lattice Form), and the Discrete Fourier Transform (DFT) (including Discrete Fourier Series, the DFT-IDFT pair, DFT of common signals, bin width, sampling duration, and sample rate, the FFT, the Goertzel Algorithm, Linear, Periodic, and Circular convolution, DFT Leakage, and computation of the Inverse DFT). Volume IV, the culmination of the series, is an introductory treatment of LMS Adaptive Filtering and applications, and covers cost functions, performance surfaces, coefficient perturbation to estimate the gradient, the LMS algorithm, response of the LMS algorithm to narrow-band signals, and various topologies such as ANC (Active Noise Cancelling) or system modeling, Periodic Signal Removal/Prediction/Adaptive Line Enhancement (ALE), Interference Cancellation, Echo Cancellation (with single- and dual-H topologies), and Inverse Filtering/Deconvolution/Equalization.Table of Contents: Principles
This book is Volume I of the series DSP for MATLAB¿ and LabVIEW¿. The entire series consists of four volumes that collectively cover basic digital signal processing in a practical and accessible manner, but which nonetheless include all essential foundation mathematics. As the series title implies, the scripts (of which there are more than 200) described in the text and supplied in code form here will run on both MATLAB and LabVIEW. Volume I consists of four chapters. The first chapter gives a brief overview of the field of digital signal processing. This is followed by a chapter detailing many useful signals and concepts, including convolution, recursion, difference equations, LTI systems, etc. The third chapter covers conversion from the continuous to discrete domain and back (i.e., analog-to-digital and digital-to-analog conversion), aliasing, the Nyquist rate, normalized frequency, conversion from one sample rate to another, waveform generation at various sample rates from stored wave data, and Mu-law compression. The fourth and final chapter of the present volume introduces the reader to many important principles of signal processing, including correlation, the correlation sequence, the Real DFT, correlation by convolution, matched filtering, simple FIR filters, and simple IIR filters. Chapter 4, in particular, provides an intuitive or "first principle" understanding of how digital filtering and frequency transforms work, preparing the reader for Volumes II and III, which provide, respectively, detailed coverage of discrete frequency transforms (including the Discrete Time Fourier Transform, the Discrete Fourier Transform, and the z-Transform) and digital filter design (FIR design using Windowing, Frequency Sampling, and Optimum Equiripple techniques, and Classical IIR design). Volume IV, the culmination of the series, is an introductory treatment of LMS Adaptive Filtering and applications. The text for all volumes contains many examples, and many useful computational scripts, augmented by demonstration scripts and LabVIEW Virtual Instruments (VIs) that can be run to illustrate various signal processing concepts graphically on the user's computer screen.Table of Contents: An Overview of DSP / Discrete Signals and Concepts / Sampling and Binary Representation / Transform and Filtering Principles
Linear prediction theory has had a profound impact in the field of digital signal processing. Although the theory dates back to the early 1940s, its influence can still be seen in applications today. The theory is based on very elegant mathematics and leads to many beautiful insights into statistical signal processing. Although prediction is only a part of the more general topics of linear estimation, filtering, and smoothing, this book focuses on linear prediction. This has enabled detailed discussion of a number of issues that are normally not found in texts. For example, the theory of vector linear prediction is explained in considerable detail and so is the theory of line spectral processes. This focus and its small size make the book different from many excellent texts which cover the topic, including a few that are actually dedicated to linear prediction. There are several examples and computer-based demonstrations of the theory. Applications are mentioned wherever appropriate, but the focus is not on the detailed development of these applications. The writing style is meant to be suitable for self-study as well as for classroom use at the senior and first-year graduate levels. The text is self-contained for readers with introductory exposure to signal processing, random processes, and the theory of matrices, and a historical perspective and detailed outline are given in the first chapter.Table of Contents: Introduction / The Optimal Linear Prediction Problem / Levinson's Recursion / Lattice Structures for Linear Prediction / Autoregressive Modeling / Prediction Error Bound and Spectral Flatness / Line Spectral Processes / Linear Prediction Theory for Vector Processes / Appendix A: Linear Estimation of Random Variables / B: Proof of a Property of Autocorrelations / C: Stability of the Inverse Filter / Recursion Satisfied by AR Autocorrelations
With human-computer interactions and hands-free communications becoming overwhelmingly important in the new millennium, recent research efforts have been increasingly focusing on state-of-the-art multi-microphone signal processing solutions to improve speech intelligibility in adverse environments. One such prominent statistical signal processing technique is blind signal separation (BSS). BSS was first introduced in the early 1990s and quickly emerged as an area of intense research activity showing huge potential in numerous applications. BSS comprises the task of 'blindly' recovering a set of unknown signals, the so-called sources from their observed mixtures, based on very little to almost no prior knowledge about the source characteristics or the mixing structure. The goal of BSS is to process multi-sensory observations of an inaccessible set of signals in a manner that reveals their individual (and original) form, by exploiting the spatial and temporal diversity, readily accessible through a multi-microphone configuration. Proceeding blindly exhibits a number of advantages, since assumptions about the room configuration and the source-to-sensor geometry can be relaxed without affecting overall efficiency. This booklet investigates one of the most commercially attractive applications of BSS, which is the simultaneous recovery of signals inside a reverberant (naturally echoing) environment, using two (or more) microphones. In this paradigm, each microphone captures not only the direct contributions from each source, but also several reflected copies of the original signals at different propagation delays. These recordings are referred to as the convolutive mixtures of the original sources. The goal of this booklet in the lecture series is to provide insight on recent advances in algorithms, which are ideally suited for blind signal separation of convolutive speech mixtures. More importantly, specific emphasis is given in practical applications of the developed BSS algorithms associated with real-life scenarios. The developed algorithms are put in the context of modern DSP devices, such as hearing aids and cochlear implants, where design requirements dictate low power consumption and call for portability and compact size. Along these lines, this booklet focuses on modern BSS algorithms which address (1) the limited amount of processing power and (2) the small number of microphones available to the end-user. Table of Contents: Fundamentals of blind signal separation / Modern blind signal separation algorithms / Application of blind signal processing strategies to noise reduction for the hearing-impaired / Conclusions and future challenges / Bibliography
Recent advances in sensor technology and information processing afford a new flexibility in the design of waveforms for agile sensing. Sensors are now developed with the ability to dynamically choose their transmit or receive waveforms in order to optimize an objective cost function. This has exposed a new paradigm of significant performance improvements in active sensing: dynamic waveform adaptation to environment conditions, target structures, or information features. The manuscript provides a review of recent advances in waveform-agile sensing for target tracking applications. A dynamic waveform selection and configuration scheme is developed for two active sensors that track one or multiple mobile targets. A detailed description of two sequential Monte Carlo algorithms for agile tracking are presented, together with relevant Matlab code and simulation studies, to demonstrate the benefits of dynamic waveform adaptation. The work will be of interest not only to practitioners of radar and sonar, but also other applications where waveforms can be dynamically designed, such as communications and biosensing. Table of Contents: Waveform-Agile Target Tracking Application Formulation / Dynamic Waveform Selection with Application to Narrowband and Wideband Environments / Dynamic Waveform Selection for Tracking in Clutter / Conclusions / CRLB Evaluation for Gaussian Envelope GFM Chirp from the Ambiguity Function / CRLB Evaluation from the Complex Envelope
In this new edition of the Handbook of Signal Processing Systems, many of the chapters from the previous editions have been updated, and several new chapters have been added. The new contributions include chapters on signal processing methods for light field displays, throughput analysis of dataflow graphs, modeling for reconfigurable signal processing systems, fast Fourier transform architectures, deep neural networks, programmable architectures for histogram of oriented gradients processing, high dynamic range video coding, system-on-chip architectures for data analytics, analysis of finite word-length effects in fixed-point systems, and models of architecture. There are more than 700 tables and illustrations; in this edition over 300 are in color. This new edition of the handbook is organized in three parts. Part I motivates representative applications that drive and apply state-of-the art methods for design and implementation of signal processing systems; Part II discusses architectures for implementing these applications; and Part III focuses on compilers, as well as models of computation and their associated design tools and methodologies.
Multi-Dimensional Imaging with Synthetic Aperture Radar: Theory and Applications provides a complete description of principles, models and data processing methods, giving an introduction to the theory that underlies recent applications such as topographic mapping and natural risk situational awareness - seismic-tectonics, active volcano, landslides and subsidence monitoring - security, urban, wide area and infrastructure control. Imaging radars, specifically Synthetic Aperture Radar (SAR), generally mounted onboard satellites or airplanes, are able to provide systematic high-resolution imaging of the Earth's surface. Recent advances in the field has seen applications to natural risk monitoring and security and has driven the development of many operational systems.
The manipulation of pictures and video in digital form has been an established research activity for more than twenty years. It is only recently, however, that digital image and video processing equipment has been accessible to the gen- eral public. This is due in part to the rapidly growing economy. of the home computer. A major contributing factor has been the marked rise in the pres- ence of the non-academic user on the internet, particularly the World Wide Web (WWW). Manipulating digital imagery has become synonymous with the WWW. It is the drive to present audio and visual media to the home user in an interactive form and to increase the available range of choices, which has encouraged agreements to begin digital video television broadcasting before the turn of the century. With the increased demand for video material, there is a perceived increase in demand for material from archive sources and this has fuelled commercial interest in automatic digital restoration processes. Further- more there is a continuing effort to design techniques for correcting errors in received compressed video bit streams for the purposes of live communications links over noisy channels e. g. mobile telephones and the internet. This book introduces the reader to a range of digital restoration activities beyond the well traversed areas of noise reduction and deblurring. It describes a number of problems associated with archived film and video.
Recently, there has been interest by regulators, the public and the manufacturers of wireless devices in the issues relating to the safety of radio frequency (RF) energy. These issues require an understanding of the scientific underpinnings of both physics of RF energy and cellular biology. This book is designed to provide precisely such cross-functional expertise. The editors of this book intend to provide a reliable source for a sound scientific understanding of the issues and to stimulate future scientific advances in this area. Therefore, the audience for this book includes such diverse groups as scientists, governmental policy-makers and regulatory bodies, representatives of industry and the public at large.
Neural Networks in Telecommunications consists of a carefully edited collection of chapters that provides an overview of a wide range of telecommunications tasks being addressed with neural networks. These tasks range from the design and control of the underlying transport network to the filtering, interpretation and manipulation of the transported media. The chapters focus on specific applications, describe specific solutions and demonstrate the benefits that neural networks can provide. By doing this, the authors demonstrate that neural networks should be another tool in the telecommunications engineer's toolbox. Neural networks offer the computational power of nonlinear techniques, while providing a natural path to efficient massively-parallel hardware implementations. In addition, the ability of neural networks to learn allows them to be used on problems where straightforward heuristic or rule-based solutions do not exist. Together these capabilities mean that neural networks offer unique solutions to problems in telecommunications. For engineers and managers in telecommunications, Neural Networks in Telecommunications provides a single point of access to the work being done by leading researchers in this field, and furnishes an in-depth description of neural network applications.
This book comprises the peer-reviewed proceedings of the International Conference on Communications, Signal Processing and VLSI (IC2SV) 2019. It explores the recent advances in the fields of signal and image processing, wireless and mobile communications, embedded systems, VLSI, microwave, and antennas. The contents provide insights into present technological challenges and discusses the emerging applications of different imaging techniques and communications systems. Given the range of topics covered, this book can be useful for students as well as researchers interested in the area of communications, signal processing, and VLSI technologies.
Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.
Ved tilmelding accepterer du vores persondatapolitik.