Vi bøger
Levering: 1 - 2 hverdage

Bøger i The Springer International Eng serien

Filter
Filter
Sorter efterSorter Serie rækkefølge
  • af Keith D Paulsen
    1.097,95 kr.

  • af Brian T Graham
    1.101,95 kr.

    This is a milestone in machine-assisted microprocessor verification. Gordon [20] and Hunt [32] led the way with their verifications of sim­ ple designs, Cohn [12, 13] followed this with the verification of parts of the VIPER microprocessor. This work illustrates how much these, and other, pioneers achieved in developing tractable models, scalable tools, and a robust methodology. A condensed review of previous re­ search, emphasising the behavioural model underlying this style of verification is followed by a careful, and remarkably readable, ac­ count of the SECD architecture, its formalisation, and a report on the organisation and execution of the automated correctness proof in HOL. This monograph reports on Graham's MSc project, demonstrat­ ing that - in the right hands - the tools and methodology for formal verification can (and therefore should?) now be applied by someone with little previous expertise in formal methods, to verify a non-trivial microprocessor in a limited timescale. This is not to belittle Graham's achievement; the production of this proof, work­ ing as Graham did from the previous literature, goes well beyond a typical MSc project. The achievement is that, with this exposition to hand, an engineer tackling the verification of similar microprocessor designs will have a clear view of the milestones that must be passed on the way, and of the methods to be applied to achieve them.

  • af Steven L Salzberg
    1.099,95 kr.

    Machine Learning is one of the oldest and most intriguing areas of Ar­ tificial Intelligence. From the moment that computer visionaries first began to conceive the potential for general-purpose symbolic computa­ tion, the concept of a machine that could learn by itself has been an ever present goal. Today, although there have been many implemented com­ puter programs that can be said to learn, we are still far from achieving the lofty visions of self-organizing automata that spring to mind when we think of machine learning. We have established some base camps and scaled some of the foothills of this epic intellectual adventure, but we are still far from the lofty peaks that the imagination conjures up. Nevertheless, a solid foundation of theory and technique has begun to develop around a variety of specialized learning tasks. Such tasks in­ clude discovery of optimal or effective parameter settings for controlling processes, automatic acquisition or refinement of rules for controlling behavior in rule-driven systems, and automatic classification and di­ agnosis of items on the basis of their features. Contributions include algorithms for optimal parameter estimation, feedback and adaptation algorithms, strategies for credit/blame assignment, techniques for rule and category acquisition, theoretical results dealing with learnability of various classes by formal automata, and empirical investigations of the abilities of many different learning algorithms in a diversity of applica­ tion areas.

  • af R H J M Otten
    1.677,95 kr.

    The goal of the research out of which this monograph grew, was to make annealing as much as possible a general purpose optimization routine. At first glance this may seem a straight-forward task, for the formulation of its concept suggests applicability to any combinatorial optimization problem. All that is needed to run annealing on such a problem is a unique representation for each configuration, a procedure for measuring its quality, and a neighbor relation. Much more is needed however for obtaining acceptable results consistently in a reasonably short time. It is even doubtful whether the problem can be formulated such that annealing becomes an adequate approach for all instances of an optimization problem. Questions such as what is the best formulation for a given instance, and how should the process be controlled, have to be answered. Although much progress has been made in the years after the introduction of the concept into the field of combinatorial optimization in 1981, some important questions still do not have a definitive answer. In this book the reader will find the foundations of annealing in a self-contained and consistent presentation. Although the physical analogue from which the con­ cept emanated is mentioned in the first chapter, all theory is developed within the framework of markov chains. To achieve a high degree of instance independence adaptive strategies are introduced.

  • af Alan L Meyrowitz
    1.689,95 kr.

    One of the most intriguing questions about the new computer technology that has appeared over the past few decades is whether we humans will ever be able to make computers learn. As is painfully obvious to even the most casual computer user, most current computers do not. Yet if we could devise learning techniques that enable computers to routinely improve their performance through experience, the impact would be enormous. The result would be an explosion of new computer applications that would suddenly become economically feasible (e. g. , personalized computer assistants that automatically tune themselves to the needs of individual users), and a dramatic improvement in the quality of current computer applications (e. g. , imagine an airline scheduling program that improves its scheduling method based on analyzing past delays). And while the potential economic impact of successful learning methods is sufficient reason to invest in research into machine learning, there is a second significant reason: studying machine learning helps us understand our own human learning abilities and disabilities, leading to the possibility of improved methods in education. While many open questions remain about the methods by which machines and humans might learn, significant progress has been made.

  • af Borko Furht
    1.694,95 kr.

    Multimedia computing has emerged in the last few years as a major area of research. Multimedia computer systems have opened a wide range of applications by combining a variety of information sources, such as voice, graphics, animation, images, audio, and full-motion video. Looking at the big picture, multimedia can be viewed as the merging of three industries: the computer, communications, and broadcasting industries. Research and development efforts in multimedia computing can be divided into two areas. As the first area of research, much effort has been centered on the stand-alone multimedia workstation and associated software systems and tools, such as music composition, computer-aided education and training, and interactive video. However, the combination of multimedia computing with distributed systems offers even greater potential. New applications based on distributed multimedia systems include multimedia information systems, collaborative and videoconferencing systems, on-demand multimedia services, and distance learning. Multimedia Tools and Applications is one of two volumes published by Kluwer, both of which provide a broad introduction to this fast moving area. This book covers selected tools applied in multimedia systems and key multimedia applications. Topics presented include multimedia application development techniques, techniques for content-based manipulation of image databases, techniques for selection and dissemination of digital video, and tools for digital video segmentation. Selected key applications described in the book include multimedia news services, multimedia courseware and training, interactive television systems, digital video libraries, multimedia messaging systems, and interactive multimedia publishing systems. The second book, Multimedia Systems and Techniques, covers fundamental concepts and techniques used in multimedia systems. The topics include multimedia objects and related models, multimedia compression techniques and standards, multimedia interfaces, multimedia storage techniques, multimedia communication and networking, multimedia synchronization techniques, multimedia information systems, scheduling in multimedia systems, and video indexing and retrieval techniques. Multimedia Tools and Applications, along with its companion volume, is intended for anyone involved in multimedia system design and applications and can be used as a textbook for advanced courses on multimedia.

  • af David Touretzky
    1.097,95 kr.

    arise automatically as a result of the recursive structure of the task and the continuous nature of the SRN's state space. Elman also introduces a new graphical technique for study­ ing network behavior based on principal components analysis. He shows that sentences with multiple levels of embedding produce state space trajectories with an intriguing self­ similar structure. The development and shape of a recurrent network's state space is the subject of Pollack's paper, the most provocative in this collection. Pollack looks more closely at a connectionist network as a continuous dynamical system. He describes a new type of machine learning phenomenon: induction by phase transition. He then shows that under certain conditions, the state space created by these machines can have a fractal or chaotic structure, with a potentially infinite number of states. This is graphically illustrated using a higher-order recurrent network trained to recognize various regular languages over binary strings. Finally, Pollack suggests that it might be possible to exploit the fractal dynamics of these systems to achieve a generative capacity beyond that of finite-state machines.

  • af Jill Fain Lehman
    1.105,95 kr.

    As the computer gradually automates human-oriented tasks in multiple environ­ ments, the interface between computers and the ever-wider population of human users assumes progressively increasing importance. In the office environment, for instance, clerical tasks such as document filing and retrieval, and higher-level tasks such as scheduling meetings, planning trip itineraries, and producing documents for publication, are being partially or totally automated. The range of users for office­ oriented software includes clerks, secretaries, and businesspersons, none of whom are predominantly computer literate. The same phenomenon is echoed in the factory production line, in the securities trading floor, in government agencies, in educa­ tional institutions, and even in the home. The arcane command languages of yes­ teryear have proven too high a barrier for smooth acceptance of computerized func­ tions into the workplace, no matter how useful these functions may be. Computer­ naive users simply do not take the time to learn intimidating and complex computer interfaces. In order to place the functionality of modem computers at the disposition of diverse user populations, a number of different approaches have been tried, many meeting with a significant measure of success, to wit: special courses to train users in the simpler command languages (such as MS-DOS), designing point-and-click menu/graphics interfaces that require much less user familiarization (illustrated most clearly in the Apple Macintosh), and interacting with the user in his or her language of choice.

  • af Harry A G Wijshoff
    1.106,95 kr.

    The organization of data is clearly of great importance in the design of high performance algorithms and architectures. Although there are several landmark papers on this subject, no comprehensive treatment has appeared. This monograph is intended to fill that gap. We introduce a model of computation for parallel computer architec­ tures, by which we are able to express the intrinsic complexity of data or­ ganization for specific architectures. We apply this model of computation to several existing parallel computer architectures, e.g., the CDC 205 and CRAY vector-computers, and the MPP binary array processor. The study of data organization in parallel computations was introduced as early as 1970. During the development of the ILLIAC IV system there was a need for a theory of possible data arrangements in interleaved mem­ ory systems. The resulting theory dealt primarily with storage schemes also called skewing schemes for 2-dimensional matrices, i.e., mappings from a- dimensional array to a number of memory banks. By means of the model of computation we are able to apply the theory of skewing schemes to var­ ious kinds of parallel computer architectures. This results in a number of consequences for both the design of parallel computer architectures and for applications of parallel processing.

  • af Ravi Jain
    2.174,95 kr.

    Input/Output in Parallel and Distributed Computer Systems has attracted increasing attention over the last few years, as it has become apparent that input/output performance, rather than CPU performance, may be the key limiting factor in the performance of future systems. This I/O bottleneck is caused by the increasing speed mismatch between processing units and storage devices, the use of multiple processors operating simultaneously in parallel and distributed systems, and by the increasing I/O demands of new classes of applications, like multimedia. It is also important to note that, to varying degrees, the I/O bottleneck exists at multiple levels of the memory hierarchy. All indications are that the I/O bottleneck will be with us for some time to come, and is likely to increase in importance. Input/Output in Parallel and Distributed Computer Systems is based on papers presented at the 1994 and 1995 IOPADS workshops held in conjunction with the International Parallel Processing Symposium. This book is divided into three parts. Part I, the Introduction, contains four invited chapters which provide a tutorial survey of I/O issues in parallel and distributed systems. The chapters in Parts II and III contain selected research papers from the 1994 and 1995 IOPADS workshops; many of these papers have been substantially revised and updated for inclusion in this volume. Part II collects the papers from both years which deal with various aspects of system software, and Part III addresses architectural issues. Input/Output in Parallel and Distributed Computer Systems is suitable as a secondary text for graduate level courses in computer architecture, software engineering, and multimedia systems, and as a reference for researchers and practitioners in industry.

  • af MengChu Zhou
    2.173,95 kr.

    Over the past two decades, research in the theory of Petri nets and the development of graphical tools has yielded a powerful methodology. The contributions in Petri Nets in Flexible and Agile Automation present theoretical development of Petri nets as well as in industrial applications to areas such as discrete- event control design, scheduling, performance evaluation and deadlock avoidance. These contributions also include comparative studies of Petri nets and other approaches. A primary theme of this book is to provide a unified approach to the applications of Petri nets in flexible and agile automation and, in that regard, a common notation and terminology is used. The book also allows readers to evaluate the benefits and applicability of state-of-the-art Petri net methods and apply CAD tools to problems of interest. Petri Nets in Flexible and Agile Automation is not only an essential reference for researchers, it is also a very useful tool for engineers, analysts and managers who are responsible for the design, implementation and operation of the next generation of manufacturing systems.

  • af S V Nagaraj
    1.004,95 kr.

    The last decade has seen a tremendous growth in the usage of the World Wide Web. The Web has grown so fast that it seems to be becoming an unusable and slow behemoth. Web caching is one way to tame and make this behemoth a friendly and useful giant. The key idea in Web caching is to cache frequently accessed content so that it may be used profitably later. This book focuses entirely on Web caching techniques. Much of the material in this book is very relevant for those interested in understanding the wide gamut of Web caching research. It will be helpful for those interested in making use of the power of the Web in a more profitable way. Audience and purpose of this book This book presents key concepts in Web caching and is meant to be suited for a wide variety of readers including advanced undergraduate and graduate students, programmers, network administrators, researchers, teachers, techn- ogists and Internet Service Providers (ISPs).

  • af Julio Cesar Sampaio Do Prado Leite
    1.111,95 kr.

    Perspectives On Software Requirements presents perspectives on several current approaches to software requirements. Each chapter addresses a specific problem where the authors summarize their experiences and results to produce well-fit and traceable requirements. Chapters highlight familiar issues with recent results and experiences, which are accompanied by chapters describing well-tuned new methods for specific domains.

  • af P. van der Meer
    992,95 kr.

    1. 1 Power-dissipation trends in CMOS circuits Shrinking device geometry, growing chip area and increased data-processing speed performance are technological trends in the integrated circuit industry to enlarge chip functionality. Already in 1965 Gordon Moore predicted that the total number of devices on a chip would double every year until the 1970s and every 24 months in the 1980s. This prediction is widely known as "Moore's Law" and eventually culminated in the Semiconductor Industry Association (SIA) technology road map [1]. The SIA road map has been a guide for the in­ dustry leading them to continued wafer and die size growth, increased transistor density and operating frequencies, and defect density reduction. To mention a few numbers; the die size increased 7% per year, the smallest feature sizes decreased 30% and the operating frequencies doubled every two years. As a consequence of these trends both the number of transistors and the power dissi­ pation per unit area increase. In the near future the maximum power dissipation per unit area will be reached. Down-scaling of the supply voltage is not only the most effective way to reduce power dissipation in general it also is a necessary precondition to ensure device reliability by reducing electrical fields and device temperature, to prevent device degradation. A draw-back of this solution is an increased signal propa­ gation delay, which results in a lower data-processing speed performance.

  • af Federico Bruccoleri
    1.772,95 kr.

    Low Noise Amplifiers (LNAs) are commonly used to amplify signals that are too weak for direct processing for example in radio or cable receivers. Traditionally, low noise amplifiers are implemented via tuned amplifiers, exploiting inductors and capacitors in resonating LC-circuits. This can render very low noise but only in a relatively narrow frequency band close to resonance. There is a clear trend to use more bandwidth for communication, both via cables (e.g. cable TV, internet) and wireless links (e.g. satellite links and Ultra Wideband Band). Hence wideband low-noise amplifier techniques are very much needed.Wideband Low Noise Amplifiers Exploiting Thermal Noise Cancellation explores techniques to realize wideband amplifiers, capable of impedance matching and still achieving a low noise figure well below 3dB. This can be achieved with a new noise cancelling technique as described in this book. By using this technique, the thermal noise of the input transistor of the LNA can be cancelled while the wanted signal is amplified! The book gives a detailed analysis of this technique and presents several new amplifier circuits.This book is directly relevant for IC designers and researchers working on integrated transceivers. Although the focus is on CMOS circuits, the techniques can just as well be applied to other IC technologies, e.g. bipolar and GaAs, and even in discrete component technologies.

  • af Huan Liu
    2.172,95 kr.

    There is broad interest in feature extraction, construction, and selection among practitioners from statistics, pattern recognition, and data mining to machine learning. Data preprocessing is an essential step in the knowledge discovery process for real-world applications. This book compiles contributions from many leading and active researchers in this growing field and paints a picture of the state-of-art techniques that can boost the capabilities of many existing data mining tools. The objective of this collection is to increase the awareness of the data mining community about the research of feature extraction, construction and selection, which are currently conducted mainly in isolation. This book is part of our endeavor to produce a contemporary overview of modern solutions, to create synergy among these seemingly different branches, and to pave the way for developing meta-systems and novel approaches. Even with today's advanced computer technologies, discovering knowledge from data can still be fiendishly hard due to the characteristics of the computer generated data. Feature extraction, construction and selection are a set of techniques that transform and simplify data so as to make data mining tasks easier. Feature construction and selection can be viewed as two sides of the representation problem.

  • af Urs E Gattiker
    1.510,95 kr.

    The Dictionary of Information Security provides complete and easy to read explanations of common security and infrastructure protection terms (quick refresher terms). Special attention is given to terms that most often prevent educated readers from understanding journal articles or books in cryptography, computer security, information systems, role-based access management and applied fields that build on those disciplines. Also included in the dictionary are terms that refer to computing forensics, malware attacks, privacy issues, system design, security auditing and vulnerability testing. Although it is difficult for an IT professional or an IT student to keep aware of the current terminology being practiced today, the Dictionary of Information Security presents cutting-edge information on the most recent terms in use in one concisely formatted volume. Similar to dictionaries for languages, statistics, epidemiology and other disciplines, this IT Security Dictionary is a reference tool that should become part of any professional and IT student's library. The Dictionary of Information Security is designed for a professional audience, composed of researchers and practitioners in industry. This dictionary is also suitable for students in computer science, engineering and information sciences.

  • af Osamu Wada
    2.178,95 kr.

    As we approach the end of the present century, the elementary particles of light (photons) are seen to be competing increasingly with the elementary particles of charge (electrons/holes) in the task of transmitting and processing the insatiable amounts of infonnation needed by society. The massive enhancements in electronic signal processing that have taken place since the discovery of the transistor, elegantly demonstrate how we have learned to make use of the strong interactions that exist between assemblages of electrons and holes, disposed in suitably designed geometries, and replicated on an increasingly fine scale. On the other hand, photons interact extremely weakly amongst themselves and all-photonic active circuit elements, where photons control photons, are presently very difficult to realise, particularly in small volumes. Fortunately rapid developments in the design and understanding of semiconductor injection lasers coupled with newly recognized quantum phenomena, that arise when device dimensions become comparable with electronic wavelengths, have clearly demonstrated how efficient and fast the interaction between electrons and photons can be. This latter situation has therefore provided a strong incentive to devise and study monolithic integrated circuits which involve both electrons and photons in their operation. As chapter I notes, it is barely fifteen years ago since the first demonstration of simple optoelectronic integrated circuits were realised using m-V compound semiconductors; these combined either a laser/driver or photodetector/preamplifier combination.

  • af Francky Catthoor
    1.681,95 kr.

    Application-Driven Architecture Synthesis describes the state of the art of architectural synthesis for complex real-time processing. In order to deal with the stringent timing requirements and the intricacies of complex real-time signal and data processing, target architecture styles and target application domains have been adopted to make the synthesis approach feasible. These approaches are also heavily application-driven, which is illustrated by many realistic demonstrations, used as examples in the book. The focus is on domains where application-specific solutions are attractive, such as significant parts of audio, telecom, instrumentation, speech, robotics, medical and automotive processing, image and video processing, TV, multi-media, radar, sonar. Application-Driven Architecture Synthesis is of interest to both academics and senior design engineers and CAD managers in industry. It provides an excellent overview of what capabilities to expect from future practical design tools, and includes an extensive bibliography.

  • af Allen M Dewey
    1.679,95 kr.

    This book describes a new type of computer aided VLSI design tool, called a VLSI System Planning, that is meant to aid designers dur­ ing the early, or conceptual, state of design. During this stage of design, the objective is to define a general design plan, or approach, that is likely to result in an efficient implementation satisfying the initial specifications, or to determine that the initial specifications are not realizable. A design plan is a collection of high level design decisions. As an example, the conceptual design of digital filters involves choosing the type of algorithm to implement (e. g. , finite impulse response or infinite impulse response), the type of polyno­ mial approximation (e. g. , Equiripple or Chebyshev), the fabrication technology (e. g. , CMOS or BiCMOS), and so on. Once a particu­ lar design plan is chosen, the detailed design phase can begin. It is during this phase that various synthesis, simulation, layout, and test activities occur to refine the conceptual design, gradually filling more detail until the design is finally realized. The principal advantage of VLSI System Planning is that the increasingly expensive resources of the detailed design process are more efficiently managed. Costly redesigns are minimized because the detailed design process is guided by a more credible, consistent, and correct design plan.

  • af Ryszard S Michalski
    2.155,95 kr.

    Most machine learning research has been concerned with the development of systems that implememnt one type of inference within a single representational paradigm. Such systems, which can be called monostrategy learning systems, include those for empirical induction of decision trees or rules, explanation-based generalization, neural net learning from examples, genetic algorithm-based learning, and others. Monostrategy learning systems can be very effective and useful if learning problems to which they are applied are sufficiently narrowly defined. Many real-world applications, however, pose learning problems that go beyond the capability of monostrategy learning methods. In view of this, recent years have witnessed a growing interest in developing multistrategy systems, which integrate two or more inference types and/or paradigms within one learning system. Such multistrategy systems take advantage of the complementarity of different inference types or representational mechanisms. Therefore, they have a potential to be more versatile and more powerful than monostrategy systems. On the other hand, due to their greater complexity, their development is significantly more difficult and represents a new great challenge to the machine learning community. Multistrategy Learning contains contributions characteristic of the current research in this area.

  • af Marvin K Simon
    816,95 kr.

    This handbook, now available in paperback, brings together a comprehensive collection of mathematical material in one location. It also offers a variety of new results interpreted in a form that is particularly useful to engineers, scientists, and applied mathematicians. The handbook is not specific to fixed research areas, but rather it has a generic flavor that can be applied by anyone working with probabilistic and stochastic analysis and modeling.Classic results are presented in their final form without derivation or discussion, allowing for much material to be condensed into one volume.Concise compilation of disparate formulae saves time in searching different sources.Focused application has broad interest for many disciplines: engineers, computer scientists, statisticians, physicists; as well as for any researcher working in probabilistic and stochastic analysis and modeling in the natural or social sciences.The material is timeless, with intrinsic value to practicing engineers and scientists.Excerpts from reviews of the hardbound edition:This is a unique book and an invaluable reference for engineers and scientists in the fields of electrical engineering, mathematics and statistics. There is no other single reference book that covers Gaussian and Gaussian-related distributions of random variables that are encountered in research work and practice. This is a reference book that I recommend to all my graduate students working in the area of telecommunication system analysis and design.--John Proakis, Professor Emeritus, Northeastern University and Adjunct Professor, University of California, San DiegoThe reference book Probability Distributions Involving Gaussian Random Variables, authored by Dr. Marvin Simon, has become, in a very short time frame, one of the most useful aids to research in the field of digital communications that has come out in many years. It has numerous results that can save researchers in the field endless hours of work. It has replaced various other well known resources because of its concentration of relevant, timely, and easily accessible results.-- Larry Milstein, UCSDThere are a small number of reference works that have proven so invaluable that I have purchased a home copy in addition to my office copy. This handbook is one of them.-- Dr. Norman C. Beaulieu, University of Alberta The Gaussian distribution and those derived from it are at the very core of a huge number of problems in multi-disciplinary fields of engineering, mathematics and science... The book, with its comprehensive information in analytical, tabular, and graphical form, is an invaluable tool for scientists and engineers.-- Sergio Benedetto, Politecnico di TorinoMore testimonials can be found in the front of this edition.

  • af Cyrus Bamji
    1.099,95 kr.

    Leaf Cell and Hierarchical Compaction Techniques presents novel algorithms developed for the compaction of large layouts. These algorithms have been implemented as part of a system that has been used on many industrial designs. The focus of Leaf Cell and Hierarchical Compaction Techniques is three-fold. First, new ideas for compaction of leaf cells are presented. These cells can range from small transistor-level layouts to very large layouts generated by automatic Place and Route tools. Second, new approaches for hierarchical pitchmatching compaction are described and the concept of a Minimum Design is introduced. The system for hierarchical compaction is built on top of the leaf cell compaction engine and uses the algorithms implemented for leaf cell compaction in a modular fashion. Third, a new representation for designs called Virtual Interface, which allows for efficient topological specification and representation of hierarchical layouts, is outlined. The Virtual Interface representation binds all of the algorithms and their implementations for leaf and hierarchical compaction into an intuitive and easy-to-use system. From the Foreword: `...In this book, the authors provide a comprehensive approach to compaction based on carefully conceived abstractions. They describe the design of algorithms that provide true hierarchical compaction based on linear programming, but cut down the complexity of the computations through introduction of innovative representations that capture the provably minimum amount of required information needed for correct compaction. In most compaction algorithms, the complexity goes up with the number of design objects, but in this approach, complexity is due to the irregularity of the design, and hence is often tractable for most designs which incorporate substantial regularity. Here the reader will find an elegant treatment of the many challenges of compaction, and a clear conceptual focus that provides a unified approach to all aspects of the compaction task...' Jonathan Allen, Massachusetts Institute of Technology

  • af Martial H Hebert
    2.166,95 kr.

    Intelligent Unmanned Ground Vehicles describes the technology developed and the results obtained by the Carnegie Mellon Robotics Institute in the course of the DARPA Unmanned Ground Vehicle (UGV) project. The goal of this work was to equip off-road vehicles with computer-controlled, unmanned driving capabilities. The book describes contributions in the area of mobility for UGVs including: tools for assembling complex autonomous mobility systems; on-road and off-road navigation; sensing techniques; and route planning algorithms. In addition to basic mobility technology, the book covers a number of integrated systems demonstrated in the field in realistic scenarios. The approaches presented in this book can be applied to a wide range of mobile robotics applications, from automated passenger cars to planetary exploration, and construction and agricultural machines. Intelligent Unmanned Ground Vehicles shows the progress that was achieved during this program, from brittle specially-built robots operating under highly constrained conditions, to groups of modified commercial vehicles operating in tough environments. One measure of progress is how much of this technology is being used in other applications. For example, much of the work in road-following, architectures and obstacle detection has been the basis for the Automated Highway Systems (AHS) prototypes currently under development. AHS will lead to commercial prototypes within a few years. The cross-country technology is also being used in the development of planetary rovers with a projected launch date within a few years. The architectural tools built under this program have been used in numerous applications, from an automated harvester to an autonomous excavator. The results reported in this work provide tools for further research development leading to practical, reliable and economical mobile robots.

  • af Thaddeus J Kowalski
    1.687,95 kr.

    Rule-Based Programming is a broad presentation of the rule-based programming method with many example programs showing the strengths of the rule-based approach. The rule-based approach has been used extensively in the development of artificial intelligence systems, such as expert systems and machine learning. This rule-based programming technique has been applied in such diverse fields as medical diagnostic systems, insurance and banking systems, as well as automated design and configuration systems. Rule-based programming is also helpful in bridging the semantic gap between an application and a program, allowing domain specialists to understand programs and participate more closely in their development. Over sixty programs are presented and all programs are available from an ftp site. Many of these programs are presented in several versions allowing the reader to see how realistic programs are elaborated from `back of envelope' models. Metaprogramming is also presented as a technique for bridging the `semantic gap'. Rule-Based Programming will be of interest to programmers, systems analysts and other developers of expert systems as well as to researchers and practitioners in artificial intelligence, computer science professionals and educators.

  • af N. Bouden-Romdhane
    1.677,95 kr.

    From the Foreword..... Modern digital signal processing applications provide a large challenge to the system designer. Algorithms are becoming increasingly complex, and yet they must be realized with tight performance constraints. Nevertheless, these DSP algorithms are often built from many constituent canonical subtasks (e.g., IIR and FIR filters, FFTs) that can be reused in other subtasks. Design is then a problem of composing these core entities into a cohesive whole to provide both the intended functionality and the required performance. In order to organize the design process, there have been two major approaches. The top-down approach starts with an abstract, concise, functional description which can be quickly generated. On the other hand, the bottom-up approach starts from a detailed low-level design where performance can be directly assessed, but where the requisite design and interface detail take a long time to generate. In this book, the authors show a way to effectively resolve this tension by retaining the high-level conciseness of VHDL while parameterizing it to get good fit to specific applications through reuse of core library components. Since they build on a pre-designed set of core elements, accurate area, speed and power estimates can be percolated to high- level design routines which explore the design space. Results are impressive, and the cost model provided will prove to be very useful. Overall, the authors have provided an up-to-date approach, doing a good job at getting performance out of high-level design. The methodology provided makes good use of extant design tools, and is realistic in terms of the industrial design process. The approach is interesting in its own right, but is also of direct utility, and it will give the existing DSP CAD tools a highly competitive alternative. The techniques described have been developed within ARPAs RASSP (Rapid Prototyping of Application Specific Signal Processors) project, and should be of great interest there, as well as to many industrial designers. Professor Jonathan Allen, Massachusetts Institute of Technology

  • af Simon W Moore
    1.674,95 kr.

    Multithreaded Processor Design takes the unique approach of designing a multithreaded processor from the ground up. Every aspect is carefully considered to form a balanced design rather than making incremental changes to an existing design and then ignoring problem areas. The general purpose parallel computer is an elusive goal. Multithreaded processors have emerged as a promising solution to this conundrum by forming some amalgam of the commonplace control-flow (von Neumann) processor model with the more exotic data-flow approach. This new processor model offers many exciting possibilities and there is much research to be performed to make this technology widespread. Multithreaded processors utilize the simple and efficient sequential execution technique of control-flow, and also data-flow like concurrency primitives. This supports the conceptually simple but powerful idea of rescheduling rather than blocking when waiting for data, e.g. from large and distributed memories, thereby tolerating long data transmission latencies. This makes multiprocessing far more efficient because the cost of moving data between distributed memories and processors can be hidden by other activity. The same hardware mechanisms may also be used to synchronize interprocess communications to awaiting threads, thereby alleviating operating system overheads. Supporting synchronization and scheduling mechanisms in hardware naturally adds complexity. Consequently, existing multithreaded processor designs have tended to make incremental changes to existing control-flow processor designs to resolve some problems but not others. Multithreaded Processor Design serves as an excellent reference source and is suitable as a text for advanced courses in computer architecture dealing with the subject.

  • af Swaminathan Natarajan
    1.101,95 kr.

    Real-time systems are now used in a wide variety of applications. Conventionally, they were configured at design to perform a given set of tasks and could not readily adapt to dynamic situations. The concept of imprecise and approximate computation has emerged as a promising approach to providing scheduling flexibility and enhanced dependability in dynamic real-time systems. The concept can be utilized in a wide variety of applications, including signal processing, machine vision, databases, networking, etc. For those who wish to build dynamic real-time systems which must deal safely with resource unavailability while continuing to operate, leading to situations where computations may not be carried through to completion, the techniques of imprecise and approximate computation facilitate the generation of partial results that may enable the system to operate safely and avert catastrophe. Audience: Of special interest to researchers. May be used as a supplementary text in courses on real-time systems.

Gør som tusindvis af andre bogelskere

Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.