Vi bøger
Levering: 1 - 2 hverdage

Bøger udgivet af now publishers Inc

Filter
Filter
Sorter efterSorter Populære
  • af Lisha Chen
    1.097,95 kr.

    Deep learning has achieved remarkable success in many machine learning tasks such as image classification, speech recognition, and game playing. However, these breakthroughs are often difficult to translate into real-world engineering systems because deep learning models require a massive number of training samples, which are costly to obtain in practice. To address labeled data scarcity, few-shot meta-learning optimizes learning algorithms that can efficiently adapt to new tasks quickly. While meta-learning is gaining significant interest in the machine learning literature, its working principles and theoretic fundamentals are not as well understood in the engineering community. This review monograph provides an introduction to meta-learning by covering principles, algorithms, theory, and engineering applications. After introducing meta-learning in comparison with conventional and joint learning, the main meta-learning algorithms are described, as well as a general bilevel optimization framework for the definition of meta-learning techniques. Then, known results on the generalization capabilities of meta-learning from a statistical learning viewpoint are summarized. Applications to communication systems, including decoding and power allocation, are discussed next, followed by an introduction to aspects related to the integration of meta-learning with emerging computing technologies, namely neuromorphic and quantum computing. The monograph concludes with an overview of open research challenges.

  • af Lingfei Wu Wu
    1.162,95 kr.

    Deep learning has become the dominant approach in addressing various tasks in Natural Language Processing (NLP). Although text inputs are typically represented as a sequence of tokens, there is a rich variety of NLP problems that can be best expressed with a graph structure. As a result, there is a surge of interest in developing new deep learning techniques on graphs for a large number of NLP tasks. In this monograph, the authors present a comprehensive overview on Graph Neural Networks (GNNs) for Natural Language Processing. They propose a new taxonomy of GNNs for NLP, which systematically organizes existing research of GNNs for NLP along three axes: graph construction, graph representation learning, and graph based encoder-decoder models. They further introduce a large number of NLP applications that exploits the power of GNNs and summarize the corresponding benchmark datasets, evaluation metrics, and open-source codes. Finally, they discuss various outstanding challenges for making the full use of GNNs for NLP as well as future research directions. This is the first comprehensive overview of Graph Neural Networks for Natural Language Processing. It provides students and researchers with a concise and accessible resource to quickly get up to speed with an important area of machine learning research.

  • af Bennet E. Meyers
    827,95 kr.

    The decomposition of a time series signal into components is an age old problem, with many different approaches proposed, including traditional filtering and smoothing, seasonal-trend decomposition, Fourier and other decompositions, PCA and newer variants such as nonnegative matrix factorization, various statistical methods, and many heuristic methods. In this monograph, the well-studied problem of decomposing a vector time series signal into components with different characteristics, such as smooth, periodic, nonnegative, or sparse are covered. A general framework in which the components are defined by loss functions (which include constraints), and the signal decomposition is carried out by minimizing the sum of losses of the components (subject to the constraints) are included. When each loss function is the negative log-likelihood of a density for the signal component, this framework coincides with maximum a posteriori probability (MAP) estimation; but it also includes many other interesting cases. Summarizing and clarifying prior results, two distributed optimization methods for computing the decomposition are presented, which find the optimal decomposition when the component class loss functions are convex, and are good heuristics when they are not. Both methods require only the masked proximal operator of each of the component loss functions, a generalization of the well-known proximal operator that handles missing entries in its argument. Both methods are distributed, i.e., handle each component separately. Also included are tractable methods for evaluating the masked proximal operators of some loss functions that have not appeared in the literature.

  • af Emmanuel Abbe
    1.157,95 kr.

    Reed-Muller (RM) codes are among the oldest, simplest and perhaps most ubiquitous family of codes. They are used in many areas of coding theory in both electrical engineering and computer science. Yet, many of their important properties are still under investigation. In this monograph the authors consider some of the most recent developments in RM codes that are having major impacts on the design of modern communication systems. These include weight enumerator and the capacity-achieving properties of RM codes, as well as some of the algorithmic developments. In particular, they discuss connections established between RM codes, thresholds of Boolean functions, polarization theory, hypercontractivity, and the techniques of approximating low weight codewords. They then overview some of the algorithms for decoding RM codes. The monograph concludes by looking at some applications of RM codes in theoretical computer science and signal processing. This monograph is written in a tutorial style, introducing the reader to the basics of RM codes before building in each chapter to cover the wide-ranging topics that make this a comprehensive overview of RM codes for current and future communication systems.

  • af Rajeev K. Goel
    827,95 kr.

    Drivers of International Research Spending contributes to the literature on the economics of technical change in two ways. First, it provides an overview and a critical appraisal of the literature on the drivers of research spending, especially focusing on the extant empirical studies in recent years. Second, it provides a unique insight into the empirical determinants of research spending using micro or firm-level data on research spending decisions across a very large sample of mostly emerging nations. Firm-level information on research enables the consideration of many characteristics (e.g., size, vintage, ownership, etc.) of firms that perform research. This monograph is organized as follows. The authors begin by presenting a schematic diagram that describes their vision about what constitutes R&D, its various dimensions, and the key players involved in such activity. Next, they present an extended overview of the literature on the causes and effects of technical change, including the drivers of research spending. The authors then discuss micro-level data sets on technical changes and R&D activity, with special attention given to the Enterprise Surveys (ES) dataset organized through the World Bank, which is employed in modeling cross-country firm-level R&D decision-making. Finally, the monograph provides recommendations for technology policy and suggests some directions for future research.

  • af Ruolei Ji
    1.017,95 kr.

    In recent years, the demand for visual media has been growing exponentially. Among the large amount of visual traffic over the Internet, high-resolution visual content constitutes an increasingly large percentage. With such a rapid growth of digital visual media traffic, there is a growing need for image/video compression approaches that can achieve much higher compression ratios than the ones obtained using existing conventional image/video compression methods, while maintaining a high visual quality. Visual compression is an application of data compression to lower the storage and/or transmission requirements for digital images and videos. Due to the rapid growth in visual data transmission demand, more efficient compression algorithms are needed. Considering that deep learning techniques have successfully revolutionized many visual tasks, learning-based compression algorithms have been explored over the years and have been shown to be able to outperform many conventional compression methods. This monograph provides a review of various visual compression algorithms, both end-to-end learning-based image compression approaches and hybrid image compression approaches. Some learning-based video compression methods are also discussed. In addition to describing a wide range of learning-based image compression approaches that have been developed in recent years, the survey describes widely used datasets, and discusses potential research directions.

  • af Michael Fritsch
    1.112,95 kr.

    Entrepreneurship in the Long-Run: Empirical Evidence and Historical Mechanisms reviews the available evidence on the historical roots of entrepreneurship and its relationship with economic performance across regions, which is defined as subnational geographic entities. Given the tremendous differences in the level and the type of entrepreneurship between regions, the authors focus on the regional level and demonstrate how historical factors can determine entrepreneurial activity in a region and may predetermine future development paths. In addition, the monograph looks at the ability of a regional economy to cope with external challenges. The main explanation for such long-term effects and for the persistence of the level of regional entrepreneurship over long periods are historically rooted regional cultures that change only very slowly. Generally, historical roots provide a key explanation for the development of regions along long-term trajectories that are characterized by a co-evolution of entrepreneurship, knowledge, and informal institutions. This means that regions can have persistently low or persistently high levels of entrepreneurship depending on whether historical factors shaped entrepreneurship positively or negatively. Following a brief introduction, the monograph starts with a brief overview of data availability and measurement (Section 2), summarizing the available empirical evidence on long-term trends in regional entrepreneurship in Section 3. Section 4 discusses potential explanations for these findings. Section 5 is devoted to the effects of persistent regional entrepreneurship on economic development. Based on these findings the authors discuss theory development in Section 6 and derive policy implications in Section 7. Section 8 reviews empirical challenges and describes the main avenues for further research. Section 9 provides the authors' conclusions.

  • af Thomas M. Moerland
    1.057,95 kr.

    Sequential decision making, commonly formalized as Markov Decision Process (MDP) optimization, is an important challenge in artificial intelligence. Two key approaches to this problem are reinforcement learning (RL) and planning. This monograph surveys an integration of both fields, better known as model-based reinforcement learning. Model-based RL has two main steps: dynamics model learning and planning-learning integration. In this comprehensive survey of the topic, the authors first cover dynamics model learning, including challenges such as dealing with stochasticity, uncertainty, partial observability, and temporal abstraction. They then present a systematic categorization of planning-learning integration, including aspects such as: where to start planning, what budgets to allocate to planning and real data collection, how to plan, and how to integrate planning in the learning and acting loop. In conclusion the authors discuss implicit model-based RL as an end-to-end alternative for model learning and planning, and cover the potential benefits of model-based RL. Along the way, the authors draw connections to several related RL fields, including hierarchical RL and transfer learning. This monograph contains a broad conceptual overview of the combination of planning and learning for Markov Decision Process optimization. It provides a clear and complete introduction to the topic for students and researchers alike.

  • af Jordi Guijarro Olivares
    1.407,95 kr.

    The damaging effects of cyberattacks to an industry like the Cooperative Connected and Automated Mobility (CCAM) can be tremendous. From the least important to the worst ones, one can mention for example the damage in the reputation of vehicle manufacturers, the increased denial of customers to adopt CCAM, the loss of working hours (having direct impact on the European GDP), material damages, increased environmental pollution due e.g., to traffic jams or malicious modifications in sensors' firmware, and ultimately, the great danger for human lives, either they are drivers, passengers or pedestrians.Connected vehicles will soon become a reality on our roads, bringing along new services and capabilities, but also technical challenges and security threats. To overcome these risks, the CARAMEL project has developed several anti-hacking solutions for the new generation of vehicles.CARAMEL (Artificial Intelligence-based Cybersecurity for Connected and Automated Vehicles), a research project co-funded by the European Union under the Horizon 2020 framework programme, is a project consortium with 15 organizations from 8 European countries together with 3 Korean partners. The project applies a proactive approach based on Artificial Intelligence and Machine Learning techniques to detect and prevent potential cybersecurity threats to autonomous and connected vehicles. This approach has been addressed based on four fundamental pillars, namely: Autonomous Mobility, Connected Mobility, Electromobility, and Remote Control Vehicle. This book presents theory and results from each of these technical directions.

  • af Sauvik Das
    1.142,95 kr.

    Cybersecurity and Privacy (S&P) unlock the full potential of computing. Use of encryption, authentication, and access control, for example, allows employees to correspond with professional colleagues via email with reduced fear of leaking confidential data to competitors or cybercriminals. It also allows, for example, parents to share photos of children with remote loved ones over the Internet with reduced fear of this data reaching the hands of unknown strangers, and anonymous whistleblowers to share information about problematic practices in the workplace with reduced fear of being outed. Conversely, failure to employ appropriate S&P measures can leave people and organizations vulnerable to a broad range of threats. In short, the security and privacy decisions we make on a day-to-day basis determine whether the data we share, manipulate, and store online is protected from theft, surveillance, and exploitation. How can end-users be encouraged to accept recommended S&P behavior from experts? In this monograph, prior art in human-centered S&P is reviewed, and three barriers to end-user acceptance of expert recommendations have been identified. These three barriers make up what we call the "Security & Privacy Acceptance Framework" (SPAF). The barriers are: (1) awareness: i.e., people may not know of relevant security threats and appropriate mitigation measures; (2) motivation: i.e., people may be unwilling to enact S&P behaviors because, e.g., the perceived costs are too high; (3) and, ability: i.e., people may not know when, why, and how to effectively implement S&P behaviors. This monograph also reviews and critically analyzes prior work that has explored mitigating one or more of the barriers that make up the SPAF. Finally, using the SPAF as a lens, discussed is how the human-centered S&P community might re-orient to encourage widespread end-user acceptance of pro-S&P behaviors by employing integrative approaches that address each one of the awareness, motivation, and ability barriers.

  • af Mark Bun
    1.142,95 kr.

    The ability (or inability) to represent or approximate Boolean functions by polynomials is a central concept in complexity theory, underlying interactive and probabilistically checkable proof systems, circuit lower bounds, quantum complexity theory, and more. In this book, the authors survey what is known about a particularly natural notion of approximation by polynomials, capturing pointwise approximation over the real numbers. This book covers recent progress on proving approximate degree lower and upper bounds and describes some applications of the new bounds to oracle separations, quantum query and communication complexity, and circuit complexity. The authors explain how several of these advances have been unlocked by a particularly simple and elegant technique, called dual block composition, for constructing solutions to this dual linear program. They also provide concise coverage of even more recent lower bound techniques based on a new complexity measure called spectral sensitivity. Finally, they show how explicit constructions of approximating polynomials have been inspired by quantum query algorithms. This book provides a comprehensive review of the foundational and recent developments of an important topic in both classical and quantum computing. The reader has a considerable body of knowledge condensed in an accessible form to quickly understand the principles and further their own research.

  • af Sofie Beier
    992,95 kr.

  • af Justin Thaler
    1.142,95 kr.

  • af Zhe Gan
    1.142,95 kr.

  • af Claudio Adragna
    1.187,95 kr.

  • af Jan M. Rabaey
    1.187,95 kr.

  • af Warren B. Powell
    1.187,95 kr.

  • af Wenshuo Wang
    1.187,95 kr.

  • af João B. O. Souza Filho
    1.172,95 kr.

  • af William B. Bonvillian
    877,95 kr.

  • af Carlos Celemin
    1.187,95 kr.

  • af Clément L. Canonne
    1.187,95 kr.

  • af Mirco Moencks
    1.187,95 kr.

  • af Zeynep Aydin-Gokgoz
    837,95 kr.

  • af Kieron O'Hara
    1.187,95 kr.

  • af Alecos Papadopoulos
    1.047,95 kr.

Gør som tusindvis af andre bogelskere

Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.