Gør som tusindvis af andre bogelskere
Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.
Ved tilmelding accepterer du vores persondatapolitik.Du kan altid afmelde dig igen.
Experts investigate the role of child development in promoting a culture of peace, reporting on research in biology, neuroscience, genetics, and psychology.
A critical examination of metropolitan planning in Paris--the "Grand Paris" initiative--and the building of today's networked global city.
An examination of how the availability of low-end information and communication technology has provided a basis for the emergence of a working-class network society in China.
An engaging introduction to the use of game theory to study lingistic meaning.
How the simulation and visualization technologies so pervasive in science, engineering, and design have changed our way of seeing the world.Over the past twenty years, the technologies of simulation and visualization have changed our ways of looking at the world. In Simulation and Its Discontents, Sherry Turkle examines the now dominant medium of our working lives and finds that simulation has become its own sensibility. We hear it in Turkle's description of architecture students who no longer design with a pencil, of science and engineering students who admit that computer models seem more "e;real"e; than experiments in physical laboratories.Echoing architect Louis Kahn's famous question, "e;What does a brick want?"e;, Turkle asks, "e;What does simulation want?"e; Simulations want, even demand, immersion, and the benefits are clear. Architects create buildings unimaginable before virtual design; scientists determine the structure of molecules by manipulating them in virtual space; physicians practice anatomy on digitized humans. But immersed in simulation, we are vulnerable. There are losses as well as gains. Older scientists describe a younger generation as "e;drunk with code."e; Young scientists, engineers, and designers, full citizens of the virtual, scramble to capture their mentors' tacit knowledge of buildings and bodies. From both sides of a generational divide, there is anxiety that in simulation, something important is slipping away. Turkle's examination of simulation over the past twenty years is followed by four in-depth investigations of contemporary simulation culture: space exploration, oceanography, architecture, and biology.
Architects who engaged with cybernetics, artificial intelligence, and other technologies poured the foundation for digital interactivity. In Architectural Intelligence, Molly Wright Steenson explores the work of four architects in the 1960s and 1970s who incorporated elements of interactivity into their work. Christopher Alexander, Richard Saul Wurman, Cedric Price, and Nicholas Negroponte and the MIT Architecture Machine Group all incorporated technologies--including cybernetics and artificial intelligence--into their work and influenced digital design practices from the late 1980s to the present day. Alexander, long before his famous 1977 book A Pattern Language, used computation and structure to visualize design problems; Wurman popularized the notion of "information architecture"; Price designed some of the first intelligent buildings; and Negroponte experimented with the ways people experience artificial intelligence, even at architectural scale. Steenson investigates how these architects pushed the boundaries of architecture--and how their technological experiments pushed the boundaries of technology. What did computational, cybernetic, and artificial intelligence researchers have to gain by engaging with architects and architectural problems? And what was this new space that emerged within these collaborations? At times, Steenson writes, the architects in this book characterized themselves as anti-architects and their work as anti-architecture. The projects Steenson examines mostly did not result in constructed buildings, but rather in design processes and tools, computer programs, interfaces, digital environments. Alexander, Wurman, Price, and Negroponte laid the foundation for many of our contemporary interactive practices, from information architecture to interaction design, from machine learning to smart cities.
An overview of the theory and application of kernel classification methods. Linear classifiers in kernel spaces have emerged as a major topic within the field of machine learning. The kernel technique takes the linear classifier--a limited, but well-established and comprehensively studied model--and extends its applicability to a wide range of nonlinear pattern-recognition tasks such as natural language processing, machine vision, and biological sequence analysis. This book provides the first comprehensive overview of both the theory and algorithms of kernel classifiers, including the most recent developments. It begins by describing the major algorithmic advances: kernel perceptron learning, kernel Fisher discriminants, support vector machines, relevance vector machines, Gaussian processes, and Bayes point machines. Then follows a detailed introduction to learning theory, including VC and PAC-Bayesian theory, data-dependent structural risk minimization, and compression bounds. Throughout, the book emphasizes the interaction between theory and algorithms: how learning algorithms work and why. The book includes many examples, complete pseudo code of the algorithms presented, and an extensive source code library.
How women coped with both formal barriers and informal opposition to their entry into the traditionally masculine field of engineering in American higher education. Engineering education in the United States was long regarded as masculine territory. For decades, women who studied or worked in engineering were popularly perceived as oddities, outcasts, unfeminine (or inappropriately feminine in a male world). In Girls Coming to Tech!, Amy Bix tells the story of how women gained entrance to the traditionally male field of engineering in American higher education. As Bix explains, a few women breached the gender-reinforced boundaries of engineering education before World War II. During World War II, government, employers, and colleges actively recruited women to train as engineering aides, channeling them directly into defense work. These wartime training programs set the stage for more engineering schools to open their doors to women. Bix offers three detailed case studies of postwar engineering coeducation. Georgia Tech admitted women in 1952 to avoid a court case, over objections by traditionalists. In 1968, Caltech male students argued that nerds needed a civilizing female presence. At MIT, which had admitted women since the 1870s but treated them as a minor afterthought, feminist-era activists pushed the school to welcome more women and take their talent seriously. In the 1950s, women made up less than one percent of students in American engineering programs; in 2010 and 2011, women earned 18.4% of bachelor's degrees, 22.6% of master's degrees, and 21.8% of doctorates in engineering. Bix's account shows why these gains were hard won.
A computer scientist and a performance and new media theorist define and document the emerging field of mixed reality performance. Working at the cutting edge of live performance, an emerging generation of artists is employing digital technologies to create distinctive forms of interactive, distributed, and often deeply subjective theatrical performance. The work of these artists is not only fundamentally transforming the experience of theater, it is also reshaping the nature of human interaction with computers. In this book, Steve Benford and Gabriella Giannachi offer a new theoretical framework for understanding these experiences--which they term mixed reality performances--and document a series of landmark performances and installations that mix the real and the virtual, live performance and interactivity. Benford and Giannachi draw on a number of works that have been developed at the University of Nottingham's Mixed Reality Laboratory, describing collaborations with artists (most notably the group Blast Theory) that have gradually evolved a distinctive interdisciplinary approach to combining practice with research. They offer detailed and extended accounts of these works from different perspectives, including interviews with the artists and Mixed Reality Laboratory researchers. The authors develop an overarching theory to guide the study and design of mixed reality performances based on the approach of interleaved trajectories through hybrid structures of space, time, interfaces, and roles. Combinations of canonical, participant, and historic trajectories show how such performances establish complex configurations of real and virtual, local and global, factual and fictional, and personal and social.
A handbook to the Coq software for writing and checking mathematical proofs, with a practical engineering focus. The technology of mechanized program verification can play a supporting role in many kinds of research projects in computer science, and related tools for formal proof-checking are seeing increasing adoption in mathematics and engineering. This book provides an introduction to the Coq software for writing and checking mathematical proofs. It takes a practical engineering focus throughout, emphasizing techniques that will help users to build, understand, and maintain large Coq developments and minimize the cost of code change over time. Two topics, rarely discussed elsewhere, are covered in detail: effective dependently typed programming (making productive use of a feature at the heart of the Coq system) and construction of domain-specific proof tactics. Almost every subject covered is also relevant to interactive computer theorem proving in general, not just program verification, demonstrated through examples of verified programs applied in many different sorts of formalizations. The book develops a unique automated proof style and applies it throughout; even experienced Coq users may benefit from reading about basic Coq concepts from this novel perspective. The book also offers a library of tactics, or programs that find proofs, designed for use with examples in the book. Readers will acquire the necessary skills to reimplement these tactics in other settings by the end of the book. All of the code appearing in the book is freely available online.
The evolution of the Boston metropolitan area, from country villages and streetcar suburbs to exurban sprawl and "smart growth." Boston's metropolitan landscape has been two hundred years in the making. From its proto-suburban village centers of 1800 to its far-flung, automobile-centric exurbs of today, Boston has been a national pacesetter for suburbanization. In The Hub's Metropolis, James O'Connell charts the evolution of Boston's suburban development.The city of Boston is compact and consolidated--famously, "the Hub." Greater Boston, however, stretches over 1,736 square miles and ranks as the world's sixth largest metropolitan area. Boston suburbs began to develop after 1820, when wealthy city dwellers built country estates that were just a short carriage ride away from their homes in the city. Then, as transportation became more efficient and affordable, the map of the suburbs expanded. The Metropolitan Park Commission's park-and-parkway system, developed in the 1890s, created a template for suburbanization that represents the country's first example of regional planning.O'Connell identifies nine layers of Boston's suburban development, each of which has left its imprint on the landscape: traditional villages; country retreats; railroad suburbs; streetcar suburbs (the first electric streetcar boulevard, Beacon Street in Brookline, was designed by Frederic Law Olmsted); parkway suburbs, which emphasized public greenspace but also encouraged commuting by automobile; mill towns, with housing for workers; upscale and middle-class suburbs accessible by outer-belt highways like Route 128; exurban, McMansion-dotted sprawl; and smart growth. Still a pacesetter, Greater Boston has pioneered antisprawl initiatives that encourage compact, mixed-use development in existing neighborhoods near railroad and transit stations.O'Connell reminds us that these nine layers of suburban infrastructure are still woven into the fabric of the metropolis. Each chapter suggests sites to visit, from Waltham country estates to Cambridge triple-deckers.
How the early Dungeons & Dragons community grappled with the nature of role-playing games, theorizing a new game genre. When Dungeon & Dragons made its debut in the mid-1970s, followed shortly thereafter by other, similar tabletop games, it sparked a renaissance in game design and critical thinking about games. D&D is now popularly considered to be the first role-playing game. But in the original rules, the term "role-playing" is nowhere to be found; D&D was marketed as a war game. In The Elusive Shift, Jon Peterson describes how players and scholars in the D&D community began to apply the term to D&D and similar games--and by doing so, established a new genre of games.
A handbook of situated design methods, with analyses and cases that range from designing study processes to understanding customer experiences to developing interactive installations. All design is situated--carried out from an embedded position. Design involves many participants and encompasses a range of interactions and interdependencies among designers, designs, design methods, and users. Design is also multidisciplinary, extending beyond the traditional design professions into such domains as health, culture, education, and transportation. This book presents eighteen situated design methods, offering cases and analyses of projects that range from designing interactive installations, urban spaces, and environmental systems to understanding customer experiences. Each chapter presents a different method, combining theoretical, methodological, and empirical discussions with accounts of actual experiences. The book describes methods for defining and organizing a design project, organizing collaborative processes, creating aesthetic experiences, and incorporating sustainability into processes and projects. The diverse and multidisciplinary methods presented include a problem- and project-based approach to design studies; a "Wheel of Rituals" intended to promote creativity; a pragmatist method for situated experience design that derives from empirical studies of film production and performance design; and ways to transfer design methods in a situated manner. The book will be an important resource for researchers, students, and practitioners of interdisciplinary design.
Lessons from and for the creative professions of art, science, design, and engineering: how to live in and with the Plenitude, that dense, knotted ecology of human-made stuff that creates the need for more of itself. We live with a lot of stuff. The average kitchen, for example, is home to stuff galore, and every appliance, every utensil, every thing, is compound--composed of tens, hundreds, even thousands of other things. Although each piece of stuff satisfies some desire, it also creates the need for even more stuff: cereal demands a spoon; a television demands a remote. Rich Gold calls this dense, knotted ecology of human-made stuff the "Plenitude." And in this book--at once cartoon treatise, autobiographical reflection, and practical essay in moral philosophy--he tells us how to understand and live with it. Gold writes about the Plenitude from the seemingly contradictory (but in his view, complementary) perspectives of artist, scientist, designer, and engineer--all professions pursued by him, sometimes simultaneously, in the course of his career. "I have spent my life making more stuff for the Plenitude," he writes, acknowledging that the Plenitude grows not only because it creates a desire for more of itself but also because it is extraordinary and pleasurable to create. Gold illustrates these creative expressions with witty cartoons. He describes "seven patterns of innovation"--including "The Big Kahuna," "Colonization" (which is illustrated by a drawing of "The real history of baseball," beginning with "Play for free in the backyard" and ending with "Pay to play interactive baseball at home"), and "Stuff Desires to Be Better Stuff" (and its corollary, "Technology Desires to Be Product"). Finally, he meditates on the Plenitude itself and its moral contradictions. How can we in good conscience accept the pleasures of creating stuff that only creates the need for more stuff? He quotes a friend: "We should be careful to make the world we actually want to live in."
Statements, dialogue, letters, epigrams, and poems by sculptor Carl Andre, a central figure in minimalism. Just as Carl Andre's sculptures are "cuts" of elemental materials, his writings are condensed expressions, "cuts" of language that emphasize the part rather than the whole. Andre, a central figure in minimalism and one of the most influential sculptors of our time, does not produce the usual critical essay. He has said that he is "not a writer of prose," and the texts included in Cuts--the most comprehensive collection of his writings yet published--appear in a wide variety of forms that are pithy and poetic rather than prosaic. Some texts are statements, many of them fifty words or less, written for catalog entries and press releases. Others are Socratic dialogues, interwoven statements, or in the form of questionnaires and interviews. Still others are letters--public and private, lengthy missives and postcards. Some are epigrams and maxims (for example, on Damian Hirst: I DON'T FEAR HIS SHARK. I FEAR HIS FORMALDEHYDE) and some are planar poems, words and letters arranged and rearranged into different patterns. They are organized alphabetically by subject, under such entries as "Art and Capitalism," "Childhood," "Entropy (After Smithson)," "Matter," "My Work," "Other Artists," and "Poetry," and they include Andre's reflections on Michelangelo and Duchamp, on Stein and Marx, and such contemporaries as Eva Hesse, Robert Smithson, Robert Morris, and Damien Hirst. Carl Andre's writing and its materiality--its stress on the visual and tactile qualities of language--takes its place beside his sculpture and its materiality, its revelation of "matter as matter rather than matter as symbol." Both assert the ethical and political primacy of matter in a culture that prizes the replica, the insubstantial, and the virtual. "I am not an idealist as an artist," says Andre. "I try to discover my visions in the conditions of the world. It's the conditions which are important."
Hannah Arendt (1906--1975) was one of the most original and interesting political thinkers of the twentieth century. In this new interpretation of her career, philosopher Richard Bernstein situates Arendt historically as an engaged Jewish intellectual and explores the range of her thinking from the perspective of her continuing confrontation with "the Jewish question."Bernstein argues that many themes that emerged in the course of Arendt's attempts to understand specifically Jewish issues shaped her thinking about politics in general and the life of the mind. By exploring pivotal events of her life story - her arrest and subsequent emigration from Germany in 1933, her precarious existence in Paris as a stateless Jew working for Zionist organizations, her internment at Gurs and her subsequent escape, and finally her flight from Europe in 1941 - he shows how personal experiences and her responses to them oriented her thinking. Arendt's analysis of the Jews' lack of preparation for the vicious political antiSemitism that arose in the last decade of the nineteenth century, Bernstein argues, led her on a quest for the ultimate meaning of politics and political responsibility. Moreover, he points out that Arendt's deepest insights about politics emerged from her reflections on statelessness and totalitarian domination. Bernstein also examines Arendt's attraction to and break with Zionism, and the reasons for her critical stance toward a Jewish sovereign state. He then turns to the issue that, in Arendt's opinion, needed most to be confronted in the aftermath of World War II: the fundamental nature of evil. He traces the nuances of her thinking from "radical evil" to "the banality of evil" and, finally, reexamines Eichmann in Jerusalem, her meditation on evil that caused a storm of protest and led some to question her loyalty to the Jewish people.
The new edition of a comprehensive, accessible, and hands-on text in historical linguistics, revised and expanded, with new material and a new layout. This accessible, hands-on textbook not only introduces students to the important topics in historical linguistics but also shows them how to apply the methods described and how to think about the issues. Abundant examples from a broad range of languages and exercises allow students to focus on how to do historical linguistics. The book is distinctive for its integration of the standard topics with others now considered important to the field, including syntactic change, grammaticalization, sociolinguistic contributions to linguistic change, distant genetic relationships, areal linguistics, and linguistic prehistory.
A comprehensive overview of important and contested issues in vaccination ethics and policy by experts from history, science, policy, law, and ethics. Vaccination has long been a familiar, highly effective form of medicine and a triumph of public health. Because vaccination is both an individual medical intervention and a central component of public health efforts, it raises a distinct set of legal and ethical issues--from debates over their risks and benefits to the use of government vaccination requirements--and makes vaccine policymaking uniquely challenging. This volume examines the full range of ethical and policy issues related to the development and use of vaccines in the United States and around the world. Forty essays, articles, and reports by experts in the field look at all aspects of the vaccine life cycle. After an overview of vaccine history, they consider research and development, regulation and safety, vaccination promotion and requirements, pandemics and bioterrorism, and the frontier of vaccination. The texts cover such topics as vaccine safety controversies; the ethics of vaccine trials; vaccine injury compensation; vaccine refusal and the risks of vaccine-preventable diseases; equitable access to vaccines in emergencies; lessons from the eradication of smallpox; and possible future vaccines against cancer, malaria, and Ebola. The volume intentionally includes texts that take opposing viewpoints, offering readers a range of arguments. The book will be an essential reference for professionals, scholars, and students. ContributorsJeffrey P. Baker, Seth Berkley, Luciana Borio, Arthur L. Caplan, R. Alta Charo, Dave A. Chokshi, James Colgrove, Katherine M. Cook, Louis Z. Cooper, Edward Cox, Douglas S. Diekema, Ezekiel J. Emanuel, Claudia I. Emerson, Geoffrey Evans, Ruth R. Faden, Chris Feudtner, David P. Fidler, Fiona Godlee, D. A. Henderson, Alan R. Hinman, Peter Hotez, Robert M. Jacobson, Aaron S. Kesselheim, Heidi J. Larson, Robert J. Levine, Donald W. Light, Adel Mahmoud, Edgar K. Marcuse, Howard Markel, Michelle M. Mello, Paul A. Offit, Saad B. Omer, Walter A. Orenstein, Gregory A. Poland, Lance E. Rodewald, Daniel A. Salmon, Anne Schuchat, Jason L. Schwartz, Peter A. Singer, Michael Specter, Alexandra Minna Stern, Jeremy Sugarman, Thomas R. Talbot, Robert Temple, Stephen P. Teret, Alan Wertheimer, Tadataka Yamada
A new analytical framework for understanding literary videogames, the literary-ludic spectrum, illustrated by close readings of selected works. In this book, Astrid Ensslin examines literary videogames--hybrid digital artifacts that have elements of both games and literature, combining the ludic and the literary. These works can be considered verbal art in the broadest sense (in that language plays a significant part in their aesthetic appeal); they draw on game mechanics; and they are digital-born, dependent on a digital medium (unlike, for example, conventional books read on e-readers). They employ narrative, dramatic, and poetic techniques in order to explore the affordances and limitations of ludic structures and processes, and they are designed to make players reflect on conventional game characteristics. Ensslin approaches these hybrid works as a new form of experimental literary art that requires novel ways of playing and reading. She proposes a systematic method for analyzing literary-ludic (L-L) texts that takes into account the analytic concerns of both literary stylistics and ludology. After establishing the theoretical underpinnings of her proposal, Ensslin introduces the L-L spectrum as an analytical framework for literary games. Based on the phenomenological distinction between deep and hyper attention, the L-L spectrum charts a work's relative emphases on reading and gameplay. Ensslin applies this analytical toolkit to close readings of selected works, moving from the predominantly literary to the primarily ludic, from online hypermedia fiction to Flash fiction to interactive fiction to poetry games to a highly designed literary "auteur" game. Finally, she considers her innovative analytical methodology in the context of contemporary ludology, media studies, and literary discourse analysis.
A detailed study of research on the psychology of expertise in weather forecasting, drawing on findings in cognitive science, meteorology, and computer science. This book argues that the human cognition system is the least understood, yet probably most important, component of forecasting accuracy. Minding the Weather investigates how people acquire massive and highly organized knowledge and develop the reasoning skills and strategies that enable them to achieve the highest levels of performance. The authors consider such topics as the forecasting workplace; atmospheric scientists' descriptions of their reasoning strategies; the nature of expertise; forecaster knowledge, perceptual skills, and reasoning; and expert systems designed to imitate forecaster reasoning. Drawing on research in cognitive science, meteorology, and computer science, the authors argue that forecasting involves an interdependence of humans and technologies. Human expertise will always be necessary.
An overview of algorithms important to computational structural biology that addresses such topics as NMR and design and analysis of proteins.Using the tools of information technology to understand the molecular machinery of the cell offers both challenges and opportunities to computational scientists. Over the past decade, novel algorithms have been developed both for analyzing biological data and for synthetic biology problems such as protein engineering. This book explains the algorithmic foundations and computational approaches underlying areas of structural biology including NMR (nuclear magnetic resonance); X-ray crystallography; and the design and analysis of proteins, peptides, and small molecules.Each chapter offers a concise overview of important concepts, focusing on a key topic in the field. Four chapters offer a short course in algorithmic and computational issues related to NMR structural biology, giving the reader a useful toolkit with which to approach the fascinating yet thorny computational problems in this area. A recurrent theme is understanding the interplay between biophysical experiments and computational algorithms. The text emphasizes the mathematical foundations of structural biology while maintaining a balance between algorithms and a nuanced understanding of experimental data. Three emerging areas, particularly fertile ground for research students, are highlighted: NMR methodology, design of proteins and other molecules, and the modeling of protein flexibility.The next generation of computational structural biologists will need training in geometric algorithms, provably good approximation algorithms, scientific computation, and an array of techniques for handling noise and uncertainty in combinatorial geometry and computational biophysics. This book is an essential guide for young scientists on their way to research success in this exciting field.
A leading neurobiologist explores the fundamental function of dendritic spines in neural circuits by analyzing different aspects of their biology, including structure, development, motility, and plasticity. Most neurons in the brain are covered by dendritic spines, small protrusions that arise from dendrites, covering them like leaves on a tree. But a hundred and twenty years after spines were first described by Ramón y Cajal, their function is still unclear. Dozens of different functions have been proposed, from Cajal's idea that they enhance neuronal interconnectivity to hypotheses that spines serve as plasticity machines, neuroprotective devices, or even digital logic elements. In Dendritic Spines, leading neurobiologist Rafael Yuste attempts to solve the "spine problem," searching for the fundamental function of spines. He does this by examining many aspects of spine biology that have fascinated him over the years, including their structure, development, motility, plasticity, biophysical properties, and calcium compartmentalization. Yuste argues that we may never understand how the brain works without understanding the specific function of spines. In this book, he offers a synthesis of the information that has been gathered on spines (much of which comes from his own studies of the mammalian cortex), linking their function with the computational logic of the neuronal circuits that use them. He argues that once viewed from the circuit perspective, all the pieces of the spine puzzle fit together nicely into a single, overarching function. Yuste connects these two topics, integrating current knowledge of spines with that of key features of the circuits in which they operate. He concludes with a speculative chapter on the computational function of spines, searching for the ultimate logic of their existence in the brain and offering a proposal that is sure to stimulate discussions and drive future research.
The hope and hype about African digital entrepreneurship, contrasted with the reality on the ground in local ecosystems.In recent years, Africa has seen a digital entrepreneurship boom, with hundreds of millions of dollars poured into tech cities, entrepreneurship trainings, coworking spaces, innovation prizes, and investment funds. Politicians and technologists have offered Silicon Valley-influenced narratives of boundless opportunity and exponential growth, in which internet-enabled entrepreneurship allows Africa to "leapfrog" developmental stages to take a leading role in the digital revolution. This book contrasts these aspirations with empirical research about what is actually happening on the ground. The authors find that although the digital revolution has empowered local entrepreneurs, it does not untether local economies from the continent's structural legacies. Drawing on a five-year research project, the authors show how entrepreneurs creatively and productively adapt digital technologies to local markets rather than dreaming of global dominance, achieving sustainable businesses by scaling based on relationships and customizing digital platform business models for African infrastructure challenge. The authors examine African entrepreneurial ecosystems; show that African digital entrepreneurs have begun to form a new professional class, becoming part of a relatively exclusive cultural and economic elite; and discuss the impact of Silicon Valley's mythologies and expectations. Finally, they consider the implications of their findings and offer recommendations to policymakers and others.
Why dominant racial and gender groups have preferential access to jobs in computing, and how feminist labor activism in computing culture can transform the field into a force that serves democracy and social justice. Cracking the Bro Code is a bold ethnographic study of sexism and racism in contemporary computing cultures theorized through the analytical frame of the "Bro Code." Drawing from feminist anthropology and STS, Coleen Carrigan shares in this book the direct experiences of women, nonbinary individuals, and people of color, including her own experiences in tech, to show that computing has a serious cultural problem. From senior leaders in the field to undergraduates in their first year of college, participants consistently report how sexism and harassment manifest themselves in computing via values, norms, behaviors, evaluations, and policies. While other STEM fields are making strides in recruiting, retaining, and respecting women workers, computing fails year after year to do so. Carrigan connects altruism, computing, race, and gender to advance the theory that social purpose is an important factor to consider in working toward gender equity in computing. Further, she argues that transforming computing culture from hostile to welcoming has the potential to change not only who produces computing technology but also the core values of its production, with possible impacts on social applications. Cracking the Bro Code explains how digital bosses have come to operate imperiously in our society, dodging taxes and oversight, and how some programmers who look like them are enchanted with a sense of divine right. In the context of computing's powerful influence on the world, Carrigan speculates on how the cultural mechanisms sustaining sexism, harassment, and technocracy in computing workspaces impact both those harmed by such violence as well as society at large.
Essays that pay tribute to the wide-ranging influence of the late Herbert Simon, by friends and colleagues. Herbert Simon (1916-2001), in the course of a long and distinguished career in the social and behavioral sciences, made lasting contributions to many disciplines, including economics, psychology, computer science, and artificial intelligence. In 1978 he was awarded the Nobel Prize in economics for his research into the decision-making process within economic organizations. His well-known book The Sciences of the Artificial addresses the implications of the decision-making and problem-solving processes for the social sciences. This book (the title is a variation on the title of Simon's autobiography, Models of My Life) is a collection of short essays, all original, by colleagues from many fields who felt Simon's influence and mourn his loss. Mixing reminiscence and analysis, the book represents "a small acknowledgment of a large debt." Each of the more than forty contributors was asked to write about the one work by Simon that he or she had found most influential. The editors then grouped the essays into four sections: "Modeling Man," "Organizations and Administration," "Modeling Systems," and "Minds and Machines." The contributors include such prominent figures as Kenneth Arrow, William Baumol, William Cooper, Gerd Gigerenzer, Daniel Kahneman, David Klahr, Franco Modigliani, Paul Samuelson, and Vernon Smith. Although they consider topics as disparate as "Is Bounded Rationality Unboundedly Rational?" and "Personal Recollections from 15 Years of Monthly Meetings," each essay is a testament to the legacy of Herbert Simon--to see the unity rather than the divergences among disciplines.
How powerful new methods in nonlinear control engineering can be applied to neuroscience, from fundamental model formulation to advanced medical applications. Over the past sixty years, powerful methods of model-based control engineering have been responsible for such dramatic advances in engineering systems as autolanding aircraft, autonomous vehicles, and even weather forecasting. Over those same decades, our models of the nervous system have evolved from single-cell membranes to neuronal networks to large-scale models of the human brain. Yet until recently control theory was completely inapplicable to the types of nonlinear models being developed in neuroscience. The revolution in nonlinear control engineering in the late 1990s has made the intersection of control theory and neuroscience possible. In Neural Control Engineering, Steven Schiff seeks to bridge the two fields, examining the application of new methods in nonlinear control engineering to neuroscience. After presenting extensive material on formulating computational neuroscience models in a control environment--including some fundamentals of the algorithms helpful in crossing the divide from intuition to effective application--Schiff examines a range of applications, including brain-machine interfaces and neural stimulation. He reports on research that he and his colleagues have undertaken showing that nonlinear control theory methods can be applied to models of single cells, small neuronal networks, and large-scale networks in disease states of Parkinson's disease and epilepsy. With Neural Control Engineering the reader acquires a working knowledge of the fundamentals of control theory and computational neuroscience sufficient not only to understand the literature in this trandisciplinary area but also to begin working to advance the field. The book will serve as an essential guide for scientists in either biology or engineering and for physicians who wish to gain expertise in these areas.
An account of the neurobiology of motor recovery in the arm and hand after stroke by two experts in the field. Stroke is a leading cause of disability in adults and recovery is often difficult, with existing rehabilitation therapies largely ineffective. In Broken Movement, John Krakauer and S. Thomas Carmichael, both experts in the field, provide an account of the neurobiology of motor recovery in the arm and hand after stroke. They cover topics that range from behavior to physiology to cellular and molecular biology. Broken Movement is the only accessible single-volume work that covers motor control and motor learning as they apply to stroke recovery and combines them with motor cortical physiology and molecular biology. The authors cast a critical eye at current frameworks and practices, offer new recommendations for promoting recovery, and propose new research directions for the study of brain repair.Krakauer and Carmichael discuss such subjects as the behavioral phenotype of hand and arm paresis in human and non-human primates; the physiology and anatomy of the motor system after stroke; mechanisms of spontaneous recovery; the time course of early recovery; the challenges of chronic stroke; and pharmacological and stem cell therapies. They argue for a new approach in which patients are subjected to higher doses and intensities of rehabilitation in a more dynamic and enriching environment early after stroke. Finally they review the potential of four areas to improve motor recovery: video gaming and virtual reality, invasive brain stimulation, re-opening the sensitive period after stroke, and the application of precision medicine.
Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.
Ved tilmelding accepterer du vores persondatapolitik.