Gør som tusindvis af andre bogelskere
Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.
Ved tilmelding accepterer du vores persondatapolitik.Du kan altid afmelde dig igen.
The most recent volume in the Drinking Water and Health series contains the results of a two-part study on the toxicity of drinking water contaminants. The first part examines current practices in risk assessment, identifies new noncancerous toxic responses to chemicals found in drinking water, and discusses the use of pharmacokinetic data to estimate the delivered dose and response. The second part of the book provides risk assessments for 14 specific compounds, 9 presented here for the first time.
After v. 1, each volume's t.p. names a different panel at the beginning of its author statement.
The polygraph, often portrayed as a magic mind-reading machine, is still controversial among experts, who continue heated debates about its validity as a lie-detecting device. As the nation takes a fresh look at ways to enhance its security, can the polygraph be considered a useful tool? The Polygraph and Lie Detection puts the polygraph itself to the test, reviewing and analyzing data about its use in criminal investigation, employment screening, and counter-intelligence. The book looks at: The theory of how the polygraph works and evidence about how deceptiveness--and other psychological conditions--affect the physiological responses that the polygraph measures. Empirical evidence on the performance of the polygraph and the success of subjects' countermeasures. The actual use of the polygraph in the arena of national security, including its role in deterring threats to security. The book addresses the difficulties of measuring polygraph accuracy, the usefulness of the technique for aiding interrogation and for deterrence, and includes potential alternatives--such as voice-stress analysis and brain measurement techniques.
The Small Business Innovation Research (SBIR) program is one of the largest examples of U.S. public-private partnerships. Founded in 1982, SBIR was designed to encourage small business to develop new processes and products and to provide quality research in support of the many missions of the U.S. government, including health, energy, the environment, and national defense. In response to a request from the U.S. Congress, the National Research Council assessed SBIR as administered by the five federal agencies that together make up 96 percent of program expenditures. This book, one of six in the series, reports on the SBIR program at the National Science Foundation. The study finds that the SBIR program is sound in concept and effective in practice, but that it can also be improved. Currently, the program is delivering results that meet most of the congressional objectives, including stimulating technological innovation, increasing private-sector commercialization of innovations, using small businesses to meet federal research and development needs, and fostering participation by minority and disadvantaged persons. The book suggests ways in which the program can improve operations, continue to increase private-sector commercialization, and improve participation by women and minorities.
An NRC committee was established to work with a Russian counterpart group in conducting a workshop in Moscow on the effectiveness of Russian environmental NGOs in environmental decision-making and prepared proceedings of this workshop, highlighting the successes and difficulties faced by NGOs in Russia and the United States.
The Committee on Developing a Federal Materials Facilities Strategy was appointed by the National Research Council (NRC) in response to a request by the federal agencies involved in funding and operating multidisciplinary user facilities for research with synchrotron radiation, neutrons, and high magnetic fields. Starting in August 1996, a series of conversations and meetings was held among NRC staff and officials from the National Science Foundation, the Department of Energy, the National Institute of Standards and Technology (Department of Commerce), and the National Institutes of Health. The agencies were concerned that facilities originally developed to support research in materials science were increasingly used by scientists from other fields--particularly the biological sciences--whose research was supported by agencies other than those responsible for the facilities. This trend, together with the introduction of several new, large user facilities in the last decade, led the agencies to seek advice on the possible need for interagency cooperation in the management of these federal research facilities.
Since its discovery in 1610, Europa--one of Jupiter's four large moons--has been an object of interest to astronomers and planetary scientists. Much of this interest stems from observations made by NASA's Voyager and Galileo spacecraft and from Earth-based telescopes indicating that Europa's surface is quite young, with very little evidence of cratering, and made principally of water ice. More recently, theoretical models of the jovian system and Europa have suggested that tidal heating may have resulted in the existence of liquid water, and perhaps an ocean, beneath Europa's surface. NASA's ongoing Galileo mission has profoundly expanded our understanding of Europa and the dynamics of the jovian system, and may allow us to constrain theoretical models of Europa's subsurface structure. Meanwhile, since the time of the Voyagers, there has been a revolution in our understanding of the limits of life on Earth. Life has been detected thriving in environments previously thought to be untenable--around hydrothermal vent systems on the seafloor, deep underground in basaltic rocks, and within polar ice. Elsewhere in the solar system, including on Europa, environments thought to be compatible with life as we know it on Earth are now considered possible, or even probable. Spacecraft missions are being planned that may be capable of proving their existence. Against this background, the Space Studies Board charged its Committee on Planetary and Lunar Exploration (COMPLEX) to perform a comprehensive study to assess current knowledge about Europa, outline a strategy for future spacecraft missions to Europa, and identify opportunities for complementary Earth-based studies of Europa. (See the preface for a full statement of the charge.)
In this study, the committee explores ways the National Weather Service (NWS) can take advantage of continuing advances in science and technology to meet the challenges of the future. The predictions are focused on the target year 2025. Because specific predictions about the state of science and technology or the NWS more than 25 years in the future will not be entirely accurate, the goal of this report is to identify and highlight trends that are most likely to influence change. The Panel on the Road Map for the Future National Weather Service developed an optimistic vision for 2025 based on advances in science and technology.
This study assesses the potential of new technology to reduce logistics support requirements for future Army combat systems. It describes and recommends areas of research and technology development in which the Army should invest now to field systems that will reduce logistics burdens and provide desired capabilities for an "Army After Next (AAN) battle force" in 2025.
Losses of life and property from natural disasters in the United States-and throughout the world-have been enormous and the potential for substantially greater future losses looms. It is clearly in the public interest to reduce these impacts and to encourage the development of communities that are resilient to disasters. This goal can be achieved through wise and sustained efforts involving mitigation, preparedness, response, and recovery. Implementing such efforts, particularly in the face of limited resources and competing priorities, requires accurate information that is presented in a timely and appropriate manner to facilitate informed decisions. Substantial information already exists that could be used to this end, but there are numerous obstacles to accessing this information, and methods for integrating information from a variety of sources for decision-making are presently inadequate. Implementation of an improved national or international network for making better information available in a more timely manner could substantially improve the situation.As noted in the Preface, a federal transition team is considering the issues and needs associated with implementing a global or national disaster information network as described in the report by the Disaster Information Task Force (1997). This National Research Council report was commissioned by the transition team to provide advice on how a disaster information network could best make information available to improve decision making, with the ultimate goal of reducing losses from natural disasters. The report is intended to provide the basis for a better appreciation of which types of data and information should be generated in an information program and how this information could best be disseminated to decision makers.
In May 1998 the National Institutes of Health asked the National Academy of Sciences/National Research Council to assemble a group of experts to examine the scientific literature relevant to work-related musculoskeletal disorders of the lower back, neck, and upper extremities. A steering committee was convened to design a workshop, to identify leading researchers on the topic to participate, and to prepare a report based on the workshop discussions and their own expertise. In addition, the steering committee was asked to address, to the extent possible, a set of seven questions posed by Congressman Robert Livingston on the topic of work-related musculoskeletal disorders. The steering committee includes experts in orthopedic surgery, occupational medicine, epidemiology, ergonomics, human factors, statistics, and risk analysis. This document is based on the evidence presented and discussed at the two-day Workshop on Work-Related Musculoskeletal Injuries: Examining the Research Base, which was held on August 21 and 22, 1998, and on follow-up deliberations of the steering committee, reflecting its own expertise.
Manufacturing process controls include all systems and software that exert control over production processes. Control systems include process sensors, data processing equipment, actuators, networks to connect equipment, and algorithms to relate process variables to product attributes.Since 1995, the U.S. Department of Energy Office of Industrial Technology 's (OIT) program management strategy has reflected its commitment to increasing and documenting the commercial impact of OIT programs. OIT's management strategy for research and development has been in transition from a "e;technology push"e; strategy to a "e;market pull"e; strategy based on the needs of seven energy-and waste-intensive industries-steel, forest products, glass, metal casting, aluminum, chemicals, and petroleum refining. These industries, designated as Industries of the Future (IOF), are the focus of OIT programs. In 1997, agriculture, specifically renewable bioproducts, was added to the IOF group.The National Research Council Panel on Manufacturing Process Controls is part of the Committee on Industrial Technology Assessments (CITA), which was established to evaluate the OIT program strategy, to provide guidance during the transition to the new IOF strategy, and to assess the effects of the change in program strategy on cross-cutting technology programs, that is, technologies applicable to several of the IOF industries. The panel was established to identify key processes and needs for improved manufacturing control technology, especially the needs common to several IOF industries; identify specific research opportunities for addressing these common industry needs; suggest criteria for identifying and prioritizing research and development (R&D) to improve manufacturing controls technologies; and recommend means for implementing advances in control technologies.
The primary purpose of systems engineering is to organize information and knowledge to assist those who manage, direct, and control the planning, development, production, and operation of the systems necessary to accomplish a given mission. However, this purpose can be compromised or defeated if information production and organization becomes an end unto itself. Systems engineering was developed to help resolve the engineering problems that are encountered when attempting to develop and implement large and complex engineering projects. It depends upon integrated program planning and development, disciplined and consistent allocation and control of design and development requirements and functions, and systems analysis.The key thesis of this report is that proper application of systems analysis and systems engineering will improve the management of tank wastes at the Hanford Site significantly, thereby leading to reduced life cycle costs for remediation and more effective risk reduction. The committee recognizes that evidence for cost savings from application of systems engineering has not been demonstrated yet.
No reliable acute-exposure1 standards have been established for the particular purpose of protecting soldiers from toxic exposures to chemical warfare (CW) agents. Some human-toxicity estimates are available for the most common CW agents--organophosphorus nerve agents and vesicants; however, most of those estimates were developed for offensive purposes (that is, to kill or incapacitate the enemy) and were intended to be interim values only. Because of the possibility of a chemical attack by a foreign power, the Army's Office of the Surgeon General asked the Army's Chemical Defense Equipment Process Action Team (CDEPAT) to review the toxicity data for the nerve agents GA (tabun), GB(sarin), GD (soman), GF, and VX, and the vesicant agent sulfur mustard (HD) and to establish a set of exposure limits that would be useful in protecting soldiers from toxic exposures to those agents. This report is an independent review of the CDEPAT report to determine the scientific validity of the proposed estimates.
In spite of its high cost and technical importance, plasma equipment is still largely designed empirically, with little help from computer simulation. Plasma process control is rudimentary. Optimization of plasma reactor operation, including adjustments to deal with increasingly stringent controls on plant emissions, is performed predominantly by trial and error. There is now a strong and growing economic incentive to improve on the traditional methods of plasma reactor and process design, optimization, and control. An obvious strategy for both chip manufacturers and plasma equipment suppliers is to employ large-scale modeling and simulation. The major roadblock to further development of this promising strategy is the lack of a database for the many physical and chemical processes that occur in the plasma. The data that are currently available are often scattered throughout the scientific literature, and assessments of their reliability are usually unavailable. Database Needs for Modeling and Simulation of Plasma Processing identifies strategies to add data to the existing database, to improve access to the database, and to assess the reliability of the available data. In addition to identifying the most important needs, this report assesses the experimental and theoretical/computational techniques that can be used, or must be developed, in order to begin to satisfy these needs.
Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.
Ved tilmelding accepterer du vores persondatapolitik.