Главная

Популярная публикация

Научная публикация

Случайная публикация

Обратная связь

ТОР 5 статей:

Методические подходы к анализу финансового состояния предприятия

Проблема периодизации русской литературы ХХ века. Краткая характеристика второй половины ХХ века

Ценовые и неценовые факторы

Характеристика шлифовальных кругов и ее маркировка

Служебные части речи. Предлог. Союз. Частицы

КАТЕГОРИИ:






Требования, предъявляемые к аннотациям 4 страница




participatory economics (a recent proposal for a new economic system).

An economic system can be considered a part of the social system and hierarchically equal to the law system, political system, cultural system, etc.

History of supply and demand

Attempts to determine how supply and demand interact began with Adam Smith's The Wealth of Nations, first published in 1776. In this book, he mostly assumed that the supply price was fixed but that the demand would increase or decrease as the price decreased or increased. David Ricardo in 1817 published the book Principles of Political Economy and Taxation, in which the first idea of an economic model was proposed. In this, he more rigorously laid down the idea of the assumptions that were used to build his ideas of supply and demand.

During the late 19th century the marginalist school of thought emerged. This field mainly was started by Stanley Jevons, Carl Menger, and Leon Walras. The key idea was that the price was set by the most expensive price, that is, the price at the margin. This was a substantial change from Adam Smith's thoughts on determining the supply price.

Finally, most of the basics of the modern school theory of supply and demand were finalized by Alfred Marshall and Leon Walras, when they combined the ideas about supply and the ideas about demand and began looking at the equilibrium point where the two curves crossed. They also began looking at the effect of markets on each other. Since the late 19th century, the theory of supply and demand has mainly been unchanged. Most of the work has been in examining the exceptions to the model (like oligarchy, transaction costs, non-rationality).

Criticism of Marshall's theory of supply and demand

Marshall's theory of supply and demand runs counter to the ideas of economists from Adam Smith and David Ricardo through the creation of the marginalist school of thought. Although Marshall's theories are dominant in elite universities today, not everyone has taken the fork in the road that he and the marginalists proposed. One theory counter to Marshall is that price is already known in a commodity before it reaches the market, negating his idea that some abstract market is conveying price information. The only thing the market communicates is whether or not an object is exchangeable or not (in which case it would change from an object to a commodity). This would mean that the producer creates the goods without already having customers blindly producing, hoping that someone will buy them («buy» meaning exchange money for the commodities). Modern producers often have market studies prepared well in advance of production decisions; however, misallocation of factors of production can still occur.

Keynesian economics also runs counter to the theory of supply and demand. In Keynesian theory, prices can become «sticky» or resistant to change, especially in the case of price decreases. This leads to a market failure. Modern supporters of Keynes, such as Paul Krugman, have noted this in recent history, such as when the Boston housing market dried up in the early 1990s, with neither buyers nor sellers willing to exchange at the price equilibrium.

Gregory Mankiw's work on the irrationality of actors in the markets also undermines Marshall's simplistic view of the forces involved in supply and demand.

Engineering

 

Energy and the environment

Many of the most serious environment problems of the technological nations result from the use of energy. Every form of energy production is known to cause some damage to the surroundings. A large part of urban air pollution is probably caused by emission from internal combustion engines. Other forms of urban air pollution result from the combustion of coal and low grade oil in steam electric plants or central heating plants.

Hydroelectric plants are considered to cause serious problems in the environment as well. One major problem of hydroelectric plants is the enormous weight of the water that fills the lake behind the dam is constructed. The added weight places severe stresses on the geological formation, causing earthquakes in the area. The most severe earthquake – 6,5 on the Richter scale – happened as the lake behind the dam in Kogna (India) was filled.

Perhaps, the most tragic problem created by the Aswan High Dam on the Nile River is the increase of diseases. The still waters behind the dam prove to create a good ground for insects carrying diseases.

Another form of environment degradation common to electric power generation is thermal pollution – the dumping of wasted heat into streams of water or the atmosphere. The warmed water is rather quickly mixed with the streams of water in a lake, this having harmful effect upon ecological balance of the lake.

In order to obtain enormous amounts of energy we are building powerful atomic electric stations which open up fine prospects in atomic power industry. However nuclear plants are capable of polluting the environment with radioactive atoms of various elements. Moreover, nuclear reactors of the types now being built will not be widely used as a source of energy because of the scarcity of the isotope «U» which is used as fuel.

The largest potential source of nuclear energy is thermo-nuclear fusion by which the nuclei of small atoms are combined to form, larger nuclei. However these power plants also contaminate the environment with radioactive elements that are released when the fuel is burnt.

Ecology

Recently, throughout years subsequent to the Chernobyl NPP accident in 1986 in specialized research and design papers the most attention has been concentrated on the safety and on the ecological aspects of the Minatom activities. The federal and departmental standards are more strictly followed. The IAEA recommendations and forecasts are taken into consideration in research and design papers. The economical examination and public consideration of projects are applied widely.

Russia likewise the majority of the leading nuclear countries has initiated a program to develop a closed nuclear fuel cycle.

It will enable in future to reduce uranium mining by half, to apply a new power source, that is, plutonium in the fuel cycle and to reach up and over 60% of fuel burned up in both thermal and fast neutron reactors.

At the present-day phase of scientific and technical development and in the future the nuclear power industry of the Russian Federation has to tackle two basic problems, that is:

1) safety improvement of nuclear installations;

2)assurance of safe management of spent nuclear fuel (SNF) and of radioactive waste (RW) (storage, transportation, treatment, utilization, disposal).

By now in Russia the procedures ensuring safety of personnel, environment and population are available. These procedures are based on the vast accumulated experience in reprocessing of Russian and foreign WWER-440 reactor spent fuel and of nuclear-powered submarine fuel.

In respect of procedures and techniques, engineering approaches and ecological safety the present-day Russian procedures and techniques for spent nuclear fuel management are quite competitive with the world's best similar techniques and procedures.

A half-century experience in application of the available procedures and techniques for spent nuclear fuel management confirms high level of their safety.

The Russian Minatom Department for Safety, Ecology and Emergencies and Administration for Ecological and Decommissioning Problems were established to ensure safety of the industry enterprises and installations and to tackle ecological problems.

The nuclear power industry differs favorably in the ecological safety from other power industry branches. For example, the total amount of NPP nuclear waste is hundred thousand times smaller than the amount of waste resulting from burning up of organic fuel, that is, NPPs produce one ton of waste per generation of 10 billion kWh.

During 1998 the Russian NPPs generated 108 billion kWh. About 50 million tons of hard coal (740 000 freight cars) or 25 million tons of mazut (350000 tanks) should be burnt to generate the same amount of electricity at thermal electric power plants. Combustion of such amount of organic fuel would result in uptake of 100 million tons of oxygen and, in addition, over hundred million tons of carbon, nitrogen and sulfur oxides, tens million tons of ashes, over 160 Ci of the natural radionuclides would be released into the atmosphere.

Russian scientists and engineers from Minatom elaborated a procedure to calculate and to determine maximum permissible amounts of releases and effluents for natural and artificial radionuclides, chemicals to meet the world standards and the IAEA requirements.

Effluents containing amount of at least one radionuclide over the prescribed permissible value specified in the standard certificate are prohibited to be discharged at all the operated NPPs. The ecological standard certificates included in the basic papers confirming the ecological safety of an installation were prepared, coordinated with the regional environment supervising bodies and issued for all NPPs in the Russian Federation.

Designs of new generation NPPs include a number of new engineering approaches to reduce releases and effluents thus, mitigating their environmental impact by 10-20 times.

The analysis of the Russian Federation State Inventory and Examination data on the effluents, releases and radioactive waste and the State Statistic Reports confirms the ecological stability of nuclear fuel cycle enterprises, the efficiency of shielding and protection barriers and of treatment systems.

For example, at RT-2 fuel reprocessing plant provision is made for multiple-stage treatment of releases to remove aerosols, nitrogen oxides and iodine-129 to protect air atmosphere against radionuclides.

The following systems should be located at the sites of nuclear fuel cycle enterprises: computer-aided monitoring and control systems of releases and effluents of radionuclides, harmful and toxic chemicals; computer-aided monitoring and control systems to ensure personnel and population radiation safety. These systems should be located at plant sites, in surveillance and supervised areas, as well as, in potential contamination areas that can be affected in case of emergency.

The nuclear power industry offers major ecological advantages over the other power industry branches once the normal operation and the reliable confining of radioactive waste are ensured.

Comparison of power generation various processes demonstrates that the nuclear procedures and techniques are characterized by the minimum index of carbon release for one unit of generated electric power. Burning of the available resources-of organic fuel may result in triple increase of carbon concentration in the atmosphere.

 

Wind Energy

It is hard to imagine an energy source more benign to the environment than wind power; it produces no air or water pollution, involves no toxic or hazardous substances (other than those commonly found in large machines), and poses no threat to public safety. And yet a serious obstacle facing the wind industry is public opposition reflecting concern over the visibility and noise of wind turbines, and their impacts on wilderness areas.

One of the most misunderstood aspects of wind power is its use of land. Most studies assume that wind turbines will be spaced a certain distance apart and that all of the land in between should be regarded as occupied. This leads to some quite disturbing estimates of the land area required to produce substantial quantities of wind power. According to one widely circulate report from the 1970s, generating 20 percent of US electricity from windy areas in 1975 would have required sitting turbines on 18,000 square miles, or an area about 7 percent the size of Texas.

In reality, however, the wind turbines themselves occupy only a small fraction of this land area, and the rest can be used for other purposes or left in its natural state. For this reason, wind power develop­ment is ideally suited to farming areas. In Europe, farmers plant right up to the base of turbine towers, while in California cows can be seen peacefully grazing in their shadow. The leasing of land for wind turbines, far from interfering with farm operations, can bring substantial benefits to landowners in the form of increased income and land values. Perhaps the greatest potential for wind power development is consequently in the Great Plains, where wind is plentiful and vast stretches of farmland could support hundreds of thousands of wind turbines.

In other settings, however, wind power development can create serious land-use conflicts. In forested areas it may mean clearing trees and cutting roads, a prospect that is sure to generate controversy, except possibly in areas where heavy logging has already occurred. And near populated areas, wind projects often run into stiff opposition from people who regard them as unsightly and noisy, or who fear their presence may reduce property values.

In California, bird deaths from electrocution or collisions with spinning rotors have emerged as a problem at the Altamont Pass wind «farm», where more than 30 threatened golden eagles and 75 other raptors such as red-tailed hawks died or were injured during a three-year period. Studies under way to determine the cause of these deaths and find preventive measures may have an important impact on the public image and rate of growth of the wind industry. In appropriate areas, and with imagination, careful planning, and early contacts between the wind industry, environmental groups, and affected communities, sitting and environmental problems should not be insurmountable.

 

Solar Energy

Since solar power systems generate no air pollution during operation, the primary environmental, health, and safety issues involve how they are manufactured, installed, and ultimately disposed of. Energy is required to manufacture and install solar components, and any fossil fuels, used for this purpose will generate emissions. Thus, an important question is how much fossil energy input is required for solar systems compared to the fossil energy consumed by comparable conventional energy systems. Although this varies depending upon the technology and climate, the energy balance is generally favorable to solar systems in applications where they are cost effective, and it is improving with each successive generation of technology. According to some studies, for example, solar water heaters increase the amount of hot water generated per unit of fossi1 energy invested by at least a factor of two compared to natural gas water heating and by at least a factor of eight compared to electric water heating.

Materials used in some solar systems can create health and safety hazards for workers and anyone else coming into contact with them. In particular, the manufacturing of photovoltaic cells often requires hazardous materials such as arsenic and cadmium. Even relatively inert silicon, a major material used in solar cells, can be hazardous to workers if it is breathed in as dust. Workers involved in manufacturing photovoltaic modules and components must consequently be protected from exposure to these materials. There is an additional probably very small danger that hazardous fumes released from photovoltaic modules attached to burning homes or buildings could injure fire fighters.

None of these potential hazards is much different in quality or magnitude from the innumerable hazards people face routinely in an industrial society. Through effective regulation, the dangers can very likely be kept at a very low level.

The large amount of land required for utility-scale solar power plants-approximately one square kilometer for every 20-60 megawatts (MW) generated-poses an additional problem, especially where wildlife protection is a concern. But this problem is not unique to solar power plants. Generating electricity from coal actually requires as much or more land per unit of energy delivered if the land used in strip mining is taken into account. Solar-thermal plants (like most conventional power plants) also require cooling water, which may be costly or scarce in desert areas.

Large central power plants are not the only option for generating energy from sunlight, however, and are probably among the least promising. Because sunlight is dispersed, small scale, dispersed applications are a better match to the resource. They can take advantage of unused space on the roofs of homes and buildings and in urban and industrial lots. And, in solar building designs, the structure itself acts as the collector, so there is no need for any additional space at all.

 

Air Pollution

Inevitably, the combustion of biomass produces air pollutants, including carbon monoxide, nitrogen oxides, and particulates such as soot and ash. The amount of pollution emitted per unit of energy generated varies widely by technology, with wood-burning stoves and fireplaces generally the worst offenders. Modem, enclosed fireplaces and wood stoves pollute much less than traditional, open fireplaces for the simple reason that they are more efficient. Specialized pollution control devices such as electrostatic precipitators (to remove particulates) are available, but without specific regulation to enforce their use it is doubtful they will catch on.

Emissions from conventional biomass-fueled power plants are generally similar to emissions from coal-fired power plants, with the notable difference that biomass facilities produce very little sulfur dioxide or toxic metals (cadmium, mercury, and others). The most serious problem is their particulate emissions, which must be controlled with special devices. More advanced technologies, such as the whole-tree burner (which has three successive combustion stages) and the gasifier/combustion turbine combination, should generate much lower emissions, perhaps comparable to those of power plants fueled by natural gas.

Facilities that burn raw municipal waste present a unique pollution-control problem. This waste often contains toxic metals, chlorinated compounds, and plastics, which generate harmful emissions. Since this problem is much less severe in facilities burning refuse-derived fuel (RDF)-palletized or shredded paper and other waste with most inorganic material removed most waste-to-energy plants built in the future are likely to use this fuel. Co-firing RDF in coal fired power plants may provide an inexpensive way to reduce coal emissions without having to build new power plants.

Using biomass-derived methanol and ethanol as vehicle fuels, instead of conventional gasoline, could substantially reduce some types of pollution from automobiles. Both methanol and ethanol evaporate more slowly than gasoline, thus helping to reduce evaporative emissions of volatile organic compounds (VOCs), which react with heat and sunlight to generate ground level ozone (a component of smog). According to Environmental Protection Agency estimates, in cars specifically designed to burn pure methanol or ethanol, VOC emissions from the tailpipe could be reduced 85 to 95 percent, while carbon monoxide emissions could be reduced 30 to 90 percent. However, emissions of nitrogen oxides, a source of acid precipitation, would not change significantly compared to gasoline-powered vehicles.

Some studies have indicated that the use of fuel alcohol increases emissions of formaldehyde and other baldheads, compounds identified as potential carcinogens. Others counter that these results consider only tailpipe emissions, whereas VOCs, another significant pathway of bald-headed formation, are much lower in alcohol-burning vehicles. On balance, methanol Vehicles would therefore decrease ozone levels. Overall, however, alcohol-fueled cars will not solve air pollution problems in dense urban areas, where electric cars or fuel cells represent better solutions.

Ecological science and sustainability for the 21 st century

Ecological science has contributed greatly to our understanding of the natural world and the impact of humans on that world. Now, we need to refocus the discipline" towards research that ensures a future in which natural systems and the humans they include coexist on a more sustainable planet. Acknowledging that managed ecosystems and intensive exploitation of resources define our future, ecologists must play a greatly expanded role in communicating their research and influencing policy and decisions that affect the environment. To accomplish this, they will have to forge partnerships at scales and in forms they have not traditionally used. These alliances must act within three visionary areas: enhancing the extent to which decisions are ecologically informed; advancing innovative ecological research directed at the sustainability of the planet; and stimulating cultural changes within the science itself, thereby building a forward-looking and international ecology. We recommend: (1) a research initiative to enhance research project develop­ment, facilitate large-scale experiments and data collection, and link science to solutions; (2) procedures that, will improve interactions among researchers, managers, and decision makers; and (3) efforts to build public understanding of the links between ecosystem services and humans.

In the fall of 2002, the Ecological Society of America (ESA) established a committee to develop an action for bolstering the research capabilities and impact of the ecological sciences. After much work and with substantial input from many people within and beyond the Society, the committee report to the ESA Governing Board in April 2004. This article is a brief summary of recommended actions that must be taken, by members of the scientific community and others, to produce the knowledge, discoveries, and forms of communication that will ensure that ecology effectively informs decisions that influence environmental sustainability globally.

For much of the past century, ecologists have enhanced our understanding of nature by focusing on the least disturbed ecosystems on earth. This has generated tremendous insights into complex ecological interactions and has positioned ecologists to focus on the impacts of humans on the planer. A more recent: body of research treats humans as one of many components of ecosystems – humans are seen not only as exploiters of ecosystem services, but as agents of change who are themselves influenced by this change. This makes sense because all organisms modify the environment in which they live; certainly humans differ in: the extent to which they transform their surroundings, but they also have the ability to forecast and modify their behaviors in anticipation of tomorrow's changes.

Within the discipline of ecology, our thinking has thus evolved from a focus on humans as intruders on the natural world to humans as part of the natural world. Now, however, ecologists must go and focus on how humans can exist in a 1 e natural world. We do not deny the devastating impacts humans have had on the earth – indeed, these impacts are present in the air we breathe, the water we drink; and in the land that we depend on for food and habitat. Instead, we assert that because excessive exploitation of natural resources and over-population are realities, ecologists must put massive efforts into science for, not about, a crowded planet. Current projections are that 8-11 billion people will live on earth by the end of this century. Ecologists therefore have little time to waste.

What do we mean by ecological science for a crowded planet: We mean a science in which the players are actively engaged with the public and policy makers. We mean an anticipatory science of discovery that effectively informs decisions and, by so doing, moves us closer to a sustainable world – a world in which population needs are met while still maintaining the planet's life support systems. Developing such a science will require. A bold, proactive agenda based on four tenets. First, our future environment will consist largely of human-dominated ecosystems that are managed intentionally or inadvertently. Second, the scientific path to a more sustainable future involves some combination of conserved, restored, and invented ecosystems. Third, ecological science must be a critical component of the decision-making process that influences our planet's sustainability. Fourth, unprecedented regional and global partnerships between scientists, governments, cor­porations, and the public must be developed to advance the science and to ensure it is used effectively.

The desire to secure a sustainable future and to develop the sup­porting science is widespread, and has been an explicit goal of the ESA for over a decade. Nevertheless, what is needed to achieve. Sustainability remains ambiguous, and no clear plan exists for making progress. The global ecological science community must confront and embrace its responsibility and unique role in this endeavor by engaging much more actively with the public and policy makers, and by refocusing the discipline on questions that explicitly address how to sustain nature's services in the midst of burgeoning human populations.

Thus we emphasize the need for ecological sustainability – that is, sustainability achieved using the breadth and depth of ecological knowledge. It is focused on meeting human needs while conserving the earth's life support systems. Although the problems facing humankind in the coming century will not be solved by science alone, the knowledge and collaborative approaches developed by ecological scientists can make important contributions to creating a more sustainable future

 

Security engineering

System security engineering is concerned with identifying security risks, requirements and recovery strategies. It involves well defined processes through which designers develop security mechanisms rather than throwing random countermeasures at the design, hoping to achieve security nirvana. Ideally, security engineering should be incorporated into the system design process as early as possible, from the initial architecture specification, if possible. The earlier security concerns are addressed, less time consuming and costly it is to fix future security problems. Despite this well-known fact, it is often the case that software engineers find themselves needing to retrofit security into an existing system. In either case, the security engineering process may be applied in similar manner.

Threat modeling involves understanding the complexity of the system and identifying all possible threats to the system, regardless of whether or not they can be exploited. In risk management the threats are analyzed based on their criticality and likelihood, and a decision is made whether to mitigate the thread or accept the risk associated with it. Security requirements specify what the system will do to mitigate the critical threats identified in the previous stages. Once system designers determine what security mechanisms must be available to the system, the development of security mechanisms follows the general software engineering cycle of design, implementation, testing, and maintenance.

Code red(s)

On July 19, 2001, the Code Red virus infected more than 20,000 systems within 10 minutes and more than 250,000 systems in just under 9 hours. An estimated 975,000 infections occurred worldwide before Code Red subsided. Code Red and Code Red II disrupted both government and business operations, principally by slowing Internet service and forcing some organizations to disconnect from the Internet.

Code Red (named after a soft drink) used a denial-of-service attack to shut down Web sites. The White House, which was the primary target of the denial-of-service attack in the first version of Code Red, had to change the numerical Internet address (IP address) of its Web site. The DoD shut down its public Web sites, and the Treasury Department's Financial Management Service was infected and had to be disconnected from the Internet.

Code Red worms also hit Microsoft's free e-mail service, Hotmail, caused outages for users of Qwest's high-speed Internet service nationwide, and caused delays in package deliveries by infecting systems belonging to FedEx. There were also numerous reports of infections around the world.

The GAO analysis of the Code Red attack reported that it was more sophisticated than those experienced in the past because the attack combined a worm with a denial-of-service attack. Furthermore, with some reprogramming, each variant of Code Red got smarter in terms of identifying vulnerable systems. Code Red II exploited the same vulnerability to spread itself as the original Code Red. However, instead of launching a denial-of-service attack against a specific victim, it gives an attacker complete control over the infected system, thereby letting the attacker perform any number of undesirable actions.

There were many lessons to be learned from the Code Red attack. First and foremost was that thousands of systems operated by Microsoft's IIS Server had not been updated with a patch that could have prevented Code Red as it was originally written. Second, there were thousands of computers around the world that nobody was paying any attention to. Staff in a data center in California that provided support for a hosting service reported that in the first two hours of Code Red, they had more than 68,000 alarm messages from their network monitoring system that a worm was scanning the Internet from computers housed in the data center.






Не нашли, что искали? Воспользуйтесь поиском:

vikidalka.ru - 2015-2024 год. Все права принадлежат их авторам! Нарушение авторских прав | Нарушение персональных данных