Σάββατο 18 Νοεμβρίου 2017

Malaria, Poverty & Wars


This is a part from this article about malaria.
 
History

Ancient malaria oocysts preserved in Dominican amber
 
Although the parasite responsible for P. falciparum malaria has been in existence for 50,000–100,000 years, the population size of the parasite did not increase until about 10,000 years ago, concurrently with advances in agriculture[132] and the development of human settlements. Close relatives of the human malaria parasites remain common in chimpanzees. Some evidence suggests that the P. falciparum malaria may have originated in gorillas.[133]
References to the unique periodic fevers of malaria are found throughout recorded history.[134] Hippocrates described periodic fevers, labelling them tertian, quartan, subtertian and quotidian.[135] The Roman Columella associated the disease with insects from swamps.[135] Malaria may have contributed to the decline of the Roman Empire,[136] and was so pervasive in Rome that it was known as the "Roman fever".[137] Several regions in ancient Rome were considered at-risk for the disease because of the favourable conditions present for malaria vectors. This included areas such as southern Italy, the island of Sardinia, the Pontine Marshes, the lower regions of coastal Etruria and the city of Rome along the Tiber River. The presence of stagnant water in these places was preferred by mosquitoes for breeding grounds. Irrigated gardens, swamp-like grounds, runoff from agriculture, and drainage problems from road construction led to the increase of standing water.[138]

British doctor Ronald Ross received the Nobel Prize for Physiology or Medicine in 1902 for his work on malaria.
 
The term malaria originates from Medieval Italian: mala aria—"bad air"; the disease was formerly called ague or marsh fever due to its association with swamps and marshland.[139] The term first appeared in the English literature about 1829.[135] Malaria was once common in most of Europe and North America,[140] where it is no longer endemic,[141] though imported cases do occur.[142]
Scientific studies on malaria made their first significant advance in 1880, when Charles Louis Alphonse Laveran—a French army doctor working in the military hospital of Constantine in Algeria—observed parasites inside the red blood cells of infected people for the first time. He, therefore, proposed that malaria is caused by this organism, the first time a protist was identified as causing disease.[143] For this and later discoveries, he was awarded the 1907 Nobel Prize for Physiology or Medicine. A year later, Carlos Finlay, a Cuban doctor treating people with yellow fever in Havana, provided strong evidence that mosquitoes were transmitting disease to and from humans.[144] This work followed earlier suggestions by Josiah C. Nott,[145] and work by Sir Patrick Manson, the "father of tropical medicine", on the transmission of filariasis.[146]

Chinese traditional Chinese medicine researcher Tu Youyou received the Nobel Prize for Physiology or Medicine in 2015 for her work on antimalarial drug artemisin.
 
In April 1894, a Scottish physician Sir Ronald Ross visited Sir Patrick Manson at his house on Queen Anne Street, London. This visit was the start of four years of collaboration and fervent research that culminated in 1897 when Ross, who was working in the Presidency General Hospital in Calcutta, proved the complete life-cycle of the malaria parasite in mosquitoes. He thus proved that the mosquito was the vector for malaria in humans by showing that certain mosquito species transmit malaria to birds. He isolated malaria parasites from the salivary glands of mosquitoes that had fed on infected birds.[147] For this work, Ross received the 1902 Nobel Prize in Medicine. After resigning from the Indian Medical Service, Ross worked at the newly established Liverpool School of Tropical Medicine and directed malaria-control efforts in Egypt, Panama, Greece and Mauritius.[148] The findings of Finlay and Ross were later confirmed by a medical board headed by Walter Reed in 1900. Its recommendations were implemented by William C. Gorgas in the health measures undertaken during construction of the Panama Canal. This public-health work saved the lives of thousands of workers and helped develop the methods used in future public-health campaigns against the disease.[149]

Artemisia annua, source of the antimalarial drug artemisin
 
The first effective treatment for malaria came from the bark of cinchona tree, which contains quinine. This tree grows on the slopes of the Andes, mainly in Peru. The indigenous peoples of Peru made a tincture of cinchona to control fever. Its effectiveness against malaria was found and the Jesuits introduced the treatment to Europe around 1640; by 1677, it was included in the London Pharmacopoeia as an antimalarial treatment.[150] It was not until 1820 that the active ingredient, quinine, was extracted from the bark, isolated and named by the French chemists Pierre Joseph Pelletier and Joseph Bienaimé Caventou.[151][152]
Quinine became the predominant malarial medication until the 1920s when other medications began to be developed. In the 1940s, chloroquine replaced quinine as the treatment of both uncomplicated and severe malaria until resistance supervened, first in Southeast Asia and South America in the 1950s and then globally in the 1980s.[153]
The medicinal value of Artemisia annua has been used by Chinese herbalists in traditional Chinese medicines for 2,000 years. In 1596, Li Shizhen recommended tea made from qinghao specifically to treat malaria symptoms in his "Compendium of Materia Medica". Artemisinins, discovered by Chinese scientist Tu Youyou and colleagues in the 1970s from the plant Artemisia annua, became the recommended treatment for P. falciparum malaria, administered in combination with other antimalarials as well as in severe disease.[154] Tu says she was influenced by a traditional Chinese herbal medicine source, The Handbook of Prescriptions for Emergency Treatments, written in 340 by Ge Hong.[155] For her work on malaria, Tu Youyou received the 2015 Nobel Prize in Physiology or Medicine.[156]
Plasmodium vivax was used between 1917 and the 1940s for malariotherapy—deliberate injection of malaria parasites to induce a fever to combat certain diseases such as tertiary syphilis. In 1927, the inventor of this technique, Julius Wagner-Jauregg, received the Nobel Prize in Physiology or Medicine for his discoveries. The technique was dangerous, killing about 15% of patients, so it is no longer in use.[157]

U.S. Marines with malaria in a rough field hospital on Guadalcanal, October 1942
 
The first pesticide used for indoor residual spraying was DDT.[158] Although it was initially used exclusively to combat malaria, its use quickly spread to agriculture. In time, pest control, rather than disease control, came to dominate DDT use, and this large-scale agricultural use led to the evolution of resistant mosquitoes in many regions. The DDT resistance shown by Anopheles mosquitoes can be compared to antibiotic resistance shown by bacteria. During the 1960s, awareness of the negative consequences of its indiscriminate use increased, ultimately leading to bans on agricultural applications of DDT in many countries in the 1970s.[75] Before DDT, malaria was successfully eliminated or controlled in tropical areas like Brazil and Egypt by removing or poisoning the breeding grounds of the mosquitoes or the aquatic habitats of the larva stages, for example by applying the highly toxic arsenic compound Paris Green to places with standing water.[159]
Malaria vaccines have been an elusive goal of research. The first promising studies demonstrating the potential for a malaria vaccine were performed in 1967 by immunizing mice with live, radiation-attenuated sporozoites, which provided significant protection to the mice upon subsequent injection with normal, viable sporozoites. Since the 1970s, there has been a considerable effort to develop similar vaccination strategies for humans.[160] The first vaccine, called RTS,S, was approved by European regulators in 2015.[161]
 
Society and culture

Economic impact
 
Malaria clinic in Tanzania
 
Malaria is not just a disease commonly associated with poverty: some evidence suggests that it is also a cause of poverty and a major hindrance to economic development.[9][10] Although tropical regions are most affected, malaria's furthest influence reaches into some temperate zones that have extreme seasonal changes. The disease has been associated with major negative economic effects on regions where it is widespread. During the late 19th and early 20th centuries, it was a major factor in the slow economic development of the American southern states.[162]
A comparison of average per capita GDP in 1995, adjusted for parity of purchasing power, between countries with malaria and countries without malaria gives a fivefold difference ($1,526 USD versus $8,268 USD). In the period 1965 to 1990, countries where malaria was common had an average per capita GDP that increased only 0.4% per year, compared to 2.4% per year in other countries.[163]
Poverty can increase the risk of malaria since those in poverty do not have the financial capacities to prevent or treat the disease. In its entirety, the economic impact of malaria has been estimated to cost Africa US$12 billion every year. The economic impact includes costs of health care, working days lost due to sickness, days lost in education, decreased productivity due to brain damage from cerebral malaria, and loss of investment and tourism.[11] The disease has a heavy burden in some countries, where it may be responsible for 30–50% of hospital admissions, up to 50% of outpatient visits, and up to 40% of public health spending.[164]

Child with malaria in Ethiopia

Cerebral malaria is one of the leading causes of neurological disabilities in African children.[114] Studies comparing cognitive functions before and after treatment for severe malarial illness continued to show significantly impaired school performance and cognitive abilities even after recovery.[112] Consequently, severe and cerebral malaria have far-reaching socioeconomic consequences that extend beyond the immediate effects of the disease.[165]

Counterfeit and substandard drugs

Sophisticated counterfeits have been found in several Asian countries such as Cambodia,[166] China,[167] Indonesia, Laos, Thailand, and Vietnam, and are an important cause of avoidable death in those countries.[168] The WHO said that studies indicate that up to 40% of artesunate-based malaria medications are counterfeit, especially in the Greater Mekong region and have established a rapid alert system to enable information about counterfeit drugs to be rapidly reported to the relevant authorities in participating countries.[169] There is no reliable way for doctors or lay people to detect counterfeit drugs without help from a laboratory. Companies are attempting to combat the persistence of counterfeit drugs by using new technology to provide security from source to distribution.[170]
Another clinical and public health concern is the proliferation of substandard antimalarial medicines resulting from inappropriate concentration of ingredients, contamination with other drugs or toxic impurities, poor quality ingredients, poor stability and inadequate packaging.[171] A 2012 study demonstrated that roughly one-third of antimalarial medications in Southeast Asia and Sub-Saharan Africa failed chemical analysis, packaging analysis, or were falsified.[172]

War
 
World War II poster
 
Throughout history, the contraction of malaria has played a prominent role in the fates of government rulers, nation-states, military personnel, and military actions.[173] In 1910, Nobel Prize in Medicine-winner Ronald Ross (himself a malaria survivor), published a book titled The Prevention of Malaria that included a chapter titled "The Prevention of Malaria in War." The chapter's author, Colonel C. H. Melville, Professor of Hygiene at Royal Army Medical College in London, addressed the prominent role that malaria has historically played during wars: "The history of malaria in war might almost be taken to be the history of war itself, certainly the history of war in the Christian era. ... It is probably the case that many of the so-called camp fevers, and probably also a considerable proportion of the camp dysentery, of the wars of the sixteenth, seventeenth and eighteenth centuries were malarial in origin."[174]
Malaria was the most significant health hazard encountered by U.S. troops in the South Pacific during World War II, where about 500,000 men were infected.[175] According to Joseph Patrick Byrne, "Sixty thousand American soldiers died of malaria during the African and South Pacific campaigns."[176]
Significant financial investments have been made to procure existing and create new anti-malarial agents. During World War I and World War II, inconsistent supplies of the natural anti-malaria drugs cinchona bark and quinine prompted substantial funding into research and development of other drugs and vaccines. American military organizations conducting such research initiatives include the Navy Medical Research Center, Walter Reed Army Institute of Research, and the U.S. Army Medical Research Institute of Infectious Diseases of the US Armed Forces.[177]
Additionally, initiatives have been founded such as Malaria Control in War Areas (MCWA), established in 1942, and its successor, the Communicable Disease Center (now known as the Centers for Disease Control and Prevention, or CDC) established in 1946. According to the CDC, MCWA "was established to control malaria around military training bases in the southern United States and its territories, where malaria was still problematic".[178]

Eradication efforts
 
Members of the Malaria Commission of the League of Nations collecting larvae on the Danube delta, 1929
 
Several notable attempts are being made to eliminate the parasite from sections of the world, or to eradicate it worldwide. In 2006, the organization Malaria No More set a public goal of eliminating malaria from Africa by 2015, and the organization plans to dissolve if that goal is accomplished.[179] Several malaria vaccines are in clinical trials, which are intended to provide protection for children in endemic areas and reduce the speed of transmission of the disease. As of 2012, The Global Fund to Fight AIDS, Tuberculosis and Malaria has distributed 230 million insecticide-treated nets intended to stop mosquito-borne transmission of malaria.[180] The U.S.-based Clinton Foundation has worked to manage demand and stabilize prices in the artemisinin market.[181] Other efforts, such as the Malaria Atlas Project, focus on analysing climate and weather information required to accurately predict the spread of malaria based on the availability of habitat of malaria-carrying parasites.[122] The Malaria Policy Advisory Committee (MPAC) of the World Health Organization (WHO) was formed in 2012, "to provide strategic advice and technical input to WHO on all aspects of malaria control and elimination".[182] In November 2013, WHO and the malaria vaccine funders group set a goal to develop vaccines designed to interrupt malaria transmission with the long-term goal of malaria eradication.[183]
Malaria has been successfully eliminated or greatly reduced in certain areas. Malaria was once common in the United States and southern Europe, but vector control programs, in conjunction with the monitoring and treatment of infected humans, eliminated it from those regions. Several factors contributed, such as the draining of wetland breeding grounds for agriculture and other changes in water management practices, and advances in sanitation, including greater use of glass windows and screens in dwellings.[184] Malaria was eliminated from most parts of the USA in the early 20th century by such methods, and the use of the pesticide DDT and other means eliminated it from the remaining pockets in the South in the 1950s as part of the National Malaria Eradication Program.[185] Bill Gates has said that he thinks global eradication is possible by 2040.[186]

Research

The Malaria Eradication Research Agenda (malERA) initiative was a consultative process to identify which areas of research and development (R&D) needed to be addressed for the worldwide eradication of malaria.[187][188]

Vaccine

A vaccine against malaria called RTS,S, was approved by European regulators in 2015.[161] It is undergoing pilot trials in select countries in 2016.
Immunity (or, more accurately, tolerance) to P. falciparum malaria does occur naturally, but only in response to years of repeated infection.[37] An individual can be protected from a P. falciparum infection if they receive about a thousand bites from mosquitoes that carry a version of the parasite rendered non-infective by a dose of X-ray irradiation.[189] The highly polymorphic nature of many P. falciparum proteins results in significant challenges to vaccine design. Vaccine candidates that target antigens on gametes, zygotes, or ookinetes in the mosquito midgut aim to block the transmission of malaria. These transmission-blocking vaccines induce antibodies in the human blood; when a mosquito takes a blood meal from a protected individual, these antibodies prevent the parasite from completing its development in the mosquito.[190] Other vaccine candidates, targeting the blood-stage of the parasite's life cycle, have been inadequate on their own.[191] For example, SPf66 was tested extensively in areas where the disease is common in the 1990s, but trials showed it to be insufficiently effective.[192]

Δεν υπάρχουν σχόλια:

Δημοσίευση σχολίου