blog




  • Essay / Pharmaceutical Industry: Branded Drug Manufacturing

    The evolving US pharmaceutical industry is responsible for the research and development of new drugs/drug compounds, better understanding of chemical compounds contained in drug substances, of the innovation of new technological advances and an increase in treatable or curable diseases through the production of pharmaceutical drugs. For more than a century, the United States has used a variety of resources: time, energy, money, and sheer will to provide consumers, universities, for-profit companies, and regulatory agencies with new advances in the drug marketplace. brand. This industry, over several decades, has significantly increased its own capabilities through various avenues of research, development and funding. Say no to plagiarism. Get a tailor-made essay on “Why Violent Video Games Should Not Be Banned”? Get an Original Essay As drug therapies continue to expand, the United States continues to fund projects through many companies. Major pharmaceutical companies like Pfizer and Merck survived and thrived during this period. With the help of the U.S. government creating regulatory agencies such as the Food and Drug Administration, consumers can rest assured that the products marketed to them are naturally safe and available to anyone who needs them. Historically speaking, the evolution of the pharmaceutical industry began in the late 1800s and continues today. A lot has changed over the years, but one thing remains constant: growth and innovation. The evolution of this industry can be broken down into 4 significant periods: from the 1800s to World War II, from the 1940s to the mid-1970s, from the 1970s to the new millennium, and the first decade of the new century. Each period or segment is composed of several areas and rationales for the maturation and expansion of new drugs. Since the creation of the pharmaceutical industry, the main goal has been and will continue to be: developing new drugs to combat all possible diseases. The United States relied heavily on the European pharmaceutical industry to supply not only the drugs themselves, but also the technologies used to make those drugs. That was until a company named Pfizer was opened in 1849. Pfizer was founded by two German cousins, Charles Pfizer and Charles Erhart. They opened Charles Pfizer & Company as a fine chemicals company in New York. With the production of santonin, an antiparasitic drug, Pfizer enjoyed success. With the Civil War looming, the demand for painkillers, preservatives, and disinfectants quickly increased. “Pfizer is expanding production of tartaric acid (used as a laxative and skin coolant) and cream of tartar (effective as both a diuretic and cleansing agent) as well as other vital medications to help meet the needs of the military of the Union". Responding to the needs of the Union Army, Pfizer was able to increase its production and workforce, all to meet growth and demand. Through constant development of new products, Pfizer has become a household name and remains one of the leading pharmaceutical companies in the United States. Throughout the years following the Civil War, many drugs were manufactured and developed: aspirin, insulin, and penicillin, to name just one. little. Each were pioneers; they saved many lives and are still widely used today. With thecreation of aspirin, the question of patents and registered trademarks arose. Bayer was the first to patent acetylsalicylic acid (ASA), in the United States and Great Britain. However, “the U.S. government seized Bayer's U.S. operations as enemy property and auctioned them off to Sterling Products, a company specializing in patented medicines. Sterling primarily purchased the exclusive right to use the terms "Bayer" and "Aspirin" in the United States, as ASA's patent had expired in 1917. As aspirin became a world-famous product, various companies began producing and advertising it under "pseudonyms", including Sterling.Sterling produced a compound primarily composed of ASA under the name Anacin , where it was then advertised and marketed to consumers with underlying claims of pain and tension relief. These statements were partly true and as a result "the Federal Trade Commission (FTC), the overseer of drug advertising, launched a 20-year campaign to attack these unfounded claims and omissions." The false claims disclosed by Sterling led lawmakers to step up regulation and required research to prove and assert due to the introduction of the Food, Drug and Cosmetic Act of 1938. The Food Act, The Medicines and Cosmetics Act of 1938 was implemented to ensure the regulation of medical devices. and cosmetics, in addition to medicines. It amended the old Food and Drugs Act of 1906 and now requires that medicines be labeled with safety instructions as well as information that is sufficiently relevant to consumers. Additionally, it required pre-market approval by the FDA, to determine the safety compliance of the drug. “It conclusively prohibits false therapeutic claims about drugs, although a separate law gives the Federal Trade Commission jurisdiction over drug advertising.” This law would lead the United States into one of the greatest eras of pharmaceutical innovation: the World Wars. Once the United States experienced the aftermath of World War II, the need for antibiotics and pain relief intensified. “Almost all governments in developed countries have started supporting public health research.” With the funding provided, researchers were able to study the mechanisms and underlying causes of the disease in depth. “This explosion of research has dramatically increased medical knowledge and provided companies with rich opportunities for innovation.” Previously, the methodology used to search for new drug compounds was largely discouraged. Researchers would use a methodology called “drug screening,” in which thousands of drugs were combed through before finding those with therapeutic value. Malerba and Orsenigo (2015) described this in relative terms as representing the lottery, with the compounds occurring by pure coincidence. One compound in particular, streptomycin, has been the subject of numerous studies as a result of these years of drug testing. Streptomycin was a promising antibiotic used to treat tuberculosis. The use of streptomycin for research is leading to innovation in study design. Although the United States had an adequate supply of this antibiotic, research provided less conclusive data. Meanwhile, the British scientists, Hill and his colleagues,were performing experimental treatments with a true randomized study due to lack of medication. "When the results of his study were published in 1948, Hill's use of concurrent (randomized, controlled) controls was hailed as ushering in a 'new era of medicine.' Using information from Hill's study, American scientists began to define generalized criteria for drug testing. Specifically, the stages at which drug development should take place. “Patients had to be selected according to formal criteria, then randomly divided into treatment and control groups; trials had to be double-blind and use objective diagnostic technologies; and drug doses were to be administered on a fixed schedule, while patient observations were to be recorded at uniform intervals.” This led to an increase in clinical trials and an increase in federal funding to improve medical research and innovation. The foundations for industry transformation were laid: innovation, research and development should be the main priority for pharmaceutical companies. “The rate of innovation began to skyrocket: the R&D-to-sales ratio rose from 3.7% in 1951, to 5.8% in the 1950s, to around 10% in the 1960s, to around 15 -20% in the 1980s and after. » In the years since increased research and development, hundreds of new chemical entities and several important drug classes have been discovered. These ranged from antipsychotic medications to antibiotics to diuretics and so on. In the years leading up to the 1970s, the U.S. population was booming and demand for pharmaceuticals increased with it, giving way to increased innovation and industrial growth. At that time, Thalidomide was presented to the public as a miracle drug. Thalidomide was discovered by Wilhelm Kunnz in 1953 and was first synthesized by Chemie Grünenthal, a pharmaceutical company. Thalidomide was first administered to consumers in 1956 and marketed as a cold and flu medication under the brand name Grippex. This drug along with Contergan was campaigned in several leading medical publications as well as letters to doctors, where sales reached 90,000 units and were sold in over twenty countries, excluding UNITED STATES. Due to the high volume of sales and lack of studies demonstrating side effects, many men, women and children have used these medications without knowing the harmful effects. Thalidomide abnormalities were first observed in newborns of mothers who used thalidomide during pregnancy. These abnormalities included “hearing loss, ocular alterations, deafness, facial paralysis, malformations of the larynx, trachea, lungs and heart, and mental retardation in 6.6% of affected individuals.” “The mortality rate among the victims varied between 40% and 45%. Worldwide, between ten and 15,000 children were born with the characteristic abnormalities associated with thalidomide, and 40% died within the first year of life.” This tragedy unfolded and gave rise to different practices in the United States and Germany. “Thanks to Frances Kelsey, the drug was not approved by the FDA, citing insufficient testing.” Since thalidomide was not approved and authorized by the FDA for use in the United States, the country did not have experience with its effects. There.