Your place to learn about all of the most important scientific discoveries of late.

Drug and Syringe

What's Inside The Covid-19 Vaccine?

Archi Das Gupta
October 2021

For months on end, the world has been yearning to go back to life before Covid: without masks and without fearing for their lives and their loved ones once they step outside. Leaders around the world have been depending on scientists to do their part in society to develop an efficacious vaccine that combats COVID-19. However, once the vaccine did come into the hands of the healthcare system, half of the population of the United States chose not to be vaccinated. One of the most alarming reasons behind this is Americans’ skepticism of science. Many Americans argue that the vaccine was made too quickly for it to be plausible as a vaccine. Others claim that since they are oblivious as to what’s inside the vaccine, they refuse to take it. 
    The most important ingredient within the Pfizer-BioNTech COVID-19 and the Moderna COVID-19 vaccine is the newly employed mRNA technology which is the only active ingredient. To elicit an immune response, conventional vaccinations use weakened pathogens of a specific virus. When the immune system detects an enemy, antibodies are produced, which act as troops to fight the invader. When your immune system understands how to respond to a virus, it may react much more quickly when you are exposed to it. MRNA vaccines, on the other hand, use the strategy of mRNAs providing genetic instructions to human cells to build a coronavirus spike protein that induces an immune response. The response from our immune system creates antibodies that give vaccinated people protection from COVID-19. 
Unfortunately, due to a lack of scientific literacy within the country, people believe that mRNA could alter human DNA. Messenger RNA already exists naturally within you and it is a piece of genetic code that instructs your cells with directions such as creating new proteins. The technology of mRNA vaccines existed long before the pandemic, and research continually built upon it. Scientists discovered ways to safely alter mRNA to make proteins that are comparable to those present in viruses. Once those proteins are created, our immune system produces antibodies to fight that virus. Lipid nanoparticles, which are made up of a very small quantity of fat, shield mRNA so that it can reach the cell.
    Other ingredients that are utilized include lipids which help genetic material reach the target cells. Lipid nanoparticles are made up of a very small quantity of fat. It protects the mRNA by giving it a greasy-like exterior to help the mRNA slide into the cells. Both Pzifer and Moderna utilize four different types of lipids to help the mRNA achieve its purpose. Salts are another component utilized within the vaccine because they aid in the balancing of the acidity within the human body. Sugar, or sucrose, is used because it aids in the preservation of structure upon freezing.
    Conclusively, the public should rely on scientists and doctors to do their jobs. The research behind mRNA vaccines has been transpiring since the early 2000s. mRNA vaccines have been studied before for other viruses such as the flu, Zika, and other types of infectious viruses. This is why the vaccination has progressed so quickly. As soon as scientists discovered the crucial information regarding COVID-19, they began deciphering the mRNA instructions for cells to synthesize the unique spike proteins into the mRNA vaccine.

Drug and Syringe

The Johnson and Johnson Vaccine

Dorothea West
March 2021

The road to herd immunity seems much nearer than we previously thought. On February 27, the newest of the available coronavirus vaccines was approved by the FDA committee for emergency use authorization. This development means that there are now three available vaccines being manufactured and distributed, thus moving the timeline up for when everyone can be vaccinated. 

What is the Johnson and Johnson Vaccine? 

The Johnson and Johnson vaccine is a single dose, more traditional vaccine that provides hope of reaching a semblance of normalcy within the coming months. This newly approved vaccine does not use the Messanger RNA form that both Pfizer and Moderna’s vaccines do and instead consists of a protein from the COVID-19 virus that is injected into the cells allowing the immune system to develop a tolerance to the virus. This vaccine proved 66.3% effective in a clinical trial of 21,895 people over the age of 18, which though lower than the two previous vaccines, is higher than the FDA’s required 50% threshold. 

Though this vaccine has a lower efficacy rate than the two other options, it has very high efficacy against hospitalization. These results are helpful because though the vaccine may not be as effective in preventing the virus entirely, it is able to prevent major symptoms from arising. It is also important to acknowledge that this vaccine was tested in a different context than the other two, with more variants present and higher levels of general virus presence. 

On a positive note, this vaccine has fewer of the challenges that the other two options have presented. The Johnson and Johnson vaccine does not have to be stored at as low temperatures as the Pfizer and Moderna vials do, and it can last longer after being taken out of the cold storage. This means that fewer doses are likely to be wasted, as they can be used for longer after being taken out. 

New timeline

In all the Johnson and Johnson vaccine seems to be extremely promising in aiding with reaching the goal of having every adult vaccinated by the summer. After the vaccine became approved, President Joe Biden announced that he would be having pharmaceutical giant Merck, one of Johnson and Johnson’s rivals, aid in producing doses of the single-shot vaccine. 

One of the other benefits of this new vaccine is that it is a single-shot formula meaning that the doses produced can cover more people, as well as illuminating the step of having to schedule and return for a second shot. It also is promising for individuals who may be afraid of needles, as they only have to go through the process once. 

Biden hopes that with the increased production ability and smoothed-out process for distributing the doses, every adult will be able to be vaccinated by the end of May. This is a promising update as it is two months earlier than his previous estimate. 

In all, this vaccine is a hopeful look towards returning to normalcy. In a recent address to the nation, Biden hopes that by the time July 4th occurs, we can celebrate freedom and independence, both as a nation celebrating this national holiday and representative of getting through this pandemic.


COVID-19 Vs. the human body

Archi Das Gupta
MARCH 2021

The COVID-19 virus has been around for about a year now and over time, scientists are discovering the lasting effect of the virus on different organs of the human body. COVID-19 is perceived to primarily affect the lungs, but new scientific evidence suggests that the brain and heart are affected as well. The severity of the damage is still unknown as the virus so rather new as it has been more than a year since it's been introduced. Regardless, the effects of the virus on various parts of the body still should be known! 
It is essential to start from the minuscule level to get a better understanding. The virus’s spiky surface cells enter your body and start infecting all the cells which start making copies of the virus. The virus then starts to progress throughout the whole body. The virus advances down the respiratory tract which is the mouth, nose, throat, and lungs. One of the adverse symptoms is difficulty breathing and this a result of COVID going deeper into your respiratory tract.
Recently, more studies are being done to see if the virus has unpropitious effects on the heart. The virus can cause your body to produce an overactive immune response, and this leads to an inflammation of the body. However, your heart is affected because it may need to exert more energy to efficiently pump oxygenated blood throughout the body. Researchers are looking into heart damage being a direct result of the virus because it is found in ¼ of the hospitalized COVID-19 patients.  There were image tests taken of patients months after their COVID-19 recovery and it was found that the virus did lasting damage to the heart muscle. Which increases the risk of heart complications. 
Covid-19's effects on the brain have become more widespread to the point where there are separate clinics for covid-19 patients with neurological symptoms. According to JAMA Neurology, a peer-reviewed medical journal, it was revealed that there was a remarkable portion of patients,36.4%, with neurological symptoms within Wuhan, China. These symptoms range from seizures,  diminished consciousness, sensory impairments, and other neurological symptoms. Critical symptoms include encephalopathy (delirious state), strokes, encephalitis(inflammation of the brain). The long-term symptoms are less severe which include headaches, nausea, numbness or tingling feelings, and neurological issues. 
According to Hopkins medicine, there are four ways COVID-19 harms the brain. One of them is the virus gaining access to the brain to cause a critical infection. The virus particles could progress to the brain through the bloodstream or nerve endings. The second possibility is that the brain would go into overdrive trying to fight the virus thus creating a maladaptive inflammatory response that would cause severe tissue and organ damage. The third possibility is brain dysfunction due to the multitude of physiological changes that transpired in the body. And the final possibility is blood-clotting abnormalities due to the potential seizing taking place within patients.
Conclusively, there is more research to be done about the long-lasting damage of coronavirus. But to reduce exposure to the virus, make sure to be vaccinated and wear your mask!


The Science of Sourdough

Dorothea West

Sourdough Bread - the elusive yet delicious baked good that many people have taken up baking during quarantine - but how does it actually become bread? 

I began baking sourdough bread a couple of years ago when my family was given a starter by a friend. My dad claimed that he was going to take care of it and learn how to bake bread, but soon, I was the one who had to feed it and began learning new recipes to try and achieve the ultimate goal of that perfect loaf. At this point, I knew very little about the precision required to make sourdough bread, and basically killed my starter, though I didn’t know until later that is what I had done. Instead of dumping out most of the starter and feeding a small portion, I would just continue to add food to this massively growing starter. It never rose after feeding and when I would try to bake bread from it, the loaves would turn out like flat little disks. I gave up on my goal of making homemade bread after about a year of failed attempts, but during quarantine, I decided to begin again. I read up on all of the techniques needed to maintain a starter and became engrossed in youtube videos of people talking about the ways to achieve a perfect loaf. After doing my research, I began to develop my starter, and quickly saw dramatically different results. The starter was bubbly and had the perfect tangy smell, and I knew I was on the right track. After a bit of experimentation (and a few failed attempts) I finally achieved a recipe that consistently gives me beautiful loaves of delicious, crusty sourdough bread. 

With the lack of commercial yeast, sourdough bread requires precise science in order to become the perfect loaf. Sourdough bread uses natural yeast, created by making a “starter” that provides the bread with both the sour flavor and rise that make sourdough unique. A starter is created by mixing equal parts of flour and water together and placing them in a sealed container in a room temperature space. This creates the ideal environment for fermentation to occur. Fermentation is a process where the microbes that exist in the air, flour, and water begin to consume the sugars from the flour, and through that, produce yeast. The process of making this bread may seem daunting to many amateur bread makers because these starters need to be maintained and fed on a regular schedule in order to maintain their yeast production, but with consistency, it just becomes another part of your routine. 

Sourdough starters are often passed down through families, some being as old as 100 years, but any ordinary person can make one through a simple process. To create a starter, you simply mix equal parts of water, and flour (to start I recommend using whole wheat flour as it has more microbes, meaning it will produce yeast at a faster rate), and letting that mixture sit, covered for 24 hours. For the first week or so, you will need to feed this mixture everyday, by throwing out around 80% of your existing mixture and adding to the remaining equal parts water and flour. What you throw out is called “discard,” which can be added to pizza dough or other baked goods to add extra flavour, or used to make delicious recipes, such as these crackers. You will know your starter has become active when it doubles in size after you feed it. After that, you can either continue feeding it daily, if you plan to bake often, or put it in the fridge and feed it weekly. Because the yeast needs a warm environment to stay alive, putting it in the fridge will slow the process of development (you are basically putting the yeast to sleep). 

The process of making sourdough bread may seem challenging, but with a bit of research, routine and dedication, you can be making bakery quality loaves in no time.

Server Installation

The Evolution and Use of Technology during COVID-19

Subita Sania

Ever since COVID-19 first started spreading around the world in January of 2020, everyone’s daily lives were uprooted, but the one thing that has changed the most is the evolution of technology and its usage among the masses. Many internet services’ usage percentages have increased greatly over quarantine and the original outbreak. These developments have occurred in all fields, including medical safety, entertainment, and working efficiently from home.

Technological advancements have been developed in order for the risk factors of COVID-19 to lessen. Over the course of COVID’s existence, many countries have applied technology that has already existed in the medical fields. In China, authorities use migration maps, mobile payment applications, and social media to track the movement of people who could possibly be infected or have encountered others with the virus. In Sweden, authorities have developed a platform where health-care workers can report real-time data on volumes of patients with COVID-19. The technology used for screening for infections has also advanced because when Iceland launched widespread testing of asymptomatic individuals, they utilized mobile technology to collect data on patient-reported symptoms. In the United States, officials have used digital thermometers to collect data on patients and a national study is capturing rest heart-rate with a smartwatch application. These developments have proven how scientists and officials in different countries are advancing already existing technology to prevent the risk of COVID-19. 

The entertainment industry has also been drastically impacted by the COVID-19 pandemic. Movie releases have had to transfer to online streaming services such as Netflix, Amazon Prime, Hulu, and others. During the lockdowns placed upon American states, Netflix saw an increase of 16 million subscribers and, Youtube and Tiktok both had an increase in viewership of about 15%. Local news sites have also started to see a vast increase of readers on their articles because so many people are curious to know the latest news on the virus. Websites for The New York Times and The Washington Post have both grown traffic on their viewings by more than 50 percent over the month of April 2020. In addition to this, the home page for the Centers for Disease Control and Prevention has attracted millions of viewers, showing just how much people are investing themselves in getting to know more about COVID. 

Besides the technology that has been developed for people to protect themselves from COVID-19 and how people use entertainment services during the pandemic, advancements have also occurred in technology used for communication. Video conferencing apps have become immensely popular as most activities have moved online. Remote learning for students and workplace meetings have started to use apps such as Zoom, Google Hangouts and Meet, and Microsoft Teams to name a few. Since March 2020, the number of users on Zoom has increased from around 2 million people to more than 6 million and Google Hangout Meets usage increased by 1 million people during March 2020. Another app that usage has increased vastly during the pandemic due to remote learning for students is Google Classroom which grew from 2 million users to 4 million users during March of last year. Due to their increased percentages, these apps have added new functions to the Zoom app intended to make learning easier for students, such as the raise hand function, reactions, and other tools on the participant menu. Google Hangout/Meet has also added the raise hand function on their menu bar so that students will be able to interact with their teachers more efficiently. All these developments to different video conferencing apps were added for the comfort of the users who are unable to see each other face to face. 

Due to the rise and dangers of COVID-19, technology around the world has evolved in order to accommodate people’s safety and lessen the contact with the virus. Evolved technology used to prevent the coronavirus, entertainment services, video conferencing apps as well as remote learning services are all used in different ways for people to find comfort in the safety of their homes. Therefore, the evolution and use of technology during COVID-19 has helped many people to find the normalcy of their lives during this difficult time. 

Resources Used:,-technology-transformation/covid-19-is-accelerating-the-rise-of-the-digital-e


The Future Begins… Starting with AI

Shreya Nasker

The future is… here? When hearing the words artificial intelligence (AI), images of machines thinking and doing things like humans come to mind. As the future nears, technology around the world has been getting more and more advanced. It was a dream that iPhones would exist and we would be able to travel on autonomous vehicles twenty years ago, but now, there are self-driving cars (Thank you, Elon Musk!), automatic AirTrains at airports, facial recognition for the newest electronics, and killer robot appliances that clean every speck of dirt in homes. While this all sounds fascinating, and we’re excited that this is the future people hoped for decades ago, it’s not something we can only just be in awe of. With technology advancing at such a rapid rate around the globe, the increase of artificial intelligence and robots doing human activities can bring consequences to society. To name a few, AI presents a risk to the human workforce and a loss of jobs because new technologies are producing robots to perform said jobs, and, in some cases, AI machinery is misused and more prone to be hacked, which could ultimately end up in disaster. 
Will robots eventually replace the human population and take over the world? It’s a thought, but we’re not quite there yet! What we can see in the foreseeable future though, is that more AI machinery and programming are going to be incorporated into the human workforce very soon. Already, chatbots and health bots are used to assist patients and provide information in the physician workforce. AI bots are programmed to aid workers, especially during crises. As this article is currently being written, the COVID-19 pandemic has created a surge in demand for healthcare workers and staff in hospitals, nursing homes, and other medical practices. The need for workers in the medical field, such as nurses, is expected to increase even more in the near future. The U.S. Bureau of Labor and Statistics states that the job prospects for nurses across all fields are expected to increase by 7% over the next decade, which is higher than most occupations. For specific jobs within the nursing field, demand is much higher than the overall 7% because factors like population growth and nurse shortages come into play. AI is increasingly becoming more and more popular in being used in the healthcare field, and though many fear that AI will end up in the job loss of millions in the future because of this, experts hope to believe robots would complete tasks much more efficiently and faster than humans as well as work alongside them. 
The Coronavirus has also been an eye-opening experience for the future of AI. Unlike humans, robots can not become sick, and to the millions of Americans who have lost their jobs due to this novel virus, robots are taking over those jobs at an alarming rate. They can get more work done at a faster rate because robots do not need breaks. A TIMES article written in mid-2020 states that as many as two million people would become jobless because robots would replace workers by 2025. Contradictory to this, a recent Forbes article informs readers that even though robots would replace many parts of the human workforce, AI can create many more jobs. Those jobs can include many new occupations that have not been discovered yet! Still, the future is not something set in stone, so it may differ from the world experts have in mind today. We just have to see for ourselves. The threats and risks of losing (more) jobs exist in society, now more than ever, as we face this virus. 
In addition to job loss, due to the advancement of technology, AI technology appears to be faulty and misused at times, and the possibilities of AI programs being hacked are very relevant and alarming. According to an article from the UN, some AI systems were proven flawed because the programming failed to recognize a civilian’s skin color. The text states that in January 2020, an African American man was arrested and taken into custody on account of shoplifting. In identifying shoplifters, artificial intelligence face recognition tools were used. However, the face recognition tool was faulty because it failed to differentiate the man’s darker skin color since the technology was tested primarily on white people. The man was wrongfully arrested and taken to the police for hours until matters were resolved. Hypothetically, what would happen if the crime were on a larger scale, like murder, and people were wrongfully convicted and locked up in prison for a very long time, all because of faulty technology? Here, white supremacy and superiority are evident in this situation as white faces were predominantly used as test subjects for new AI technology. The world is becoming increasingly diverse, and technology must reflect that. Also, AI proves to be very dangerous and consequential because it is prone to be hacked. In an August 2020 article written about the probabilities of AI being hacked, writer Frederick Bussler discusses how it’s simple for hackers to make their way into different programs and cause damage to AI systems. We are in a digital world, and hacking has become a lot more common in our everyday lives. In his article, the author asks what if self-driving vehicle programs were hacked and how the worst-case scenario would be dangerous to society as it could result in fatalities and the cost to damage would end up in millions. Solutions to these problems include bolstering cybersecurity to prevent hacking from occurring. This, however, isn’t a guarantee that AI programs are safe from being hacked, but there is hope that this will be taken care of soon. Nonetheless, the threat is still very prevalent, and the consequences are very severe. 
So, as the next generation and, for generations to come, we must brace ourselves for what the future holds and face the unwanted and inevitable ramifications of artificial intelligence.


Western vs. Holistic Medicine

Akasha Jackson

Since the beginning of time, people and society have had a great fascination with the human body and more specifically how to properly treat and care for that body. For many years, the human race has been expanding its knowledge of medicine and has developed a multitude of techniques, methods, and standards regarding identifying and treating illnesses and injuries. Western medicine is a more modern and conventional approach to healthcare. With this approach, doctors and healthcare professionals focus on pathology and curing diseases with drugs, radiation, and/or surgery. Holistic medicine is an alternative approach to curing disease and treating the body. This unorthodox approach focuses on improving health and wellness in the whole body (body, mind, soul) so that it can heal itself. Although holistic medicine is considered the atypical approach, professionals incorporate some conventional aspects into treatment like certain drugs and prescriptions. As western medicine becomes the more standard and accepted form of healthcare, alternative forms of medicine become more obsolete and unknown from modern society. Though very different approaches to curing illness, western and holistic medicine can both be very effective and should be readily available options of healthcare to the general public. 

Western medicine is described as a scientific and evidence-based approach that targets a patient’s symptoms and diseases using drugs, radiation, and surgery. As the predominant approach to healthcare in the United States and the western world, western medicine has become the most accepted and conventional means of treatment. In the 19th century, as scientists made rapid progression and developed their knowledge in chemistry, biology, and procedural techniques, doctors adopted a more systematic approach to diagnosis and treatment. As new information came to light, the face of medicine changed greatly. Modern innovation and techniques coupled with a great progression in science are attributed to the start and growth of western medicine. Considering that this approach began during the Industrial Revolution and the age of many scientific discoveries/conclusions, it is evident that Western medicine is the product of great modernity and it is fairly new. Allopathic medicine aka western medicine makes use of treatments and techniques modernists would consider advanced. Essentially, healthcare professionals generate a diagnosis by taking into account evidence gathered from symptoms and patient testimony. From this point, there are most times a standard procedure or treatment that patients undergo to cure them of whatever ailments they suffer. For example, with patient attestation and verification of symptoms, doctors easily diagnose conditions such as appendicitis in patients. Continuing with the appendicitis example, doctors can then decide on a course of treatment which in most cases is surgery and rounds of antibiotics. There are many advantages of proceeding with an allopathic approach of care, generally, these are quicker and more efficient care with advanced techniques and tools used. However, western medicine often fails to identify and treat the root of a patient's problem, also this form of medicine tends to be very costly and unavailable universally. 

Holistic medicine is described as a whole-body approach to healthcare where professionals combine both traditional and alternative methods to generate a specific treatment plan for patients. This approach prioritizes the prevention of a disease and then treatment, all while focusing on achieving health and wellness in the mind, body, and soul. Holistic treatment has deep ancient roots and was once considered the most effective means of medicine for the mere reason that healers focus on the whole body. The Greek philosopher Socrates and a supporter of this approach once said “the part can never be well unless the whole is well.” Holistic health professionals stress the healing power of nature and the importance of encouraging the self-healing effects of the body. Within the realm of holistic health, frequent practices performed are Alternative Medical Systems, Mind-Body Interventions, Biologically-Based Therapies, Manipulative and Body-Based Methods, Energy Therapies. Today, holistic medicine practitioners are always advocating for and encouraging the use of alternative approaches to Western Medicine that focus on nature, energy, and balance. Especially during a time of great pollution, chemical additives, and unnatural practices, these professionals believe that the best means of treatment must align with natural methods and true balance. Some benefits of this alternative care are personalized treatments, whole body care, great preventative measures, and fewer side effects. However, holistic medicine isn’t always backed by science and is often stigmatized and deemed ineffective. Recently, some scientists and researchers have asserted herbal and natural medicine as a fundamental aspect of medicine and have agreed with some features of holistic care. 

Many aspects should be taken into consideration when deciding a route of medicine to follow. Values and beliefs should always be accounted for and used to determine whether allopathic or alternative care is better for a patient. Western medicine tends to focus on evidence gathered and uses standardized methods to effectively diagnose and treat a patient. However, holistic medicine tends to focus on treating the whole body and initiating the growth of overall health and wellness within a patient. Always advocate for yourself, your health, and make sure to explore and educate yourself on the many different options of healthcare one can choose from. 

Worlds End South Africa

The Alarming Global Warming of 2020

Subita Sania

As if 2020 wasn’t one of the most problematic years already, the Washington Post has reported in the beginning of December that it has just recorded the warmest calendar year as of yet. Global warming is a facet of climate change, which refers to the rise of the Earth’s temperature and has become one of the biggest problems for the Earth since June 23, 1988, when it became a national issue. The causes of this mainly arise from humans interfering with the Earth’s temperature and climate by burning fossil fuels, cutting down rainforests, and farming livestock. Furthermore, the biggest effects of global warming revolve around increased precipitation, droughts, wildfires, and health risks.

In 2020, there were multiple reports of the months of September and November being recorded as the warmest in history. For instance, September 2020 was reported to be the hottest September since 1880, according to scientists at NOAA’s National Centers for Environmental Information. The average global temperature in September was 1.75 degrees Fahrenheit above the 20th-century average temperature of 59 degrees. The month surpassed the global average temperatures for both September 2015 and 2016 by 0.04 degrees Fahrenheit. The temperatures had an alarming effect on the Arctic sea ice coverage as the month of September ranked the second smallest on record. According to the Copernicus Climate Change Service, global average temperatures for November were 1.4 degrees Fahrenheit above 1981-2010 levels. This also affected Australia because it suffered through multiple heatwaves as well as Norway, Sweden, and England, who set national records for their hottest November. This year alone has beaten previous records made by 2016 such as showing near-record heat, being Europe’s hottest calendar year, as well as having the hottest fall season by a large margin for Europe.

Causes of global warming that can be seen in 2020 surface from natural causes and anthropogenic causes. Natural causes of global warming include the sun’s intensity, volcanoes erupting, and changes in naturally occurring greenhouse gas concentrations. However, according to NASA, these natural causes barely have an influence on the Earth’s environment and rapid warming because they occur too slowly. The greenhouse gas emissions humans generate has become the leading influence on global warming as the amount of these gases has increased alarmingly in recent decades. The burning of fossil fuels such as coal, oil, and gas for electricity, heat, and transportation in cars, trains, etc. is the primary source of these carbon emissions. Deforestation also releases carbon into the air as logging, clear-cutting, fires, and other forms of forest debasement contribute up to 20 percent of global carbon emissions. Livestock production of different animals that are consumed as food contributes to methane emission, which traps heat into the atmosphere and generates air pollution. 

These causes of global warming affect global ecosystems by transforming them and harming the places humans live. One of these effects is extreme weather because as the Earth’s temperature becomes warmer, it collects, retains, and drops more water, which changes weather patterns and makes drier environment drier and wetter environments wetter. This increases the frequency of many types of disasters such as storms, floods, heatwaves, and droughts. According to the “World Health Organization,” global warming is expected to be the cause of about 250,000 additional deaths per year from 2030 to 2050. This is due to the many health risks that come with air pollution as it worsens, there is more airborne pollen and mold to increase the chances of hay fever and torment those with allergies and asthma. Diseases can also arise from a warmer and wetter climate such as dengue fever, West Nile virus, Lyme, and kidney disease. 

Initially, it was believed that 2020 would bring positive influences to global warming because the coronavirus prevented a lot of plans for the creation of products that would require a lot of greenhouse emissions. However, with these expectations, there also came downfalls due to how this year became the warmest in recorded history and the Earth is still warming at an alarming rate. The effects of climate change can change the places people live for the worse and lead humans to suffer from things they caused, so the Earth needs to be better-taken care of to prevent more long-lasting risks.



ALS and FTD are Being Linked to a Huntington's Disease Mutation

Archi Das Gupta

Discoveries are made every day in the medical field that consistently help patients. A recent example of this came when the National Institute of Neurological Disorders and the National Institute on Aging discovered that two neurodegenerative disorders, amyotrophic lateral sclerosis (ALS) and frontotemporal dementia (FTD), were linked to a genetic mutation that causes Huntington’s disease.  This discovery can help treat patients with FTD and ALS.

ALS is a nervous system disease that weakens muscles and hinders physical function. ALS is one of many motor neuron diseases that are caused by the degeneration and death of motor neurons. Motor neurons extend from the brain to the spinal cord and the muscles throughout the body, so if motor neurons were to die the ability to move muscles would be lost and the person would become paralyzed. FTD is an uncommon brain disorder that affects the frontal and temporal lobes of the brain. The frontal lobe functions for voluntary movement and higher-level thinking such as planning, long-term memory, motor functions, etc while the temporal lobe functions as auditory stimuli, perception, and memory. When a person has FTD, the frontal lobe and temporal lobe neurons are lost which leads the lobes to shrink. Huntington's disease is a progressive neurodegenerative disorder that produces uncontrolled movements, emotional difficulties, and lack of cognition. The disorder is caused by a defective gene and this gene is dominant meaning that the disorder is inheritable. 

 Many neurological disorders including Huntington are affected by repeated expansion meaning that there is an irregular repetition of amino acids in the DNA. In Huntington’s disease, three amino acids go through the repetition of the huntingtin gene. The more the gene repeats, the more likely it is for patients to experience early onset within Huntington.  

In an international project, researchers screened the full genomes (set of chromosomes in a gamete) from a large group of FTD/ALS patients. Although multiple patients already have the genetic marker for FTD/ALS, researchers found that a small subset had the same Huntington mutation. The patients that had this mutation didn’t show the symptoms of Huntington but they did for ALS and FTD. The patients were carrying the Huntington mutation but they were presented with symptoms of FTD or ALS. In affected individuals, an analysis of a chromosomal dominant form of FTD/ALS found a hexanucleotide GGGGCC repeat expansion within the C9orf72 gene. The researchers noted in their FTD patient series that the repeat expansion was carried out by 3% of sporadic and 11.7% of familial patients. In their ALS subjects, the expansion was 4.1% of sporadic cases and 23.5% of family cases. This expansion mutation was the most common genetic cause of sporadic and familial ALS in their clinical sequence, as compared with other mutations responsible for ALS. In FTD, expansion in sporadic FTD was also found to be the main cause of familial cases and to be equal in frequency to progranulin (GRN) mutations.

Conclusively, due to this discovery now there will be additional screening done on FTD/ALS patients and new therapies are being developed. One of them being ASO(An antisense oligonucleotide) therapy by silencing the causative gene, either operating on both alleles for non-essential genes or selectively operating on the mutant allele in the case of essential genes, you may treat gain-of-function disorders. There is also the prospect of utilizing ASOs for splicing modulation for certain loss-of-function diseases, which can restore the gene function in certain cases or otherwise compensate for its loss. As c9FTD/ALS is a disease for which all of these pathogenic mechanisms have been suggested, an appealing alternative is the use of oligonucleotides for therapy. With discoveries being made constantly, there is hope for finding a cure for these neurodegenerative diseases that harms people’s livelihood.  


CRISPR: Science Technology of the future

Akasha Jackson

If given the choice, would you rewrite DNA and modify your child’s gene expression to your satisfaction? CRISPR is a technological tool that scientists use to edit and change genomes. This tool allows scientists to make modifications in a DNA sequence, resulting in transformations and changes in a genome. Investigations and experiments regarding CRISPR have been ongoing since 1987, however, only recently in 2019 was this hardware used in a clinical trial. This substantial advance in CRISPR is paramount and revolutionary in genetic engineering and overall in science technology. 

On a more scientific level, what exactly is CRISPR? CRISPR is an acronym for the phrase “clusters of regularly interspaced short palindromic repeats.” This system is a guide for a CAS9 protein towards faulty DNA allowing for the enzyme to cut and edit the genome. Specifically, CRISPR is a naturally occurring system that is composed of short CRISPR DNA sequences and CAS genes (that make CAS proteins). CAS proteins identify viral DNA and their 

codes and then transcribe these codes into RNA. They then bind with RNA to search and identify genomes containing the virus/deformation and destroy the DNA. 

In early 2013, Feng Zhang managed to successfully utilize CRISPR in genome editing of eukaryotes. Since then, scientists around the world have manipulated genes using this tool and produced what are deemed as “desirable traits” in embryos. Up to 13,000 powerful and precise CRISPR edits can be made within a single cell and be applied throughout the organism. Some researchers have found that they can prevent the existence and/or progression of a disease or condition in embryos. In 2018, He Jiankui assembled couples where the man had HIV and through IVF (in vitro fertilization), manufactured embryos that were resistant to HIV. He directed the CAS-9 to target and destroy HIV found in DNA sequences. Last year, Denis Rebrikov began his work in modifying the deaf gene in the embryos of deaf women. Rebrikov announced that he would investigate and adopt the method of He Jianku but on a different gene. Additionally, recent studies have proven that explorations of CRISPR through a pharmaceutical lens can allow for efficiency and success in the production of anticancer drugs. 

Although CAS-9 is and has been used solely for the treatment of medical conditions, many fear that the power of CRISPR will soon be abused and used for selfish, synthetic gain. Some scientists believe that athletic and intellectual prowess may be gained through CRISPR. They ultimately think that if studies and trials with this system prevail, many will harness CRISPR for unethical gain rather than reasonable fixations. Others that are against the use of CRISPR think that the tool is a violation of body autonomy. This genetic engineering doesn’t allow for consent regarding the procedure and the changes made. Similar to intersex children and parents given the choice of choosing their child's gender, CRISPR allows parents to ultimately 

choose their child's appearance and body composition. Some bioethicists think that because of this freedom without consideration for the child, CRISPR is morally wrong and unethical. 

Despite the many ethical debates and controversy regarding CRISPR and its usage, it is a major breakthrough in biotechnology, genetic engineering, and overall science technology.


What is Benford's Law and How Does it Detect Fraud?

Fratchelya Ciputra

Benford’s law is everywhere. Anything from the size of volcanoes, populations of cities, to baseball winnings statistics, numbers in magazine articles, and atomic weights. Used in forensic science to detect fake images, IRS tax and election fraud, and to find potentially dangerous social media bots, what exactly is Benford’s Law? 

First discovered by Simon Newcomb in 1881 and later reinterpreted by Dr. Frank Benford in the 1930s, Benford’s law is a simple observation of number patterns that is present in any pool of data. The image above perfectly describes this phenomenon. The first digits of any random data set will occur at a rate of about 30.1% for the number 1, 17.6% for 2, and 12.5% for 3, continually decreasing until the digit 9 is at a percentage of 4.5%. This logarithmic relationship, this perfect curve, is present in every single naturally occurring, seemingly random, set of data you can find. 

This ability to predict patterns in seemingly random data is why Benford’s Law is used by so many people to detect fraud. Whenever a data set does not match the curve indicated by Benford's law that is immediately a signal that something is wrong. The IRS (Internal Revenue Service) uses Benford’s law to detect fraud in our tax forms- it’s used to verify if photos have been tampered with - and it can also be used in our election data. Although irregularities in the statistics of election data may entail ballot tampering, Benford’s law is also sensitive to the decisions we make based on reasoning. According to the documentary “Digits” on Netflix’s Connected, Benford’s Law will also detect if you voted on someone you didn’t want to vote for in the first place. So while Benford’s Law is applicable in court to indicate fraud, it has shortcomings when predicting elections because it is not infallible to the informed voter’s decisions. 

Other than its endless uses for the detection of irregularities in data, it’s shocking to think that in the randomness and chaos of our everyday lives order can always be found through Benford’s Law. Things like successful scores in soccer, volcanic eruptions, the size and distance traveled by cyclones, and how many people live in a city at any given time are meant to be random. Split second decisions made by players on a field that could make or break the win, a slip up made by two drunk adults, even the weather, these things aren’t meant to be predictable. But they are according to Benford’s Law. 

Scientists and mathematicians alike are baffled as to why this pattern is in seemingly every part of our existence. It begs the question of whether Benford’s law is because of how we structure our numbers and how we interpret things, or if the universe really is interconnected and ordered in a way that escapes our understanding. 


The Race For a COVID-19 Vaccine

Archi Das Gupta

Americans are entering frigid December and are still hoping that a vaccine can be administered to them quickly over the course of a few months. Millions of people have faced the negative effects of coronavirus - mentally, physically, and emotionally. These unprecedented times became the “new normal” early March, as individuals had to adhere to actions like social distancing and wearing a mask. COVID-19 affects different age groups in different ways and can range from mild to severe to even asymptomatic victims. The most common symptoms of COVID-19 are the loss of taste and smell and coughing, other symptoms include fever/chills, difficulty breathing, congestion, runny nose, and fatigue. Luckily due to recent technological advances, a vaccine was quickly put into work by several different countries. 
The human body has natural defenders against foreign activity in the body. Inside blood, there are red and white blood cells. The white blood cells consist of macrophages, B-lymphocytes, and T-lymphocytes and primarily fight infection. Macrophages take in and digest germs, they then leave behind antigens. The body identifies the antigens and formulates antibodies to attack them. B-lymphocytes produce these antibodies and T-lymphocytes attack the cells that have been affected. When the body first encounters germs, it takes many days for the body to get better and learn from these T-lymphocytes so they know how to fight it if it occurs again. Vaccines help the body develop immunity from the virus and it causes the immune system to develop antibodies and T-lymphocytes. 
Multiple countries are working on finding a COVID-19 vaccine quickly but there are many phases in place to get an eligible vaccine through to other people. This is where the race began with 13 companies developing their vaccines. But it seems few have already taken the lead: Moderna, BioNTech, Pzifer, and AstraZeneca. The Moderna, BioNTech, and Pfizer vaccine types are where “Genetic instructions for the coronavirus spike protein are encoded in mRNA, delivered via lipid nanoparticle” according to Biopharmadrive. 
The general development cycle of vaccines is the exploratory stage which usually takes 2-4 years and it is where scientists do basic lab research and identify the antigens. Additionally acquiring grants and money usually takes up to about 9-12 months but due to Operation Warp Speed, whose goal is for patients to receive vaccination by January 2021, they quickly issued out millions to billions of money to any vaccine that meets their protocols and requirements. Obviously, due to COVID-19 causing so much harm to society and the economy, the process has been sped up to where the exploratory stage was only 2-3 months. The next stage is the preclinical stage where scientists start to meet the regulations and requirements needed such as writing protocols, operating procedures, and writing a system of quality assurance reports. This stage determines the approximate doses required for patients. The next stage is clinical development where there are drug discovery and more pre-clinical research on animals and this is where clinical trials on humans begin. 
From this point, there are three phases and since these 3 companies have already entered phase 3, they have taken the lead in the race. To shorten up what each phase means, phase 1 is when the drug will be administered to about several dozen healthy human patients and the goal is to find a dose with the least side effects. Phase 2 is when the human patients start ramping up to several hundred human patients and Doctors start documenting how the immune systems are responding to the vaccines. And phase 3 is when the vaccine is administered to thousands of patients and the effectiveness is being questioned. All three competitors have entered this stage in the past few months. Moderna reported 94.1% efficacy, BioNTech and Pzifer reported 95% efficacy while AstraZeneca reported 90% efficacy. Now it seems that BioNTech and Pfizer are head to head more than ever.  
The vaccines are coming at the end of the year going into 2021 as the results are extremely promising. The vaccine will be administered to priority groups first, meaning healthcare workers, essential workers, people with underlying conditions, and the elderly. For the vaccine to be administered for everyone else, it could take up to months, but it is important to mention that each state has its list of priority groups therefore it may differ in certain places. It is also equally important to mention that each person needs two shots of the vaccine as one is a booster shot meaning it will boost the immune system. Conclusively, the winner of the race has not been declared yet as the two main competitors are in a deadlock. But hope is on the horizon as science has developed quickly enough for a vaccine to be coming out sooner than ever.