Are you aware that SAT is going DIGITAL?

Since the SAT has been around for such a long time, American academic culture has become accustomed to it. Numerous colleges continue to use it as a criterion for assessing applicants, and students, parents, and educators all understand its significance and goal.

The College Board has been using the SAT to determine which high school students are eligible to attend college for a century now, and while it still exists today, it has undergone a sizable number of changes. The College Board recently announced that the SAT will go digital by 2024, which may be the biggest change yet. It’s a choice motivated by the desire to increase the SAT’s usability for students and to keep the test current in a time when colleges don’t value it as highly as they once did.

Many colleges and universities temporarily suspended their SAT (or ACT) score submission requirements when Covid-19 forced schools to temporarily close their doors in 2020. Only a few schools have brought back those requirements two years later. But even before the pandemic, many colleges already made the submission of test results optional on application forms.

Many students have continued to take the SAT and made the decision to send their scores to potential schools despite this pattern and the numerous test cancellations brought on by Covid. Vice President of college readiness assessments for the College Board Priscilla Rodriguez reaffirmed the College Board’s dedication to upholding students’ right to submit their test results, saying, “Evidence shows that when colleges consider SAT scores in the context of where students live and attend school, the SAT helps increase diversity. The SAT will continue to be one of the easiest and most affordable ways for students to stand out as we recover from the pandemic.

Is the SAT available online right now?

The custom of using a No. 2 pencil to fill in the bubbles on an answer sheet is one of the SAT’s long-lasting legacies. Over the next two years, everything will change.

Even though the SAT will be administered in a digital format, it won’t be available everywhere. The new SAT is not an online test that students can take whenever they want from wherever they choose. Instead, it will continue to be given either in designated testing locations on weekends or in schools during the week.

The SAT has changed in the past, not just now.

The College Board’s choice to develop a brand-new digital exam is consistent with the changes the organization has made to the SAT Suite of Assessments over the past eight to ten years. The College Board also offers the PSAT 8/9, PSAT 10, and PSAT/NMSQT in addition to the SAT (National Merit Scholarship Qualifying Test). Students in the eighth, ninth, or tenth grade take the PSAT, a condensed version of the SAT. The majority of the assessments in the College Board’s Suite of Assessments are now available in digital format, and have been for a while.

The SAT School Day program was launched by the College Board in 2010 in an effort to give low-income students more access to the exam. Although it took some time for the initiative to gain traction, the College Board now has contracts with 20 states (plus Washington, D.C.) to provide free SAT testing to juniors in high school. Other states offer the test on school days to give opportunities to students who might not have access to the funds or transportation to take it on a Saturday. Some states require the SAT and use it as a standardized assessment. There is already a digital version of the current test being used in many of the states that offer the School Day SAT.

But the 2024 SAT is not simply new because it will be delivered digitally rather than on paper. Additionally, it will incorporate a number of format and design changes aimed at streamlining and simplifying the questions.

Has the College Board not recently made significant changes to the exam?

The parents of today’s students may not be aware of just how different the test is from what they experienced because everyone is trying to keep up with the changes that the College Board has implemented over the past 20 years.

The College Board shifted its focus to writing in 2005 after finally allowing calculators and introducing student-produced (i.e., not multiple-choice) questions in 1994. An essay where students responded to a topic by formulating and defending a thesis was added in order to reflect the significance of clear and effective writing. 1600 points were changed to 2400 total points, with 800 points each for reading, math, and writing.

The new format, however, was short-lived. The College Board sought to develop a test that better reflected the actual work students were doing in high school, driven by the vision of David Coleman, who took over as CEO in 2012. Coleman said, “We are not interested in students just picking an answer, but justifying their answers,” after becoming frustrated with the way students had been able to boost their test scores using test-taking “tricks.”

In place of its previous design, the College Board adopted a more college-readiness-focused approach in 2016.

The new test had four sections: Reading, Writing and Language, No Calculator Math, and Calculator Math, as opposed to multiple short sections for each subject.

All of the questions in the Reading section now relate to passages that were provided in the test, and vocabulary questions—possibly the last remnant of an intelligence test—were eliminated.

The 1600-point scoring system was reinstated, with 800 points awarded for Evidence-Based Reading and Writing (a combination of the Reading, Writing, and Language sections) and 800 points awarded for Math (a combination of both Math sections).

The essay was revised, but it was made optional. Students had to sign up in advance to take it, and essay scores were computed independently of other test results. The essay was completely eliminated from the test by 2021, though some states still use it for their own assessment procedures.

What modifications are being made to the actual SAT?

The SAT is undergoing significant change, but some aspects will not change. The digital SAT Suite will continue to measure the knowledge and skills that students are learning in school and that are most important for college and career readiness, according to Priscilla Rodriguez in an official video statement. Only at a school or testing facility (not at home) will the test be given, and Khan Academy will still provide free online practice materials.

Students who require special accommodations can get them because the College Board is dedicated to making the exam accessible to all students. The accustomed 1600-point scoring system will continue to be used, and significant scholarship opportunities will still be based on student performance.

What precisely is changing with the SAT, then?

Shorter passages with only one question associated with them will replace the lengthy passages with 10–11 questions each in the Reading section. The passages themselves will cover a wider range of subjects to better reflect what college students read.

Less wordy math questions will be included, and the entire Math section will allow the use of calculators. Students may still bring their own SAT-approved calculators, but an app based on the online Desmos calculator will be available.

From timing to navigation, everything the College Board does with the digital design aims to make the test more user-friendly. Students will have more time to answer each question because the SAT will now only last two to two and a half hours. The digital exam will also have a timer that appears on the screen and a feature for marking questions for later review. Instead of waiting weeks to receive their results, students will now be connected with nearby two-year colleges, workforce development initiatives, and career options via their score reports.

Why would there be such a large change now?

The SAT is still crucial, regardless of how colleges have changed their views on test scores on applicants’ applications. The SAT was still popular among students even after schools became test-optional in 2020 and Covid frequently caused test cancellations.

Colleges should allow students to submit their SAT scores.

In a poll conducted by the College Board, 83% of students said they would like the option to submit their test results to colleges.

Priscilla Rodriguez noted that despite an increasing number of colleges making the SAT optional, students still see the scores as a valuable addition to their applications: “Some students may decide their application is stronger without test scores, while others will benefit from sending them, including the hundreds of thousands of rural, first-generation, and underrepresented students whose SAT scores strengthen their college applications.”

More students can take the SAT because it’s available online.

More students will be able to take the SAT now that it will be offered in digital format. The College Board, which aims to remain relevant in an academic culture that increasingly views the SAT as an optional component of college applications, gains from this as well as students. The SAT will be simpler to administer because the new digital test will be considerably shorter than the current one. States, districts, and schools will have more options for when, where, and how frequently they administer the SAT as opposed to being constrained by a set schedule.

The shift to digital applications is already happening because some schools already offer digital SAT and PSAT versions. The International SAT in March 2023 will mark the debut of the new design on a grand scale. All PSAT tests will switch to the new digital format by the fall of 2023. The entire SAT Suite of Assessments will be available digitally in the updated format starting in March 2024.

How will the updated SAT be more user-friendly?

The College Board has concentrated on students’ access to technology and SAT preparation materials after realizing that the majority of students already complete a sizable portion of their schoolwork on digital devices. Through Khan Academy, students can access official practice exams without having to pay for costly resources or preparation courses.

Since many students take their own devices to practice for the SAT, taking the actual test on those same devices will feel more familiar than responding to questions on paper. Students will be able to access the actual SAT on their own laptop or tablet via a secure server. The College Board will lend a device to anyone who needs one if they don’t have one on test day or their school doesn’t have any. Additionally, students will be able to plug in and reconnect without losing any time or work if Wi-Fi goes down or a device dies.

What are the SAT changes’ main advantages?

The College Board ran a global pilot of the new SAT in test centers in the US and abroad in November 2021. The test was overwhelmingly supported by students, educators, and test administrators, who praised its improved design and straightforward administration.

The examination will go more smoothly.

The initial procedures are more effective than what is currently in place. Students won’t have to spend time checking in and filling out paper forms, and educators won’t have to deal with packing, sorting, or shipping test materials without the need for physical copies. Students won’t have to stress about remembering to pack pencils, calculators, or even digital devices because all the materials are provided.

The examination will be more practical.

Positive student feedback on the test itself was particularly noteworthy. The ability to switch between the questions, an online calculator, and a formula sheet allowed students to say that having the material available digitally was more convenient. It was much simpler to keep track of the remaining time thanks to the built-in clock: “This reduced my stress as I knew I had enough time to think critically about the question rather than panicking and choosing a random option when I was unsure of what I had just read.”

I loved that I could go back to questions that I had flagged, since usually on paper I take extra time to find the questions I had missed. Students clearly appreciated the feature allowing them to flag questions for review. The ability to mark one’s response quickly and confidently was incredibly relieving, according to test-takers, who also found the process of answering to be much more efficient than what is required on the paper-and-pencil test.

The examination will be more efficient.
The updated questions and simplified structure received more praise. The newly redesigned Reading section was well received: “I really liked reading the brief passages and responding to just one question. That lessened my anxiety. The reading passages were “much better and more interactive than the long reading texts on the current SAT,” according to English language learners. Students were able to analyze and respond to the questions in the math section, which they felt “were shorter, made sense, and got straight to the point.”

The examination will be safer.

Finally, the simplicity and security of the new SAT are appreciated by test administrators and proctors. The test center coordinators “didn’t have to spend another half hour at the test center just to make sure that things are done,” as there were no forms to complete, paper tests to package, or answer forms to fill out. With the current paper-and-pencil test, the test administration can be cancelled or student scores can be disregarded if even one test form is compromised.

With the SAT going digital, each student will also receive a distinct test form, making it nearly impossible to share answers.

Why are students advised to take the SAT?

Parents and students may understandably wonder why it is necessary to take the SAT since the vast majority of colleges and universities do not require SAT scores as part of the college application process. The SAT is not irrelevant just because it isn’t a requirement for an application.

Not whether you must take the test, but rather what advantages your SAT scores will bring, is the real question.

If the SAT is optional at the majority of colleges, why does it even matter?

According to the College Board, of the 1.7 million students in the class of 2020 who took the SAT, approximately 300,000 were from rural or small towns, 600,000 were first-generation college students, and 700,000 belonged to an African American or Latino racial or ethnic group. The SAT continues to “open doors for students of all backgrounds,” according to Rodriguez.

62% of students in the class of 2021 who took the SAT were able to do so for free on a weekday at their school. Independent studies demonstrate that universal testing during the school day increases the percentage of low-income students who enroll in college. Rodriguez uses her own experience as an immigrant kid who arrived in the U.S. on a shoestring budget to highlight the importance of the SAT to these students: “I know how the SAT Suite of Assessments opened doors to colleges, scholarships, and educational opportunities that I otherwise would have known about or had access to.”

Finally, even though there are more than 25,000 high schools in the United States, the SAT is still an objective test that students can take. There is no way that colleges could be familiar with all of those high schools and have seen every student there. It is becoming more difficult to decide which students should be accepted into colleges as grade inflation continues to rise in American high schools.

The percentage of students graduating from high school with an A average has increased from 39% in 1998 to 55% in 2021, even though high school grades continue to be the most widely accepted indicator of a student’s ability. Colleges need more data points to make their decisions when more than half of students receive As.

That means extracurricular activities for many students, but clubs and sports are frequently expensive and out of reach for many families. When taken into account, test results can support a student’s GPA and reveal strengths that go beyond what grades might suggest.

According to preliminary findings from the College Board’s pilot program, the new digital SAT will give students a better opportunity to demonstrate their knowledge and stand out on college applications.

Climate Change Essay

Academic Discipline: Geography
Course Name: Geography
Assignment Subject: Mitigating against climate change
Academic Level: Undergraduate-1st Year
Referencing Style: Harvard
Word Count: 2,119

Climate change is a man-made phenomenon that threatens the survival of local and global ecosystems. According to the United Nations Environmental Program (UNEP), “the global average temperature in 2019 was 1.1 degrees Celsius above the pre-industrial period”, which has affected the planet through higher levels of GHG emissions, and the increased magnitude and frequency of extreme weather events. The international community has pioneered a range of climate change mitigation agendas, which stress the importance of a multi-stakeholder approach that involves governments, corporations, households and individuals alike. The purpose of this expository research is to explore some of the climate change mitigation strategies that can be used at the governmental level, identify their relative advantages and disadvantages and evaluate which strategy is optimal for governments. It will be argued that the adherence to international climate change agendas and the imposition of a carbon tax are the most effective mitigation measures.

Climate Change Mitigation Strategies
Governments are among the leading stakeholders who can facilitate climate change mitigation, primarily because they have the political legitimacy and authority to implement policies that can constrain the behaviour of other social actors- such as corporations, households and individuals. The first broad climate change mitigation approach is the imposition of a carbon-tax, which charges consumers for fossil fuel consumption. The carbon tax system has been empirically used in developed and developing countries alike, including in countries such as Canada, Japan, Mexico and South Africa (Lai 2021).

The first advantage of a carbon-tax system is that it is a demand side policy that seeks to address climate change by reducing the level of demand for fossil fuels. Imposing a tax on fossil fuel users therefore acts as a disincentive that can be applied across all consumer categories (Ionescu 2019). The second advantage of a carbon tax system is that it can be uniformly applied to all consumers, intimating that its standardization potential increases the scope of application (Kutasi and Perger 2015). The third advantage of a carbon-tax system is that it encourages users to adopt alternative energy forms, in order to curtail the cost associated with fossil fuel consumption. This implies that the carbon tax system can spur innovation insofar as creating an incentive for users to find alternative, renewable forms of energy for their daily needs (Ionescu 2019).

Notwithstanding the advantages associated with carbon-tax, it is also fraught with challenges that may undermine its viability to drive climate change mitigation at the global level. First, the carbon tax is perceived as a regressive tax, in that it imposes the highest burden on low-income earners (Osman et al. 2021). That is, since fossil fuels become more expensive, low-income households that lack alternative sources of energy will need to spend a higher proportion of their disposable income on the tax, such that they will be worse off than their more affluent social counterparts. Second, the carbon tax does not adequately capture the environmental cost of pollution in both the immediate and the long term. The implication is that governments may be uncertain about the accurate tax rate, leading to a scenario where the tax may be too low or too high- both of which would negatively affect climate change mitigation (Whithey et al. 2022). Third, the carbon tax system creates an incentive for tax evasion, such that wealthy individuals, households or corporations may decide to offshore carbon production to developing economies with less stringent carbon tax measures. The implication is that countries may fail to uniformly apply the carbon tax- thereby undermining the global climate change mitigation agenda.

The second broad strategy that has been used to combat climate change is the cap-and-trade policy. The system essentially provides polluters with a carbon or emissions quota, which they are legally bound to operate within. In instances where a polluter has emissions remaining on their quota, they are free to sell their quota to a third party who may have exhausted theirs (Rabe 2016). The cap-and-trade system is arguably preferred to the carbon tax system, primarily because it provides a degree of autonomy to users. The first advantage of the cap-and-trade system is that carbon consumers can meet their production needs while simultaneously adhering to the expected levels of emissions. In other words, corporations can maintain their production levels, and keep emissions below or at the legally mandated levels (Rabe 2016). The second advantage is that the system provides flexibility for carbon consumers, which presents a sharp contrast to the tax regime which has a higher level of constriction among users. Third, the cap-and-trade system is beneficial because it provides a valuable revenue stream for governments, which can then redirect the financial resources towards other climate change mitigation projects (Liu and Li 2017).

Despite the range of benefits associated with the cap-and-trade system, the climate change mitigation approach also has limitations that warrant consideration. First, the system is arguably biased towards corporations and social actors with high revenues, as these actors can emit beyond their respective quotas by purchasing carbon credits from users who have lower total emissions. Second and relatedly, the system is disadvantageous for smaller actors who may lack the necessary financial liquidity to purchase additional credits to meet their own production quotas. Third, the system can be problematic in that it does not necessarily encourage the development of alternative energy resources, as corporations would still be permitted to emit greenhouse gasses (Zhang et al. 2021). The implication is that this approach is not a viable long-term solution to eliminate carbon dependency within the economy, as it simply confers a price to emissions. Finally, the cap-and-trade system can be problematic insofar as increasing the price of fossil fuels (Liu and Li 2017). The implication is that corporations may face higher operational costs in the medium to long term, which can negatively affect production and performance at the aggregate level.

The third broad strategy that can be used by governments to mitigate against climate change is creating incentives for social actors to switch towards renewable forms of energy. For instance, governments can provide funding for renewable energy research and development (R&D); impose import restrictions on fossil fuels and provide tax cuts among other incentives for corporations that are developing renewable energy (Michaelowa 2004). The first benefit of an incentive system is that it reduces the level of carbon dependency within the economy, thereby presenting a viable long-term avenue to mitigate against climate change (Dale et al. 2020). The second benefit of an incentive system is that it fosters competition among the different actors, thereby creating multiple solutions that can thereafter be tailored to meet the specific needs of energy consumers and producers. The main disadvantage of an incentive system is that it does not necessarily affect the current levels of GHG emissions and fossil fuel dependency, such that consumers may continue relying on fossil fuels while developing renewable energy alternatives. Another potential limitation of the approach is that it has high utility for industrialized and affluent countries, whose governments possess the necessary financial resources to provide adequate incentives for renewable energy. In other words, the approach may have limited utility for developing countries, thereby contravening the ability to achieve sustainable development objectives at the global level (Sforna 2019).

The fourth climate change mitigation approach that governments can use is the development of sustainable development goals, which provide a framework for social actors. The international community has made substantial progress in this regard, through the establishment of sustainable development goals (SDGs). The first advantage of an international SDG framework is that it is tailored to the needs and realities of each country, such that the SDGs that are applicable to industrialized economies are not necessarily mirrored by the goals applicable to developed economies (Sachs 2012). This is beneficial because it enables each country to set its own climate change mitigation agenda, based on a thorough understanding of its internal strengths, capabilities, resource endowments, and constraints. The second advantage of the SDGs approach is that the goals are often measurable and tangible, which increases the ease of tracing progress made in achieving the identified goals (Sachs 2012). The third advantage of the approach is that it is a comprehensive framework for combating climate change, as the attainment of SDGs necessarily requires cooperation and collaboration from a range of stakeholders in the country. For instance, in order to achieve the SDG of providing potable water to local communities, the stakeholders that can contribute to this objective include corporations, nongovernmental organizations, local communities, households and individuals. Finally, the SGD approach is beneficial because it aligns environmental responsibility with a range of social and human development indicators, thereby highlighting the relationship between environmental, human and economic development indicators (Pandey 2017).

The SDGs have been lauded as one of the most important international agreements of the contemporary context. However, to date, no country has yet achieved all the stipulated objectives. Therefore, the first limitation of this approach is that it can be perceived as overly ambitious, which negates the validity of the approach in meeting sustainable development objectives (Sachs 2012). The second limitation is that most SDGs highlight structural problems that are faced by local economies yet lack any structural level solution that can be leveraged to achieve the objectives. For instance, one of the SGDs is the elimination of poverty and hunger. The complexity of poverty in any context highlights the importance of structural level factors, for instance social determinants of health, institutional processes of discrimination, and the marginalization of vulnerable populations. Against this backdrop, the SDGs still lack structural level solutions, for instance in terms of policies that can be adopted to improve the global capitalist system and consider the needs of vulnerable populations in the Global South.

Analysis
The focus of this analysis has been on evaluating climate change mitigation strategies, specifically those that can be implemented by governments. The research has identified four broad strategies that can be used- namely the imposition of a carbon tax; the introduction of a cap-and-trade system; the creation of incentives to improve adoption of renewable energy alternatives; and adhering to international climate change mitigation accords such as the SDGs. Each of the identified strategies is associated with both advantages and disadvantages. The purpose of this section is to evaluate which of the alternatives is the best approach for government agencies, based on the discussion conducted in the previous sections.

The objectives of a viable and comprehensive climate change mitigation approach should encompass both the reduction of current levels of emissions, as well as fostering higher adoption rates for renewable energy alternatives. Based on this assertion, the inferiority of climate change mitigation solutions is the cap-and-trade system. This is because the approach not only permits carbon emissions (thereby failing to reduce the actual level of emissions) and lacks any direct orientation towards the promotion of renewable energy resources (Liu and Li 2017). An equally inferior approach is the incentive approach, as this is considered a medium to long term plan that fails to address current levels of GHG emissions, albeit while fostering innovation and development of alternative renewable energy resources.

The adherence to international frameworks such as the SDGs is laudable because it provides each country with a unique set of objectives that can be used to address the climate change concern (Sachs 2012). Moreover, the approach is also laudable because of its multi-stakeholder orientation, as that facilitates a thorough implementation that involves a broad base of social actors. The carbon tax system is widely unpopular especially among economists and political commentators, but has notable benefits that highlight its potential as a climate change mitigation tool. Specifically, the carbon tax system is beneficial because it punishes fossil fuel consumption in the immediate term, while creating a disincentive that can propel social actors to divert towards renewable forms of energy. The implication is that governments should be more invested in climate change mitigation approaches that are consistent with international goals; have the capacity to reduce fossil fuel dependency in the immediate term, while creating an incentive for the widespread adoption of renewable forms of energy.

Conclusion
In conclusion, the purpose of this analysis was to consider the range of climate change mitigation tools that are typically used by governments across the globe. The intent was to evaluate the advantages and disadvantages of each approach, prior to offering a recommendation for the best approach that governments can use. It was argued that governments should rely on measures that are consistent with international goals; have the capacity to reduce fossil fuel dependency in the immediate term, while creating an incentive for the widespread adoption of renewable forms of energy. This in turn highlights the carbon tax and SDGs as the most appropriate and comprehensive approach to climate change mitigation.

Reference List:

Dale, A., Robinson, J., King, L., Burch, S., Newell, R., Shaw, A. & Jost, F. 2020, “Meeting the climate change challenge: local government climate action in British Columbia, Canada”, Climate Policy, vol. 20, no. 7, pp. 866-880.

Ionescu, L. 2019, “Climate Policies, Carbon Pricing, and Pollution Tax: Do Carbon Taxes Really Lead to a Reduction in Emissions?”, Geopolitics, History and International Relations, vol. 11, no. 1, pp. 92-97.

Kutasi, G. & Perger, J. 2015, “Tax incentives applied against externalities: International examples of fat tax and carbon tax”, Society and Economy, vol. 37, pp. 117-135.

Lai, O. 2021, “What Countries Have A Carbon Tax?”. Retrieved May 10 2022 from https://earth.org/what-countries-have-a-carbon-tax/

Liu, H. & Li, Z. 2017, “Carbon Cap-and-Trade in China: A Comprehensive Framework”, Emerging Markets, Finance & Trade, vol. 53, no. 5, pp. 1152-1169.

Michaelowa, A. 2004, “Firms, Governments and Climate Policy: Incentive-Based Policies for Long-Term Climate Change”, International Affairs, vol. 80, no. 2, pp. 382-384.

Osman, M., Schwartz, P. & Wodak, S. 2021, “Sustainable consumption: What works best, carbon taxes, subsidies and/or nudges?”, Basic and Applied Social Psychology, vol. 43, no. 3, pp. 169-194.

Pandey, S. 2017, “The Road From Millennium Development Goals to Sustainable Development Goals by 2030: Social Work’s Role in Empowering Women and Girls”, Affilia, vol. 32, no. 2, pp. 125-132.

Rabe, B.G. 2016, “The Durability of Carbon Cap-and-Trade Policy”, Governance, vol. 29, no. 1, pp. 103-119.

Sachs, J.D. 2012, “From millennium development goals to sustainable development goals”, The Lancet, vol. 379, no. 9832, pp. 2206-2211.

Sforna, G. (2019). Climate change and developing countries: From background actors to protagonists of climate negotiations. International Environmental Agreements: Politics, Law and Economics, 19(3), 273-295.

United Nations Energy Program. N.d. “Facts About the Climate Emergency”, Retrieved May 10 2022 from https://www.unep.org/explore-topics/climate-action/facts-about- climate-emergency

Withey, P., Sharma, C., Lantz, V., McMonagle, G. & Ochuodho, T.O. 2022, “Economy-wide and CO2 impacts of carbon taxes and output-based pricing in New Brunswick, Canada”, Applied Economics, vol. 54, no. 26, pp. 2998-3015.

Zhang, G., Zhao, L., Zhang, Q. & Zhang, Z. 2021, “Effects of Socially Responsible Behaviors in a Supply Chain under Carbon Cap-and-Trade Regulation”, Discrete Dynamics in Nature and Society, vol. 2021, pp.1-18.

Accounting for the First World War

Academic Discipline: History
Course Name: History
Assignment Subject: Accounting for the causes of WW1
Academic Level: High School-Grade 12
Referencing Style: Chicago
Word Count: 2,023

The First World War was historically unprecedented in terms of the advanced warfare tools, casualty rates, and involvement of multiple nations from different continents. Although the onset of the war is largely attributed to the 1914 Sarajevo assassination, the causes of the war pre-date the events of 1914. The purpose of this paper is to discuss the causes of the First World War, to ascertain the relative importance of cumulative factors that led to the war. It will be argued that while several factors contributed to the outbreak of war, the single most important factor was the alliance system.

The first major cause of the First World War was the alliance system. On the one hand, Germany, Austria-Hungary, and Italy formed the central powers or Triple Alliance, while on the other hand, the Allies or Triple Entente comprised France, Russia, and Britain. The alliance system contributed to the onset of war for three main reasons. First, the system was based on distinct alliances or partnerships that dictated the unfolding of war, as an attack on one member of a faction was necessarily perceived as an attack on all members belonging to the faction in question (Alpha History 2021). Second, the alliance network divided great powers across two distinct camps, thereby essentially pitting the two groups against each other in the battle to control Europe and the rest of the world. Third, the alliance system was based on the mandate to protect mutual interests, in turn perceived as mutually exclusive with the strategic interests of the other camp (Alpha History 2021).

In addition to the alliance system, Europe was engulfed in an arms race in the years leading to 1914. The arms race was characterized by the heavy militarization of both the Central powers and the allies, with the intent to create naval, air force and standing army capacities that would rival and dominate the other (Maurer 1997 p.286). The arms race was facilitated by the industrial revolution and subsequent technological advancements, which paved the way for the manufacture of semi-automatic rifles, tanks and other warfare tools. The arms race contributed to the onset of the First World War in three distinct ways. First, the arms race resulted in the creation of war equipment that had the inherent capacity to devastate entire populations and destroy cities to the ground- which increased the likelihood of a total war among the warring parties. Second, the arms race was fuelled by a spirit of competition, thereby intensifying the rivalry that existed between the Triple Alliance and the Triple Entente (Maurer 1997 p.286). Third, the arms race provided each faction with a legitimate prospect for victory over the other, thereby exacerbating tension as neither party was likely to back down from a military confrontation with the other.

The third contributing factor of the First World War was the influence of nationalism and the empire building mentality among European statesmen. Since the unification of both Italy and Germany in the late 19th century, both countries had expressed an intent to maintain their territorial empires within Europe- especially staving off potential attacks from some of the already established powers- such as Britain and France (Williamson 1988 p.795). Nationalism and the empire building mentality contributed to the onset of war in four major ways. First, it created a hegemonic power battle between Britain and France on the one hand, against Germany and Italy on the other, as the latter powers sought to increase the diplomatic and economic importance of their respective countries in Europe and beyond. Second, nationalism fuelled the sentiment that each country was virtually on its own, and that the ability to protect its strategic interests was paramount to maintain relevance in the restructured European order. Third, the empire building mentality informed the sentiment that each country had to expand its physical geographic boundaries to increase prospects of successfully staving off an attack from real and perceived enemies. Finally, nationalism and empire building both contributed to the balance of power crisis that Europe experienced at the time, as a range of competing countries with comparable economic and military strength were vying for the opportunity to dominate Europe.

The fourth factor contributing to the eruption of war was imperialism. All major countries in both the Triple Entente and the Triple Alliance had overseas empires that were useful in fuelling the industrialization drive in their respective domestic economies, in addition to providing critical resources that could be used for armament among other initiatives. Imperialism’s first contribution to the war was intensifying the rivalry and hostility between the European powers, for instance illustrated by the fight for territories in the Scramble for Africa (Weinstein 2000 p.11). Second, imperialism fed the notions of nationalism and empire building, thereby providing an additional incentive to maintain the tension and hostility between European powers. Third, imperialism enabled the European great powers to amass resources from their colonies, which were used for industrialization, and the manufacture of advanced weapons. Therefore, in the absence of colonies and imperialism, industrialization may have occurred much later, which would have also ostensibly delayed the onset of war.

An arguably more proximate cause of the war was the Balkan crisis. Prior to 2014, Austria-Hungary had tension with Serbia over the latter’s desire to unify all Slavic people including some populations residing in Austria-Hungary (Alpha History). Austria-Hungary perceived the Serbian intent as a direct assault on its nationalism and empire building imperatives, thereby increasing the tension between the two in the region. With the assassination of the Archduke Franz Ferdinand by a Serbian extremist, Austria-Hungary capitalized on the opportunity to not only redress the issues experienced with Serbia. The Balkan crisis was therefore not only an ongoing tension between Austria-Hungary and Balkan states, but one that served as the final impetus for war.

This paper has thus far argued that the First World War occurred because of cumulative factors, specifically the alliance system; the arms race; nationalism and empire building sentiments; imperialism and the Balkan crisis. The next important question pertains to which of the identified factors can be deemed most influential and important in leading to the First World War. This section will argue that the alliance system was the most important, single contributor to the First World War, primarily because each of the other identified factors were not sufficient causes to single-handedly lead to the eruption of war. The relative importance of each factor will be considered separately, prior to defending the identified thesis.

As noted earlier, the arms race was critical in setting the stage for war, as it created the impression that each country had the capability to not only stave off an attack from an enemy, but also successfully launch an attack for its own strategic purposes. However, the arms race was insufficient as a single cause for the war. This is because both historically and in the contemporary context, countries have always had different levels of armament, and the imperative to create a military force that can successfully quell an external attack has existed since time immemorial. The implication is that the armament race was necessary insofar as providing the tools for war, although it remained insufficient to cause the war between European powers.

Similarly, it was established that nationalism and empire building sentiments were critical in driving the tension that eventually erupted into war. However, like the arms race, nationalism and empire building were not stand-alone causes for the war. First, nationalism is arguably an important aspect of any country, both historically and in the contemporary context, as it provides the basis for national unity, a sense of belonging and national pride. Therefore, nationalism did not cause the war. Instead, the leading statesmen in both the Triple Alliance and Triple Entente manipulated nationalistic sentiment to create a zero-scum game in Europe. That is, nationalism was used as a vehicle to fuel the othering of external societies, to an extent where nationalism was perceived as synonymous with defending one’s country against enemies. In similar token, empire building may not have necessarily led to war, primarily because each great power had a subtle respect and acceptance of the empire building ambitions of the other. For instance, Germany’s unification process led to its acquisition of Alsace-Lorraine from France. Although France was not impressed with the cessation of its territory after the Franco-Prussian war, this was insufficient to make France declare war on Germany to gain back its territory. The implication is that empire building did lead to the exacerbation of tension among the great powers, although it did not single-handedly lead to the eruption of war.

Imperialism was equally important insofar as exacerbating tension and hostility among European powers, especially in contexts where they were vying for control over the same colonial territory. The first reason imperialism did not single-handedly lead to war is because it had existed for decades prior to the First World War, intimating that at the very least, it was not a proximate reason for the war. Second, imperialism was an all-European affair, in that each country was virtually capable and permitted to annex foreign territories to fuel industrialization or other agendas in the domestic context. For instance, although Britain and Germany were virtually enemies by 1914, both had territories in Africa which were acknowledged and respected as such despite the rivalry between them. This demonstrates that imperialism was not a zero-sum game, and the ability for each European power to exercise it implied that it was not a proximate cause for the war.

Of all factors identified as contributing to the onset of war, the factor that could have single-handedly led to the eruption of war was the alliance system. First, the alliance system created a reality in which an attack on one was perceived as an attack on all, implying that any skirmishes had the potential to bring Europe to war. Second, the Balkan crisis and the Sarajevo assassination both revealed the significance of the alliance system in leading Europe to war. This is because Austria-Hungary had to secure support from the Triple Entente, while Serbia received the protection of Germany, prior to making any official declarations of war (Alpha History). In other words, the Sarajevo assassination assumed importance and significance because the conflicting parties were assured of the intervention of their allies should they engage in warfare- thereby underscoring the significance of the alliance system in leading to war. Finally, the alliance system was important because the war involved most of Europe, and eventually the United States and Japan- demonstrating that in the absence of camps, war may have been inevitable on a smaller scale. However, the magnitude of the war and the involvement of multiple parties demonstrated that the alliances created formal allegiances that had to be honored, in turn leading to the involvement of multiple countries and setting the stage for a historically unprecedented conflict.

In conclusion, this paper sought to identify the causes of the First World War and isolate at least one cause that may have unilaterally resulted in the war. It was argued that the main factors contributing to the onset of war included the arms race; the political significance of nationalism and empire building sentiments; imperialism; the alliance system; and the Balkan crisis. Moreover, the paper argued that of all identified factors, the single most important determinant of the war was the alliance system. It is important to stress that each of the identified factors were all significant in leading to the First World War, and that the complexity of the war is such that it cannot be attributed to one single factor. This paper demonstrated that while all identified factors contributed to the eruption of war, the alliance system was the most significant culprit. This is because in the absence of the alliance system, war may have been inevitable between some great powers, but its magnitude and effect would have been equally on a smaller scale. Therefore, the alliance system forged networks of allegiance that had to be honored, in turn explaining why the First World War involved several countries, was fought in many countries, and led to casualty rates of more than 20 million.

Bibliography
Alpha History. 2021. “Alliances as A Cause of WW1”. Accessed May 10 2022 from https://alphahistory.com/worldwar1/alliances/

Maurer, John H. 1997. “Arms Control and the Anglo-German Naval Race before World War I: Lessons for Today?” Political Science Quarterly 112 (2): 285-306.

Weinstein, Jeremy M. 2000. “Africa’s “Scramble for Africa”: Lessons of a Continental War.” World Policy Journal 17 (2): 11-20.

Williamson, S. R. 1988. “The Origins of World War I.” The Journal of Interdisciplinary History 18 (4): 795–818.

Stress Management Strategies for Nurses: Expository Essay

Academic Discipline: Nursing
Assignment Subject: Stress management strategies for nurses
Academic Level: Undergraduate-2nd Year
Referencing Style: MLA
Word Count: 2,055

The COVID-19 pandemic demonstrated the physical strain and peril that nurses expose themselves to in providing high quality care for patients. However, what has been more subtle is the psychological toll that nurses experience through their profession, which can create health related challenges for the nurses, dilute the quality of care provided to patients, undermine best practices in the nursing profession, and lead to burnout among other undesirable realities (Lopez-Lopez et al., 1032). Stress management is therefore a critical imperative for the nursing profession, as it can equip nurses with the necessary tools to manage the psychological burden associated with their profession. The purpose of this paper is to explore some of the stress coping strategies that nurses can pursue, thereby providing some practical utility to the nursing profession, in addition to individuals who experience stressful working, living and learning environments. Moreover, the paper will evaluate how healthcare institutions can facilitate productive stress management techniques among nurses.

The first strategy that nurses can use to cope with stress is mindfulness. This approach can assume the form of a reflective diary or meditation. According to research, mindfulness is useful in coping with stress as it enables the individual to be fully aware of their environment and self, such that they desist from overreacting to stimulus or feeling overwhelmed (Lee et al., 87). Nurses can adopt mindfulness to not only become fully immersed in their roles and duties, but also to ensure that they maintain level headedness when confronted with stress triggers. The mindfulness example of a diary can also be useful in improving the self-awareness of nurses, such that they become more attuned to their own stress triggers, current coping strategies, the impact of the coping strategies on their ability to perform expected duties, and the possibility of adopting alternative coping strategies that would enhance their well being (Lee et al., 87).

Healthcare institutions can facilitate mindfulness training as an institutional practice, or by offering individual nurses with mindfulness resources. With regards to the former, healthcare institutions can offer seminars, talks, reading materials, and/or learning modules for their nursing staff. This approach would be beneficial insofar as standardizing mindfulness training as a key component in guaranteeing the mental and physical safety of nurses (Mahon et al., 572). In relation to the latter, healthcare institutions can offer mindfulness training packages upon request by the nurses, based on the rationale that some nurses may prefer alternative stress coping strategies and should be permitted to choose the approach that resonates best with their personality, background and other personal factors.

The second strategy that nurses can use to cope with stressful work environments is by making positive lifestyle choices. Specifically, nurses should be encouraged to maintain a healthy diet, exercise regularly, and ensure they have adequate time to rest and sleep (Pelit-Aksu et al., 1095). Research has noted that fatigue is one of the most common causes of poor performance in the healthcare sector, which underscores the importance of a healthy lifestyle among the nursing profession (Barker & Nussbaum, 1370). The significance of a healthy lifestyle is that it alleviates any fatigue related determinants of stress, while simultaneously ensuring that nurses can maintain a positive health outcome that improves their ability to administer quality care to patients. The main potential drawback of this approach is that nurses often have busy and/or unpredictable schedules, which can undermine their ability to establish an exercise routine- among others (Barker & Nussbaum, 1370).

Healthcare institutions can facilitate the adoption of healthy lifestyles among nurses in several ways. First, institutions can offer important information on dietary guidelines, that the nurses can thereafter tailor to personal tastes and preferences. Second, healthcare institutions can invest in building gyms and other recreational facilities on site, which would enable nurses to exercise according to their schedules, and with maximum convenience. Third, institutions can provide nurses with access to dieticians and personal exercise trainers, who can collaborate with nurses to create a schedule that fits with the goals and schedules of each individual nurse. With either approach, it is important to stress that healthcare institutions can play a useful role in creating supportive environments that improve nurses’ ability to maintain healthy lifestyle patterns as a strategy to cope with stressful working environments.

The third strategy that nurses can use to manage stress is by freely communicating their challenges to the matron, chief nursing officer, or chief nursing executive. The rationale for this approach is that nurses often feel overwhelmed by practices or expectations that are within the control and purview of the matron, implying that the latter can offer assistance or advice that can be useful to the nurse. For instance, during the pandemic, a common challenge experienced by nurses was the burden of working overtime- necessitated by the high demand for healthcare services within the general population (Aggar et al., 91). Qualitative research indicated that nurses who communicated their experiences and feelings to the matrons had better stress coping strategies, as the matrons were often able to either remove the stressor entirely (for example by hiring more temporary nurses) or offer advice that can improve the stress coping mechanisms used by the nurses (Aggar et al., 91; Zhang et al., 1584).

The establishment and maintenance of open communication channels between the nurses and matrons is contingent on the policies adopted by the healthcare institution in question. First, it is imperative to appoint matrons who are interested in the welfare and wellbeing of nurses, implying that there should be a balance between their supervisory responsibilities and administrative duties. Second, it is important to ensure that nurses are actively encouraged to seek support, guidance and assistance from the matron, for instance by creating roundtable dates where nurses openly communicate with the matron on a range of issues affecting their professional or personal wellbeing. Third, it is important that the upper levels of the nursing hierarchy adopt a leadership style that is conducive to open communication channels with the nursing staff. According to literature, the transformational leadership style would be useful, as it is contingent on empowering followers to take initiative and assume an active role in resolving challenges faced within the institution (Echevarria et al., 168). Similarly, the servant-oriented leadership style would be equally useful as it would place the needs of the followers (nurses in this instance) as a priority for the leadership (Kul et al., 1168). Moreover, qualitative research has provided support for the assertion that the leadership style adopted by the upper hierarchy in healthcare institutions is a critical determinant of the culture adopted at the organization, which then has either a direct or indirect influence on the health and wellbeing of all organizational members. (Echevarria et al., 168).

The fourth strategy that nurses can adopt to cope with stress is seeking peer support. According to Li et al. (204), the challenges faced by nurses over the course of performing their duties are often applicable to fellow nurses. This implies that seeking peer support would be useful, as a nurse can benefit from feeling understood and being provided with actionable advice that can be used to alleviate stress (Li et al., 204). Peer support is also useful because it demonstrates that the nurse is not experiencing unique or novel stressors in the workplace environment, thereby potentially reducing feelings of helplessness and being overwhelmed. Peer support is equally important because it fosters a sense of community within the nursing staff, thereby improving the likelihood of positive communication and interaction- which can also alleviate stress.

Healthcare facilities can actively promote peer support for nurses in a variety of ways. First, institutions are well placed to pair newly recruited nurses with experienced nurses, thereby creating an automatic peer support pairing system that can be useful for newly recruited nurses. Second, institutions can provide seminars or talks whereby the more experienced nurses discuss work related challenges and issues with junior nurses- thereby also providing a form of peer support. Third, healthcare facilities can actively encourage the creation of peer support groups within the hospital, where nurses can have the prerogative to join a peer support group that suits their personal expectations or preferences. Finally, healthcare institutions can seek input from the nurses on what kind of peer support system should be established, for instance by administering a survey among all nurses working at the institution.

The fourth strategy that nurses can use to cope with stressful environments is by relying on counselling or therapy services. Research has noted that on the one hand, nurses are expected to project a demeanour of confidence, assurance, professionalism and competence in order to carry out their expected duties and responsibilities (Aggar et al., 92). On the other hand, both the personal and professional environments can create feelings of insecurity, fear, and uncertainty, which nurses are typically encouraged to hide or manage privately (Aggar et al., 92). The implication is the possibility of a dissonance between the emotional realities that are publicly disclosed, versus those that are privately experienced. As such, counseling or therapy can be a useful and effective stress coping strategy as it would provide the space in which nurses can openly discuss feelings or thoughts that are antithetical to the professional demeanour they are expected to maintain. In counselling, nurses can discuss a range of factors with the assurance that this will not impact how they are treated or perceived at work.

Healthcare institutions can facilitate the counselling stress coping strategy by offering counselling services for nurses within the hospital. Ideally, the counselling services would be provided by an external third party, which would improve the utilization rates among the nurses. Second, healthcare institutions can offer financial support for nurses who prefer an external counsellor, for instance by shouldering some percentage of the cost, or providing medical benefits and/or insurance coverage. Third, healthcare institutions can provide support by addressing the stigma attached to nurses displaying ‘unprofessional’ emotions or thoughts. That is, it would be beneficial to create an environment where nurses feel supported to experience human feelings, emotions and realities, without necessarily framing that as harmful to their ability to perform the expected duties and responsibilities. This approach may be useful insofar as enabling the nurses to openly seek counselling or therapy to discuss negative emotions that can affect their performance and wellbeing.

The final strategy that nurses can use to cope with stress is by maintaining a work-life balance. Over the course of the pandemic, this was arguably difficult to establish as most nurses had to work overtime and dealt with the ramifications of the pandemic in both the professional and the personal contexts. However, the maintenance of a work-life balance is important in that it enables nurses to leave the work-related stressors in the workplace, such that they can focus on rest and relaxation in their home environments (Lee et al., 93). The work-life balance is equally important because it enables the nurse to establish clear boundaries that can be useful in maintaining a less stressful existence. Healthcare institutions can play a pivotal role in supporting nurses’ establishment of a work-life balance. First, institutions can respect the nurses’ home environment by refraining from calling them in to work or discussing work related issues while they are at home. Second, institutions can offer mandated ‘home’ days to the nurses, whereby the nurse is expected to take at least three weeks at home every six months. Third, institutions can facilitate work-life balance by encouraging nurses to limit work related communication (for instance emails, texts and notifications) while they are at home.

In conclusion, the purpose of this essay was to examine some of the stress coping strategies that can be used by nurses. The importance of stress coping management is pronounced within the nursing profession, as work related responsibilities and demands can often create stress triggers that can affect performance in addition to undermining physical, emotional and psychological wellness. The paper argued that nurses can manage stress through mindfulness; the maintenance of a healthy lifestyle; the establishment of communication channels with the upper hierarchy of the nursing staff; peer support; therapy and counselling; as well as maintaining a work-life balance. The paper also argued that healthcare institutions should play a leading role in providing resources and/or creating an environment that is conducive to the adoption of positive stress coping strategies within the nursing profession.

Works Cited
Aggar, Christina, et al. “The Impact of COVID‐19 pandemic‐related Stress Experienced by Australian Nurses.” International Journal of Mental Health Nursing, vol. 31, no. 1, 2022, pp. 91-103.

Barker, Linsey M., and Maury A. Nussbaum. “Fatigue, Performance and the Work Environment: A Survey of Registered Nurses.” Journal of Advanced Nursing, vol. 67, no. 6, 2011, pp. 1370-1382.

Echevarria, Ilia M., Barbara J. Patterson, and Anne Krouse. “Predictors of Transformational Leadership of Nurse Managers.” Journal of Nursing Management, vol. 25, no. 3, 2017, pp. 167-175.

Kül, Seval, and Betül Sönmez. “The Effect of Nurse Managers’ Servant Leadership on Nurses’ Innovative Behaviors and Job Performances.” Leadership & Organization Development Journal, vol. 42, no. 8, 2021, pp. 1168-1184.

Lee, Jong-Hyun, Jaejin Hwang, and Kyung-Sun Lee. “Job Satisfaction and Job-Related Stress among Nurses: The Moderating Effect of Mindfulness.” Work: Journal of Prevention, Assessment & Rehabilitation, vol. 62, no. 1, 2019, pp. 87-95.

Lee, Meng H., et al. “A Longitudinal Study of Nurses’ Work-Life Balance: A Case of a Regional Teaching Hospital in Taiwan.” Applied Research in Quality of Life, vol. 17, no. 1, 2022, pp. 93-108.

Li, H. -., et al. “The Effect of a peer‐mentoring Strategy on Student Nurse Stress Reduction in Clinical Practice.” International Nursing Review, vol. 58, no. 2, 2011, pp. 203-210.

López‐López, Isabel M., et al. “Prevalence of Burnout in Mental Health Nurses and Related Factors: A Systematic Review and meta‐analysis.” International Journal of Mental Health Nursing, vol. 28, no. 5, 2019, pp. 1032-1041.

Mahon, Marie A., et al. “Nurses’ Perceived Stress and Compassion Following a Mindfulness Meditation and Self Compassion Training.” Journal of Research in Nursing, vol. 22, no. 8, 2017, pp. 572-583.

Pelit‐Aksu, Sıdıka, et al. “Effect of Progressive Muscle Relaxation Exercise on Clinical Stress and Burnout in Student Nurse Interns.” Perspectives in Psychiatric Care, vol.57, 2020, pp.1095-1102.

Zhang, Meng, et al. “Influence of Perceived Stress and Workload on Work Engagement in front‐line Nurses during covid‐19 Pandemic.” Journal of Clinical Nursing, vol.30, 2021, pp.1584-1595.

Nurses’ Potential in Combating Eating Disorders Among Adolescents

Academic Discipline: Nursing
Assignment Subject: Nurses’ potential in combating eating disorders among adolescents
Academic Level: Undergraduate-2nd Year
Referencing Style: APA
Word Count: 2,003

Social media platforms have become increasingly important in the contemporary context, as they represent a highly efficient avenue for real time interaction and communication. Social media platforms are predominantly used by the younger demographic cohort, which has important implications for their psychological wellbeing and development (Barry et al., 2017). Notwithstanding the stated benefits of social media, the platforms are also associated with a pernicious effect among younger users. The focus of this analysis is on one of the more pernicious effects of social media- specifically the promulgation of messages that encourage the development of body image dysmorphia among adolescents. The importance of this topic is that body dysmorphia is arguably the leading cause for the development of eating disorders, which have a disproportionate impact on adolescent females compared to any other demographic group (Jarman et al., 2021).

This essay seeks to discuss the strategies that nursing professionals can use to prevent and/or address the underlying reasons for social-media-induced eating disorders that are prevalent among adolescents. To this end, the next section of the paper presents a brief overview of the research scope, specifically noting the relational dynamics between social media, body dysmorphia, and the development of eating disorders among adolescents. This is ensued by a discussion of the strategies that nursing practitioners can use to assist teenagers dealing with eating disorders. The final section concludes with a rehash of the main arguments and insights. It will be argued that nursing practitioners can assist teenagers with body dysmorphia by leveraging their professional networks; disseminating information about the consequences of body dysmorphia and eating disorders; empowering teenagers to become more comfortable with their natural body types; enabling teenagers to become cognizant of the detrimental effects of both body image dysmorphia and eating disorders; as well as collaborating with other professionals in the healthcare sectors, such as dieticians and counsellors, to provide a viable action plan for teenagers experiencing eating disorders and body image dysmorphia.

Contextual Overview: Social Media, Body Dysmorphia and Eating Disorders
Social media can be paralleled to most forms of popular culture because of its capacity to expose users to a range of lifestyle choices, patterns and behaviours that can influence perceptions about self, others and the environment (Jarman et al., 2021). Adolescent social media users often follow celebrities, social media influencers, and other pseudo-celebrity users who often affect how they perceive themselves. The beauty ideals that are typically parroted and promoted on social media platforms are consistent with the thin body type (specifically for women), which is not solely unique to social media but has also been noted for other, Western popular culture mediums (Danthinne et al., 2022; Milkie, 1999). It is therefore commonplace among adolescents to compare themselves with the beauty images, ideals and standards they observe on social media, which can inadvertently lead to the development of body image dysmorphia.

Body image dysmorphia can be defined as feelings of intense dissatisfaction with one’s physical appearance. In the contemporary context, social media has been highlighted as one of the most significant exposure platforms that lead to the development of body image dysmorphia among adolescents (Danthinne et al., 2022). This is because social media is perceived as an accurate projection of the prevalent social norms for beauty, influencing younger users to adhere to the expected beauty standards in order to fit in (Fardouly & Holland, 2018). As noted earlier, the dominant beauty ideal is that of a thin body type, which alienates girls and women who may not naturally fit with that body type.

One of the most detrimental effects of body image dysmorphia is the development of eating disorders (Marks et al., 2020). This is especially applicable to adolescents who may be overweight or obese, such that they are pressured to reduce their body weight drastically in order to fit within the parameters of Western beauty ideals. Consequently, a substantial number of teenage social media users develop conditions such as bulimia and Anorexia Nervosa, which have severe physical, emotional, and psychological health implications (Marks et al., 2020). According to recent research, an estimated 2.7% of teenagers in the United States have an eating disorder (including Bulimia and Anorexia Nervosa) (Polaris Teen Centre, 2018). Moreover, eating disorders can be associated with other comorbidities that can increase the proclivity towards suicidal ideation, among other psychological issues (Marks et al., 2020). The reality that adolescent social media users tend to develop body image dysmorphia, and subsequently eating disorders, highlights the significance of this research topic.

Typical Strategies to Mitigate Against Eating Disorders
According to existing research, eating disorders are typically perceived as psychological problems (Goodyear, 2020; Marks et al., 2020). Consequently, the most cited approach to address challenges of body image dysmorphia and eating disorders is contingent on psychological intervention. Specifically, teenagers experiencing eating disorders are often encouraged to participate in cognitive behavioural training or therapy (CBT), which aims to address the problematic thought patterns that justify the development of the eating disorder (Yasar et al., 2019). Other scholars have noted that eating disorders emerge because of the exposure to popular culture mediums that promote the Western beauty ideal, intimating that this can be redressed by including other body types and encouraging popular culture to refrain from shaming women who do not subscribe to the Western ideals (Milkie, 1999).

Regardless of the attribution of eating disorders to either underlying psychological issues or the disproportionate exposure to Western beauty ideals, the nursing profession is often exempted among practitioners who can combat eating disorders among adolescents. That is, nurses are not perceived as critical actors who can intervene against the development of eating disorders among adolescents. The rest of this paper seeks to challenge this assumption, by demonstrating the ways in which nursing practitioners can intervene to challenge both body image dysmorphia and eating disorders among teenagers. The utility of this discussion is improving the conceptualization of eating disorders, by highlighting the potential value of integrating nurses within potential interventionists.

Nursing Professional Interventions
Nursing practitioners are often the first point of contact with the public, specifically in the context of primary healthcare facilities. Moreover, most schools are now mandated to have some healthcare professionals on their premises, which further explains the importance of nurses as potential interveners against body-dysmorphia-induced eating disorders. In addition, nursing professionals can collaborate with other healthcare professionals in devising a viable intervention for adolescents with eating disorders, which further highlights their potential importance in alleviating some of the negative consequences of social media exposure among adolescents.

The first strategy that nursing professionals can use is to thoroughly understand the development of the eating disorder- necessarily from the vantage point of the teenager. That is, one of the best nursing practices in the contemporary context is the ability to practice through a patient-centric approach, that perceives the patient as capable of relaying their challenges and issues they are currently facing (Lindgren et al., 2020). In either the primary care or school setting, the nursing practitioner can obtain a detailed patient history that notes when and why the eating disorder emerged, and the reasons the adolescent has struggled to cope with body image dysmorphia. A patient-centric approach would be useful because it empowers the patient to understand their own reality, which will eventually lead to the development of a sense of ownership that can be used to address the problematic behaviour (Lindgren et al., 2020).

Second, nursing practitioners can encourage adolescents to maintain a reflective journal documenting their thoughts, feelings and emotions in the aftermath of spending time on social media platforms. This would be especially useful in a school environment and would enable the adolescents to be cognizant of how social media exposure can lead to the development of body image dysmorphia. The second rationale for this strategy is that it would provide an opportunity to highlight negative thoughts and comparison tendencies- thereby redressing unhealthy psychological processes that also lead to the development of body image dysmorphia in the aftermath of social media exposure (Yasar et al., 2019). Consequently, teenagers would be better placed to find positive coping strategies that do not further jeopardize their health outcomes.

The third approach that nurses can use to combat eating disorders among teenagers is through the dissemination of information about eating disorders, focusing on their immediate and long-term health consequences. Although social media can play a role in the development of eating disorders, it does not possess resources that can be effectively used to discourage users from developing eating disorders (Milkie, 1999). As credible healthcare professionals, nurses would have accurate and evidence-based knowledge on the impact of eating disorders on adolescents. They can therefore disseminate information to ensure that teenagers are privy to the effects of eating disorders, to discourage them from applying an extreme reaction to body image dysmorphia.

The fourth strategy that nursing practitioners can use is to expose adolescents to ‘successful’ role models who may not necessarily resonate with the Western beauty ideal of a thin body type. The rationale for this approach is that it would demonstrate that body image is not a prerequisite for social success or acceptance- which some adolescents may be unaware or sceptical of (Goodyear, 2020). Nursing professionals would be useful in designing comprehensive programs that can eventually be integrated within healthcare policy, to mitigate against eating disorders among adolescents. Nurses may be well placed to actively participate in this approach because they meet a high volume of patients and professionals through their work. Moreover, this recommendation would be consistent with the best nursing practice of advocacy for the patient (Lindgren, 2020). That is, the nurse would be advocating for the adolescents by leveraging their professional networks in order to provide adolescents with alternative conceptualizations of beauty, success, social acceptance and desirability.

Another strategy that nurses can use to assist teenagers with eating disorders is to collaborate with other healthcare professionals- for instance dieticians and therapists. Dieticians would be useful insofar as generating an appropriate plan that is tailored to the specific goals of the adolescent, while maintaining nutritional balance that would reduce the possibility of developing eating disorders. Similarly, therapists would be helpful in providing counselling services that can assist the teenager to understand the roots of the body dysmorphia and eating disorder, which would then facilitate the development of a mitigation strategy. The nursing professionals can play an instrumental role in bringing together different healthcare professions to combat this issue, thereby highlighting the potential utility of nurses in assisting teenagers with body image dysmorphia.

In conclusion, the purpose of this analysis was to evaluate the extent to which the nursing profession can be actively engaged in combating eating disorders among adolescents. The first section of analysis demonstrated that social media use is often problematic, especially among teenage girls, as they tend to develop body dysmorphia because of the exposure to Western beauty ideals that revolve around the thin body type. It was also noted that most strategies on combating body dysmorphia and eating disorders focus on involving psychological healthcare practitioners- often at the expense of nurses. The paper argued that contrary to conventional wisdom, nurses can play an important role in mitigating against the development of eating disorders among teenagers. The necessary qualification that contextualizes the arguments herein is the active engagement of nurses in learning environments, as well as their prioritization as the first point of contact in primary healthcare facilities. The paper thus argued that nurses can adopt strategies that are consistent with best practices- for instance adopting a patient centric approach to care delivery, empowering the patients across all stages from diagnosis to treatment, and pioneering collaboration within the healthcare profession to assist teenagers with eating disorders. Moreover, it was argued that nurses can be useful in disseminating evidence-based and practical information; challenging the conventional Western beauty ideal of the thin body type; encouraging teenagers to actively think about the subliminal messages conveyed through social media platforms; and creating positive dietary and exercise regimes that are compatible with the needs, expectations or desires of teenagers experiencing body dysmorphia and subsequently developing eating disorders.

References
Barry, C. T., Sidoti, C. L., Briggs, S. M., Reiter, S. R., & Lindsey, R. A. (2017). Adolescent social media use and mental health from adolescent and parent perspectives. Journal of Adolescence, 61, 1-11.

Danthinne, E. S., Giorgianni, F. E., Ando, K., & Rodgers, R. F. (2022). Real beauty: Effects of a body‐positive video on body image and capacity to mitigate exposure to social media images. British Journal of Health Psychology, 27(2), 320-337.

Fardouly, J., & Holland, E. (2018). Social media is not real life: The effect of attaching disclaimer-type labels to idealized social media images on women’s body image and mood. New Media & Society, 20(11), 4311-4328.

Goodyear, V. (2020). Narrative matters: Young people, social media and body image. Child and Adolescent Mental Health, 25(1), 48-50.

Jarman, H. K., Marques, M. D., McLean, S. A., Slater, A., & Paxton, S. J. (2021). Motivations for social media use: Associations with social media engagement and body satisfaction and well-being among adolescents. Journal of Youth and Adolescence, 50(12), 2279-2293.

Lindgren, B., Molin, J., & Graneheim, U. H. (2020). Balancing between a person-centred and a common staff approach: Nursing staff’s experiences of good nursing practice for patients who self-harm. Issues in Mental Health Nursing, 42(6), 564-572.

Marks, R. J., De Foe, A., & Collett, J. (2020). The pursuit of wellness: Social media, body image and eating disorders. Children and Youth Services Review, 119, 1-8.

Milkie, M. A. (1999). Social comparisons, reflected appraisals, and mass media: The impact of pervasive beauty images on black and white girls’ self-concepts. Social Psychology Quarterly, 62(2), 190-210.

Polaris Teen Centre. (2018). 10 Statistics of Teenage Eating Disorders. Retrieved from https://polaristeen.com/articles/10-statistics-of-teenage-eating-disorders/

Yaşar, A. B., Abamor, A. E., Usta, F. D., Taycan, S. E., & Kaya, B. (2019). Two cases with avoidant/restrictive food intake disorder (ARFID): Effectiveness of EMDR and CBT combination on eating disorders (ED). Klinik Psikiyatri Dergisi: The Journal of Clinical Psychiatry, 22(4), 493-500.

Information boom or information crisis: The impact of digitization on news

Academic Discipline: Media and Communications
Course Name: Media and Communications
Assignment Subject: Information boom or information crisis: the impact of digitization on news credibility
Academic Level: Undergraduate-2nd Year
Referencing Style: MLA
Word Count: 1,898

With the rise of Web 2.0 – meaning interactive Web where the content is user-generated (Buhler et al. 215) – there has been a modification in the production and distribution of information, and thus in the overabundance (Anderson 52) and diversity (Buschow and Suhr 385) of information available. The positive outcome of this is that society now has access to more information sources, in comparison to the previously highly hierarchical information (Schapals, Maares and Hanusch 22) produced by a small and exclusive group of actors in the press industry (Hermida and Young 96). At the same time, the subsequent shift towards an abundance of information that was accessible to a much larger pool of contributors (Buhler et al. 218) has resulted in some negative outcomes, including information that is opinionated, contestable, hard to verify (Fisher 21), and in some cases purposely manipulative (Heinrich 179). In this paper, it will be argued that the digitization of news has ushered in a remarkable means of mobilization and action, but also a significant contribution to news and information dissemination to counterbalance the traditional mass media.

The digitization of news
In order to understand the reality of the processes of production and circulation of information, the material conditions that organize them need to be considered. This refers to the major distribution channels that direct the circulation of news (Parcu 103) and the main players taking part in the international flow of information (Anderson 53). Taking into account, the socio-economic conditions weighing on the production and dissemination of information (Hermida and Young 94) makes it possible to qualify the discourses celebrating the advent of an era of diversity (Buschow and Suhr 385) through news digitization.

From its inception, the Web became a medium that did not have to submit to the control that the major media outlets were subjected to (Schapals, Maares and Hanusch 23) over the production and circulation of information. This was, initially, a means of ensuring a greater range of information inputs (Heinrich 180). The Internet was seen as having the capacity to circulate information across borders, having unprecedented speed and reach (Hermida and Young 93), while also being largely cost-effective. These new contributors of information would play a part in extending the field of news, including eyewitness accounts and investigative journalism (Monahan and Ettinger 490) not commissioned or funded by large media corporations. As a result, independent media centers sprung up to produce both factual and valuable information (Schapals, Maares and Hanusch 24), but also conspiracy theories (Fisher 36) and counter-information that served to undermine the dominant media corporations (Buschow and Suhr 392). The materiality of the flows of information produced (Buhler et al. 217) attested to their ability to reach mass audiences.

Information boom
The development of Web 2.0 has unsettled the classic information economy (Schapals, Maares and Hanusch 27) by giving rise to a networked economy (Parcu 104), as the latter enjoys the freedom from limitations of the controlled production and dissemination of news (Heinrich 183). Firstly, as aforementioned, the reduction in costs has resulted in an unprecedented increase in the number of actors (Buschow and Suhr 386) who had taken on the task of interpreting and disseminating information, and secondly, it boosted the “diversity of perspectives” (Hyvonen 122), in both qualitative and quantitative terms.

With the development of the Web 2.0, ordinary citizens, most with no experience in journalism, branded themselves international correspondents (Hermida and Young 98) if they found themselves in the right place and at the right time, mostly aimed at creating the so-called ‘infotainment’ (Schapals, Maares and Hanusch 21). Their aim was to garner bigger audiences by creating engaging content that was informative, but also largely entertaining, and aimed at drawing a critical mass. These newly constructed accounts created on popular social media platforms, namely YouTube and Facebook, and subsequently Instagram, Snapchat, and TikTok (Prerez-Escoda et al. 34), were competing both against one another, and the mainstream media outlets (Hyvonen 123) for their share of viewership. On the one hand, they had the advantage over the mainstream media outlets as their viewership and engagement numbers (Buhler et al. 223) were clearly visible, as this quickly drew in the advertisers (Anderson 59). On the other hand, unless they fully disclosed their sources, their accounts were potentially unverifiable (Fisher 22), and viewers still tended to trust the reports from the mainstream outlets. The content creators – which was the subsequently coined title for such contributors (Prerez-Escoda et al. 28) – did not have the same material and bureaucratic constraints that the mainstream media outlets tend to have (Schapals, Maares and Hanusch 22), and as such, felt empowered to freely tamper with the complex dynamics of information exchange (Heinrich 181) in the digital age.

Information crisis
The advent of a new era of news circulation attests to the permanence of the imbalances in the processing of information (Parcu 105). With the Internet, blogs, and especially social media platforms, each user works on building an audience. The Web is in a state of continuous flux, with information added, subtracted, edited, viral, and non-viral, and producing an abundance of links to other sites or other sources (Monahan and Ettinger 486). New intermediaries which have emerged on the Web constitute new gatekeepers of the process of information circulation (Dvorkin 55), as they challenge the effectiveness of the contribution of the traditional media outlets regarding the coverage of international news. The amateur production of information on the Web (Schapals, Maares and Hanusch 20) has affected the overall circulation of news by boosting competitiveness, changing the marketing and advertising forces (Anderson 59), thus effectively creating the New World Information Order (Zeller, Trakman and Walters 46). In such a system, the economic model of news content is the maximization of viewership.

Media ownership and the influence of political powers over the media (Hamida and Young 98) are the two main reasons that the traditional media has affected news dissemination and contributed to an air of distrust in the production of information (Perez et al. 26). It is believed that some mainstream media are a powerful propaganda instrument in the service of political powers that forge significant advantageous positions at the international level (Zeller, Trakman and Walters 60) and influence international public opinion. In democratic countries, the influence of political and economic forces on the media (Dvorkin 67) may not be as obvious as in some regimes which are perceived as authoritarian, and where the interference of the public authorities (Schapals, Maares and Hanusch 20) in the work of journalists and the control information (Dvorkin 16) disseminated by the mass media (particularly those owned by the state) is very apparent.

The control exercised by political forces over the media also has a great influence on the circulation of information in periods of conflict or crisis. For these reasons, the emergence of digital social media is seen as an opportunity to redress this balance. Namely, through the use of social media, users hope to undermine the power (Hermida and Young 99) to control the circulation and dissemination of information by the public authorities. For example, while state media, as a deliberate tactic, does not cover certain topics in conflict times (Zeller, Trakman and Walters 59), social media and public blogs and forums try to fill these gaps in the news coverage (Monahan and Ettinger 482). They tend to be based on eyewitness accounts, but also unbridled and unreliable sources of information, and, frequently, conspiracy theories (Fisher 36).

Numerous non-governmental organizations are working to ensure the freedom of information (Heinrich 180), and highlight the need for citizens and communities within public or private associations, to continue producing, processing, and communicating information. In the age of the Internet, driven by the hope of widening the circle of information producers (Buschow and Suhr 384), this works to transform the passive relationship to information into an interactive production. The alternative news sources have thus experienced a major transformation with the emergence of the Internet, and the digitization of news on online social media platforms (Parcu 103).

As a result of the digitization of news, online social media are also seen by some as the repositories of disinformation, which are contributing to the information crisis, in that they have the capacity to be disturbing propaganda tools (Heinrich 177). For example, in addition to the proliferation of content that has been doctored or taken out of context, there is the issue of identifying and preventing the dissemination of all the potentially engaging posts which tend to take off quickly and become viral, as the users of other social media platforms share them without making any checks with regard to their authenticity (Dvorkin 56).

Future prospects
The argument of whether the impact of digitization on news has resulted in an information boom or information crisis is centred on the way people access news, and how they consume news. For example, the socio-digital networks – which now play a major role in the way people access and source information (Buschow and Suhr 390) – seem to have the technical capacity to draw in significant viewership. At the very least, they provide access to more diverse visions and opinions (Parcu 107), but with the growing awareness of digital literacy (Buhler et al. 214), online users are seeking verifiable sources of information in order to gain an understanding of the subject that is rooted in facts. The proliferation of unverifiable sources and ‘infotainment’ has been criticized as it has an immense influence on the way people perceive information (Perez-Escoda et al. 31), and the way in which it shapes their view of the world (Zeller, Trakman and Walters 51) even if they aren’t aware of it.

On the one hand, ‘infotainment’ and content created with the aim to reach a critical mass on social networks appears to have encouraged people to take a greater interest in news and social and political issues (Hyvonen 126) in a way that is more engaging than traditional news reports in the press (Fisher 31). On the other hand, media and communication analysts have questioned the capacity of the content creators to bring more to this field to balance the traditional media reports (Dvorkin 82) by highlighting the way in which certain players have gradually established themselves as essential mediators (Parcu 92) between content produced by amateurs and the mainstream media.

Concluding remarks
The control of information, or rather, the authentication and verifiability of accounts (Buhler et al. 214), both by independent users and professionals or public figures remains a major issue, even more so in the digital era. The traditional mainstream media, which for a long time had a monopoly on information (Dvorkin 80), play a major role during these times, and their control becomes an important issue for the political and economic powers (Zeller, Trakman and Walters 60). The rise of social media and the digitization of news has contributed to reducing the power of the mass media over the control and circulation of information. By facilitating the expression of citizens, social media have contributed to the public’s awareness of social and political issues and their interest in understanding the causes, effects, and consequences. Nevertheless, these new media platforms are also sources of misinformation and propaganda (Heinrich 177), which is developing more and more as the competition mounts between the major corporate media and online users.

Referencing Style: MLA
Word Count: 1898
Sources: 13

Works Cited
Anderson, C. W. “The State (s) of Things. 20 Years of Journalism Studies and Political Communication.” Political Communication 21.1 (2020): 47-62.

Buhler, Julian, et al. “Developing a Model to Measure Fake News Detection Literacy of Social Media Users.” Disinformation, Misinformation, and Fake News in Social Media. Springer (2020): 213-227.

Buschow, Christopher, and Maike Suhr. “Change management and new organizational forms of content creation.” Media and change management. Springer (2022): 381-397.

Dvorkin, Jeffrey. Trusting the News in a Digital Age: Toward a” New” News Literacy. John Wiley & Sons, 2021.

Fisher, Caroline. “What is meant by ‘trust’ in news media?.” Trust in media and journalism, edited by Kim Otto and Andreas Kohler. Springer (2018): 19-38.

Heinrich, Ansgard. “How to build resilient news infrastructures? Reflections on information provision in times of “Fake News”.” Resilience and Hybrid Threats. IOS Press (2019): 174-187.

Hermida, Alfred, and Mary Lynn Young. “From peripheral to integral? A digital-born journalism not for profit in a time of crises.” Media and Communication 7.4 (2019): 92-102.

Hyvonen, Mats. “As a matter of fact: journalism and scholarship in the post-truth era.” Post-Truth, Fake News. Springer (2018): 121-132.

Monahan, Brian, and Matthew Ettinger. “News media and disasters: Navigating old challenges and new opportunities in the digital age.” Handbook of disaster research. Springer (2018): 479-495.

Parcu, Pierluigi. “New digital threats to media pluralism in the information age.” Competition and Regulation in Network Industries 21.2 (2020): 91-109.

Perez-Escoda, Ana, et al. “Fake news reaching young people on social networks: Distrust challenging media literacy.” Publications 9.2 (2021): 24-40.

Schapals, Aljosha Karim, Phoebe Maares, and Folker Hanusch. “Working on the margins: Comparative perspectives on the roles and motivations of peripheral actors in journalism.” Media and Communication 7.4 (2019): 19-30.

Zeller, Bruno, Leon Trakman, and Robert Walters. “The Internet of Things-The Internet of Things or of Human Objects-Mechanizing the New Social Order.” Rutgers L. Rec. 47 (2019): 15-118.

The effectiveness of the COVID-19 temporary solutions in waste management

Academic Discipline: Urban Geography
Course Name: Urban Geography
Assignment Subject: The effectiveness of COVID-19 temporary solutions in waste management, urban mobility and pollution
Academic Level: Undergraduate-3rd Year
Referencing Style: Chicago
Word Count: 1,885

Medical waste tends to incur a host of microorganisms that can infect hospital patients, healthcare workers, and the general public. In a public health crisis, infectious risks include the release into the environment of contaminant microorganisms (Meng et al. 2021, 226) as patients are treated in the hospital for COVID-19. Additionally, not segregating and not recycling waste is one of the causes which has led to environmental degradation in terms of pollution and the increase of carbon dioxide (CO2) emissions from decomposing waste.

As a result of the urgency of the pandemic, most countries were unprepared to tackle the mixed contaminated medical waste from COVID-19. This was a warning sign that the basic infrastructure and capacity to effectively and safely deal with medical waste, should be widely communicated (Battisti 2021, 2) in accordance with the requirements of relevant multilateral environmental agreements (Liang et al. 2021, 9). Interim solutions, such as segregating waste and using locally manufactured incinerators, were implemented to meet short-term needs for COVID-19 waste treatment. The aim was, firstly, to prevent the transmission of the virus (Kumar et al. 2021, 449), and secondly, to ensure the non-accumulation of surplus waste. In this paper, it will be discussed how the immediacy of the public health crisis required a radical paradigm shift towards solutions which serve to protect the environment (Garnett et al. 2022, 6) from the waste newly generated since March 2020, in a world that has already been struggling (Ghernaout and Elboughdiri 2021, 5) with waste management and waste pollution.

Waste management issue
The health risks as a result of direct contact with COVID-19-contaminated waste include potential negative impacts on human health if it is not disposed of properly, such as through the contamination of water and soil during waste treatment (Kumar et al. 2021, 449) and by air pollution, given that the virus particles are airborne (Meng et al. 2021, 227). When dumped into the natural environment or at public landfills, medical waste can lead to bacteriological or toxic contamination of the surroundings (Battisti 2021, 3), which can leak to publicly used resources. This concerns all the waste generated by the operation of a healthcare establishments (Garnett et al. 2022, 5), both at the level of its hospitalization and care services, and at the level of the medical, technical, and administrative services (Roy et al. 2021, 36). Namely, medical waste includes all waste generated in healthcare establishments (including clinics, medical and dental surgeries, establishments for the disabled and for the elderly), and by healthcare workers (on the premises and homecare), research centers, and laboratories related to medical procedures (Ghernaout and Elboughdiri 2021, 3) such as the research and production that went into creating the COVID-19 vaccines (Lazarus et al. 2022, 2). Responsible disposal of unused, expired or contaminated vaccines also had to be considered, especially as the number of doses that was purchased by countries as a precaution (Schaefer, Leland and Emanuel 2021, 903) could not subsequently be donated (Lazarus et al. 2022, 1).

As COVID-19 is transmitted by close contact with people carrying the virus, including asymptomatic patients, hospitals, healthcare facilities, and the general public produced more medical waste (Garnett et al. 2022, 2) from PPE, antibacterial products, and surface disinfectants. There was also a sharp increase in the amount of single-use plastics (Meng et al. 2022, 224) due to the fear of cross-contamination. The issue of this newly generated waste during the pandemic has taken on variable dimensions. Firstly, the impact is associated with the quantity generated (Battisti 2021, 2), and secondly, it concerns the importance of mitigating risks (Kumar et al. 2021, 452) for human health and the environment.

Temporary solutions proposed and implemented
Effective solutions for the control of waste throughout the pandemic included maximizing the use of available waste management and, at the same time, seeking to avoid any potential long-term impacts on the environment. To do this, medical and healthcare establishments looked into solutions to manage the increase in waste generation by maximizing the use of existing facilities (Rubab et al. 2022, 218). For example, establishments tested the immediate implementation of systems of safe disposal of used medical equipment, especially the segregation, collection, and management of medical waste (Roy et al. 2021, 35) according to the level of risk of cross-contamination.

In order to assess the environmental risks and potential solutions in this case, the spread of the virus particles outside of the healthcare establishments also had to be considered, such as the potential spread of disease and chemical contaminants in high-density urban areas or near abundant marine environments (Battisti 2021, 2). As aforementioned, the deposit of medical care waste in uncontrolled areas can have a direct environmental effect through the contamination of natural resources. Namely, unauthorized and improper treatment or incineration, or dumping of medical waste tends to pollute the resources (Liang et al. 2021, 9) and spread further in the ecosystem. As such, the solutions which were implemented to tackle this temporary but overwhelming surge of medical waste were based on the ‘polluter pays’ principle’ (Roy et al. 2021, 37), the precautionary principle’ (de Sousa 2021, 670), and/or the principle of proximity (Herfien et al. 2021, 39).

The ‘polluter pays’ principle requires all waste producers to be legally and financially responsible for the disposal of their waste, in a safe manner and without negatively impacting the environment (Roy et al. 2021, 37). Ensuring that waste disposal does not affect the environment is the responsibility (de Sousa 2021, 672) of each healthcare or medical facility that generates medical waste. On the other hand, under the precautionary principle, the potential risks of the medical waste generated during a health crisis are considered, which has the effect of obliging medical facilities (Karamanian 2021, 91) which generate considerate amounts of waste to apply high standards (Herfient et al. 2021, 40) for the collection and disposal of waste, to provide thorough training in safety and hygiene to all the employees that use PPE and other medical equipment (Rubab et al. 2022, 218), and to communicate the importance of responsible waste management (Kumar et al. 2021, 453) to ensure public safety and environmental protection. In turn, the principle of proximity refers to the treatment and disposal of hazardous biomedical waste on account of it being contaminative (Meng et al. 2021, 228). This means that the management of this waste has to minimize the risks for the population (Hefien et al. 2021, 51) and ensure that the spread of the virus is mitigated. For COVID-19-contaminated waste, this meant disposing of PPE and other medically used items locking in and storing waste (Herfien et al. 2021, 51) so that the virus does not permeate, and sending it off to a site which could responsibly disinfect or destroy this type of waste. An extension of this principle is that each country takes steps to properly dispose of all waste within its own borders (Liang et al. 2021, 13) in order to stop the spread of the virus and not minimize additional transportation emissions.

Analysis of effectiveness
To meet the urgent needs to protect the public in the face of the pandemic, the amount of PPE, additional medical treatment materials, and billion doses of vaccine administered worldwide produced thousands of additional tonnes of waste (Lazarus et al. 2022, 2). Subsequently, as a result of reports issued by environmental agencies, which stated that single-use masks tend to end up in the oceans as waste (Battisti 2021, 1), the most effective solution was to encourage and clearly communicate guidance on how to reduce waste overall in a safe manner, such as through the use of non-disposable PPE and their proper and frequent disinfection and treatment after use (de Sousa 2021, 672). One solution that was financed and executed in India, Italy, France, among others, was a wide initiative to recycle single-use masks by identifying and supporting collection and recycling as a contribution to the circular economy (Roy et al. 2021, 41).

Additionally, one of the temporary solutions which has shown to be the most effective was mapping the sources of waste generation to identify the changes in waste quantities (Garnett et al. 2022, 8) practiced in Canada, China, and the UK. The aim was to maximize the use of resources and the efficiency of waste collection and treatment. This included re-directing services (Herfien et al. 2021, 51) from the locations where waste generation has been reduced as a result of stay-at-home and physical distancing directives – such as schools, offices, shopping malls, and other public places – to the ones which were shown to generate more waste (de Sousa 2021, 674) affected by COVID-19 – clinics, hospitals, other medical establishments, home care centres, research and testing laboratories, quarantine centres, etc.

Evidently, the solutions which showed to be effective were human and financial resources, as well as assets intended for waste collection, allocated based on the need to address waste generation points (Garnett et al. 2022, 9), such as increased waste collection services, but also their responsible disposal and treatment (Karamanian 2021, 91), ensuring they do not end up in landfills and oceans. This included the temporary on-site storage and/or thermal treatment (Herfien et al. 2021, 50) of potentially contaminant waste, and adequate and safe sanitation measures (Meng et al. 2021, 227). A solution that was practiced in South Korea and the United States concerned items intended for multi-material recovery (Garnett et al. 2022, 5) stored temporarily on site for 72 hours, which is the known survival time of the virus particles (Liang et al. 2021, 13). After that, the collected materials were deemed to be safer to be handled, treated, and recycled.

Potentials for long-term solutions
From the start of the pandemic at the beginning of 2020, numerous organizations mobilized to provide the masks and other personal protective equipment (PPE) which were proven to be effective to protect the public and mitigate the spread of the virus (Karamanian 2021, 90). In the wake of this, countries looked for ways to finance businesses which would produce the PPE nationally (Ghernaout and Elboughdiri 2021, 4), as opposed to waiting for overseas shipments which were delayed due to COVID-19 restrictions, and also struggled to keep up with a sudden and overwhelming demand.

Considering long-term prospects, the proposed solutions have to make it possible to set up effective local production and collection circuits (Roy et al. 2021, 41) for used single-use personal protective masks/equipment (manufacturing, distribution, collection, and safe transportation) from the medical and healthcare establishments, the general public, businesses, and so on. This would involve the collection points or drop-off terminals complying with health standards (Schaefer, Leland and Emanuel 2021, 904), and the sorting and treatment of waste for reuse or recycling (Meng et al. 2021, 229). The solutions as such must offer equipment for the integration into processes of material reuse (Garnett et al. 2022, 10) in order to prevent their accumulation in the landfills and oceans. The solutions should also ideally include disinfecting technologies for the safe reuse of PPE as part of a wide manufacturing process.

Conclusion
As a result of stay-at-home orders and the physical distancing measures, the disruption of basic urban services, including the collection, segregation, treatment, and disposal of waste, essential for hygiene and public health, posed new challenges (Kumar et al. 2021, 453) and required new and effective solutions. As a solution for properly disposing of waste collected from households and medical establishments that deal with COVID-19, municipalities had to boost their waste management services. Temporary changes to waste management operations had to be executed using existing resources, and by finding quick-impact solutions to maintain continuity and efficiency of operations (de Sousa 2021, 676). This put a strain on waste management operations and the beneficiaries of these services. Best practices and recommended guidelines on the treatment of waste included adapting municipal waste management to remedy this situation and help decision-makers develop a solid waste management strategy in response to COVID-19.

Bibliography
Battisti, Corrado. “Not only jackals in the cities and dolphins in the harbours: less optimism and more systems thinking is needed to understand the long-term effects of the COVID-19 lockdown.” Biodiversity (2021): 1-5.

de Sousa, Fabiula Danielli Bastos. “Management of plastic waste: A bibliometric mapping and analysis.” Waste Management & Research 39, no. 5 (2021): 664-678.

Garnett, Emma, Angeliki Balayannis, Steve Hinchliffe, Thom Davies, Toni Gladding, and Phillip Nicholson. “The work of waste during COVID-19: logics of public, environmental, and occupational health.” Critical Public Health (2022): 1-11.

Ghernaout, Djamel, and Noureddine Elboughdiri. “Plastic waste pollution worsen by the COVID-19 pandemic: Substitutional technologies transforming plastic waste to value added products.” Open Access Library Journal 8, no. 7 (2021): 1-12.

Herfien, Herfien, Akbarulla Akbarulla, Andri Wijaya, Benni Mapanta, Farid Martadinata, Gunansyah Gunansyah, Kurniati Kurniati et al. “Medical Waste In The COVID-19 Pandemic Era: Management Solutions.” Science and Environmental Journal for Postgraduate 4, no. 1 (2021): 36-53.

Karamanian, Susan L. “The precautionary principle.” In Elgar Encyclopedia of Environmental Law. London: Edward Elgar Publishing Limited (2021): 90-94.

Kumar, Amit, Vartika Jain, Ankit Deovanshi, Ayush Lepcha, Chandan Das, Kuldeep Bauddh, and Sudhakar Srivastava. “Environmental impact of COVID-19 pandemic: more negatives than positives.” Environmental Sustainability 4, no. 3 (2021): 447-454.

Lazarus, Jeffrey V., Salim S. Abdool Karim, Lena van Selm, Jason Doran, Carolina Batista, Yanis Ben Amor, Margaret Hellard, Booyuel Kim, Christopher J. Kopka, and Prashant Yadav. “COVID-19 vaccine wastage in the midst of vaccine inequity: causes, types and practical steps.” BMJ Global Health 7, no. 4 (2022): 1-5.

Liang, Yangyang, Qingbin Song, Naiqi Wu, Jinhui Li, Yuan Zhong, and Wenlei Zeng. “Repercussions of COVID-19 pandemic on solid waste generation and management strategies.” Frontiers of Environmental Science & Engineering 15, no. 6 (2021): 1-18.

Meng, Jian, Qun Zhang, Yifan Zheng, Guimei He, and Huahong Shi. “Plastic waste as the potential carriers of pathogens.” Current Opinion in Food Science 41 (2021): 224-230.

Roy, Poritosh, Amar K. Mohanty, Alexis Wagner, Shayan Sharif, Hamdy Khalil, and Manjusri Misra. “Impacts of COVID-19 outbreak on the municipal solid waste management: Now and beyond thepandemic.” ACS Environmental Au 1, no. 1 (2021): 32-45.

Rubab, Saddaf, Malik M. Khan, Fahim Uddin, Yawar Abbas Bangash, and Syed Ali Ammar Taqvi. “A Study on AI‐based Waste Management Strategies for the COVID‐19 Pandemic.” ChemBioEng Reviews 9, no. 2 (2022): 212-226.

Schaefer, G. Owen, R. J. Leland, and Ezekiel J. Emanuel. “Making vaccines available to other countries before offering domestic booster vaccinations.” JAMA 326, no. 10 (2021): 903-904.

Can Environmental Responsibility be Taught?

Academic Discipline: Education/Sociology
Course Name: Education/Sociology
Assignment Subject: Can environmental responsibility be taught?
Academic Level: Undergraduate-3rd Year
Referencing Style: APA
Word Count: 1,864

Teachers in natural sciences, politics, history, social studies, as well as the arts and literature are already debating the degree to which education curricula should be oriented toward social transformation (Lehtonen, Salonen, & Cantell, 2019), and to what extent is it their responsibility to impart not only knowledge and facts, but also values (Dewi & Primayana, 2019). The currents of research associated with environmental education, in particular the possibility of integrating curricula with the aim of teaching environmental responsibility (Aarnio-Linnanvuori, 2019), includes environmental ethics, citizenship education, and education for sustainable development. In this paper, it will be argued that, while these currents may appear to have different objectives and values, they have to be combined in order to contribute to the construction of an eco-citizenship identity in a way that will develop the emancipation, responsibility, and commitment (Levy, Oliveira, & Harris, 2021) of the future generation of eco citizens.

Definition and scope
Teaching environmental responsibility refers to advocating specific initiatives such as reducing direct and indirect consumption, reducing and recycling waste, minimizing the frequency and quantity of the polluting modes of transport, solidarity and environmental justice (Alexander et al., 2021), and the political, economic, and cultural links to the environment (Lehtonen, Salonen, & Cantell, 2019). Teaching environmental responsibility as such requires implementing specific teaching methods (Aarnio-Linnanvuori, 2019), which engage the subjects, in particular by inviting them to challenge the social, economic, and political systems, and question their values (Olsen et al., 2020). On the one hand, it is suggested that this should be independent of school disciplines (for example, by parents and the community). On the other, teaching environmental responsibility should be already addressed in schools by teaching the roots of participatory and “deliberative democracy” (Devaney et al., 2020). This means favouring action over knowledge (Alexander et al., 2021) in a way that explicitly addresses personal and societal principles and morals, but also interdependence and interdisciplinarity (Olsen et al., 2020) in the face of the rapidly changing world.

Citizen participation is, in all discourse, particularly in the field of the environment and sustainable development, a deliberative aim whose objective is the formation of an enlightened public opinion, preparing future generations to participate in the consultative processes (Parra et al., 2020), and acting in an emancipatory aim which seeks to integrate collective awareness to induce the transformation of socio-environmental realities (Cachelin & Nicolosi, 2022) which are taking the society on the path to an environmental crisis (Levy, Oliveira, & Harris, 2021). This is not a question of transmitting knowledge or facts (Devaney et al., 2020) so much as teaching a new generation how to apply that knowledge so that they can appropriate the different principles and values (Aarnio-Linnanvuori, 2019) and understand how they can contribute to tackling the emerging issues.

The difference between teaching environmental science and teaching environmental responsibility
The notion of responsibility is often attached to a context or an area (Cachelin & Nicolosi, 2022), namely moral and social responsibility (Alexander et al., 2021) with respect to our surroundings and the resources we need for our survival. In the beginning, parents, the extended family, and the community are largely responsible for setting the children on the right path, who, upon being formally educated, would then become responsible adults (Levy, Oliveira, & Harris, 2021) who understand how their contribution (or lack of) affects society at large. At the same time, parents teaching environmental responsibility early on is not a guarantee (Zhang & Gibson, 2021) that they will become socially and environmentally responsible adults. And, along the same lines, a lack of awareness about environmental responsibility in childhood does not mean that, through the course of their life, these persons would not find their own path to environmental awareness and responsibility.

The case of environmental responsibility, as opposed to teaching environmental science, requires taking ethics into consideration (Aarnio-Linnanvuori, 2019), meaning responsibility which becomes a societal requirement (Parra et al., 2021), widely accepted as a norm. A good example of this is the recent shift in environmental consciousness (Olsen et al., 2020) and the subsequent actions which are no longer seen as radical but mainstream. As such, teaching ethics in schools might refer to constructing an ethical frame of mind (Aarnio-Linnanvuori, 2019), so that students gradually understand how their everyday habits impact the environment. For example, this means teaching them to be ethical consumers, teaching them about carbon footprint, how to calculate it, and how to reduce their emissions (Franch, 2020) by making alternative dietary, travel, and consumption choices. It can also refer to teaching them about their career prospects, and how to align their interests and ambitions (Lehtonen, Salonen, & Cantell, 2019) with environmental responsibility and sustainability. Teaching environmental responsibility as such should combine ethics within the educational process through self-realization (Zhang & Gibson, 2021) and a critical view of societal norms (Dewi & Primayana, 2019).

Another aspect of teaching considers teaching the psychosocial attitudes (Cachelin & Nicolosi, 2022) that favour responsibility. This refers to teaching how to forge strong connections in both the human and the non-human environment (Lehtonen, Salonen, & Cantell, 2019), especially the importance of the connection to the rest of the living world as a way of its safeguarding and preservation. For example, teaching students to be empathetic and how to relate (Aarnio-Linnanvuori, 2019) to the living world would result in mindful and considerate (Olsen et al., 2020) future citizens. This would help contribute to their abilities to build a favourable perception of the environment (Zhang & Gibson, 2021) and, ideally, internalize the imparted values and principles.

Curriculum
Teaching environmental responsibility means simultaneously focusing on making the students understand the management of the limited collective natural resources (Parra et al., 2020) using awareness that goes far beyond disciplinary teaching. In other words, it is not a matter of merely relaying scientific facts (which the courses in natural sciences already do), but combining action and critical reflection (Olsen et al., 2020). Current educational actions are often based on the aspect of responsibility countered by guilt (Aarnio-Linnanvuori, 2019), meaning transferring responsibilities to subsequent generations (Lehtonen, Salonen, & Cantell, 2019). However, French (2020) and Aarnio-Linnanvuori (2019), among others, distinguish between education “about” the environment, and education “for” the environment. In this way, in the context of teaching, responsibility means taking an interdisciplinary approach (Olsen et al., 2020).

In addition to curricula in science and technology, teaching environmental responsibility refers to making students understand the production of energy (Parra et al., 2020), sourcing, access, conservation, and especially renewable energy sources. It also entails understanding the natural resource cycles, responsible waste and wastewater treatment (Parra et al., 2020) and long-term resource management (Lehtonen, Salonen, & Cantell, 2019), the impact of environmental pollution and degradation on the quality of water, soil, and air (Dewi & Primayana, 2019). While considered by most to already fall within the domain of science education, teaching responsibility specifically means referencing the social values and the aspects of applicability to human needs (Cachelin & Nicolosi, 2022). More precisely, it involves a critical analysis of the vision of a sustainable future and of the significance of the practices associated with it that both current and future generations have to exercise (Alexander et al, 2021). Such a critical approach has not sufficiently penetrated the field of educational practices (Devaney et al., 2020). Teaching environmental responsibility means teaching students to analyze the values ​​(Zhang & Gibson, 2021) and the underlying practices that result from it.

Teaching eco-citizenship to new generations
With regard to environmental education or education for sustainable development, teaching environmental responsibility appears to be at the crossroads of environmental education, thus giving rise to the proposal of eco-citizenship (Levy, Oliveira, & Harris, 2021). Namely, teaching environmental responsibility must contribute to the emergence of citizen awareness and acknowledgment (Devaney et al., 2020), inserting them into a body of knowledge (Franch, 2020), and being able to convince, independently make informative and responsible decisions, and to act by contributing to the local and global communities in a meaningful way. By stimulating critical thinking, teaching environmental responsibility can be an exercise of an enlightened democracy (Parra et al., 2020) for future citizens.

With a view to enriching environmental science education with socially responsible citizenship (Devaney et al., 2020), and more specifically the establishment of the future eco-citizen (Franch, 2020), taking socio-ecological issues into account requires the integration of objectives (Cachelin & Nicolosi, 2022) into science curricula, such as education relating to the pressing environmental issues and the need to act (Zhang & Gibson, 2021) in order to avert the climate crisis. The current context of understanding the roots, causes, and consequences of environmental issues (Alexander et al., 2021) requires reframing not only science education but education in general. New proposals have been directed at focusing educational effort, not only on the relationship to the environment, but also on the strengthening of the network of relationships in the living environment (Zhang & Gibson, 2021) aimed toward the goal of sustainability.
Educating students so that they would become environmentally responsible adults requires imparting principles and values (Parra et al., 2020) so that they understand their role in society (Devaney et al., 2020), not only in the future as adults, but right now, as students, and as young citizens. This has to take on a particular dimension, as, to be clear, education with the aim of creating environmentally responsible citizens of the future is different from education for sustainable development. Considering education as a fundamental instrument for the protection of the environment and for sustainable development, would ideally incorporate the principles of environmental responsibility into education for sustainable development in formal education systems (Aarnio-Linnanvuori, 2019).

Conclusion
When it comes to teaching environmental responsibility, the main challenges lie in the existing tensions that are at work between research aimed at imparting knowledge (Zhang & Gibson, 2021), understanding and action expressed through the curriculum (Dewi & Primayana, 2019), and teaching commitment and environmental consciousness (Aarnio-Linnanvuori, 2019). Teaching this as an interdisciplinary subject matter (Olsen et al., 2020) falls within the scope of socially active issues, which are also considered an educational issue for the citizens of the future (Zhang & Gibson, 2021) who will, with the right frame of mind, and the right knowledge, contribute to building a truly sustainable world (Parra et al., 2020). For example, when students are taught to be environmentally responsible, this concerns all aspects of their lives, including their future choice of a career, their personal and public life, as well as individual and collective consciousness (Franche, 2020). As such, the choice of an approach induces strong consequences if left unaddressed or inadequately addressed.
In the process of teaching, students should be encouraged by teachers to create a critical perspective on society and on the established modern societal norms (Cachelin & Nicolosi, 2022), namely the harmful practices on a mass scale, and be able to challenge in an informative and knowledgeable way (Alexander et al., 2021) the dominant discourses and practices. In conclusion, although it is a complex subject to tackle, and especially to implement, not only can environmental responsibility be taught, but it should be taught as part of the curriculum at all levels of education, and building upon the acquired knowledge.

References
Alexander, W. L., Wells, E. C., Lincoln, M., Davis, B. Y., & Little, P. C. (2021). Environmental justice ethnography in the classroom: teaching activism, inspiring involvement. Human Organization, 80(1), 37-48.

Aarnio-Linnanvuori, E. (2019). How do teachers perceive environmental responsibility? Environmental Education Research, 25(1), 46-61.

Cachelin, A., & Nicolosi, E. (2022). Investigating critical community engaged pedagogies for transformative environmental justice education. Environmental Education Research, 28(4), 491-507.

Devaney, L., Brereton, P., Torney, D., Coleman, M., Boussalis, C., & Coan, T. G. (2020). Environmental literacy and deliberative democracy. Climatic Change, 162(4), 1965-1984.

Dewi, P. Y. A., & Primayana, K. H. (2019). Effect of learning module with setting contextual teaching and learning to increase the understanding of concepts. International Journal of Education and Learning, 1(1), 19-26.

Franch, S. (2020). Global citizenship education: A new ‘moral pedagogy’ for the 21st century?. European Educational Research Journal, 19(6), 506-524.

Lehtonen, A., Salonen, A. O., & Cantell, H. (2019). Chapter 11: Climate change education: A new approach for a world of wicked problems. In Cook, J.H. (Ed.) Sustainability, Human Well-being, and the Future of Education (pp. 339-374). Cham: Palgrave Macmillan.

Levy, B. L., Oliveira, A. W., & Harris, C. B. (2021). The potential of “civic science education”: Theory, research, practice, and uncertainties. Science Education, 105(6), 1053-1075.

Olsen, S. K., Miller, B. G., Eitel, K. B., & Cohn, T. C. (2020). Assessing teachers’ environmental citizenship based on an adventure learning workshop: A case study from a social-ecological systems perspective. Journal of Science Teacher Education, 31(8), 869-893.

Parra, G., Hansmann, R., Hadjichambis, A. C., Goldman, D., Paraskeva-Hadjichambi, D., Sund, P., … & Conti, D. (2020). Education for environmental citizenship and education for sustainability. In Conceptualizing Environmental Citizenship for 21st Century Education (pp. 149-160). Cham: Springer.

Zhang, H., & Gibson, H. J. (2021). Long-term impact of study abroad on sustainability-related attitudes and behaviors. Sustainability, 13(4), 1953-1971.

Senioritis: 20 Signs a Student Has It and The Cure

Senioritis is a phenomenon that affects high school and college seniors worldwide. If you’ve found this article, you’ve probably asked yourself why you’re suddenly low on motivation. Is it simply the senior year slump or something else to be worried about?

You’ve made it to senior year! The culmination of all the hard work you’ve been doing in school all these years. You started out excited and amped up. You’re finally on top of the school’s food chain. The finish line is finally in your view and you’ve even made it through the first semester, but out of nowhere, you’ve lost all motivation. What used to excite you no longer feels as rewarding as it once did and weirdly enough, it all began during your second semester.

There’s a pile of unmade homework and upcoming deadlines and it’s causing you worry and anxiety. You’re absolutely paralyzed and don’t know how to get back on track, or worse, you’ve lost all reason to even try. If this resonates with you in any way, you might just have caught a bad case of senioritis. But worry not because with this diagnosis comes Homework Help Global’s prognosis. Here’s everything you need to know on what and how to overcome it.

Female student stressed with school work because of senioritis
Does Senioritis Really Exist?

While some could argue that this is just another fancy term for laziness and an excuse for seniors to get away with not doing as much work as they used to in order to get to the finish line, the term has been around for at least a decade. And while it’s even more unmotivating to find yourself diagnosed with this supposed illness, you’re not alone. There are many ways to get your mind, body, and soul back in the groove of schooling and end the year as strong as you started.

Senioritis in college or high school is definitely a true phenomenon based on many student testimonials. It may have started out as a joke, but according to a Southern New Hampshire University academic adviser, it is “a real thing people experience.” That and the fact the term is not only defined in the urban dictionary, but the definition is also stated in Merriam-Webster as “an ebbing of motivation and effort by school seniors as evidenced by tardiness, absences, and lower grades.”

In fact, here are some statistics for you: Based on a survey that was conducted by The Northwood Omniscient, the student news magazine of Northwood High School in North Carolina, USA, “78 percent of seniors admitted to having senioritis.” A similar survey was reported as well in The Voice, the newspaper of Farmington High School in Connecticut, where 53 percent of students said they had senioritis. These studies factored in COVID-19 and how being a student during a pandemic has exacerbated the effects of this supposed disorder.

However, this phenomenon has been a recurring trend for seniors in high school since before the pandemic. In a study by William C. Carpluck on The Senior Enigma: A Study of The Entrenched and Sustaining Source of Senioritis, 23.8 percent of high school seniors said they “strongly agree” when asked if they believed their academic efforts would most likely decline in the second semester of their senior year (which is when the illness usually hits.) 45.9 percent of the same high school seniors answered they “agree” and only the remaining 30.4 percent disagreed.

Whether or not we reach a common consensus on its legitimacy as a real disorder, there’s an indisputable fact that a lot of students are plagued with it. Instead of focusing on its validity as a real illness, let’s instead focus on ways of combating it.

One of three male students asleep on stair steps due to senioritis
What Are The Signs That a Student Could Be “Infected”?

Maybe it’s simply a lack of motivation but if you find yourself agreeing to the symptoms below, you’re just going to have to stick around to hear our prognosis.

1. Waking up in the morning becomes the hardest task to do.
2. Finding yourself already being so over school even though you still have the rest of the school year to go.
3. Finding it impossible to focus on studying.
4. When your 15-minute study break turns into an hour or more.
5. Finding it hard to stay awake during class.
6. You went from planning and laying out your outfit on the bed before the first day of school to not caring about your appearance altogether.
7. Procrastinating on your deadlines and then having to pull an all-nighter just to get things done.
8. Having an internal dialogue with yourself and finding any excuse to not go to class.
9. Forgetting to apply to colleges or internships after college.
10. Getting a sudden surge of nostalgia. Wanting to spend more time with friends and even appreciating your parents more.
11. Feeling dread over adulting.
12. Wanting to simply skate by instead of aiming high.
13. Not wanting to do your homework.
14. Not caring about your grades.
15. No longer putting in the effort into academics or extracurriculars.
16. Asking yourself why you even need to pursue higher education.
17. Feeling guilty because you realized you haven’t been putting in the work, therefore feeling even worse and being stuck in the cycle of dread and pressure.
18. Daydreaming about quitting.
19. Wanting to physically leave campus.
20. Feeling overwhelming relief as soon as the bell rings and class ends.

Male student with senioritis having a hard time while working on laptop
What Are The Causes?

There could be a slew of reasons why this could happen and here are some of the plausible causes.
The Pressure of Perfection

While it isn’t inherently bad to want to strive for greatness and perfection, when students start basing their worth on their productivity, it’ll eventually lead to students feeling burnt out.

While setting goals such as getting a high grade point average, getting into internships or summer jobs, all while keeping up with extracurriculars are important, it could easily be too much to handle. Rest is as equally important as putting in the work. If there’s anything to strive for perfection, it should be how to balance priorities. Remembering that, “Wisdom is knowing when to have rest, when to have activity, and how much of each to have” (Sri Sri Ravi Shankar), might just be one of the ways to avoid unnecessary stress during your senior year.
Focusing Too Much on The Future

It’s an absolute given that one of the main reasons students are in school is to fully equip them with the right tools to prepare them for their future lives and careers. But when the focus is too much on the future, there becomes a tendency to be complacent with the present and that leads to losing sight of current priorities and school tends to be put on the back-burner.

At the same time, putting too much focus on your future could cause debilitating anxiety. This causes you to be unable to tackle present problems, for fear of consequences that have yet to happen. Fear of the unknown could paralyze you, so learning how to take everything one step at a time could be the answer to surviving the supposed illness.
Sadness

When you’re a senior, nostalgia just hits differently. In the past, it’s not hard to say goodbye at the end of the school year knowing you’ll still get to go back to the same school. You get to see the same faculty and be with your same group of friends, but when you’re a high school senior, you realize how much your life will change. This could even mean moving away from your hometown and leaving your friends and family behind. Essentially, leaving high school is to leave your life’s biggest safety net and it’s incredibly daunting. And just like any ending, it’s overwhelmingly sad. The same goes for college seniors since the transition to becoming a full adult could initially feel like all the joy in life is being sucked out of you. But that’s a conversation for another time.

Sadness is another paralyzing factor that could hinder students from wanting to keep the momentum they’ve had years prior. But with endings, come sweet beginnings. It’s only the end of one chapter, so try and look forward to the start of many new ones ahead.
Complacency

There comes an air of smugness with becoming a senior and it’s easy to blame just being plain lazy on having some dubitable disease. You’re so close to the finish line that you feel like you can get by doing the bare minimum and this is the mistake seniors commit. What they don’t realize are the consequences of not working hard enough and just skating by. For high school seniors, some colleges could revoke your admission if your grades drop. There’s also the possibility of being put on academic probation while you get your grades up, and this does not look good on your college application. This same complacency could even deter your teachers from giving good letters of recommendation.

Slacking off now and being comfortable with it could eventually become a habit you bring with you once you enter college or the workplace and by then, it’ll be even harder to bounce back from. It’s easy to blame it all on a supposed sickness but there’s a fine line between that and just poor work ethic. All this does is tarnish your reputation in your current school, your future college, and even future companies you’d like to join.

Students who overcame senioritis throw their graduation caps in the air in celebration
Will It Go Away?

According to Buzzfeed, the cure is simply graduating. If you’ve reached this far along into the article, that must feel a little anticlimactic and maybe even a little disappointing. But that doesn’t mean that there aren’t solutions that you can adopt as you work your way to graduation day.
How to Survive Senior Year

So you’ve finally come to terms with your diagnosis. Congratulations, you’re one step closer to fighting it by addressing your problem.

One thing to note here is that it’s a matter of winning over your own mind.

It’s hard to understand how the mind works sometimes. One day you’ve got it all together, the next, it’s taking all of your power to even find the energy to get up and start the day. With all the unprecedented events that have been happening around the world in recent years, students aren’t the only ones who have had a decrease in motivation. Many people are struggling to carry on while battling a myriad of problems. It could range from mental health, physical health, or a surplus, or lack of work. It’s nice to know you aren’t alone in dealing with these problems. We are all guilty of putting pressure on ourselves in the strive to perfection and productivity. So much so that it causes an overwhelming feeling of dread for what’s to come next.

However, it’s important to note that our worth is not solely based on how productive we are and it won’t matter if we work ourselves to the bone if we are not fundamentally taking care of other areas of our lives. While being a student does mean a majority of your time and energy is focused on school, understanding that it isn’t necessarily your end all be all helps alleviate some of the stress it causes. If anything, you should approach this area and time in your life with as much enthusiasm and excitement as you can. Make sure you’re still attending to your other needs outside school. You can absolutely expect that life gets more challenging as you age and that there are larger obstacles ahead. This is precisely why starting as early as now to build healthy habits is important. Finding the balance between student life and normal life and exercising a great work ethic, are all imperative steps to staying motivated.

Here are foolproof ways and solutions to ensure that you end the school year strong and are fully equipped and ready to enter college or start your career.

Motivational words “You got this” for students with senioritis written with chalk on an asphalt road
Acknowledge The Problem

The first step to finding any solution is identifying and defining the problem. You’ve come down with a case of senioritis, yes, but don’t let this discourage you even more than you already are. Accept that change is needed and proceed with all the right tools. As listed above there are a multitude of symptoms and not all might apply to you so identify which ones do and take it from there. Whether it’s finding it hard to wake up in the mornings in time for class, or no longer caring about your appearance or your grades, identifying which symptoms you have will make it easier to find the solutions.
Organize

Whether it’s your home, study area, your notes, schedule, or mind, clutter and disorganization affect the brain. According to The Royal Australian College of General Practitioners, having a cluttered space is a visual distraction that increases cognitive overload, therefore, reducing the brain’s working memory. Neuroscience researchers have even found that having a clean, organized space results in better focus and allows the brain to process information better, thus increasing productivity. Removing clutter is a simple enough task and once you accomplish this, you are already one step closer to alleviating your problems.
Rest

As mentioned earlier, taking it easy on yourself is just as beneficial as the amount of work you put in. Even bodybuilders and people who love working out will argue that your muscles grow not when you are doing your workouts, but actually when your body is at rest that it is able to rebuild. You might think because you have senioritis it would be counterintuitive to add more rest to the list of solutions, but it all boils down to what you do during your time of idleness. Use this time to brainstorm and think of ways you could get yourself out of a slump. Instead of using your downtime to binge-watch movies or kill time on your phone, conceptualize a plan to get to graduation not just skating by, but passing with flying colors.
Make The Most of Your Senior Year

Instead of wallowing in sadness over a chapter that’s about to close, now’s the best time to hold everything you know and love closer to your heart. An era of your life is about to change completely and now is the time to truly appreciate how far you’ve come and make amazing memories that you can look back fondly on. Life is fleeting, make every second count. You don’t want to look back on your life and regret having spent your days slacking off in bed instead of learning all that you could and being in class with the rest of your friends. There are so many people who wish they could have the opportunity to get an education. Count this time in your life as a blessing and move with gratitude whenever you feel negative about going to school.
Find The Joy in Studying

There is so much joy in learning and growing as a person. There are also an infinite number of ways to do so. Everyone soaks up information differently. You might be the type of student who can’t sit still in class or is easily distracted, so find your own unique way of making learning fun. Try different methods, whether it’s studying with a close group of friends, trying the Pomodoro Technique, having music playing in the background while you study, and so much more. Maybe you’re the type to be more productive while working when you have a little snack. Check out our previous blog on Brain Food for Studying, and you just might find some of your favorite foods there to incorporate into your study sessions. Find what works for you and stick with it. Don’t be too hard on yourself when other people’s methods don’t work for you. We’re all wired differently so embrace your own uniqueness!
Motivate Yourself

Your teachers and friends can only motivate you to an extent, but it is ultimately up to you to take the reins in your own life. Decide that you want to be successful. Decide on being motivated. Wake up and affirm to yourself that you are able, talented, and ready. Manifestations do happen and it all starts with putting the thought out into the universe. Cogito, ergo sum. I think, therefore I am. Practice strengthening your mind with affirmations and watch your life transform. Have power over your mind, not the other way around.
Keep Calm and Carry On

Change doesn’t happen overnight. If you find yourself still unable to get in the right mindset despite following the solutions above, don’t quit just yet. These are all exercises that you can apply to your day-to-day life whenever you are faced with problems. Backtrack to the first step by identifying what needs to be done, be calm, and then proceed. It’s okay to pause. Everyone gets tired, everyone loses motivation from time to time. The important thing is how you get back up once you’ve fallen. It’s an obstacle, not a roadblock. You can always persist and carry on.
Still Can’t Muster Enough Strength to Get Through Senior Year?

If you’re still unable to get out of the rut and still need help, let Homework Help Global take some of the load off you.

Whether you need help with writing term papers, book reports, case studies, or even your Ph.D. dissertations, Homework Help Global has a team of smart, dedicated writers to help you. If you have an overload of assignments and are completely overwhelmed, our services even include making lab reports, math solutions, creating Powerpoint presentations and so much more.

For high school seniors, Homework Help Global can assist with scholarship applications, admission essays, and college or university applications.

If you’re a college senior, we can even help you get started on building your resume and writing a cover letter.

Senioritis is not the end for you, and there are plenty of ways to overcome it and we, at Homework Help Global, would be more than happy to help you along the way.

Check out our services to see what else we can help you out on or take it a step further and see our easy online system to get quoted for free or you can go ahead and order here!

Brain Food For Studying: 25 Brain-Boosting Snacks That Will Help Improve Your Study Sessions, Exam Performance, and More

When you’re pulling a late night study session you might enjoy snacking on that bag of chips and energy drink, but while those snacks taste great, it’s not the right brain food for studying effectively.

Did you know that avoiding junk food while studying can actually help you get better grades? Here’s why: to study productively, you need to call on a few significant brain functions: concentration, memorization, focus, and energy. Junk food can actually get in the way of some of these brain functions, even if that tiny little chocolate bar looks harmless and oh-so-tempting.

With the right brain fuel, you can capitalize on all of these aspects of the brain to harness as much of its power as you can. The more brain power you can use, the better your memory and learning abilities will be, which in turn will get you better grades on exams, essays, and term papers.

So, what is the best brain food for studying and improving your academic performance? We’re going to show you, and we’ve even provided some snack ideas you can try using each of these 25 powerful brain-boosting superfoods.

Female student in a restaurant eyeing a plate of junk food
Why Your Study Snack Matters

If you’re someone who relies on that mid-study session chocolate bar to get through the night, you might be wondering what’s so bad about it. On the outside, it seems like a harmless treat, but it’s not giving you the nutrients your brain needs to power through those study sessions.

We all know that junk food is bad for your body and can lead to heart disease, obesity, and more. However, it’s bad for your brain health, too. According to Neuroscience News, studies have shown that eating too much junk food can actually alter the structure and function of your brain, particularly in the prefrontal cortex which is responsible for controlling decision making, cognitive control, and more elements that are important for studying.

For effective studying, you need to be in peak brain health so you can take really good notes in class, stay alert and take in information from your study materials, and call on your brain power to remember everything when it comes time for exam day.

This list of brain food for studying will provide you with some ideas for study snacks that will do much more than simply satisfy your hunger. Each of these foods has been scientifically linked to at least one core function that is essential to effective studying and academic performance. When you choose the right brain fuel, you will be more than ready and prepared to take on those exams, mid term projects, finals, and term papers with ease.

If you need a little advice to get your grocery list ready, check out our blog on grocery shopping with a student budget. This will help you make sure you can get the foods you need with the budget you have.

Plate with sunny side up eggs being eaten while studying
1. Eggs

Eggs are often known in the health world as nature’s multivitamin – in fact, Healthline refers to eggs as “the healthiest food on the planet.” This means eggs are arguably the top most effective brain food for studying. Here’s why: eggs are packed with powerful vitamins, antioxidants, nutrients, and healthy fats that have tons of important benefits such as reducing cholesterol and helping the body build protein. But the really important nutrients in eggs for students are choline, vitamin B12, and selenium. All of these nutrients are linked to better memory storage, coordination, neurological stability, and brain development.

It’s important to note that the most important nutrients for brain function are found within the egg yolk, so tossing back an egg white omelette won’t be effective enough.

Snack ideas:

● Hard boiled eggs

● Scrambled eggs with salsa

● Fried eggs on whole grain toast

● Deviled eggs

● Egg snack bites

● Omelettes

● Baked egg muffins
2. Berries

Deeply coloured berries such as blackberries, blueberries, raspberries, and strawberries, are packed with antioxidants that help brain cells communicate properly. Additionally, most berries contain flavonoids, which are responsible for giving the berries their colour. Flavonoids are known to help improve memory and learning (information intake). Blueberries in particular are a known superfood for tons of different reasons, including cognitive performance, but all berries will help your brain in the long run.

Try to avoid going for dried or canned berries. The manufacturing process to preserve berries this way requires a lot of additives and typically contains tons of extra refined sugar that will defeat the entire purpose of eating the berries in the first place.

Snack ideas:

● Triple berry smoothies

● Greek yogurt topped with berries

● Whole grain cereal topped with berries

● Berry oat snack bars

● Dehydrated strawberry “chips”

● Fruit salad

● Strawberry and goat cheese salad

Salmon and fatty fish superfoods waiting to be cooked
3. Fatty Fish

Not everyone loves fish, but everyone should love the benefits your brain gets from eating fatty fish. Certain types of fatty fish (salmon, trout, tuna, and sardines) contain omega-3s, which is an essential fat that makes up half of the total fat in your brain. These fats are important for memory and learning, building nerve cells, and emotional regulation. In contrast, some studies show that not getting enough omega-3s can lead to learning impairments and depression.

Snack ideas:

● Fish tacos

● Albacore tuna and whole grain crackers

● Fish sticks

● Homemade fish cakes

● Smoked salmon

● Salmon mousse dip with whole grain crackers

● Smoked salmon deviled eggs
4. Avocados

Known for their high levels of healthy fats, avocados contain unsaturated fats that are linked to better cognitive performance. These fats can also help you feel full for longer, so you won’t have to listen to your stomach rumbling while you’re studying.

Avocados are also packed with powerful antioxidants and carotenoids such as lutein, which is important for eye health. You need your eyes to be in good working order so you can properly read your materials and make sure you don’t miss any details in your instructions. Lastly, avocados can also help reduce blood pressure, which is a key function that improves the brain’s ability to do its job.

Snack ideas:

● Avocado toast

● Sliced raw avocados topped with seasoning

● Guacamole with whole grain chips or carrots

● Add avocado to smoothies

● Avocado salad with chicken and kale

● Baked avocado “fries”

● Chocolate avocado pudding

Lamb and tomato kebabs for a good study snack
5. Lamb

Red meat often gets a bad reputation in the health world, so it’s probably surprising to see it on a list of good brain food for studying. However, in recent years, scientific research has found that weekly consumption of lamb is linked to better cognitive performance and brain function. Lamb is also packed with protein, which is important for brain function as well because it helps give your brain the energy it needs to stay alert and focused.

Snack ideas:

● Lamb kabobs

● Gyros on whole grain pita bread

● Grilled lamb bites

● Sliced lamb with feta cheese and olive oil

● Lamb stew
6. Turmeric

Turmeric is the spice that gives curry dishes their signature yellow colour, but it’s also one of the most underrated superfoods on the planet. This spice is a key component of herbal medicine in many cultures and countries around the world, and it’s a great brain food for studying, too. Many people even use turmeric supplements in order to maximize their nutritional intake and keep their bodies healthy.

The main active ingredient in turmeric is the powerful anti-inflammatory compound curcumin, which helps with memory storage and memorization in general. This is due to the fact that curcumin has been linked to increases in a protein known as the brain-derived neurotrophic factor (BDNF). The BDNF is the gene that is responsible for prolonging the life of neurons in the brain and is extremely important for memory and learning. Curcumin is also packed with antioxidants that help boost the body’s immune system, protect the brain from damage from free radicals, and help prevent Alzheimer’s disease.

A note about turmeric: it’s often paired with black pepper because black pepper helps the body absorb the full nutritional value of the spice, so try to use this combo if you’re cooking with it.

Snack ideas:

● Turmeric tea

● Add turmeric to hummus and dips with veggies

● Sprinkle turmeric on top of deviled eggs

● Add turmeric to smoothies

● Turmeric energy balls

● Roasted carrots with turmeric spice

Student eating a kale and green salad before taking exams
7. Leafy Greens

You’ve probably already heard this before, but many leafy greens are classified as superfoods for the brain. This is especially true for kale, collards, and spinach. These powerful greens contain those helpful flavonoids, as well as important brain-boosting vitamins, and carotenoids (antioxidants).

Scientific studies have also shown that leafy greens help slow cognitive decline and preserve your brain function, so you’re helping your future self out when you munch down on your greens, too.

Snack ideas:

● Green smoothies

● Roasted kale chips

● Salads

● Lettuce wraps

● Make spinach pesto for pasta and pizza

● Green hummus with veggies

● Sauteed spinach with scrambled eggs

● Spinach and feta spanakopita
8. Nuts

Nuts are another group of food known for containing lots of those super essential brain-boosting omega-3s. Most nuts are also very rich in vitamin E, which helps keep free radicals and other environmental toxins from getting into the brain. When those free radicals get in, they start to cause oxidative stress that can speed up cognitive decline as you age, putting you at risk for Alzheimer’s and other neurological diseases. Therefore, not only are they a tasty snack, but that handful of nuts can help boost your brain power and preserve it at the same time.

Snack ideas:

● Trail mix

● Natural peanut butter (without preservatives) and whole grain crackers

● Granola bars with nuts

● Roasted cashews

● Candied pecans

● Dark chocolate peanut brittle

● Nuts, cheese, and whole grain crackers

● Oatmeal walnut cookies

● Dark chocolate covered almonds

● Peanut butter energy bites

A healthy snack of dark chocolate and raspberries on a counter
9. Cocoa and Dark Chocolate

Yes, you can have chocolate while you’re studying if it’s dark chocolate. Sure, dark chocolate isn’t everyone’s favourite taste, but if it can help with your chocolate fix it’s better than nothing. Some scientific research has shown that dark chocolate, and cocoa products with high levels of cocoa, have positive effects on cognitive function and can help reduce your risk of mental fatigue. Dark chocolate has the highest cocoa levels, which is why it’s on this list as one of the best snacks to handle a sweet craving while you’re studying.

Snack ideas:

● Dark chocolate covered berries

● Oat clusters with dark chocolate

● Apple slices with dark chocolate and peanut butter

● Energy bites with dark chocolate chips
10. Coffee (Caffeine)

Here is some great news for coffee lovers and students who just can’t get out of bed without the help of that morning java – coffee is actually a good brain food for studying! Health experts have found that caffeine helps the brain stay alert and improves concentration, and can even help you retain information better. Coffee also contains some helpful antioxidants that help boost your mood levels and reduce the risk of neurological diseases such as Alzheimer’s.

So, you can continue to reach for that cup of coffee when you get settled for your study session. Just make sure it’s not your 10th cup of the day – avoid overdoing it because too much caffeine can lead to the jitters and throw off your sleep cycle.

Snack ideas:

● Dark chocolate covered coffee beans

● Coffee energy bites

● Banana coffee smoothies

● Iced coffee

● Zucchini coffee bread

● Oat milk lattes

● Coffee mousse

A bowl of broccoli ready to be eaten
11. Broccoli

This cruciferous vegetable does more than resemble miniature trees. It can help you power your brain, too. Broccoli contains a lot of glucosinolates, which is a compound that breaks down into isothiocyanates that help protect your brain from neurological decline. This crunchy veggie also contains tons of antioxidants, vitamins, and flavonoids that help with memory and learning.

Broccoli is also rich in fibre, which helps you stay full by regulating your digestive system and can also assist in regulating blood pressure and blood sugar levels.

Snack ideas:

● Raw broccoli dipped in hummus

● Broccoli cheddar bites

● Crustless broccoli quiches

● Broccoli slaw

● Steamed broccoli with cheddar cheese

● Broccoli tots
12. Beets

While they don’t really sound like the most appealing study snack in the world, beets are an often overlooked brain food for studying. These root vegetables help improve blood flow to the frontal lobe of the brain, which is the key area of the brain that gets put to work when you’re memorizing, focusing, concentrating, and absorbing information.

Snack ideas:

● Pickled beets

● Beet puree on baguettes or crackers

● Baked beet chips

● Roasted beets

● Beet dip with veggies

Whole grain toast with different toppings for snack ideas
13. Whole Grains

You likely already know that whole grains are good for you and should be part of any well-rounded diet, but did you know that whole grains are also good for the brain? Whole grains are a great brain food for studying because they are a great source of that protective vitamin E we mentioned before. They also contain antioxidants, other important vitamins and nutrients, and tons of fibre. In fact, whole grains are so good for your brain function that they can also reduce your risk of stroke.

Here are some examples of whole grains: quinoa, brown rice, popcorn, oatmeal/oats, buckwheat, millet, whole rye, and bulgur. You can commonly find them in pastas, some cereals, bread, and some baked goods.

Snack ideas:

● Whole grain crackers and hummus

● Oatmeal with berries

● Greek yogurt topped with whole grain granola

● Sandwiches on whole grain bread (go for tuna for extra brain fuel)

● Whole grain avocado toast

● Air popped popcorn

● Quinoa muffins

● Burrito bowls with brown rice and beans
14. Sunflower Seeds

If you love to munch on sunflower seeds, we’ve got some great news for you: this crunchy snack beloved by baseball fans is also an excellent source of brain fuel. These little seeds are full of fantastic antioxidants as well as those helpful flavonoids we mentioned above that help you with better information intake.

Sunflower seeds do contain a lot of calories, however, so make sure that you’re eating them in moderation. When you’re zoned in on studying, it can be easy to keep mindlessly reaching into the bowl. Try to pour out only a portion at a time to keep yourself on track.

Snack ideas:

● Flavoured store-bought sunflower seeds

● Trail mix

● Apples dipped in sunflower seed butter

● Homemade granola bars with sunflower seeds

● Sunflower seeds sprinkled on a salad

Matcha powder superfoods being prepared for a snack
15. Green Tea/Matcha

Reaching for a cup of green tea to energize yourself might be one of the best ideas during study time. Green tea contains L-theanine, which is linked to better brain performance, higher energy levels, and stress management. It also contains caffeine, which as we mentioned above, can help you stay alert, improve concentration, and help you absorb information.

Matcha powder contains even more L-theanine than regular green tea, and it can be added to tons of different recipes and foods. This is because matcha powder uses the entire tea leaf, so it’s able to maintain the full range of nutrients in the plant while green tea is more refined. As a result, matcha also contains a lot of natural antioxidants that help with memory and learning, increased reading time, and a better attention span.

Snack ideas:

● Matcha green tea smoothies

● Banana and matcha “ice cream”

● Matcha tea lattes

● Chia seed matcha pudding

● Matcha protein bites

● Coconut matcha bars
16. Citrus Fruits

Citrus fruits, such as grapefruits, oranges, lemons, and clementines, are known for their high vitamin C content. In fact, eating just one orange a day will give you the full recommended daily dose of vitamin C, which is important for your overall health.

However, citrus fruits also have tons of brain fuel hidden inside. Citrus fruits contain over 60 different antioxidants, flavonoids, and anti-inflammatory compounds that are important for brain function. These compounds also help you prevent breakdown of cells in the nervous system, which is important for fighting off neurodegenerative diseases.

Be careful if you’re opting for store-bought fruit juices instead of the fruit itself. While fruit juice does have the vitamins that raw fruit does, these juices are also packed with refined sugars that can cause weight gain, liver disease, heart issues, and more. With fruit, the pure form is always better.

Snack ideas:

● Sliced oranges with dark chocolate

● Homemade orange juice (without additives)

● Citrus fruit salsa

● Dehydrated citrus fruit snacks

● Grapefruit mousse

● Citrus fruit smoothies
17. Pumpkin Seeds

This popular fall-season snack is as healthy as it is tasty. Pumpkin seeds are rich in zinc, antioxidants, and healthy fats. All of these minerals are important for brain health and brain cell regulation.

If you’re finding yourself doing a lot of late night studying and you’re feeling the effects on your sleep cycle, pumpkin seeds can also help you with this. Pumpkin seeds have a high level of the amino acid tryptophan, which is associated with better sleep regulation because it converts to melatonin.

Snack ideas:

● Roasted pumpkin seeds

● Greek yogurt topped with pumpkin seeds

● Pumpkin muffins topped with pumpkin seeds

● Add pumpkin seeds to trail mix

●Rye and pumpkin seed whole grain crackers
18. Beans and Lentils

Known as the musical fruit if you heard that schoolyard rhyme when you were little, beans are packed with tons of great brain fuel. These small but mighty foods come full of B vitamins, fibre, and other important minerals. They also release energy in the body slowly over time, which helps you stay awake and focused for longer throughout your study session. Beans are also known to be a great source of omega-3 fatty acids that boost brain health. While all beans have some level of omega-3s, pinto and kidney beans have the highest.

Snack ideas:

● Baked beans (pairs well with eggs)

● Bean dip with veggies or whole grain crackers

● Homemade bean hummus

● Hearty minestrone soup

● Chili

● Roasted chickpeas

Hand with a few cherry tomatoes for studying snacks
19. Tomatoes

While it often masquerades as a vegetable, tomatoes are a fruit that do amazing things for your body and your brain health. One of the antioxidants found in tomatoes is lycopene, which is a very helpful nutrient that can protect your brain against damage caused by those free radicals and toxins we mentioned above. In fact, tomatoes are the richest source of lycopene in the Western diet, and it can be found in all forms of tomato products – even ketchup, despite its high sugar content.

Tomatoes are also approximately 95% water, which can help you stay hydrated when you’re studying for long periods of time.

● Snack ideas:

● Sliced tomatoes topped with balsamic vinegar

● Homemade tomato soup or gazpacho

● Cherry tomatoes in salad

● Whole grain pasta with tomato sauce

● Baked tomatoes with cheese
20. Watermelon

This sweet summer treat comes with tons of antioxidants and vitamins that we’ve mentioned on this list, including lycopene. However, the main reason watermelon makes a great snack for studying isn’t necessarily for its brain-boosting powers, but for its hydration powers.

Staying hydrated is a very important component of making your studying time more productive and acing exams or tests. The brain relies on water and fluid intake to continue communicating and sending messages through the nerves, and if you don’t get enough hydration, you’ll quickly experience a drop in memory, learning, information intake, decision making, and more. Watermelon is 92% water, making it an excellent tool to keep you hydrated for optimal memory and learning.

Snack ideas:

● Watermelon slices or chunks

● Frozen watermelon puree (sherbet)

● Watermelon fruit popsicles

● Balsamic chicken and watermelon salad

● Fruit pizza with watermelon as the base

● Watermelon and feta salad

Male student looking at jars of fermented kimchi and superfoods
21. Fermented Kimchi

Fermented foods, like kimchi, are great for you because the fermenting process adds probiotics and microorganisms that are beneficial for your gut health. You may be wondering why that would even be mentioned on a list of the best snacks for brain fuel, but what you may not realize is that gut health is directly connected to brain health. Therefore, foods that contain live active cultures and probiotics can help you maintain cognitive function as well as digestive function.

Snack ideas:

● Kimchi and celery sticks

● Creamy kimchi dip

● Kimchi slaw

● Add kimchi to stir fry dishes

● Kimchi miso soup

● Brown rice with kimchi
22. Ginseng

The ginseng plant has a world of health benefits in every part of the plant, from the root to the stalk. Firstly, ginseng has been linked to improved frontal lobe function and cognitive preservation. Secondly, ginseng produces some of the benefits you can get from caffeine, as it can boost your natural energy levels to help you stay alert, awake, and focused. Lastly, ginseng has also been linked to lower blood pressure and blood sugar levels, which are both key for brain function.

Snack ideas:

● Ginseng tea

● Korean ginseng soup

● Add ginseng to smoothies

● Iced ginseng tea

● Ginseng date squares
23. Cheese

Yes, you can still eat cheese when you’re eating for brain health! Cheese is a whole food that contains B vitamins, protein, and other nutrients that help the brain regulate and grow neurotransmitters, tissues, and enzymes. It can also help you prevent cavities and keep your teeth healthy as a nice bonus. Cheese that comes from grass-fed cow milk retains the highest levels of these important nutrients, so choose grass-fed whenever possible.

However, as you may already know, eating too much cheese can lead to weight gain, heart disease, and other health problems, so make sure you don’t overdo it when you’re snacking away.

Snack ideas:

● Sliced cheese and whole grain crackers

● Cheese sticks (non-processed)

● Warm cheese dip with veggies or whole grain crackers

● Garlic bread made with whole grain bread and cheese

● Baked cheese crisps

● Taco dip with cheese, beans, and guacamole
24. Greek Yogurt

Similar to cheese, Greek yogurt contains B vitamins, protein, and tons of probiotics that help your brain function. Additionally, just like the probiotics in fermented foods we mentioned before, these probiotics can also help you maintain gut health and therefore brain function. Those powerful probiotics can also help regulate your stress levels, anxiety, and emotional wellbeing, which are also important factors for studying that you should never overlook. The more stressed and burned out you are, the more your memory and learning abilities will slip.

When you’re shopping for Greek yogurt, make sure you don’t fall for the marketing tactics of “Greek-style” yogurts. These are not the same thing, and typically “Greek-style” yogurt is just regular plain yogurt with added thickening agents to replicate the thicker texture of real Greek yogurt. It will not contain the same number of nutrients that real Greek yogurt has.

Snack ideas:

● Greek yogurt parfait with whole grain granola, berries, and nuts

● Substitute for sour cream in dips or toppings

● Overnight oats with Greek yogurt and skim milk

● Greek yogurt protein pancakes

● Tzatziki dip and veggies

● Greek yogurt popsicles with berries
25. Soy

While soy can sometimes get a bad rep from the anti-GMO crowd, it can do wonders for your brain health in the long run. Soy products contain polyphenols, which are antioxidants known for reducing the risk of early onset dementia, preserving brain function over time, and improving cognitive abilities. As long as you eat soy in moderation, you can reap the brain health benefits for your studying time.

Snack ideas:

● Roasted soybeans

● Wasabi beans

● Steamed edamame

● Dark chocolate covered soybeans

● Add roasted soybeans to trail mix

Female student holding junk food up in front of her face
Foods You Should Avoid While Studying

These foods should stay off your list of brain food for studying, and outside your study sessions they should still be consumed only in moderation.

● Alcohol: It seems fun at the time, but alcohol leads to memory loss, poor decision making skills, and poor cognitive performance. Save the adult beverages for post-exam celebrations.

● Sugary drinks: Remember when you were younger and your parents told you that drinking too much soda would rot your teeth? As it turns out, it can rot your brain, too. Sugary drinks contain high-fructose corn syrup, which is directly linked to memory and learning impairment (among many other health conditions).

● Refined sugar: This one goes hand in hand with those sugary drinks we just mentioned. Sugary foods can lead to impaired memory function because high sugar levels don’t metabolize as easily in the brain, which can lead to cell damage or slow development of new cells.

● Processed foods: Highly processed foods, like instant noodles and ready-made meals, contain a lot of calories and fat, and very little brain fuel. Studies have shown that people who eat a lot of processed foods have lower scores when they’re tested on memory and learning functions than people who have healthy diets.

● Refined carbs (such as white bread and pasta): These foods don’t have enough fibre to slow down digestion, which means they can rush through your system and increase blood sugar levels. This can lead to impaired memory function even just after one meal. If you’re going to eat bread or pasta, choose whole grain versions.
You Focus on the Food While We Focus on the Work

If you just can’t kick that junk food habit, Homework Help Global can help. We may not be able to sit in on your in-person exams for you, but we can certainly take a load of work off your plate while you sit back and enjoy your snacks.

We provide a range of resources, services, and advice to help you stay on track with all of your academic goals. From custom essay writing services to professional editing services, all of the tools for success are just a click away. Whether you want someone to write your entire assignment for you or just need some quick advice from our blog, we’re here to help.

To order your next assignment, use our online ordering system or get a free quote from our operations team here.