KZINGA JIMENEZ

Writer | Digital Humanist | Educator | Artist


Essay: Ed-Tech On A Global Scale

The benefits, risks, and implications of educational technology and its ever-growing relations to the world

Educational technology is shifting the societal learning structure and world at a rapid pace, and the decision to keep up with, and responsibly shape, these ever-growing changes lies with those utilizing it, and placing it into full focus. EdTech either used to mean synchronous communication – students often looking at a teacher’s real-life classroom through a virtual platform lens – or continuous asynchronous coursework, where students would take their time to learn and gravitate to key concepts at their own pace, and within the confines of their busy schedules, as is particularly the case for adult online learners. 

Now, the field of EdTech has evolved once more, this time to the tune of Artificial Intelligence systems synchronizing with the ones already present, causing what can only be described as an uproar in the broader tech field. Many questions are arising that need answers, especially pertaining to the rapid expansion of digital systems that, realistically, need more time for testing and research prior to implementation and interaction with core audiences. At this point in time, there seems to be a real oscillating line between continuing this rapid expansion or initiating a pause to think about the ethical and equitable perspectives of utilizing this new technology, and using those perspectives as a key factor in moving forward. The fact of the matter is, the current pace of EdTech shifts demands deliberate, swift, and globally-conscious choices.

Imagine two separate classrooms, one decade apart. In 2012, a teacher on assignment in rural-based Kenya creates a grammar lesson on an overly-used chalkboard, while students are forced to share a single outdated textbook amongst each other. In 2022, a student in this same region submits an essay through a well-known cloud-based platform, goes on to receive personalized AI-generated feedback, and finds access to a virtual library with countless titles at their disposal. The entire transformation of innovation in this regard is truly a remarkable feat. Yet somewhere between the energetic frequency of those two classrooms, that same student’s online behavioral data was likely also harvested, algorithmically categorized and listed, and discreetly sold to an advertising firm. Both of these realities configure the same storyline. 

Educational technology is broadly defined as the use of digital tools, platforms, and systems that facilitate various learning methods. It has grown from simply an emergency and supplementary classroom aid to shaping the very infrastructure of current modern education. From AI-driven tutoring platforms (both in high and low capacity) to government-backed learning management platforms, ed-tech has become a significant contributing architect for hundreds of millions of students in K-12 schools, universities, and specialized or informal learning environments across global lines. The ongoing, central tension highlighting ed-tech’s global expansion is not solely that of a technological stance, but rather, it is intrinsically ethical and political. The question is not whether these digital tools can improve learning outcomes, but who controls this same infrastructure, who holds agency in defining the curriculum basis, who profits from the scraped student data, and whose cultural knowledge is treated as a defining space in which to determine how certain initiatives move forward. 

There was truly no other event that accelerated the pacing of ed-tech adoption more dramatically or more consequentially than the COVID-19 pandemic. When schools across the world indefinitely closed their brick and mortar establishments in early 2020, educational systems that spent several years debating on the appropriate role digital platforms had were suddenly forced, almost overnight, to pivot into using them as their only platforms. In a span of weeks, millions of students transitioned to remote learning environments mediated entirely by digital toolage. Kousa and Niemi (2023), acknowledging the full context of Finnish education

policy, observe that the pandemic’s forced adoption of digital tools far outpaced the ethical and regulatory frameworks initially designed to govern and monitor them. Governments, school districts, and ed-tech companies all collectively made a conscious decision that having access to education in the short term was more important than scrutiny of the backend systems providing it. This can be considered a valid, even defensible decision under any typical emergency conditions. However, the upending consequences have proven to be long-lasting. Still, further planning, implementation and building quickly continued with the infrastructure. Platforms adopted to a new normal, data contracts and agreements were signed off, vendor and supplier connections were established and opportunistically-rooted. 

One key ed-tech issue is a process that researchers have coined as “datafication” – this is the transformation of student optics such as behavioral analytics, performance, emotional states, identity, and more into extractable, monetized data. Sriprakash et al. (2025) argue that datafication is not exactly a neutral technical process but a deeply political one, noting previous historical patterns of colonial hierarchy and their relations to digital systems. When a platform classifies current students by a predicted, almost boilerplate-like academic trajectory or flags behavioral anomalies for administrative review, this is dangerous territory, in which it produces categories that reflect and reinforce existing toxic structures of power and exploitation. Bunn and Sridhar (2025) have provided detailed analysis of Australian ed-tech companies that engage in these same tactics of collection and monetization of student data, and that these practices specifically involve minors. As children under the law, the overlayering fact that these students  who lack any of the legal capacity required to willingly consent to data collection and have little awareness of their learning activity being commercially-exploited makes this a seemingly troubling scenario. It is sadly not an anomaly as it reflects a certain business model that has embedded itself as the standard for worldwide ed-tech markets.

Unfortunately, openly-expressed responses to these practices have been largely insufficient. Compliance-based agendas and frameworks, which require for companies to disclose or report any data collection practices in technical terms-of-service documents, actually do quite little to address the underlying incentive structures that drive this exploitation to begin with. Kousa and Niemi (2023) further complicate this picture by pointing to the cultural limitations of Artificial Intelligence tools now becoming widely used in ed-tech. These Natural Language Processing (NLP) systems are predominantly based and trained on cultural syntax, grammar and expression from Western, English-language data. But these have been proven to perform poorly across more diverse linguistic environments, ultimately resulting in finalized, systematically-biased assessments against students from non-Western backgrounds. This frame interweaves with rather eugenicist ideologies that simply don’t and cannot function in today’s ultra-globalized society. Especially in a world only becoming moreso. Ed-tech still has much optimized potential despite these named faults, and the proof is shown in real time. The exact same body of research documenting ed-tech’s disadvantages also contains compelling, verified evidence of its capacity to improve learning outcomes. The underlying and perpetual tension between these key findings is not a contradiction as much as one of many complexities in the field. Fakih and Sridhar (2025) conducted an analysis of targeted ed-tech outcomes across multiple Indian cities, documenting any observed milestones among digital platform learners. Notably, these measurable results were highly consistent and even spanned across varying socioeconomic backgrounds. This information suggests that under proper, legally-regulated conditions, ed-tech carries the potential to function as a binding force as opposed to a hierarchical one. 

Aside from India and Australia, similar ed-tech patterns have emerged from sub-Saharan Africa as well. Faustino et al. (2024) document noticeable improvement in student engagement and instructional quality in East African classrooms where digital tools were introduced with adequate support and hands-on training. Regarding the ongoing spiral of teacher shortages, with already limited resources, case studies show ed-tech platforms have and continue to function as a required and supplemental tool for instruction, which broadens access to educational content. This particular research complicates any purely negative assessments of ed-tech’s global impact. It is also worth noting that Bunn and Sridhar (2025), whose regional work on Australian data exploitation was discussed above, share a researcher with Fakih and Sridhar (2025), whose findings in India highlight ed-tech’s more positive potential. This colors a fundamental truth about this field – the truth that this technology can generate both genuine educational benefits just as much as serious ethical harm. Ed-tech is not necessarily good or bad in the abstract. It lies more within a grey area criteria, ultimately meaning that it is widely based on the deployment conditions involved.

The positive outcomes noted above are indeed real. They are also unevenly distributed, and this comes at a cost that falls disproportionately on those least positioned to bear the fallout consequences – working class populations, and perhaps other separate third-world communities. The same technologies that improved learning outcomes for students in Indian cities were, in other contexts, being used to watch and categorize students who had no other say on the matter. The data exploitation of minors especially represents the most ethically stark analysis of this problem. Bunn and Sridhar (2025) demonstrate that students of all ages are subject to data collection practices in ed-tech platforms not tolerated in other consumer circumstances. Participation in these platforms is often mandatory and may even be authorized by institutions themselves or government bodies with no provided alternatives. This process removes any modicum of consent that might have previously existed. 

Infrastructure barriers further classify ed-tech’s benefits. Faustino et al. (2024) document persistent, iterative challenges in East Africa such as unreliable electricity, limited connectivity, inadequate tech devices, and insufficient educational training. These patterns deeply reflect underinvestment rooted in colonial economic relations, and overall cultural bias. Western companies who design products for high-broadband environments and export them to sub-Saharan African communities see that the real result is access structured by and for the Global North. This, like the English language, is considered the default approach. To note an example case study, Sriprakash et al. (2025) highlight the vulnerability of Indigenous Australian communities to models treating Western frameworks as universal defaults. When digital curricula encode only one cultural tradition’s assumptions about knowledge and learning, the technology is actively marginalizing humans, which travels to backend context engineering algorithms. 

A four-part framework built around data sovereignty, algorithm accountability, care-centered design, and democratization is eventually introduced and offered as a viable alternative. Data sovereignty describes the complete control and distribution of educational data and statistics, or insights for students and communities. In practice, this requires multiple limits on data collection, restrictions on commercial usage, and refined mechanisms for easy access and record deletion. Bunn and Sridhar (2025) ground these protections in the UNCRC, arguing that meaningful implementation requires prohibiting any monetization of advertising-linked data in digital educational platforms. Algorithm accountability demands full transparency in automated decision-making. Cultural responsiveness, the third pillar, requires designing ed-tech in genuine partnership with the communities served, treating Indigenous and non-Western practices as design knowledge rather than deficits. And finally, the pillar of democratization looks for sustained, ongoing and committed investment in high-quality devices and offered available training in underserved regions. 

Ed-tech’s trajectory since the COVID-19 era illustrates the familiar tension of powerful digital tools arriving and prototyping faster than the foundational wisdom to govern them. The same platforms expanding access in Indian cities and East African classrooms have simultaneously exposed children to surveillance, reinforced cultural hierarchies, whether intentional or not, and have essentially deepened the very inequities they promised to assist in mitigating. The key insight to learn from is that ed-tech’s outcomes are not at all determined by the software itself, but by regulatory environments, commercial incentives, cultural contexts, and levels of infrastructural investment. The stakes are increasing, gradually extending beyond education. Ed-tech systems are among the most pervasive data collection systems ever built, and they target children. The patterns established now will shape educational standards for generations to come. Technology is a tool at the end of the day, and its outcomes depend entirely on who holds and shapes it, and for whom. 


SOURCES:

  • Bunn, A., & Sridhar, K. S. (2025). Educational technology (EdTech) in Australian schools: A case for better practice. UNSW Law Journal, 48(3), 871–912.
  • Fakih, A., & Sridhar, K. S. (2025). Effect of edtech start-ups on students’ performance: Selected evidence from Indian cities. Vikalpa: The Journal for Decision Makers, 50(4), 389–404. https://doi.org/10.1177/02560909251361990
  • Faustino, A., Cheema, G. K., & Bussey, M. (2024). Instructional technologies of education in East African countries: An overview. Journal of Interdisciplinary Studies in Education, 13(S1), 236–252. https://ojed.org/jise
  • Kousa, P., & Niemi, H. (2023). AI ethics and learning: EdTech companies’ challenges and solutions. Interactive Learning Environments, 31(10), 6735–6746. https://doi.org/10.1080/10494820.2022.2043908
  • Sriprakash, A., Williamson, B., Facer, K., Pykett, J., & Valladares Celis, C. (2025). Sociodigital futures of education: Reparations, sovereignty, care, and democratisation. Oxford Review of Education, 51(4), 561–578. https://doi.org/10.1080/03054985.2024.2348459


Leave a comment