Digital Tools, Analogue Barriers: How Open Physics AI Preserves Inequality

The promise sounds utopian. A researcher in Nairobi armed with nothing more than a laptop and an internet connection can now access the same computational physics tools that once required a Stanford affiliation and millions in grant funding. Open-source physics AI has, theoretically, levelled the playing field. But the reality emerging from laboratories across the developing world tells a far more complex story, one where democratisation and digital inequality aren't opposing forces but rather uncomfortable bedfellows in the same revolution.

When NVIDIA, Google DeepMind, and Disney Research released Newton, an open-source physics engine for robotics simulation, in late 2024, the announcement came wrapped in the rhetoric of accessibility. The platform features differentiable physics and highly extensible multiphysics simulations, technical capabilities that would have cost institutions hundreds of thousands of dollars just a decade ago. Genesis, another open-source physics AI engine released in December 2024, delivers 430,000 times faster than real-time physics simulation, achieving 43 million frames per second on a single RTX 4090 GPU. The installation is streamlined, the API intuitive, the barriers to entry seemingly demolished.

Yet data from the Zindi network paints a starkly different picture. Of their 11,000 data scientists across Africa, only five percent have access to the computational power needed for AI research and innovation. African researchers, when they do gain access to GPU resources, typically must wait until users in the United States finish their workday. The irony is brutal: the tools are free, the code is open, but the infrastructure to run them remains jealously guarded by geography and wealth.

The Mirage of Equal Access

DeepMind's AlphaFold represents perhaps the most celebrated case study in open-source scientific AI. When the company open-sourced AlphaFold's code in 2020, solving the 50-year-old protein structure prediction problem, the scientific community erupted in celebration. The AlphaFold Database now contains predictions for over 200 million protein structures, nearly all catalogued proteins known to science. Before AlphaFold, only 17% of the 20,000 proteins in the human body had experimentally determined structures. Now, 98% of the human proteome is accessible to anyone with an internet connection.

The two Nature papers describing AlphaFold have been cited more than 4,000 times. Academic laboratories and pharmaceutical companies worldwide are using it to develop vaccines, design drugs, and engineer enzymes that degrade pollutants. It is, by any measure, a triumph of open science.

But look closer at those citations and a pattern emerges. Research published in Nature in 2022 analysed nearly 20 million papers across 35 years and 150 scientific fields, revealing that leading countries in global science increasingly receive more citations than other countries producing comparable research. Developed and developing nations often study similar phenomena, yet citation counts diverge dramatically based on the authors' institutional affiliations and geography.

This isn't merely about recognition or academic vanity. Citation rates directly influence career progression, grant funding, and the ability to recruit collaborators. When scientists from Western developed countries consistently receive higher shares of citations in top-tier journals whilst researchers from developing economies concentrate in lower-tier publications, the result is a two-tiered scientific system that no amount of open-source code can remedy.

At a UN meeting in October 2023, delegates warned that the digital gap between developed and developing states is widening, threatening to exclude the world's poorest from the fourth industrial revolution. Only 36% of the population in the least developed countries use the internet, compared to 66% globally. Whilst developed nations retire 2G and 3G networks to deploy 5G, low-income countries struggle with basic connectivity due to high infrastructure costs, unreliable electricity, and regulatory constraints.

The UNESCO Science Report, published in its seventh edition as countries approached the halfway mark for delivering on Sustainable Development Goals, found that four out of five countries still spend less than 1% of GDP on research and development, perpetuating their dependence on foreign technologies. Scientific research occurs in a context of global inequalities, different political systems, and often precarious employment conditions that open-source software alone cannot address.

The Ecosystem Beyond the Code

The landscape of open-source physics simulation extends far beyond headline releases. Project Chrono, developed by the University of Parma, University of Wisconsin-Madison, and its open-source community, supports simulating rigid and soft body dynamics, collision detection, vehicle dynamics, fluid-solid interaction, and granular dynamics. It's used at tens of universities, in industry, and federal research laboratories. Hugging Face's platform hosts thousands of pre-trained AI models, including IBM and NASA's advanced open-source foundation model for understanding solar observation data and predicting how solar activity affects Earth and space-based technology.

Yet the pattern repeats: the software is open, the models are free, but the surrounding ecosystem determines whether these tools translate into research capacity or remain tantalisingly out of reach.

The LA-CoNGA Physics project offers an instructive case study. Since 2020, this initiative has worked to build computational capacity for astroparticle physics research across Latin America. Nine universities have developed laboratories and digital infrastructure, connecting physicists to global partners through the Advanced Computing System of Latin America and the Caribbean (SCALAC) and RedCLARA. Mexico's installed servers increased by 39.6% in 2024, whilst Chile and Argentina saw increases of 29.5% and 16.5% respectively. Argentina entered the TOP500 supercomputer rankings, and Brazil added two more TOP500 systems.

Yet Latin American researchers describe persistent challenges: navigating complex funding landscapes, managing enormous volumes of experimental and simulated data, exploring novel software paradigms, and implementing efficient use of high-performance computing accelerators. Having access to open-source physics AI is necessary but nowhere near sufficient. The surrounding institutional capacity, technical expertise, and sustained financial support determine whether that access translates into research productivity.

Consider the infrastructure dependencies that rarely make it into open-source documentation. NVIDIA's PhysicsNeMo platform and Modulus framework provide genuinely transformative resources for physics-informed machine learning. But running these platforms at scale requires stable electricity, reliable high-speed internet, and expensive GPU hardware. In sub-Saharan Africa, over 600 million people still lack access to reliable electricity. The proportion of Africans enjoying consistent power has increased by merely 3 percentage points since 2014-2015. Urban grid networks suffer from widespread power quality issues, and Africa's power infrastructure faces frequent blackouts and voltage instability.

A physicist in Lagos running Genesis simulations faces a fundamentally different reality than a colleague in Lausanne. The code may be identical, the algorithms the same, but the context of infrastructure reliability transforms what “open access” actually means in practice. When power cuts interrupt multi-hour simulation runs or unstable internet connections prevent downloading critical model updates, the theoretical availability of open-source tools rings hollow.

The Hidden Costs of Data Transfer

Even when researchers have stable power and computational resources, bandwidth costs create another barrier. In developing countries, broadband and satellite access costs are at least two to three times higher than in the developed world. For researchers searching literature databases like PubMed or Google Scholar, the internet meter ticks away, each moment representing real financial cost. When downloading gigabytes of model weights or uploading simulation results to collaborative platforms, these costs multiply dramatically.

A study covering Latin America found that the region spends approximately $2 billion annually for international bandwidth, a sum that could be reduced by one-third through greater use of Internet Exchange Points. There are no supercomputers in East Africa. Researchers struggling to access computational resources domestically find themselves sending data abroad to be governed by the terms and conditions of competing tech companies, introducing not only financial costs but also sovereignty concerns about who controls access to research data and under what conditions.

The bandwidth problem exemplifies the hidden infrastructure costs that make “free” open-source software anything but free for researchers in low-income contexts. Every download, every cloud-based computation, every collaborative workflow that assumes high-speed, affordable connectivity, imposes costs that compound over time and research projects.

The Cost of Being Heard

Even when researchers overcome computational barriers, publishing introduces new financial obstacles. The shift from subscription-based journals to open access publishing erected a different barrier: article processing charges (APCs). The global average APC is $1,626, with most journals charging between $1,500 and $2,500. Elite publications charge substantially more, with some demanding up to $6,000 per article. For researchers in developing nations, where monthly salaries for senior scientists might not exceed these publication costs, APCs represent an insurmountable obstacle.

Publishers allow full fee waivers for authors in 81 low-income countries according to Research4Life criteria, with 50% discounts for 44 lower middle-income jurisdictions. However, scientists in Kenya and Tanzania report being denied waivers because World Bank classifications place them in “lower middle income” rather than “low income” categories. Some journals reject waiver requests when even a single co-author comes from a developed country, effectively penalising international collaboration.

Research4Life itself represents a significant initiative, providing 11,500 institutions in 125 low- and middle-income countries with online access to over 200,000 academic journals, books, and databases. Yet even this substantial intervention cannot overcome the publication paywall that APCs create. Research4Life helps researchers access existing knowledge but doesn't address the financial barriers to contributing their own findings to that knowledge base.

UNESCO's Recommendation on Open Science, adopted in November 2021, explicitly addresses this concern. The recommendation warns against negative consequences of open science practices, such as high article processing charges that may cause inequality for scientific communities worldwide. UNESCO calls for a paradigm shift where justice, inclusion, and human rights become the cornerstone of the science ecosystem, enabling science to facilitate access to basic services and reduce inequalities within and across countries.

From 2011 to 2015, researchers from developing economies published disproportionately in lower-tier megajournals whilst Western scientists claimed higher shares in prestigious venues. Diamond open access journals, which charge neither readers nor authors, offer a potential solution. These platforms published an estimated 8-9% of all scholarly articles in 2021. Yet their limited presence in physics and computational science means researchers still face pressure to publish in traditional venues where APCs reign supreme.

This financial barrier compounds the citation inequality documented earlier. Not only do researchers from developing nations receive fewer citations for comparable work, they also struggle to afford publication in the venues that might increase their visibility. It's a vicious circle where geographic origin determines both access to publication platforms and subsequent academic recognition.

The Mentorship Desert

Access to tools and publication venues addresses only part of the inequality equation. Perhaps the most pernicious barrier is invisible: the networks, mentorship relationships, and collaborative ecosystems that transform computational capacity into scientific productivity.

Research on global health collaboration identifies multiple structural problems facing scientists from the Global South: limited mentorship opportunities, weak institutional support, and colonial attitudes within international partnerships. Mentorship frameworks remain designed primarily for high-income countries, failing to account for different resource contexts or institutional structures.

Language barriers compound these issues. Non-native English speakers face disadvantages in accessing mentorship and training opportunities. When research collaborations do form, scientists from developing nations often find themselves relegated to supporting roles rather than lead authorship positions. Computer vision research over the past decade shows Africa contributing only 0.06% of publications in top-tier venues. Female researchers face compounded disadvantages, with women graduating from elite institutions slipping 15% further down the academic hierarchy compared to men from identical institutions.

UNESCO's Call to Action on closing the gender gap in science, launched in February 2024, found that despite some progress, gender equality in science remains elusive with just one in three scientists worldwide being women. The recommendations emphasise investing in collection of sex-disaggregated data regularly to inform evidence-based policies and fostering collaborative research among women through formal mentorship, sponsorship, and networking programmes. These gender inequalities compound geographic disadvantages for female researchers in developing nations.

Financial constraints prevent researchers from attending international conferences where informal networking forms the foundation of future collaboration. When limited budgets must cover personnel, equipment, and fieldwork, travel becomes an unaffordable luxury. The result is scientific isolation that no amount of GitHub repositories can remedy.

Some initiatives attempt to bridge these gaps. The African Brain Data Science Academy convened its first workshop in Nigeria in late 2023, training 45 participants selected from over 300 applicants across 16 countries. African researchers have made significant progress through collective action: the African Next Voices dataset, funded by a $2.2 million Gates Foundation grant, recorded 9,000 hours of speech in 18 African languages. Masakhane, founded in 2018, has released over 400 open-source models and 20 African-language datasets, demonstrating what's possible when resources and community support align.

But such programmes remain rare, undersupported, and unable to scale to meet overwhelming demand. For every researcher who receives mentorship through these initiatives, hundreds more lack access to the guidance, networks, and collaborative relationships that translate computational tools into research productivity.

The Talent Drain Amplifier

The structural barriers facing researchers in developing nations create a devastating secondary effect: brain drain. By 2000, there were 20 million high-skilled immigrants living in OECD countries, representing a 70% increase over a decade, with two-thirds coming from developing and transition countries. Among doctoral graduates in science and engineering in the USA in 1995, 79% from India and 88% from China remained in the United States.

Developing countries produce sizeable numbers of important scientists but experience tremendous brain drain. When brilliant physicists face persistent infrastructure challenges, publication barriers, mentorship deserts, and limited career opportunities, migration to better-resourced environments becomes rational, even inevitable. The physicist who perseveres through power outages to run Genesis simulations, who scrapes together funding to publish, who builds international collaborations despite isolation, ultimately confronts the reality that their career trajectory would be dramatically different in Boston or Berlin.

Open-source physics AI, paradoxically, may amplify this brain drain. By providing researchers in developing nations with enough computational capability to demonstrate their talents whilst not removing the surrounding structural barriers, these tools create a global showcase for identifying promising scientists whom well-resourced institutions can then recruit. The developing nations that invested in education, infrastructure, and research support watch their brightest minds depart, whilst receiving countries benefit from skilled workers whose training costs they didn't bear.

International migrants increased from 75 million in 1960 to 214 million in 2010, rising to 281 million by 2020. The evidence suggests many more losers than winners among developing countries regarding brain drain impacts. Open-source physics AI tools were meant to enable scientists worldwide to contribute equally to scientific progress regardless of geography. Instead, they may inadvertently serve as a recruitment mechanism, further concentrating scientific capacity in already-advantaged regions.

The Prestige Trap

Even if a brilliant physicist in Dhaka overcomes infrastructure limitations, secures GPU access, publishes groundbreaking research, and builds international collaborations despite isolation, one final barrier awaits: the tyranny of institutional prestige.

Research analysing nearly 19,000 faculty positions across US universities reveals systematic hiring hierarchies based on PhD-granting institutions. Eighty percent of all US academics trained at just 20% of universities. Five institutions (UC Berkeley, Harvard, University of Michigan, University of Wisconsin-Madison, and Stanford) trained approximately one out of every five professors.

Only 5-23% of researchers obtain faculty positions at institutions more prestigious than where they earned their doctorate. For physics specifically, that figure is 10%. The hiring network reveals that institutional prestige rankings, encompassing both scholastic merit and non-meritocratic factors like social status and geography, explain observed patterns far better than research output alone.

For researchers who obtained PhDs from institutions in the Global South, the prestige penalty is severe. Their work may be identical in quality to colleagues from elite Western universities, but hiring committees consistently favour pedigree over publication record. The system is simultaneously meritocratic and deeply unfair: it rewards genuine excellence whilst also encoding historical patterns of institutional wealth and geographic privilege.

When Simulation Crowds Out Experiment

There's a further, more subtle concern emerging from this computational revolution: the potential devaluation of experimental physics itself. As open-source simulation tools become more capable and accessible, the comparative difficulty and expense of experimental work creates pressure to substitute computation for empirical investigation.

The economics are compelling. Small-scale experimental physics projects typically require annual budgets between $300,000 and $1,000,000. Large-scale experiments cost orders of magnitude more. In contrast, theoretical and computational physics can proceed with minimal equipment. As one mathematician noted, many theorists require “little more than pen and paper and a few books,” whilst purely computational research may not need specialised equipment, supercomputer time, or telescope access.

Funding agencies respond to these cost differentials. As budgets tighten, experimental physics faces existential threats. In the UK, a 2024 report warned that a quarter of university physics departments risk closure, with smaller departments particularly vulnerable. Student enrolment in US physics and astronomy graduate programmes is projected to decline by approximately 13% as institutions anticipate federal budget cuts. No grant means no experiment, and reduced funding translates directly into fewer experimental investigations.

The consequences extend beyond individual career trajectories. Physics as a discipline relies on the interplay between theoretical prediction, computational simulation, and experimental verification. When financial pressures systematically favour simulation over experiment, that balance shifts in ways that may undermine the epistemic foundations of the field.

Philosophers of science have debated whether computer simulations constitute experiments or represent a distinct methodological category. Some argue that simulations produce autonomous knowledge that cannot be sanctioned entirely by comparison with observation, particularly when studying phenomena where data are sparse. Others maintain that experiments retain epistemic superiority because they involve direct physical interaction with the systems under investigation.

This debate takes on practical urgency when economic factors make experimental physics increasingly difficult to pursue. If brilliant minds worldwide can access AlphaFold but cannot afford the laboratory equipment to validate its predictions, has science genuinely advanced? If Genesis enables 43 million FPS physics simulation but experimental verification becomes financially prohibitive for all but the wealthiest institutions, has democratisation succeeded or merely shifted the inequality?

The risk is that open-source computational tools inadvertently create a two-tiered physics ecosystem: elite institutions that can afford both cutting-edge simulation and experimental validation, and everyone else limited to computational work alone. This wouldn't represent democratisation but rather a new form of stratification where some physicists work with complete methodological toolkits whilst others are confined to subset approaches.

There's also the question of scientific intuition and embodied knowledge. Experimental physicists develop understanding through direct engagement with physical systems, through the frustration of equipment failures, through the unexpected observations that redirect entire research programmes. This tacit knowledge, built through years of hands-on laboratory work, cannot be entirely captured in simulation code or replicated through computational training.

When financial pressures push young physicists towards computational work because experimental opportunities are scarce, the field risks losing this embodied expertise. The scientists who understand both simulation and experimental reality, who can judge when models diverge from physical systems and why, become increasingly rare. Open-source AI amplifies this trend by making simulation dramatically more accessible whilst experimental physics grows comparatively more difficult and expensive.

Computational Colonialism

There's a darker pattern emerging from these compounding inequalities: what some researchers describe as computational colonialism. This occurs when well-resourced institutions from developed nations use open-source tools to extract value from data and research contexts in developing countries, whilst local researchers remain marginalised from the resulting publications, patents, and scientific recognition.

The pattern follows a familiar template. Northern institutions identify interesting research questions in Global South contexts, deploy open-source computational tools to analyse data gathered from these communities, and publish papers listing researchers from prestigious Western universities as lead authors, with local collaborators relegated to acknowledgements or junior co-authorship positions.

Because citation algorithms and hiring committees privilege institutional prestige and lead authorship, the scientific credit and subsequent career benefits flow primarily to already-advantaged researchers. The communities that provided the research context and data see minimal benefit. The open-source tools that enabled this research were meant to democratise science but instead facilitated a new extraction model.

This dynamic is particularly evident in genomic research, climate science, and biodiversity studies. A 2025 study revealed significant underrepresentation of Global South authors in climate science research, despite many studies focusing explicitly on climate impacts in developing nations. The researchers who live in these contexts, who understand local conditions intimately, find themselves excluded from the scientific conversation about their own environments.

Some initiatives attempt to address this. CERN's open-source software projects, including ROOT for data analysis, Indico for conference management, and Invenio for library systems, are used by institutions worldwide. Rucio now supports scientific institutions including DUNE, LIGO, VIRGO, and SKA globally. These tools are genuinely open, and CERN's Open Source Program Office explicitly aims to maximise benefits for the scientific community broadly.

Yet even well-intentioned open-source initiatives cannot, by themselves, dismantle entrenched power structures in scientific collaboration and recognition. As long as institutional prestige, citation networks, publication venue hierarchies, and hiring practices systematically favour researchers from developed nations, open-source tools will be necessary but insufficient for genuine democratisation.

The Way Forward?

If open-source physics AI is both democratising and inequality-reproducing, simultaneously liberating and limiting, what paths forward might address these contradictions?

First, infrastructure investment must accompany software development. Open-source tools require computing infrastructure, stable electricity, reliable internet, and access to GPUs. International funding agencies and tech companies promoting open-source AI bear responsibility for ensuring that the infrastructure to use these tools is also accessible. Initiatives like SCALAC and RedCLARA demonstrate regional approaches to shared infrastructure that could be expanded with sustained international support.

Cloud computing offers partial solutions but introduces new dependencies. GPU-as-a-Service can reduce hardware costs, but ongoing cloud costs accumulate substantially, and researchers in low-income contexts may lack the institutional credit cards or international payment methods many cloud providers require.

Second, publication systems need radical reform. Diamond open access journals represent one path, but they require sustainable funding models. Some proposals suggest that universities and funding agencies redirect subscription and APC budgets toward supporting publication platforms that charge neither readers nor authors. Citation bias might be addressed through algorithmic interventions in bibliometric systems, weighting citations by novelty rather than author affiliation and highlighting under-cited work from underrepresented regions.

Third, mentorship and collaboration networks need deliberate construction. Funding agencies could require that grants include mentorship components, structured collaboration with researchers from underrepresented institutions, and explicit plans for equitable co-authorship. Training programmes like the African Brain Data Science Academy need massive expansion and sustained funding. UNESCO's recommendations on fostering collaborative research through formal mentorship, sponsorship, and networking programmes provide a framework that could be implemented across funding agencies and research institutions globally.

Fourth, institutional hiring practices must change. As long as PhD pedigree outweighs publication quality and research impact in hiring decisions, researchers from less prestigious institutions face insurmountable barriers. Blind review processes, explicit commitments to geographic diversity in faculty hiring, and evaluation criteria that account for structural disadvantages could help shift entrenched patterns.

Fifth, brain drain must be addressed not merely as an individual choice but as a structural problem requiring systemic solutions. This might include funding mechanisms that support researchers to build careers in their home countries and recognition that wealthy nations recruiting talent from developing countries benefit from educational investments they didn't make.

Sixth, the balance between computational and experimental physics needs active management. If market forces systematically disadvantage experimental work, deliberate countermeasures may be necessary to maintain methodological diversity. This might include dedicated experimental physics funding streams and training programmes that combine computational and experimental skills.

Finally, there's the question of measurement and accountability. The inequalities documented here are visible because researchers have quantified them. Continued monitoring of these patterns, disaggregated by geography, institutional affiliation, and researcher demographics, is essential for assessing whether interventions actually reduce inequalities or merely provide rhetorical cover whilst entrenched patterns persist.

The Paradox Persists

Open-source physics AI has genuinely transformed what's possible for researchers outside elite institutions. A graduate student in Mumbai can now run simulations that would have required Stanford's supercomputers a decade ago. A laboratory in Nairobi can access protein structure predictions that pharmaceutical companies spent hundreds of millions developing. These advances are real and consequential.

But access to tools isn't the same as access to scientific opportunity, recognition, or career advancement. The structural barriers that perpetuate inequality in physics research, from infrastructure deficits to citation bias to hiring hierarchies, persist regardless of software licensing. In some cases, open-source tools may inadvertently widen these gaps by enabling well-resourced institutions to work more efficiently whilst underresourced researchers struggle with the surrounding ecosystem of infrastructure, publication, mentorship, and prestige.

The geography of scientific innovation is being reshaped, but not necessarily democratised. The brilliant minds in underresourced regions do have better computational footing than before, yet translating that into meaningful scientific agency requires addressing infrastructure, economic, social, and institutional barriers that code repositories cannot solve.

The simulation revolution might indeed devalue experimental physics and embodied scientific intuition if economic pressures make experiments feasible only for wealthy institutions. When computation becomes universally accessible but experimental validation remains expensive and scarce, the entire epistemology of physics shifts in ways that deserve more attention than they're receiving.

The fundamental tension remains: open-source physics AI is simultaneously one of the most democratising developments in scientific history and a system that risks encoding and amplifying existing inequalities. Both things are true. Neither cancels out the other. And recognising this paradox is the necessary first step toward actually resolving it, presuming resolution is even what we're collectively aiming for.

The tools are open. The question is whether science itself will follow.


Sources and References

  1. NVIDIA Developer Blog: “Announcing Newton, an Open-Source Physics Engine for Robotics Simulation” (2024)
  2. MarkTechPost: “Meet Genesis: An Open-Source Physics AI Engine Redefining Robotics” (December 2024)
  3. UNDP Digital Blog: “Only five percent of Africa's AI talent has the compute power it needs” (2024)
  4. Tony Blair Institute for Global Change: “State of Compute Access 2024: How to Navigate the New Power Paradox”
  5. DeepMind: “AlphaFold reveals the structure of the protein universe” (2020-2024)
  6. Nature: “Leading countries in global science increasingly receive more citations than other countries doing similar research” (2022)
  7. UN Meetings Coverage: “Widening Digital Gap between Developed, Developing States Threatening to Exclude World's Poorest from Next Industrial Revolution” (October 2023)
  8. UNESCO: “UNESCO Science Report: the race against time for smarter development” (2024)
  9. ICTP-SAIFR: LA-CoNGA Physics Project documentation (2020-2024)
  10. Nature: “Still lacking reliable electricity from the grid, many Africans turn to alternative sources” (Afrobarometer, 2022)
  11. Project Chrono: “An Open-Source Physics Engine” (University of Parma & University of Wisconsin-Madison)
  12. IBM Newsroom: “IBM and NASA Release Groundbreaking Open-Source AI Model on Hugging Face” (2025)
  13. World Bank: “World Development Report 2021: Data for Better Lives – Connecting the world”
  14. Research4Life: Programme documentation and eligibility criteria
  15. UNESCO: “UNESCO Recommendation on Open Science” (2021)
  16. Learned Publishing: “Article processing charges for open access journal publishing: A review” (2023)
  17. The Scholarly Kitchen: “Guest Post – Article Processing Charges are a Heavy Burden for Middle-Income Countries” (March 2023)
  18. F1000Research: “The importance of mentorship and collaboration for scientific capacity-building: perspectives of African scientists” (2021)
  19. UNESCO: “Call to Action: Closing the gender gap in science” (February 2024)
  20. Kavli Foundation: “Expanding MRI Research in Africa” (African Brain Data Science Academy, 2023)
  21. iAfrica.com: “African Researchers Build Landmark AI Dataset to Close Language Gap” (African Next Voices, 2024)
  22. Masakhane open-source models and datasets (2018-2024)
  23. IZA World of Labor: “The brain drain from developing countries”
  24. Science Advances: “Systematic inequality and hierarchy in faculty hiring networks” (2015)
  25. Institute of Physics: “Quarter of UK university physics departments risk closure as funding crisis bites” (2024)
  26. AIP.org: “US Physics Departments Expect to Shrink Graduate Programs” (2024)
  27. Stanford Encyclopedia of Philosophy: “Computer Simulations in Science”
  28. CERN: “Open source for open science” and Open Source Program Office documentation
  29. Symmetry Magazine: “New strategy for Latin American physics”

Tim Green

Tim Green UK-based Systems Theorist & Independent Technology Writer

Tim explores the intersections of artificial intelligence, decentralised cognition, and posthuman ethics. His work, published at smarterarticles.co.uk, challenges dominant narratives of technological progress while proposing interdisciplinary frameworks for collective intelligence and digital stewardship.

His writing has been featured on Ground News and shared by independent researchers across both academic and technological communities.

ORCID: 0009-0002-0156-9795 Email: tim@smarterarticles.co.uk

Discuss...