Your Cloud Is Drying My River: The Real Cost of AI

In Mesa, Arizona, city officials approved an $800 million data centre development in the midst of the driest 12 months the region had seen in 126 years. The facility would gulp up to 1.25 million gallons of water daily, enough to supply a town of 50,000 people. Meanwhile, just miles away, state authorities were revoking construction permits for new homes because groundwater had run dry. The juxtaposition wasn't lost on residents: their taps might run empty whilst servers stayed cool.
This is the sharp edge of artificial intelligence's environmental paradox. As AI systems proliferate globally, the infrastructure supporting them has become one of the most resource-intensive industries on the planet. Yet most people interacting with ChatGPT or generating images with Midjourney have no idea that each query leaves a physical footprint measured in litres and kilowatt-hours.
The numbers paint a sobering picture. In 2023, United States data centres consumed 17 billion gallons of water directly through cooling systems, according to a 2024 report from the Lawrence Berkeley National Laboratory. That figure could double or even quadruple by 2028. Add the 211 billion gallons consumed indirectly through electricity generation, and the total water footprint becomes staggering. To put it in tangible terms: between 10 and 50 interactions with ChatGPT cause a data centre to consume half a litre of water.
On the carbon side, data centres produced 140.7 megatons of CO2 in 2024, requiring 6.4 gigatons of trees to absorb. By 2030, these facilities may consume between 4.6 and 9.1 per cent of total U.S. electricity generation, up from an estimated 4 per cent in 2024. Morgan Stanley projects that AI-optimised data centres will quadruple their electricity consumption, with global emissions rising from 200 million metric tons currently to 600 million tons annually by 2030.
The crisis is compounded by a transparency problem that borders on the Kafkaesque. Analysis by The Guardian found that actual emissions from data centres owned by Google, Microsoft, Meta and Apple were likely around 7.62 times greater than officially reported between 2020 and 2022. The discrepancy stems from creative accounting: firms claim carbon neutrality by purchasing renewable energy credits whilst their actual local emissions, generated by drawing power from carbon-intensive grids, go unreported or downplayed.
Meta's 2022 data centre operations illustrate the shell game perfectly. Using market-based accounting with purchased credits, the company reported a mere 273 metric tons of CO2. Calculate emissions using the actual grid mix that powered those facilities, however, and the figure balloons to over 3.8 million metric tons. It's the corporate equivalent of claiming you've gone vegetarian because you bought someone else's salad.
The Opacity Economy
The lack of consistent, mandatory reporting creates an information vacuum that serves industry interests whilst leaving policymakers, communities and the public flying blind. Companies rarely disclose how much water their data centres consume. When pressed, they point to aggregate sustainability reports that blend data centre impacts with other operations, making it nearly impossible to isolate the true footprint of AI infrastructure.
This opacity isn't accidental. Without standardised metrics or mandatory disclosure requirements in most jurisdictions, companies can cherry-pick flattering data. They can report power usage effectiveness (PUE), a metric that measures energy efficiency but says nothing about absolute consumption. They can trumpet renewable energy purchases without mentioning that those credits often come from wind farms hundreds of miles away, whilst the data centre itself runs on a coal-heavy grid.
Even where data exists, comparing facilities becomes an exercise in frustration. One operator might report annual water consumption, another might report it per megawatt of capacity, and a third might not report it at all. Carbon emissions face similar inconsistencies: some companies report only Scope 1 and 2 emissions whilst conveniently omitting Scope 3 (supply chain and embodied carbon in construction).
The stakes are profound. Communities weighing whether to approve new developments lack data to assess true environmental trade-offs. Policymakers can't benchmark reasonable standards without knowing current baselines. Investors attempting to evaluate ESG risks make decisions based on incomplete figures. Consumers have no way to make informed choices.
The European Union's revised Energy Efficiency Directive, which came into force in 2024, requires data centres with power demand above 500 kilowatts to report energy and water usage annually to a publicly accessible database. The first reports, covering calendar year 2023, were due by 15 September 2024. The Corporate Sustainability Reporting Directive adds another layer, requiring large companies to disclose sustainability policies, greenhouse gas reduction goals, and detailed emissions data across all scopes starting with 2024 data reported in 2025.
The data collected includes floor area, installed power, data volumes processed, total energy consumption, PUE ratings, temperature set points, waste heat utilisation, water usage metrics, and renewable energy percentages. This granular information will provide the first comprehensive picture of European data centre environmental performance.
These mandates represent progress, but they're geographically limited and face implementation challenges. Compliance requires sophisticated monitoring systems that many operators lack. Verification mechanisms remain unclear. And crucially, the regulations focus primarily on disclosure rather than setting hard limits. You can emit as much as you like, provided you admit to it.
The Water Crisis Intensifies
Water consumption presents particular urgency because data centres are increasingly being built in regions already facing water stress. Analysis by Bloomberg found that more than 160 new AI data centres have appeared across the United States in the past three years in areas with high competition for scarce water resources, a 70 per cent increase from the prior three-year period. In some cases, data centres use over 25 per cent of local community water supplies.
Northern Virginia's Loudoun County, home to the world's greatest concentration of data centres covering an area equivalent to 100,000 football fields, exemplifies the pressure. Data centres serviced by the Loudoun water utility increased their drinking water use by more than 250 per cent between 2019 and 2023. When the region suffered a monthslong drought in 2024, data centres continued operating at full capacity, pulling millions of gallons daily whilst residents faced conservation restrictions.
The global pattern repeats with numbing regularity. In Uruguay, communities protested unsustainable water use during drought recovery. In Chile, facilities tap directly into drinking water reservoirs. In Aragon, Spain, demonstrators marched under the slogan “Your cloud is drying my river.” The irony is acute: the digital clouds we imagine as ethereal abstractions are, in physical reality, draining literal rivers.
Traditional data centre cooling relies on evaporative systems that spray water over heat exchangers or cooling towers. As warm air passes through, water evaporates, carrying heat away. It's thermodynamically efficient but water-intensive by design. Approximately 80 per cent of water withdrawn by data centres evaporates, with the remaining 20 per cent discharged to municipal wastewater facilities, often contaminated with cooling chemicals and minerals.
On average, a data centre uses approximately 300,000 gallons of water per day. Large facilities can consume 5 million gallons daily. An Iowa data centre consumed 1 billion gallons in 2024, enough to supply all of Iowa's residential water for five days.
The water demands become even more acute when considering that AI workloads generate significantly more heat than traditional computing. Training a single large language model can require weeks of intensive computation across thousands of processors. As AI capabilities expand and model sizes grow, the cooling challenge intensifies proportionally.
Google's water consumption has increased by nearly 88 per cent since 2019, primarily driven by data centre expansion. Amazon's emissions rose to 68.25 million metric tons of CO2 equivalent in 2024, a 6 per cent increase from the previous year and the company's first emissions rise since 2021. Microsoft's greenhouse gas emissions for 2023 were 29.1 per cent higher than its 2020 baseline, directly contradicting the company's stated climate ambitions.
These increases come despite public commitments to the contrary. Before the AI boom, Amazon, Microsoft and Google all pledged to cut their carbon footprints and become water-positive by 2030. Microsoft President Brad Smith has acknowledged that the company's AI push has made it “four times more difficult” to achieve carbon-negative goals by the target date, though he maintains the commitment stands. The admission raises uncomfortable questions about whether corporate climate pledges will be abandoned when they conflict with profitable growth opportunities.
Alternative Technologies and Their Trade-offs
The good news is that alternatives exist. The challenge is scaling them economically whilst navigating complex trade-offs between water use, energy consumption and practicality.
Closed-loop liquid cooling systems circulate water or specialised coolants through a closed circuit that never evaporates. Water flows directly to servers via cold plates or heat exchangers, absorbs heat, returns to chillers where it's cooled, then circulates again. Once filled during construction, the system requires minimal water replenishment.
Microsoft has begun deploying closed-loop, chip-level liquid cooling systems that eliminate evaporative water use entirely, reducing annual consumption by more than 125 million litres per facility. Research suggests closed-loop systems can reduce freshwater use by 50 to 70 per cent compared to traditional evaporative cooling.
The trade-off? Energy consumption. Closed-loop systems typically use 10 to 30 per cent more electricity to power chillers than evaporative systems, which leverage the thermodynamic efficiency of phase change. You can save water but increase your carbon footprint, or vice versa. Optimising both simultaneously requires careful engineering and higher capital costs.
Immersion cooling submerges entire servers in tanks filled with non-conductive dielectric fluids, providing extremely efficient heat transfer. Companies like Iceotope and LiquidStack are pioneering commercial immersion cooling solutions that can handle the extreme heat densities generated by AI accelerators. The fluids are expensive, however, and retrofitting existing data centres is impractical.
Purple pipe systems use reclaimed wastewater for cooling instead of potable water. Data centres can embrace the energy efficiency of evaporative cooling whilst preserving drinking water supplies. In 2023, Loudoun Water in Virginia delivered 815 million gallons of reclaimed water to customers, primarily data centres, saving an equivalent amount of potable water. Expanding purple pipe infrastructure requires coordination between operators, utilities and governments, plus capital investment in dual piping systems.
Geothermal cooling methods such as aquifer thermal energy storage and deep lake water cooling utilise natural cooling from the earth's thermal mass. Done properly, they consume negligible water and require minimal energy for pumping. Geographic constraints limit deployment; you need the right geology or proximity to deep water bodies. Northern European countries with abundant groundwater and cold climates are particularly well-suited to these approaches.
Hybrid approaches are emerging that combine multiple technologies. X-Cooling, a system under development by industry collaborators, blends ambient air cooling with closed-loop liquid cooling to eliminate water use whilst optimising energy efficiency. Proponents estimate it could save 1.2 million tons of water annually for every 100 megawatts of capacity.
The crucial question isn't whether alternatives exist but rather what incentives or requirements will drive adoption at scale. Left to market forces alone, operators will default to whatever maximises their economic returns, which typically means conventional evaporative cooling using subsidised water.
The Policy Patchwork
Global policy responses remain fragmented and inconsistent, ranging from ambitious mandatory reporting in the European Union to virtually unregulated expansion in many developing nations.
The EU leads in regulatory ambition. The Climate Neutral Data Centre Pact has secured commitments from operators responsible for more than 90 per cent of European data centre capacity to achieve climate neutrality by 2030. Signatories include Amazon Web Services, Google, Microsoft, IBM, Intel, Digital Realty, Equinix and dozens of others. As of 1 January 2025, new data centres in cold climates must meet an annual PUE target of 1.3 (current industry average is 1.58), effectively mandating advanced cooling technologies.
The enforcement mechanisms and penalties for non-compliance remain somewhat nebulous, however. The pact is voluntary; signatories can theoretically withdraw if requirements become inconvenient. The reporting requirements create transparency but don't impose hard caps on consumption or emissions. This reflects the EU's broader regulatory philosophy of transparency and voluntary compliance before moving to mandatory limits, a gradualist approach that critics argue allows environmental damage to continue whilst bureaucracies debate enforcement mechanisms.
Asia-Pacific countries are pursuing varied approaches that reflect different priorities and governmental structures. Singapore launched its Green Data Centre Roadmap in May 2024, aiming to grow capacity sustainably through green energy and energy-efficient technology, with plans to introduce standards for energy-efficient IT equipment and liquid cooling by 2025. The city-state, facing severe land and resource constraints, has strong incentives to maximise efficiency per square metre.
China announced plans to decrease the average PUE of its data centres to less than 1.5 by 2025, with renewable energy utilisation increasing by 10 per cent annually. Given China's massive data centre buildout to support domestic tech companies and government digitalisation initiatives, achieving these targets would represent a significant environmental improvement. Implementation and verification remain questions, however, particularly in a regulatory environment where transparency is limited.
Malaysia and Singapore have proposed mandatory sustainability reporting starting in 2025, with Hong Kong, South Korea and Taiwan targeting 2026. Japan's Financial Services Agency is developing a sustainability disclosure standard similar to the EU's CSRD, potentially requiring reporting from 2028. This regional convergence towards mandatory disclosure suggests a recognition that voluntary approaches have proven insufficient.
In the United States, much regulatory action occurs at the state level, creating a complex patchwork of requirements that vary dramatically by jurisdiction. California's Senate Bill 253, the Climate Corporate Data Accountability Act, represents one of the most aggressive state-level requirements, mandating detailed climate disclosures from large companies operating in the state. Virginia, which hosts the greatest concentration of U.S. data centres, has seen a flood of legislative activity. In 2025 legislative sessions, 113 bills across 30 states addressed data centres, with Virginia alone considering 28 bills covering everything from tax incentives to water usage restrictions.
Virginia's House Bill 1601, which would have mandated environmental impact assessments on water usage for proposed data centres, was vetoed by Governor Glenn Youngkin in May 2024, highlighting the political tension between attracting economic investment and managing environmental impacts.
Some states are attaching sustainability requirements to tax incentives, attempting to balance economic development with environmental protection. Virginia requires data centres to source at least 90 per cent of energy from carbon-free renewable sources beginning in 2027 to qualify for tax credits. Illinois requires data centres to become carbon-neutral within two years of being placed into service to receive incentives. Michigan extended incentives through 2050 (and 2065 for redevelopment sites) whilst tying benefits to brownfield and former power plant locations, encouraging reuse of previously developed land.
Oregon has proposed particularly stringent penalties: a bill requiring data centres to reduce carbon emissions by 60 per cent by 2027, with non-compliance resulting in fines of $12,000 per megawatt-hour per day. Minnesota eliminated electricity tax relief for data centres whilst adding steep annual fees and enforcing wage and sustainability requirements. Kansas launched a 20-year sales tax exemption requiring $250 million in capital investment and 20-plus jobs, setting a high bar for qualification.
The trend is towards conditions-based incentives rather than blanket tax breaks. States recognise they have leverage at the approval stage and are using it to extract sustainability commitments. The challenge is ensuring those commitments translate into verified performance over time.
At the federal level, bicameral lawmakers introduced the Artificial Intelligence Environmental Impacts Act in early 2024, directing the EPA to study AI's environmental footprint and develop measurement standards and a voluntary reporting system. The legislation remains in committee, stalled by partisan disagreements and industry lobbying.
Incentives, Penalties and What Might Actually Work
The question of what policy mechanisms can genuinely motivate operators to prioritise environmental stewardship requires grappling with economic realities. Data centre operators respond to incentives like any business: they'll adopt sustainable practices when profitable, required by regulation, or necessary to maintain social licence to operate.
Voluntary initiatives have demonstrated that good intentions alone are insufficient. Microsoft, Google and Amazon all committed to aggressive climate goals, yet their emissions trajectories are headed in the wrong direction. Without binding requirements and verification, corporate sustainability pledges function primarily as marketing.
Carbon pricing represents one economically efficient approach: make operators pay for emissions and let market forces drive efficiency. The challenge is setting prices high enough to drive behaviour change without crushing industry competitiveness. Coordinated international carbon pricing would solve the competitiveness problem but remains politically unlikely.
Water pricing faces similar dynamics. In many jurisdictions, industrial water is heavily subsidised or priced below its scarcity value. Tiered pricing offers a middle path: charge below-market rates for baseline consumption but impose premium prices for usage above certain thresholds. Couple this with seasonal adjustments that raise prices during drought conditions, and you create dynamic incentives aligned with actual scarcity.
Performance standards sidestep pricing politics by prohibiting construction or operation of facilities exceeding specified PUE, WUE or CUE thresholds. Singapore's approach exemplifies this strategy. The downside is rigidity: standards lock in specific technologies, potentially excluding innovations that achieve environmental goals through different means.
Mandatory disclosure with verification might be the most immediately viable path. Require operators to report standardised metrics on energy and water consumption, carbon emissions across all scopes, cooling technologies deployed, and renewable energy percentages. Mandate third-party audits. Make all data publicly accessible.
Transparency creates accountability through multiple channels. Investors can evaluate ESG risks. Communities can assess impacts before approving developments. Media and advocacy groups can spotlight poor performers, creating reputational pressure. And the data provides policymakers the foundation to craft evidence-based regulations.
The EU's Energy Efficiency Directive and CSRD represent this approach. The United States could adopt similar federal requirements, building on the EPA's proposed AI Environmental Impacts Act but making reporting mandatory. The iMasons Climate Accord has called for “nutrition labels” on data centres detailing sustainability outcomes.
The key is aligning financial incentives with environmental outcomes whilst maintaining flexibility for innovation. A portfolio approach combining mandatory disclosure, performance standards for new construction, carbon and water pricing reflecting scarcity, financial incentives for superior performance, and penalties for egregious behaviour would create multiple reinforcing pressures.
International coordination would amplify effectiveness. If major economic blocs adopted comparable standards and reporting requirements, operators couldn't simply relocate to the most permissive jurisdiction. Getting international agreement is difficult, but precedents exist. The Montreal Protocol successfully addressed ozone depletion through coordinated regulation. Data centre impacts are more tractable than civilisational-scale challenges like total decarbonisation.
The Community Dimension
Lost in discussions of megawatts and PUE scores are the communities where data centres locate. These facilities occupy physical land, draw from local water tables, connect to regional grids, and compete with residents for finite resources.
Chandler, Arizona provides an instructive case. In 2015, the city passed an ordinance restricting water-intensive businesses that don't create many jobs, effectively deterring data centres. The decision reflected citizen priorities: in a desert experiencing its worst drought in recorded history, consuming millions of gallons daily to cool servers whilst generating minimal employment wasn't an acceptable trade-off.
Other communities have made different calculations, viewing data centres as economic assets despite environmental costs. The decision often depends on how transparent operators are about impacts and how equitably costs and benefits are distributed.
Best practices are emerging. Some operators fund water infrastructure improvements that benefit entire communities. Others prioritise hiring locally and invest in training programmes. Procurement of renewable energy, if done locally through power purchase agreements with regional projects, can accelerate clean energy transitions. Waste heat recovery systems that redirect data centre heat to district heating networks or greenhouses turn a liability into a resource.
Proactive engagement should be a prerequisite for approval. Require developers to conduct and publicly release comprehensive environmental impact assessments. Hold public hearings where citizens can question operators and independent technical experts. Make approval contingent on binding community benefit agreements that specify environmental performance, local hiring commitments, infrastructure investments and ongoing reporting.
Too often, data centre approvals happen through opaque processes dominated by economic development offices eager to announce investment figures. By the time residents learn details, decisions are fait accompli. Shifting to participatory processes would slow approvals but produce more sustainable and equitable outcomes.
Rewiring the System
Addressing the environmental crisis created by AI data centres requires action across multiple domains simultaneously. The essential elements include:
Mandatory, standardised reporting globally. Require all data centres above a specified capacity threshold to annually report detailed metrics on energy consumption, water usage, carbon emissions across all scopes, cooling technologies, renewable energy percentages, and waste heat recovery. Mandate third-party verification and public accessibility through centralised databases.
Performance requirements for new construction tied to local environmental conditions. Water-scarce regions should prohibit evaporative cooling unless using reclaimed water. Areas with carbon-intensive grids should require on-site renewable generation. Cold climates should mandate ambitious PUE targets.
Pricing water and carbon to reflect scarcity and social cost. Eliminate subsidies that make waste economically rational. Implement tiered pricing that charges premium rates for consumption above baselines. Use seasonal adjustments to align prices with real-time conditions.
Strategic financial incentives to accelerate adoption of superior technologies. Offer tax credits for closed-loop cooling, immersion systems, waste heat recovery, and on-site renewable generation. Establish significant penalties for non-compliance, including fines and potential revocation of operating licences.
Investment in alternative cooling infrastructure at scale. Expand purple pipe systems in areas with data centre concentrations. Support geothermal system development where geology permits. Fund research into novel cooling technologies.
Reformed approval processes ensuring community voice. Require comprehensive impact assessments, public hearings and community benefit agreements before approval. Give local governments authority to impose conditions or reject proposals based on environmental capacity.
International coordination through diplomatic channels and trade agreements. Develop consensus standards and mutual recognition agreements. Use trade policy to discourage environmental dumping. Support technology transfer and capacity building in developing nations.
Demand-side solutions through research into more efficient AI architectures, better model compression and edge computing that distributes processing closer to users. Finally, cultivate cultural and corporate norm shifts where sustainability becomes as fundamental to data centre operations as uptime and security.
When the Cloud Touches Ground
The expansion of AI-powered data centres represents a collision between humanity's digital aspirations and planetary physical limits. We've constructed infrastructure that treats water and energy as infinitely abundant whilst generating carbon emissions incompatible with climate stability.
Communities are already pushing back. Aquifers are declining. Grids are straining. The “just build more” mentality is encountering limits, and those limits will only tighten as climate change intensifies water scarcity and energy systems decarbonise. The question is whether we'll address these constraints proactively through thoughtful policy or reactively through crisis-driven restrictions.
The technologies to build sustainable AI infrastructure exist. Closed-loop cooling can eliminate water consumption. Renewable energy can power operations carbon-free. Efficient design can minimise energy waste. The question is whether policy frameworks, economic incentives and social pressures will align to drive adoption before constraints force more disruptive responses.
Brad Smith's acknowledgment that AI has made Microsoft's climate goals “four times more difficult” is admirably honest but deeply inadequate as a policy response. The answer cannot be to accept that AI requires abandoning climate commitments. It must be to ensure AI development occurs within environmental boundaries through regulation, pricing and technological innovation.
Sustainable AI infrastructure is technically feasible. What's required is political will to impose requirements, market mechanisms to align incentives, transparency to enable accountability, and international cooperation to prevent a race to the bottom. None of these elements exist sufficiently today, which is why emissions rise whilst pledges multiply.
The data centres sprouting across water-stressed regions aren't abstract nodes in a cloud; they're physical installations making concrete claims on finite resources. Every litre consumed, every kilowatt drawn, every ton of carbon emitted represents a choice. We can continue making those choices unconsciously, allowing market forces to prioritise private profit over collective sustainability. Or we can choose deliberately, through democratic processes and informed by transparent data, to ensure the infrastructure powering our digital future doesn't compromise our environmental future.
The residents of Mesa, Arizona, watching data centres rise whilst their wells run dry, deserve better. So do communities worldwide facing the same calculus. The question isn't whether we can build sustainable AI infrastructure. It's whether we will, and the answer depends on whether policymakers, operators and citizens decide that environmental stewardship isn't negotiable, even when the stakes are measured in terabytes and training runs.
The technology sector has repeatedly demonstrated capacity for extraordinary innovation when properly motivated. Carbon-free data centres are vastly simpler than quantum computing or artificial general intelligence. What's lacking isn't capability but commitment. Building that commitment through robust regulation, meaningful incentives and uncompromising transparency isn't anti-technology; it's ensuring technology serves humanity rather than undermining the environmental foundations civilisation requires.
The cloud must not dry the rivers. The servers must not drain the wells. These aren't metaphors; they're material realities. Addressing them requires treating data centre environmental impacts with the seriousness they warrant: as a central challenge of sustainable technology development in the 21st century, demanding comprehensive policy responses, substantial investment and unwavering accountability.
The path forward is clear. Whether we take it depends on choices made in legislative chambers, corporate boardrooms, investor evaluations and community meetings worldwide. The infrastructure powering artificial intelligence must itself become more intelligent, operating within planetary boundaries rather than exceeding them. That transformation won't happen spontaneously. It requires us to build it, deliberately and urgently, before the wells run dry.
Sources and References
Lawrence Berkeley National Laboratory. (2024). “2024 United States Data Center Energy Usage Report.” https://eta.lbl.gov/publications/2024-lbnl-data-center-energy-usage-report
The Guardian. (2024). Analysis of data centre emissions reporting by Google, Microsoft, Meta and Apple.
Bloomberg. (2025). “The AI Boom Is Draining Water From the Areas That Need It Most.” https://www.bloomberg.com/graphics/2025-ai-impacts-data-centers-water-data/
European Commission. (2024). Energy Efficiency Directive and Corporate Sustainability Reporting Directive implementation documentation.
Climate Neutral Data Centre Pact. (2024). Signatory list and certification documentation. https://www.climateneutraldatacentre.net/
Microsoft. (2025). Environmental Sustainability Report. Published by Brad Smith, Vice Chair and President, and Melanie Nakagawa, Chief Sustainability Officer.
Morgan Stanley. (2024). Analysis of AI-optimised data centre electricity consumption and emissions projections.
NBC News. (2021). “Drought-stricken communities push back against data centers.”
NPR. (2022). “Data centers, backbone of the digital economy, face water scarcity and climate risk.”
Various state legislative documents: Virginia HB 1601, California SB 253, Oregon data centre emissions reduction bill, Illinois carbon neutrality requirements.

Tim Green UK-based Systems Theorist & Independent Technology Writer
Tim explores the intersections of artificial intelligence, decentralised cognition, and posthuman ethics. His work, published at smarterarticles.co.uk, challenges dominant narratives of technological progress while proposing interdisciplinary frameworks for collective intelligence and digital stewardship.
His writing has been featured on Ground News and shared by independent researchers across both academic and technological communities.
ORCID: 0009-0002-0156-9795 Email: tim@smarterarticles.co.uk