Your Data Outlives the App: The Governance Problem Nobody Has Solved

The app on your phone that you opened this morning, the one you use to check the weather or scan a receipt or convert a file format, may be one of the last of its kind. Not because it will stop working, but because the entire concept of downloading, installing, and maintaining software is hurtling toward obsolescence. In its place, something stranger and more fluid is taking shape: software that exists for minutes, hours, or days before vanishing without a trace, conjured from nothing by artificial intelligence and dissolved just as quickly once it has served its purpose.
Welcome to the age of the disposable app.
This is not a speculative fantasy plucked from a science fiction screenplay. It is a prediction grounded in converging trends across AI-assisted code generation, serverless cloud infrastructure, and a growing cultural exhaustion with the bloated, notification-heavy app ecosystems that have defined the smartphone era. By 2026, industry leaders and analysts anticipate that AI will routinely generate temporary, purpose-built software modules on demand, modules that close after serving their function and leave behind nothing but the data their users choose to keep. The implications for how we relate to technology, own our data, and understand what “software” even means are profound, disorienting, and largely uncharted.
Software That Forgets Itself
The idea of ephemeral software is not entirely new. Serverless computing, which emerged in the mid-2010s with platforms like AWS Lambda, already operates on a principle of transience: functions spin up in response to events, execute their logic, and shut down. The global serverless computing market, projected by Grand View Research to reach $52.13 billion by 2030 at a compound annual growth rate of 14.1 per cent, has normalised the concept of infrastructure that appears and vanishes on demand. What is new is the combination of large language models capable of generating entire applications from natural language prompts, serverless infrastructure that can host them without persistent servers, and a user base increasingly comfortable with the idea that code does not need to live forever.
Andrej Karpathy, co-founder of OpenAI and former head of AI at Tesla, captured this shift vividly in his 2025 year-in-review blog post. He described having “vibe coded entire ephemeral apps just to find a single bug because why not,” adding that code is “suddenly free, ephemeral, malleable, discardable after single use.” The term “vibe coding,” which Karpathy coined in February 2025, describes a mode of programming where developers “fully give in to the vibes, embrace exponentials, and forget that the code even exists.” What began as an amusing experiment for weekend projects has, within a year, evolved into what Karpathy now calls “agentic engineering,” a workflow where autonomous AI agents handle the vast majority of code production while humans orchestrate and verify. Writing on his personal blog about his experience vibe coding MenuGen, an end-to-end application built entirely by Cursor and Claude, Karpathy expressed excitement about a future where “the barrier to app drop to ~zero, where anyone could build and publish an app just as easily as they can make a TikTok.”
The numbers support the trajectory. According to Stack Overflow's 2025 Developer Survey, which gathered responses from over 49,000 developers across 177 countries, 84 per cent of respondents are using or planning to use AI tools in their development process, up from 76 per cent the previous year. Fully 51 per cent of professional developers use AI tools daily. Some 44 per cent of developers are now turning to AI tools to learn to code, up from 37 per cent the year before. Meanwhile, Gartner projects that by 2026, low-code development tools will account for 75 per cent of new application development, up from less than 25 per cent in 2020. The global low-code market itself is forecast to reach $44.5 billion by 2026, growing at a compound annual rate of 19 per cent. Eighty-four per cent of enterprises have already adopted low-code or no-code tools to reduce IT backlogs, and organisations adopting low-code report 50 to 70 per cent faster development cycles compared to traditional methods.
These are not incremental improvements. They represent a fundamental rewiring of how software comes into existence.
When Apps Become Verbs
Chris Royles, Field CTO for EMEA at Cloudera and a Fellow of the British Computer Society who holds a PhD in artificial intelligence from the University of Liverpool, is among those who have articulated this vision most directly. In a set of predictions published for 2026, Royles stated that “AI will start to radically change the way we think about apps, how they function and how they're built.” Today's applications, he noted, are declarative: millions of lines of code following fixed rules. AI is tearing up that rulebook. Users will soon request temporary modules generated by code and a prompt, and “once that function has served its purpose, it closes.” These disposable apps, Royles suggested, can be “built and rebuilt in seconds.”
His colleague Paul Mackay, RVP Cloud EMEA and APAC at Cloudera, offered a complementary warning. Many organisations, Mackay observed, “will begin shelving their 'Frankenstein' AI applications they built for specific business use cases, as costs spiral and governance concerns grow.” The implication is striking: not only will new software be born ephemeral, but existing permanent software may itself be retired and replaced by disposable alternatives as organisations recognise that maintaining complex, bespoke AI applications is becoming untenable.
The shift is already visible in practice. In January 2026, the global ecommerce platform Rokt held a company-wide hackathon (internally branded as “Rokt'athon”) in which more than 700 employees, many of them non-technical, used Replit's AI agent to build 135 fully functional internal applications in a single 24-hour period. Lawyers, marketers, and operations staff built tools for hiring workflows, analytics dashboards, training games, and SQL query repositories. As one Rokt executive put it, “We're empowering people who couldn't code with the ability to build software. And it's exciting, having lawyers come up to me and say, 'I've been building in Replit.'” None of these applications went through a traditional software development lifecycle. None were designed to last indefinitely. They were built to solve a problem, and once the problem was solved, many would be retired or rebuilt from scratch.
This pattern, where software becomes a verb rather than a noun, something you do rather than something you have, represents a break with decades of computing convention. Since the dawn of the personal computer, software has been a product: boxed, licensed, installed, updated, patched, and eventually deprecated through a lifecycle measured in years. The disposable app collapses that lifecycle into days, hours, or even minutes.
The Exhaustion Economy
The appeal of ephemeral software is not purely technological. It is also cultural, born from a mounting frustration with the current state of digital life.
The mobile app ecosystem has become, by most measures, unsustainable. According to AppsFlyer's 2025 uninstall report, more than one in every two apps installed is uninstalled within 30 days of download. Mobile apps lose 77 per cent of their daily active users within the first three days. By day 30, the average retention rate drops to approximately 6 per cent, meaning 94 per cent of users churn within a month. Dating apps exhibit an uninstall rate of roughly 65 per cent, and gaming apps are not far behind at 52 per cent. Performance remains the single most decisive factor: nearly 96 per cent of users consider performance a key element in deciding whether to keep or delete an app, and more than 40 per cent now drop applications that seek unnecessary access to their device or personal data.
Meanwhile, organisations are drowning in SaaS sprawl. The average enterprise now uses 112 SaaS applications, and the global SaaS market is projected to reach approximately $408 billion in 2025. There are over 42,000 SaaS companies worldwide. Reports indicate that 91 per cent of AI tools in organisations remain unmanaged, creating both productivity drag and security vulnerabilities. Subscription fatigue is measurable and growing: users are exhausted by overlapping features across dozens of apps, endless notifications, and the cognitive overhead of managing an ever-expanding digital toolset.
Disposable apps offer an alternative logic. Rather than downloading a permanent application to perform a task you might need once, you describe what you need, an AI generates it, you use it, and it disappears. No installation. No subscription. No notification settings to configure. No account to create and subsequently forget the password for. The software exists precisely as long as it is useful and not a moment longer.
This aligns with a broader cultural movement toward what designers and technologists have begun calling “minimalist utility,” the idea that technology should do one job exceptionally well, remove friction, and respect the user's time, attention, and data. After years of maximalist design that promised ever more features, integrations, and engagement surfaces, minimalist utility promises “enough”: the smallest set of capabilities that reliably solves a real problem. The shift is not anti-innovation. It is a demand for clarity, control, and measurable value, a recognition that the app economy's relentless expansion has produced diminishing returns for the people it was supposed to serve.
Where Does the Data Go?
The most unsettling question raised by disposable software is not about the software itself. It is about the data.
When an application exists for a few hours and then vanishes, what happens to the information it processed? If an AI generates a temporary expense tracker for a business trip, analyses a set of medical records for a quick consultation, or creates a one-off survey tool for customer feedback, where do those numbers, those records, those responses reside once the app closes? Who owns them? Who is responsible for their security? Who ensures they are not retained by the AI system that generated the app, or by the cloud infrastructure that hosted it?
These questions are not hypothetical. They strike at the heart of an already fragile regulatory landscape. The European Union's General Data Protection Regulation (GDPR), which has resulted in 2,245 fines totalling 5.65 billion euros since enforcement began in 2018, grants individuals the right to erasure, commonly known as the right to be forgotten. Under Article 17, individuals can request that organisations delete their personal data. The technical burden of tracking where personal data has been stored or processed is already significant for traditional software; for ephemeral applications that spin up and dissolve across distributed cloud infrastructure, it becomes an order of magnitude more complex.
The enforcement trajectory is unambiguous. In 2025 alone, European regulators issued fines amounting to 2.3 billion euros, a 38 per cent year-over-year increase. TikTok received a 530 million euro penalty for illegal data transfers to China. Meta paid 479 million euros for consent manipulation. The French data protection authority CNIL levied a 100 million euro fine against Google for making cookie rejection harder than acceptance, establishing a precedent around dark patterns in consent interfaces. The message is clear: regulators are not slowing down. And the EU AI Act, whose most significant compliance deadline falls on 2 August 2026, introduces additional obligations for high-risk AI systems, including requirements around data governance, transparency, human oversight, and record-keeping. Organisations that fail to comply face fines of up to 35 million euros or 7 per cent of global annual turnover.
The collision between ephemeral software and persistent data regulation creates a novel governance challenge. If an AI-generated app processes personal data during its brief existence, the controller (the organisation or individual who deployed the app) remains responsible for ensuring GDPR compliance, including responding to data subject access requests and deletion requests. But if the app itself no longer exists, and its architecture was generated dynamically by an AI model, reconstructing where data flowed, how it was processed, and whether copies were retained becomes extraordinarily difficult. As the European Data Protection Board (EDPB) clarified in its April 2025 report, large language models rarely achieve anonymisation standards, meaning that any data processed through AI-generated applications is likely to retain personal data characteristics that trigger regulatory obligations.
Seventy-one per cent of organisations already cite cross-border data transfer compliance as their top regulatory challenge in 2025. Disposable apps, which may be generated in one jurisdiction, hosted in another, and accessed from a third, threaten to multiply this complexity exponentially.
The Governance Gap
The regulatory challenge extends beyond data protection. Disposable apps raise fundamental questions about software accountability and quality assurance that existing frameworks were never designed to address.
Traditional software development follows established patterns of testing, review, deployment, and maintenance. Code is written by identifiable developers, reviewed by peers, tested against defined criteria, deployed through controlled pipelines, and maintained through versioned updates. When something goes wrong, there is a trail: version numbers, commit histories, deployment logs, and responsible parties. This infrastructure of accountability has been built over decades and is baked into regulatory frameworks, industry standards, and professional practices.
Disposable AI-generated software dissolves this trail. If an AI generates a temporary tool that produces incorrect calculations, gives flawed medical guidance, or mishandles financial data, who bears responsibility? The user who described what they wanted? The AI model that generated the code? The platform that hosted the ephemeral application? The company that trained the model? The cloud provider whose serverless infrastructure executed the code? The liability chain for a piece of software that existed for ninety minutes and was generated by a prompt written in plain English is, to put it mildly, unclear.
Chris Royles, in his 2026 predictions for Cloudera, emphasised that “rigorous governance is required” for disposable apps, noting that “organisations need visibility into the reasoning processes used to create these modules to ensure errors are corrected safely.” His colleague Wim Stoop, Senior Director at Cloudera, predicted the emergence of “specialist AI agents dedicated to data governance” that would “continuously monitor, classify, and secure data wherever it resides, ensuring governance becomes an always-on function embedded into daily operations.” Stoop's vision implies a future where governance itself becomes autonomous and persistent, even as the software it oversees remains temporary and fleeting.
Yet the governance infrastructure for this new paradigm remains largely theoretical. The Stack Overflow 2025 Developer Survey found that developers show the most resistance to using AI for high-responsibility, systemic tasks: 76 per cent have no plans to use AI for deployment and monitoring, and 69 per cent resist using it for project planning. A “reputation for quality” and a “robust and complete API” rank far higher than “AI integration” when developers evaluate new technology. This caution among practitioners stands in tension with the speed at which disposable app generation is advancing. The technology is moving faster than the frameworks designed to govern it.
Trust in an Ephemeral World
The trust dynamics of disposable software are counterintuitive. On one hand, ephemeral apps could be more secure than permanent ones. A tool that exists for two hours presents a far smaller attack surface than one that sits on a device for years, accumulating vulnerabilities through outdated dependencies and unpatched security flaws. If the app is gone, there is nothing to hack. Disposable apps can also be designed with encryption, limited data collection, and proper teardown processes that destroy residual data upon closure.
On the other hand, the Stack Overflow survey reveals a troubling pattern: positive sentiment toward AI tools among developers has declined from over 70 per cent in 2023 and 2024 to just 60 per cent in 2025, even as adoption has increased. The biggest single frustration, cited by 66 per cent of developers, is dealing with “AI solutions that are almost right, but not quite,” which leads to the second biggest frustration: “Debugging AI-generated code is more time-consuming,” cited by 45 per cent. Experienced developers are the most sceptical, with the lowest “highly trust” rate (2.6 per cent) and the highest “highly distrust” rate (20 per cent). When asked about a future with advanced AI, 75 per cent of developers said the primary reason they would still ask a person for help is “when I don't trust AI's answers.”
If the people building these systems do not fully trust them, why should the people using the resulting applications? The question becomes more urgent when disposable apps move beyond internal tools and weekend projects into domains with real consequences: healthcare, finance, legal advice, education. A disposable app that helps a nurse calculate drug dosages, even for a single shift, carries stakes that demand the same rigour as permanent medical software. The ephemerality of the tool does not diminish the permanence of its potential consequences.
AI agents, which represent the next frontier of this trend, are not yet mainstream among developers. The Stack Overflow survey found that 52 per cent of developers either do not use agents or stick to simpler AI tools, and 38 per cent have no plans to adopt them. Among those who do use agents, the productivity benefits are clear: 69 per cent report improved workflow and 70 per cent report reduced time on specific tasks. But only 17 per cent believe agents have improved team collaboration. The picture that emerges is one of individual productivity gains that have not yet translated into systemic trust or organisational confidence.
Rethinking Ownership in a Post-Permanent World
The shift from permanent to ephemeral software does not merely change how we build technology. It changes how we think about ownership, identity, and the digital artefacts that define our lives.
For decades, the software on our devices has served as a form of digital identity. The apps on your phone, the programmes on your computer, the subscriptions you maintain: these are choices that reflect who you are, what you value, and how you organise your life. When software becomes ephemeral, conjured for a task and dissolved afterward, that relationship evaporates. You do not own the tool. You do not even really use the tool in the traditional sense. You describe a need, something appears, it does its job, and it is gone.
This has implications for data portability and interoperability. Current regulatory frameworks, including the GDPR's right to data portability and the EU's Digital Markets Act, assume that users have ongoing relationships with software platforms, relationships that generate data over time and create lock-in effects that regulation seeks to mitigate. Disposable apps short-circuit this model entirely. There is no lock-in because there is no permanence. But there is also no continuity: no history of preferences refined over months, no accumulated data that can be exported to a competitor, no institutional memory embedded in the tool.
The Consent Management Platform market, which has grown from $802.85 million in 2025 to a projected $3.59 billion by 2033, reflects the complexity of managing user consent in an era of proliferating data touchpoints. Disposable apps threaten to multiply those touchpoints dramatically. Each ephemeral application that processes personal data creates a new consent obligation, a new data processing record, and a new potential liability, all compressed into a timeframe that makes traditional compliance workflows unworkable. The 2026 regulatory landscape demands systematic consent management, including Global Privacy Control signal recognition, one-click reject mechanisms with equal prominence, and granular consent per purpose. Achieving this within a disposable app that may exist for less than an hour requires entirely new approaches to consent architecture.
India's Digital Personal Data Protection Act, which entered its enforcement-heavy phase following the release of operational rules in November 2025, and new US state privacy laws taking effect in 2026, including California's updated CCPA with its mandatory one-click data deletion mechanism (the Delete Act), add further layers of complexity. Three additional US state privacy laws take effect in 2026, joining the growing patchwork of jurisdictional requirements. Organisations deploying disposable apps will need to navigate this maze, much of which assumes precisely the kind of persistent, identifiable software relationships that ephemeral apps are designed to eliminate.
The Class Divide of Ephemeral Computing
There is a risk, largely unexamined, that disposable apps could deepen existing digital inequalities.
The ability to generate software on demand requires access to AI models, cloud infrastructure, and reliable internet connectivity. For knowledge workers at well-resourced organisations, disposable apps promise liberation from SaaS fatigue and IT backlogs. For individuals and communities without reliable connectivity or the digital literacy to articulate their needs to an AI, the shift may simply replace one form of exclusion with another.
Gartner's prediction that by 2026, developers outside of formal IT departments will account for at least 80 per cent of the user base for low-code development tools, up from 60 per cent in 2021, sounds like democratisation. And in many ways it is. Karpathy himself has noted that “regular people benefit a lot more from LLMs compared to professionals” and expressed excitement about seeing “the barrier to app drop to ~zero, where anyone could build and publish an app just as easily as they can make a TikTok.” Rokt's hackathon, where lawyers and marketers built functional software in hours, demonstrates the potential. Jason Wong, a Gartner analyst, has observed that “the high cost of tech talent and a growing hybrid or borderless workforce will contribute to low-code technology adoption,” suggesting that economic pressures are accelerating the shift.
But “anyone” still means anyone with access to the right tools, the right infrastructure, and the right prompts. The global serverless computing market is concentrated overwhelmingly in North America, Europe, and parts of East Asia. The countries where app uninstall rates are highest, Bangladesh at 65.56 per cent, Nepal at 65.27 per cent, Pakistan at 64.58 per cent, are also the countries least likely to benefit from the disposable app revolution, not because their populations lack ingenuity but because the infrastructure and economic conditions to participate fully are not yet in place. OpenAI's GPT models dominate the LLM landscape (82 per cent of developers in the Stack Overflow survey reported using them), and Anthropic's Claude Sonnet models are used more by professional developers (45 per cent) than by those learning to code (30 per cent). Access to the best AI code generation tools remains stratified by both geography and economic circumstance.
Building for Impermanence
What does it mean to design for a world where software is not built to last?
The answer is still forming, but several principles are emerging. First, data must be decoupled from applications more radically than ever before. If the app is temporary, the data layer cannot be. Users will need persistent, portable data stores that any ephemeral application can connect to, process, and disconnect from without taking the data with it. This is architecturally feasible; serverless databases like AWS DynamoDB, Google Cloud SQL, and Azure Cosmos DB already provide exactly this kind of persistence. But achieving it at scale requires a fundamental shift in how users and organisations think about data stewardship. The stateless nature of serverless functions, which by design do not maintain long-term memory between invocations, makes this decoupling both necessary and technically natural. Solutions including external storage services, event-driven state passing, and managed stateful services are already bridging the gap between ephemeral execution and persistent data needs.
Second, governance must become embedded rather than applied. Cloudera's prediction of AI governance agents, always-on systems that monitor and classify data regardless of which application is accessing it, points toward a model where compliance does not depend on the longevity of any particular piece of software. As Stoop put it, governance will shift from “something people do to something they oversee,” with humans “shaping the process as it runs” rather than manually enforcing every rule. The EU AI Act's requirement for transparency in AI-generated interactions, which becomes enforceable under Article 50 in August 2026, will accelerate this need. Every AI-generated interaction must be disclosed, synthetic content must be labelled, and deepfakes must be identified.
Third, the economics of software will shift from subscriptions to consumption. If apps are generated on demand and discarded after use, the per-seat, per-month licensing model that has dominated SaaS for two decades becomes obsolete. In its place, we might see usage-based pricing for AI-generated software: pay for the compute to generate the app, the time it runs, and the data it processes. Forrester projects that generative AI spending will grow at an average annual rate of 36 per cent through 2030, capturing 55 per cent of the $227 billion AI software market. Much of that spending will likely flow through consumption-based models that align with the ephemeral nature of the software being produced.
Fourth, and perhaps most importantly, users will need new mental models for their relationship with technology. The permanent app trained us to think of software as a possession, something we chose, configured, and lived with. The disposable app asks us to think of software as a service in the most literal sense: a fleeting act performed on our behalf, no more permanent than a conversation. Whether that shift feels liberating or destabilising will depend largely on whether the infrastructure of data ownership, governance, and trust catches up with the pace of technical change.
After Permanence
We are not there yet. The 77 per cent of developers who say vibe coding is not part of their professional workflow, the 52 per cent who have not adopted AI agents, and the steadily declining trust in AI tools among experienced practitioners all suggest that the transition will be neither smooth nor complete. Permanent software will not vanish overnight. Mission-critical systems, regulated industries, and applications requiring years of accumulated context will continue to demand traditional development approaches for the foreseeable future.
But the direction of travel is unmistakable. The convergence of AI code generation, serverless infrastructure, and user exhaustion with permanent software is creating conditions for a genuinely new paradigm. Henen Garcia, Chief Architect for Telecommunications at Red Hat, has argued that 2026 marks a “decisive pivot towards agentic AI, autonomous software entities capable of reasoning, planning, and executing complex workflows without constant human intervention.” If those entities can build software as easily as they can execute it, the distinction between the tool and the task it performs begins to dissolve entirely.
Karpathy's vision of a world where “the barrier to app drops to ~zero” is not a prediction about some distant future. It is a description of what is already happening in hackathons, internal tools, and weekend projects around the world. The question is not whether disposable apps will arrive. They are already here. The question is whether our institutions, our regulations, and our own habits of mind can adapt to a world where the software we rely on was born this morning and will be dead by tonight. The answer will determine not just the future of technology, but the future of the data, the decisions, and the human experiences that technology is built to serve.
References and Sources
Karpathy, A. (2025). “2025 LLM Year in Review.” karpathy.bearblog.dev. Available at: https://karpathy.bearblog.dev/year-in-review-2025/
Karpathy, A. (2025). “Vibe coding.” X (formerly Twitter), 2 February 2025. Available at: https://x.com/karpathy/status/1886192184808149383
Karpathy, A. (2025). “Software in the era of AI.” Y Combinator Keynote. Discussed at: https://www.latent.space/p/s3
Karpathy, A. (2025). “Vibe coding MenuGen.” karpathy.bearblog.dev. Available at: https://karpathy.bearblog.dev/vibe-coding-menugen/
Stack Overflow (2025). “2025 Developer Survey.” Available at: https://survey.stackoverflow.co/2025/
Stack Overflow (2025). “Developers remain willing but reluctant to use AI.” stackoverflow.blog, 29 December 2025. Available at: https://stackoverflow.blog/2025/12/29/developers-remain-willing-but-reluctant-to-use-ai-the-2025-developer-survey-results-are-here/
Stack Overflow (2025). “AI Section, 2025 Developer Survey.” Available at: https://survey.stackoverflow.co/2025/ai
Gartner. “Forecast Analysis: Low-Code Development Technologies, Worldwide.” Available at: https://www.gartner.com/en/documents/7146430
Gartner (2024). “75 Percent of Enterprise Software Engineers Will Use AI Code Assistants by 2028.” Press release, 11 April 2024. Available at: https://www.gartner.com/en/newsroom/press-releases/2024-04-11-gartner-says-75-percent-of-enterprise-software-engineers-will-use-ai-code-assistants-by-2028
Kissflow (2026). “Gartner Forecasts Low Code/No Code Platform Market for 2026.” Available at: https://kissflow.com/low-code/gartner-forecasts-on-low-code-development-market/
Royles, C. (2025). Cloudera 2026 Predictions. Reported in IT Brief Asia: https://itbrief.asia/story/cloudera-forecasts-disposable-apps-ai-governance-shift
Royles, C. (2025). Cloudera 2026 Predictions. Reported in Artificial Intelligence News: https://www.artificialintelligence-news.com/news/ai-in-2026-experimental-ai-concludes-autonomous-systems-rise/
Replit (2026). “How Rokt built 135 internal applications in 24 hours.” Customer case study. Available at: https://replit.com/customers/rokt
AppsFlyer (2025). “App uninstall report, 2025 edition.” Available at: https://www.appsflyer.com/resources/reports/app-uninstall-benchmarks-report/
GetStream (2026). “2026 Guide to App Retention: Benchmarks, Stats, and More.” Available at: https://getstream.io/blog/app-retention-guide/
European Commission. “AI Act: Shaping Europe's digital future.” Available at: https://digital-strategy.ec.europa.eu/en/policies/regulatory-framework-ai
GDPR.eu. “Everything you need to know about the Right to be forgotten.” Available at: https://gdpr.eu/right-to-be-forgotten/
GDPR-info.eu. “Art. 17 GDPR, Right to erasure.” Available at: https://gdpr-info.eu/art-17-gdpr/
SecurePrivacy (2026). “EU AI Act 2026 Compliance Guide.” Available at: https://secureprivacy.ai/blog/eu-ai-act-2026-compliance
Orrick (2025). “The EU AI Act: 6 Steps to Take Before 2 August 2026.” Available at: https://www.orrick.com/en/Insights/2025/11/The-EU-AI-Act-6-Steps-to-Take-Before-2-August-2026
SecurePrivacy (2026). “Privacy Laws 2026: Global Updates and Compliance Guide.” Available at: https://secureprivacy.ai/blog/privacy-laws-2026
Forrester (2025). “Spend on Generative AI Will Grow 36% Annually to 2030.” Available at: https://www.forrester.com/blogs/spend-on-generative-ai-will-grow-36-annually-to-2030/
Forrester. “Global AI Software Forecast, 2023 to 2030.” Available at: https://www.forrester.com/report/global-ai-software-forecast-2023-to-2030/RES179806
Grand View Research. Serverless Computing Market Report. Referenced at: https://americanchase.com/future-of-serverless-computing/
Wolters Kluwer (2025). “Privacy in transition: What 2025 taught us and how to prepare for 2026.” Available at: https://www.wolterskluwer.com/en/expert-insights/privacy-in-transition-what-2025-taught-us-and-how-to-prepare-for-2026
CodeConductor (2026). “Disposable AI Apps: AI Is Changing Software Development in 2026.” Available at: https://codeconductor.ai/blog/disposable-apps-ai-changing-software-development/
Artificial Intelligence News (2025). “AI in 2026: Experimental AI concludes as autonomous systems rise.” Available at: https://www.artificialintelligence-news.com/news/ai-in-2026-experimental-ai-concludes-autonomous-systems-rise/

Tim Green UK-based Systems Theorist & Independent Technology Writer
Tim explores the intersections of artificial intelligence, decentralised cognition, and posthuman ethics. His work, published at smarterarticles.co.uk, challenges dominant narratives of technological progress while proposing interdisciplinary frameworks for collective intelligence and digital stewardship.
His writing has been featured on Ground News and shared by independent researchers across both academic and technological communities.
ORCID: 0009-0002-0156-9795 Email: tim@smarterarticles.co.uk








