Opinion

Resilient and interconnected infrastructure

AI & ML Cloud
Mike Hoy, chief technology officer, Pulsant, examines the obstacles that organisations are likely to face in the successful implementation of AI projects.

The next 12 to 18 months will witness the evolution of AI proofs of concept into groundbreaking technologies. This progress will be fuelled by the ability to access and utilise a vast reservoir of private data, which is nine times larger than data available on the Internet. Overcoming the challenges of accessing this data will be vital to realising AI’s true potential.

Indispensable data

Rapid and accessible data is the cornerstone of successful AI. Without seamless and reliable access to data in a usable format, the very foundation of AI development and deployment collapses.

The reality is that organisational data is fragmented across multiple platforms and locations, transcending the boundaries of prominent ecosystems like AWS and Microsoft. AI applications require a robust and reliable network to ensure consistent latency, performance and real-time data exchange. Connectivity, therefore, becomes the linchpin for unlocking the value of these disparate data sources.

The criticality of connectivity is often overlooked by boards, who mistakenly assume it just works. This oversight can have catastrophic consequences for AI initiatives. Even the most advanced AI applications, equipped with immense computational power, can be crippled by a mere 10-millisecond delay in data retrieval. In 2025, deploying AI without a robust connectivity strategy is not merely a misstep; it’s a strategic failure with severe repercussions.

Cloud controversy

The connectivity challenge underscores the critical need for a new wave of cloud models designed specifically to support the demands of AI. This has reignited a broader debate about the future of cloud computing.

AI models are fundamentally different from traditional software applications. Early cloud infrastructure was ill-equipped to handle the immense scale and complexity of AI, with its billions of parameters and the constant flow of real-time data streams. This necessitates a paradigm shift in cloud design and supporting infrastructure to fully unleash the potential of AI.

While security, connectivity, and resilience – enabled by geographically distributed networks – remain fundamental, the escalating cost of operating in public clouds is forcing organisations to reassess their reliance on providers like AWS and Microsoft. The surge in workload repatriation to private clouds underscores the critical need for standardised data migration processes to ensure a smooth and efficient transition.

Standardisation

The challenge of cloud migration for AI mirrors the complexities of switching bank accounts. Just as banking regulations have streamlined this process, legislative guidance on cloud migration could be a game-changer for organisations. By establishing standardised data movement practices, organisations can more easily adopt hybrid cloud models that are perfectly suited to their AI requirements and broader business objectives.

In the face of increasingly distributed AI workloads, a standardised approach is crucial. It will not only accelerate AI adoption and foster best practices but also solidify the position of AI leaders as the market matures.

Boosting collaboration

AI’s growing demands on infrastructure necessitate increased awareness within the tech industry regarding the interplay of connectivity, cloud models, and the broader ecosystem. Successful AI implementation in the real world requires strong collaboration between organisations, suppliers, and partners.

In this new era of AI, connectivity and cloud considerations are no longer secondary concerns – they are fundamental to success. By prioritising these factors in planning and execution, businesses can effectively navigate the complexities of 2025 and beyond. 

This opinion piece was included in our February 2025 print issue. You can read the magazine in full here.

Posted under: