Access to information inside of organisations today is often still too fragmented. Data is being generated across a range of source systems, whether on-premises, in as-a-service platforms or cloud environments. This creates data and infrastructure silos that hinder the ability to get a clear, unified view of all of an organisation’s data assets.
This challenge is giving rise to a new architectural approach known as a visibility-driven architecture. At its core, it is a modern approach to data infrastructure management, designed to break down the barriers between the systems and various data architectures that store and utilize your data, be it in cloud environments and/or on-premises hardware. It establishes a unified control plane to provide end-to-end visibility and management over the infrastructure that houses the data. This foundational visibility, in turn, enables a data fabric to deliver a single, unified view of the data itself, making it discoverable, accessible, and manageable no matter where it resides.
This unified approach is critical to organisations’ AI ambitions.
AI introduces both advanced capabilities and complexity into modern infrastructure. It produces value by harnessing vast datasets – but that puts pressure on organisations to maintain that connection between AI and data. Only when accurate, complete data is available through a unified architecture can an organisation train and use AI models with confidence.
Studies show many organisations are not yet reliably maintaining that connection.
In the banking and financial services industry for example, research shows that data “is only available when and where it is needed a quarter of the time”. The picture isn’t much better across sectors, where one quarter of IT leaders say they need help “making data available” to fulfil their AI use cases.
Two sides to data uplift
Most remediation work in this space tends to focus on improving the data itself: its fidelity, quality, standardisation, lineage, and other governance-related factors. This is a good and important starting point for preparing existing data for ingestion into AI systems.
But data availability – including cleaning and preparing it for ingestion by an AI model – is addressing only one piece of the puzzle.
Organisations also need to optimise where data is stored, and how it’s aggregated for ingestion by the AI. This becomes even more important as time goes on – as the AI itself starts producing more data that it learns from; as large language models (LLMs) get more data-hungry; or as organisations adopt multi-agent AI systems, where hundreds or thousands of AI agents or assistants simultaneously draw upon subsets of data to take certain actions in a defined transactional sequence.
For organisations, it becomes a question of having data stored in an appropriate location so it can be reliably called upon; and having visibility over where data is at all times to ensure any problems that the AI encounters retrieving it can be detected and remediated before a customer-facing issue arises.
The need to ensure data is available when and where it’s needed is a key reason why the concept of a visibility-driven architecture has taken off and been embraced by leading AI adopters.
Unpacking the elements of a visibility-driven architecture
As with a lot of architectural concepts, visibility-driven architectures are built on a number of key principles. Beyond that, when it comes to technical options, there are several potential options and pathways, and what is selected will depend on what organisations have today and the future-state they want to achieve.
At the highest level, a visibility-driven architecture establishes end-to-end, real-time visibility across the entire IT infrastructure that houses data, from cloud to on-premises to edge. This requires more than just a centralised dashboard; instead, visibility has to be incorporated into the core of the data and storage infrastructure.
At a high level, this can be enabled with modern solutions such as data fabrics, event-driven technology and observability platforms that break down silos and unify data access and management across diverse environments.
Data fabrics are used to create abstraction layers atop heterogeneous systems, enabling seamless data access without costly or redundant movement. This ensures data can stay in source systems and be tapped only when and where it is needed.
More organisations are also moving to real-time, event-driven approaches to data movement. As AI moves into the agentic era, it increasingly requires data access in real-time. Data architecture changes that prioritise the use of real-time data pipelines and event-driven approaches are key to unlocking data and improving its mobility. Real-time data pipelines empower continuous data ingestion and analysis, supporting immediate insights and decision-making far beyond traditional batch processing constraints.
Additionally, observability needs to be baked into all parts of the end-to-end infrastructure supporting AI systems, and joined together using a unified control plane that spans clouds, on-premises systems, and edge devices.
Unified observability is more than monitoring: it transforms IT systems into intelligent, proactive engines that deliver context-rich insights and enable decisive action.
With unified observability across end-to-end infrastructure in place, organisations can use that intelligence to optimise the infrastructure to support their AI models in real-time. This may involve using AI-powered techniques such as predictive analytics to forecast storage demand and potential failures before they occur, or taking advantage of automated tiering to dynamically place data on the most appropriate storage media based on access patterns and performance needs.
Adopting a visibility-driven architecture is the next stage for Australian organisations to advance their AI capabilities. When all of the components and capabilities of a visibility-driven architecture are in place, a resilient, adaptive platform is forged – transforming the AI fortunes of organisations.

Authored by: Matthew Hardman, Chief Technology Officer for APAC, Hitachi Vantara
Hitachi Vantara will be participating in the Gartner IT Symposium/Xpo Gold Coast (8-10 September 2025), exhibiting at Booth #209. AI Specialist Gary Mancuso will present the session, “AI Ambitions v Realities: Your partner in AI,” on Tuesday, 9 September, at 13:45 at Stage 1, Exhibition Hall.
Source link