Sydney Local Health District cuts ‘lightyears’ off reporting – Cloud – Software – Storage


Sydney Local Health District has cut its clinical reporting time down from eight hours to a matter of seconds after building a clinical data warehouse on Snowflake.



Dr Thomas Hambly, Sydney Local Health District

The clinical data warehouse allows Sydney clinicians to easily extract datasets of patients’ medical history, including pathology and diagnoses, and run reports within 22 seconds.

“Using our original source system, it would have taken an estimated eight hours to run a report from the last three months,” explained Dr Thomas Hambly, SLHD medical informatics registrar at Snowflake World Tour in Sydney.

“I say an estimated eight hours because after 30 minutes it times out [and] we’re not actually able to pull the data out.”

The district first built a data lake platform in Snowflake on AWS to streamline and order millions of patients’ medical records stored in a 30-year-old system.

Prior to building the Snowflake data lake, NSW Health’s data sat in Cerner Millennium in a layout of 6500 data tables that were divided into 240 data models.

“All this data is sort of a mishmash of different types of information in a single table. Can you imagine the data structures they set up back in the 1990s [that] in some cases haven’t really aged so well?” said Hambly.

Once the data was migrated to Snowflake’s data lake, the informatics team looked at how they could improve clinicians’ use of analytics through a data warehouse containing clinical data.

SLHD was already using “extensive data warehouses” on the business side, but less so on the clinical.

“This was something we wanted to do to enable us to do more analytics in the clinical space,” Hambly said.

The informatics team started first using the clinical data warehouse in a “small” project focused on encouraging clinicians to use electronic pathology results instead of printing them out.

“So as part of this [project], we wanted to build reporting,” said Hambly. But we wanted to test out whether… these new tools would improve our ability to deliver in this way.

The team ran two different reports from the Snowflake data lake: the first using SLHD’s raw data and the second with data that had been filtered through the warehouse.

“It took three minutes running in Snowflake, and that was just running it off the source data, without any of the warehouse built in,” Hambly said.

“When we put this data warehouse layer in, we got it down to 22 seconds — so lightyears quicker than our original system.”

Using the data lake, SLDH staff are now able to “pump” the data reports straight into PowerBI. Previously, they would have received a CSV file that they could “analyse themselves” after first downloading it onto Excel. 

“[With PowerBI] you can see visually how different departments are doing. You can put in graphs and put in heat maps,” said Hambly. “It’s as far from the best dashboard that we could build. It’s clearly much better than a CSV file.



Source link