Greenwich Hospital, a 174-bed facility in Greenwich, Conn., that is part of the Yale New Haven Health System, has always been on the cutting edge of technology. Back in 1993, when most hospitals were still relying on hand-written documentation, Greenwich went electronic, purchasing bedside terminals. Instead of doctors and nurses writing down patients’ initial admission assessment and then keeping track of vital signs on those cumbersome clipboards, all the information would now be entered online — and then accessed from anywhere in the hospital later.
Similarly, when the hospital was opening a new state-of-the-art facility about five years ago, the technology team decided to install a state-of-the-art storage area network (SAN) to replace its current slow, decentralized system.
“We had separate storage on separate file servers [located throughout the hospital], and every time you added another server, you added more local storage,” explains James Weeks, Greenwich Hospital’s chief information officer. “It was getting out of control. You would have extra storage here that you couldn’t use on another server for another application.”
“We also had different types of media,” adds Abel Jorquera, the hospital’s director of technology. “Every server had its own media. We also had a jukebox. So every time someone needed to access something, we had to find the DVD and then put it in the jukebox. That could take hours. We wanted to keep our data online and accessible.”
The hospital was also looking to install a PAC (Picture Archival Communication) system for CT and MR scans, which uses a tremendous amount of storage — more than can be kept on a small or single server.
HP Gets the Nod
To solve all these problems, Greenwich opted for a centralized SAN. The vendor they chose, after working with consultants, talking to peers and going on several road trips, was Hewlett-Packard.
Greenwich installed its first storage area network in 2000, then added an Enterprise Virtual Array (EVA) a couple of years ago. Recently, the tech team put in new higher-speed drives and larger capacity drives — and they’re still not done.
“Our storage keeps growing,” says Weeks.
Radiology is particularly storage intensive.
“The PAC system keeps growing,” says Jorquera. “Our data right now is close to four or five terabytes, and we’ve only been live a few years. And I see it continuing to grow. So the only way to keep up with those demands and keep the storage online, so radiology can have access to all of their images, is we have to continue to grow our storage area network.”
Amazingly, Greenwich’s new SAN only takes up two racks of space in its new (incredibly clean and quiet and temperature-controlled) primary data center and a single rack in the backup area. And there is room for growth.
At the same time that it upgraded its storage capabilities, the hospital invested a lot of time and energy on infrastructure and security upgrades. It has a secondary system located at the other end of the campus, multiple generators, three incoming lines from Northeast Utilities (at least two of them on different routes), and it’s working with Verizon to get a SONET ring. The hospital also put in fiber, a redundant air conditioning system, redundant power, security monitors, water leak monitors, and a state-of-the art fire suppression system in both its primary and secondary data centers.
To ensure continuous, smooth operation, Greenwich purchased a maintenance agreement that provides the hospital with 24/7 support, four-hour turnaround, and what Weeks and Jorquera term “proactive maintenance.”
HP constantly meets with Greenwich’s tech team and “proactively looks at the system. If it sees any pre-failure warnings, it notifies them and opens up a ticket,” says Weeks. “It’s a major investment, but if you’re going to go down this road, you need to make sure it’s rock solid, because this is almost equivalent to the core of the network. If the storage area network goes down, nobody’s using anything.”
“All of our critical apps are running on our storage area network,” adds Jorquera. “We use that storage for everything: for e-mail, PACS, our medical records, file shares, you name it.”
As for maintaining the storage area network, the IT staff agrees, it’s pretty easy.
“The hardest part is getting updates from the vendor,” says Jorquera. “There are a lot of updates that happen on the system. Upgrades are a little painful when you have to upgrade everything before you can actually operate the storage. But for the most part, managing the system is great. If we have any trouble, tickets are opened automatically with HP. They’ll call us and say, ‘We found this issue, are you aware of this, are you looking into it?'”
Jorquera and his staff also regularly monitor the system, so if anything goes wrong, they’re on it in a flash.
As for doctors and nurses, the system is totally invisible. They just learn the application to access the information and then can get the data they need from any terminal in usually just a few seconds. Even when the IT team recently moved everything to the new data center, the data kept flowing, with only an occasional slowdown.
Helping with HIPAA
For many hospitals, the decision to upgrade their storage capacity is tied to HIPAA, the Health Insurance Portability and Accountability Act enacted a few years ago, mainly to protect patient privacy in the electronic age. However, that was not the case at Greenwich.
“My thought on HIPAA is ‘shame on us’ that there had to be a law enacted to tell us to do that,” says Weeks. “Everybody should have been doing all of that anyway. It’s just good common sense to running a business. They just had to mandate it for those folks that were saying, ‘We’re not spending any money on redundancy or backup systems or alternate paths unless we’re forced to, because we don’t have the money.’ … But to us, the privacy and security issues that HIPAA was asking were things that were common sense that healthcare should be doing anyway. What we did was for business continuity and just made good sense. So when we were audited for how we were complying with HIPAA, we came out with great marks.”
The Bottom Line
So how much will upgrading your storage capacity and infrastructure cost? At least $1 million, say Weeks and Jorquera. And that could just be for starters. It depends on the hospital’s existing infrastructure and storage capacity and needs.
“There’s so much involved,” explains Weeks. “The network. The fiber. What kind of cabling they have in the walls. What kind of data center. Do they have air conditioning? It could be a million — but hopefully not ten million, because that means they’ve really let their infrastructure go. But we’ve seen sites where they still have Category 3 cable in the walls. They have no fiber. They maybe have only one generator, not multiple generators.”
“You need to get your infrastructure in place first. Upgrade that,” says Jorquera. “Then take a look at your storage needs. A million dollars is probably not enough, but it will get you started.”
“I say all the time, you can invest all the capital you want,” says Weeks. “You can buy the latest and greatest. But if you don’t have the folks who’ve been there and done that and know it, you’re going to fail.”
Weeks advises hospitals looking to upgrade their storage systems to be very careful when picking a vendor — look closely at both upfront and recurring costs.
“You need to investigate each vendor, and also understand what gets covered under those service agreements,” he advises. “What is the customer’s responsibility? What is the vendor’s responsibility? What is the turnaround time? Is it 24 hours? Does it cover everything? They really need to get into that contract and see, because once you put this in, it’s your life. All your systems are pointing to that. If it fails, you don’t lose one system, you lose multiple systems. It has a lot of redundancy. It has a lot of capabilities built in. But you need to make sure you know what you’re buying.”
Looking to the Future
Weeks advises storage users to look closely at vendor roadmaps when making a decision.
“Get a roadmap and benchmark it with others,” says Weeks. “Make sure that what you’re buying isn’t going to be [outmoded in a year]. This is a major investment. Get out there and talk to your peers. Even talk to people outside of healthcare. A lot of time folks cubbyhole themselves and only talk to other healthcare institutions, but you can learn a lot from financial institutions and manufacturers too. Talk to people who have been there and done that. Go to trade shows. Make sure you’re up on the latest technology, so when you make your decision you can feel confident that you made the right decision and you’re not second-guessing yourself later on.”
“I think we were fortunate to know where we wanted to be and that our partners at HP understood what we wanted to do, and they have shown us the road to get there,” says Jorquera. “Today we are looking at what’s coming up next, so we have already asked them for an appointment to talk about lifecycle. What are we going to do? This is here today, but I’m worried about three or four years from now. What are we going to do then? That’s the next step for us. We’re always looking ahead. We want to make sure that whatever we have stored here can withstand the lifecycle of this product or the next product or whatever we’re going to do.”
For more storage features, visit Enterprise Storage Forum Special Reports