A significant amount of information is being generated in healthcare these days, as part of its transition to a technology-driven industry. All that material—electronic health records (EHRs), test results, emails, private communications, and research—needs to be stored safely and securely, and many organizations are turning to data centers to deliver.

“Hospitals first started out with a few computers, and then they started centralizing that into a room somewhere in the basement of the hospital,” says James Coe, senior vice president and director of critical facilities at Syska Hennessy Group (New York). “That room started overheating or ran out of space or power, so they had to make the room bigger or dedicated to data.”

Today, healthcare providers are considering a host of options to serve as the next generation of data storage, including new or renovated facilities located on or off campuses, outsourcing to third-party operators, and cloud services. Forty-three percent of architecture, engineering, and construction firms that responded to Healthcare Design’s 2014 Corporate Rankings Survey reported working on data center projects in 2013, up from 37 percent in 2012.

As owners focus on compliance with changing regulations, such as HIPAA’s meaningful use requirement, which incentivizes providers to implement EHR technology to improve patient care while maintaining privacy and security of that information, and as the Affordable Care Act brings more patients into the system and further increases the volume of EHRs and other information, these projects are expected to remain prevalent for some time.

Making the right choice

Industry professionals say the first step in choosing an appropriate storage option is measuring a facility’s or system’s current and projected critical load, or how much power computers, servers, and networking equipment are drawing. Organizations also need to identify their budget and how quickly they need to deploy a solution.

According to a report from DPR Construction, “The Convergence of Healthcare and IT,” mechanical and electrical systems account for more than two-thirds of a data center’s cost, so space is typically measured in terms of kilowatts (kW) of power available for computing equipment. A large healthcare organization might need a data center with the capacity to handle more than 5,000 kW, while a midsize provider might have a need between 1,000 to 4,000 kW.

In addition to size capacity, certain data center options are faster to bring to market (such as cloud storage and collocation within an existing data center), while others might take a few months to more than a year (wholesale leasing of substantial data storage and processing capacity from a developer/provider and new construction).

Organizations that want tight control of their data and servers might look at owning as the only option. This requires a substantial financial investment for construction as well as ongoing operations and maintenance, including staffing, utility fees, and the need to update or refresh computer and server equipment and applications on an average of every three to four years.

On the flip side, owning means providers control who can access the building and when maintenance and upgrades are performed. “There are a lot of issues that can come from other people in your building who have access to your equipment,” says Mark Thompson, who oversees the advanced technology market for DPR Construction (Redwood City, Calif.). “It’s not just your computer equipment but the infrastructure. You can be exposed to data center outages when you had no idea someone was in there working on the switchgear.”

Another consideration when owning a data center is whether to locate it on or off campus. On-campus facilities are required to follow the same building codes and regulations as other hospital buildings, but they can utilize existing power, network, and cooling systems. Off-campus options can be built to less-strict building codes and free up real estate on tight campuses for revenue-generating services. However, they still need to be relatively nearby to maintain reliable connectivity with a system’s network.

Many large healthcare organizations choose to have a network of smaller data centers located at each hospital and MOB campus and link to several larger off-campus data facilities around the country. For smaller, more regional players, however, the price of entry for owning and operating a data center can be prohibitive. Many are opting for a collocation model, which involves leasing space in someone else’s data center, usually by the cabinet (the unit housing the servers), row, pod, or cage (a fenced-off area to secure servers and equipment). One of the pros to this service is that an owner provides its own equipment and pays for processing power, cooling, and physical security. However, as IT needs scale up, collocation can become space constrained and expensive with rack space leasing costs ranging from $1100 to $3600 a month, Thompson says.

For midsize healthcare systems that need more substantial data and processing capacity (from 1,000 to 4,000 kW), a wholesale data center follows the same model as collocation but on a much larger scale. Wholesale providers typically lease space in 500 kW increments, making it more difficult for smaller healthcare organizations to scale up from collocation or cloud storage, Thompson says.

One of the benefits of leasing from either a collocation or wholesale company is that the data center becomes an operational cost rather than a capital cost since the healthcare organization isn’t spending its own capital or borrowing money to build the facility, but rather making monthly rent payments. “That’s very appealing to hospital systems to not spend capital,” Coe says.

Cloud services are gaining momentum for their on-demand access and ease of deployment. An owner leases everything, from the equipment and servers to the generators, uninterruptible power supply (UPS), and security system, for a premium price. “If they’re going to be able to provide you IT hardware and you can expand and shrink your IT operations on demand, that’s going to become more attractive to users,” Thompson says.

Kevin Farquhar, vice president and principal with HGA Architects and Engineers (Alexandria, Va.), says there are concerns with security protocols for cloud-based data services so he advises using them for storing non-sensitive information. He says healthcare groups can put their “public face” information on a managed cloud-based system, while maintaining private and individual records on their own networks or private clouds.

Site considerations

Healthcare providers looking to build and operate a data center need to weigh a number of factors, starting with location. These mission critical facilities can be placed almost anywhere, but owners should try to find a site that’s unlikely to be affected by natural disasters, such as hurricanes, earthquakes, or flooding. They should also be sited away from major highways, airports, or rail lines where they could be exposed to chemical spills or accidents. Other top considerations are inexpensive and reliable power, access to water sources (for cooling equipment needs), and a redundant ca
ble service with high-speed networking.

Thompson says some states, such as Oregon, North Carolina, and Texas, also offer tax incentives to data center operators. “Typically, the data center is the tip of the iceberg on the cost,” he says. “The big expense is the capital expenses on that IT hardware. If you get sales tax incentives, in some cases that can be up to one-third of your operating costs.”

Looking at design features, one of the biggest considerations is the cooling needs of the equipment, especially as data centers get larger in terms of processing power. In recent years, though, Thompson says there’s been a push to ease back on cooling to save on energy costs. “Up until 2009, all of the hardware manufacturers were saying you had to have 60-degree air to keep your equipment cool,” he says. “We’re pushing that up now into the 80s, and it’s operating fine.”

He’s also seen a shift in the use of outside air (OA) economization, which uses “free” cooler air from the outdoors for equipment cooling needs. Locations in the southwest and northwest are optimal for these systems by relying on the region’s natural dry air.

The industry also continues to implement hot-aisle and cold-aisle orientations that allow facilities to get more efficiency out of their cooling equipment by segregating cold and hot air within the facility so that warm air is efficiently routed back to the AC system without mixing with the cold air. Farquhar says he’s also added chimneys for the supply and return, which forces the air through the cabinets to be more efficient.

To protect these operations, data centers can be built to different levels of reliability by adding redundant features, such as an extra generator, to back up data during an outage. The Uptime Institute, a third-party organization focused on data center performance and efficiency, has established criteria for power, cooling, maintenance, and capability to withstand a fault, across four tiers. Coe says the higher the tier level—and more built-in redundancy—the more expensive the building cost, starting at around $9 million for a Tier I facility and upwards of $28 million for a Tier 4.

While healthcare facilities do not require a certain Tier level, Coe says in his experience he sees many choose Tier 2 or Tier 3.  “Tier 2 will provide some redundancy, but will still have single points of failure (SPOF) and will require a shut-down to accomplish certain maintenance activities,” Coe says. “Tier 3 will provide more redundancy and better fault tolerance, but will still have some SPOF.  Tier 3 facilities will also have the capability to perform maintenance on any single piece of equipment without a shut-down.”

Data centers must also account for future growth and expanding data needs. Coe says a good rule of thumb in design is to be “modular and scalable,” by using a repeatable building block with a certain number of generators, UPS modules, and chillers and cooling towers. As needs continue to grow, he says, facilities can add more equipment, such as another generator or UPS, until they eventually need to rebuild the whole infrastructure. “You build in stepping stones to get to the ultimate build-out,” he says.

One size doesn’t fit all

Conversations about planning for a data center—whether as part of a new hospital project or expansion of current infrastructure—should always involve the chief information officer to help identify overall IT goals and special equipment considerations, HGA’s Farquhar says.

Coe says he’s seen many projects that leave space in a building to be programmed for a future data program, while others have to make an existing space work. “Every building’s a compromise,” he says. “If you’re going into an existing building, you just have to find a way to use the space the best you can. If you don’t have a building already preprogrammed, you can come up with the best option for the budget.” Adds DPR’s Thompson: “If you can make the case that you’ve got a lot of growth coming and you’ve got a fast ramp-up rate, it makes sense to build your own.”

The overriding message on data centers: “One size does not fit all,” Farquhar says. “Every group is going to have a different data set and goals for future expansion and implementation of electronic records depending on how large they are, how much they can invest in capital, and how much they feel they need to hold onto versus working with a managed group.”

Anne DiNardo is senior editor of Healthcare Design. She can be reached at adinardo@vendomegrp.com.

For more on data centers, check out “Redundancy: The Key To Keeping The Power On At Your Data Center.”