When Kichler Lighting gutted its data center several years ago, company decision makers recognized that an integral component of the redesign should be powerful storage infrastructure. Today, the current two-tiered storage architecture, not only gives Kichler the flexibility, efficiency, capacity and speed it needs today, but also provides the headroom to plan for tomorrow. Of course, how it got to that point is the interesting part of the story.
The Price of Automation
In business since 1938, the decorative lighting company had its sights set on the future when it decided to implement PeopleSoft’s ERP application, software designed to automate every functional area of the business. However, in 2001, when IT went live with the application, it was clear that there were performance problems — and there were big ones.
“On a good day, a single line-item order entry would take 15-20 seconds,” said Michael S. Sink, director of the network and operations infrastructure at Kichler. With expectations of seeing 2-3 second transmission times for inbound and outbound traffic, disappointment ran high.
Not only was the system extremely slow for line-item order entry, but employees couldn’t get orders into the system on the shipping docks or print bills of lading. “We were quickly prompted to look at our infrastructure, which was clearly underpowered for an ERP environment,” said Sink.
— Michael S. Sink
Dogged by the poor performance of the PeopleSoft application, Cleveland-based Kichler Lighting was clear on one thing as it looked ahead to a technology refresh. “We needed to reduce data transmission times, get high system utilization, and be able to flatten the growth of online transactional data while reducing the cost of managing that data,” said Sink.
With that goal in mind, and the influence of a new CIO, the company kicked its plan into action. The IT department replaced the corporate network backbone as well as the database and application servers, and redesigned the storage architecture. Kichler said he saw its overall batch-processing times reduced by 700 percent. “Transaction times dropped from 20-25 seconds to 1-2 seconds, and sometimes the system is so fast that it’s impossible to measure,” he said.
IT decision makers knew from the get-go that they wanted a storage area network (SAN). “We wanted to be able to add data and hosts quickly,” he said, noting that it was clear that more system capacity would be needed over time. “Our goal was to put as much data as possible on the SAN.”
The Makeover Continues …
In 2002, the company spent six months engineering a new IT environment. The following year, the equipment arrived.
Cisco Catalyst 3508 distribution switches were replaced with redundant Cisco Catalyst 4006 backbone switches. Maxed-out SunFire 4500 database application servers were upgraded to two SunFire 4800s. (One of the 4800s is used as a product development and testing database engine and for cluster failover.) “We created a parallel testing environment for true performance load testing. Something we hadn’t done previously,” Sink said.
For its storage environment, Kichler ditched its outdated EMC Symetrix box with 500 GB of data and SCSI I/O for a 2Gbps SAN and Clariion CX600 box.
As a former EMC customer, the company looked at the vendor’s newer offerings, Sink said. In addition, the IT department also evaluated storage equipment from both Hitachi and IBM. In fact, Kichler was on the verge of purchasing equipment from Hitachi when EMC released the 600 series.
“The 600 series price/performance ratio turned out to be a better fit for us,” Sink said. Although the Hitachi equipment would have provided about 20 percent more headroom, the EMC solution was half the cost, he said. “The bottom line was that we didn’t need the extra throughput that the Hitachi product offered,” he said.
Kichler initially configured the Clariion CX600 with 4.5 TB, for a combined production, and test and development environment.
The company soon added 2 TB of ATA disk to take advantage of information lifecycle management principles. “We put the data we used less frequently on less expensive disks,” Sink said. More specifically, Kichler has an archival database for transactional data older than 13 months, which sits on lower-performance, lower-cost disk.
“However, it’s all managed through the same frame as the fibre channel,” Sink explained.
Today, Kichler has about 30 servers, from Sun Microsystems, Dell and IBM. Seventy percent of the servers are SAN-attached via fiber cables. For its mission critical ERP application, the company runs dual fiber per server with load balancing.
The bottom line: Kichler realized a 12-month ROI on its data center upgrade.
Much More to Come
Since the initial installation of its storage equipment, Kichler has added more disk systems on the SAN, from 4.5 TB to 8.5 TB. The company has since put Microsoft Exchange on the SAN, as well as its file/printer server storage. More recently, the company added digital asset management to the SAN, for which it allocated one terabyte of storage. The company is moving its CAD drawings, catalogs and marketing materials onto the new system.
And, there’s still room to spare. With that in mind, Kichler has plans to add a document management system as part of a 2006 initiative. “Today we have a lot of paper. The document management system will allow us to scan and store orders and customer accounting information. We see this as critical for disaster recovery,” Sink said.
In the near future, the IT department will wrap up a capacity-planning project as it prepares for a PeopleSoft ERP upgrade. According to Sink, the Clarion CX600 has 4GB of cache per storage processor. Depending on the outcome of the planning project, Kichler may move to the CX700. He believes there is I/O in the SAN.
According to Sink, EMC’s Navisphere software has taken the guesswork out of managing the storage infrastructure as the company adds applications, making it easy to provision disks or know how much space is available. “Managing the SAN is extremely intuitive,” he said.
Sink reports that the company got the benefits it was looking for from its upgraded data center plus added room to grow. “We continue to process business data a whole lot faster and more efficiently,” he said.