Chevron Makes Seismic Storage Shift

Enterprise Storage Forum content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Chevron Corp. of Houston, Tex., must have one of the largest storage environments around. It has swollen to nearly 1800 TB and is growing at a rate of 2TB a day. The company utilizes storage directly to drive the bottom line.

“The modern organization marches on storage, like the soldier marches on his stomach,” says Alan Nunns, GM of global strategy at Chevron. “This is especially true in the oil business due to its information intensity.”

He estimates that the massive global IT and storage machine behind Chevron adds 1 cent a gallon to the price of gas. But he believes that current gas prices would be much higher now if not for storage technology advancing the art of exploration and cutting the amount of test drilling required by a factor of five.

“Without the use of 3D seismic data and a massive repository of well-managed data, gas might cost $6 a gallon by now,” says Nunns. “Since we implemented our global storage network, we have gotten much better at finding new reserves of oil.”

Before, five to ten exploratory holes were sunk before success, but that number is now down to about two drills per strike. Nunns attributes this mainly to new IT and storage capabilities.

Finding New Reserves

Case in point: Nigeria. A large region of that country was thought to be depleted of oil. Older two-dimensional seismic data revealed no more prospects of oil in the area.

The company switched to 3D seismic, and with the capacity available from modern arrays, began storing vast amounts of 3D seismic data. It was able to add to this data pool by adding archives of old 2D seismic data. The result: modeling became so efficient that technicians began to go over old ground with a fresh view. This led to them discovering one of the biggest new fields found in a decade.

“We thought there were no more prospects in this region, yet we discovered a billion-barrel field,” says Nunns.

In his view, this has been made possible by the level of innovation in storage, coupled with 3D modeling techniques. He explains that the company has already accumulated more than 200TB of raw 3D seismic data. The system then takes a 2TB cube of raw information and processes it down to 100GB during a simulation. These simulations are done rapidly in the quest to find more oil. 500 oil field simulations equate to 50TB — thus storage resources must continually be expanded to keep up with demand.

Information pours in from all corners of the globe in vast quantities. Take a single oil platform. 110,000 documents are required for design alone — about 100GB. Each platform has about 1,000 I/O points that result in a 10GB per day data stream. A refinery has 33,000 I/O points. This adds up to 1TB per day of raw data and about 1TB per year of processed data. And as many as 75,000 simulation models have to be assembled for the refinery processes.

But that’s not all. Nunns trots out some more staggering numbers. Four million commercial transactions per year on ERP, 50,000 employees generate a million e-mail messages a day, and a total of 1.5TB of network traffic each and every day.

“Our corporate storage is growing at a scary 2 TB per day,” says Nunns. “We are doubling our data storage every two years. In 1997 we had only 8 TB and we are well over a PB.”

Of his nearly 1800 TB of data, two thirds is technical and the rest business. Half the data is structured in databases, ERP systems, etc., leaving the other half completely unstructured.

The Value of Data Calculated

Nunns explains the math behind storage retention. The value of data can be calculated, he says, as its business impact minus the cost of access minus the cost of storage and minus its potential liability (this last category can be in terms of having it versus not having it).

“If you don’t need it and aren’t regulated for it, attorneys are advising companies to get rid of data, as there are hidden liabilities in retaining it,” says Nunns. “Further, the value of information varies over its lifecycle. It gets to a point where the value of the data becomes negative.”

To make the cost picture even more complex at Chevron, the value of some data goes up and down as new methods and technology make it possible to take a fresh look at an old field. He hopes that the evolving ILM techniques will help him maximize the net present value of information over its lifecycle.

“You could delete information that could become valuable again, or store data you will never use again,” says Nunns. “We have to understand these lifecycles better in order to do a more efficient job of managing.”

Article courtesy of Enterprise IT Planet

Drew Robb
Drew Robb
Drew Robb is a contributing writer for Datamation, Enterprise Storage Forum, eSecurity Planet, Channel Insider, and eWeek. He has been reporting on all areas of IT for more than 25 years. He has a degree from the University of Strathclyde UK (USUK), and lives in the Tampa Bay area of Florida.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends, and analysis.

Latest Articles

15 Software Defined Storage Best Practices

Software Defined Storage (SDS) enables the use of commodity storage hardware. Learn 15 best practices for SDS implementation.

What is Fibre Channel over Ethernet (FCoE)?

Fibre Channel Over Ethernet (FCoE) is the encapsulation and transmission of Fibre Channel (FC) frames over enhanced Ethernet networks, combining the advantages of Ethernet...

9 Types of Computer Memory Defined (With Use Cases)

Computer memory is a term for all of the types of data storage technology that a computer may use. Learn more about the X types of computer memory.