The Now and Later of Large Scale Computing at Chevron

Print Friendly, PDF & Email

In this video from the 2013 XLDB conference at SLAC, Peter Breunig from Chevron presents: The Now and Later of Large Scale Computing at Chevron.

Finding, developing and extracting oil and gas from the subsurface is and has been a data driven exercise. Throughout the continuum from the first seismic experiments in the early part of the 20th century, the first nuclear/acoustic/electromagnetic well logs, analyzing cores, production testing, 4D seismic and beyond petro-technical professionals have been dealing with “relatively” large data sets. As advances in hardware/software technologies, sensing technologies and math-physics have progressed the ratio of data to compute power has remained relatively constant as the goal is increasing the resolution of the subsurface. The talk will provide the path taken and where it might go in the future in and how it might connect to opportunities in the XLDB arena.

You can check out more talks from the XLDB conference at YouTube.

Comments

  1. Cool talk- I worked for Chevron Oil Field Research many years ago, running those big ol’ mainframes to study biomarkers in oil reservoirs. Now I do bioinformatics, and biomarkers have quite a different meaning, but it is great to see how far they have come.