Sven Treitel

From SEG Wiki
Jump to: navigation, search
Sven Treitel
Sven Treitel headshot.png
Latest company Amoco
Membership Honorary Member
MSc Geology and Geophysics
PhD Geology and Geophysics
BSc university Massachusetts Institute of Technology
MSc university Massachusetts Institute of Technology[1]
PhD university Massachusetts Institute of Technology[2]

Sven Treitel is a German born Argentinan American geophysicist, who, with Enders Robinson was instrumental in the transition of exploration geophysics from analog to digital recording and data processing.


Early in the technological timetable of their profession, geophysicists laboriously tracked peaks and valleys on photographic paper - data processing vis-à-vis interpretation. Today, such operations minus digital computers would appear self-contradictory.[3]

The turning point occurred in the mid-1960s, when geophysical companies began the gradual conversion from analog to digital systems, thus becoming, in fact, the first industry to successfully make widespread use of digital technology in its infancy and now one of the most computer-intensive of all.

MIT's Geophysical Analysis Group (GAG)

It was in academia, however, at the Massachusetts Institute of Technology, that a need for digitization of analog signals first arose in exploring the possible application of statistical methods of time series analysis to seismic data processing. Such research was conducted in the early 1950s by a superlative nucleus of graduate students, the Geophysical Analysis Group. It received funding from several major oil and geophysical service companies who, despite this, eventually came to regard GAG's work as too esoteric for commercial application and thus not worth even the annual $30-odd thousand they collectively contributed.

The GAG members went their separate ways - most to academia, and a few to industry. Among the latter was Sven Treitel, a research consultant for Amoco in Tulsa since 1977.

In perspective, Treitel doesn't deny that it was impossible to foretell the reaches of the early GAG research - nor did the members ever try to. Industry's skepticism may have been matched by his own, had he been asked to believe in the '50s that in the '80s his children would use a home computer many times the word capacity of the then-state-of-the-art MIT Whirlwind vacuum-tube mainframe. And only in perspective can it be said now that GAG (Treitel et al, listed in Geophysics, June 1967, p. 512) laid the foundations of the geophysical "digital revolution".

Early Years

There wasn't, however, a sense of mission (in Treitel's case he admits to a strictly worldly consideration) in joining GAG. "It was 1953. I wish I could say I was steeped in Norbert Wiener's theories and thus drawn to his ideas. In reality, had someone asked me what time series analysis was, I would have smiled sheepishly and changed the subject. Like most students I was poor, and the GAG assistantship was the ticket to complete my graduate studies."

And entering geophysics had been equally undeliberate.

Had digital computers been available when he attended MIT, Treitel would now be a naval engineer, i.e., the complex and now totally computerized manipulations required in descriptive geometry weren't one of Treitel's fortes. Flunking this prerequisite for his intended career, he chose to try geophysics, having had an ongoing interest in both mathematics as well as geology.

But soon the prosaic appeal of the group was superseded by a genuine interest in its scientific filigree. In fact, Treitel wrote his MS thesis on wavelet estimation experiments conducted in GAG. And it was a disappointment when in 1957 their research was called to a halt because, according to Treitel: "Our sponsors felt that everything we could achieve had been achieved. Far from it. But at least we had created a framework for others to build on."

Treitel would be one of the leading others. But for three years, filter theory - the synthesis of GAG's work - would not be in the forefront of his concerns. Having earned his PhD, in 1958 he ventured into what he regarded as "the real world" of the exploration industry. Furthermore, MIT's $150 a month for a research assistantship was no financial footing for his immediate plans to marry Renata Minerbi, whom he had met in Buenos Aires.

Standard of California in Cuba

Standard of California, which for two consecutive summer jobs had sent him to Cuba, offered him a permanent position there as a gravity and seismic interpreter. He arrived in Havana shortly before Castro's takeover. By late 1959 there was no hope for free enterprise on the island, forcing Standard and other companies to leave. For Treitel it meant transferring to their lab in La Habra, California. The smog and teeming population in the general area motivated him to opt for a change of employers.

Pan American Petroleum

Pan American Petroleum in Tulsa, then a town with a livable quarter-million census and pristine air, seemed a good place to start a job search. A telegram informed the director of geophysical research, Daniel Silverman, of Treitel's immediate availability. As a former chairman of the advisory committee representing the industrial sponsors of GAG, Silverman was well aware of Treitel's qualifications, and responded with an attractive offer. Thus, in 1960 Treitel began his career at Amoco (so renamed in 1970), leading to his current status in the company's highest scientific level.

A propitious research environment soon revived Treitel's interest in the theories advanced at GAG. This resulted in a series of developments of wide application in industry, especially in the data processing area. Of his own role Treitel will speak only generically, usually in terms of previous scientific contributions which made his own possible.

In doing so, it wouldn't be surprising for Treitel to give credit to the Adam of computerdom, i.e., whoever invented the abacus five millennia ago. The test of time, after all, is of the essence if one regards scientific evolution as a small-step process - and Treitel does. Quantum leaps, so often heralded by the technical and lay press, he discredits categorically.

"Progress consists far less of the so-called breakthroughs than of the dead-ends, frustrations, and a great deal of wasted time that precede them. We rarely report this in our writings, and that's all right, as long as we remember that our mistakes outweigh our successes. In other words, let's not become too impressed with ourselves, or our sights will become hopelessly narrow."

Honors and Awards

But Treitel's quiet successes are recognized in a long list of honors. These include in 1969 the SEG Reginald Fessenden Award and the Conrad Schlumberger Award of the European Association of Exploration Geophysicists, and in 1983 SEG Honorary Membership. He is also one of only three SEG members - joining Franklyn Levin, a long-time friend, and Albert Musgrave- to have been both Distinguished Lecturer (in 1982 with Deconvolution - Use It But Don't Abuse It), plus three time the recipient of the Best Paper in Geophysics Award, in 1964 and 1969 with respective co-authors Enders Robinson and Kenneth Peacock, and in 1995 with Larry Lines and a group at Conoco.

As for Treitel's contributions to the Society, his internationality in chairing the Standing Committee on Translations since 1979 has had protracted significance. A scientific communicator of renown in English-speaking countries, he is equally at ease lecturing in Germany (where he was born) in that country's tongue, in Hispanic countries (he grew up in Argentina), or in Italy, where his wife's family originated.

This is clearly an advantage in the course of his frequent travels, not to mention an extra social pleasantry at the SEG's Annual Meeting. But far more than his being adroitly multilingual, it is the clarity and often stunning acuteness with which he addresses any subject that is impressive - witness the interview that follows. It may lend a clue as to why he is one of the most effective and best-liked spokesmen for his profession:


Dr. Treitel, the event of digital data processing is often referred to as "revolutionary." Can the nature of the work GAG did in the 1950s be described as such?
I take a dim view of words like revolutionary, pioneering, breakthrough, etc. which pervade the technical literature from the '60s on. Judging by the many breakthroughs reported - daily it seems - it's a wonder we still have so many unresolved scientific problems. But the truth is that we are not that good. Basically, we just build on the foundations laid by other, often unsung, scientists who came before us. So attaching revolutionary to a technical, medical, or whatever scientific advancement is but a way to scream louder - to isolate one effort amidst today's swell of scientific activity.
Science is a long sequence of small steps. On the subject of the evolution of science I would refer you to J. P. Woods's 1960 SEG presidential address, in which among several interesting things he said: "... we catalog and classify, we make rules and systems and codes, and so we invent the immutable laws of science which in due time we revise." You see, the implications of a development can only be evaluated at a distance.
Now, in perspective, the GAG work was very innovative. Only we didn't know it while it was happening - never were we under the impression of being "seminal contributors" (another pet expression of the technical press), or about to break through anything. The MIT faculty wasn't especially impressed with us either. In fact, even the professors responsible for launching the GAG project left us pretty much to our own devices - perhaps more so than hindsight might justify. What we accomplished was due to the fact that all the ingredients needed for a research team to be effective, then or now, were present. We were self-motivated individuals with similar scientific interests. And then, too, we were very lucky to come together in the same place, at the same time.
Is "luck" scientifically permissible?
It is a very important element, and now properly appreciated, particularly in the United States where we live under the Puritan work ethic by which progress is allegedly achieved only by the sweat of one's brow. Indeed, science being a slow and painful process, good working habits help, but I believe much depends on luck.
Which developments - if not breakthroughs - emanated from GAG?
There was, of course, the basic realization that the digital computer could be used to process seismograms. Then there was the method of predictive deconvolution. And another was the so-called minimum-phase or minimum-delay representation of the recorded seismogram, which has led to a whole battery of techniques that others built on to further refine our ability to make information more interpretable.
How much has the evolution of digital filtering technology diluted the original theories from which you departed almost 30 years ago?
The basic theories were drawn from communication theory. The underlying framework is as relevant now as it was then. What has happened is that the framework has just so much to be milked out of it, and we need a more general one. Now there exists a whole family of techniques, all very computer-intensive, with which we operate on the data and directly extract the subsurface configuration.
We're contemplating the possibility of feeding the recorded seismic information into a computer system with minimal human intervention (one can even someday envision robots in lieu of field crews) and have the computer pick those reflections and produce an interpretation - from scratch, that is. In 20 years or so this would be standard. Now, I could be terribly embarrassed by someone unveiling such a system next month, but that gets into artificial intelligence, which is still in its infancy. Nobody yet knows how to program a computer to make a seismic interpretation. The problem today is that we really don't know how the brain of a human interpreter works.
Is this why you, among others, consider geophysics an art?
Of course. How do you program a machine to be creative? Today's computers are nothing but high-speed morons - you must tell them exactly what to do, step by step. We have yet to learn how to design machines that will mimic the intuition, gut feelings, and imagination that are a vital part of geophysical data interpretation.
Speaking of what is now feasible, the state we are beginning to reach is where human interpreters will tell their computer terminals about their hunches, which they will express through a model. In response to it, the computer will churn out seismograms, gravitational maps, or whatever is required to correspond to the interpreter's hunch. This way there is an almost instantaneous, side-by-side comparison between the geophysicist's hunch seismogram and the computer's rendition of the actual recorded seismogram.
The problem is to bring model and observations into agreement. Today, this is done mostly by hand, but we're trying to mechanize the process so that the computer can do most of it automatically. Unfortunately, more than one model will usually fit the observations, so we'll have to develop ways to narrow the number of possible models that do mimic what we in fact observe.
Backtracking for a moment - what prompted you to resume time series analysis as an area of research after joining Pan American in 1960?
My first project in Amoco was the theory of direct currents in electrical prospecting, and I wasn't particularly interested in it, nor did I think it would call for ongoing research. So I suggested to Dan Silverman that the Amoco lab get into time series analysis. Well, considering that the demise of GAG was largely due to industry's lack of interest in technology too new to consider for immediate application (as Silverman was well aware) he wasn't overly enthusiastic. But eventually I got authorization.
For the next two years I did little but review the old GAG work. Working on problems alone, with no one to discuss them with, wasn't easy. Then, in 1962 I had an inspiration - in retrospect perhaps the best of my career - I contacted Enders Robinson, really one of the brightest of that old team of graduate students. At the time he was a professor in Sweden but that didn't stop us from starting a correspondence that would last until his return to the United States in 1964. In think Enders will agree that was a most fruitful period. (At this point Treitel produces a bulky file - yellowing sheets of variegated stock and trim, of entirely equational contents. Such mathematical exchange, then or now, would virtually defy the most mathematically gifted, and as for the average observer offers but one clear bit of information per letter - the date.)
Our long-distance cooperation was a blessing, because it forced us to write down the most minute details, vs. finding a solution over lunch and then forgetting it, or not taking the pains to put it on paper. These notes were the basis of most of our early papers on geophysical filter theory, some of which were compiled in 1969 in the Robinson-Treitel Reader by Seismograph Service Corporation at the urging of Bob Geyer.
Aside from your shared interest in filter theory, did your common academic background with Dr. Robinson facilitate your work together?
Yes. To this day I can often recognize by talking to a researcher for a few minutes whether he or she attended MIT. All effective learning institutions, in fact, leave a certain imprint on their students, a style in which they tackle and resolve technical problems. This is true of Enders and myself, and may explain the homogeneity of style of over 25 co-authored papers. The lack of disruptive breaks between his passages or chapters and mine are not the product of editing or a conscious effort, but of our similar approach to science itself.
You have published over 40 papers on various areas of geophysics, and in 1980 you and Dr. Robinson also co-authored a book, Geophysical Signals Analysis. Is this prolific output expected from a researcher?
Only people in academia operate under the publish-or-perish principle. In industry we have other concerns, and the initiative to publish is almost entirely ours. Academic and industrial researchers have, however, this in common - they need an audience, like people in the performing arts.
If I did science by myself, locked in a lab with nobody to compare notes with, I'd die on the vine. Publishing is a way to seek approval of your peers. It isn't easy, it's time-consuming, and it's certainly not the most important thing an industrial researcher can do, but it's a great source of satisfaction. Who, for instance, wouldn't like to listen to a paper presented at an SEG meeting and hear himself referred to? (Favorably, that is ...)
Trying to outdo each other, aren't researcher drifting further apart from the user community in purely scientific pursuits?
On the contrary - the gap between the lab and the field is narrowing. In 1960 we had a sort of town-and-gown atmosphere at the Amoco lab, as was true of other research facilities. We were here, and they were there. On the other hand, they, those involved in actual exploration, were suspicious of research types; their understanding of technology was usually limited to conventional things, mostly hands-on experience. Furthermore, there was little recognition of the need for continuing education.
Now, over the years, just as the line between acquisition and processing is becoming thinner, it is also getting more difficult to determine where research ends and implementation begins.
In fact, attitudes have changed so much that industry currently offers researchers a great working environment, in some respects better than our traditional sanctum - academia. In a university not only do you spend several hours a day preparing classes and conducting them, but you must also devote an inordinate amount of time to writing proposals and going after the grants that make research possible. By contrast, in a corporation I only ask for money once a year and I am paid to do all the research I want. (When my children were younger they used to ask about what I did at the office. They never quite understood when I told them that I mostly sat, smoked my pipe, and thought. I wish I could have given them something more tangible they could have been proud of, like, "I cure patients," or "I drive a fire engine," but what could I do?)
More importantly, however, in industry you are right next door to the application of your theories. And let's face it, if we divorce ourselves from implementation, research becomes baroque - irrelevant to the problems at hand.
Don't companies tend to get impatient with researchers on the payroll if results aren't forthcoming from time to time?
Certainly. But it's interesting to note that the lead time for new technology is in the order of 10 years. For example, in the late '60s I initiated a project to use the digital computer to simulate wave propagation in heterogeneous media by using the method of finite differences. I did the early coding to demonstrate the applicability of the method. Then I hired some people to develop it. Progress was encouraging, but not until the late '70s did the new method prove itself useful in the field of analysis. It benefited Amoco, and others in industry as well, but it took 10 long years.
Unavoidably, some people (among them the researcher who must stand up for his idea when management gets restless) begin wondering if the effort and the investment are justifiable. There is a lot of soul-searching involved in R&D.
In which area of research does your main interest lie now?
It's a broad category - modeling and inversion - which is really a more sophisticated way to look at deconvolution.
Convolution is a mathematical operation that idealizes what we think happens in the earth when we excite it with some source of energy. (By the way, the convolutional model was the underpinning of GAG's work.)
The earth, however, doesn't read our articles. It isn't aware that it is supposed to be convolving, so it doesn't always obey our model. Therefore, when we state that a seismic recording is a result of convolution, we are only assuming the mechanisms which generate it. It's a very good assumption, but others may work even better.
So, in deconvolution (signal-cleaning would be a better term) we operate on the seismogram to get rid of the noise and sharpen the signal. But looking at the problem in more general terms, we arrive at what is generically called inversion. That is, looking at the seismic data recorded (it can be electrical, gravitational, or magnetic, too), we build a mathematical model for each data set - equations which we think describe the physical properties of the subsurface. It's a much newer field than deconvolution.
Might this be the answer to all processing riddles?
We are unlikely to come up with a mathematical model that describes all the observations.
Perhaps you were hinting at this when you once paraphrased St. Augustine: "The good 'geophysicist' should beware of mathematicians and all those who make empty promises ..." (introduction to "Principles of Digital Multichannel Filtering," Geophysics, May 1970).
Not really. I just have an abhorrence of taking myself too seriously. So I prefer to inject two grains (perhaps a few ounces) of salt in my work. Humor works, too.
Geophysics is not just mathematics. A pure mathematician has a luxury that we in applied science don't have. He can make his own logical constructions and then investigate with great pleasure where these constructs lead him. In geophysics, although we build mathematical models to describe what we observe, mathematics alone won't do.
To be specific - when we process seismograms we use a lot of theory to make computations based on the observations. But then comes the problem of interpreting those observations in terms of the structure below the recording surface. And that's not a nuts-and-bolts approach. You cannot say, for example, based on what you see on the processed seismogram, that there is absolute certainty for an interpretation. The best you can hope to do is get a number of possible hypothetical geologies that fit the data.
But which one actually is ground-truth - that's where intuition, practical experience (not science) come into play. Technology only makes the final decision simpler by narrowing down to a few the infinite range of possibilities. To me - my Augustinian levity aside - that would justify what people like myself do who dabble in theory.

SEG Best Paper in Geophysics Award


  1. Treitel, Sven (1955), A wavelet model of seismic noise (Thesis), Massachusetts Institute of Technology, OCLC 31491162,
  2. Treitel, Sven (1958), On the dissipation of seismic energy from source to surface (Thesis), Massachusetts Institute of Technology, OCLC 05717342,
  3. Proubasta, Dolores (1985). "Sven Treitel". The Leading Edge 4 (2): 24–28. doi:10.1190/1.1439127. ISSN 1070-485X.
  4. Lines, Larry; Tan, Henry; Treitel, Sven; Beck, John; Chambers, Richard; Eager, John; Savage, Charles; Queen, John et al. (1995). "Integrated reservoir characterization: Beyond tomography". GEOPHYSICS 60 (2): 354–364. doi:10.1190/1.1443771. ISSN 0016-8033.
  5. Lines, Larry R.; Schultz, Alton K.; Treitel, Sven (1988). "Cooperative inversion of geophysical data". GEOPHYSICS 53 (1): 8–20. doi:10.1190/1.1442403. ISSN 0016-8033.
  6. Peacock, K. L.; Treitel, Sven (1969). "PREDICTIVE DECONVOLUTION: THEORY AND PRACTICE". GEOPHYSICS 34 (2): 155–169. doi:10.1190/1.1440003. ISSN 0016-8033.
  7. Robinson, E. A.; Treitel, S. (1964). "PRINCIPLES OF DIGITAL FILTERING". GEOPHYSICS 29 (3): 395–404. doi:10.1190/1.1439370. ISSN 0016-8033.

External links

find literature about
Sven Treitel
SEG button search.png Datapages button.png GeoScienceWorld button.png OnePetro button.png Schlumberger button.png Google button.png AGI button.png