VSProwess
VSProwess
95
VSProwess
File Commands 7
Job manager.........................................................................................................................7
Purge processing..................................................................................................................7
Save as route template .........................................................................................................7
Formation tops .....................................................................................................................8
Welltrak ...............................................................................................................................8
Logs .....................................................................................................................................8
Edit a route 10
Edit ....................................................................................................................................10
Parameters .........................................................................................................................12
Execute a route 13
Execute mode ....................................................................................................................13
Display results 14
Display selection................................................................................................................14
Display view mode ............................................................................................................15
Zoom mode........................................................................................................................17
Spectral analysis ................................................................................................................19
Hodogram display..............................................................................................................20
GroupView ........................................................................................................................22
VibQC................................................................................................................................25
Profile view mode..............................................................................................................26
Wellview mode..................................................................................................................29
VSProwess Contents • i
Print options...................................................................................................................... 38
Options (frequency domain) ............................................................................................. 38
Filter.................................................................................................................................. 39
Inversion display............................................................................................................... 39
Log display ....................................................................................................................... 40
Polygon mode ................................................................................................................... 41
Display label ..................................................................................................................... 41
Composite Displays 43
Composite settings ............................................................................................................ 43
Uses of composite displays ............................................................................................... 44
Operators by class 45
Overview........................................................................................................................... 45
Import/Export operators.................................................................................................... 45
Acquisition repair operators.............................................................................................. 46
Arithmetic operators ......................................................................................................... 46
Domain transformations.................................................................................................... 47
Time domain operators ..................................................................................................... 47
Frequency domain operators............................................................................................. 48
Wave-number domain....................................................................................................... 48
Manipulation operators ..................................................................................................... 49
Convenience operators...................................................................................................... 49
Imaging operators ............................................................................................................. 50
Multiple component operators .......................................................................................... 51
Miscellaneous operators.................................................................................................... 51
Operators 52
ACQinput.......................................................................................................................... 52
Add ................................................................................................................................... 54
AGC.................................................................................................................................. 54
AGIPoutput....................................................................................................................... 55
Append.............................................................................................................................. 55
AutoPick ........................................................................................................................... 56
Bin..................................................................................................................................... 57
BinImport.......................................................................................................................... 59
ClientShift......................................................................................................................... 60
Collect............................................................................................................................... 61
Conjugate .......................................................................................................................... 62
CopyREF .......................................................................................................................... 62
Correlate............................................................................................................................ 62
CSVimport ........................................................................................................................ 63
CVLcalibrate..................................................................................................................... 65
CVLshift ........................................................................................................................... 65
DBupdate .......................................................................................................................... 65
DCsubtract ........................................................................................................................ 67
Deconvolve ....................................................................................................................... 68
Deglitch............................................................................................................................. 69
Depth2Time ...................................................................................................................... 70
Designature ....................................................................................................................... 70
Deskew ............................................................................................................................. 71
Divide ............................................................................................................................... 71
ELFoutput ......................................................................................................................... 72
Enhance............................................................................................................................. 72
Equalise............................................................................................................................. 74
EventDetect....................................................................................................................... 75
EventLocate ...................................................................................................................... 77
Fbandpass.......................................................................................................................... 77
VSProwess Contents • ii
Fbandstop...........................................................................................................................77
Fbutter................................................................................................................................78
Fcollapse............................................................................................................................79
Fexpand .............................................................................................................................79
FKtoFX..............................................................................................................................80
FKfilter ..............................................................................................................................80
Flow-pass...........................................................................................................................80
Frontblank..........................................................................................................................81
FXresample........................................................................................................................81
FXtoFK..............................................................................................................................82
FXtoTX..............................................................................................................................83
Graft...................................................................................................................................83
HRotate..............................................................................................................................84
Integrate .............................................................................................................................85
Invert..................................................................................................................................85
Kbandpass..........................................................................................................................86
Kbandstop..........................................................................................................................87
LASimport .........................................................................................................................87
Magnitude..........................................................................................................................89
Mark ..................................................................................................................................89
Mc170input........................................................................................................................90
Migrate...............................................................................................................................92
MIRFinput .........................................................................................................................94
Multilock ...........................................................................................................................97
Multiply .............................................................................................................................97
NMO..................................................................................................................................98
NotchFilter.........................................................................................................................99
Orientate ..........................................................................................................................100
Overlay ............................................................................................................................101
Phaserot ...........................................................................................................................102
PickAmplitude .................................................................................................................102
Polarize ............................................................................................................................103
Profile ..............................................................................................................................105
ProfileIn ...........................................................................................................................105
ProfileX ...........................................................................................................................106
Prune................................................................................................................................107
Qestimation......................................................................................................................107
RayForm ..........................................................................................................................108
RayProx ...........................................................................................................................110
RayRetrace.......................................................................................................................112
RayTrace..........................................................................................................................113
RearBlank ........................................................................................................................118
Remark.............................................................................................................................118
Resample .........................................................................................................................119
Rotate...............................................................................................................................119
Scale ................................................................................................................................120
SEG2input .......................................................................................................................121
SEGYexport.....................................................................................................................128
SEGYinput.......................................................................................................................134
SEGYlist..........................................................................................................................142
SEGYoutput.....................................................................................................................143
Select ...............................................................................................................................143
SensorScale......................................................................................................................144
Sort ..................................................................................................................................144
Stack ................................................................................................................................145
StackAll ...........................................................................................................................146
Subtract............................................................................................................................147
Synthetic ..........................................................................................................................147
TcorrSmooth....................................................................................................................148
Tfilter ...............................................................................................................................148
Appendices 157
SEGY header usage by SEGYoutput.............................................................................. 157
Twig Descriptors............................................................................................................. 160
Dataset database identifiers............................................................................................. 161
Angles in VSProwess...................................................................................................... 163
Loci file........................................................................................................................... 166
VSProwess Contents • iv
Introduction
VSProwess
VSProwess is seismic processing software designed for the special requirements
of borehole seismic data acquisition, especially Vertical Seismic Profile (VSP)
and related techniques. VSProwess is often used at the rig site to generate quality
control displays and preliminary processed results during or shortly after the data
acquisition operation.
Pre-processor
VSProwess may also be used as a pre-processor to organise and prepare data for
export to other seismic processing systems for more advanced processing and
perhaps integration with surface seismic data.
Education
Ease of use makes VSProwess an excellent tool for training and general signal
processing education.
Research
Flexibility and seamless extension with user-written processing algorithms also
make VSProwess an excellent tool for research and special studies.
Help
The on-line help system provides all the information required to run the
VSProwess software. It is assumed that the user already has a good
understanding of the fundamentals of signal processing in general and VSP
processing in particular.
Browsing
The help system is hyper-linked to get the user to the required information as
directly as possible, however each chapter of the help document may be read like
a book by using the browse buttons on the help toolbar.
Worksheet
The essence of any signal processing system is the means by which the user is
able to construct, modify and execute a route (a processing sequence) from a
VSProwess Introduction • 1
library of operators (the available processing operations). VSProwess provides a
virtual worksheet in which the user is able to construct his/her route.
With intuitive point and click actions, the user can insert any number of
operators into a worksheet, link them together in a logically consistent manner
and control their execution. When a new worksheet is created, the user may elect
to insert a standard or user-created route template, which may then be edited if
required.
Allow editing
Very often, especially in field operations, the user does not wish to become
involved in the possibly complex business of creating or editing a route, but
instead wishes only to invoke a pre-prepared standard route and possibly change
just a few parameters. The “Allow editing” check box on the Job manager dialog
defaults to un-checked each time a job is selected, which prevents access to the
Edit mode. If the user wishes to edit the route, he/she must remember to check
this box. See also parameter locking.
If the “Allow editing” check box was checked, then editing mode is entered by
selecting the Edit mode from the main menu. This displays the edit dialog bar
from which the various editing operations are controlled.
Nodes
Each operator has a defined number of inputs, from zero to three. For example,
an Add operator needs to know which two input datasets are to be added
together. The operator box represents inputs and outputs as physical “nodes”.
Input nodes are marked at the top of the box. If an operator has two input nodes
they are annotated as “a” and “b”. The position of the output node is not marked,
but it is always the middle of the base of the operator box.
Import/export
The special “input operators” do not have an input node; instead, the input is
specified as the name of an input file entered as one of the operator’s parameters.
For a route to be executable, it must start at one of these input operators, which
are used to import external datasets into the VSProwess system. Similarly there
are “output operators” have no output node, but instead export a dataset out to a
named file, usually as SEGY.
Links
Links connecting operators in the worksheet represent logical links governing
the sequence of execution and the dataflow between those operators. Upon
execution, the links connected to an operator’s input nodes determine from
where the operator must obtain it’s input data. Each input node may have only
one link connection, which must be back to the output of another operator.
Operators have a maximum of just one output node, but this may be linked to
any number of input nodes.
Super-imposed links
Links are maintained if an operator box is moved around the worksheet, but it is
best to exercise a little care with the layout of the route to avoid links being
drawn over other links or boxes. Super-imposed links do not alter the processing
flow in any way, but they can make it difficult for the user to interpret the route.
VSProwess Introduction • 2
About operators
An operator performs some processing operation upon an input dataset,
transforming it into an output dataset. A graphical representation of an operator
may be inserted into the worksheet as an “operator box”. From the Edit mode
dialog bar simply double-click the required operator from the scrolling list and
an operator box appears at the top right of the worksheet view.
Snap grid
The operator box may be freely moved around the worksheet except that boxes
are constrained to a snap grid. Two boxes may not occupy the same grid
position.
Instance
Each operator box inserted into the worksheet is actually a reference to a
particular instance of that operator (processing operation) complete with a
unique set of operator parameters (processing options).
Remark
A special operator, Remark performs no processing but merely serves as a
container for comments about the structure of the route or suggested parameters.
Liberal use of the Remark operator is recommended
Operator parameters
Many operators support various processing “parameters”. When a new instance
of an operator is inserted into the worksheet, it inherits the default parameters for
that operator. To modify the parameters first select Parameters mode from the
main menu, click on an operator box and its parameters should appear in a dialog
bar down the right-hand side of the screen.
Locked parameters
The parameters for a particular operator may form a structural part of a
processing route and it may not be helpful to have them easily modified. An
additional “Lock” checkbox is provided on the parameters dialog bar, which is
only visible if the “Allow editing” mode is enabled. If an operator’s parameters
are locked, and “Allow editing” is not enabled, then access is refused to that
operator’s parameters. To assist the user, a small green rectangle is displayed
next to those operators that do have accessible parameters.
Title
To clarify the processing route a title string may be assigned to any operator box.
This is normally reserved for key points in the route, typically those for which
results are likely to be displayed and printed. The title string also serves as a title
for printed results.
Execution of operators
From the Execute dialog bar the user selects the segment, or segments, of the
route to be processed, by simply clicking on the start and end boxes. Press the
Start button to commence processing. Progress of the processing operations is
reported via a status display and may be aborted at any time.
VSProwess Introduction • 3
Colours
The status of the processing is indicated by the colour of the text within the
operator boxes. Black means not processed (or processing is no longer valid),
red means selected for processing, blue means that an operator has completed.
Order of execution
An operator cannot execute until all of the operators upon which it depends have
themselves completed execution, except in the special case of an input operator,
which imports a data file from outside of the VSProwess system. If an
independent parallel sequence is encountered then the first operator to be
processed is the one with the lowest opid number.
Processing chain
The VSProwess system insists that there must be an unbroken chain of processed
operators leading to any valid dataset. If, for example, the processing parameters
of an intermediate operator are changed, then all dependent processing is marked
invalid and can no longer be accessed. Note that invalid processing files are not
actually deleted until the associated operator is re-executed or the user requests a
Purge operation.
Display of results
To display the results of a successfully processed operator on screen, the user
selects Display mode from the main menu. Point and click on the desired
operator box, allocate one of the available display pages and the display should
appear. Use the “Settings” dialog to set up the display mode and scales as
appropriate. Note that the type of display is automatically appropriate for the
type of data object generated by the operator. Thus a profile object results in a
suitable graph, a frequency domain object results in a frequency domain display,
etc.
Zoom mode
Certain displays offer a zoom mode activated simply by clicking at any corner of
the area to be zoomed and dragging the cursor to the opposite corner. Within the
zoom mode certain interactive tools may be available, for example trace database
display and editing, on-screen event picking, spectral analysis and hodogram.
Printing
When a display has been viewed on-screen it may be committed to paper using
the Print command from the File menu in the usual Windows way. VSProwess
can print to any printer supported by Windows (i.e. just about all of them).
Subject only to any limitations of the particular printer and driver combination
used, the print should be produced accurately to the specified scale. Printing is
resolution independent, the better the resolution of the printer, the better the
quality of the printed result.
Electronic interchange
To enable the electronic interchange of plots you must purchase and install
Adobe Acrobat software. The Acrobat PDF-writer appears to VSProwess as just
another printer option. Acrobat 4.0 allows multiple PDF (portable document
format) files to be merged into one electronic document.
VSProwess Introduction • 4
Multiple component processing
A dataset may and usually does contain multiple components (twigs). An
operator normally processes all components identically, except for certain
special operators. This is a very useful feature for VSP processing because most
datasets are acquired using three component receivers.
Data organisation
VSProwess uses a hierarchical data model suited to the requirements of VSP
data acquisition and processing techniques. The actual folder structure closely
reflects this hierarchy. The top level of the hierarchy is the VSP folder. Beneath
this there are three main sub-folders, /Routes, /Welltrak and /Jobs. /Routes is a
library of user created route templates, /Welltrak is a library of well-track
databases generated by the Welltrak application and /Jobs is the parent folder for
all processing jobs. A job sub-folder is named by the user at the time the job is
created.
There may also be a /Operators folder under the VSP folder containing operator
information and executables that are client specific.
VSP folder
All of your work and data files are (by default) stored under the "C:\VSP"
working folder. This working folder can be placed on a different drive or
network share by specifying the desired path as a command line parameter when
Prowess is started. This is done for you during installation, but you can change
to a new location by editing the shortcut properties.
For example to place the working folder on drive D
<installation folder>\Prowess.exe D:\VSP
To place the working folder on a network share
<installation folder>\Prowess.exe \\Computer_name\Share_name\VSP
You could have several VSP folders on one drive. This might be useful to
organize jobs from the same client
<installation folder>\Prowess.exe C:\VSPmyclient
Job
A job consists of a folder containing a worksheet and all of the processing files
created using that worksheet. Usually each VSP survey would be allocated one
or more job names. For example, a dual offset source VSP might be given two
jobs, one for each source, since it is probable that the datasets from the two
sources would be processed separately, at least in the early stages. A third job
might be created later to combine the processed results in some manner. It is
possible to perform the processing of both sources within the same job because a
worksheet may become arbitrarily large.
The job folder contains both the route files and the processing files. The route
files are quite small, but processing files may consume a large amount of disk
space. Use the "purge all" operation to delete all of the processing files for an
inactive job. This will leave the route files intact so that the processing may
easily be reproduced.
Remember that the processing files for an operator are not actually destroyed
when that operator is "invalidated". Thus, it is possible for a job to seem empty
while still consuming hundreds of megabytes of disk space. The "purge invalid"
VSProwess Introduction • 5
command deletes these invalidated processing files while retaining any valid
data.
Opid
Each operator inserted into a worksheet is allocated a unique number within that
worksheet. This is the operator identity (opid) number. One of the uses of this
opid number is to form a unique folder name in which the results of the
operator’s processing (the output dataset) may be stored. The opid also
differentiates between multiple instances of the same operator ensuring that each
instance executes with the correct processing parameters.
Trace
Trace numbers are allocated sequentially each time an output dataset is
generated by an executing operator. Any original field record number is
preserved in the header database for later reference. A single trace is actually
multi-dimensional and may have several components (“twigs”).
Twig
A dataset may be described by more than one measurement, for example, the
three orthogonal components of a borehole receiver. Often these measurements
will be combined to produce a dataset containing a mixture of one or more of the
measurements and it is therefore important that the relationship between
corresponding measurements is retained. In VSProwess, the corresponding
measurements are stored with the same trace number but in different subfolders,
referred to as "twigs". Therefore, a trace may be considered to possess multiple
twigs. For most operators the same processing is applied to each twig of a trace,
although there are some exceptions. In this way, it is easy to retain the
relationship between corresponding measurements.
It is possible to Prune off selected twigs, apply different processes and then Graft
the twigs back together. All twigs of a trace should be intimately related, but
they do not necessarily have to be components. For example a source signature
for a particular trace contains essential information for that trace but is certainly
not a component of the received down hole signal.
Descriptors
Twigs are allocated descriptor strings for identification. The most common are
VZ, HX and HY, the orthogonal vertical, transverse and axial components of a
typical borehole receiver.
VSProwess Introduction • 6
File Commands
Job manager
The Job manager dialog may be used to create a new job or to switch between
existing jobs.
Select a job from the list of all currently available jobs. When the OK button is
clicked any existing route in the selected job folder is loaded. The route appears
in the same state as last viewed.
To create a new job use the Create button, you will be prompted for a name for
the new job. When the OK button is clicked a folder structure for the new job is
created, after which the “Route template” dialog appears which contains a list of
the available standard and user defined processing routes for selection. If a
completely new route is required then choose the "New" option and insert
operators as required.
If you wish to change parameters that have been locked or to use Edit mode to
alter the route then check the “Allow editing” box.
Purge processing
Remember that the processing files for an operator are not actually destroyed
when that operator is "invalidated". Thus, it is possible for a job to seem empty
while still consuming hundreds of megabytes of disk space. The "purge invalid"
command deletes these invalidated processing files while retaining any valid
data.
Purge invalid
Delete any processing files in the current job for operators that are marked
invalid or are pending execution. This option is useful for recovering disk space
consumed by a previous processing run
Purge all
Deletes all processing files in the current job. Use with caution.
Welltrak
Launch Welltrak to import a deviation database into the current job. See
Welltrak help for more information.
Input operators MIRFinput, SEGYinput, SEG2input use the current job as the
default location of Welltrak databases.
ACQinput uses the Welltrak database in the folder defined by the input path
parameter. If necessary, launch Welltrak from ACQ to make sure wellhead
coordinates are suitable.
Input operators ACQinput, MIRFinput, SEGYinput, SEG2input, set wellhead
(UTM) coordinates in the dataset database. The wellhead (UTM) coordinates are
found from the wellhead entry of the welltrak file. They also calculate RCX,
RCY and TVD values from the welltrak database using MD as a key. RCX and
RCY values are referenced to the wellhead.
The NMO operator uses wellhead (UTM) coordinates to reference its XOFF and
YOFF twig values. Hence Bin operator parameters may not be referenced to the
wellhead.
Wellview has access to all welltrak databases in the current job folder.
Wellview adds, wellhead UTM coordinates from the displayed opid, to all
source, receiver and image point coordinates before display.
Logs
The following programs are available for processing continuous velocity logs
and generating syntheticseismograms.
LogEdit
LogEdit is not yet available.
LogSynth
Launch LogSynth to generate a synthetic seismogram from logs generated by the
LogCal application. Full help for LogSynth can be found from the application.
Reports and other logs exported by LogCal are used to process the synthetic
seismogram. CSV formats exported from LogSynth can be read by CSVimport.
To display reflection coefficients use the Sparse spike display Option with
Tscale resampling turned off. You will need to rename the required column of
the exported report to SYN or VZ in order to import using the CSVimport
operator.
Edit
Create or alter the structure of a processing route worksheet. Editing is only
available if the Allow editing option has been selected in the Job manager
dialog. Editing is not available during in Execute mode.
Move
Move an operator using a drag operation on the central area.
Link
Make a link between operators using a drag and release operation from the
output node of one operator to the input node of another operator.
Unlink
Break a link between two operators, by selecting the input node of the second
operator. The link is highlighted when the mouse button is down. To release the
selection just drag the mouse away from the link. When the mouse button is
released the unlink operation can be cancelled. Any processed operators affected
by the unlink operation will be indicated. If you unlink a processed operator, it
becomes orphaned. You can display and execute below an orphaned operator but
linking to it will cause dependant data to become invalid.
Delete
Delete a selected operator.
Stretch
Row down. Insert a blank row into the processing route by moving all operators
from the selected operator down one row.
Row up. Remove a blank row from the processing route by moving all operators
from the selected operator up one row. The operation will only succeed if a
blank row occurs directly below the selected operator.
Column right. Insert a blank column into the processing route by moving all
operators from and to the right of the selected operator to the right by one
column.
Column left. Remove a blank column from the processing route by moving all
operators from and to the right of the selected operator to the left by one column.
The operation will only succeed if a blank column exists to the left of the
selected operator.
Title
Assign a title string to the selected operator. The title appears to the left of the
operator on the processing route unless the Right title box is checked. Titles may
be used to identify important stages within a processing route and are used as
default titles for printed displays. See Composite displays for more on title.
Edit per-twig description used during display. This description appears
immediately after the automatic label twig identifier. It is only available for the
first five twigs.
Label
Edit a line of text that will be added to the label file, immediately after the
selected operator has been executed.
This annotation is stored with the route, so, is not lost if the selected operator is
re-executed. It will be saved when a route template file is created.
Optionally, you can clear any existing label before this line is added. This is
useful for displays where you do not want the detail of earlier processing to
appear on the display.
Operator insertion
To insert a new operator into a route, select the required operator from the list of
operators and either click the Insert button or just double click the list item. New
operators are inserted at the nearest empty grid position near where the context
menu was activated.
Copy
Copy the selected operator.
Paste
Paste the currently copied operator near where the context menu was activated.
All parameters are copied but not processed data.
Execute mode
Execute mode is used to execute the processing route. Select the range, or
ranges of operators to be executed by clicking on the bounding operators and
press the Start button.
Start of processing
Execution can start at one of the special input operators, or at any operator
provided all inputs to that operator have already completed execution, as
indicated by blue links. The operators to be executed change colour to red.
Selecting the operator again will restore the original status of the operator.
Be aware that executing a previously executed operator invalidates the existing
processing results, along with the results of all dependent operators up to and
including any output operator. Such operators are indicated by a light red colour.
Execution sequence
Operators are processed sequentially. Parallel operators are processed in an
arbitrary order determined by the opid numbers. During the execution of each
operator, a dialog box is displayed indicating progress and providing two
methods of interrupting execution.
The compatibility of the input data for each operator is checked before
execution. If a problem is found the operator execution is aborted.
Progress dialog
The progress dialog is displayed while an operator is running. If an operator
appears to be taking too much time, it can be aborted or "killed".
Abort
The Abort button is the normal method of aborting an operator. If the operator
responds to the abort request then it was executing normally and probably would
eventually have completed its work.
Kill
Use the Kill button if the Abort button is ineffective. It destroys the operator
process, allowing VSProwess to recover from a crashed or hung operator.
Display selection
You can associate any processed dataset to any one of the twelve available
display pages.
To display a dataset
Double click on a processed dataset, the display page allocation dialog should
appear. Select the page number that you wish to associate with the dataset.
If "Now" is left checked then the system goes into View mode and the selected
dataset is immediately rendered onto the screen. Uncheck "Now" if you want to
stay in display selection mode.
If “Load settings” is checked, you can load default page settings from another
page or from stored settings. For more details see Page allocation section below.
Page settings
The following options are available for the currently selected page
Change the Settings of the selected page.
Load the settings from another page (.cfg) or from Stored settings (.txt). If you
choose to “Reset Xscale limits” default X scale limits will be loaded from the
dataset database.
Store settings for use at some other time.
Change the scrollable Limits.
Use Show Page to view the selected page.
Use Clear page to de-allocate the selected page, releasing memory resources.
Composite displays
Use Settings to configure the composite display from the available displayed
pages. Only one set of composite display settings is available at a time. See
composite display section for more information on composite settings.
Store composite display settings if you want to use them again. Composite
display settings are stored in the Composite folder for the job. They consist of a
text file containing composite settings and plot settings for each individual
display page in the composite display. The opid title, seen on the route, is used to
identify the dataset required for each page in the composite. Each opid in a
composite display must have a title before the composite can be stored.
Load composite display settings to reactivate the selected composite display.
Note that the opid title on the route is used to load the required dataset into the
relevant display page. If that opid does not have suitable data, it will not be
displayed. If you choose to “Update Xscale limits during load” default X scale
limits will be loaded from the dataset database.
Use PrintPreview and Print to display the composite display.
Use Clear to clear all composite settings ready to start a new composite.
Page selection
It is possible to switch rapidly between allocated pages using either the page
number buttons or the keyboard (keys 1-12) or the up and down arrow keys. The
keyboard is fastest and allows displays to be compared in a similar way to
flicking between paper plots.
Twigs
Any combination of the available twigs may be paneled onto one display.
Geos
Any combination of available geophones (receivers) may be selected. Any range
marking and interpolation is carried out on all receivers, not just the currently
displayed receivers.
Zoom
There are two mouse drag operations available on the main display. If Dip is
unchecked, then a drag operation may be used to define an area of the display to
zoom, which takes you to the Zoom mode.
Paneled (multiple twig) displays cannot be zoomed.
Dip
With Dip checked a drag operation provides a method of determining the dip
(gradient) of a particular data alignment. This information is useful for choosing
the dip search and dip accept ranges for spatial filters such as Enhance. Units for
the dip measurement are the same as the units of the display axes and are shown
on the status bar. For example if the mapping parameter of a metric time domain
display is Measured depth, then the dip is presented in milliseconds per metre.
Polygon
“Polygon mode” is only available for wave number (FK) displays.
Zslice
The display of, multiple, three-dimensional datasets, in VSProwess is limited to
a time or depth slice. On selecting “Zslice” you will be asked to select the
required XY coordinates; target, receiver or source. A square appears in the top
left hand corner of the display. Place the cursor over a trace of interest in the
main display and select using the mouse left button. The position of that trace
within the XY grid will be highlighted. Drag the cursor up and down to view
each Z (time/depth) slice. This option is intended for use with large cubes of
data.
Label
“Edit label”, allows the dataset label to be edited.
Clear
Clear de-allocates the display, releasing memory resources.
Twig
Select the twig to be displayed.
ExportDB
Produces a CSV file from the currently displayed plot, See DBupdate operator
for information on the format of this file. The CSV file contains one entry per
trace keyed on trace number (TR). You may choose to save only a restricted
number of database values.Use ExportDB to preserve interactive trace editing
such as trace marking and time picks. To import an exported CSV file use the
DBupdate operator.
Zoom mode
Zoom mode is used to view a portion of the main display in detail and provides
access to a number of trace analysis tools.
Use the Cancel button or the Esc key to return to the main display view. If
changes are made to the database, you will be prompted to save the changes. If
changes are saved, subsequent processing will be invalidated.
Cursor
Select the cursor type to be displayed. Make sure to select the correct cursor
before carrying out an extensive picking operation. Remember that Aux cursor
picks are not saved. See also Unpick trace function.
Selecting a trace
To select a trace, click the mouse with the cursor near the trace to be selected.
The selected trace is drawn in blue and information is displayed in the status bar.
The cursor keys may be used to step to an adjacent trace.
It is also possible to select a trace by clicking the right mouse button near the
trace to be selected. A menu will popup containing trace functions.
Deselecting a trace
To deselect a trace, try the escape key.
Zoom utilities
Utility Operation
Unpick all Unpick all times for the selected cursor.
Unmark all Unmark all traces
Interpolate Any unpicked (pick time = 0) times for the selected cursor
receive a linearly interpolated/extrapolated pick. There
must be at least two picks for the interpolation to proceed.
This function is particularly helpful if picking indistinct
arrival events such as secondary shear wave.
Un-interpolate Undo the interpolate operation. Any manually adjusted
picks remain.
Trace functions
Some of these functions are only available from the menu invoked by clicking
the right mouse button over a trace.
Mark
Mark toggles the marked status of the selected trace. This is used in conjunction
with the Select operator. To Mark (or Unmark) a range of traces, first Mark
the starting trace then select the last trace of your range and use Mark range (or
Unmark range). It is also possible to mark traces using the Mark operator.
Mark/Unmark record
Mark (or Unmark) all traces that have the same field record number as the
currently selected trace.
Spectra
Use Spectra to obtain a spectral analysis centered at the cursor time.
Database edit/Properties
Use “Database edit” to view and edit information in the trace database. This is
available by choosing Properties on the popup menu.
Unpick
Unpick the current cursor time of the selected trace.
Trace dump
Generate a CSV file of the currently selected trace.
Spectral analysis
The Spectra mode displays the spectral analysis of the selected zoomed trace
centered at the cursor time. Any combination of the time domain, frequency and
phase spectra may be displayed.
Phase graph
The phase response is blanked at frequencies where the spectral amplitude falls
off the graph. This produces a tidier and less confusing plot because the phase
tends to go haywire over the frequency regions where the signal energy has
dropped close to the noise floor.
It is possible to subtract the phase of the stored analysis from the current
analysis. One use for this option is to compare vibroseis reference sweeps. If the
phase difference is less than 20 degrees, the display range of the phase graph will
decrease.
Store
The analysis may be stored in memory for comparison.
Show
Show or hide the stored analysis.
Subtract
Subtract the phase of the stored analysis from the current analysis. When this
option is selected the phase graph will automatically be displayed and the
Normalise setting will be turned on.
Settings
The following options are available from the Settings dialog.
Graph selection
By default, data is displayed in the frequency and time domain. Either graph may
be turned off. There is also an option to display phase response.
Window
A hann window may be applied to minimise distortion caused by waveform
truncation.
Normalise
Self normalise both the current and the stored analysis. This option is
automatically turned on when phase subtraction is displayed.
Key position
Choose the position of the key within the time, frequency or phase graphs.
Key text
Edit the text that will appear in the key and select whether the dc and rms values
are to be displayed.
Hodogram display
Hodogram mode is only available when at least two suitable twigs are present in
the dataset.
Hodogram mode displays a three dimensional hodogram (particle motion
analysis) of the currently selected trace, centred at the cursor time. The azimuth
and inclination of the hodogram may be varied by dragging the mouse over the
three dimensional area of the display.
Magnitude
Select the “Magnitude” setting to change the time domain view to signal
magnitude. This mode is especially useful if the first arrival energy is coming
from a direction significantly away from the vertical axis, as may occur for
example with fixed element receivers in a deviated borehole.
Dynamic
In dynamic mode, the azimuth and inclination of the hodogram are changed
dynamically as an aid to interpretation.
Settings
The window length (milliseconds) may be set as required. The default setting is
from 5ms before the pick time to 5ms after. The time scale of the time domain
window may also be changed from the default value of 100 cm/sec.
Show
Show or hide the stored hodogram.
Trace
Step through traces using this item. Trace stepping can also be accomplished
using the keyboard up and down arrow keys.
Grouping
This allows various trace combinations to be displayed Select Adjacent to view
a number of traces around the currently selected trace. If two adjacent traces are
requested five traces in total will be displayed, two traces each side of the
currently selected trace. This option is useful for comparing waveforms, e.g. for
timing.
Select Record to display all traces with the same record number as the currently
selected trace. Trace stepping is restricted within the current record, but use the
associated spin buttons or the page up/page down keys to step to the next record.
Rotation
On entering hodogram rotation mode is set to None. Traces in the time domain
window are not rotated.
Select Azimuth to view X and Y time domain traces rotated using the current
azimuth. Dragging the 3D hodogram changes the azimuth. Inclination is rotated
to zero in this mode hence the Z trace is unchanged. X’ is the trace rotated in line
with the azimuth.
Select Inclination to view Z and X’ traces rotated around the Y’ axis using the
current inclination. Dragging the 3D hodogram changes the azimuth. Z’ is
rotated inline with the inclination.
Select Free to rotate freely by dragging the 3D hodogram.
Y’ = - X.sin(azimuth) + Y.cos(azimuth)
X’ = X.cos(azimuth) + Y.sin(azimuth)
X’’ = - X’.sin(inclination) + Z.cos(inclination)
Z’ = X’.cos(inclination) + Z.sin(inclination)
Azimuth is defined clockwise from the Y axis.
Inclination is defined clockwise from the Z axis.
Rotated traces are coloured green. Only the active trace of a grouping is rotated.
Rotated traces are not saved. Apply rotations using the Polarize, Rotate or
HRotate operators.
Preview
This option is available on the right click menu and previews the result of
Polarize and Rotate operators.
Polarize evaluates rotation angles using samples, at the pick time, from the
requested pair of twigs. Rotate uses rotation angles from angle1 or angle 2 of the
dataset database.
Save azimuth/Inclination
This option is available on the right click menu. It uses the current
azimuth/inclination angle to set a value in angle1 or angle2 of the dataset
database. The Rotate operator can then be used to apply the rotation. Rotate
assumes angle1 defines rotation to the first of the twig pair to rotate so angle1 is
set to 90+azimuth.
GroupView
GroupView mode displays a group of traces linked by the original field record
number of currently selected zoom trace. It is based on the acquisition software,
ACQ, record trace display of field records.
GroupView requires field channel numbers and field record numbers to be set in
the dataset database. These should be available for datasets imported by
ACQinput and MIRF input. See SEGYinput documentation for information on
setting up field channel numbers if they are not available in a SEGY file.
Access Groupview by selecting the context menu over the required operator and
select a record group to view.
Timescale accuracy
Because the actual size of the monitor display area is unknown to the program,
the time scale can only be a rough approximation on the screen. Printed display
views are accurately scaled.
Display pages
There are eight display pages available, each of which can be configured to show
more clearly some particular aspect of the data. Any two or more pages may be
linked into a chain, which you can step through using the spacebar.
Typically the default page is set up to show all channels with the time scale
compressed so that the full record length is visible. A second page might show
all components at an expanded time scale. A third page might show just the
reference and vertical components at a more expanded time scale for accurate
cursor adjustment.
Cursors
Display cursors allow for the accurate measurement of event arrival times. The
position of a display cursor also determines the centre of the region of data to be
used by the spectral analysis and hodogram tools.
Linked cursors
Separate display cursors are provided for each channel, but cursors associated
with the same receiver are linked together. So that, for example, moving the
cursor for a VZ channel also moves the cursors for the associated HX, HY and
DH channels.
Active channel
The last channel to have been selected becomes the “active” channel. Boxes are
drawn around both the channel identification and attribute fields, which are,
respectively, the left and right margins of the display area.
To change the active channel without moving the associated cursor, select the
channel within the right-hand margin.
If you change to a display page configured not to show the active channel, then
the top channel automatically becomes the new active channel.
The status bar at the bottom of the display shows various attributes for the active
channel, including channel allocation, cursor time and amplitude at the cursor.
Moving cursors
Use the mouse to quickly drag (mouse button held down) a cursor near to the
desired time. If the cursor is initially off screen, it will snap to the mouse pointer
after a small amount of drag.
Fine adjustment is easily accomplished by clicking the mouse button while the
pointer is to the left or right of the cursor. The cursor moves left or right by just
one step per click. The size of the step is determined by the current sample step
setting, which can be as small as one tenth of a sample, allowing very precise
sub-sample picks.
Spectra/Hodogram
These tools are described in detail elsewhere. The current channel and cursor
position is analysed by these tools.
Next/Previous
Step forward or backward through field record numbers. The active trace in the
zoom window mirrors this trace movement.
Page
Change the current display page. It is normally more convenient to use the
spacebar to step through a linked list of pages.
Attribute
Any one of the following channel attributes may be displayed.
Amp shows the signal amplitude in volts at the cursor time.
Ident shows the trace descriptors.
SNR gives an estimate for the signal to noise ratio of a window centred on the
cursor time. This is useful as data quality check.
Tabs shows absolute cursor times relative to the start of the record.
Amplify
This control allows you to apply additional gain in 6dB steps. The additional
gain is applied to all channels, after the usual normalisation.
For example, extra gain is sometimes required to emphasize the direct arrival if
the channel has normalised to a higher magnitude tube wave.
Normalise
There are three types of normalisation available: Trace, Data and Fixed. The
default normalisation type can be selected from the settings dialog. Display
reverts to default type after record and replay. The normalisation type can be
selected on a temporary basis from the display dialog bar or use the “X” key to
toggle between Trace and Cross normalisation types.
Trace means that the display gain applied to each channel is just sufficient so
that the highest magnitude signal within that channel is not clipped on the
screen. In other words is trace is independently normalised.
Selecting Cross displays the channels cross normalised to the highest magnitude
signal within an appropriate group of channels. The geophone components (VZ,
HX, HY) form one group and down-hole hydrophones (DH) form another,
because of the large difference in magnitude between these two classes of
transducer. Reference traces are not cross normalised
Use Fixed to display the geophone components to a user value specified in the
settings dialog.
Auto-pick
This button auto-picks the data as displayed on the current display page.
Automatic picking is performed according to the currently configured settings.
Pick criterion are described in the AutoPick operator.
All currently displayed components for a particular group are combined before
picking. For example if only the vertical component is currently displayed that
will be used for picking, if all or any combination of VZ, HX and HY are
displayed the pick will be found from the magnitude of those components
present.
The length of data searched during autopick is determined by the current scroll
position and time scale.
Settings
The settings dialog provides the following controls.
The channels associated with a particular geophone (receiver) may be included
or excluded from the display page.
The channels associated with a particular descriptor (twig) may be included or
excluded from the display page. For example, you might set up a page showing
only the vertical components (VZ descriptor).
The traces are normally displayed in channel order down the screen. Select the
group components option to group all similar components together, e.g. all of
the vertical components followed by all of the X horizontals, etc.
Overlay all selected components of a tool, using Overlay.
Locked traces are drawn in a Select Lock REF/Lock AUX at zero option to ensure that a reference or
different colour! auxiliary signal remains visible on a page with an expanded time scale.
Enter the required start time, in milliseconds, for the reference trace using Lock
REF at option.
Autopick parameters for Geo and Ref can be configured. See AutoPick operator
for more details.
Choose a display style from the Trace mode drop list. Standard, Alternate,
Scheme1 and Scheme2 draw wiggle traces in various colours. Grey scale draws
the traces as shades of grey depending on trace amplitude and is most useful for
multi-geophone, grouped component displays.
VibQC
Vibrator quality control (VibQC)available through GroupView has been
designed for vibrator QC.
The various graphs produced by VibQC and their Settings are described below.
Channel allocations
Pilot and Ground-force (GF) sweep channels must be identified for VibQC to
proceed. Reaction-mass (RM) and Base-plate (BP) outputs may also be assigned.
By default, VibQC identifies the required channels from the dataset database
twig descriptors. This behaviour may be overridden, allowing channels to be
assigned using drop lists.
Settings
Select which graphs (time velocity magnitude Q(if available)) are to be
displayed. Set the start depth for the top of the graphs, this is useful for profiles
without shallow entries or to effectively zoom in on the bottom of the well.
Choose; the units of display, Show points to indicate check level depths or to
show RMS velocity.
VRMS2 = ΣVi2∆ti / Σ∆ti
Where Vi and ti are interval velocities and times from ALL entries included on
the curve. The first interval time and velocity is the average time and velocity of
the shallowest included entry.
Cursor
A small box indicates the current cursor entry. Data for this entry are displayed
on the control panel. To move the current entry, use the cursor, page up, page
down, home or end keys. Alternatively, drag the cursor directly, or to single step
just click above or below the cursor.
Include on curve
Select whether the current cursor entry will appear as a cross on the graph
instead of being joined by a line to adjacent points. A point that has been
excluded from the curve cannot be selected for interval velocity calculation.
Interval velocity
Select whether the depth point at the current cursor entry will be used in the
calculation of interval velocities. Point must be included on curve.
Interval Q
Select whether the depth point at the current cursor entry will be used in the
calculation of interval Q. Point must be included on curve. Interval Q is
calculated from the following equation.
Interval Q between depth 1 and depth 2 =
(Ttran2 – Ttran1)/ [(Ttran2/Q2) – (Ttran1/Q1)]
Q1 = average Q value at depth 1
Q2 = average Q value at depth 2
Ttran1 = transit time from source to receiver at depth 1
Ttran2 = transit time from source to receiver at depth 2
If average Q is calculated from the shallowest receiver Ttran1 and Ttran2 are
modified by subtracting the transit time to the shallowest receiver.
Listing
Display the profile database in a tabulated format suitable for direct printout.
Store
A Profile may be stored in memory for comparison against another one. This
option is not available for multiple pick profiles.
Rotate
Rotate through the currently available multiple pick profiles making each the
active profile in turn. This option is only available when multiple pick profiles
are present.
Show
Show or hide the stored profile.
Insert
You can add additional depth/velocity pairs to extend the profile database above
or below the region covered by the VSP. For example to improve the accuracy of
a subsequent inversion operation by including estimated velocities from below
the deepest recorded level.
Delete
Delete an inserted point.
Edit
Edit an inserted point.
Wellview mode
This special three-dimensional display mode is useful to visualise source and
receiver locations, and you can choose to add a WellTrak well deviation
database if available. This display mode is also useful to visualise the results of
special studies such as fracture monitoring and salt-flank proximity analysis.
Certain display options are intended for fracture monitoring/passive seismic
applications. These include Timeline and Monitoring information such as
pressure data.
Access Wellview by selecting the context menu over the required operator.
Drag the screen to rotate the viewpoint in three dimensions.
The wellhead marks the origin of the display. The square base of the well
representation lies at the reference level elevation unless the start depth of the
grid is greater than the reference level elevation; in this case, the wellhead
appears as a square with a cross on the base of the display.
Click near a receiver or source or image point to view information their position.
Use the arrow up and down keys to step through the database in trace order. Use
Page up and Page down keyboard keys to step through record groups.
Once a point has been selected, it is possible to zoom in on that point. Selected
traces that are marked in the dataset database are coloured yellow.
P will take you to plan view.
Settings
Various options are provided to allow the user to configure the display.
Grid: Turn the grid lines on or off.
Receivers: If selected, receiver coordinates (TVDSD, RCX, RCY) are displayed
as small diamonds. When outside the depth range of the display, receivers
coordinates appear as squares on the base of the display.
More
Select OK more to save current settings and access further settings.
TVDSD,X,Y,YEAR,MONTH,DAY,HOUR,MINUTE,SECOND,ANNOT
m,m,m,n,n,n,n,n,n,n
1000,200,-300,2005,5,29,15,10,5,some annotation
2000 200,-400,2005,5,29,16,20,7,some other annotation
The locations can be displayed as single points or joined, both in a user defined
colour. The X and Y coordinates must be relative to origin of the 3D view. Units
are specified by the second line.
The information can be annotated with the filename OR the last field of a row.
A date and time of the file can be specified OR each row can have its own date
and time. This allows the points to be displayed as part of the Timeline.
Welltrack
Browse to find a Welltrak well deviation database (*.welltrak) file. You will be
prompted to supply WRE (well reference elevation) and colour for each selected
welltrak. Default WRE is taken from currently displayed dataset database.
Marked
This selectable button indicates and sets the marked state of the selected trace.
Label
Add text to the display label (see display label documentation).
Attribute
Use this option to display colour coded database information at either source
receiver or image point locations. Select the database attributes from the
dropdown list or type in any other database identifier, see appendices.
FSD: Search all relevant database values to be displayed. Use the maximum and
minimum value to configure the colour scale. If FSD is not selected you may
choose the limits. This is useful when comparing views from other opids.
Only show information for geo <n>: Only database entries matching the
selected geophone number will be displayed. Note this has implications for FSD.
Only show information at selected MD, Only show information at selected
record: Only relevant database entries will be displayed.
For Image points and receivers select FORMATION to colour the point by
formation colour.
Timeline settings
Access settings for Timeline and Monitoring graphs. These graphs are mainly
intended for monitoring passive seismic event statistics but may also be useful
for other survey types.
There are two types of graphs. One shows information from the dataset database,
while the other reads information from a CSV file, e.g. Borehole Pressure.
The default length/duration of the Monitor graphs is defined by the earliest and
latest, time of record, in the dataset database, or by the current Time-line time.
The user can override the start and end times or return to default times.
If Use Event Window to Window Timeline replay check box is selected the
Event Window interval is used to specify a Time Window for the Time-line
replay. Only information within the Time window are displayed.
Percentage of display allocated to Time-line graphs: defines how much of the
screen/page is taken up by the time-line graphs. Zero width will remove graphs
from display.
Time-line graphs annotation grid (hours): defines how often times will be
annotated at the base of the group of graphs, e.g.0.1 =every 6 minutes.
Up to four dataset database graphs can be displayed these are a count of
events/records in a user-defined window, Offset of Image Point or Receiver from
source, Depth of various dataset database depth values and a user-defined entry
from the dataset database. Settings include the height, max and min graph
extents and a user-defined attribute.
Monitor File Settings: Select this to access monitor file settings.
The monitor file must be formatted as a csv file.
The first line of the file, without a # in the first position is split and used as graph
annotation.
The first column of the file must contain time values. The start date and time of
the first numerical line of the file must be set to allow it to be synchronised with
the dataset database times. The first column can be at a user-specified constant
time interval or time values can be read from the file using the selected format.
To speed up redrawing lines can be skipped thus decimating the file.
Up to three columns can be graphed. The user must allocate the column number
and height in mm of the graph and define the minimum, maximum and colour of
each column displayed.
X scale
There are several possible parameters that control the horizontal position (or
mapping) of a trace. The values of these parameters for each trace are held in the
dataset database. For each parameter, the range of values present in the dataset is
displayed as a guide.
Scale
The “Horizontal scale” and “Trace origin” settings control the mapping from the
chosen parameter to the horizontal position of each trace. A range of standard
horizontal scales is provided. If none of these will do, you can type in your own.
Direction
Following industry practice, the traces are normally displayed from left to right
in descending order of mapping parameter value. This order may be reversed
using the Direction setting. The exception is constant trace mapping that has
traces in ascending order from left to right.
Units
Apart from constant trace mapping, “Units of distance” defines how the traces
will be annotated.
Multiple twigs
When multiple twigs are displayed, the page is divided into equal size panels for
each twig. Remember that the number of traces displayed is determined by the
size of the display panel and the horizontal mapping and scale. There is no
guarantee that all traces are visible.
FK plots
For wave-number domain data, a special Width setting controls the width in
millimetres of the display.
Mapping parameter
Select the required mapping parameter from the following options. Use the
default “Constant trace” if you are going to pick arrival times, because this is the
only option which guarantees that traces are never super-imposed.
Line
When receiver, source or target offset is requested as a mapping parameter, the
two-dimensional coordinates can be interpreted in a number of ways.
One approach is to Follow the trace mapping, i.e. integrate the offsets between
adjacent traces, starting from the defined Origin. This is useful when comparing
VSP data with “Random line” surface seismic datasets. NB the highest trace
number has the smallest offset.
If Follow trace mapping is not selected, another approach is to project the
coordinates onto a display line. The origin and direction of the display line is
defined by Origin and End X and Y coordinates. If the start and end coordinates
are the same, the line is not defined.
If the line is defined, Source (SCX, SCY), Receiver (RCX, RCY) and Target
(TCX, TCY) coordinates are projected onto the line. Offsets are calculated from
the line origin. Projected coordinates in the direction from the line origin to the
line end result in positive offsets. Normal plot direction will display largest
(most positive) offset on the left of the plot, use Reverse plot direction to put
largest offset on the right. Projected coordinates of the left and right most traces
are displayed in the display label. Use trace origin display setting to define the
offset of the left most trace.
Associated OPID
The associated opid plot parameter appears when a dataset is displayed against
measured depth, TVD or TVDSD. It allows an OPID containing continuous log
data, eg velocity (CVL), density, gamma etc to be displayed above the VSP and
using the depth scale of the VSP.
The associated dataset must either be sampled on depth (assumed to be TVDSD)
or contain a twig matching the currently chosen depth scale (MD, TVD,
TVDSD). The associated OPID need not be linearly sampled and could have
been imported using CSVimport or LASimport.
Once a relevant OPID has been associated it is necessary to accept the settings
and open them again to access the velocity or other log settings. The width
setting defines the height of the “sideways” log.
Only log data within the depth scale of the display is used. Use the origin option
to extend the log to deeper or shallower depths depending on the direction of
plot (normal or reverse).
Select the displayed twigs of the associated opid using the usual Twig selection
dialog activated from the plot dialog bar. Twig selection appears first for the
main display, select OK and the associated opid twig selection appears. Depth
scales cannot be displayed at present.
Associated opid displays always appear at the top of the display, attributes
appear below associated opid twigs.
Display style
Style Effect
Block colour Sample amplitude is represented by coloured blocks.
Block grey Sample amplitude is represented by grey scale blocks.
Block rainbow Sample amplitude is represented by coloured blocks.
Sparse dots Sample amplitude represented by dots
Sparse spikes Sample amplitude represented by lines
Variable area Sample amplitude is represented by a wiggle with
filled loops to the right of the trace baseline.
Variable area bipolar The loops are filled with a colour to represent the peak
amplitude encountered within the loop in this wiggle
trace.
Variable area bipolar2 Like variable area bipolar but only two colours
Variable area rainbow Like variable area bipolar but with full spectrum of
colours
Wiggle Sample amplitude is represented by a simple wiggle
curve
Add wiggle
Overlay wiggle traces (see above) of the colour defined in the Colour tab on any
of the above display styles.
Polarity
If Reverse polarity is chosen, each sample amplitude is multiplied by minus one
before display. In “Variable area” style, negative numbers are filled. In “Block
grey” style, negative numbers are filled with darker shades than positive
numbers. In “Block colour” and “Variable area bipolar” styles negative numbers
are shades of blue, positive numbers are shades of red.
Trace width mm
Specifies the maximum trace width in mm.
Arrival curves
Specifies which arrival time will be joined by a line. One use of the curves is for
finding the position of an arrival once it has been subtracted from the data. Three
arrival curves are available the colours and fill option of which can be selected
from the Colours settings tab. See also ClientShift documentation.
Fill
If an arrival curve type is selected you can choose to fill the display background
starting at the curve selected for fill to the next available curve. The last area is
filled to the bottom of the display. The fill colour is the curve colour defined on
the colour tab.
Overlay curve
The Overlay curve is turned on by the Overlay operator. If turned on it can be
displayed and/or filled with the colour defined in the Colour tab.
Normalisation
Traces may be displayed either individually normalised (by default) or cross-
normalised.
Trace normalise
Each trace is normalised to its own maximum magnitude. There is no
relationship between the relative amplitudes of each trace.
Cross normalise
All traces are normalised to the same “Normaliser” value. This mode is useful
when magnitude variations across the display are of interest. To increase the
amplitude of traces on a cross normalised display decrease the “Normaliser”
value.
Use the X key during display to select cross normalisation. Use the +/- keys to
double/half the display amplitude.
Max
Shows the maximum sample magnitude found within the dataset. This is useful
for choosing a suitable value for the “Normaliser”.
Colour
You may select your preferred colours for the following items.
Overlay Colour
Select the colour of the overlay outline set by the Overlay operator.
Print options
You have a degree of control over how your plot is printed.
Print title
The default print title is derived from the opid title. If requested, the print title
appears on printed display or at the top of the display label if the display page is
part of a composite display.
Display style
The following display styles are available.
Style Effect
Colour Sample amplitude is represented by coloured blocks
Grey Sample amplitude is represented by grey scale blocks.
Variable area colour Sample amplitude is represented by filled colour curves
Range dB
Displayed dynamic range either selected from range of values or user specified.
Trace width mm
Specifies the maximum trace width in mm. Only available for variable area style.
Normalisation
Datasets in the frequency domain are always displayed cross-normalised to the
maximum value found in the dataset.
Filter
A dataset may be filtered before display. By default the filter checkbox is
unchecked and displays are not filtered.
Parameters
The filter is a zero-phase band-pass filter defined by a four-point frequency
template with linear ramps. The four frequencies must be monotonic and must be
specified in order of increasing frequency. The highest specified frequency must
be less than the nyquist frequency for the dataset.
Filter template
Below the first frequency, the samples are set to zero. Between the first and
second frequency the samples are attenuated linearly. Between the second and
third frequency the samples are unchanged. Between the third and the fourth
frequency, the samples are attenuated linearly. Above the fourth frequency, the
samples are set to zero.
AGC
An automatic gain correction may be applied. For each trace, an RMS value is
calculated over a moving Window specified in ms. Gain is calculated by
normalising this RMS value to the maximum magnitude in the trace. The Max
gain parameter controls the size of the gain applied. Gains are ramped between
neighboring windows.
Inversion display
There is a special display mode for transposed data from the Invert operator.
This display provides mirrored versions of the transposed VSP data and the
inverted velocity curve of the specified trace. Use the Prune operator to turn off
this special display. The following display settings are available.
Log display
Logs can be imported into VSProwess using the CSVimport or LASimport
operator. There is a special display mode for such datasets. Logs of various types
can be combined, resulting in a multi-twig dataset. Twig combinations can be
selected using the Twigs button. Null value is –999.25. Null values are not
displayed.
Cursor
The log display does not support a zoom mode but click and hold the mouse
button to display a draggable cursor. Values for all logs are displayed on the
status bar. You must set an appropriate timescale and start time to ensure the
region of interest is within the displayable region.
Velocity
The width in mm, minimum and maximum and velocity scale type can be
entered.
If Log velocity is selected, the inverted velocity will be displayed on a
logarithmic scale for better comparison with a conventional sonic log.
Blocky style is ignored during display of Invert operator and linear sampled logs.
Auxiliary scale
The auxiliary scale is derived from a TVDSD twig for a time domain dataset or a
TWT twig for a depth domain dataset. The auxiliary scale can be turned off.
Depth units and Depth referenced to datum define how a depth scale will be
annotated.
Other logs
The width in mm, minimum and maximum values, major annotation grid values
and colour of: acoustic impedance, density, gamma, resistivity and caliper logs,
is configurable.
Multiple log twigs may be overlayed. The twig with the lowest twig number is
the major twig.
Log values outside max and min values will “wrap around”. Grid values at major
grid steps will be annotated. The major grid value may force the maximum and
Polygon mode
Polygon editing is only available for FK displays. Any number of arbitrary
polygons may be created for use as a template by the FKfilter operator. If no
current polygons are displayed, editing must begin by inserting a new polygon.
The following operations are available for editing polygons. On completion of
each operation, the mode reverts to Move point.
Insert point
Select an edge of a polygon into which a point is to be inserted.
Delete point
Select the polygon point to be deleted.
Move point
Move a point using a drag operation.
Insert polygon
Insert a triangular polygon around the selected cursor position.
Delete polygon
Select inside the polygon to be deleted.
Display label
Each time an operator executes it copies a label file from the input A folder to
the output folder. Certain key operators may add a line or two of text to this label
file in the process. You are free to edit this text or add some extra text of your
own. The text from the label file, which is present in the output folder of a
displayed operator, is superimposed on the display as the label. Some additional
text is added automatically to describe the display parameters.
Remember that the label file for any given operator is replaced each time that
operator is executed. Typically, you would make your changes when you are
ready to make a hard copy (or PDF) of a display. You can also add text to the
label in route edit mode. Any such text is then saved as part of the route.
The operator title from route edit is used as a title for the label. Any twig
dependant titles are shown as part of the title.
Label position
Select the position of the label with respect to the display outline. Select any one
of the four corners within the display, center top or bottom or outside to the top
right. You may also request the label to appear sideways.
Composite settings
Up to twelve display pages can be printed side-by-side on one print page.
The size and orientation of the composite display is dependant on the paper size
(or print area in Adobe Acrobat) chosen from the print driver dialog.
To print or view a composite display use the Print and Print Preview buttons in
the composite group box on the dialog bar.
Well name, client name and contractor name are found from the first plot in the
composite display.
Opid title
The opid title is entered during route Edit. If you wish to store the composite,
each display page must have an opid title.
Display page
The number of each required display page must be selected from the Display
page drop lists. Only currently allocated pages are available. The topmost drop
list in the dialog defines the leftmost display page. The order of subsequent
display pages is defined by row order within the dialog.
Display pages can be printed in any order. A display page may appear more than
once.
Any display that extends beyond the end of the print page will be truncated or
omitted.
Scales and display styles must be configured before the start of printing.
Scale annotation
X and/or Y annotation can be turned off for each display. Annotation consists of
the scale description and the scale numbers.
If displays have the same Y-scale, annotating the first may be enough. Individual
Y-scales will automatically be turned off if the global Y-scale option is chosen.
If X-scale and Y-scale are both turned off all scale information is removed
except grid lines.
Formation tops
Turning on the global-Y-scale allows you to display formation tops, if available.
Formation tops can be set up using the File/Formation tops menu option. Choose
to the position of the formation top name. Choose whether to simply display the
name and line stub only as a key. Choose to fill under the formation top using
the background colour.
Print title
You have to enter the required print title. Remember, individual composite pages
have their own titles independent of this print title. You may choose to display
the title or not.
Overview
In order to be able to take full advantage of the power and flexibility of the
VSProwess signal processing system it is necessary to become familiar with the
full range of available operators. This chapter introduces the operators by
grouping them according to function.
Import/Export operators
The import operators are used to import seismic data into the VSProwess
environment.
Depending upon the recording system in use, a VSP dataset may have been
originally recorded to tape or disk in a number of different proprietary data
formats.
SEGY
SEGYinput imports SEGY, the most common data exchange format. The SEGY
standard was designed for surface seismic applications, the available header
information is very often incomplete or ambiguous for borehole seismic data.
The SEGYlist operator provides a means of examining the header information
from a SEGY file.
SEGYoutput exports data from the VSProwess environment in the form of a
SEGY file. The header location usage for exported SEGY files is rigidly defined
(see the appendices) but includes most of the information available within
VSProwess. A file produced by SEGYoutput is an excellent way of archiving a
processed dataset. Such a file may be easily and unambiguously re-imported into
VSProwess by the SEGYinput operator, using the default parameters.
SEGYexport also exports a SEG-Y file but header location usage is user defined.
It should be possible to fit the header specifications of most clients using this
flexible operator.
Some clients have a defined header specification to which SEGY data supplied
to them should conform. Supporting this requirement are the AGIPoutput,
ELFoutput and SCHLUMout operators.
Wavelet
If you need either a single sample pulse or an arbitrarily complex wavelet.
CSVimport
Import data samples from a CSV file. Designed for importing logs but can be
used for any twig descriptor.
ProfileIn
If you need to import a velocity model from a simple list of depth/velocity pairs,
we have the ProfileIn operator.
DBupdate
Occasionally you will need to make wholesale changes to the trace header
database, for example to add source coordinates, the best tool for this is your
favourite spreadsheet program. Display the initial dataset and use the
“ExportDB” button to export the trace header database. Use your spreadsheet
program to make the necessary changes and be sure to save your work in CSV
format. You can now use the DBupdate operator to re-import your modified
trace database.
Use a similar method to preserve interactive trace editing.
DCsubtract
Calculate and subtract DC.
Deglitch
Remove glitches due to telemetry errors.
Deskew
Remove recording skew.
Arithmetic operators
The arithmetic operators perform simple arithmetic operations and usually work
in any domain. In general, both of the input datasets must be compatible with
each other. They include the Add, Subtract, Multiply and Divide operators, all of
which require two input datasets. If you need to multiply a dataset by a constant
value then use the Scale operator. For example to reverse the polarity use Scale
with the multiplier value set to minus one.
The Magnitude operator only works in the time domain.
Tshift
Perhaps the most ubiquitous of the time domain operators is Tshift. This operator
may be used to apply a fixed time shift, a pick related time shift (for event
alignment) or both simultaneously. Internally Tshift actually works in the
frequency domain, which enables it to apply sub-sample time shifts. Tshift is
also useful for truncating trace length or extending traces length by zero padding.
Tstatics
If all of the necessary information is in the input database, then this module will
apply the correct static corrections to a dataset.
FrontBlank/RearBlank
This pair of operators may be used to “blank” (set to zero) sections of the
dataset.
AutoPick
The AutoPick operator can save you a lot of time by automatically picking
arrival times (one-tenth sample resolution). This operator is not foolproof and
you should always examine the resulting arrival curve carefully.
Equalise/Unequalise
Non-linear spatial filtering processes (median filters) require that adjacent traces
be of comparable amplitude. The Equalise operator may be useful to compensate
for the effects of variable source amplitude. The Unequalise operator is able to
undo the effects of the last Equalise.
Tramp
This operator is normally used to compensate for the amplitude variation across
a dataset caused by the effects of spherical divergence.
Enhance
The Enhance operator is a powerful median based spatial filter. It may be used to
enhance horizontal or sloping events in a dataset.
Integrate
Integrate works only in the time domain and typically is used to transform data
from accelerometers into a form more comparable with conventional velocity
sensitive geophones.
Flow-pass/Fbandpass/Fbandstop
These operators are used to apply zero-phase filtering in the frequency domain.
Fbutter
The Fbutter operator implements a butterworth band-pass filter with selectable
filter slopes. Butterworth filters are useful because they have a minimum phase
characteristic.
PhaseRot
The PhaseRot operator is used to rotate the phase of a dataset, perhaps to
approximate a better match to a seismic wavelet.
Fcollapse/Fexpand
This pair of operators is used to halve or double the sampling time interval of a
dataset. Be careful when using Fcollapse to first apply a suitable anti-alias filter.
Conjugate
The Conjugate operator obviously works only with complex data (i.e. frequency
domain).
FXresample
Sometimes, for geophysical reasons, borehole seismic is acquired with variable
receiver geometry. Certain processes, notably FK transformations, may require
constant trace separation. The FXresample operator can be used to resample a
dataset horizontally, but remember to first filter the dataset to remove any energy
above the frequency at which spatial alias occurs, else you will distort the data.
Qestimation
Qestimation calculates Q, a measure of seismic energy absorption.
Symmetric spectrum
At present all frequency domain data is assumed to have a symmetric spectrum.
This is a useful characteristic of “real” seismic data that allows us to reduce the
size (and therefore disk storage and execution time) of a complex frequency
domain trace by 50%. The negative frequencies are easily obtained as required.
However, this practice prohibits the use of complex time domain signals, so may
be discontinued in a future release.
Wave-number domain
The following operators work in FK space.
FKfilter
Apply your own arbitrarily shaped spatial filters in FK space.
Manipulation operators
This group of operators provides the means of sorting, splitting or combining a
dataset. They (generally) work in any domain, but if two datasets are to be
combined, they must be compatible.
Sort
Borehole seismic data traces are rarely acquired in strict depth order. The Sort
operator re-orders the input traces according to a required parameter.
Select
This operator is used to extract a subset of the input traces.
Append
To merge two datasets into one, first use Append followed by a Sort.
Prune/Graft
This pair of operators is used to extract or recombine a twig (e.g. component)
from a dataset.
Stack
If you need to stack raw data, this is the answer.
StackAll
Generate a corridor stack or use to duplicate single traces.
Mark
Mark traces within a range of dataset database values.
Resample
Change sample interval in a more general way than Fexpand and Fcollapse.
Convenience operators
Some operators have been provided which encapsulate a common sequence of
operations into a single operator. It is perfectly possible to perform the same
function with a sequence of primitive operators, but the processing flow is
usually easier to understand if these “convenience” operators are used.
Deconvolve
The deconvolution process is actually performed in the frequency domain for
computational efficiency. Note that this process is deconvolution in the strict
mathematical sense (i.e. the dual of division in the frequency domain).
Tfilter/NotchFilter
Filters with time domain input and output may be required at several stages of a
borehole seismic processing flow. These operator actually perform the filtering
in the frequency domain.
Correlate
Internally the correlation process is performed in the frequency domain for
computational efficiency.
Imaging operators
Imaging operators are available which can be used together to image-reconstruct
an offset source dataset. Any survey configuration can be processed.
RayTrace
Use to 3D ray-trace, through a horizontally layered velocity model, using source
and receiver positions from a dataset database. Display rays and reflection point
loci, in three dimensions, using WellView.
NMO
Using information generated by RayTrace, convert datasets trace by trace, from
one-way time-samples, to two-way time or depth-samples.
BIN
Reconstruct the output traces from NMO to vertical bin lines.
Bin
Vertically binned offset
source dataset
BinImport
Import bin coordinates
Migrate
Migrate an NMO dataset
RayProx
Proximity processing, e.g. salt proximity. RayProx uses arrival time and
direction to image the flank of a high velocity near vertical structure.
Polarize/Rotate/Hrotate/ToolOrientate
Orientate and rotate three component datasets.
RayForm
Apply time variant rotation to multiple component datasets. The primary purpose
of RayForm is wave-mode separation of up going wave fields.
Miscellaneous operators
These operators are not related.
Transpose
Turn your VSP into a form suitable for comparison to surface seismic.
Profile
Generate a time against depth profile.
Invert
Convert your VSP into something that a geologist will recognise.
Overlay
Overlay two datasets, e.g. VSP over surface seismic.
PickAmplitude
Find pick amplitudes for use with WellView Attribute option.
Synthetic
Generate a synthetic seismogram from velocity or impedance logs.
ACQinput
ACQinput imports a stacked dataset as generated by the ACQ data acquisition
software.
Requirements
All information required for processing and static corrections should be present
in the dataset database. It is assumed that a satisfactory corrected time against
depth graph has been obtained and that any traces to be excluded from VSP
processing have been de-selected.
Description
The ACQinput operator imports a stacked, timed, sorted and edited MIRF dataset
as produced using the ACQ data acquisition software.
Vertical depths and receiver coordinates are extracted from a welltrak database if
this is present. If no welltrak database is found, a vertical well is assumed. If a
Welltrak database is present, wellhead UTM coordinates are set in the output
dataset database.
The welltrak database must be called well.welltrak. If a well.welltrak file is not
found in the VSP job folder, the input path is searched and any well.welltrak file
is copied to the VSP job folder.
ACQinput is especially useful for rig-site processing because of its close
integration with the acquisition software. For a more flexible method of
importing any MIRF data, see the MIRFinput operator.
VSProwess Operators • 52
External reference delay (i.e. RSS channel delay)
Source coordinates relative to wellhead (SCX & SCY)
Receiver coordinates relative to wellhead (RCX & RCY)
Measured depth relative to reference elevation
True vertical depth relative to datum
Time correction from source to datum (Ts).
ACQinput -Parameters
Path
Use the “PATH” button to browse for the pathname of the acquisition job, e.g.
“C:\Jobs\Testhole”.
Dataset
Specify which dataset is to be imported.
The stack files and database (dset.cdb) that comprise the dataset are expected to
reside in a dset_nnn folder under the job folder.
Output samples
Input a suitable number of output samples depending upon the sample interval
and the total depth of the well. With a one-millisecond sample interval, 4096
samples are usually adequate.
For deep targets it may be necessary to select 5000 samples, but be aware that
this triggers 8192 point FFTs, which increases processing time and disk storage
requirements.
Reverse polarity
You can opt to reverse the polarity of either geophone or reference channels.
Most CGG geophone systems (ASR, BSR, GCH, SST500) are wired SEG
REVERSE so that the first arrival is a down-break. You should reverse the
geophone polarity when importing data recorded with these systems.
VSProwess Operators • 53
To conform to the industry standard SEG convention for VSP polarity, the first
arrival should be positive going, appearing as a black right-side peak on the
display.
Hydrophone signals should have a negative-going first arrival.
Descriptors
Select which descriptors (components) are to be imported as twigs.
Remember that processing is performed on all imported twigs, so don't import
horizontal components if they are not needed.
Geophones
This option might be used for example to exclude traces from a geophone known
to be faulty.
Add
Add forms the arithmetic sum of two datasets.
Requirements
Datasets must have the same sample domain.
Datasets must have the same trace domain.
Datasets must have the same number of twigs.
Datasets must have the same number of traces.
Datasets must have the same number of samples.
Datasets must have the same sample interval.
Considerations
The output dataset derives its database from input dataset A.
Description
Produces the arithmetic sum of two datasets. Addition is performed sample-by-
sample, trace-by-trace and twig-by-twig. The requirement that both input
datasets must have the same sample domain implies that complex numbers may
not be added to non-complex numbers.
Parameters
None.
AGC
AGC applies “automatic gain correction”, effectively compressing the dynamic
range of the dataset, helping to make some details stand out more clearly. But
AGC is a non-linear process and generally best avoided in a processing route.
Requirements
Dataset must be in the time domain.
VSProwess Operators • 54
Description
For each twig and each trace an automatic gain correction is applied. An RMS
value is calculated over a moving Window specified in ms. Gain is calculated by
normalising this RMS value to the maximum magnitude in the trace. The
Maximum applied gain parameter controls the size of the gain applied. Gains
are ramped between neighboring windows.
WARNING
AGC distorts relative amplitudes within a trace, but it may be used after the final
stage of processing, perhaps for comparison with surface seismic data.
AGC – parameters
Window (ms)
This defines the length in ms of the moving window.
AGIPoutput
Output a VSProwess dataset as a SEG-Y file according to AGIP specifications.
Requirements
Dataset must be in the time domain.
Description
Archive all twigs, traces and samples of a VSProwess dataset into a single SEG-
Y file.
The parameters are self-explanatory and conform to AGIP requirements. The
reference twig is output first, as required by AGIP.
Parameters
None.
Append
Append two datasets side by side.
Requirements
Datasets must have the same sample domain.
Datasets must have the same trace domain.
Datasets must have the same number of twigs.
Datasets must have the same number of samples.
Datasets must have the same sample interval.
Datasets must not have wave-number trace domain.
VSProwess Operators • 55
Considerations
The output dataset derives its database header from input dataset A.
Description
Form one dataset by appending two datasets end to end. The number of output
traces is the sum of the number of traces from the two input datasets.
Parameters
None.
AutoPick
Automatically pick Pick1(primary) and/or reference arrivals.
Requirements
Dataset must be in the time domain.
Description
AutoPick locates peaks or breaks of either polarity which occur within the
specified search window.
Pick1(Primary) arrivals are picked from the first twig with the “VZ” descriptor,
if one is present, otherwise twig1 is used.
Reference arrivals are always picked from a twig with the “REF” descriptor.
A threshold level is specified as a percentage of the maximum magnitude of the
signal contained within the search window.
The search proceeds from the start of the window until the signal exceeds the
specified threshold level.
Depending upon the required pick event the search proceeds either forwards
until the signal level begins to fall again, or backwards (break) until the signal
falls below 10% of the peak amplitude, or forwards (zero crossing) until the
signal crosses zero.
A straight line is defined through amplitudes at 90% and 10% of the peak
magnitude or either side of the zero crossing. The break or zero crossing is
defined where this line crosses zero amplitude.
Picking accuracy is to one-tenth of the sample interval; achieved by polynomial
interpolation between samples.
Failure to locate an event is indicated in a text file that is opened when the output
is displayed and the trace remains unpicked.
Remember to switch on the first arrival curve when viewing the results of this
operator.
AutoPick -Parameters
Pick reference arrival
VSProwess Operators • 56
Primary pick parameters
Choose what type of event and polarity to pick.
Specify the limits of the window in which to search for the event.
Specify a threshold level that the signal must exceed for an event to be located.
The threshold is specified as a percentage of the maximum signal magnitude
within the search window.
At shallow levels, the first arrival usually has the maximum magnitude.
However, strong tube wave arrivals for example can have a greater magnitude.
The default threshold of 25% is usually appropriate. Too low a threshold
increases the chance of an incorrect pick caused by noise.
Automatic picking can give misleading results in the presence of early refracted
arrivals; a common phenomena if, for example, the borehole passes close to the
flank of a salt dome.
Use_magnitude
Check this box if you expect that the first arrival energy does not arrive from the
direction of the VZ component. This may occur with fixed (i.e. non-gimballed)
sensors in a deviated well, or if the source is significantly offset from the
wellhead. AutoPick will calculate and scan the vector magnitude of the trace to
find the first arrival. The signal magnitude calculated from all the input twigs
except the REF twig and is output as a new MAG twig, which can be displayed
so that picks can be checked and manually adjusted if necessary.
Remember that magnitude is always positive; do not bother to try searching for a
negative peak or break. In addition, because there are no negative excursions in
signal magnitude the search algorithm may become unreliable. This problem can
be alleviated by over-sampling the dataset. (See Fexpand operator).
Bin
Reconstruct the output from the NMO operator into a line of vertical bins.
Uses
Use after NMO operator as part of a mapping procedure.
Requirements
Must be an NMO corrected dataset.
XOFF and YOFF twigs produced by the NMO operator must be present.
Description
For each twig, all input traces are searched and reconstructed into vertical bins.
The position of each vertical bin is defined by, the coordinates of the bin line
and, the offset increment along the bin line from the origin. Each vertical bin
VSProwess Operators • 57
produces one output trace. The bin location is stored in both the target and the
source coordinate locations of the dataset database.
For each sample of each output trace, all input traces are searched for samples
that fall into a disc, centered on the sample value and bin coordinate. The radius
of the disc is defined by the bin width. Input sample coordinates of are read from
the XOFF and YOFF twigs.
Input trace coordinates, from XOFF and YOFF, can be projected onto the bin
line, if necessary. This is useful when the vertical plane containing the source
and receiver is not the same for all traces. A projection limit can be set so that
only coordinates, which lie within the projection limit distance of the bin, will be
projected. In this way, a grid of vertical bin lines can be produced, although,
each line must be processed separately. A zero projection limit allows all loci
points to be projected.
The output sample can be either, the closest to the bin position or, the average of
all samples in the binned sample.
There can be situations where no valid input-samples are found for an output-
sample. This can result in “gaps” within the body of the dataset. This may
happen when reflection point loci are significantly bent. In this case, there is an
option to interpolate in order to improve the spatial sampling of the input dataset.
This is done by creating samples halfway between those samples that span the
bin. Make sure the bin increment is acceptable before resorting this option.
Display the result from Bin using target coordinates. Display uses the target
coordinates of the first and last trace, as default display line coordinates.
In order to check the bin density, a second twig is output, holding the total
number of samples found for each bin sample. It is best to view this data using
the block colour or block grey display mode, making sure cross normalisation is
selected.
Considerations
View the reflection point loci to find, the position and length of the bin line, the
bin spacing, and whether it is necessary to project the reflection point loci onto
the bin line. This can be done using WellView on the dataset output from
RayTrace. Use the full report from NMO to find accurate coordinates.
Prune off the XOFF and YOFF twigs prior to any processing of the NMO
dataset, such as image enhancement, to avoid distorting the loci. Graft the
dataset back together before Bin.
Parameters
Bin start X
X coordinate at start of bin line, output trace 1, absolute coordinates, in survey
units.
Bin start Y
Y coordinate at start of bin line, output trace 1, absolute coordinates, in survey
units.
Bin end X
X coordinate at the end of bin line, last trace, absolute coordinates, in survey
units.
VSProwess Operators • 58
Bin end Y
Y coordinate at the end of bin line, last trace, absolute coordinates, in survey
units.
Bin width
This defines the width of the bin in survey units and is usually half the bin
increment.
Projection limit
Only input coordinates that lie within the projection limit distance of the bin will
be projected. A zero projection limit allows all input coordinates to be projected.
Average
Output the average of all samples found in the output sample bin. Otherwise,
output the closest sample to the bin center.
Interpolate
Try to fill in the gaps caused by spatial under sampling. Check the bin increment
first.
Project
Project input coordinates, onto the bin line, before binning.
BinImport
Import a list of Bin coordinates to be used by Migrate.
Description
BinImport can be used to import a CSV file list of bin coordinates or a single
straight line of bins can be created using parameters.
In both cases for each output bin, a short zero valued trace is output. Bin X and
Y coordinates are set in SCX, SCY, RCX, RCY and TCX, TCY header
locations.
The CSV file formats are illustrated below
TCX, TCY
m, m
111, 222
333, 444
or
LINE TCX, TCY
n, m, m
VSProwess Operators • 59
1,111, 222
1,333, 444
In the first example, only the compulsory TCX and TCY columns are present.
Line number will be set to zero. In the second example, the line number will be
set to one. Use line number to identify groups of bins that are linked.
The output “dataset” is time domain.
BinImport –Parameters
Use CSV file
Feet
If checked distances above are assumed to be measured in feet.
ClientShift
Apply a client two-way time shift to a dataset.
Requirements
Dataset must be in the time domain.
Description
This operator is intended to be used to apply a static time shift to final processed
datasets in order to tie the VSP with other time domain client data e.g. surface
seismic. Tcorrected arrival curves and formation tops displayed on a client
shifted dataset will be modified by the client shift.
A common, sub-sample resolution time-shift, is applied to all twigs and all traces
of a dataset, except when the Client data parameter is selected.
Log data: velocity, density, depth scales, may suffer from edge effects if a
subsample shift is applied.
The client shift is stored in the datset database. The Client data parameter allows
the client shift to be stored in the dataset database without shifting the data
samples. This is useful for displaying VSP arrivals on a surface seismic dataset.
All displays in a composite plot must have the same Client shift value in their
dataset database, even if the data samples have not been shifted. This allows
VSP and Client datasets to be displayed in the same composite.
After the application of ClientShift dataset database Tcorr times will have half
the client shift added. In particular the Tcorrectedx2 arrival curve plot will be
shifted by the client shift.
VSProwess Operators • 60
Formation top two-way times, plotted on a composite display with the global
Yscale derived from a ClientShifted dataset, will be modified by the client shift.
Profile or ProfileX executed after ClientShift will have their Ts corrections
modified by the addition of half the client shift. This allows listings of time
versus depth including the client shift.
This operator may also be used truncate or extend the number of samples in a
dataset.
ClientShift –parameters
Client data
The Client data parameter allows the client shift to be stored in the dataset
database without shifting the data samples. This is useful for displaying VSP
arrivals on a surface seismic dataset and combining VSP and client datasets on a
composite display.
Output samples
Specify the number of output samples. This may be used either to extend the
data with zeros or to truncate the data.
Client shift
A common shift may be applied identically to all traces.
Common T start
The start time for all traces may be set as required. If the dataset is to be supplied
to the client in SEG Y format it is best to make T start zero.
A negative value may be specified to allow the display of data before time zero.
This is useful for viewing two-way time datasets that straddle datum.
A positive value may be specified, for example to discard dead samples before
the first arrival. This is useful to reduce the length of data required for
processing.
Collect
Assemble a collection of data. See also the Append operator.
Requirements
Input dataset must have the same number of twigs and samples as previously
collected data.
Input dataset must be time domain and have the same trace domain and sample
interval as previously collected data.
Description
Unlike most other operators, Collect does not delete the previous dataset each
time it is executed. Instead, the new traces are appended. Collect allows the
accumulation of a large dataset from small segments.
This operator is useful if it is necessary to process a very large dataset with only
limited disk space. For example, correlation and stacking of a large Vibroseis
dataset might be performed for just a few dozen traces at a time.
VSProwess Operators • 61
To clear the accumulated dataset it is necessary to purge the route.
Parameters
None.
Conjugate
Form the complex conjugate.
Requirements
Sample domain must be complex.
Description
Produces the complex conjugate of every sample in a complex dataset.
Parameters
None.
CopyREF
Copy reference traces a specified number of times.
Requirements
Dataset must have the twig descriptor "REF".
Dataset must have only one twig.
Dataset trace domain must be distance.
Description
Replicate reference traces a specified number of times in order to make up the
correct number of reference traces for multiple geophone array surveys.
This is a special purpose operator to allow processing of jobs where, for example
the sample interval of the reference channel is different to that of the geophone
channels. The reference traces can be read in separately, resampled to the
required sample interval, replicated as necessary and finally grafted back to the
geophone components.
CopyREF -Parameters
Number of copies
The number of times each trace will be replicated. Usually the number of
satellites in the receiver array.
Correlate
Correlate all twigs with the reference twig.
VSProwess Operators • 62
Requirements
Dataset must be in the time domain.
Dataset must contain a reference twig.
Dataset must contain at least one non-reference twig.
Description
Correlate all twigs with the reference twig. The auto-correlated (i.e. correlated
with itself) reference twig is produced as a quality control check. The auto-
correlated reference signal should be a symmetrical wavelet centred at time zero.
To allow display of the auto-correlated wavelet, all twigs are delayed (shifted)
by a fixed interval of 100 ms. The reference pick time is set to the same interval
so that the Tstatics operator can automatically correct for this time shift.
Before the reference twig is used for correlation, it is normalised to its total
power, which ensures that correlated signal amplitudes are independent of
Vibroseis pilot sweep amplitude and allows useful signal level comparisons for
different sweep periods.
If reference signal level information is important to your task, this can be
restored by scaling to the peak amplitude of the auto-correlated wavelet.
Correlate -Parameters
Number of output samples
Specifies the number of samples required in the output dataset.
CSVimport
Import traces into VSProwess from a CSV file.
Requirements
The CSV file must conform to the following requirements.
The first line contains headings that must be one of the VSProwess twig
descriptors, with the exception of TVD, VINT and VALUE.
The first column must be either: TIME, TWT or DEPTH, TVDSD or TVD.
First column values must either be linearly increasing i.e. increase by a constant
amount from line-to-line or non-linear increasing or decreasing.
Where applicable, the second line must contain the measurement units of the
column. Units are: m, ft, s, ms, m/s, ft/s, us/ft and us/m and apply to DEPTH,
MD, TVD, TVDSD, TIME, TWT, TT, velocity, DT and VINT columns. At
present, other units are ignored e.g. gm/cc.
The third line starts the data samples.
VSProwess Operators • 63
CSV file interpretation
The first column of the CSV file defines the sample domain, distance or time.
The difference between the first and second data sample of the first column
defines the sample interval. If column one indicates non-linear samples sample
interval will be set to zero and VSProwess will the use the first column as a
sample map.
The default user measurement units are defined by the: TVDSD or the TVD or
the MD column. If none of these is present, measurement units are metres.
If the first column heading is TVD or DEPTH, the reference elevation will be
subtracted from all samples before further interpretation.
Other columns can be any twig descriptor, see appendix and VINT and DT.
VALUE is a special column heading , the twig descriptor for a VALUE column
is taken from the name of the input file.
The output dataset will contain either multi-twigs each with one trace or one
twig with multiple traces. If all column headings, except column one, have the
same twig descriptor, the output dataset will contain one twig and multiple
traces. Otherwise, the output dataset will contain multiple twigs with one trace
each. The exception to this is the velocity. There can be two velocity twigs. This
is useful when comparing calibrated and un-calibrated velocity logs.
If CSVimport reads a blank data value it will assume there are no more data
values of interest in any column. Blank data values should be replaced by a null
value of –999.25. Null values are not displayed.
Uses
CSVimport may be used to import log information such as velocity and density
for display. It may also be used to import seismic traces from a text file.
CSVimport –Parameters
CSV file
Full path and filename of CSV file.
Wellname
Well-name to be used for display annotation.
Client
Client name to be used for display annotation.
Contractor
Contractor name to be used for display annotation.
Reference elevation
This is the reference elevation e.g. elevation of KB. This is used in conjunction
with a TVD column to correct the depths to seismic datum. The value is set in
the dataset database for use with display. The measurement units, of this
parameter, depends on, the TVD column measurement units if one is present.
Otherwise, default measurement units are assumed.
VSProwess Operators • 64
CVLcalibrate
Calculate a calibration/drift curve by comparing check shot times from input A
with integrated continuous velocity log times from input B. Find best fit lines
between knee points saved in …<job>/logwork folder/CVLcalibrate<n>. Where
n is the opid of the CVLcalibrate operator.
Knee points can only be selected and changed when a linked pair of
CVLcalibrate and CVLshift operators are both processed. Default plot
parameters for calibration are triggered by such a linked pair. Add knee points
through context menu. Change knee points by dragging. The start and end of
continuous velocity log are imovable knee points.
Continuous velocity log samples have to be related to TVDSD. See LASimport
operator. If the log is depth sampled and no TVDSD twig is present samples are
assumed to be TVD and well reference elevation (WRE) and seismic datum
elevation (SDE) is used to reference depths to seismic datum.
See also CVLshift.
This operator is still under development. Request more information from
[email protected].
CVLshift
Use calibration information from …<job>/logwork folder/CVLcalibrate<n>,
where n is the opid of the CVLcalibrate operator linked to the A input CVLshift
operator, to apply a calibration to the continuous velocity log linked to the B
input. The B input for CVLshift and CVLcalibrate must be connected to the
same operator. Generate a TWT(two-way time) twig.
See also CVLcalibrate.
Use Depth2Time operator to convert from depth to time and Resample to make a
linear time scale.
Use context menu Export to export log data in CSV or LAS format.
DBupdate
Modify a VSProwess dataset database using information supplied by parameters
or within a comma separated value (CSV) file.
Requirements
Input dataset must be time domain.
Description
The input dataset is copied to the output dataset without change. However, the
output database trace header values are modified according to the information
supplied by parameters or in the CSV file.
Well, contractor and client name can be entered using parameters.
A welltrak file can be used to update TVD, RCX, RCY, TOOLAZ and
TOOLINC database locations using MD as a key. Wellhead UTM values will
also be set. However, if Downhole source is selected it is assumed that the
current source depth is the source measured depth below reference level on the
selected weltrak. Source depth, SCX and SCY will be set, using the selected
welltrak but will be relative to the wellhead coordinates of the input dataset
database. Use Wellview to verify coordinates.
VSProwess Operators • 65
The user also has a choice of three database loactions into which to set a value
for all traces in the database. Parameter updates are intended for information that
applies to all traces and may not be available from the recorded dataset.
If more complicated database editing is required the user can use a CSV file.
Only the header values specified in the CSV key row are modified. All other
trace headers are copied across to the output database without change.
“REC”,”REC”,”SCX”,”SCY”
“”,””,”m”,”m”
1,322,100.4,-300.4
323,655,130.4,-150.2
656,1004,144.2,-160.4
The double “primary key” in this example is a special construct, which allows us
to modify all of the traces having a primary key falling within the specified
range, in this original field record number.
The second line specifies the units and overrides the unit parameter.
After importing this simple file with DBupdate we will have modified the source
coordinates for all traces extracted from 1,004 original field records. If this
survey were recorded using a 16 level array, then we would have modified 1,004
x 16 = 16,064 traces.
It is also possible to use the ExportDB tool in conjunction with DBupdate to
preserve interactive trace editing, instead of using SEGYoutput.
Database keys
Values are assigned to output database locations using the pre-defined keys as
listed below. All times specified in the CSV file must be in seconds, all distances
are in the units specified by the distance units parameter.
The CSV file must contain a heading row used to attach a key to each column of
data. There must be a key for every column of data.
The first column defines the “primary key” which controls which traces are
going to be modified. The primary key field cannot be blank. If the primary key
also appears in the second column, then we will match all traces having a
primary key value anywhere between the range defined by the two values of
VSProwess Operators • 66
primary key. In addition, if the second value for the primary key is blank, it is
taken to be the same as the first value.
If a column value for any key other than the primary key is blank, then that value
is taken instead from the input dataset database, i.e. the value is copied from
input to output without change.
Valid keys
TR, VSProwess trace (can only be used as a primary key)
For other valid keys see the appendices.
Units
Valid units are m (metres), ft (feet), m/s, ft/s, s (seconds) and ms (milli-seconds).
DBupdate –parameters
CSV file
A check box to specify whether the CSV file is being used. If not checked any
CSV file pathname will be ignored.
Identification
If checked wellname, client and contractor information will be copied to the
dataset database.
Welltrak
If checked the specified welltrack file will be used to update TVD, RCX, RCY,
TOOLAZ and TOOLINC dataset database locations using MD as a key to the
welltrak file. Wellhead coordinates from the welltrak wellhead entry will be
copied to the dataset database.
DCsubtract
Calculate and remove a DC offset.
Requirements
Dataset be in the time domain.
VSProwess Operators • 67
Dataset should be unfiltered.
Dataset should contain as many recorded samples as possible.
Description
It is assumed, that in the absence of DC, the sum over a large number of samples
will be zero. Hence, the average value found in a trace is assumed to be the DC
offset.
The calculation is performed for each trace and each twig independently. An
average sample value is calculated, for all non-zero samples in a trace. The
output trace is calculated by subtracting the average value from all non-zero
samples.
Zero values are omitted from the calculation, as it is extremely unlikely that real
data will contain any samples at all whose value is precisely zero, except in the
regions removed by the blanking operations.
Deconvolve
Deconvolve dataset A with dataset B.
Requirements
Datasets must both be in the time domain.
Datasets must have the same number of twigs or else dataset B must have only
one twig.
Datasets must have the same number of traces or else dataset B must have only
one trace.
Datasets must have the same number of samples.
Datasets must have the same sample interval.
Considerations
The output database is derived from input dataset A.
Output start time is derived from the difference in input A and input B start
times. This attempts to correct the “time shift” caused by spectral divide by a
delayed signal, but this will only be successful if dataset B is aligned so that the
first arrival is at zero time with a negative start time. Aligning dataset B other
than this will require an extra Tshift to be added to the route.
Description
This operator performs the deconvolution of dataset A with dataset B. The
deconvolution is performed trace-by-trace and twig-by-twig except for two
special cases. If dataset B has only one twig then all twigs of dataset A are
deconvolved with the same twig. If dataset B has only one trace then all traces of
dataset A are deconvolved with the same trace.
The deconvolved data may be band-limited using the integrated zero-phase
band-pass filter.
Deconvolve -Parameters
Blank dataset B from (ms)
A deconvolution window period may be specified beyond which Dataset B is
blanked.
VSProwess Operators • 68
Stabilisation noise (%)
A certain amount of noise, typically 5%, must be added to dataset A in order to
stabilise the result of the deconvolution.
Deglitch
Remove glitches due to telemetry errors.
Requirements
Dataset be in the time domain.
Dataset must be unfiltered.
Description
Borehole digital telemetry systems may sometimes introduce the occasional
spike or “glitch”. Filtering is not effective against such glitches because they
contain aliased frequencies. This operator attempts to identify the location of
each glitch and replace it with a new value interpolated from the surrounding
samples. Isolated glitches are usually completely removed by this process.
Glitch detection is optionally carried out in two ways.
Firstly, glitches may be identified by having magnitudes above the expected
maximum seismic magnitude.
All samples, whose magnitude is greater than the Maximum seismic magnitude
parameter, are replaced by values, interpolated from the adjacent samples. If the
adjacent samples are also identified as glitches, the glitch sample is replaced by a
zero value.
Secondly, they may be identified by relying on the fact that the glitch almost
certainly contains higher frequencies than the surrounding seismic signal.
The algorithm first applies a high pass filter to reject the underlying seismic
signal. The maximum magnitude in the trace is located. If the magnitude of the
sample is n times greater than the rms amplitude in a window around the sample,
it is assumed a glitch. The located glitch sample values on the input trace are
replaced by values interpolated from the adjacent samples. The algorithm is
repeated upon the modified trace the specified maximum number of searches or
until no more glitches are found.
A list of the glitches removed is generated.
Limitations
Adjacent glitches may not be completely removed. The process works best if the
seismic energy is concentrated in the lower half of the available bandwidth,
which is usually the case.
VSProwess Operators • 69
Deglitch –parameters
Detect glitches by magnitude
Window length
The number of samples, around the glitch sample, on the filtered trace, that are
to be used to calculate an RMS value.
Threshold ratio
A glitch is detected when the ratio of the filtered glitch sample value to the RMS
value around the glitch sample is greater than this value.
Full report
Output a more detailed report of detected glitches.
Depth2Time
Convert from depth to time domain using TWT twig.
This help topic is still under development.
Designature
Special purpose operator to simplify source signature deconvolution.
Requirements
Dataset must contain one reference twig ("REF" descriptor).
Dataset must contain at least one non-reference twig.
Dataset must be in F-X space.
Considerations
The Reference twig is not propagated.
Reference picks are zeroed.
Description
The Designature operator attempts to collapse the source signature from the
down-hole data. The reference twig must be the source signature spectrum. The
process is applied in the frequency domain by division of the down-hole
spectrum with the source signature spectrum. A certain amount of noise must be
added to the denominator in order to stabilise the result. Division of each non-
VSProwess Operators • 70
reference twig with the reference twig is performed sample-by-sample and trace-
by-trace
After Designature, the phase delay of the recorded source signature should have
been removed from the data. The database is therefore modified by subtracting
the Reference pick time from non-zero pick times. The Reference pick time is
reset to zero.
Designature -Parameters
Stabilising noise
Specifies the amount of stabilising noise to be added to the denominator. The
noise is specified as a percentage of the total signature power. A typical value
used is 5%.
Deskew
Remove inter-channel skew inflicted by a supported acqusition system..
Requirements
Dataset be in the time domain.
Dataset must hold original record channel in the dataset database.
Description
This operator will remove inter-channel skew.
Deskew –parameters
Geophone system
Select required geophone system. Only the Delta geophone system is supported
at present.
Divide
Divide dataset A by dataset B.
Requirements
Datasets must have the same sample domain.
Datasets must have the same trace domain.
Datasets must have the same number of twigs or else dataset B must have only
one twig.
Datasets must have the same number of traces or else dataset B must have only
one trace.
Datasets must have the same number of samples.
Datasets must have the same sample interval.
VSProwess Operators • 71
Considerations
The output database is derived from input dataset A.
Description
Division of two datasets. The division is performed sample-by-sample, trace-by-
trace and twig-by-twig. Except that if dataset B has only twig then all twigs of
dataset A are divided by the same twig. This is useful for signature
deconvolution.
In addition, if dataset B has only one trace then all traces of dataset A are divided
by the same trace. This is useful for wavelet deconvolution.
A certain amount of white noise must be added to the denominator in order to
stabilise the result (prevent division by very small numbers). The requirement
that both input datasets must have the same sample domain implies that complex
numbers are never divided by non-complex numbers and vice versa.
Divide -Parameters
Stabilising noise
Specifies the amount of stabilising noise to be added to the denominator. The
noise is specified as a percentage of the total power in input trace B. The typical
value used is 5%.
ELFoutput
Output a VSProwess dataset as a SEG-Y file according to ELF aquitaine
specification.
Requirements
Dataset must be in the time domain.
Description
Archive all twigs, traces and samples of a VSProwess dataset into a single SEG-
Y file. You should refer to the current ELF specification and enter all required
information using the card parameters provided.
Parameters
None.
Enhance
Spatially enhance data alignments.
Requirements
Dataset must be time domain.
Considerations
Input dataset must be sorted into a trace order appropriate to the required
mapping parameter.
VSProwess Operators • 72
Dips must be specified relative to the required mapping parameter.
Description
This operator uses a median operator to spatially enhance data alignments within
a specified range of dips. Each output sample is derived by analysing data from
spatially adjacent samples. Several methods are available to determine the value
of each output sample. A range of dips may be analysed, or "searched". The
noise rejection option limits the range of dips that may "accepted".
Because Enhance operates on a dataset in order of trace number, it is imperative
that the dataset should be sorted into an appropriate order for the required
mapping parameter. This may be achieved using the Sort operator. It does not
matter if the mapping parameter increases or decreases with trace order, but the
relationship must be monotonic.
A VSP may be acquired at varying depth intervals. For example, the deepest
section of a well might be surveyed with fifteen-metre geophone spacing, with
the rest of the well surveyed at thirty-metre spacing. In order to cope with such a
dataset Enhance does not require the mapping parameter increment to be
constant from trace to trace. For each output trace, Enhance uses all traces for
which the mapping parameter falls within the specified range.
The VSProwess display system provides an interactive dip-picking mechanism
that makes it easy to choose the required dip range. However, remember that the
numerical value of a dip depends upon the selected horizontal mapping
parameter, so you must remember to display the dataset against the same
mapping parameter.
Enhance -Parameters
Parameter
Specifies which database variable is to be used as the mapping parameter. For a
zero offset VSP in a vertical well, the mapping parameter is usually vertical
depth. For a vertical incidence VSP in a deviated well, a more useful mapping
parameter may be receiver offset.
Units
Select either imperial or metric units for the parameter range and dip values.
Parameter range
The parameter range is the horizontal range over which input data samples
contribute to output data samples. Roughly the equivalent to defining the order
of a conventional median filter.
The parameter range is defined in mapping parameter units. The parameter range
must be less than the horizontal extent of the dataset and must be longer than the
average horizontal trace separation. For example, a vertical well is surveyed at
thirty metre intervals, the mapping parameter is vertical depth and the range is
specified at three hundred metres. In this case, Enhance applies a median filter
using eleven input traces centred upon the output trace.
Edge compensation
Two methods are available for the handling of edge traces.
The "Dead" method pads out the median operator with dead (zeroed) traces.
The "Repeat" method simply repeats the nearest output trace that is calculated
from the full complement of input traces. This method should only be used with
VSProwess Operators • 73
a horizontal operator (search dips from 0 to 0) and is useful for down-wave
enhancement.
Enhance dips
If selected the remainder of the parameters will be used to enhance dipping
arrivals. Otherwise, only horizontal alignments will be enhanced.
Method
Select the method used to calculate an output sample from the adjacent samples
over the "search" range of dips.
•Maximum median: this method uses the median value of the dip with the
highest median magnitude.
•Unweighted semblance: a "semblance" of one represents a perfect alignment; a
semblance of zero defines a random alignment. This method uses the median
value for the dip with the highest semblance.
•Weighted semblance: the median value of the dip with the highest semblance is
scaled by the semblance value itself. This has the effect of reducing the
amplitude of a data alignment having a poor semblance value.
The weighted semblance is often the best option for dip enhancement because
the amplitude of each resulting data alignment provides an indication of the
confidence level that may be attached to that alignment.
Equalise
Equalise amplitudes between traces.
Requirements
Dataset must be time domain.
VSProwess Operators • 74
The RMS magnitude in the specified window must be greater than zero.
Description
Each trace in the selected "master twig" is scaled so that the RMS magnitude
becomes equal to the required value. All other twigs are scaled by the same
factor so that the relative amplitude between twigs is unaffected.
The scaling factor applied to each trace is saved in the database for later use by
the Unequalise operator.
Equalise -Parameters
Window start
Specify in milliseconds the start of the window used to calculate the RMS value.
Window length
Specify in milliseconds the length of the window used to calculate the RMS
value.
Master twig
Specify the twig from which RMS values are calculated, usually the twig with
the best signal to noise ratio.
Required magnitude
Specify the required RMS magnitude.
EventDetect
EventDetect is designed to scan long record files for events. Any detected events
are output to a new and hopefuly much more compact set of files. EventDetect is
intended to perform the initial data reduction stage of passive monitoring
analysis. If used after MIRFinput, EventDetect should produce the same output
as the MIRFevents utility, see ACQ manual.
Requirements
Dataset must be time domain.
VZ, HX or H1 and HY or H2 twigs must be present.
Traces must be grouped by Record numbers.
Not every input record may have an event.
Input records may have more than one event.
Limitations
A maximum of 200 receivers is allowed.
Description
Marked traces are ignored. For each input record, the lowest RMS value over a
running window, defined by the event windows, is calculated from the
magnitude of all three components. This lowest RMS value is assumed to be the
background noise value for that receiver.
VSProwess Operators • 75
For receiver a running short term RMS value over the RMS window is
calculated from the magnitude of all three components. Now the search for
events starts.
• For each requested receiver short trem RMS values are searched to
find a valid event. An event has an RMS value greater than the
background RMS * event threshold/100.
• The number of receivers with events within the event window is
found. A valid event must appear on at least the user specified
number of receivers.
• If the event is invalid, the search restarts after the shortest event
time.
• If the event is valid a record is output starting at the shortest event
time – a third of the output duration. The record header is copied
and the following output values are changed. Per receiver event
“Pick1” times are set. The time from the start of the record being
searched to the time at the start of the output record is added to the
micro second header location. The original field record is set in
stack id header location. Data length values are updated.
• The search moves on so that there is no overlap between output
records.
• When the entire record has been searched, the next record is
processed.
EventDetect -Parameters
Event window (seconds)
To be a valid event, events on receivers within a record must lie within this time
window i.e. longest event time - shortest event time < event window.
Event Threshold %
Event threshold defines the percentage of the lowest RMS value that defines an
event i.e. an event threshold of 200 specifies that an event RMS must be at least
twice the smallest RMS. The lowest RMS value is described above.
Output Magnitude
A twig containing the short-term RMS magnitudes can optionally be output.
This may help decide detection parameters.
VSProwess Operators • 76
EventLocate
Locate passive seismic events. This operator is still under development.
Fbandpass
Apply a zero-phase band-pass filter.
Requirements
Dataset must be in the frequency domain and the highest specified frequency
point must be less than the nyquist frequency for the dataset.
Description
Filter data with a zero-phase band-pass filter. The filter is specified as a four-
point template with linear ramps.
Fbandpass -Parameters
F1
Below this frequency samples are set to zero.
F2
At F2 data is unchanged by the filter. Between F1 and F2 the frequency samples
are attenuated linearly. F2 must be higher frequency than F1.
F3
Between F2 and F3 samples are unchanged by the filter. F3 must be higher
frequency than F2.
F4
Above this frequency samples are set to zero. Between F3 and F4 the frequency
samples are attenuated linearly. F4 must be higher than F3. F4 must be less than
half the Nyquist frequency.
Fbandstop
Apply a zero-phase band-reject (notch) filter. The filter is specified as a four-
point frequency template with linear ramps.
Requirements
Dataset must be in the frequency domain. Sample interval is unimportant except
that the nyquist frequency must be greater than the highest specified frequency
template point.
Description
Filters all data with a zero-phase notch filter.
VSProwess Operators • 77
Fbandstop -Parameters
F1
Frequencies below F1 are unchanged.
F2
At frequency F2, the data is attenuated by the specified rejection. Between F1
and F2, the attenuation varies linearly with frequency. F2 must be higher
frequency than F1.
F3
Between frequencies F2 and F3, the data is attenuated by the specified rejection.
F3 must be higher frequency than F2.
F4
Frequencies above F4 are unchanged. Between F3 and F4, the attenuation varies
linearly with frequency. F4 must be higher than F3 and less than half the Nyquist
frequency.
Rejection
Specify the attenuation in decibels to be applied to reject band (notch).
Fbutter
Apply a Butterworth filter.
Requirements
Dataset must be in the frequency domain.
Description
The Butterworth filter is useful because it yields a minimum phase response that
may sometimes provide a better match to surface seismic processing.
Fbutter -Parameters
Filter type
Select the filter type to be Low-pass, High-pass or Band-pass.
VSProwess Operators • 78
The roll-off slope determines the order of the filter characteristic to be applied.
For example, the roll-off slope for a second order filter is 12 dB/octave.
Fcollapse
Resample down, or double the sampling interval of a dataset by halving the
number of frequency samples.
Requirements
Dataset must be in the frequency domain.
To avoid alias, the dataset must not contain any frequencies greater than half the
nyquist frequency for the input sample interval.
Description
Halves the number of frequency domain samples in a dataset by dropping the top
half of the spectrum. If the dropped frequency components contained no energy,
then no signal information is lost by this process.
The most common use for this operation is to "decimate" an over-sampled
dataset. Over-sampling is a data acquisition technique used to increase the
effective dynamic range of a digitised signal by spreading the quantisation noise
over a much wider bandwidth than that occupied by the signal. Filtering down to
the signal bandwidth reduces the total quantisation noise. The filtered signal may
then be "decimated" to an appropriate sampling interval.
Parameters
None.
Fexpand
Resample up, or double the number of samples in a frequency domain dataset.
Requirements
Dataset must be in the frequency domain.
Description
Doubles the number of frequency domain samples in a dataset by adding zeroed
high frequency samples. After transformation back to the time domain, this
process results in a halving of the sample interval.
The most common use for this operation is to improve the appearance of a
display, particularly if the dataset contains signal energy near to the nyquist
frequency. For example, a dataset sampled at two milliseconds may contain
useful signal energy up to about two hundred hertz. However, with just one or
two samples per "wiggle" it would be very difficult to interpret such signals.
By resampling a couple of times, we may obtain a much clearer display. It is
important to understand that the signal quality and bandwidth is unchanged, we
have simply interpolated additional samples.
Parameters
None.
VSProwess Operators • 79
FKtoFX
Transform a dataset from the FK domain into the FX domain.
Requirements
Dataset must be in FK domain.
Considerations
The output dataset derives trace database information from the data processed in
the closest connected FXtoFK operator.
Description
Transforms a dataset from the FK domain to the FX domain.
Parameters
None.
FKfilter
Apply an FK filter template.
Requirements
Dataset must be in FK domain.
A polygon template file must be present in the input dataset folder.
Description
Applies an FK filter polygon template as created with the Display tool.
Horizontal smoothing at the edge of each polygon may be applied to reduce
truncation effects.
FKfilter -Parameters
Mode
Specify whether the data within a polygon is to be accepted or rejected.
No of edge traces
Specify the number of traces over which smoothing will occur. Specify zero to
inhibit smoothing.
Flow-pass
Apply a zero-phase low-pass filter.
Requirements
Dataset must be in the frequency domain. Sample interval is unimportant except
that the nyquist frequency must be greater than the highest specified frequency
point.
VSProwess Operators • 80
Description
Filters all data with a zero-phase low-pass filter.
Flow-pass -Parameters
F1
Frequencies below F1 are unchanged.
F2
Frequencies above F2 are set to zero. Between F1 and F2, attenuation varies
linearly with frequency. F2 must be higher than F1.
Frontblank
Blank a dataset from the start of each trace until some specified pick.
Requirements
Dataset must be in the time or depth domain.
Description
Front blank a dataset from the start of each trace until the specified event pick by
setting the samples to precisely zero.
Frontblank -Parameters
Blank until
Select the event pick at which blanking is to end. The "End offset" is added to
this event pick.
End Offset ms
An offset in milliseconds or user units to be added to the selected end of
blanking event pick.
Example
For example, to blank the first 50 milliseconds of each trace, select "Tzero" and
set "End offset" to 50 milliseconds.
To blank until 20 milliseconds before the Pick1 arrival, select "Pick1" and set
"End offset" to be -20 milliseconds.
FXresample
Spatially resample a variably spaced dataset to a constant spacing.
Requirements
Dataset must be in the FX domain.
VSProwess Operators • 81
Considerations
The lowest frequency at which spatial alias occurs restricts the useful bandwidth
of the input dataset.
Description
Spatially resample a dataset to constant trace spacing. The trace spacing may be
specified against various mapping parameters. Related parameters are
interpolated as necessary. For example, a dataset recorded with regular measured
depth intervals might be resampled to produce a dataset with regular true vertical
depth intervals.
Resampling is achieved by the polynomial interpolation of adjacent traces in the
FX domain.
The input dataset must be filtered down to the lowest frequency at which spatial
alias occurs or else the resampled dataset will be distorted, probably severely.
Spatial alias may be most easily detected in FK space where it is manifested as a
wrap-around.
FXresample -Parameters
Parameter
Specify the mapping parameter to be used for interpolation. Possible parameters
are measured depth, vertical depth, receiver offset, source offset or target offset
(from wellhead) or receiver offset, source offset or target offset along the line
defined by the coordinates (highest trace number has smallest offset).
Units
Determines whether the supplied values are in feet or metres.
Spacing
Specify the spacing of interpolated traces in horizontal units. The spacing must
be less than the horizontal units extent of the dataset. Spacing can be a positive
or negative value.
FXtoFK
Transform a dataset from FX (frequency/distance) space to FK (frequency/wave-
number) space.
Requirements
Dataset must be in FX space.
Considerations
In order for an FK transform to produce meaningful velocity alignments, the
input dataset must be linearly sampled in depth (constant depth increment per
trace). An input dataset having non-linear depth sampling will require a very
cautious interpretation in FK space. The FXresample operator may be used to
generate a linear depth sampled dataset.
VSProwess Operators • 82
You should try to minimise the peak amplitude difference between the deepest
and shallowest traces. Remove any high amplitude shallow traces (use the Select
operator) and try using the Equalise operator to compensate for the residual
amplitude spread.
Spectral energy peaks caused by resonance or coherent noise can cause the
velocity alignments of interest to be compressed and obscured. Apply a low-cut
filter and/or a notch filter to the input dataset to attenuate any such peaks.
Description
Transforms a dataset from FX space to FK space by Fourier transforming across
the traces.
The number of traces in the wave-number domain is a power of two and must be
greater or equal to the number of input traces. Traces in the wave-number
domain are assigned arbitrary wave-number units from -0.5 for trace one to 0.5
for the last trace.
Trace database information from the FX domain data is invalid in the FK
domain. Because of this, when data is returned to the FX domain, trace database
information is derived from the closest connected FXtoFK operator.
FXtoFK -Parameters
Minimum FFT length
Specify the minimum length of the Fourier transform. Using a greater than
necessary minimum Fourier transform length may improve the clarity of the FK
space display.
FXtoTX
Transform a dataset from FX space to TX space.
Requirements
Dataset must be in FX space.
Description
Transforms a dataset from FX space to TX space. The resultant time domain
sample interval is derived from the Fourier transforms length. Within
VSProwess, a dataset in the frequency domain always has a number of samples
that is an exact power of two.
FXtoTX -Parameters
Number of output samples
Specify the required number of time domain output samples. If the specified
number of samples is greater than the number available (which is always some
power of two) then the output trace is padded with zeroes.
Graft
Graft together two datasets having different twigs to form a single merged
dataset.
VSProwess Operators • 83
Requirements
Both datasets must have the same number of traces (except that dataset B may
have only one trace).
Datasets must have the same number of samples.
Datasets must have the same sample interval.
Datasets must have the same sample domain.
Datasets must have the same trace domain.
Datasets may not have duplicate twig descriptors.
Considerations
Trace database information is derived from input A.
Description
The Graft operator is used to merge together two datasets which each have non-
overlapping sets of twig descriptors.
The Prune and Graft operators form a complementary pair.
Example
For example, a particular processing route may require that the reference twig be
"pruned" from a dataset, subjected to some special processing and then "grafted"
back on to the rest of the original dataset.
Special case
Both datasets must contain the same number of traces except for one exception;
dataset B may contain only one trace. In this special case the single trace (which
may have multiple twigs) from dataset B, will be replicated for every trace in the
output dataset.
This feature allows the creation of a dataset suitable for the Correlate operator
when only a single pilot trace is available.
HRotate
Rotate HX and HY components, using tool azimuth.
Requirements
Dataset must contain HX and HY twig descriptors.
Dataset must be in the time domain.
The tool azimuth must be available in the dataset database.
Description
For each input trace, HRotate rotates the plane of the HX and HY components,
with respect to the tool azimuth in the dataset database. The resultant twigs are
named H1 and H2. All other twigs are copied unchanged. Rotation can be
specified to rotate the output H1 component, toward the source coordinates or,
toward north. The angle used to rotate the HX and HY components is saved in
the dataset database.
A rotation to the source results in an H1 component rotated toward the source
and an H2 component transverse to the source direction.
VSProwess Operators • 84
A rotation to north results in an H1 component rotated to north and an H2
component transverse to this.
HRotate requires the tool azimuth to be in the dataset database. If tool azimuth
has been measured, use DBupdate to set it in the dataset database. ToolOrientate
calculates tool azimuth from HX and HY components.
See angles appendix for more information.
HRotate -Parameters
Rotate to
Choose from Source or North.
Save angle as
Choose database location to save calculated angle from Angle1 or Angle2.
Integrate
Integration of time domain dataset.
Requirements
Dataset must be in the time domain.
Description
Numerically integrates time domain traces using the trapezium rule.
Parameters
None.
Invert
Inversion of a time domain dataset to velocity.
Requirements
Input A must be from a time domain dataset.
Input B must be from a Profile or ProfileIn operator.
Considerations
The dataset to be inverted should be an accurate and linear representation of the
vertical reflectivity sequence. Avoid noise rejection and signal weighting
algorithms. A priori knowledge of bulk velocities below the total depth of the
well may be accommodated by inserting time depth pairs into the input time
versus depth profile.
A special composite display mode is used for the output from a Transpose
operator, which includes the transposed input traces, the velocity profile and an
integrated depth scale. This special display can be turned off by processing the
Invert dataset through a Prune operator.
It is possible to invert a vertical incidence dataset recorded in a deviated well.
Offset source datasets that have been processed through RayForm, NMO and
VSProwess Operators • 85
Bin or Migrate can also be inverted. The effects of amplitude versus angle
(AVA) will affect reflection coefficients and thus inverted velocities. Execute
Prune after Invert and display. Display will use a multi-trace mode of display if
there are more than 20 traces in the dataset and block trace modes can be used.
Invert data can be exported to a CSV file by using the Export context menu
option. Note the VZ Trace will not be exported.
Description
Each input trace is inverted to generate an estimated velocity profile. The Invert
operator uses the reflectivity series to estimate the velocity profile within the
seismic bandwidth and a time versus depth model to guide the very low
frequencies. Density information is approximated using Gardner's rule. See
Tutorial for more information.
Invert -Parameters
Zero-phase filter
Specify the four filter points that define the usable bandwidth of the input
dataset.
Kbandpass
Apply a wave-number band-pass filter.
Requirements
Dataset must be in FK space.
Description
Kbandpass rejects traces having wave-number values outside of the specified
pass-band. A common use is to enhance vertically aligned down-wave data. The
pass-band is specified as a four point wave-number template.
Kbandpass -Parameters
K1
Traces with wave-number less than K1 are zeroed.
K2
Traces with wave-number between K1 and K2 are attenuated linearly with wave-
number. K2 must be a higher wave-number than K1.
K3
Traces with wave-number between K2 and K3 are unchanged. K3 must be a
higher wave-number than K2.
VSProwess Operators • 86
K4
Traces with wave-number above K4 are zeroed. Traces with wave-number
between K3 and K4 are attenuated linearly with frequency. K4 must be a higher
wave-number than K3.
Kbandstop
Apply a wave-number reject (notch) filter.
Requirements
Dataset must be in FK space.
Description
Kbandstop rejects traces having wave-number values inside of the specified
reject band. A common use is to reject vertically aligned down-wave data. The
rejection band is specified as a four point wave-number template.
Kbandstop -Parameters
K1
Traces with wave-number less than K1 are unchanged.
K2
Traces with wave-number between K1 and K2 are attenuated linearly with wave-
number. K2 must be a higher wave-number than K1.
K3
Traces with wave-number between K2 and K3 are zeroed. K3 must be a higher
wave-number than K2.
K4
Traces with wave-number above K4 are unchanged. Traces with wave-number
between K3 and K4 are attenuated linearly with frequency. K4 must be a higher
wave-number than K3.
LASimport
Import LAS format files.
Requirements
File must conform to LAS standard. Documentation for current LAS standard is
included on the release CD. An example LAS file is included in the SAMPLES
folder in ..Programme files/Avalon Sciences it can be viewed using a text editor
such as TextPad.
Description
LASimport imports LAS files into VSProwess. LAS column definitions are
associated with VSProwess twig descriptors using parameter lines. The default
parameter file contains some common associations.
VSProwess Operators • 87
The first column of the LAS file can be; depth or time, linearly or non-linearly
sampled and must be associated with MD, TVD, TVDSD or TWT VSProwess
twig descriptor.
Parameters are entered using a text file.
Well reference elevation (WRE) can be read from the LAS file or by parameter
input. Seismic datum elevation (SDE) can be set by parameter input. It is
important that WRE and SDE are correct.
VSProwess Operators • 88
Welltrak
A welltrak can be used to create XOFF and YOFF twigs, but, only if the the
LAS file contains a curve that can be associated with a measured depth twig.
Existing TVD or TVDSD twigs will be deleted. The output dataset will be
sampled using TVDSD and an MD twig will be output.Wellhead UTM
coordinates are set from welltrak file. Well reference elevation is used to convert
from TVD to TVDSD.
The following parameter line will initiate the use of a welltrak database.
WELLTRAK Example
The above line informs LASimport to read the welltrak database
Example.welltrak in the current job folder.
Magnitude
Find sample-by-sample magnitude.
Requirements
Dataset must time domain.
Description
Output the magnitude of each sample, for each trace and each twig.
Uses
Use as a general mathematical operator.
Mark
Mark traces within a range of dataset database values.
Requirements
Dataset must time domain.
Description
If mark all traces within range is selected, mark traces whose specified dataset
database value lies within, and including, the defined range.
VSProwess Operators • 89
If mark all traces within range is not selected, then a step value can be specified.
Only traces, whose specified database value, lies within, and including, the
defined range and whose value is equal to the start of range plus multiples of the
step parameter are marked.
Uses
For instance, use in conjunction with Select or Stack to omit a range of depths or
geophone numbers, or decimate a dataset by marking even traces.
Mark -Parameters
Mark traces with
Choose the required dataset database location. See appendix for a description of
dataset database location identifiers.
values from/to
The start and end of the range of specified dataset database values. All times are
in seconds and all depths are in user units.
Step
Enter the value used step through the range. A step of zero will mark all traces in
the range.
Mc170input
To import data recorded with an OYO McSeis 170f instrument.
Limitations
This operator has initially been written and tested on the assumption that the data
to be imported is an up-hole survey acquired from a vertical hydrophone array.
In addition, at the time of writing we have received very little information from
OYO concerning the capabilities and use of the McSeis 170f recorder. The
known issues are as follows:
Any source and receiver geometry information in the headers is ignored.
Pre-trigger delays are ignored.
All distances, depths and offsets must be supplied in metres.
Description
The Mc170input operator reads the specified files from the specified folder, in
the same order in which those files appear in the file list. Any channels not
defined in the channel list are ignored. The channel list must precede the file list
in the parameters file. Provision is made for the entry of source and receiver
geometry.
Limited support for the McSeis 160 instrument has been added.
VSProwess Operators • 90
Parameters -general
The parameters for this operator are supplied as statement lines in plain ASCII
text. Inserting a hash (#) at the start of a line causes that entire line to be treated
as a comment, i.e. ignored. Only one statement is allowed per line of text.
Mc160
The presence of this switch causes the operator to read Mc160 formatted files.
VSProwess Operators • 91
F <record no> <tool depth>
File statements allow you to specify which files are to be read and to associate a
“tool depth” with each file. This is the depth of the datum point that you have
chosen for the down-hole receiver array.
The depth of each receiver in an array is calculated by adding the “depth offset”
for that receiver to the “tool depth”.
Migrate
Migrate is not yet fully implemented. More, technical information will appear in
later versions of help.
Migrate a dataset to its true sub-surface location.
Uses
Use after NMO operator as part of an image reconstruction procedure.
Requirements
Input A must be a time NMO corrected dataset.
Inputs
Migrate has three inputs
• Input A a time NMO corrected dataset. Xoff and Yoff twigs do not
need to be generated by the NMO operator, thus reducing the
amount of data generated by NMO particularly for large datasets.
• Input B directly from RayTrace operator. This tells Migrate the
location of the required loci.csv file. See appendix for loci file
format.
• Input C Bin coordinates. The bin coordinates specify the bins into
which migrated samples are added. The dataset database location
for bin coordinates is specified by parameter. Coordinates can be
SCX, SCY or RCX, RCY or TCX, TCY. These coordinates can be
taken from any VSProwess dataset database e.g. imported surface
seismic lines. Bin coordinates can also be generated from the
BinImport operator. Bin coordinates can describe lines, a grid
pattern or random coordinates. Use line number (LN) to group bin
locations into lines.
Description
Each sample of each input trace of input A is migrated into bins defined by the
input C dataset database location requested in the parameters. The migration
ellipse is derived from information from the current input A trace and the
loci.csv file defined by input B.
The RMS velocity from the loci.csv file, and source and receiver locations for
the current trace, are used to define an ellipse (or in the 3D case, an elliptical
surface) that passes through the reflection point location found from the loci.csv
file.
Each sample is migrated along the ellipse by a distance derived from the in-line
aperture parameter (and additionally in the 3D case the cross-line parameter).
VSProwess Operators • 92
The sample amplitude is added into each relevant bin location crossed by the
migration ellipse.
After all samples from all traces have been migrated, each output sample is
scaled by the total number of samples added to its bin. A Bin twig containing the
bin density is output.
The line (LN) dataset database location is used to define bin lines. Samples can
be projected onto a bin line if the Project parameter is selected.
If, all apertures are zero Migrate will produce a result similar to the Bin operator.
Limitations
Currently 3D is not implemented. The 2D migration can be used for situations
where the source, receiver and all image points lie in a similar plane e.g. static
offset source in a vertical well, rig source in a deviated well that deviates along
the bin location line.
For some 3D situations, a simple 2D solution can produce a reasonable result. In
the simple 2D solution, we assume a symmetrical ellipse about the vertical. This
is a reasonable approximation when reflection point is much deeper than the
receiver, or when the source lies vertically above the receiver. Reflections close
to the receiver will not be migrated correctly except for vertical incidence.
Output results lie on a 2D plane assumed to pass through the well.
For a rig source in a vertical well and vertical incidence surveys, use the simple
2D case.
For walkaway lines that do not pass over the receiver, use the simple 2D case.
Migration parameters
Migration apertures
Apertures restrict migration ellipses. Facing north, a positive aperture dips up to
the west. At present only inline, apertures are implemented.
Binning parameters
Choose the dataset database location for input C bin X and Y coordinates.
Enter the width of the bin, i.e. how far from the bin samples can be captured.
Enter projection limit i.e. how far away from the current line can samples be
projected.
Simple 2D
Causes simple 2D case to be implemented, see above.
Average
Only for zero aperture case. Output the average of all samples found in the
output sample bin. Otherwise, output the closest sample to the bin center.
Interpolate
Only for zero aperture case. Try to fill in the gaps caused by spatial under
sampling. Check the bin increment first.
Project
Project input coordinates, onto the bin line, before binning.
VSProwess Operators • 93
MIRFinput
To import data recorded in MIRF (Media Independent Record Format) into the
VSProwess environment.
For rig-site processing the preferred import operator is ACQinput, which directly
uses the stack files and database generated by the ACQ program during data
acquisition.
Requirements
Data must be recorded as MIRF files, as used by the PDAQ, Multilock, MAU,
ASP, SST500 and Geochain acquisition systems.
Description
MIRFinput is a general purpose MIRF import tool. Typically, it is used to import
raw data, perhaps for conversion to SEGY using SEGYoutput.
Data may be selectively imported according to dataset number, geophone
number and channel descriptor. It is also possible to output a decimated subset of
the input dataset by selecting to output every nth trace, where n is set by
parameter.
Either raw (f_nnnnnn.rcd) or stack (f_nnnnnn.stk) files may be imported.
All traces are output with the same data length, long traces are truncated and
short traces are padded with zeros. If a deviation database file is specified then
the vertical depth and receiver offset values in the headers, are updated to take
account of the current well deviation data and wellhead UTM coordinates are set
in the output dataset database.
It is possible to reallocate channel descriptors and owners and correct measured
depth offsets between tools by using a channel reallocation file whose format is
described below.
MIRFinput -Parameters
Path
Enter the full pathname of the folder containing the input files. There is a browse
button to help you.
Wellname
Well-name to be used for display annotation.
Client
Client name to be used for display annotation.
Contractor
Contractor name to be used for display annotation.
Dataset
Only MIRF files with the specified dataset number in the header are imported.
Set this value to zero to import all files regardless of dataset number.
VSProwess Operators • 94
Geophones
Use this option to include geophones (i.e. receivers in a multi-receiver tool).
Non-existent geophones are ignored. Normally all geophones should be
included.
Descriptor
This option allows the inclusion of channels according to descriptor. For
example, the user may wish to import only channels containing a vertical
component (VZ). Each selected descriptor is allocated a twig, usually with the
same name.
If the REF descriptor is selected and the input file contains more than one REF
channel, then only the source reference channel is imported (as the “REF” twig).
If the AUX descriptor is selected then all AUX and REF channels are output as
the twigs “CHn”, where n = channel number. This is useful for importing test
records or if the usual descriptor names are not applicable.
End at record
Only files with a record number equal to or lower than the end record are
imported. This setting defaults at 999 and may need to be increased for very
large datasets.
VSProwess Operators • 95
For most survey configurations Ts remains constant throughout the survey. If
this is not the case, for example if the source depth changed, then you can use the
database edit function from the VSProwess display system to modify Ts for each
trace.
Welltrak
If the “Use welltrak database” toggle is selected then the specified Welltrak
database is used to supply receiver: coordinates, vertical depths and orientation
(TOOLAZ, TOOLINC). This option is useful if receiver coordinates are not
available from the record headers, or if the well deviation data has been updated.
Welltrak name
Identifies the Welltrak database to be used. If a full pathname is not specified the
default path is the current jobpath.
Channel reallocation
If the “Use channel reallocation” toggle is selected then the specified channel
reallocation file is used to reallocate channel descriptors and owners. This option
should be used in the unlikely event that channels were incorrectly allocated
during acquisition.
VSProwess Operators • 96
Multilock
Correct phase response and skew of data recorded from the Multilock geophone
system.
Requirements
Dataset must be in the time domain.
Considerations
Only supports the 12 channel, 2 ms systems as used by CGG. Assumes data
acquired using the MAU interface panel and that reference channels do not
require correction.
Description
This operator is designed to compensate for the channel characteristics of the
Multilock geophone system electronics.
The ACQ acquisition software provides an identical correction through its
display filter, but this correction is never applied to the actual data recorded to
disk.
A Multilock operator should therefore be placed directly after either a MIRFinput
or ACQinput operator.
The dataset should be timed after correction by this operator.
A processing route containing the Multilock operator must not be used to process
data recorded from a different geophone system.
Corrections applied
An inverse filter is applied to correct for the low frequency phase and amplitude
distortion.
A channel dependent sub-sample time shift is applied to correct for the inter-
channel skew.
Multiply
Form the arithmetic product of two datasets.
Requirements
Datasets must have the same sample domain.
Datasets must have the same trace domain.
Datasets must have the same number of twigs or else dataset B may have only
one twig.
Datasets must have the same number of traces or else dataset B may have only
one trace.
Datasets must have the same number of samples.
Datasets must have the same sample interval.
Considerations
The output dataset derives its database from input dataset A.
VSProwess Operators • 97
Description
Produces the arithmetic product of two datasets. Multiplication is performed
sample-by-sample, trace-by-trace and twig-by-twig. The first exception is that
input dataset B may have only one twig, in which case all of the twigs in input
dataset A are multiplied by the same twig. This behaviour is useful for
implementing correlation in the frequency domain. The second exception is that
dataset B may have only one trace, in which case all of the traces in dataset A are
multiplied by the same trace. This facility is useful for convolution with an
arbitrary wavelet.
The requirement that both input datasets must have the same sample domain
implies that complex numbers are never multiplied by non-complex numbers.
Parameters
None.
NMO
Convert the input dataset from, one-way time-samples to, two-way vertical-time
or vertical-depth, below datum.
Requirements
Input A, recorded traces at one-way time from source to receiver.
Must be time domain.
Input B, linked to an operator which has produced a text file (loci.csv defined in
the appendices) containing reflection point loci information e.g. RayTrace.
Description
For each twig, for each input trace, an output trace is produced, by mapping one-
way time input samples, to output samples at two-way time or vertical depth
below datum (TVDSD). The reflection point loci file Loci.csv from input B
defines the mapping.
Optionally, two further twigs, XOFF and YOFF, are output, that contain
reflection point X and Y coordinates respectively, for each output sample. These
reflection points are absolute values referenced to an origin defined by the
wellhead UTM coordinates set by input operators; ACQinput, MIRFinput,
SEGYinput and SEG2input. Mapping stops at the deepest reflection in the
modeled data. XOFF and YOFF twigs are required by the Bin operator.
Two-way vertical times or TVDSDs are set in the Pick1 database location.
A text report is output, listing comparisons, for each trace, between receiver
depth and Pick1 times of the input A dataset database and the Loci.csv file. The
two-way vertical time or depth from the Loci.csv file is also listed. This file
should be checked to confirm, a one to one correspondence between traces in
input A and the Loci.csv file and, to verify the ray-traced first arrival times
match those in input A. If the time match is poor, greater than a few
milliseconds, the results from NMO should be treated with caution.
If requested, a more detailed report lists the mapping of one-way times to two-
way time or depth and X and Y coordinates.
Uses
Use after RayTrace and before Bin operators, as part of a CDP mapping
procedure.
VSProwess Operators • 98
Use after RayTrace as an alternative to shifting one-way time vertical incidence
data to two-way time below datum.
Use after RayTrace to convert vertical incidence data to depth below datum.
NMO - parameters
Number of output samples
In conjunction with the Output sample interval, this defines the length of
output data.
Output sample interval
In ms, or survey units if depth selected. At an average velocity of 10000ft/s
(3000m/s), a sample interval of 1ms is equivalent to 10ft (3m) depth sample
interval.
Depth
If selected output traces are sampled in depth.
Full report
If selected a detailed report of mapping information is output.
Output_XandY_twigs
X and Y twigs are required for the Bin operator.
NotchFilter
Apply a zero-phase band-reject (notch) filter. The filter is specified as a center
frequency, a frequency width and a rejection dB.
Requirements
Dataset must be in the time domain.
Description
Filters, all data with a zero-phase notch filter. This operator is useful where
explicit conversion of the dataset into the frequency domain and back just to
apply a simple filter would make the processing flowchart unwieldy.
If Fstart = Notch center - (Notch width/2) and Fend = .Notch center + (Notch
width/2). The notch filter is applied in the following way.
Frequencies below Fstart are unchanged.
At frequency Fcentre, the data is attenuated by the specified rejection. Between
Fstart and Notch centre, the attenuation varies linearly with frequency.
Frequencies above Fend are unchanged. Between Notch centre and Fend, the
attenuation varies linearly with frequency.
NotchFilter -Parameters
Notch centre Hz
Defines the center of the notch.
VSProwess Operators • 99
Notch width Hz
Defines the width of the notch.
Rejection dB
Specify the attenuation in decibels to be applied to reject band (notch).
Orientate
To orientate a pair of orthogonal components.
Requirements
The input dataset must contain at least two orthogonal components.
The input database must contain Pick1 times.
Dataset must be in the time domain.
Considerations
If a MAG (magnitude) twig is present in the input dataset it will not be present in
the output dataset.
The pick type parameter must be set correctly.
Description
For the selected pair of orthogonal components, for each input trace, Orientate
calculates the direction from which the picked arrival has come. It then rotates
the plane of the selected pair of components so that, one component of the
selected pair points in that direction, and the other component is transverse to
that direction. All other twigs, with the exception of the MAG twig, will be
copied unchanged.
The orientation angle is calculated at a sample defined by the Pick1 time. If the
Primary pick is at a break, Orientate will find a peak time by searching forward
from the break time, to the next peak magnitude, calculated from all the
available components. For QC purposes, this peak time is saved in the Pick2
location of the output dataset database. The orientation angle is saved in the
output dataset database.
X = amplitude of the first of the twig pair.
Y = amplitude of the second of the twig pair.
Orientation angle = atan(Y/X).
Orientate can be used to orientate the horizontals of a dataset, having a
significant source receiver offset, by selecting the twig pair containing the
horizontal components.
Orientate names the processed pair of output twigs as follows,
• Chosen pair named HX, HY, H1 or H2, output will be H1 and H2.
H1 will point toward the pick energy.
• One of, chosen pair, not named HX, HY, H1 or H2, output will be
E1 and E2. E1 will point toward the pick energy.
When HX and HY twig descriptors are chosen for orientation, the azimuth of the
tool is calculated and stored in the dataset database. Tool azimuth is calculated
using SCX, SCY, RCX and RCY database locations.
Orientate –parameters
Orientate twig pair
Select twig pair used to orientate from 1+2, 1+3 and 2+3.
To confirm which twigs are assigned to twigs one, two and three, display the
dataset and look at the drop down list of twigs. For instance, twig one will be
assigned to the first twig on the list which is not a REF or MAG twig.
Save angle as
Choose database location to save calculated angle from Angle1 or Angle2.
Pick type
Select either Trough or Break depending on the pick criteria. If Trough is
selected the pick will be that defined in the dataset database. If Break is selected
Orientate searches forward, from just before the break time, to the peak of the
magnitude, calculated from all the available components and uses this time
instead of the time in the dataset database. This peak time is saved in the Pick2
location of the output dataset database. If problems arise when picking the
magnitude see AutoPick use_magnitude parameter help.
Overlay
Overlay a small section of seismic over a larger section of seismic. In particular,
overlay a VSP dataset, with horizontal coverage, over a surface seismic section.
Requirements
Datasets must be time or depth domain.
Datasets must have the same sample interval.
Datasets must have the same horizontal trace separation.
Input A is expected to have more extensive coverage than input B and will
usually be a surface seismic section.
Input B will usually be a VSP dataset of limited horizontal and temporal extent,
e.g. vertical incidence survey in a deviated well, image reconstructed offset
source survey.
Description
Overlay mimics the process of cutting out a VSP dataset and overlaying it on a
surface seismic section.
The number of output samples and traces is defined by input A.
Input A is copied unchanged until the trace in input A that corresponds to
trace one of input B is reached. For this and subsequent traces, up to the last
corresponding trace of input B, input A samples are replaced with their
Overlay –parameters
Trace in input A that corresponds to trace 1 of input B
Phaserot
Rotate the phase of a dataset.
Requirements
Dataset must be in frequency domain.
Description
Rotates the phase of a dataset. A rotation of 180 degrees will reverse the phase of
the dataset and is equivalent to a time domain polarity reversal.
Phaserot -Parameters
Rotation angle
Specifies the rotation angle in degrees.
PickAmplitude
Calculate the amplitude at or around the requested pick location and set it in the
dataset database.
Requirements
Dataset must be time or depth sampled.
Description
PickAmplitude sets amplitude values in the dataset database. The amplitude may
be calculated from an RMS window or from a single sample. The location of the
amplitude depends on the pick type and can be the Pick1, Pick2, Pick3 or
All_picks.
Warning
Most operators change sample amplitudes.
Uses
Run PickAmplitude before ProfileX if Profile display magnitude graph is
required. PickAmplitude can also be used in conjunction with the WellView
attribute option.
PickAmplitude –parameters
RMS window
Specifies the window length used to calculate an RMS value. Window units
depend on the input samples interval. For time domain datasets this is seconds,
for depth domain datasets this depends on the user units, metres or feet.
The window is centred on the pick sample unless Break pick is selected. In that
case, the window starts at the pick sample.
If the RMS window is zero, only the pick sample will be used unless Break pick
is selected. In that case PickAmplitude searches forward, from just before the
pick sample, to find the amplitude at the next peak or trough.
Pick type
Choose either Pick1, Pick2, Pick3 or All_picks.
Break pick
See RMS window parameter.
Polarize
To orientate a pair of orthogonal components.
Requirements
The input dataset must contain at least two orthogonal components.
The input database must contain Pick1 (Primary pick) times.
Dataset must be in the time domain.
Considerations
If a MAG (magnitude) twig is present in the input dataset it will not be present in
the output dataset.
The pick type parameter must be set correctly.
Description
For the selected pair of orthogonal components, for each input trace, Polarize
calculates the direction from which the picked arrival has come. It then rotates
the plane of the selected pair of components so that, one component of the
Polarize –Parameters
Polarize twig pair
Select twig pair used to polarize from 1+2, 1+3 and 2+3.
To confirm which twigs are assigned to twigs one, two and three, display the
dataset and look at the drop down list of twigs. For instance, twig one will be
assigned to the first twig on the list which is not a REF or MAG twig.
Save angle as
Choose database location to save calculated angle from Angle1 or Angle2.
Pick type
Select either Trough or Break depending on the Pick criteria. If Trough is
selected Pick1 will be that defined in the dataset database. If Break is selected
Polarize searches forward, from just before the break time, to the peak of the
magnitude, calculated from all the available components and uses this time
instead of the time in the dataset database. This peak time is saved in the Pick2
location of the output dataset database. If problems arise when picking the
magnitude see AutoPick use_magnitude parameter help.
Requirements
The dataset must be sorted into measured depth order (see Sort)
The dataset database must contain all information required for the proper
calculation of computations.
Considerations
In order for the listing from profile display mode to show the source to monitor
distance, the source to monitor delay and any external delay, the input dataset
should come from very early in the processing route, before the Tstatics and
Designature operators have been applied.
Input traces which are marked, are excluded from the curves.
Description
The Profile operator extracts Pick1 time versus depth information from the input
dataset database and generates a velocity profile database.
The main purpose of Profile is the production of time versus depth curves and
computation listings (similar to those created using our ACQ data acquisition
software). However, the velocity profile database generated by Profile is also
used for operators such as Invert and should be used in place of the TvsDmodel
operator, which it now replaces.
Parameters
None.
ProfileIn
Create a velocity profile database from a comma-separated value (CSV) file.
Requirements
The CSV file must contain at least two columns labeled TVDSD (vertical depth
below datum) and either VINT (interval velocity between TVDSD and next
shallowest TVDSD) or TCORR (vertical travel time from datum to TVDSD).
An optional units line may be included. Valid units are m (metres), ft (feet), m/s,
ft/s, s (seconds) and ms (milli-seconds).
Multiple velocities can be input using VINT1 (or VINT), VINT2 and VINT3
columns.
Anisotropy information can be included using the EPSILON, DELTA and
GAMMA columns, see tutorial on Anisotropy on the release disk. If anisotropy
information is found the complete input CSV file will be copied to the data
folder of the opid. Anisotropy information from the copied CSV file is used by
RayRetrace and RayTrace operators.
Dip information can be included using the DIP and AZIMUTH columns. Layers
with positive dip, dip up in their azimuth direction. Positive and negative dips
are allowed. TVDSD of a dipping layer is defined at XX,YY.
Example
“TVDSD”,“VINT”
“m”, “m/s”
100,2500
200,3000
300,3500
This example defines a velocity of 2500m/s down to 100m. A velocity of
3000m/s from 100-200m and a velocity of 3500m/s from 200-300m.
For an extended example export from profile and choose Ray model.
ProfileIn –parameters
Pathname of CSV file
Full path and filename of the CSV file.
ProfileX
Like Profile but can process multiple picks.
If required, pick amplitudes must be calculated using the PickAmplitude
operator prior to executing ProfileX. Pick1 amplitude is expected to be in the
USER1 database location etc.
If a database entry for a selected pick is unpicked the entry will be marked.
ProfileX –parameters
Use pick1
Generate a velocity profile database from pick1.
Use pick2
Generate a velocity profile database from pick2.
Prune
Select twigs from a multiple twig dataset.
Requirements
The selected twigs must be present in the dataset.
Description
Select twigs by pruning them off for separate processing.
Prune -Parameters
Twig descriptors
Select the twigs required for output.
Qestimation
Estimate Q (absorption) for down-going arrivals at pick1 times.
Requirements
Dataset must contain a single twig.
Dataset must be in FX space.
Dataset database must contain source and receiver positions and pick1 times.
Although absolute amplitudes are not essential, removing spherical divergence,
prior to transformation to FX space, improves interpretation of results.
The arrival, described by the pick1 time, must be isolated before transformation
to FX space e.g. up-wave arrivals minimised by Enhance.
The pick arrival must be shaped e.g. by Twindow operator, to remove sharp cut
offs prior to transformation to FX space.
Description
For each trace of the single input twig, a slope, in dB per wavelength, is
calculated over the selected range of frequencies. An estimated average Q value
is derived from this slope.
Average Q is calculated over a user-defined range of frequencies. The frequency
range can be derived using an FX display selecting Variable area colour display
mode or by spectral analysis of traces in the windowed time domain dataset. The
slope in dB over the frequency range should be a straight line.
The shallowest receiver may be used as a source reference. In this case, the
spectrum of the shallowest trace is subtracted from subsequent spectra before
slope calculation. A flag is set in the dataset database to notify Profile displays
that average Q values are referenced to the shallowest receiver.
Average Q values are stored in the dataset database and a flag is set to notify
Profile displays of the presence of Q estimations.
A CSV file is output to aid interpretation of results.
Qestimation-Parameters
Start and end frequency Hz
Average Q values are calculated over this frequency range.
RayForm
Use time-variant ray angles, to form rotated output traces, from two orthogonal
input traces. The primary purpose of RayForm is wave-mode separation of up
going wave fields.
Requirements
Input A, must be time domain.
Input A, must contain at least two twigs.
In straightforward applications, Input A will usually only contain a mixture of P
and shear up wave energy.
Input A horizontal component must be rotated toward the source. Thus, due to
the constraint of horizontal layers, upwave events in vertical and horizontal
components have the same polarity.
Input B, linked to an operator e.g. RayTrace, which has produced a text file
(defined below) containing angle information for one wave-mode type.
RayForm -Description
For the selected pair of orthogonal components, for each input trace, RayForm
rotates each time sample so that, one component of the selected pair points in the
direction of the angle defined at that time sample, and the other component is
perpendicular to that direction. All other twigs are copied unchanged. Ray-
formed components are named E1 and E2.
Time variant angle information is obtained from a text file, produced by the
RayTrace operator, or, by some other external ray-tracing package. The format
of this file is described below. In the case of RayTrace, the inclination and the
azimuth of up wave rays, at the receiver, from each interface, is output. RayForm
linearly interpolates the angles between the interfaces.
If the angle information has been obtained from the RayTrace operator the input
A dataset must be aligned at one-way time below source. It is important that the
ray-traced dataset closely matches the input A dataset.
RayForm –parameters
Ray form twig pair
Select twig pair used to ray form from 1+2, 1+3 and 2+3.
To confirm which twigs are assigned to twigs one, two and three, display the
dataset and look at the drop down list of twigs.
RayProx
Find the edge of a high velocity structure, such as a salt dome.
Requirements
Input A must be the output from a RayRetrace operator.
Input A dataset database must contain Pick1 times and the following static time
correction information: reference pick time, source to monitor distance and
external delay.
Input A must have special source and receiver locations.
• Travel paths from the source to the receiver must enter the high
velocity structure at a similar depth.
• Velocity model between the source and the entry point of the high
velocity structure must be known and be nearly horizontal and
isotropic.
Input B must be a Profile or ProfileIn operator that describes the velocity model
above the entry point to the high velocity structure.
• The last velocity layer is the high velocity layer. This will be
extrapolated to the deepest receiver location.
• The second last velocity layer is the layer just above the entry
point. The depth of this layer defines the depth of the entry point to
the high velocity layer.
RayProx -Description
We will call the near vertical high velocity structure the proximity structure.
For each source receiver pair the following procedure is applied.
Rays are calculated from the source down through the proximity structure. Ray
angles range from vertical to horizontal. The angle increment is defined by a
parameter. As layers are horizontal, these rays are used through all azimuths. We
will call these rays the source rays.
The relevant ray is read from the RayRetrace ray file. This should describe the
ray from the receiver through the velocity model between the receiver and the
Limitations
• Critical refractions are ignored
• Horizontal layered model
• No anisotropy
RayProx -Parameters
Time error (ms)
The time error allowed when comparing ray times against transit times.
RayRetrace
Find source image point by ray tracing from receiver to source, through a
horizontal layered velocity model, using picked arrival inclination and azimuth
to define the start of the ray-path.
Requirements
Input A dataset database Angle1 entry, must contain the azimuth of the arrival
being retraced and Angle2 entry must contain the inclination.
Input B, must be Profile or ProfileIn operator describing the velocity between the
source and receiver. In the special case of proximity processing, the velocity
model must describe velocities near the well track.
Anisotropy will be taken into account if anisotropy information is included in the
CSV file used by ProfileIn. ProfileIn will copy the CSV file containing the
anisotropy information. Weak polar anisotropy is assumed as described by Leon
Thomsen, see tutorial on anisotropy for more information.
Limitations
• Critical refractions ignored
• Horizontal layered model
• Only arrival described by angle1 and angle2 is retraced
• Azimuthal anisotropy is not yet included
• No reflections occur
• Signal must appear to have come from gimbaled Geophones.
Description
Angle1 and angle2 define the azimuth and inclination, respectively, at the start of
the retraced ray. The ray is traced back through the horizontal layered velocity
model of input B. Ray tracing continues until the ray reaches the source depth or
the bottom of the model, or the ray offset is greater than twice the source offset,
from the well head.
The coordinate at the end of the retraced ray is stored in IPX, IPY and IPD
database locations. These can be viewed using Wellview. A ray file is output to
allow Wellview to display the retraced ray and for use in the RayProx operator.
The travel time through the retraced ray, from the receiver to the image point, is
stored in the USER8 database location. USER8 is modified to account for static
time corrections, e.g. reference pick time, external delay and source to monitor
delay.
Assuming azimuth and inclination values are correct, arrivals that have traveled
through an isotropic horizontal layered earth should be retraced to their source
location, i.e. their image point coordinates should match their source coordinates
and their Pick1 and USER8 times should be consistent.
Only database values are changed.
Angle preparation
If it was not known, find the tool azimuth using ToolOrientate. This may have
to be performed on a dataset specifically designed to find the tool azimuth. See
angles appendix for more information.
Use DBupdate to transfer tool azimuth to the recorded components of the
dataset to be RayReversed.
Use Rotate to direct the H1 component to north using tool azimuth.
Execute an Polarize operator to rotate the horizontal components. Save the
rotation angles in angle1.
Execute an Polarize operator to rotate the in-line horizontal and the vertical
component. Save the rotation angle in angle2.
RayTrace
3D ray-trace, isotropic or anisotropic, through a horizontally or simple dip
layered velocity model, using source and receiver coordinates from a VSProwess
dataset database.
Requirements
Input A, is a time domain dataset, whose database contains source and receiver
X, Y and Z coordinates. Tstatics corrections must have been applied.
Input B, must be a Profile operator, with at least two points included as interval
velocities, or a ProfileIn operator.
Description
Ray-tracing is performed through the input B velocity model, using trace by
trace source and receiver coordinates from the input A trace database.
Only points included as interval velocities in Profile/ProfileIn are included in the
velocity model. The velocity model is extrapolated using the deepest velocity to
a user defined maximum two-way time. Mode converted arrivals can be ray
traced. To simplify ray-trace information, either downwave or upwave
information is generated.
Anisotropy will be taken into account if it is included in the CSV file used by
ProfileIn. ProfileIn will copy the CSV file containing the anisotropy
information. Weak polar anisotropy is assumed as described by Leon Thomsen,
see tutorial on anisotropy for more information.
Each layer is allowed to dip across the model. The tie point is defined by XX,
YY and TVDSD values in the model. XX and YY default to 0.0. Dipping layers
will terminate shallower layers. Dip values are defined by the DIP column,
default 0.0.
In some cases dip azimuth can be defined using the AZIMUTH column, default
0.0. AZIMUTH will be ignored if there is a significant source receiver offset
(tan(SGO/TVDSD) > 5 degrees) and the azimuth will be the azimuth of the
RayTrace - parameters
Maximum two-way time of model
The two-way time, in seconds, to which the deepest velocity is extrapolated.
This defines the two-way time of the deepest interface in the model. Ray tracing
continues until this two-way time or the data length of input ‘a’ whichever is
longer. Model entries from input ‘b’ beyond this time will be ignored during ray
tracing.
Downwave
If selected only primary down wave arrivals are ray-traced.
Capture radius
Defines the maximum distance a ray can be from the receiver in order to be
captured, in survey units.
RearBlank
Rear blank each trace of a dataset beyond the specified event time.
Requirements
Dataset must be in the time or depth domain.
Description
Rear blanks a dataset beyond the specified event pick by setting the samples to
precisely zero.
RearBlank -Parameters
Blank from
Select the event pick from which blanking is to begin. The "offset" is added to
this event pick. Tcorr is not allowed for depth datasets.
Start offset ms
An offset time in milliseconds or user units to be added to the selected start of
blanking event pick.
Examples
To rear blank each trace starting from 50ms before the Pick2 time for that trace,
select "Pick2" and set "Start offset ms" to -50.
To rear blank all traces from 100ms onwards, select “Tzero” and set “Start offset
ms” to 100.
.
Remark
Provide a method for the entry of explanatory remarks.
Description
The Remark operator is not really an operator at all because it is not executable
and performs no processing. The Remark operator simply provides a convenient
method of attaching an explanatory comment to a processing sequence. Remark
Resample
Re-sample from any sample interval to any other sample interval in the same
sample domain.
Requirements
Dataset must be time or distance domain.
Description
Resample uses Shannon’s Theorem. Each sample is replaced by a sinc function
weighted by the sample amplitude. When all the sinc functions are added
together, the resulting “continuous” signal can be sampled at the required sample
interval. A good description of this process appears in the IFP publication
“Signal Processing for Geologists and Geophysicists” by J.L.Mari et al.
To speed up processing the sinc function is limited to one second or one
thousand metres. This limits the minimum frequency that can be re-sampled.
If the dataset is to be over-sampled, it is necessary to limit the input dataset to
frequencies no greater than half the nyquist frequency for the input sample
interval, before linking to the Resample operator.
If the dataset is to be over-sampled is important to understand that the signal
quality and bandwidth of the output dataset is unchanged, we have simply
interpolated additional samples.
The advantage of Resample over the Fexpand and Fcollapse operators is that any
output sample interval, within reason, can be requested, not just double or half
the input sample interval. This may be important for re-sampling “odd” sample
intervals imposed by digital signal transmissions from multiple receivers. The
disadvantage of Resample is that it may be slow.
Continuous log twigs with NULL values can be resampled.
Resample -Parameters
Required sample interval
The units of the required sample interval depend on the sample domain of the
input dataset and, in the case of a depth domain dataset, the measurement units.
Units can be either: seconds, metres or feet.
Rotate
Rotate pairs of orthogonal components.
Description
For each input trace, Rotate rotates the plane of the selected pair of components
by an angle specified by the Rotation mode parameter. The angle can be
specified as, TOOLAZ, TOOLINC, TOOLROT, Angle1 or Angle2 from the
dataset database and can be applied in a forward or reverse direction. There is
also an option to rotate all traces by a constant angle.
REF and MAG twigs cannot be rotated and are copied unchanged.
Rotate has no knowledge of the absolute alignment of the processed twigs. It is
therefore up to the processor to name the processed output twigs. This is done
using parameters.
A description of angles used in VSProwess is included in the appendices.
Rotate -Parameters
Rotate twig pair
Select twig pair from 1+2, 1+3 and 2+3.
To confirm which twigs are assigned to twigs one, two and three, display the
dataset and look at the drop down list of twigs. For instance, twig one will be
assigned to the first twig on the list that is not a REF or MAG twig.
Rotate by
Modes are TOOLAZ, TOOLINC, TOOLROT, ANGLE1, ANGLE2,
CONSTANT_ANGLE
Use to specify rotation or reverse rotation, in the plane of the selected pair of
orthogonal components, using the requested angle from the dataset database, or
rotation by a constant angle.
Scale
Scale a dataset by a constant value.
Description
Scale all samples of a dataset by multiplication with a constant.
Scale -Parameters
Multiplier
The constant that is to be multiplied with all samples.
SEG2input
To import VSP field data recorded in SEG-2 format.
Requirements
TBD
Description
The SEG2input operator is designed to import VSP data recorded in the SEG2
data format. The operator has been tested with datasets recorded with both Sabre
and OYO equipment. It is assumed that the dataset consists of a number of SEG2
files, where each file contains the data for a single sweep or shot.
Each SEG2 file contains one “File Descriptor Block” and several “Trace
Descriptor Blocks”. The information in these blocks is associated with
"Descriptor Strings". The user must associate his SEG2 trace descriptor strings
with VSProwess database values
A FILETEMPLATE parameter may be used to selectively read files whose
names conform to a specified pattern. By this means, it is possible to selectively
import files recorded with a particular source or belonging to a particular trace
descriptor.
The precise format of SEG2 trace descriptor information depends upon the
contractor supplying the data. When the format for a particular contractor has
been deciphered, this information may be preserved by creating a standard
import route.
SEG2input –Parameters
Because of the great flexibility required it is not practical to provide a simple
GUI based parameter input window. Instead, the parameters for this operator are
in text form, following as closely as possible the syntax used the similar
SEGYinput operator.
The parameters are supplied as statement lines in plain ASCII text. Inserting a
hash (#) at the start of a line causes that entire line to be treated as a comment,
i.e. ignored. Only one statement is allowed per line of text. Specification
statement lines may appear in any order apart from TRACE statements that must
be correctly ordered.
FILETEMPLATE <template>
Use this statement to select files with a particular filename specification. The
default template is < *.sg2 >. Remember that Windows filenames are not
case-sensitive.
For example,
FILETEMPLATE sal*.sg2
The above statement would select all files within the DATAPATH folder whose
filenames start with "sal" and end in ".sg2" (e.g. sal0001.sg2 ).
WELLTRAK <pathname>
This statement is used to specify the name of a Welltrak database. If present,
then vertical depths (TVD), receiver coordinates (RCX, RCY) and tool
alignment (TOOLAZ, TOOLINC) are extracted from the Welltrak database
using the measured depth (MD) as a key. If the welltrak name is not a full
pathname the welltrak database is assumed to be in the current job folder.
Wellhead UTM coordinate, from the welltrak database, is set in the output
dataset database.
If the CLIENT statement is omitted the SEG2 CLIENT file descriptor string will
be used.
If the CONTRACTOR statement is omitted the SEG2 COMPANY file
descriptor string will be used.
There is no corresponding WELLNAME file descriptor string supplied by
SEG2.
TRACE CH1
TRACE CH2
TRACE VZ GEO 1 MDO -20.0 CH 17
TRACE HX GEO 1 MDO -20.0 CH 18
TRACE HY GEO 1 MDO -20.0 CH 19
TRACE VZ GEO 2 CH 20
TRACE HX GEO 2 CH 21
TRACE HY GEO 2 CH 22
For the above example if we assume that GEO1 happens to be the upper
geophone but that the geophone array is actually zeroed on GEO2, the lower
geophone, then depending upon the acquisition contractor it is possible that the
measured depth contained in the trace header is only correct for GEO2. In such a
case we can use the MDO (measured depth offset) token to supply a depth offset
to be added to the key measured depth for any geophone. MDO must be
specified in the depth units as specified in the SABRE file. In the example we
are saying that GEO1 is 20 <units> above GEO2.
OUTPUT NS <value>
The number of samples per trace to read (NS) must be > zero.
Example: output four twigs each with 2000 samples, but for GEO 1 only,
assuming the input traces are as described in the example above.
OUTPUT NS 2000
OUTPUT TWIG VZ
OUTPUT TWIG HX
OUTPUT TWIG HY
OUTPUT GEO 1
Transfer statement
The transfer statement is used to transfer one item of information from the SEG2
trace descriptor block into the dataset database. An item of information is
referenced by its descriptor string. A transfer statement starts with the keyword
identifying the VSProwess database item of interest, followed by the SEG2
descriptor string of the relevant information within the trace descriptor block.
<KEYWORD> <descriptor string>
In some cases SEG2 descriptor strings may identify more than one value, in
these cases, square brackets surround the position of the required value.
RECEIVER_LOCATION, SOURCE_LOCATION and
STATIC_CORRECTIONS must include square brackets even if only one value
is present.
For example, <KEYWORD> RECEIVER_LOCATION[3] would assign the
third element of RECEIVER_LOCATION to the dataset database item defined
by the keyword.
Constant statement
If an item of information is not correct or not present in the SEG-2 header, it
may be set to a constant value as follows.
<KEYWORD> VALUE <signed constant>
For example, to set the source to datum correction to 0.0085 seconds
TS VALUE 0.0085
DESCALE
Usually gets value from DESCALING_FACTOR trace descriptor string.
Defaults to one.
POLARITY
Usually gets value from POLARITY trace descriptor string.
Defaults to one.
CHAN
Channel number within original field record.
SI
Usually gets value from SAMPLE_INTERVAL trace descriptor string
Specified in seconds, defaults to 0.001. Must be the same for all traces.
SCX
Source offset east of wellhead.
SCY
Source offset north of wellhead.
RCX
Receiver offset east of wellhead.
RCX
Receiver offset north of wellhead.
SD
Source depth below SREF.
WRE
Well reference elevation w.r.t. datum, e.g. KB.
TREF
The reference arrival time in seconds. Used to modify PICK1(TPIMARY) and
PICK2(TSECONDARY) after source signature deconvolution. Also used by the
Tstatics operator.
SRE
Source reference elevation w.r.t. datum.
SDE
Seismic datum elevation w.r.t. datum.
S2M
Source to monitor distance. Typically the distance of the monitor hydrophone
from an air-gun array, always a positive value.
TEXT
External delay in seconds. The external delay is any delay to which only the
reference signal is subjected before it reaches the recording system, for example,
the transmission delay for a remote source might be 0.003 seconds.
VW
TS
The Source to datum time correction (Ts) in seconds. Ts is essential for the
calculation of corrections to datum. Ts is the value that must be added after
vertically correcting the time below the source.
In the marine case Ts is always the source depth divided by water velocity
(VW).
In the land case, the situation is rather more complex and if possible, Ts is
measured directly, otherwise it must be derived indirectly.
SRC
Source number
STK
Stack number (or level number)
PICK1(TPRIMARY)
Primary arrival time in seconds.
TSTART
Start time for all traces in seconds, normally zero.
NOINSTACK
Number of records contributing to a stacked trace. Used in conjunction with
DESCALING_FACTOR to correct data samples.
REC
Original field record number.
USER1 to USER8
Dataset database locations for the user, not used by VSProwess.
SEGYexport
Output a VSProwess dataset as a SEG-Y format file.
Requirements
Dataset must be sampled in the time or depth domain.
SEGYexport - Statements
Because of the huge variation in usage of SEG Y header fields, it is not practical
to provide a simple GUI based parameter input window. Instead, the parameters
for this operator are in text form and are edited using a standard pop-up text
editor.
The parameters are supplied as statement lines in plain ASCII text. Inserting a
hash (#) at the start of a line causes that entire line to be treated as a comment,
i.e. ignored. Only one statement is allowed per line of text. The parameter text is
split into various sections. The start of each section is identified by one of the
following keywords.
ASSIGN_SEGY_ID
TEXTUAL_FILE_HEADER
BINARY_FILE_HEADER
TRACE_HEADER
The sections must appear in this order. The first two are optional. The second
two must be present. No information is set unless specified by a statement line.
Statement lines can appear in any order within a section.
Note that the default parameters for a newly inserted operator are correct for the
exporting of standard VSProwess format archive files as generated by the
SEGYoutput operator.
DATAFILE <pathname>
This statement is always required and must appear first. Supply the full path and
file name of the output SEG-Y file. Use the DATAFILE button to browse.
ASSIGN_SEGY_ID
This optional statement must precede assignments of VSProwess twig
descriptors to SEG Y trace identifiers. The format of the lines, following this
statement, is described below.
TEXTUAL_FILE_HEADER
This optional statement must precede assignments to the Textual file header.
The first 3200 bytes of the SEG Y file is the Textual file header. The Textual file
header record contains 40 lines of textual information.
All lines of the Textual file header can be configured.
C<N> <Text>
The first three characters must be a C followed by a number.
N is an integer from 1 to 40 and specifies the position within the textual file
header.
There must be a space between the number and the text. The Text may be up to
36 characters long.
The following keywords can be used, in conjunction with your text, to draw
information from VSProwess.
<WELLNAME>
<CLIENTNAME>
<CONTRACTORNAME>
<LABEL1>
<LABEL2>
.
.
<LABEL15>
For example, the following line would place the first line of the display label, if
it exists, into line 21 of the Textual file header.
C21 <LABEL1>
To conform to SEG Y standards, the last two lines of the Textual file header
should be,
C39 SEG Y REV1
BINARY_FILE_HEADER
This compulsory statement must precede assignments to the Binary file header.
The 400-byte, Binary file header, record contains binary values that affect the
whole SEG Y file.
In order to set information in the Binary file header block the user must create a
list of transfer statements, as described below.
The Binary header values described in points 1-4 below are considered to be the
minimum requirement to enable someone to read the SEG Y file.
The floating-point format of output data samples is defined by two-byte integer
Binary file header location 25. See point 3 below.
Any of the keyword listed below may be used to assign values to the Binary file
header.
Certain binary header values are mandatory when conforming to the SEG-Y
standard. Great care should be taken when setting values in the Binary file
header.
SEGYexport does not support “SEG Y rev 1”extended Textual file header
records as specified by Binary file header two-byte integer starting at byte 305.
TRACE_HEADER
This compulsory statement must precede assignments to the Trace header.
Each trace in the SEG Y file is preceded by a 240-byte header block containing
trace attributes.
In order to set information in the 240-byte trace header block the user must
create a list of transfer statements, as described below.
Any of the keyword listed below may be used to assign values to the Trace
header.
1:The sample interval in seconds (SI) from the dataset database will be
multiplied by 1000000 and entered into the SEG Y Binary file header, in two-
byte integer form, starting at byte 17. It is recommended that the SI keyword
always be used for the Binary trace header two-byte integer starting at 17.
2:The number of samples from the dataset database (NS) will be entered into the
SEG Y Binary file header, in two-byte integer form, staring at byte 21. It is
recommended that the NS keyword always be used for the Binary trace header
two-byte integer starting at 21.
SEGYinput
Import a SEG-Y format file into the VSProwess environment.
Requirements
The input file must be in SEG-Y format. The input SEG-Y file is expected to
contain a 3200 byte reel header, a 400 byte line header and a series of traces each
consisting of a 240 byte trace header followed by the data samples for that trace.
Description
Because there is no completely standard SEG-Y trace header format to hold all
the necessary VSP information, it is left up to the user to associate SEG-Y trace
header words with VSProwess database values. Simple arithmetic operations
may be performed upon header values or they may be set constant. It is possible
to selectively read traces recorded with a particular source or belonging to a
particular trace descriptor.
The format of SEG-Y trace header information usually depends upon the
contractor supplying the data. A list of the header information may be supplied
by the contractor but, if it is not, the SEGYlist operator may be used to identify
useful information. Once the format for a particular contractor has been
deciphered, the information may be saved by creating a standard route ready for
the next time.
SEGYinput -Parameters
Because of the huge variation in usage of SEG-Y header fields, it is not practical
to provide a simple GUI based parameter input window. Instead, the parameters
for this operator are in text form and are edited using a standard pop-up text
editor.
DATAFILE <pathname>
This statement is always required. Supply the full path and file name of the input
SEG-Y file. Use the DATAFILE button to browse.
PROWESSLINEHEADER
This statement informs SEGYinput that the SEG-Y input file is in the standard
VSProwess archive format, as created by the SEGYoutput operator. Any
TRACE or OUTPUT statements are ignored if this keyword is present and traces
and all samples are read.
If you wish to read in a SEGY file not generated by VSProwess then you must
delete this statement, or comment it out by placing a # symbol at the beginning
of the line.
DATA_FORMAT <IBM/SEGY2/SEGY3/IEEE754>
The data format is normally obtained automatically from the Binary file header,
but may be over-ridden by this optional statement.
Data format may be IBM or IEEE754. IEEE754 is the native format used by all
modern workstations and is therefore the normal format used for VSProwess
archive files as produced by SEGYoutput. Also supported are data format codes
2 and 3 (SEGY2, SEGY3).
WELLTRAK <pathname>
This optional statement is used to specify the name of a Welltrak database in the
cuurent job folder or the full pathname of a Welltrak database, including file
extension. If present, then vertical depths (TVD), receiver coordinates (RCX,
RCY), tool inclination (TOOLINC) and tool azimuth (TOOLAZ) are extracted
from the Welltrak database using the measured depth (MD) as a key. Wellhead
coordinates from the Welltrak database are set in the dataset database.
DEPTHSAMPLEDDATA
If this statement is present, the sample interval (SI) is assumed to be in depth
units, defined by measurement units found in Binary file header location 55 or
UNITS statement, if one is present.
UNITS <1/2>
If this statement is present, the measurement unit, found in Binary file header
location 55, will be overridden by the specified units, 1=metres 2=feet.
OUTPUT NS <value>
The number of samples per trace to read (NS) must be greater than zero. Since
the default is zero, there must be an OUTPUT NS statement present.
Transfer statement
The transfer statement is used to transfer one item of information from the
SEGY trace header block into the dataset database.
Information may be present in the SEGY trace header block in either 16 or 32 bit
binary integer form. An item of information is referenced by its starting position,
or "start byte" number within the header block.
A transfer statement starts with the keyword identifying the header item of
interest, followed by the size and start byte of the relevant information within the
trace header block.
<KEYWORD> INT32 <start byte> (32 bit integer number)
<KEYWORD> INT16 <start byte> (16 bit integer number)
The following types are not standard SEGY formats but are included for
completeness.
<KEYWORD> UINT16 <start byte>(16 bit unsigned -integer number)
<KEYWORD> IBMREAL <start byte>(32 bit IBM real number)
<KEYWORD> IEEEFLOAT <start byte>(32 bit IEEE float number)
Constant statement
If a required item of information is not present in the SEGY header, it may be set
to a constant value as follows.
<KEYWORD> VALUE <signed constant>
For example, to set the source to datum correction to 8500 microseconds
TS VALUE 8500
Modifier statement
After a header item has been assigned a value (either a constant or by transfer
from the header), this value may be offset or scaled using a modifier statement.
MD
Measured depth below well reference level.
TVD
True vertical depth below well reference level.
SCY
Source offset north of wellhead.
RCX
Receiver offset east of wellhead.
RCX
Receiver offset north of wellhead.
TCX
Target offset east of wellhead.
TCX
Target offset north of wellhead.
All coordinates must be referenced to wellhead. If they are supplied relative to
some other point then use the ADD modifier to correct them to wellhead.
If the WELLTRAK statement is present then TVD, RCX and RCY are all found
from the specified Welltrak database.
PICK1 or TPRIMARY
Pick1 (Primary) arrival time in microseconds.
PICK2 or TSECONDARY
Pick2 (Secondary) arrival time in microseconds.
TREF
The reference arrival time in microseconds. Used to modify TPRIMARY and
TSECONDARY after source signature deconvolution. Also used by the Tstatics
operator.
SI
Sample interval in microseconds.
NS
Number of samples per trace.
SRC
Source number (if available).
STK
Stack number (or level number).
NGEO
Geophone number.
LN
Line number for walk-aways.
TSTART
Start time for all traces in microseconds, normally zero.
TDELAY
The start time in microseconds for an individual trace. Might be set for
Schlumberger data in which case probably in milliseconds, use MUL 1000 to get
microseconds.
SDE
Seismic datum elevation w.r.t. datum.
WRE
Well reference elevation w.r.t. datum, e.g. KB.
SRE
Source reference elevation w.r.t. datum.
SD
Source depth below SREF.
S2M
Source to monitor distance. Typically the distance of the monitor hydrophone
from an air-gun array, always a positive value.
SCORR
Source elevation error. This error may be due to the roll of the shooting vessel
but is not currently measured by any acquisition contractor.
TEXT
External delay in microseconds. The external delay is any delay to which only
the reference signal is subjected before it reaches the recording system, for
example, the transmission delay for a remote source is typically 3000
microseconds.
VW
Water velocity, either in feet per second or metres per second depending upon
the units specified by the MU identifier.
TS
The Source to datum time correction (Ts) in microseconds. Ts is essential for the
calculation of corrections to datum. Ts is the value that must be added after
vertically correcting the time below the source.
In the marine case Ts is always the source depth divided by the water velocity.
REC
Original field record number.
CHAN
Channel number within original field record.
NOINSTK
Number of records contributing to a stacked trace.
USEC
Microseconds after second.
SDE
Seismic datum elevation.
SEGYlist
List the contents of the headers of a SEG-Y format file. Useful for examining the
contents of an anonymous data file.
Requirements
The input file should follow the SEG-Y convention.
Description
The header fields of a SEG-Y file are listed into a text file located in the Opid
data folder. To view the listing, select the relevant SEGYlist operator while in
Display Select mode.
The 3200 byte Textual file header is translated from EBCDIC code. Blank lines
are ignored.
The last 200 bytes of the Binary file header are not listed.
The line header and trace headers are interpreted four bytes at a time.
Unfortunately there exists a good deal of variation in how header fields are
formatted, with some header values sometimes using 2 bytes and others
sometimes using 4 bytes. To overcome this problem we display each group of
four bytes interpreted as two 16-bit values and as a 32-bit value. In either case
the fields are interpreted as big-endian two’s complement values.
For example,
Alternatively, if one of the optional outputs is selected, the four bytes are
interpreted as either IBMREAL or IEEEFLOAT. These values are listed last in
each line.
SEGY ID present
Specifies whether the identification header is present in the input file.
Optional output
None, IBMREAL or IEEFLOAT specifies optional real number interpretation.
SEGYoutput
Output a VSProwess dataset as a SEG-Y format file.
Requirements
Dataset must be sampled in the time or depth domain.
Description
Archive all twigs, traces and samples of a VSProwess dataset into a single SEG-
Y file. The file and trace header format output by VSProwess is a superset of
standard SEG-Y and is specified at the end of this document.
The SEGYoutput operator is commonly used to archive processed data, perhaps
to preserve some interactive pre-processing such as first arrival picking or stack
editing.
The number of samples output to the SEG-Y file is truncated to 64000.
A text file is also created which lists some of the SEG-Y header information.
SEGYoutput -Parameters
Output file
Specifies the root file name for the output SEG-Y and list files. The SEG-Y data
is given a “.sgy” extension, while the text file that lists header information is
given a “.lst” extension.
Format
Specifies the floating-point format. Either IBM floating point format or IEEE754
floating-point format, which is nowadays the native format used by all
workstations.
Select
Accept or reject "marked" traces in a dataset.
Description
Accepts or rejects marked traces in a dataset. Traces may be interactively
"marked" using the Display tool.
The Select operator is commonly used to discard traces from further processing.
A dataset may be split into two by using two complementary Select operators,
one set to accept only marked traces and the other set to reject the marked traces.
Select -Parameters
Marked trace
Specify if marked traces are rejected or accepted.
SensorScale
Scale individual sensors. A simple use may be to reverse the polarity of a
wrongly wired sensor.
Requirements
Dataset must be in time domain.
Description
Scale sensor amplitudes by multiplying samples by a user defined scale value.
SensorScale -Parameters
A text file formatted in comma separated columns. Column 1 is the receiver
number. Column 2, 3 and 4 are the VZ, HX and HY scale factors for the
reciever. In the following example, all receiver 17 HX values, and all receiver 20
HY values, will be reversed.
17,1.0,-1.0,1.0
20,1.0,1.0,-1.0
Sort
Sort a dataset by re-numbering the traces depending upon a selected mapping
parameter.
Requirements
The trace domain of the dataset must be distance.
Description
Sorts a dataset by re-numbering the traces so that the specified mapping
parameter increases or decreases monotonically with trace number. Trace order
is important for operations such as Enhance and FXtoFK. The Sort operator is
also useful for ordering raw data.
Sort direction
Specifies if the selected mapping parameter increases or decreases with trace
number. By convention, traces are usually processed in decreasing depth order.
Stack
Create a stacked dataset, usually from raw data.
Requirements
The trace domain must be distance.
Sample domain must be time.
Description
During VSP acquisition, it is normal practice to acquire several records at each
geophone station. If the records for a particular station are stacked (averaged)
together the effect is to improve the signal to noise ratio. The “law of
diminishing returns“ applies: assuming that the background noise has a “white”
(flat) frequency distribution, then for each doubling of the number of traces in
the stack we should obtain a 3dB improvement in signal to noise ratio.
If the background noise contains impulsive events (e.g. micro-slip, particle
impact or cable relaxation) and if enough records are available (at least five or
more) then the “trimmed mean” algorithm may provide an effective remedy.
Each output sample is the average of the input samples after the two least similar
samples have been excluded.
The Stack operator produces a stacked output trace for each group of adjacent
input traces that have the same specified database value (usually measured
depth). It is usually necessary to sort the input traces (see the Sort operator) into
depth order before stacking.
Any traces that have been interactively "marked" by the user are excluded from
the stack. The database values for a stacked trace are taken from the highest trace
number in that stack, except for the source coordinates, which are the median of
the source coordinates of the stacked traces.
If requested, a report is generated in a form that can be used by DBupdate. This
could be used to correct source coordinates for instance.
Stack Method
Options are "mean", “median” or "trimmed mean". The trimmed mean option is
useful if impulsive noise is present in the input data.
Tolerance
Traces whose database values lie within the tolerance are stacked together. This
option is only available with further stack criteria
StackAll
To stack all input traces into a single, replicated, output trace.
Requirements
Dataset must be in the time domain.
Description
The StackAll operator stacks all input traces into a single output trace, which
may be replicated as required. The most common use of this operator is to create
a “corridor stack”.
Zero samples are ignored The input dataset may well contain arbitrary
zones of zeroed data because of preceding
FrontBlank and RearBlank operations. A
simple stack of all input traces may
therefore result in amplitude distortion in
the output trace. To avoid this problem, the
StackAll operator ignores any samples with
the value of precisely zero. It is extremely
unlikely that real data (or filtered synthetic
data) will contain any samples at all whose
value is precisely zero, except in the regions
removed by the blanking operations. Do not
apply any filtering or domain change
operations to blanked data before executing
StackAll, because zeroed samples would be
changed to near-zeroed samples, which are
not ignored by StackAll.
Subtract
Subtract dataset B from dataset A.
Requirements
Datasets must have the same sample domain.
Datasets must have the same trace domain.
Datasets must have the same number of twigs.
Datasets must have the same number of traces.
Datasets must have the same number of samples.
Datasets must have the same sample interval.
Considerations
The output dataset derives its database from input dataset A.
Description
Produces the arithmetic difference of two datasets. Subtraction is performed
sample-by-sample, trace-by-trace and twig-by-twig. The requirement that both
input datasets must have the same sample domain implies that complex numbers
are never subtracted from non-complex numbers.
Parameters
None
Synthetic
Generate a “spiky” synthetic seismogram from; velocity, velocity and density or
acoustic impedance logs.
Requirements
Input must contain at least a velocity or an acoustic impedance twig.
Description
From consecutive acoustic impedance values, I(n) and I(n-1), generate a
reflection coefficient from the equation:
R = I (n) – I(n-1) / I(n) + I(n-1)
The first sample of the reflection coefficient series is zero.
If there is no input velocity twig, one is calculated from the acoustic impedance
using Gardners relationship.
Parameters
None
TcorrSmooth
Smooth Tcorrected times.
Description
Smooth Tcorrected times (TCORR), by fitting a best-fit straight line, over a user
defined window and alter Pick1. USER7 database location holds original pick
time, USER8 holds difference =original-smoothed pick. Use Trace attribute from
context menu to graph differences. Only database information is changed.
Parameters
Smooth length definition
Choose any database field to define the meaning of length.
Length
The length of window over which smoothing occurs.
Tfilter
Apply a zero-phase band-pass filter in the time domain.
Requirements
Dataset must be in the time domain (or depth domain).
Tfilter -Parameters
F1
Frequencies below F1 are completely attenuated.
F2
Frequencies between F1 and F2 are attenuated linearly. F2 must be higher
frequency than F1.
F3
Frequencies between F2 and F3 are passed unaltered by the filter. F3 must be a
higher frequency than F2.
F4
Frequencies above F4 are attenuated completely. Between F3 and F4,
frequencies are attenuated linearly. F4 must be higher than F3. F4 must be less
than half the Nyquist frequency.
Time2Depth2
Convert from time to depth domain using TVDSD twig.
This help topic is still under development.
ToolOrientate
Purpose
Find tool azimuth or tool rotation. ToolOrientate does not apply orientation
angle.
Requirements
Dataset must contain VZ, HX and HY twig descriptors.
Dataset must be in the time domain.
The tool inclination and azimuth must be available in the dataset database. See
appendix describing angle use in VSProwess.
Description
An orientation angle is calculated at the Pick1 time. If Pick1 is at a break,
ToolOrientate will find a peak time by searching forward from the break time, to
the next peak magnitude, calculated from all the available components. For QC
purposes, this peak time is saved in the Pick2 location of the output dataset
database.
Uses
Use ToolOrientate to estimate unknown tool orientations.
For fixed or gimballed geophone elements, in a vertical portion of the borehole
(defined by deviation threshold) use ToolOrientate to find tool azimuth.
For fixed geophone elements, in a deviated portion of the borehole (defined by
deviation threshold) use ToolOrientate to find the tool rotation.
ToolOrientation -Parameters
Deviation threshold
Choose from Source or North.
Tramp
Apply time-dependent scaling to a dataset to compensate for spherical
divergence.
Requirements
Dataset must be in the time domain.
Description
Applies a simple time dependent scaling using the factor (sample time)exp,
where exp is an exponent. An exponent of one will correct amplitudes
proportionally with time.
Tramp -Parameters
Exponent
Specifies the exponent. Commonly somewhere between one and two.
Transpose
Transpose a corridor of data near the first arrival curve.
Requirements
Dataset must be in the time domain or depth sampled.
Considerations
Dataset should be aligned at two-way vertical time or True Vertical Depth below
datum.
Trace 1 must be the deepest trace.
Description
The corridor of data just below the primary arrival curve will normally have a
superior signal to noise ratio than data far below the primary arrival curve and
will usually be the most useful part of the VSP dataset for the purpose of
comparison against a seismic section or synthetic seismogram. It is convenient to
be able to extract and display this corridor in a “transposed” form.
The Transpose operator synthesises N output traces as follows: For each
sampling instant take the row of N samples, starting at the first trace which
crosses the primary arrival curve. The first output trace therefore contains
samples from immediately below the primary arrival curve. Subsequent output
traces contain samples from progressively further below the primary arrival
curve.
To comprehend the transposition process one might imagine an “abacus beads”
model, with a pattern of coloured beads resembling a VSP, now slide N beads
from the first row fully to the left, followed by N beads from the second row, etc.
It is usual to generate from 5 to 20 output traces. If events appear to be consistent
across the transposed traces, it is often useful to stack all the traces together and
replicate this stacked trace N times. See the “StackAll” operator.
Remember that transposed data from below TD may suffer from poor signal to
noise ratio because the deepest VSP traces often have the worst data quality due
to open hole washouts, etc. Consider removing these traces if necessary.
Tshift
Apply time shifts to a dataset, including arrival alignment and common shift.
Requirements
Dataset must be in the time domain.
Description
This operator is used to apply a sub-sample resolution time-shift to a dataset.
The time-shift may be controlled by the arrival picks to allow horizontal
alignment of events.
A multiplier may be specified, for example to permit shifts to two-way time.
An additional common shift may be applied identically to all traces.
Note that the “Tcorrected” option assumes that all information required for the
calculation of vertical datum corrected times is actually available in the input
database.
A shift to “Tcorrectedx2” sets a flag in the output database that turns on a
suitable arrival curve when the output dataset is displayed.
This operator may also be used truncate or extend the number of samples in a
dataset.
Tshift -parameters
Shift using database
If selected dataset database information will be used to define the size of the shift
“per trace”.
Amount of shift
Controls how much time shift is applied on a trace-by-trace basis. However, note
that any non-zero common time shift is applied globally to all traces in addition
to the per trace shift selected from the following list.
Multiplier
The pick-controlled shift may be multiplied by 1,2 or 3. Common shift is not
affected.
Output samples
Specify the number of output samples. This may be used either to extend the
data with zeros or to truncate the data.
Common T shift
A common shift may be applied identically to all traces.
Common T start
The start time for all traces may be set as required.
A negative value may be specified to allow the display of data before time zero.
This is useful for viewing, for example, down-wave aligned data and auto-
correlated data.
A positive value may be specified, for example to discard dead samples before
the first arrival. This is useful to reduce the length of data required for
processing.
Tstatics
To apply source-related static time corrections to a dataset.
Requirements
Dataset must be in time domain.
Description
This operator applies sub-sample resolution time shift corrections to the input
dataset.
The value of each correction may vary for each trace and must be present in the
input database.
Any pick times present in the input database are also adjusted accordingly.
Tstatics -Parameters
External delay
If enabled then all non-reference traces are time shifted by the external delay
values extracted, for each trace, from the input database. The external delay
fields in the output database are zeroed after this correction has been applied.
External delay is usually the instrument delay involved in transmitting the source
signature from a remote source. The correction is necessary to restore the
correct phase relationship between receiver and reference traces.
Twindow
Extract a windowed segment of time domain data.
Requirements
Dataset must be in the time domain.
Description
Twindow multiplies a trace against a raised cosine template defined by the four
times T1-T4.
Time-variant filter
The most common use for Twindow is the implementation of a time variant
filter. A trace may be broken into two or more overlapped segments that are
separately filtered and subsequently recombined using the Add operator. The
ramps at the beginning and end of each segment should overlap perfectly and be
sufficiently long to keep truncation effects at an acceptable level.
Twindow -Parameters
T1
All samples before this time are set to zero.
T2
The sample at time T2 is unchanged. The samples between times T1 and T2 are
attenuated linearly with time. T2 must be greater than T1.
T3
The samples between times T2 and T3 are unchanged. T3 must be greater than
T2.
TXtoFX
Transform a dataset from TX space to FX space.
Requirements
Dataset must be in TX space, i.e. the trace domain must be distance and the
sample domain must be time. This is the usual case for recorded data.
Description
TXtoFX is used to transform a dataset from the time domain to the frequency
domain using an FFT (Fast Fourier Transform) algorithm. Because the FFT
requires the number of samples must be an exact power of two, TXtoFX
automatically pads each trace with zero values up to the next power of two.
For this reason, you should try to ensure that the number of samples in the input
data is not longer than necessary. For example, a 5000-sample trace takes quite a
lot longer to process and consumes more than twice the disk space of a 4096-
sample trace.
The inverse operator is FXtoTX, which transforms the dataset from the frequency
domain back to the time domain. If no further operators have been applied while
the data was in the frequency domain then the transformed/inverse-transformed
dataset should be identical to the original dataset to a high degree of accuracy.
Seismic data in the time domain is mathematically real which implies that after
transformation to the frequency domain the resulting complex data has the
property of symmetry about the zero frequency axis. This property is exploited
to halve the size of the transformed dataset by discarding negative frequencies.
Parameters
None.
Unequalise
Reverse the effect of the Equalise operator.
Requirements
Dataset must be in the time domain.
Description
Reverses the effect of the last Equalise operator applied to the dataset.
Parameters
None.
Wavelet
Generate a single trace containing a user specified wavelet.
Wavelet -Parameters
Wavelet type
Select a single sample impulse or use the arbitrary template file specified in the
Template file parameter.
Template file
Browse for a template file.
Sample interval in ms
Enter the required sample interval in milliseconds.
File header
The first 3600 bytes of a SEGY file form the “File header”, which consists of a
3200 byte “Textual file header” and a 400 byte “Binary file header”.
Twig Descriptors
This section describes the various twig descriptors used by VSProwess
Recorded information
VZ is the vertical component.
HX is a horizontal component
HY is a horizontal component
REF is the reference trace
DH is a down-hole hydrophone
CH1 – CH8 are other channels, e.g. other reference traces
PILOT vibrator pilot sweep
GF vibrator ground force
RM vibrator reaction mass
BP vibrator baseplate
Processed information
MAG is the vector magnitude twig from AutoPick
SYN is synthetic data, e.g. from Wavelet
XOFF is the X offset twig from NMO
YOFF is the Y offset twig from NMO
Angles in VSProwess
Recorded components
Recorded components all tool types
VZ twig contains the vertical component.
HX twig contains a horizontal component perpendicular to VZ and HY.
HY twig contains a horizontal component perpendicular to VZ and HX.
A down-going direct P-wave arrival peak on the VZ component must be a
positive number.
The direct P-wave arrival peak in the HY component must be a positive number
when the HY component points toward the source. The HX component is
expected to be 90 degrees clockwise from the HY component. It may be
necessary to alter VZ, HX and HY polarities. Use MIRFinput and ACQinput to
do this if possible, otherwise, use SensorScale.
Processed components
Multi-component operators in VSProwess attempt to retain true vector
information, wherever possible. To produce results in a known direction they
require the following component orientation at the start of processing.
VZ is vertical
H1 is horizontal and pointing north
H2 is horizontal and pointing west
Pre-processing depends on the survey type. For more information, create a new
job and load in the “Avalon Sciences/Routes/Three component route”.
Walkaways
Walkaways are a special case. TOOLINC, TOOLAZ and TOOLROT should be
the same for all source positions into the same tool location. If necessary, use
ToolOrientate to find TOOLROT/TOOLAZ dataset database values. Export the
database and graph the resultant values to pick the most suitable value. Import
the required values using DBupdate.
For multi-level walkaways use NGEO as the key in DBupdate and have one line
for each geophone number. Alternatively, use MD or TVD as the key.
The above ray trace illustrates a reflection point loci for a single trace, source
receiver pair.
Format
The loci file is an ASCII file with Comma Separated Values (CSV).
There must be no more than 20 columns (0 to 19). Lines starting with # are
ignored.
There must be one row, before any data rows, describing the values in each
column. The following strings are recognized in the descriptor row:
TR, XX, YY, TVDSD, TT, TCORR, INC, AZIMUTH, RANGLE, TYPE,
LAYER, PLUNGE, STRIKE, VRMS, VRMSI, VRMSX
The columns can be in any order but it is recommended that the TR column
appears first.
The row after the descriptors is a row of unit identifiers. The following strings
are recognized, m, ft, m/s, ft/s, s, ms, deg, n (no units).
Depth and offset values are in feet or meters as defined by the unit row.
Time values are seconds or milliseconds as defined by the unit row.
Angles are degrees.
TR is the trace number.
• A new TR value is required for each source receiver pair.
• Rows with the same TR value must be grouped together.
All source and receiver depths are below or at datum. For some land datasets a
datum other than seismic datum may need to be used.
XX is the East, coordinate of reflection point or source or receiver.
YY is the North, coordinate of: reflection point, source or receiver.
TVDSD is the vertical depth of: reflection point, source or receiver below the
top of the model.
TT is the modeled slant travel time from source to reflection point to receiver.
TCORR is the vertical travel time, from the top of the model to reflection depth,
though the model, ie one-way vertical time.
INC is the incident angle of the last ray, of the reflection ray path, at the
receiver. INC=0 for a vertical upward travel path, INC=90 for horizontal travel
path. INC is a positive value if receiver to reflection point vector is in the
AZIMUTH direction. This is the usual situation. Steep events may cause ray
paths to pass the receiver before reflecting thus causing negative INC.
AZIMUTH is the azimuth of the last ray, of the reflection ray path, at the
receiver. An azimuth of zero is north, positive values clockwise from north.
RANGLE is the angle of reflection relative the normal to the layer. An
RANGLE of zero is normal incidence.
TYPE is the type of reflection, 1000=PdownPup (PP), 2000=PdownSup (PS),
3000=PdownSdownSup (PSS).
LAYER is an integer identifier for the layer at the reflection point.
PLUNGE is the local model layer plunge (dip) at the reflection point.
STRIKE is the local model layer strike direction at the reflection point. Looking
in the strike direction a layer will be shallowest to the left.
VRMS is the RMS velocity along the travel path from source to reflection point
to receiver.
VRMSI and VRMSX are inline and cross-line RMS velocity values which will
differ from VRMS in an anisotropic model.
CSV
Comma Separated Values, a universal text based data interchange format
supported by virtually all spreadsheet and database software.