The on-coming developers are responsible for these processes!
The intent of these processes is to establish database configurations for a new expedition.
If the off-going developers want to help and time permits--that's wonderful. It's a Catch-22 that probably indicates you need some rest.
Connection Notes
Log into operating system oracle
user first; then conduct database user logins.
- Open a SQL Developer session.
- Connect to
oracle@k1.ship.iodp.tamu.edu.
- From the prompt k1
$
run the commands shown. - Supply the appropriate credentials.
Example of the above from Windows Terminal (wt.exe),Powershell, PuTTY, Hyperterminal or any other such tool.
ssh oracle@k1.ship.iodp.tamu.edu cd /backup/export . oraenv ORACLE_SID = [oracle] ? LIMSJR (or provide SHIPTEST) sql me@limsjr# If you are conducting DBA privileged activities, use your DBA accountsql my_dba@limsjr
NOTE: For command-line tools--like sql, sqlplus, rman:
if your password has spaces or symbol characters, you must quote the password, e.g.
|
Run data cleaning and preening scripts
Connect to Oracle as TRANSFER (schema owner) or your DBA account. Doing this work in SQL Developer is often convenient.
- Make appropriate substitutions for current expeditions.
- If you are new to the process, read the footnotes. Ask.
|
The box_lims_expedition_delete()
step is the longest. Monitoring is helpful. Currently takes about 30 minutes--mostly to delete millions of results. Historically saw ranges of 2-8 hours (older hardware, fewer CPUs, slower disks).
If material from another expedition was analyzed on this expedition, it may be necessary to delete that content too, e.g. call box_lims_expedition_delete('344(312)');
- VARIANCE: If you need to keep some data from the previous expedition for the next expedition, skip this step. Even if you forget, you can still copy what you need from TRANSFER.NEW_tables without having to resort to the data pump import tool.
- VARIANCE: If you need to keep some data from the previous expedition for reference on the next expedition, then custom SQL is required to preserve what you need. And you must communicate with the MCS to ensure what you need is preserved on the ASMAN file system--their routine is to expunge the previous expedition.
- VARIANCE: If multiple expeditions need to be cleaned, repeat the command for each expedition.
- You must look up and specify ###. The script as written deletes all audit keys <= ### from x_lime_audit_main. The deletion cascades to linked audit detail.
- Defragmentation of the space used by SAMPLE, TEST, RESULT is recommended, but not required if you are pressed for time. Defragmentation on 398 took an hour to run with current hardware and disk allocation. Older hardware took 9-14 hours.
Cleaning OVERVIEW Menu and Summary Data
Participants for a new expedition have no need to see either menus or data from the previous expeditions database OVERVIEW.
Menus. OVERVIEW menus are stored in LIMS.LIST_ENTRY
. View the content where LIST = 'MENU'.
Change old expeditions for new expeditions. Drop menus that are not needed. The ODP and DSDP menus should always be retained.
Data clean. The summary content displayed by OVERVIEW lives in LIMS.X_DBOVERVIEW
. The table is indexed by expedition, site, and hole. It is sufficient to clean out specific content with a query like this (example)
|
Containers to be loaded for MAD analyses
See the MAD container load - process. And additional related documentation on the shore wiki. The names and masses of containers are delivered in a spreadsheet (or text file).
This is not done every expedition.