...
Code Block |
---|
set timing on
-- GEODESC. Remove last expedition's GEODESC project.
drop user geodp### cascade constraints;
-- LIMS. Remove last expedition's LIMS content.
-- Review TRANSFER.LOG entries to review these activities.
call transfer.util_cleanout_transfer_new_tables(); -- (1)
call transfer.box_asman_catalog_delete('EXP346'); -- (2) Last expedition
call transfer.box_lime_audit_delete(###); -- (4)
call transfer.box_lims_expedition_delete('346'); -- (3) Last expedition
call transfer.box_lims_cleanup();
call transfer.box_lims_defrag(); -- (5)
-- OPS. Remove last expedition's IRIS data.
-- The timestamps below will surely not be correct. Review when the database export was conducted. Review the data based on a time range.
delete from ops.ris_data where observed_time <= to_timestamp('2023-08-11 13:00:00.0', 'yyyy-mm-dd hh24:mi:ssxff');
delete from ops.ris_configuration where last_modified <= to_timestamp('2023-08-11 13:00:00.0', 'yyyy-mm-dd hh24:mi:ssxff');
delete from ops.ris_event where event_timestamp <= to_timestamp('2023-08-11 13:00:00.0', 'yyyy-mm-dd hh24:mi:ssxff'); |
...
- VARIANCE: If you need to keep some data from the previous expedition for the next expedition, skip this step. Even if you forget, you can still copy what you need from TRANSFER.NEW_tables without having to resort to the data pump import tool.
- VARIANCE: If you need to keep some data from the previous expedition for reference on the next expedition, then custom SQL is required to preserve what you need. And you must communicate with the MCS to ensure what you need is preserved on the ASMAN file system--their routine is to expunge the previous expedition.
VARIANCE: If multiple expeditions need to be cleaned, repeat the command for each expedition.
Run the command again if you see errors like thisCode Block Error report - ORA-20000: ORU-10027: buffer overflow, limit of 1000000 bytes ORA-06512: at "SYS.DBMS_OUTPUT", line 32 ...
The above error should no longer be occurring. Use of DBMS_OUTPUT has been replaced with inserts to TRANSFER.LOG for all TRANSFER procedures and some LIMS functions and procedures.
Alternatively "set serveroutput off". Then use SQL like that below to monitor progress oftransfer.box_lims_expedition_delete()
from another terminal window or SQL Developer worksheet. ExampleCode Block select x_expedition, count(*) from lims.sample group by x_expedition order by x_expedition;
- Look up and specify ### (
lims.x_lime_audit_main.audit_key
). The script as written deletes allaudit_key
<= ### fromlims.x_lime_audit_main
. The deletion cascades to linkedlims.x_lime_audit_detail
. - Defragmentation of the space used by SAMPLE, TEST, RESULT is recommended, but not required if you are pressed for time. Defragmentation on 398 took an hour to run with current ODAs and configs. Older ODAs and configs experienced 9-14 hours.
2/13/2024: Removed constraints
from the end of the drop user geodp###
command. Was causing the script to fail.
Remove SCALES not relevant to current shipboard work
...
There is presently no script for this process. The SCORS application will cancel affines and splices, but not expunge the records.
The BOX_LIMS_EXPEDITION_DELETE() procedure contains a superset of this process, but must usually be copied and customized for specific need. Review the code.
Use SQL Developer to remove content not relevant to the current expedition.
Info |
---|
When removing records from LIMS.X_SAMPLE_DEPTH, do not remove any records with a scale_id=11311 or 11331. Some of these records are needed for 999 and QAQC samples. If you already have, the script below will restore them. Removal of X_SAMPLE_DEPTH records for specific scales and expeditions may be accomplished like this:
-- Selectively delete X_SAMPLE_DEPTH records by expedition and scale. |
Code Block | ||||
---|---|---|---|---|
| ||||
INSERT INTO lims.x_sample_depth (SAMPLE_NUMBER, SCALE_ID, TOP, BOT) Select s.SAMPLE_NUMBER, '11311' AS SCALE_ID, 0 AS TOP, 0 AS BOT from lims.sample s LEFT OUTER JOIN lims.x_sample_depth sd ON sd.sample_number = s.sample_number and sd.scale_id = '11311' where s.x_expedition = '999' -- and s.x_site = 'U9999' -- and s.x_hole = 'A' -- and s.x_core = '1' and sd.sample_number is null; INSERT INTO lims.x_sample_depth (SAMPLE_NUMBER, SCALE_ID, TOP, BOT) Select s.SAMPLE_NUMBER, '11331' AS SCALE_ID, 0 AS TOP, 0 AS BOT from lims.sample s LEFT OUTER JOIN lims.x_sample_depth sd ON sd.sample_number = s.sample_number and sd.scale_id = '11331' where s.x_expedition = '999' -- and s.x_site = 'U9999' -- and s.x_hole = 'A' -- and s.x_core = '1' and sd.sample_number is null; INSERT INTO lims.x_sample_depth (SAMPLE_NUMBER, SCALE_ID, TOP, BOT) Select s.SAMPLE_NUMBER, '11311' AS SCALE_ID, 0 AS TOP, 0 AS BOT from lims.sample s LEFT OUTER JOIN lims.x_sample_depth sd ON sd.sample_number = s.sample_number and sd.scale_id = '11311' where s.x_expedition = 'QAQC' -- and s.x_site = 'U9999' -- and s.x_hole = 'A' -- and s.x_core = '1' and sd.sample_number is null; INSERT INTO lims.x_sample_depth (SAMPLE_NUMBER, SCALE_ID, TOP, BOT) Select s.SAMPLE_NUMBER, '11331' AS SCALE_ID, 0 AS TOP, 0 AS BOT from lims.sample s LEFT OUTER JOIN lims.x_sample_depth sd ON sd.sample_number = s.sample_number and sd.scale_id = '11331' where s.x_expedition = 'QAQC' -- and s.x_site = 'U9999' -- and s.x_hole = 'A' -- and s.x_core = '1' and sd.sample_number is null; |
Cleaning OVERVIEW Menu and Summary Data
...