...
The intent of these processes is to clean and preen the database establish required database configurations for a new expedition. Only a small portion of this content is needed for any one expedition. Additional scenarios and combinations of data handling are described and considered.
Process ----> DO THIS ON BUILD BOX
...
via a shared account like SHIP\DAQ
- Because it may take a long time and you don't want to tie up your laptop
...
- .
- Because your opposite may need to check in on progress if it runs longer than your shift.
Connection Notes
...
Log into operating system oracle
user login first, ; then conduct database user logins.
- Open a PuTTY session.
- Connect to
oracle@k1.ship.iodp.tamu.edu.
- From the prompt k1
$
run the commands shown. - Supply the appropriate credentials.
NOTE: For command-line tools--like sqlplus
, rman
→ if your password has spaces or symbol characters, you must quote the password, e.g.
|
PuTTY window
|
Run the SQL scripts
Connect to Oracle as the TRANSFER schema owner. Recommend using sqlplus. Make appropriate substitutions for current expeditions. If you are new to the process, read the footnotes carefully. Ask other developers or DB's.
|
NOTE - participants_remove.sql
found in SVN under - C:\develop\wapps\Tools\DbScript\participants needs to be run to remove all the old JRS_ scientist user id's. Need to be run under a DBA Privileged account in SQL Developer. The process will remove user ID's and will also remove dependent user configuration from the X_AUTH_ACCOUNT table as well.
...
Logging the effort quantifies the experience for future improvements. Please place the log file in an expedition folder under r:\ad\support\###...\box\
- The
lims_expedition_delete()
step is the longest. Monitoring is required. - If material from another expedition was analyzed on this expedition, it may be necessary to delete that content too, e.g.
call lims_expedition_delete('344(312)');
- VARIANCE: If you need to keep some data from the previous expedition for the next expedition, skip this step. Then you can copy what you need from TRANSFER.NEW_tables without having to resort to the data pump import tool.
- VARIANCE: If you need to keep some data from the previous expedition for reference on the next expedition, then custom SQL is required to preserve what you need. And you must communicate with the MCS to ensure what you need is preserved on the ASMAN file system--their routine is to expunge the previous expedition.
- VARIANCE: If multiple expeditions need to be cleaned, repeat the command for each expedition.
- You must look up and specify ###. The script as written deletes all audit keys <= ### from x_lime_audit_main. The deletion cascades to linked audit detail.
- Defragmentation of the space used by SAMPLE, TEST, RESULT is recommended, but not required if you are pressed for time. "Coalescing" of the RESULT table alone typically takes 9 hours in the shipboard environment. Doing so on a warehouse does take days with spinning disk.
...
Participants for a new expedition have no need to see either menus or data from the previous expeditions database OVERVIEW.
Menus. OVERVIEW menus are stored in LIMS.LIST_ENTRY
. View the content where LIST = 'MENU'.
Change old expeditions for new expeditions. Drop menus that are not needed. The ODP and DSDP menus should always be retained.
Data clean. The summary content displayed by OVERVIEW lives in LIMS.X_DBOVERVIEW
. The table is indexed by expedition, site, and hole. It is sufficient to clean out specific content with a query like this (example)
|
Discussion
Database Schemas
...
- Request a starter set of credentials and permissions for OEM from the DBAs. Specify which environment you are in.
- OEM credentials are distinct from RDBMS credentials. You must have DBA level privileges and credentials to the database being managed.
- OEM and RDBMS credentials are distinct from operating system credentials. You must have access to the host oracle account for some operations.
(1)
Connect to OEM at https://oemjr.ship.iodp.tamu.edu:7802/em
Login with your OEM credential.
...
Run this SQL as the DBA to verify info about archive logging status.
|
Expect output like this
|
or this
|
How to Invoke PL/SQL Scripts
All scripts are run from procedures owned by the TRANSFER account. The typical data cleanup and preening session is shown above and repeated here
|
Most scripts take a single string parameter for expedition. The ASMAN scripts take a single string parameter indicating the catalog to be deleted--usually formed as 'EXP'+'theExpeditionNumber', e.g. 'EXP339'. Catalogs: DESCINFO and DROPS are kept per user request.
Run the scripts from SQL*Plus or from within a SQLDeveloper worksheet. SQL*Plus has the advantage of being light-weight, and provides facilities to spool all transactions that transpire for post-cleanup review and comparison. SQLDeveloper has the advantage of readily reviewing and verifying the scripts being invoked.
The scripts are designed to provide feedback, however the Oracle mechanisms are not real-time. To turn on useful feedback mechanisms, apply these commands from the SQL*PLUS command-line mileage will vary if tried in the context of SQLDeveloper:
|
To capture the content of a SQL*Plus session, do something like this:
|
The duration of the session is very dependent on the quantity of data collected. Be patient. It takes time to delete 14 million rows. Times between 9 and 20 hours are normal.
Special behavior for scripts is noted below. E.g. order of execution is important in some cases, some processes are fast, some are slow. Some should be double-checked.
Data cleanup is routine, but will have variances. The additional scenarios provided below attempt to clarify the variances we encounter. Apply the scenario or combination of scenarios that best fits current expedition requirements.
You will see a number of procedures in TRANSFER
that are not documented here. These processes have been in disuse for so long additional thought, testing, and documentation should be applied to them when they come up again. Specific questions? Ask around. Read the code.
...
Irrevocably and quickly drop all content in the TRANSFER schema tables with "NEW" in their names. No parameter required.
|
Clean out the ASMAN file catalogs(will also remove Autosaves from Descinfo2)
This script removes the database records only. It is a separate step to clean out the file system for the catalog. You are responsible for the database cleanup. The MCS take care of the file system. Talk with each other.
|
Repeat asman_catalog_delete for as many catalogs as you desire to remove. Confirm removal by re-running the survey script. Commit when done.
|
Clean out LIMS
This is the big one. Monitor archive logs. Start it in a stable environment that you know won't be rebooted in anywhere from 6-20 hours. Check on it using SQL to count sample/test/result rows, or watch it via Monitor Sessions in SQLDeveloper.
|
Repeat lims_expedition_delete
for as many expeditions as required. The smaller the quantity of data, the faster it will go.
See detailed process for setting of expedition specific variables. ----
Recommend the various select count(*) variants as routine content checks before and after the removal step.
|
This procedure is slow. Data we want to keep is intermixed with data to be deleted. So we do it rows at a time. The database is a 24/7 tool--there's always some activity against it.
...
Preserved for historical reference. These scripts still exist.Preferred syntax. Used in complement, bracketing work-to-be-done in the middle. The alternate syntax is robust over gradual changes over time. It ensures no triggers are missed on the tables we work with most: sample, test, result.
|
Disabling triggers is a technique to speed up data loads. It prevents automated renaming and computation of depths for legacy content which may already have this metadata. Remember to re-enable when done. Disabling of triggers should not occur during data collection.
CONTAINER LOAD. See Excel spreadsheet and SQL script method at https://build.iodp.tamu.edu/svn/wapps/Tools/containers.
LIMS_LOAD_LEGACY. Deprecated. Not used.
...
For new participants and staff to experience and practice using integrated shipboard systems, it is helpful to have some data to play with. The same is required for development that continues between expeditions.
Records are de facto accumulated against an expedition/project called TEST 999. After awhile it is helpful to clean out tests and results.Be judicious and selective. Ask around: is anyone using this data? For example, sometimes end-user testing or development is in progress that you may not have been aware of.Once you have go-ahead, it is often sufficient to clean out tests and results. Unless specifically testing sample cataloging we prefer to avoid the effort of recreating a complete and representative set of samples.These clean out scripts should only be used in the scenario where samples are being preserved [curatorial tests and results], but science data [non-curatorial] tests and results are being removed.
|