Expedition 330 Technical Reports

X330 Technical Report PDF File


LOUISVILLE SEAMOUNT TRAIL






LO–ALO Handover notes

Bill Mills, Cheih Peng & Steve Prinz

Expedition 329 ended in Auckland, New Zealand with the first line ashore at 0630, Dec 13th, 2010 (Wynyard Wharf). Logistic activities included the loading of standard science and operational supplies and the offloading core, samples and data collected during Expedition 329. Local PR activities and tours were conducted for visiting dignitaries and VIPs.
On Dec 18th, the last line was away at 0806 starting our transit to our first Louisville Seamount site.
Throughout the expedition seafloor magnetic data was collected routinely on all transits. There were no seismic surveys or VSP operations.

SPECIAL PROJECTS


LOGISTICS

  • Check out sheets were printed black and white with various printer settings. The sheets will print in color when using FireFox instead of Internet Explorer, though Rakesh has advised there are bugs when using AMS with FireFox. Grant has been troubleshooting this problem for Internet Explorer. The current solution is to set the SO Color printer as default for the Windows 7. On the final print menu, just check the box for, "choose paper source by PDF page size. Not other choices for landscape or paper size are needed. It will print in color and on the 11x17 paper.
  • Usage history in AMS has been updated, so "physical count" does reflect the actual physical count number. However, random numbers are still showing up sometimes. Problem has been reported.
  • New issues 'bugs?' were encountered as the offgoing shipment was being prepared. I could not add/save a new parcel for 4 of the scientists. Some research revealed that these names & address ID's appeared several times in the address file from previous cruises, or other people with the same name. This may present a conflict to AMS. I deleted and added these names again, but this didn't help. The only way I found to trick the program was to change the address ID to a unique or dummy value. This was reported to Rakesh, who said not to create dummy ID's, but as no other solution was presented, this was the only way I found to get AMS to work properly. This issue will be reviewed after the cruise.
  • Unexpected high recovery resulted in high usage of curatorial supplies, such as D-tubes, shrink wraps, epoxy, snap cap vials, and dividers. Additional supplies will need to be sent air-freight to Costa Rica.
  • Items ordered for MBIO expeditions were added to AMS as inventory items.
  • Ship store items were also cleaned up, items no longer active were marked obsolete and items with new sizes were added. We suggest that the ship store operation should be handed over to Publication group. It is more practical for the pub specialist to handle all ship store related matter, including maintaining inventory level.
  • Check out sheet for FPUB was verified via pub specialist and none of the items matches what they have in the office. A communication was sent back to Debbie Partain to clarify as what they would like to keep in the inventory. We have not heard any reply.
  • Chris Bennight has requested and granted AMS editing privilege in Item Master, including ROP/SL, and location. Shelf/Aisle/Bin were added to most of the FCL items.
  • AA and AAA batteries are moved to LO office and inventory moved to BLO ship loc.
  • The inventory system in the chemistry lab underwent a substantial overhaul. The inventory sheets where reviewed and physical counts updated, duplicate items removed, and locations normalized. In addition an in-lab location system was created, and all items on the checkout sheet had this location added in AMS.


AREA-BY-AREA SUMMARY

Bridge Deck

Technical Support Office: Ship's electricians and engineers worked on the HVAC supply for the Technical and ODL's offices in an attempt to improve cooling. Some minor improvements were made to the air flow. An inline booster fan has been order (similar to the one installed for the Science office). Hopefully, this will solve the HVAC problems. Spoke with the captain about painting the deck above the offices grey for the upcoming tropical cruises to help with the heat load ...the ideal was not enthusiastically embraced.
The office monitor was remounted on a frame that pivots off the wall; you can now access the back inputs. HDMI-DVI cables were installed from the monitor to each of the pc stations. This allows you to add the office monitor as a second monitor to your computer and share your content with the office. The tech computer was setup to display V-brick and other internet information (between foot ball games).

Core Deck

Core Labs General
High recovery at the first sites resulted in a 6 day processing backlog. Without sufficient rack space in the lab, un-sampled working halves were boxed and taken to the reefer. During the peak of the backlog, cores would be returned to the conference room so that scientist could select their personal samples; afterwards the cores were boxed and returned to the reefer. Then as time permitted, cores were returned to the lab for actual sampling and wrapping, and then boxed once again. This was extremely inefficient and put a great deal of strain on a technical team working hard to keep up with core processing. Extended logging program and low recovery on our last sites allowed us to process the backlog fully.
Dri-Deck tiles were installed in the core lab around Physical Properties Lab, Description Area and Paleo Prep Lab.

Catwalk:
No issues
Core Entry
Core Entry Station: Chris demonstrated a new catwalk application to replace Sample Master for core logging. It was well received by staff. We need management support to proceed with this project.
Laser Engraver: Was assembled and installed at the beginning of the expedition. Chris created a "printer" for Sample Master that would send the engraving information to a LabView application that controlled the actual engraving process. At this time the laser is not available for routine use as we cannot adequately vent the fumes generated. An extraction system has been ordered. Also, the mechanism that turns the cores needs to be re-engineered. The system is scheduled for routine operation after the tie-up period.
New XRF: The unit worked to the manufacture's specifications, unfortunately that was not good enough for discriminating lava flows. However, our sedimentologist used it for identify phosphate and manganese in sediment cores. The unit was setup on the thermcon bench and the thermcon equipment was moved to the strat correlator's station. The instrument was left onboard for further testing.
Physical Properties Lab
WCMSL: Another expedition has come and gone and still no success in transferring the new application to the slow track (WCMSL).
STMSL: Trevor experimented with a new digital interface with the MS loop with partial success.
NGR: Rebuilt the electronics support framing and installed full side shields and door cover. While not fully shielding the equipment, it should block any minor water splashing from the splitting room or core rack. The data processing code was incorporated into the NGR track software so that the data processing is now automatic at the end of each run. Numerous NGR code issues were fixed and the UI upgraded.
Thermcon: All of the Thermcon's half-space probes failed. This is a continuing problem that needs to be addressed with the supplier.
Splitting Room
Was heavy used this trip without any a major issues. Staff recommend that we re-build the drain system with clean outs. Need to add a second parallel saw if the new spinner becomes a standard instrument.
Core Description
DESClogik: Continues to be a source of frustration with staff and science party and major resource drain. Staff worked hard to fix small problems but will it ever end??
SHMSL: No problems to report.
SHIL: No problems to report.
Tables: The auxiliary table was configured for description purposes.
Magnetics Lab
CryoMag: They cryo-compressor was problematic thru out the cruise. In the last week of the expedition we reduced the He pressure and the unit seems to be working fine. The SRM software was modified to allow simultaneous measurement of the SQUID improving sampling speed from a few seconds to less than .5 sec per measured position. Scientists strongly recommend that the SRM software be replaced.
D-Tech 2000 Demag: Continuous communication problems throughout the expedition. A replacement has been ordered?
Thermal Demag: Insulation is failing and heating elements are exposed. The whole unit needs to be replaced.
AGICO Spinner: Modification to the internal indexing sensor fixed an alignment problem with the hex key actuator. Scientist went through a number of sample holders; spares have been ordered along with alignment tools.
Paleo Lab: Saw minimum use, no problems to report.
Close-Up Photography: The thin section imaging system PICAT was well received. No other problems reported.
Microscope Labs: No issues.

Fo'c'sle Deck

Hazmat Response Locker: No problems to report.
Chemistry Lab:
General: Manuals were created, reviewed, and updated for various instruments (Coulometer, ICP, GC-PFT, DIC/DOC, general chemistry SOP). Numerous instrument hosts were upgraded to Win7.
Coulometer: Software was replaced with a new version which should alleviate some of the common complaints and issues with the previous software.
CHNS: No issues other than the lack of silver capsules for analysis. This item is now on AMS.
GC-PFT: Was not used this leg but Chris worked on issues raised on the previous leg.
ICP: Saw heavy use this leg with no serious issues.
Carver Presses: Oil leak was fixed.
XRD Lab: No issues
Thin Section: Concern was expressed regarding the care and upkeep of the equipment without full time tech on the other crew. No other issues to report.
Sample Prep Lab: A step-up transformer was ordered so the furnace can reach the correct temperature. A new technique for crushing microbio sample was developed using the X-Press and eliminating the need for the flaming box of death. New racks for the furnace will be fabricated back on shore.
Microbiology Van: The van was used throughout the cruise for stable isotopes.
Underway Lab and Fantail: The new Maggie work without problems. The new GPS Tremble unit was installed in the Technical support office and was used in conjunction with fantail GPS to orient the pmag logging tools. It was suggested that we order a second Tremble and retire the Ashtecs. The new Trembles can be networked on the ship's LAN eliminating the need to hardwire separate units to WinFrog one and two.

UPPER TWEEN and Hold Decks

Logistics Stores: No problems to report.
Shop: No problems to report.
Pallet Storage and Staging: Plans for reorganizing the space during tie-up were submitted and approved by management and ODL. Issues regarding ODL labor have not been addressed yet.
Core Storage: No problems to report.

LOWER TWEEN DECK

Data Center, User Room and Offices: No problems to report. Floors look like crap.
IT Services
Phones: Throughout the expedition we had numerous issues with the phones. Staff worked with Rignet and Campus to reconfigure the system resolving the problems.
Satellite: Found the threshold on the satellite systems not configured correctly.
MCS website: Migrated to the wiki.
VOL 1: Increased volume size on servers.
Servers: Enabled SAMBA for our customers to have a gain access to our servers requiring no additional software to be installed.
Developers: Kept very busy, refer to their report.
Gym: Numerous floor cracks have appeared and need to be repaired during the transit or tie-up. The staff wants to reduce the size of the universal gym and order a second treadmill. ODL and our ETs made a heroic effort to keep the current treadmill running.
Lounge and Movie Room: Our video server has died and will be returned to shore for repair.

ALL OTHER AREAS

Elevator: No problems to report.
Analytical Gas: We planned on installing the He manifold but never got around to it! Probably should wait until we finish the reorganization of the pallet storage area.

Core Lab

Johanna Suhonen

Summary

The Louisville Seamount Trail expedition was by no means a high recovery hard rock leg, but two of our sites had an extremely high recovery percentage, which led to long hours spent in the splitting room especially during the second half of the expedition.
The rock saws were used constantly. An additional 8-section core table was set up in the middle of the splitting room to hold sections being sampled, and as a labeling station.
Several fans were utilized in the splitting room in an attempt to dry the split rocks in a reasonable amount of time. For the wet porous pieces an air nozzle with a needle valve for continuous operation was attached to a clamp stand. This method also worked well on the fragile pieces of basalt that could not come out of the shrink wrap after being split. The air nozzle was clamped to the stand and pointed down to the rock with the pressure adjusted according to the fragility of the sample. The nozzle was left blowing until the rock was dry; often in excess of 30 minutes.
During the high recovery sites the core rack space in the lab was not adequate to store all the unsampled working halves, thus the section halves were boxed and taken down to the reefer with the top end of the box left unstapled.
Dri-Dek was laid down to cover the extent of the core description area floor. There should be enough remaining for covering the paleo wet lab floor if necessary.
The laser engraver was installed in physical properties track area. It is currently under testing and development.
The portable XRF was set up on the counter across from to the stairwell door (former thermal conductivity station).

Equipment Performance Summary

The splitting room saw a lot of activity during the expedition. The core splitter worked well.
The splitting table is still experiencing serious drainage issues when the ship is listing to port. Preferred solution would be adding a drain in the port end of the table. The drain can simply run from the portmost corner of the splitting table into the super saw well.
The super saw did a great job, except for the very long or skinny pieces of rock, which occasionally ended up coming out uneven.
The super saw drain is leaking. The whole drainage system should be re-designed, as under the current configuration it is very difficult to clear out the drain should a blockage occur.

Rock Saws

  • The rock saws were in constant use. The issues that had been brought up on Exp 317 still remain unresolved. The water supply to the saws, the drill press and the wash down hose is inadequate when multiple pieces of equipment are being operated at once. The drainage system makes the saw counter unusable for anything else due to all the run-off from the saws ending up on the counter top. This also leads to unnecessary amounts of time spent washing down the counter. The best solution would be to enclose the saws and to drain them individually.
  • The big rock saw's foot switch broke, and it was replaced with a spare. The original switch was repaired and is now in fine working order. It is located in the unlabeled bottom drawer.
  • The solenoid valve (valve #6) on the small rock saw is leaking. The saw was in constant use, thus it could not be taken down for repairs.
  • It would be beneficial to separate the power supplied to the saws onto different circuits, as the saws are creating a high load when they all are being run simultaneously.
  • A temporary water proof LED string was installed inside the saw enclosures. It would be great if the saws could have permanent lighting.
  • A hands-free rock rinsing station in a comfortable vicinity of the saws would be essential. An ideal set-up would be a foot-pedal controlled dial style hose nozzle hooked up facing down over a sink, with means of adjusting the water pressure (connect the nozzle onto a tap with a control on the bottom, so that it can be set to a desired flow).

Drill Presses

  • The drill press was not used on Exp 330.

CURATION

  • The refurbished handheld Brady label printer never worked properly, and it was eventually taken out of service and will be sent back to the manufacturer for repair.
  • The current batch of shrink wrap is greenish and creates noxious fumes when heated. All the good old clear wrap was used up. Brad is looking into the issue. Hopefully we will be able to revert back to less smelly clear shrink wrap in the future. Due to the fumes an archive half wrapping station was set up in the paleo lab in front of the fume hood. The working halves were wrapped in the splitting room where the exhaust fan can be utilized to get rid of the fumes.

Portable XRF

The portable XRF worked well and proved most useful for identifying minerals in sedimentary intervals and zones of alteration, especially where the material quantity was not adequate for XRD analysis.
Set up the Niton XL3d Handheld XRF unit. The science participants and technical staff who would be using or assisting we directed to read and sign off on the Niton safety instructions. These people were also issued with radiation badges by the L.O. A bottle of High Purity Helium was set up with the instrument's dedicated pressure regulator. -cobine


Core Description

Thomas Gorgas

SUMMARY

This expedition was a natural extension and continuation of an ongoing improvement process with and around DESClogik – an IODP software application designed to digitally describe "rock & sediment material" & store corresponding digital data into a global data base.
The goals for EXP330 were:

  • Offer the science party a DESClogik a tool, which satisfies the science party's needs for a detailed core description without experiencing a lot of internal resistance against the tool;
  • Overcome strongly negative sentiments of scientists, who had been working with this tool in the past – most prominently expressed during EXP327, with wall board writings, such as "We HATE DESClogik!" – A very stark expression of frustration and also helplessness when it comes to using the application.
  • Creating & receiving an overall positive feedback from the science party about using DESClogik with the expression as overall "good" (or even: "better than good").
  • Monitor the "feedback" from scientists along the way, and effectively address currently existing issues, and fix them with the help of our developers on board (the science party's feedback from "WEEK-4" is collectively summarized in a separate document; cruise evaluations are pending).
  • Improve the application by advancing from "workarounds" to "true solutions" - being implemented in form of up-dates & upgrades throughout the voyage. In that respect, the goal was also to "unlearn" certain ways of "doing workarounds" and truly "fix limitations".
  • By using DESCLogik, further add petrologic & sediment description data into a global data base, which other scientists can retrieve in the near future via the Internet.

The above goals were set high & challenging because both DESClogik Tech and also Yeo-Person (the latter working with the DESClogik data indirectly through LIMS2EXCEL and STATER) were new in their respective position; certain skills and "unknowns" of this portion of the operation needed to be learned in the first place, and hopefully improved.
Despite this "Mission Impossible"-like challenge, the task was also considered as a great learning & teaching opportunity for both tech's and the science party, approaching our daily tasks with fresh eyes. Working with top-experts in their respective fields as a novel "DESClogik" tech/user, helped us to detect and discover limitations of the system that have been "worked with" in the past by creating "workarounds" – instead of developing "true solutions".
Whether we mastered the aforementioned challenges and achieved our goals (or not), must be seen and analyzed by reading the "Expedition Evaluations" from the science party at the very end of EXP330.

WORKFLOW

The general working steps for core description during EXP330 were as follows:

  • Imaging the core sections with the SHIL system – an advanced image core logger. This process allows the scientists to print the imaged core sections and annotate written notes on the hard copies – a very valuable addition to the work flow prior to entering those data into the digital data base via DESClogik. The images plus annotations were digitized with the CANON Copy &Scan machine proximal to the conference room. Scanned images were then logged and locally stored on the shipboard server (see also the new "QUICKGUIDE for scanning images"). The important key point of this working step is that the scientists remain always disciplined enough to enter all their annotated data into the LIMS database via DESClogik. We are certain that it is nearly impossible to accomplish this task to a full 100% - simply because the notes are often so complex, which is not matched (yet) with the entering options within DESClogik. However, the VCD's as visual representation of the core description were accepted as "good".
  • Scanning the core sections with the SHMSL Allowed us to obtain light reflectivity & intensity for a certain color spectrum (comprising a wavelength spectrum of approximately 300 to 950 nm in wavelength). In addition to color/light reflectivity, also point-source magnetic susceptibility was measured as part the PHYSICAL PROPERTY data.
  • Core Description using DESCLogik: The working steps for sediment & rock descriptions were split into individual working groups comprising "Sedimentology", "Igneous Petrology", "Structural Geology", "Alteration" and "Paleontology" . DESClogik was the binding element among these working groups to collaborate effectively on describing the core sections in a digital format.
  • Overall this process of describing cores using DESClogik went relatively smoothly, while helping to further improve this critical step of creating a global data base. Constructive feedback from the involved scientists was obtained throughout the voyage, and IODP developers effectively targeted the main limitations of the system ASAP (special thanks in that respect go to the developers on board of EXP330 – namely, Chris Bennight, David Fackler and James Ziao who always & immediately responded to the needs and new requests from the DESClogik Tech, Yeo-Person and scientists to improve the digital work flow). A diligent process of "tracking requests" was started from "DAY-1" of the expedition by collecting all "Scientists complaints and requests" on a "Logging Sheet". This simple "tool" was base to implement the most pressing issues ASAP.


IMPROVEMENTS

The main improvements of DESClogik (from Version 3.22.9.4 to 3.22.9.8) and for the overall work process during EXP330 can be summarized as follows:

  • Enhancing the speed of UP-& DOWNLOADING process significantly (e.g., for a large data set, such as U1374A, the speed increases for downloading data using the 9.4 versus the 9.8 version reduced the download time from approximately 4 min. down to 35 sec.!)
  • Implementing an automatic "CLEAR SCREEN" upon "DOWNLOAD" of a data set.
  • Implementing an individual "CANCEL CELL" function (per RMC when clicking on an individual cell). This was an important improvement because of an irregular (and often irritating) "Data Retrieval" process prior to this up-grade. The issue was that data from previous uploads appeared to be re-appearing and/or not allowing the newly updated data to properly replace the old data, etc. – this new function made it much easier to ensure deleting old entries and replacing them with new ones.
  • Improving on the content of individual templates for specific needs of EXP330 and thereby also for future rock & sediment expeditions. Especially during the first two weeks we fine-tuned "Tabs" & "Columns" as used for describing core sections. The main development during EXP330 in that respect was to create a new template "VESICLES_VEINS", which allowed three working groups ("Igneous Petrology", "Structure ", and "Alteration") to fill in their respective information into one (and not three separate) templates. Eventually we also enabled the scientists to describe "Voids" (plus infilling minerals of those voids) within the same template.
  • Improving on the THIN SECTION REPORT generating tool, which allowed a faster and also more precise data retrieval; this process includes now also the option to draw lines and add color-shaded areas on the report (using an XLS Macro) – a BIG improvement from the previous version of the report-generating tool. Columns for "VESICLES INFILLING MINERAL" & other information, such as the 'total %' for "GROUNDMASS", "PHENOCRYSTS", and "VESICLES" do now satisfactorily complement the report.
  • Value lists remain properly selected, i.e., the sub-value lists were not accessible to the user at all times after being "globalized". This was tiresome & tedious (= frustrating) when a user had to select a preferred sub-value list every single time he/she logged out & back in – instead of keeping their preferred list active for a particular working period, they had to manually select their preferred value list over & over again. The "workaround" to ensure the proper "Value List" selection was that the DESClogik Tech saved the preferred sub value list for the respective scientist, which then – upon restarting the application – was readily usable. Ideally, this scenario is replaced by allowing the scientist to select the preferred value list so that it remains selected thereafter; this probably requires a change in the user's "administrative" & "permission" powers to execute that working step on their own.
  • Creating a "Suggestion Tracking Sheet" and a large clip-board for attaching core section image print-out for hand-written annotations. As simple as these tools are, they turned out to be very valuable to keep all scientists on track with their respective "wishes", to improve and implement changes, new features, value lists, etc. It was a bit of an "old fashion" approach, but it worked quite well. The large "Clip Board" with Velcro strips for attaching rulers and pencils worked also very well !
  • Last but not least, a THIN SECTION LIST in XLS format was created to keep track of all the thin sections that have been described and completed in their descriptions – a very good tool to utilize also for future expeditions! This tracking sheet also should include a "link" within the TS Report area to see the photograph of the described area, clast, etc.
  • "000" is ALWAYS reported as such in corresponding columns ('dip'; 'azimuth', etc.) - some vital information for structural geologists, which had not been reported properly before (instead of "000", values were reported as "0", and so forth).


OBSERVED & EXISTING DESClogik LIMITATIONS

These are the main limitations as they currently still require attention. A more detailed summary of the scientist feedback is added in a separate document ("EXP330 Scientists Feedback"), providing further insight into the perception of working with DESClogik from the users/scientists point-of-view:

  • High-definition of certain columns in certain templates is currently still limited: For example, "Porphyritic Texture" & "Magmatic Texture" (in the THINSECTION TEMPLATE) both comprise the same entry values (in this case: "texture_name"). This also pertains to the use of "s-group" vs. "structure_group" (in "Summary Description"). Whenever that situation occurs (i.e., the lack of "high-definition"), it may create confusion within the database when retrieving data and reporting them back to the user – in this case for the THINSECTION REPORT. The question is: How can we create uniquely defined data cells within DESClogik via properly defined "value entries" when not enough "value entries" and "qualifiers" exist? This seems to be trivial, but it turns out not to be… The rocks observed & studied during this expedition are highly complex breccias where the "Columns" are not sufficiently defined to truly describing them accurately in accordance to their complexity – at least this is the current perception from the scientist's point-of-view. The scientists actually recommended creating a newly improved "Sediment" & "Sediment Rock" and "Thin Section" template for those particular "study cases" – a task, which will require a lot of future development work to accomplish this goal…
  • Improving the THINSECTION Report: In the context of the above (templates which feed the THIN SECTION REPORT), this part of the work flow definitely requires the greatest attention for additional improvements. The data correction & revision process remains tedious & time consuming (and so is the post-processing for the THINSECTION report). Despite numerous improvements throughout EXP330, there still is a sense of uncertainty around the data accuracy when downloading DL data from LIMS into the TS Reporting Tool. As a long-term solution, we suggest to create an entirely new "THINSECTION Data Entry Interface", which resembles the actual data-entry process. Problems occurred in particular when we had more than one observation on one single slide for different CLASTS observed (e.g. , U1372A-5R-1-W-31_34-BILLET7-SLIDE7). We partly solved this problem by revising the definition of some of the thin section columns, but it is still highly problematic to acquire & deal with this type of data.
  • "UPLOADING" is still sometimes returned with an ERROR message – which in the end, upon retrieving data, turns out NOT to be an error (thus, it is almost double-confusing…) This occurred, for example, when attempting to upload a few new entries as part of a large data set (e.g., for U1374A); whenever facing this situation, we simply closed the application, turned off the PC for a minute & restarted the computer. In most cases, the error was not observed thereafter – and yet, this seems to rather awkward and unsatisfactory as a long-term solution.


WHAT HAPPNES NEXT? – SOME SUGGESTIONS

Concretely, it is recommended at this juncture to change & modify the following:

  • Make it possible to retain the height of the column header upon opening a template (it is annoying to resize your header height every time upon re-opening your chosen template)
  • Remove the "ALL TABS" and "TEMPLATE" and "USER" buttons entirely (unless they are dedicated as a "function" – see below: ALL TABS); instead, move the "CLEAR SCREEN" button closer to the center of the screen, proximal to the "UP/DOWNLOAD" buttons, or also take this button off the screen if not used anymore in the 3.22.9.8 version. However, when downloading data into a tab of a template, allow to load ALL Data into ALL tabs automatically (or keep the original ALL TAB button active)
  • Allow UPDATE using the SELECT SAMPLE button w/o closing that window repeatedly upon selecting samples from a new hole (say: switching sample-selection from U1374A to U1372A). Right now, the previous selection of a SITE/HOLE/etc. is retained from one session to the next, but the UPDATE function to switch from one HOLE to another still does not work. Minor issue, yet annoying.
  • Find a better way to deal with "data gaps" & "overlaps", which has become a problem in particular for the high-recovery "breccia"-type of rocks encountered during EXP330.
  • Implement a "BULK CANCEL ROW" function so that the user/tech does not have to go through every row individually and "delete" (i.e., "cancel") corresponding data values.
  • Implement a "Bulk Value List Change" function – especially handy when you deal with dozens/hundreds of "species" columns as the paleontologists do.
  • Dealing with the CORE SUMMARY DESCRIPTION column (as part of the THIN SECTION REPORT template) is too tedious. Right now the WORD.DOC uploads function does not make it easy for the user to keep track of the particular document version, which has been most recently uploaded. It would be useful to have a "feedback message" of some sort in place to be assured that the document has been indeed uploaded.
  • Allow "multi-mineral" voids & vesicles, etc., to be entered & acknowledged in one row, without requiring a multi-row entry for the same depth interval –which is especially critical when dealing with "Clasts" and other complex rocks.
  • Continue to develop the "Digital VCD". It is a helpful tool (similar to LIMSPEAK) in establishing a continuous & quick overview of the growing VCD (all the powers to Chris Bennight to keep going with this development!)
  • Improve on the depiction of a certain depth-interval and thus "range" of typical Paleontology data (interval of first & last occurrence) using DESClogik.
  • Improve DESClogik entry capabilities for complex rock conglomerates ("breccias") – i.e., create new templates, perhaps based on the currently existing "Sediment Description" template. EXP330 provided an excellent "play field" to recognize the needs for this type of rock description. One of the scientists' comments was: "It is a real problem that we cannot describe these kinds of complex rocks properly because there is always a column missing to fill in the corresponding information. We annotate a lot of information on the image print-outs, which is ultimately lost because we do not have the proper ways to enter them into DL." (pers.comment, Rebecca Williams, U.K.) A "form-based data entry" (as it should be also implemented for the SUMMARY DESCRITPION WORD.doc upload – see: above) is the goal for the next step for this type of development (as it should be also for generating the THIN SECTION REPORTS).
  • Further enhance the "GRAPHICAL TOOL BOX" within DESClogik to fully replace the current working step of printing out core section images & writing annotations on corresponding hard copy print-out's. Ideally, this improvement allows the describer to digitally select a certain area on the graphical core depiction, and writes the text description into a metafile pop-up window. This information then is transferred to a designated "Hot Spot" in LIMS. This type of "visual tool" would combine the describer's acquired skills with the powerful capabilities of a fully digital working tool.
  • Continue with "Screenshot" videos (using CAMTASIA) of typical DESCLogik working steps ("Bulk Entry", "QC-ing data", etc.). A series of four brief introductory videos exists already, and one additional video has been recorded and produced throughout EXP330 (thanks to our Image Specialist on EXP330, "Photo Ninja", Mr. Bill Crawford). This effort is considered to be vital to further advance & accelerate the process of developing DESClogik from a "Prototype BETA" to a fully useable digital core description tool.


SUMMARY

No doubt: Limitations in the DESClogik application exist up to date. However, many aforementioned "issues" were addressed & resolved throughout EXP330 by upgrading DESClogik from Version 3.22.9.4 to 3.22.9.8.
Ultimately, the willingness & kind patience of all scientists involved in the challenging task of describing rock material using DESClogik, combined with an incredible effort by all developers on board of EXP330, allowed us to accomplish most of our set goals.
Further improvements are expected to be implemented with future IODP expeditions, and also in the context of the upcoming DESC workshop in College Station, which is part of preparing for EXP335 ("Superfast" – also a rock expedition).
Ultimately, we hope to have contributed some positive aspects to the development of DESClogik by working together as a team throughout EXP330.

ACKNOWLEDGEMENT

Without the additional extra-support of all fellow-techs and the LO/ALO team of EXP330, our progress with DESClogik during this voyage would have been hardly achievable (DESClogik in itself is a full-time job). THANK YOU!

Addendum

suggestions

  • Watch the 4 short training videos (either BEFORE or during transit to the first site); located on one of the DESClogik station under "DESClogik VIDEOS".
  • EXPORT your results to EXCEL regularly via DESClogik (once/hour minimum, or with every section being completed).
  • TIME-STAMP/Date the name of that back-up EXCEL file for tracking purposes
  • Copy that file into the USERVOL/LABGROUPS/Your-Respective-Lab-Group folder (so we all have access to it as well)
  • Immediately DOWNLOAD data thereafter for Quality-Control (QC) purposes – any gaps in the data sets observed?
  • Report any data gaps or odd data in your DESClogik spreadsheet. The ultimate "Pro" user highlights anything "unusual" with "red" in the EXCEL spreadsheet so we can easily spot it when opening that file at a later time. This is actually really important!
  • For columns that want to be alphabetically ordered (e.g., for "FORAMS", etc.), please order them already inside the original XCEL spreadsheet in such an order before we apply the "Bulk Entry" procedure to turn this EXCEL spreadsheet into a DESClogik template.
  • Issues with UPLOADING/DOWNLOADING process may be resolved as such:
  • Download data onto screen and compare with EXCEL back-up spreadsheet; then apply CLEAR-SHEET and enter new data; then upload & subsequently download data again for QC purposes.
  • IF you enter SAMPLE 1 & SAMPLE 2, make sure they are not from the same SAMPLE INTERVAL (e.g. CORE1-1A & CORE1-1A) – i.e., simply leave that field blank – also make sure that the interval depths are properly defined so that one does not end up with "deeper" above "shallower" depth intervals.
  • VALUE LISTS: Click onto the FULL LIST at first, and then select the one you prefer to work with (e.g., "330_pf_preservation", etc.). Let DL-TECH know, which list you prefer and have him/her select it for you permanently. Check once in a while whether that list is still active, and inform Tech if not!
  • Do not enter "symbols" such as "<", ">" into certain columns! Pay attention to "SPELLING ERRORS" (as the value words are then not recognized anymore by the value lists and/or STRATER). That includes also paying attention to correctly enter data and write readable notes in the THINSECTION report templates/logging sheets.
  • In general: Abstain from HIDING COLUMNS using the ICON on the DESClogik interface (instead, use a RIGHT-MOUSE-Click when clicking on the column)
  • Much better is: Abstain from making any changes in the template yourself Instead: Work with your valid DESClogik specialist on improving your respective TEMPLATES & VALUE LISTS, add/hide/edit columns, etc.
  • For THINSECTIONS: Fill in all cells diligently, EXPORT-2-EXCEL (for Back-Up purposes), upload, clear the sheet & download for QC-purposes. THIN SECTIONS remains THE biggest problem child in this entire process…
  • Double-check BIN & CORE SECTION lengths: Do they make sense? Use the BIN SIZE length as the length & size of the ROCK you describe (puzzled? Me, too…) Perhaps check with the YEO-Person and also Curator for data accuracy.
  • Work as a team! Communicate often & well among team members and groups. However, should you hit a wall (e.g., of frustration, etc.) - choose to go outside or to the gym (instead of going after someone else…), take a good breath of sea air, and enjoy the vista – then come back & continue happily ever after…
  • Make sure that also "ALTERATION" and "VEINS" tabs are filled with information inside the THIN SECTION template (required for TS Report)
  • Turn OFF the computer once a day, or so. DEPTH errors may occur due to changes in the curated core section lengths; it requires a physical shut-down of the application (and perhaps sometimes also the PC) to update newly logged "curated depths".
  • Always mark on the IMAGE PRINT OUT's where THIN SECTIONS are taken (helps to put them into the visual context of the entire core section)
  • Delete each cell (if warranted) individually with a RMC (window pops up with "Cancel Cell" option); then re-fill cell content with what you want, then upload, then down-load for QC purposes.


Physical Properties

Patrick Riley

Summary

Expedition 330 had relatively high recovery for a hard rock cruise and very little soft sediment was encountered. All cores recovered were drilled using the RCB method.

Equipment Performance Summary

WRMSL

  • Sections were run measuring at a 2cm spatial resolution. The PWL was turned on for only the soft sediment cores that were recovered. Although these were taken with the rotary core barrel good P-wave results were obtained as the liner was completely full. GRAPE calibrations were performed as necessary based on monitoring the drift of the water standard. On a number of occasions section data were saved with a file name for a previous section. From reading past technical reports this appears to be an on-going problem.


  • PWL actuator clamping & not releasing. What was reported as a hardware issue was actually a configuration change that went wrong. The "Position Error" was enabled. The operation of the PWL actuator is to move to a position that it will never reach, going into current limit, clamping the whole round for good transducer contact. Enabling the position errors caused an error to be sent when the actuator went into current limit, hanging the program. Changed configuration file and downloaded new configuration disabling position errors and problem was solved. –cobine


  • Changed position of Galil Amp to position on the back wall away from the monitor arm. Discussed at length with Etienne Classen moving the WRMSL Galil amp to the back wall and possibly but not necessarily moving the NGR Galil amp there as well. –cobine


  • WRMSL code development. Installed the National Instruments PCIe 6251 Counter, DIO card in the STMSL host and completed modification to the MSL version 2 software for the differential logic switching. Encountered problem with a race condition which had not been seen in any previous testing. First fix was to redefine the error as a dropped data point and continue. Fix was to instigate a more rigorous handshaking of the communication between the control component of the code and the sensor driver components. Testing shows that this prevents the error or dropped data from happening.


  • The new code base has not been migrated to the Whole Round MSL track –cobine


  • Briefly tested with the version 2 code. All sensors ( Bartington MS, GRA, PWL) run concurrently on one computer. This was a brief test only, not a formal testing session. As this track does not have the updated Galil driver library (for some reason,) switching of the Galil driver versions is necessary for experimentation. This added to the reluctance to do much testing on this system. (Discretion being the better part of valor) –cobine

STMSL

cobine

  • Running on version 2 code. The source code is compiled into an executable program.
  • Running LabVIEW 2009 Sp1 on MS Windows XP. There is still an issue with the ORTEC driver for the DigiBase under MS Windows 7 OS. Updated driver doesn't work - but this time it doesn't crash the operating system when installed.
  • Included on this system is the IODP MagSus 1 magnetic susceptibility sensor prototype. Which has been showing itself as a reliable replacement for the Bartington MS2 meter box. It utilizes the Bartington MS2C loop. It has not been tested with other Bartington MS2 sensor loops.
    • Major advantage of the IODP MagSus 1 sensor is there is no data rollover at 9999 on the display.
    • Other advantages are : It is a calibrated sensor (requires calibration) so there are no corrections for coil factors and the unit of measure are whatever the calibration standards use.
    • The sample time for this sensor is variable from 0.1 to 10 seconds, depending on the degree of instrument noise that can be tolerated. Recommended time is 1 second for reasonable noise performance. For calibration the time is set to 10 seconds to ensure that the calibration is as low noise as possible.
  • Digital I/O is via National Instruments MIO card and differential (Hi-Low, Low-Hi) logic levels for the ultimate in noise rejection.
  • Bugs, very randomly (once or twice in 300 measurements) the sensor does not measure. Timeout. Problem is more observable when the measuring interval is small (0.5 cm) but this may be just the high density of measurements shows the problem more readily and not the increased work load. Originally this gave an error which stopped the system from continuing. The error was 'redefined' to become a dropped data point. The null value (NaN, not a number) is output and the measuring process continues uninterrupted.

Fix: attempt one:The first attempt at fixing this has been to send, with verification, the "Measure" command up to 3 times before timing out and creating the null data point. This seems to have reduced the incidence but not irradiated it. Cause unknown. This had not been observed on the MSL prototype code that was developed on X327 and earlier in this expedition during initial testing. Testing with an increased instrument load is required to see if its occurrence increases with the number of instruments. Another bug is to do with Sequence Modifiers over-riding the standard flag that is read off the label when the standard is to be included in the measuring sequence, not at the end of the sequence. Normally the standard flag will signal the end of a measuring sequence. This requires some simple logic to resolve, for all conditions. As this is a non-standard operating condition it may or may not have been fixed yet. For touch screen operation some dialog boxes need to be recoded to include large, finger sized buttons (64 X 105 pixel instead of the default 23 X 47 pixels). As express dialogs boxes were used as the foundation during initial development , button sizes can not be readily changed, so recoding is required.

NGR

  • Calibrations were performed prior to arriving at the first site and at the end of site U1374, which was approximately half way through the expedition. Background measurements were recorded for 20,000s prior to recovering core at the first site, and any location that was more than 150 nm apart. Data acquisition times varied greatly due to the high recovery and back log of cores in the lab. As time permitted sections were run for 1.5 hours, but as the back log of core increased the count times were shortened to either 0.5 hrs – 1.0 hrs depending on the composition of the rocks.
  • The electronics rack was enclosed with a protective hinged cover using an alloy frame work and Perspex.


  • The NGR Master Data Reduction application (NGR Tester) has been incorporated into the NGR program so sample data is selected automatically and compiled with the background and calibration files at the end of each run. Use the "Run the NGR v330" desk top icon to launch the current application. The "Configuration Settings" display has been modified and looks very different from the previous version (see screen captures). This is where one enters the acquisition time (System Setup tab) and selects the data paths for compiling the sample, background and calibration files (Folders and Files tab). Another change to the program is that the "Run the Experiment" button is grayed out until a sample length is entered, ensuring that all measurements will have a depth in section assigned to them. When launching the NGR program it takes an extremely long time to open; exercise patience. Also, the first data set that is compiled after opening the program takes some time, but subsequent runs perform the task much quicker. Both of these issues are supposed to be corrected soon.

The drawer containing the calibration standards should be identified with a "Radioactive Material" label.




Thermal Conductivity

  • Both half-space probes failed during the expedition. Water was infiltrating the epoxy that is supposed to keep the wires water proof. After failing to return values on samples the probes were oven dried and a series of test measurements were performed on the macor standard. The probes performed well while on the bench, but after a number of measurements while immersed in water both probes failed to return a thermal conductivity value for the standard. The test results were saved on the thermcon computer C:\Data\Thermcon\Exp 330 half-space probes. This has been a continuing problem that seems to have no apparent resolution.
  • The thermal conductivity station was moved back to the stratigraphic correlater location to make room for the hand-held XRF.


Stratigraphic Correlator

  • The Mac used for stratigraphic correlation was removed and stored with the MCS. Presumably it will be set up on the next expedition that requires it.



MAD

  • Moisture and density measurements were performed on cubes that were shared with the PMAG group. The cubes were saturated and placed under vacuum (-45 kPa) for 24 hrs prior to measuring the wet mass.
  • Other than being extremely tedious to use there were no problems with the balance, pycnometer, or gantry software.
  • The pycnometer He cylinder was changed once during the expedition. Calibration of the pycnometer was performed at the beginning of the expedition and when the cylinder was changed. QAQC measurements were performed during each run.
  • Pycnometer cell 6 was returned to the lab from shore but not installed. It is being stored under the balance station as a spare.
  • The PP lab received a clamshell vacuum chamber for connecting to the vacuum pump. Both the pump and chamber are stored under the balance station. Also, two more vacuum chambers are in the Upper Tween Stores. At times the lab needed to employ 3 vacuum chambers to accommodate all of the samples.
  • At the pycnometer station a regulator was installed on the airline and set to 20 psi, which is sufficient for cleaning the pycnometer cells.
  • After being turned off and back on during one of the daily meetings, the temperature of the Lauda circulating bath was fluctuating dramatically. Inspection revealed lots of dust and lint on the cooling fins of the heat exchanger. Once cleaned, the temperature stabilized. A protective cover was siliconed in place over the on/off switch to keep it from being turned off during cross over meetings.
  • Additional vials for soft sediment moisture and density measurements are stored under the sink in the Phys Props area.


Discreet P-wave

  • Measurements were performed on saturated cubes just after the wet mass for MAD was recorded. Calibrations and QAQC measurements were performed on a routine basis.
  • The half-core bayonets were not used, but were calibrated and tested at the beginning of the expedition.
  • New copper and aluminum calibration standards were sent to the ship. They have been entered into LIMS, have QAQC labels printed, and are stored in the drawer with the acrylic calibration pieces


GANTRY:Shear Strength

  • The AVS system was not used.



Gantry:Velocity

  • Progressing on writing a user friendly version of the Velocity Gantry software. Alas it will not be suitably advanced for preliminary demonstration and comment at the STP meeting on board. –cobine


P-mag Laboratory

Trevor Cobine

Overview

This expedition was a hard rock expedition where p-mag was a major component of its objectives. Unfortunately the new thermal demagnetizer and the new controller and software for the D-Tech 2000 did not arrive as anticipated. There were some issues with the D-Tech 2000. The new Agico was JR6a was used extensively and has performed well.

  • All instruments were used for the analyses preformed. The SRM was used exclusively in the continuous sample mode and the new AGICO spinner was used exclusively for the discrete samples.
  • Two pieces of equipment gave continuing problems, the cryo-compressor with thermal overloads and the D-Tech with serial communication problems. All other equipment worked as expected.
  • The SRM programs were updated with concurrent read of the 3 SQuIDs being implemented.


Issue Summary

  • Cryo Compressor: The thermal overloads experienced in the previous expedition continued. ET Jurie Kotze replaced the main power switch which includes the thermal breaker, overloads continued. It was noted that the pressure for the cryo-compressor was higher than recommended. This was bled down to the minimum working pressure and overheating has abated.
  • D-Tech 2000: The D-Tech had problems with the serial communications between the PC and the controller. This has been diagnosed as a generational problem. There is a new D-Tech controller and software on order, awaiting delivery.
  • The Agico AR6 sample spinner magnetometer gave problems with the automatic cradle alignment mechanism. After some fine tuning and mechanical adjustments it worked fine again without problems.
  • SRM Rock fragments fell behind the Home switch, jamming it in the on position, preventing the tray from moving out of the low field region. Cleaned out the fragments and all worked ok.


Instrument Status

SRM

  • Issues with the cryo-compressor overheating, awaiting the delivery of a new cryo-compressor. There is no spare cryo-compressor on board.
  • Boat stayed in low field region and could not be coaxed out. Problem was small pieces of rock had lodged behind the HOME switch actuator plate and held the switch in the operated position. Removed the pieces of rock, cleaned this area of other debris build up and SRM returned to service. Source of the rock pieces was the sanding of the sharp edges of cubes to fit the AGICO sample holders. Pieces would crumble off collect on the bench and then accidently fall in behind the switch.
  • SRM communication with the SQUIDs was improved reducing acquisition time from a few seconds to less than .5 seconds. Communication protocols were change to read a specific number of bytes instead of 500ms timeout, the serial communication VI were set to re-entrant and the code logic altered to communicate with all 3 SQUIDS simultaneously. – mills
  • SRM software still causes frustration with the scientists. The problems being the usual complaints of clunky interfaces and idiosyncratic operations and bugs. Particularly, errors causing the program to hang with no indication of what's happening.


  • We are running a source code version of the Continuous Sample program that includes the ability to vary the measuring speed. As this is a hard rock expedition this is important for preventing flux jumps on highly magnetized samples. I know of no reason why this could not be an executable program, other than it has been a 'work in progress' this expedition.


  • There is an executable version of the Continuous Sample program that doesn't have the speed control and a compiled version of the Discrete program.


  • Discrete program is not being used for two reasons:
    • Jeff Gee was not happy with the results it was giving.
    • Continuous sections & high recovery keeps the continuous program busy. The AGICO JR6a has been extensively (exclusively) used for discrete samples. My observation is, "if the tools are there and they are of good quality, they will be used - even if it was meant to be a back up for something else." Since the KLY-2 was replaced and the Molspins have been replaced both these replacements (KLY-4s & JR6a) have had considerably increased use over their predecessors.
  • Bugs
    • Randomly, the sample will be moved to the low field region and sit there. Instrument thinks the sample boat is moving and measures, demags as usual. This occurs on the NRM or first pass. I don't believe it occurs after the measuring sequence has commenced. Restart the measurement sequence and it runs properly, with no other resetting or restarting done.
    • Clunkiness of the user interface has been an issue. There is quite a bit of dissatisfaction with the code, interfaces and the idiosyncrasies of it. Badly labeled buttons or default labels have not been changed causes some issue. Behavior of some modules is quirky and users take time to work out how to accommodate this.
    • The execution of the code reaching dead ends and hanging, still occurs under error conditions and will continue to do so with the programming style unique to this code base.
    • The error handling is not well implemented although there is error logging and it is likely to cause a hang up as it hits a dead end in the execution.


Agico JR6a

  • After initial problems working well.
  • The brass actuator arm was jamming on the extension move then giving an error message. Cleaned and lubricated the super fine thread with DryLube (MoS2 in (volatile distillate ) and an eye dropper. Lubrication and cleaning seems to have fixed the problem. There was a lot of rock dust inside the case that was cleaned out.
  • Inside the case there is a slotted disc and optical switch that signals the "home" position of the hex key. The optical sensor provides a TTL signal (0-5v). What we found was that the switch was outputting 2.3-2.7v signal that would randomly signal the home position. Jurie modified the switch's mount to increase the gap between the disc and sensor. With this modification the optical switch outputs a solid 5v when the slot was sensed and provides the correct home position solving the hex key alignment problem. -mills


Agico KLY4-s:

  • No issues


D –Tech 2000.

  • Awaiting replacement, new controller & software have been ordered


Shondstadt thermal demagnetizer:

  • No issues but there are still only 2 quartz boats until the replacement thermal demagnetizer arrives.


Items on Order

  • 12 Agico automatic cube sample holders and alignment tools
  • 3 Haskris 17. sq in. suction filters.
  • Brooks Automation Cryo-Compressor 8200
  • ASC Thermal demagnetizer
  • D-Tech 2000 controller and software


Miscellaneous

  • The absorber in the cryo-compressor was changed. Due again 5 January 2012.
  • A Bartington MS2 meter with MS2F probe had been set up for the scientist to measure discrete samples after each demagnetization step.


  • Email from Jeff Gee To Margaret Hastedt

Also, when the compressor/cold head shut off the SQUIDS exhibited very strange behavior. Basically, they flatlined (with noise at 10-8 emu or far less than normal) even though a basalt core was running through the magnetometer. In fact, the 10-5 A/m output is how we first noticed that something was wrong. Ever seen this type of behavior? jeff
Reply from Margaret Hastedt 6 Jan 2011
Yes, I did see this once on x.329, after the cryocooler had been off for at least 12 hours (none of us noticed!). Although still superconducting, the SQUIDs quit counting. The temperatures were all elevated according to their voltages, but why that would have ANYTHING to do with the counting electronics had me completely baffled. Maybe something to do with the striplines? The behavior ceased as soon as the temperatures cooled back down.
Maggie

  • Implemented the IODP MagSus1 sensor code and tested it with the MSL version 2 software. Better magnetic susceptibility standards are required for calibration purposes as the Bartington MS2c standards do not cover a large enough range also measured values (Bartington MS2 meter) does not agree with designated values. This sensor is still at prototype stage and has not been instigated as an approved sensor.



ET Report

Jurie Kotze: Etienne Claassen

Core Lab

1.NGR rack & frame was sagging and deemed unsafe for use by techs and scientist, to protect the equipment and for safety we built a new rack for NGR electronics, removed the entire frame surrounding the NGR equipment piece by piece and build new rack that is safe and neat. Built new Plexiglas cover for NGR to protect the instrument from wet splashes as occurred in the past with exploding core liners under gassing sediment cores.
2. Built new core description table, mounted two new screens and modified screen arms with heavier duty hydraulic lever arms, lay all cables to and from the screens to the computer. Repaired power pack for booster and reroute all cables in roof, Screen and USB cables.
3. Helped with the installation and development on new laser core line engraver, wired instrument interlocks and safety switches together with some electronic control additions. Some further details still have to be designed and installed before it could be put in full operation. This was the trial period to establish what else needed to be changed and developed and it will be functional in near future.
4 .Endless difficulties occurred with Cryomag helium compressor, it kept on tripping on over current or over temperature, several attempts were made to do changes on Haskris water temperatures and filter cleaning and a suspicion of clogged heat exchanger inside the compressor came to mind. The power up switch was replaced as well as it is not just a switch but handles different parameters of the unit on which trips it as well when unit goes out of spec. The absorber filter was changed as well as it was due for a change in January. The compressor current ran at the prescribed amperage rate that the Manual suggests (8,5 A)and a constant mains and current logging monitor was installed throughout the time but that just showed a normal current consumption continually . The only thing that caused some concern was the helium static pressure that was sitting at 275lbs and according to the manual and marking on the gauge it was supposed to be at a static pressure of 250lbs. After letting some gas out under controlled condition we dropped the pressure to 255lbs static pressure. After re-startup it ran cooler and the running current dropped by 0.5Amps. It never failed again. This must be an indication that the gas pressure might have been too high for its comfort.
5. The D-Tech 2000 kept on failing due to some communication hang ups although nothing electronically was at fault with the machine. The only change to the system was a new PC and operating system with new software. With a lot of patience and nursing we constantly had to nurse it back into operation and it kept going without failure for the last two days. The D-Tech will be changed with a new model machine as the new one did not arrive in port in time. To our conclusion it was just the huge generation gap between the new PC and software and the old generation machine. It should be something of the past the moment we installed the new machine. The D-tech used to be a very reliable instrument in the past and we never had this kind of troubles previously, it just got to the point where it needed some replacement due to its old generation.
6. The Agico AR6 spinner magnetometer gave problems with the automatic cradle alignment mechanism. After some fine tuning and mechanical adjustments it worked fine again without problems.

Chemistry Lab

1. Fisher scientific high temp furnace in chemistry lab fan became noisy, removed fan, cleaned and lubricated bushes as no spare fan was available for it, refitted fan and rerouted wiring that could have caused noise by getting touched by fan blades as well, checked fan running satisfactorily and ordered 2 new fans for the furnace.
2. Ordered new Omega handheld temperature meter plus type K shielded thermocouple (old one failed and is beyond repair) for monitoring the inside temp while in use.
3. Ordered a new 5-KVA step up transformer for voltage compensation to get furnace to reach its maximum temp of about 1150 degrees C at the Factory voltage of 230V, with ships 208V it would only reach 980 degrees C.
4. Two ordered 30A relays for lab incubator still outstanding, might need a follow up check. One to replace and one spare. The rest of the spares for it arrived two expeditions earlier.
5. Carver press giving problems during the previous expedition was removed. The controlling pressure gauge was removed and repaired. The oil leak that occurred was searched for but nothing leaking could be found on the unit itself. The whole machine was washed to get rid of all oil on it and it was piped in a closed circuit and kept it running at a pressure of 40 000 lbs for a 12 hr period, no leaking showed up. The cause might have been some overfilling of hydraulic fluid. During the expedition some oil appeared again on the floor in the vicinity. It might be an idea to go through the rest of the piping fixtures for other leaks somewhere else if it is not overfilling on one of the other machines. We suggest that the help of the ET's or reporting it to the ALO's if oil levels needs to be checked.

Paleo lab

None but a service on a ultrasonic cleaner

DHML

1. Assisted with setting up and impedance matching of the Göttingen Borehole Magnetometer (GBM) tool to match up with the Lamont 10km logging cable. Several adjustments and matching stubs had to be made for tool to communicate and maintain communication with onboard equipment used by Lamont and third party equipment from German University (Göttingen)
2. Installed a Trimble GPS antenna outside on top deck railing for a Trimble receiver in LO's office in aid of ships positioning for Lamont logging operations.

Thin section lab

1. Built low voltage LED lighting circuit for lab rock cutting equipment to give better light in and around cutting area. LED's were made water tight but no electric danger but purely to prevent corrosion.
2. Removed X-ray lab, bead maker extraction fan situated in the thin section lab making a very load and annoying sound that was unacceptable for techs to work there and we moved it above the ceiling board and secured it to the steelwork. All piping and wiring are still in the same position as before, just the fan's position has moved away from direct working space.

Underway

Assisted with Maggie deployments and retrievals during transits.

Bridge deck offices

Requested fresh license codes for eight of the ten Rig watch dongles from Can rig and refreshed their license codes for the next 255 day period.
Repaired coffee grinder on Bridge deck, replaced springs in between two blades, clean coffee machine and checked all moving parts.

Gym

Treadmill failed, motor started to burn due to worn down brush gear causing excessive sparking on armature commutation strips. Cleaned controller board, repaired dry joints rinse of board with contact cleaner, assembled board again and replaced heat sink compound on power transistors. Removed motor, had mechanics shimmed commutation contacts on lathe and cleaned spacers between contacts, shaped new brushes from old ODL motor brushes as no brushes were available for our machine, checked bearings and armature, reassembled motor, and reinstalled in treadmill, tested motor, working and drawing 6.8 A. All checks were done as instructed by Cybex Co. Voltage to spec, amps to spec with no and full loads. Realigned belt on treadmill and checked PC board settings for speed control, everything working satisfactorily again. Spares were ordered for tread mill, new belt, rip belt, deck and brushes.

ET- Shop

Built aluminum frame for TV and storage shelves in ET shop and DHML.
Ordered spares for Shop's general stock and various dedicated spares for equipment.
Cleaning operations will be done in due time.

General

It was a trip filled with challenges and I, Jurie want to thank my colleague Etienne for a major amount of mechanical construction and building projects that I could not produce at the standard of workmanship that he has done.

CURATORIAL REPORT


Lara Mile

Summary

Samples: 5359 samples were taken on Expedition 330; of these 1208 were shipboard samples and 4151 were personal samples.
Core recovery was 808m and mainly of volcaniclastic breccias, sandstone and some limestone.

Shipments

CORE:
Core-There is a total of 144 boxes of core (72 archive, 72 working)
Hand Carry- Jonathon Kell will hand carry smear slides he made on board.
RESIDUES:
Residues- The residue distribution is as follows:
ICP Chen
MADC Fitton, Sano, Fulton
PMAG Deschamps
TS GCR, BUCHS
TSB GCR
XRD Rausch
SS GCR
Frozen & Refrigerated Shipments-At this time we estimate 2 boxes of frozen and, 3 boxes refrigerated shipments will be sent from Auckland, New Zealand.
Thin Sections- 286 thin section slides were made from 265 thin section billets on Expedition 330. All thin sections will be sent to the GCR for inventory, before they can be requested by shipboard scientists. The thin section inventory is attached in the same email as this report.
Smear Slides- The sedimentologists prepared and described 30 smear slides. The Smear Slide inventory is attached with this report. All smear slides will be shipped to the GCR.

Core Flow Activity and Sampling

Core flow followed the same procedure as Exp 327.

Hard Rock Sample Parties

There were a total of 10 Sample Parties held during Exp. 330. Due to the large volume of recovery both the conference room and the core deck (description, sampling, cutting room, and down hole) were utilized.

Action Items

Pieces for CT Scan:

  • There is a request from Dr. Shin-ichi Sano to borrow several whole pieces from U1376.
  • There is one piece (U1374A_3R1_pc 1) for CT scanning that has been packaged and sent with the thin sections.
  • Please see attached inventory list for Loan and CT SCAN pieces.
  • There is a request to borrow thin sections 876IODP.
  • There are a few cubes that will be cut at the GCR and the labels for these cubes have been included in the thin section shipment.

Computer Software

Curatorial Oddities Report (COR):
This new report was a great help in maintaining an accurate database. http://ararat.ship.iodp.tamu.edu:8080/UWQ/
Piece Length report:
Scientist Lara Kalnins wrote a code for Matlab that will check any irregularities in bin lengths/overlap and core lengths for hard rock cores. I will get a copy of it and store it in the Curation folder on the server.
Latest Version of Sample Master: The latest version of Sample Master on the ship as of Expedition 330 is v2.1.1.6. The curatorial staff in the GCR needs to ensure that this is the version installed on all workstations in the repository.
Problems encountered
Printing and Uploading in Sample Master
When entering and uploading samples for Shipboard analysis there was a constant problem with printing Samples would be entered and "print on upload" would be selected but an error window would come up that read there was nothing to print.
If "print on upload" did work it would not print all of the labels; typically it would leave out the first or second core. This problem was not consistent with which line or lines it left out of printing.
For the sample parties I used the Parameter Search in SampleMaster to print out by core the labels. However, when trying to print out several cores at once the same problem would occur where some of the labels would be skipped. Again this didn't matter if there were 5 labels or 50.
Parameter Search and EDIT TAB in Sample Master
The parameter search in sample master version 2.1.1.6 was unstable. The red "X" would appear randomly and didn't matter if it was loading 8 samples or 300 samples. Towards the end of the cruise it was less and less reliable which means it would crash every time I tried to retrieve samples regardless of how many samples or server I was on.
Features to Sample Master:

  • It would be nice if when using the parameter search "Core" criteria that a range of cores could be input instead of just one at a time.
  • When picking through different cores to view/edit the sample table sometimes the screen will still show the last core that was selected instead of the current core. A refresh button that when selected would then refresh sample master to show the current core selected would be a great help.




IMAGING LAB

William Crawford

SPECIAL PROJECTS- Whole round imaging

The main goal of this special project is to secure and hold whole round sections of the core in a precise enough manner that will allow us to rotate and photograph the complete 360 degree outer surface of the core and combine the multiple images in a single contiguous photograph. These images can then be directly compared with those generated during down hole imaging of the formation using the FMS logging tool, which will enable robust correlations of structural properties of the formation (e.g. between core and log data. This comparison, which is currently not possible, will permit an accurate placement of the photographed sample relative to its location below the seafloor, as determined with logging data.
Design efforts to develop a method to image the exterior of whole round core sections began on shore, and continued during Exp 330. I worked with LO Bill Mills (who generated drawings in Solid Works), and with the machinist on board (NAME) to create a working prototype using cylinders of solid nylon and core tubing material. The next phase of the project entails: 1) feasibility studies and material selection, which will also be conducted with assistance from professional machinists; integration with core track devices, with assistance from line scan camera track developers; and 3) development and implementation of software and image inscription protocols, with assistance from programmers and scripting experts.
The project will be completed and tested on shore, prior to a planned shipboard installation at the beginning of the SuperFast Expedition, in April 2011.

SYSTEMS STATUS

DIGITAL IMAGING TRACK

Operational issues: The track-mounted line scanner worked will during the whole expedition with one exception. There was a computer shut down and when it was brought back up the operators failed to notice that the camera had defaulted to a previous calibration, as a result a number of scanned sections were too dark, but at a density that could be corrected. As a result no sections needed to be rescanned, and I corrected the images in the database. To reduce the likelihood of this happening in the future, two actions were taken: 1) Both user areas were populated with the current calibration settings, so that if power fails again the camera will pick up one of those two sets of data and it will default back to the latest and correct calibration data; 2) Trevor Cobine worked with the ET's to build a power supply that is independent of the computer, so that it decreases the chances for the camera power to be shut down, i.e. from now on, if they have to shut down the computer it will not affect the camera's memory.
Quality control: The images were constantly monitored and a calibration check was done between each site and hole.
Printing of images: A new practice of printing out sections of core side by side on tabloid-sized paper was implemented to allow scientists to generate hand annotations as the core was being described. Human interface with DescLogic was found to be deficient, so scientists resorted to this method.
It is my understanding that the density-corrected images are being pulled from the work station (not the database) and configured for printing. In the case of a lighter core, the density correction used was found to be in error. It should be noted, however, that the correction was not created by the Imaging Specialist nor was it seen by prior to deployment. My recommendation here is that whenever there is an issue that has to do with images (be it stills, video, etc.) the Imaging Specialist must be consulted, in every instance.
The practice of printing out core images to annotate them by hand as well as the down loading of data to Excel are clear indications of scientists continued dissatisfaction with the interface DescLogic offers.

MICROSCOPES

Configuration and alignment- The microscopes were configured as per the request of the users:
1) microscopes for petrographic use were configured with a range of objectives beginning with 2.5x;
2) the axiophot scope at the very bow end of the lab was configured with objectives ending with 100x oil for use by the nanofossil expert;
3) the foram work station was configured with a stereo scope and appropriate lighting;
4) two stereo scopes were deployed in the smear section area, one stereo scope in the chemistry lab and another in the ET shop.
Overall Microscope Assessment- After configuring the microscopes for specific needs in the first few weeks after departure, the microscopes required little attention other than cleaning and minor adjustments, except for three instances where repairs were needed:
1) The slider mechanism (which switches the light path between the oculars and the camera) in the axioskop located on the starboard lab wall became jammed. My assessment is that the slider rod was forced over so that it pinched with the mechanism of the other rod. The prism was disassembled and repairs were made.
2) A second instance of problems with a slider mechanism occurred on the axiopot scope. I believe that this was also caused by user abuse, which caused an aluminum shaving to jam the mechanism. A major disassembly of the microscope to access the slider area was necessary. The mechanism was cleaned and polished, and the microscope as reassembled. Total time 4 hours.
3) The bulb socket in the axiophot was replaced, after two unsuccessful attempts to clean the bulb and socket.
4) The axiophot computer in one of the work-stations developed a glitch, Chris Bennight implemented a gain adjustment to relieve an error message in the spot program.
Data-base interface- The deployment of Chris Bennight's interface with the spot cam software was a tremendous success! The ease of operation was evidenced by zero complaints, and with a record 1257 images uploaded to the database to date.

CLOSE-UP PHOTOGRAPHY

The close up table was rigged up and put in working order. The system worked very well and the captured images were displayed real-time on the monitor, which allowed for immediate visual inspection and feed-back. I was then able to make subtle adjustments of the lights to achieve the desired effects. For example, for many of the small crystal structures requested the lighting needed to be adjusted, and it even required the use of a hand held source to accurately control the direction and angle so as to optimize contrast or texture and in some cases illuminate crevasses. I found this to be challenging and rewarding.
Once again, working with Chris Bennight and David Fackler in integrating image acquisition and data-base archiving, immediate printing of appropriate labels and the ease of uploading the final images made the process of capturing close-up images a joy.
The acrylic box provided by Beck and Kuro was used extensively. This device allows for a controlled intensity of the background lighting level so as to achieve a clean shadowless white. Background light separation allows the sample to be illuminated at the ratios and angles required for each particular sample.
We did have a failure of one of the lights. It was requested that two lights be ordered but there is some debate as to the needs for this. I feel, however, it is essential to have a spare light as a back up. Otherwise, there will only be one light to illuminate the back ground, and this will not cover much more than 20cm sample. The challenges imposed by a lack of spare lights I believe are unnecessary.

Video and still efforts

My video efforts were limited to sharing my B-roll footage with the Ocean Leadership videographer who sailed on this Expedition, and learning the new audio systems. B-roll capture of various activities from the lab stack to the drill floor was down loaded and archived. The new video camera and wide-angle lens, coupled with the shot-gun microphone improved image gathering ability.
Still image production was normal with no malfunction of equipment. AA batteries were added to the list of consumable items in inventory.
Established workflow protocols using Adobe Bridge to append meta data to each image. The Bridge window for meta data entry can be populated using the cut and paste functions from excel and word documents. This functionality can be exploited to allow for rapid identification of each of the person on a given image. Eventually a complete personnel list will be populated and the beginning of each expedition, and the routine personnel (e.g. lists for each of the A and B crews) can be easily imported from archived lists. Using this feature will easily add all relevant metadata to the raw files and insures correct spellings and titles of all personnel. A down side of this method is that it does not allow for listing names on an image as they appear from left to right. On the plus side, multiple images with similar content can be selected so that the same metadata can be added to all of the related images at the same time. I find that the implementation of Adobe Bridge to append meta data to each image is very user friendly and that the resulting addition of metadata to each image provides the much needed base for building a searchable imaging management system.

Petrographic image capture and archiving tool (picat)

Picat system worked well and was received very well by all scientists. At the beginning of the expedition, I imaged all the thin sections, but soon after that duty was taken over by Johanna Suhonen who assisted Gus Gustafson with the thin section manufacture. Johanna has an interest in photography and she interfaced with the procedure quickly.
Maintenance- Picat had minimal maintenance issues. We did, however, burned out two bulbs and had problems with the power supply switch, which was temporarily repaired by the ET's. Bulbs had been placed on the inventory list on Expedition 327, but this request does not appear to have made it through the system. A new request was made and confirmed with shore that the proper bulbs will be in stock for future expeditions, with a stock level of 5 and a reorder level of 3. The problem with the light source power-supply is reoccurring due to a faulty switch. This issue needs follow-up on shore, and if possibly requires replacement with one more robust component.
All in all the system worked well and 579 images were collected and archived. This reflects two images taken from each thin section (single polarization and cross polarization) with multiple images and exposures taken from some samples a larger dynamic range was needed. This, however, was rare.
Having the PICAT work station next to the close up table was deemed advantageous, as this allows for combined storage of supplies and lenses. Many times the 65 macro was needed for the smaller request and the lenses needed were within reach.
Data-base interface- At the beginning of the expedition, entering of the thin section data was problematic since this crew was not familiar with adding the sample request and thin section curated to the database. However, once protocols were established for an efficient information transfer between curator and thin-section techs, a good work-flow was achieved and thin section data archiving became routine and very easy.
Proposed improvements In addition to exploring a more robust power supply switch, I would like to investigate the use of a brighter light source. Currently the bulb we use is 12ac, with a power supply of 100 watt. It is approximately 3200 degrees K in color. The amount of illumination is adequate, however, to achieve the best exposures our work environment (e.g. vibration issues) we have to use higher shutter speeds; medium f-stops are also desirable to achieve better edge sharpness. To achieve both of these, it is common to use 6400 iso settings, which are the maximum values the camera can handle, but by pushing the sensor to that kind of limit it results in undesirable noise. So far this does not appear to be causing any detriment to the type of images we are harvesting with the Picat, since the grainy texture is not usually noticed in this type of application. However, working on the limits of the tool is really not desirable, and using such a large iso values may compromise the image quality. Therefore I plan to explore the use brighter light source as one of the combinations of proposed improvements to the system.

SummAry

213 Close-up Images
737 Line Scan Images
1257 Micrograph Images
579 Thin Section Images
44.5 gb Raw Still Images




Publications Specialist Technical Report


Rhonda Kappler


ISSUE Summary


  • Provided administrative, graphics and publications support at port call and throughout the expedition. Will also provide support to 330T Auckland port call since there is not a yeoperson sailing.


  • Was responsible for all expedition paperwork, wireless devices, communication policy, photo release form, manuscript and photo copyright form, etc. Worked with the IODP travel administrator to coordinate and verify hotel arrangements for the end of the cruise. Acted as a contact for providing visitors to port call information. In addition to the normal and appropriate customs/immigration documentation, New Zealand authorities have required the collection of written proof and itineraries of flights leaving the country for all 330 participants be placed in passports along with the arrival form upon return to port in Auckland.


  • Inventoried Pubs Office supplies on hand and checked ships storeroom for availability. Suggested the Pubs Specialist on shore supervisor determine which supplies are required on a regular basis to perform duties then having the ships' inventory supply list updated to accommodate supply needs.


  • Six sites with eight holes were drilled with 1114 m of total advancement of the drill and 807 m of rock recovered at an average recovery of 72.5 and 88% recovery for Hole U1374A to set a new "hard rock" record. Attended science site presentations, crossover and other meetings. Worked with staff scientist and co-chiefs as well as very closely with science party members and Desclogik administrator in creating graphic data reports and troubleshooting various software issues.


  • Provided support for Visual Core Descriptions (over 720 total for sediment and igneous) and Core Recovery Summary figures. Created methods legend figures for both sediment and igneous recovery and made available to the scientists. Collected, organized and tracked all text, tables, plates and figures for the expedition volume.


  • Encountered and addressed multiple database and download "bugs" and issues. See also the Core Describer technical report for many of the same problems submitted in detail by Thomas Gorgas as well as numerous emails and issues reported to programming personnel and JR Developer.


  • Helped coordinate social events; birthday celebrations, holidays, parties, contests, expedition logo design, t-shirt making etc. Also, worked closely with the Captain in aiding a ship's catering crew member who had experienced and unexpected loss.



Equipment and software performance summary


Strater2


Performed well and only crashed once. Improved is how the software handles the core summary descriptions. However, still a disadvantage is the description "cutting off" on igneous. A request has gone into Golden Software to work on the problem. If the description features would work better, it could potentially alleviate much work for the Pubs Specialist and having to print hundreds of Word files out over and over again. Due to constant changes and corrections for Site U1374A, the 370 Word files were printed 4 times x2 with each reprint requiring about 8 hours.
New for version 2 is batch printing which is helpful, but doesn't work as ideally as it should. The printing sequence is related to the primary number (for example: 21-1 prints directly after 2-1 vs printing 2-3 next) causing resorting of pages before being punched and put in binders. Also, pages often print at random rather than in order (for example: core-sect 36R-6 may ended up in printing prior to core 34).

Database, entries and data upload or download related


Multiple issues with database information and LIMS2Excel downloads delayed production drastically. This was especially true of the deepest and highest recovery at U1374A. Unfortunately, a few instances where there were needed "fixes" the result was a fix required for the fix.
Many of the issues and bugs prompted changing the xml files from utilizing "create worksheet" in LIMS2Excel to using the "Desc template" download selections as it seemed more trouble free. In addition, the xml files were change to accommodate longer summary description longer line lengths in order to control hard returns. With longer lengths, hard returns only appear were intentionally placed whereas shorter caused returns at the end of character lengths when brought into Strater.
Sediment = 180 characters
Igneous = 4000 characters
Scientists expressed frustration over issues with Desclogik. This included the slowness of uploading prompting questions in process. For example, why descriptions have to be in three separate places (Word generated file, copied to Desc summary cell and uploading the Word file into Desc) for igneous. They also commented on data not canceling properly or "changing" later.
In addition, scientists wanted the ability to be able to enter data as they deem fit and "molding" data entries to fit needs for the database was limiting and discouraging to some. This caused unexpected use of Desclogik templates and columns resulting in problems with plotting in Stater.
For example with alteration, scientists were entering data for smaller intervals that fell within larger intervals. A section might be "moderately altered" for the majority of that interval, but a slight occurrence may also fall within that. Starter can not decipher and properly plot intervals that with within or overlap other intervals.
Also, true of this was data entered for vescularity. The vescularity abundance percent column was intended just to record data associated with vesicles, but ended up containing percentages for voids, vugs and other as well. The result was hand manipulation to delete unwanted percent entries from the download in Excel. To help alleviate this issue, a modified template with a "non-vesicle" entry column was developed to give scientists flexibility to maintain these entries, but help eliminate hand manipulation and correct plotting in Strater.
Ideally information in a download should match that as entered and seen in Desclogik. However, issues with this expedition data became so problematic and time consuming that they required additional manual manipulation within downloaded Excel files in order to get the work done. These were not associated with intentional science data entries as is the case with the above. Excel files were extensively reviewed for problems areas against entries in Desclogik then actually adjusted in Excel in order to match. Some were human error, but by far the majority were process and system issues. Such "corrections" were only done to get work out and only after consulting the scientist and looking over Desc entries to be ensured that it was the download that was problem.
All issues are reported for proper correction in the database. In some cases, the LIMS2Excel program had "bugs" and in other it was more Desklogik related. Deleted data entries in Desclogik did cancelling out properly causing "hidden" entries to be downloaded. These often were associated with LIMS2Excel combining exact intervals and placing value or data entries within one cell separated by a comma delimiter. Or if not exact intervals, then obtaining as "extra" line of data that was meant to have been deleted.
A particularly puzzling problem pertained to inconsistent depths that happened too many times. What would come up in Desclogik (for Stratagraphic/Lith unit, groundmass, vescularity, phenocrysts, etc) as automatically generated when the scientists first begin describing core and filling in entries did not always match depths from WebTabular reports or l2e downloads from using SCALINPUTS, SECT, PC or other at times. This has occurred primarily with section bottom depths in multiple sites and sections and some issues were as little as 1 cm whereas others were as much as the section depth.
One example is U1375B-1R-1:
WebTabular – LIMS Section Summary = .61 m
Image download = .61 m
Orintation, Piece and Section scale = .6 m
Desclogik entries = .57 m
Random changes to value lists causing problems with mapping data in Starter using schemes created. For example, the value list selection somehow changed to exclude a space between words occurring what seemed to be one site to the next yet no changes had been made to the actual list in Desclogik.
As must be true of all expeditions, sometimes scientists did not utilize the value lists, especially during the first and into the second site. Yet at other times it was the value entries that contained inconsistencies also causing problems due to multiple choices being available within a list for the same thing within the pull down selection available (for example: with an s plural vs without). In one instance, "conjugate joint" was entered with one space between words and also with two between. There are quite a few more examples that could be provided. This can cause work to have to be redone and definitely slows progress. It took a while to determine why this was not plotting in Strater since at a glance they look the same.
Many emails to JR developer and as well as "right now" type of requests went to programmers that will contain other problem areas and details that may not be listed within this report.

Other


Server Vol1 lost link and disappeared multiple times from the Mac platform requiring MCS help to regain access.

Special note


It was extremely helpful to have conscientious and knowledgeable people to turn to on board this expedition. Core Describer and Desclogik Administrator, Thomas Gorgas, made extra efforts to check and see how data and Desclogik was affecting the Pubs Specialist associated work in order to try to help alleviate issues. In addition to multiple emails, those with programming powers and a higher database knowledge (including David Fackler and James Zhao as well as Chris Bennight) were often interrupted by on the spot help requests. The biggest portion of the work would not have been completed without such support.

Additional duties and information


  • Added movies to the ship's collection and two copies of Oceanography (Vol.23, No.1, March 2010), Special issue on Mountains in the Seacontributed by co-chief Anthony Koppers


  • Updated the "Cabin Info" form under Deficiencies to include hard hats needed in cabins and saved for access with other Publications Specialist materials.


  • Helped the ship's Radio Operator with collection of cell phones twice during the expedition.


  • Assumed duties as the ship's stores sales primary contact.


  • Opened and learned how to use Font Creator for the first time to develop new Lithsymbol fonts for use with VCDs and Methods legends. Created six new symbols.


  • Developed spreadsheets for Desclogik templates, tabs, and columns utilized for importing into Strater2 and creating VCDs so that the Core Describer and Desclogik administrator, Thomas Gorgas, and scientists would be better informed.


  • Learned how to create a percentage recovery template and draft figure at the request of a scientist.


  • Made several modifications to the Strater VCD templates, including:


  • Divided igneous template structures data to load in two columns to prevent symbol overlap and improve readability.


  • On the sediment template, divided several columns (samples, lith accessories, etc) into two on the sediment template, changed the shipboard samples to utilize schemes and changed the section column to display lithology units at the request of the staff scientist.


  • Modified the sediment Age Zone columns for PF and Nannofossils to contain an age range for multiple overlapping entries at the request of and with help from the primary scientists.


  • At the request of the structural scientists, the Dip azimuth and angle to more enable a more accurate representation of the degrees. For example, azimuth should always be three degrees and maintain any preceeding zeros as a standard. This has not been done for previous expeditions. This required changes to the Strater VCD template and the type of log column being utilized. Alternatives methods were attempted in both Excel and Strater, but it was found that this also required the special attention of programmers to these "mark" and manipulate these specific columns in Desclogik in order to maintain downloaded data as text in order for this to work properly. This had to be done several times to accommodate new sites and data entries.


  • Provided comprehensive email updates required by the Publications Specialist Supervisor often followed up by detailed replies to specific questions.



Suggestions


  • Provide better instruction for scientists at the beginning on how Stater2 works with database downloads to create VCDs and the implications of not utilizing value lists or maintaining consistent methods and entries. A big help would be some better utilization of value lists earlier on as well. Many said it would have been good if they had known more specifically how important value lists were, consistency required and to which particular columns in generating reports and utilizing Strater.


  • I would also suggest continuing spreadsheets to hand out that clearly illustrate where we have such issues. See the attached.


  • Create a notification and check system for changes, including re-curated core data.


  • Find a way for computers to be restarted and Desclogik refreshed on a more regular basis since computers and programs are used for hours upon hours and even many shifts. This was posed as one of several possible causes contributing to "wrong" depths coming up in Desclogik. I also personally found that in order to see changes sometimes you must completely get out and log back in.


  • A better system is needed for notifying the Pubs Specialist of when data is truly ready or complete. It is understandable that data will change while core is still coming up describing going on. However, there were several times I was informed that data entries were complete and that all corrections were made and ready for download only to discover later it was not. For U1374A, I had updated and printed materials (370 VCDs and 370 Word files) three times to discover while at the copier/printer sorting the latest that data was had changed and descriptions from thin sections added. What was to be the final version never made it past the recycle can. This is extremely time challenging when dealing with large recovery as well as trying to move on to other work.


  • Add a programming provision for all Desclogik templates to treat and maintain the degrees data for WebTabular Reports and LIMS2Excel downloading in the manner scientists have requested.


  • More user-friendly WebTabular reports. Reading and using data generated really requires a background knowledge of Desclogik. To the "general" user, may have much difficulty with data interpretation. Headings and table format are cumbersome to utilize.


  • Fixes are needed, but avoid making unnecessary programming changes to software being used on ship and test on shore instead.


  • Extensive follow up on shore regarding issues - more practical training, investigating the accuracy of information being provided and generated through reports, data interaction, etc. as well as other listed within this report.


Chemistry Lab Technicians Report

Chris Bennight

Overview

Expedition 330 was a "hard rock" leg with a low amount of chemistry samples (primarily ICP). The reduced total throughput allowed for various general chemistry lab improvement projects to be undertaken, including:

  • Documentation – Manuals were created, reviewed, and updated for various instruments (Coulometer, ICP, GC-PFT, DIC/DOC, general chemistry SOP).
  • Inventory – The inventory system in the chemistry lab underwent a substantial overhaul. The inventory sheets where reviewed and physical counts updated, duplicate items removed, and locations normalized. In addition an in-lab location system was created, and all items on the checkout sheet had this location added in AMS.
  • Instrument host upgrades – Previously an attempt had been made to upgrade the instrument hosts to Windows 7, but this had to be rolled back due to various issues (primarily due to a lack of chemistry technician involvement in the process). During this expedition Grant Banta and I were able to upgrade numerous systems to Windows 7, and document the procedure. Where not possible the issues preventing upgrades were documented and set aside for future follow-up (software incompatibilities, etc.).
  • Addressing issues raised by the previous expedition cruise evaluations. There were various issues raised in the previous cruise evaluation, and I was able to begin work addressing some of the more pressing of these issues, primarily the GC-PFT instrument/workflow status.
  • Coulometer software upgrade – The coulometer software was replaced with a new version which should alleviate some of the common complaints and issues with the (previous) software.

Issue Summary

Included below are outstanding issues that the laboratory officer or others may need to be aware of.

  • Carver Presses – various fittings (which we don't have replacements for) are leaking/cracked. I was told this was to be followed up with on the transit, so am simply flagging this to keep it in the spotlight.
  • Parts on order:
    • A new inner door glass was ordered for the ICP to replace the current broken piece.
    • A new column for the GC-PFT was ordered from Restek to address issues from 329.
    • Rechargeable aerosol cans were ordered (to be filled with nitrogen) to replace compressed air usage.
    • Alcohol thermometers were ordered to replace the current mercury units still in service. Note that we will have these mercury thermometers ready for disposal at port call.
  • Helium gas system – Unsafe and unsuitable - Especially with the new industrial 1/8" banding now being used. We do not even own tin snips or other types of cutting devices capable of breaking this banding - we have to borrow from the core tech shop. Injuries will occur if this is not rectified ASAP.
  • Mettler balance - The mettle balance requires two Xs204 balances to operate. One of the balances stop functioning on the transit prior to the last Victoria tie up. The balance has supposedly been shipped off for repair, but I am unable to determine or obtain information from anyone as to the current disposition of the balance (and why it isn't back yet). Chemistry lab workflow will be impacted without a functional balance (for masses > 150mg – the Cahn covers masses < 150mg).
  • 13C Isotope spill in radioactive. Was a small amount of bicarbonate solution (10mL). Spill was absorbed as much as possible, and then the area was wiped down with a 1M HCl solution with an open door and ventilation to sequester all the 13C to CO2 and remove it out to the atmosphere.

Instrument Status

Balance-CAHN

The Cahn 31 was replaced last leg due to an internal issue. It was tested this leg and appears to be working properly after fixes by the ET's, but the Cahn 29 was used for actual analysis this leg. No issues to report. The computer was successfully upgraded to Windows 7 x64.

Balance-Mettler

The Mettler balance has been down one unit since the transit before the Victoria tie up, and no one seems to be following up on getting the broken unit repaired/back out to the ship. Disposition is unknown, but the balance in inoperable without this unit.

Coulometer

The coulometer was used to analyze the few sediment samples obtained on the earlier holes. The software has previously been very "finicky" about how it operated (or if), and the opportunity was taken this leg to develop a replacement interface for the instrument.
The software was completed and tested successfully this leg, and is now in service. The coulometer host was also successfully upgraded to Windows 7 x64. The coulometer manual was updated to reflect the new OS and detail operation from the new instrument interface.

CHNS

The CHNS was used this expedition to analyze the same samples as the coulometer. No issues with the instrument were encountered. There was a desire to perform direct TOC analysis on the CHNS, but we did not have an adequate supply of large silver capsules. These were apparently a special order item that had been used routinely, but never added to the inventory (and are now exhausted). As this has become a semi-standard (on request) analysis technique the required supplies (large silver capsules) have been added to AMS.
The CHNS software will not run under windows 7 x64. Initial attempts to discuss this with the vendor were unsuccessful; follow-up will take place post expedition from shore.

SR Analyzer

The instrument had been removed and placed in storage for EXP 329. It was removed from storage and set back up. The software itself will not run under windows 7 x64, so the instrument has been reverted back to the XP configuration.

Water System

No issues to report.

Dionex

The Dionex was not used this expedition for routine analysis. A new version of Chromelon was delivered with promised win7 compatibility, but after various support discussions it was determined that Win-7 64 bit compatibility will have to wait until version 7.1, which is due in early February. The instrument was reverted to windows XP.

DA

The DA was not used for any routine analysis this expedition. The instrument host was successfully upgraded to windows 7 x64 and the DA software is fully operational.

GC3

The GC3 was not used this expedition for routine analysis. The software will not currently run on windows 7 x64. A new version is approved for windows 7 x32, but no action was taken this expedition.

NGA

The NGA was not used this expedition for routine analysis. The software will not currently run on windows 7 x64. A new version is approved for windows 7 x32, but no action was taken this expedition.

GC-MSD

The GC-MSD was not used this expedition for routine analysis. The software will not currently run on windows 7 x64. A new version is approved for windows 7 x32, but no action was taken this expedition.

Alkalinity

The Alkalinity device was not used this expedition for routine analysis. It had been stored away the previous expedition, and was brought out of storage, setup, and tested to verify proper operation. Additionally the OS was updated to windows 7 and the alkalinity was moved up to Labview 9.0 (our currently supported version) from 8.5. Additional testing validated proper functionality.

GC-PFT

The instrument itself was not used for routine analysis this expedition. As it was a source of significant issues last expedition, effort was expended investigating and trying to solve some of the issues. The autosampler was converted back to headspace mode and proper operation was validated. The method used on the previous expedition was validated with the headspace autosampler. The Restek RTX 802 column (mega-bore) was tested, but proper separation was only achievable at 50c or less oven temperature, and was not very reliable. The installed Al2O3 plot column gave proper separation, but was to narrow in diameter (0.250mm) for the sample volumes being injected (1-2mL). A slightly longer Al2O3 (KCl deactivation) plot column (50m) was ordered with a 0.530mm internal diameter. This column is popular in literature for halogenated light hydrocarbon separation and should perform well in this application. It is expected that this column + proper headspace autosampler usage will remedy the problems discovered the previous expedition.
The manual is in the process of being developed/updated. Many spares/parts for this instrument were also not in AMS and so have been added this expedition.
The software itself is incapable of running under windows 7 x64 so no upgrade took place. A newer version of the software is validated for windows 7 x32, but no action has been undertaken in regards to this.

ICP

The ICP was used extensively this expedition, with no significant issues to report.
The software was updated to accommodate new methodologies suggested from the previous expedition (4 replicate measurements) as well as to allow for the automatic validation against check standards.
Reproducibility issues were discovered in the LOI process; with the end result being that all LOI's will now use 5g of material and a weigh to a count of 1000 data points. See the X-Ray technician report for more detailed information.
The computer was updated to Windows 7 x64, and the software has been validated as operating properly under it.

Carver Presses

The presses were all leaking/broken in various ways at the beginning of the expedition. This was a known problem and was communicated at cross-over. New fittings (many of which were cracked) are ordered and should be installed during the upcoming transit.

DIC/DOC

This instrument was not used this expedition. The software is capable of running under windows 7 x64, but the instrument shares a host computer with the GC-PFT (which isn't), so no upgrade took place.
No supplies or consumables were present in AMS for this instrument so a new list was drawn up and the items added.
No manual existed for this instrument so a new manual was started.

Miscelanous

General Lab

A new inventory system was put in place on top of the existing one for the chemistry lab (see Appendix A). All items on the checkout sheet were assigned to a location in the chemistry lab based on aisle and drawer number (or cabinet, shelf, etc). All drawers, cabinets, and shelves were numbers, and the locations associated with the Aisle/Rack/Bin field in AMS. This should assist with locating items, keeping a proper inventory, and managing materials in the lab. In the process physical counts were performed on all items in the lab, drawers reorganized where appropriate and obsolete supplies/documents returned to shore.

Rock Sampling Technique (Microbiology)

The technique for cracking rocks for microbio sampling was altered slightly to prefer crushing the rocks in the x-press (with autoclave sterilized core liners and delrin disks, and flam sterilized steel spacers). This cuts down on the noise, flying rock debris, and impact to the epoxy countertop.

XRD Lab

Heather Barnes

Summary:

All equipment operated without any serious problems. A couple issues need to be addressed (discussed below in more detail): 1. increasing future LOI aliquots to 5g (previously 1g) to minimize effects of the Mettler Toledo Balance error; and 2. find a permanent solution for the LOI ashing furnace so that it reaches temperatures greater than 1000 °C.
Expedition analyses:
Total XRD: 45
Total ICP: 80
Equipment:

Equipment Summary

Bruker D4 Diffractometer:

No problems were encountered with the diffractometer. No reoccurrence of the Difftrac Commander aborting at end of sample scan.
Some Vaseline was added to the 'z-direction' rods on the sample changer to ease friction.

ICP Bead Maker:

Platinum crucibles are in great condition and beads 'pop off' without any sticking. Make sure to check crucible condition at port call. I suggest evaluating the crucibles during tie- up – it may be beneficial to recast the crucibles every couple years.
The beadmaker's extraction fan has become very noisy again. We took apart the fan to check the bearings. They are in good condition (bearings were replaced in June, 2010). To eliminate some of the noise, the fan was moved into the ceiling in Thin Section.

LOI Furnace:

A maximum temperature of 933°C was obtained and determined sufficient for the expedition LOI's. However, scientist John Mahoney said that we should seriously look into getting the temperature up over 1000 °C for future expeditions. The ET's are scoping out a step up transformer that we could permanently mount. Please check into this during CRISP Exp 334 because it is important that the furnace temperature is able to reach over 1000 °C for SuperFast Exp 335.
The crucible trays for the ashing furnace were 'flaking' in the oven, causing black powder (Fe?) to be deposited on and in the crucibles. One batch of samples had to be discarded due to black powder entering into the samples. The trays onboard were sandblasted to remove any build up of black material, however they continue to produce a small amount of black powder. I have been in contact with Darrel Manning (Fisher Scientific representative) about ordering new trays (see emails below). Unfortunately, Fisher no longer carries trays for our size oven. Bill Mills is going to speak with Al about having some trays made.
Tray specs: 5.5" x 9.5" x 3/4" (W x L x H), holds 15 glass crucibles, withstand temperatures > 1000 °C (Fig. 1).

Figure 1: LOI crucible tray for use in Muffle Furnace.

Mettler Toledo Balance system:

Midway through the expedition the balance software would not load - giving a 'fatal error'. This error was due to a 'blank' in the 'tare' field within the programming code. For some reason, unexplainable by the programmers, the balance software code did not provide a number in the 'tare' field. After entering the number 0 into the 'tare' field within the programming code the balance ran fine.

Haskris:

New filters for Haskris water filters are ordered (PMAG initiated this order).

Sample Preparation:


ICP LOI Issues:

I ignited five duplicates for LOI (the five samples that were contaminated with black powder from the sample trays) and found LOI discrepancies greater than 10%. The scientists onboard are updating their explanatory notes to indicate the LOI error (see insert from John Mahonny's explanatory notes below).
The >10% error is due to the fact that our balances are only accurate to the 3rd decimal place, at best. When using a 1g aliquot for LOI the LOI weight difference is in the hundredths to thousandths. A balance error in the hundredths and thousandths will therefore create a substantial error in LOI percent (see spreadsheet below).
For future expeditions I am going to suggest:

  • Increase count time to 1000 for all measurements. There should not be an issue with time, 1000 counts only takes about 3 minutes.
  • Increase the amount of material ignited to 5 g. This will decrease the percent error in weighing and thus decrease the error in LOI values.
  • Below is an excel spread sheet showing the decrease in weight variability and a lower variability in LOI with an increase in sample weight to 5g.


Updated explanatory notes (supplied from John Mahoney):

"After grinding, an aliquant of the sample powder was weighed on a Mettler Toledo balance and ignited to determine weight loss on ignition (LOI). Samples were ignited at 930°C to 960°C for 4 h. For samples from Sites U1372 to U1374, the amount of sample weighed for the LOI measurement was 1000.0 ± 0.5 mg. Estimated relative uncertainty on LOI values for these samples is about 14%, on the basis of duplicate measurements. For samples from Sites U1375 to U1377, a 5000.0 ± 0.5 mg aliquant was used and the estimated relative uncertainty on LOI values is about 4%."

ICP sample preparation

The new Delrin plugs for the Xpress ICP preparation are rubbing onto the rock samples. Using the lowest of pressure I am still finding white residue from the plugs on the sample surface. Could it be that the new Delrin plugs are softer than usual? This needs to be monitored.
New stainless steel Xpress sample holders arrived during port call and are in upper drawer in ICP prep lab.
New Fisher Scientific glass sample bottles are being added to inventory for the ICP sample powders (similar to the Qorpak 60ml bottles).

Misc


Emails to/from Darrel Manning regarding crucible trays for Ashing Furnace:
Hi Darrel,
Thanks.
The oven is old - not sure how old but I would guess 10-15 yrs?
The oven is onboard the research vessel and we are off the coast of New Zealand at the moment - due into port (Auckland) next week if you want to stop by
Do you have any other solutions?  If not, I will have to look somewhere else.
 
thanks for your help,
Heather
 
>>> "Manney, Darrel" <darrel.manney@thermofisher.com> 2/4/2011 7:58 PM >>>Hi Heather, It looks like that is the only tray that is currently available for the isotemp ovens.  They couldn't look the unit up by the model or serial number for some reason.  Do you know how old your oven is? I can stop by Monday or Tuesday to view. Regards, Darrel A. ManneySales RepresentativeFisher Scientific9999 Veterans Memorial DriveHouston, Texas  77038Phone 979.574.9335Fax 979.703.7562darrel.manney@thermofisher.com From: JR Heather Barnes [mailto:jr_barnes@ship.iodp.tamu.edu] Sent: Wednesday, February 02, 2011 6:38 PMTo: Manney, DarrelCc: Steven Prinz; JR Laboratory Officer; JR Chieh Peng; JR Steve PrinzSubject: RE: IODP/furnace tray information Hello Darrel, thanks for getting back to me.The model number I have for the furnace is 750-58, Serial number 60900021, 0.58 cu ft. The tray you provided the price quote for is too wide.I am looking for trays with dimensions:5.5" x 9.5" x 3/4" (W x L x H) holds 15 glass crucibles.I have attached a photo. Many thanks, Heather >>> "Manney, Darrel" <darrel.manney@thermofisher.com> 2/2/2011 5:02 PM >>>Hi Heather, The model # you gave wasn't able to be pulled up with the manufacturer, so I had to guesstimate that you were using the 1.26cu.ft. oven (fisher part 10-750-126N).  If this is the case then you would want to use part 10 497 10N for the oven. Can you verify the size of your oven? Regards, Darrel A. ManneySales RepresentativeFisher Scientific9999 Veterans Memorial DriveHouston, Texas  77038Phone 979.574.9335Fax 979.703.7562darrel.manney@thermofisher.com From: JR Heather Barnes [mailto:jr_barnes@ship.iodp.tamu.edu] Sent: Friday, January 28, 2011 5:51 PMTo: Manney, DarrelCc: Eric Jackson; JR Eric Jackson; JR Chieh Peng; JR Steve PrinzSubject: IODP/furnace tray information Hello Darrel, We are trying to find a crucible rack/tray that holds 15 crucible (made from SS that can withstand temperatures >1300 degree Celcius). See attached picture. We use them in our Fisher Scientific Forced Draft Muffle Furnace. The furnace model number I have is 750-58. The crucible rack dimensions are:5.5" x 9.5" x 3/4" (W x L x H) holds 15 crucibles crucible hole dimensions 35mmwith handle slot at front of crucible tray On the Fisher website I found a rack with catalogue number 10-497-5 - but it is too big for our furnace.Does Fisher carry any with the dimensions listed above. If so could you send me a price quote. Many thanks, Heather


THIN SECTION

Gus

Summary

This was a busy expedition with 286 thin sections requested.
In an attempt to expedite section preparation a helper was trained to assist with various aspects of the process and was quite helpful.

Special projects

  • LED strip lights were added to the Petrothin to help illuminate the blade and cup wheel.
  • A sink insert was installed to raise the working height inside the sink.


Problems encountered

  • The LP-50 lap wheel was 12-17 microns concave at the start of the expedition and over time has been reduced to 5-8 microns. The flatness monitor was not used as it tended to exacerbate the problem and the machine run throughout the trip in the static mode. A more aggressive effort to flatten the wheel would have been pursued if there weren't concerns over having sufficient grit to adequately complete the thin section work. A new wheel has been ordered.
  • The Petrothin developed an internal vacuum tube leak where it had chaffed on either the belt or pulley. Vacuum is now directly connected from the spindle shaft to the vacuum manifold on the bulkhead.


Miscellaneous

  • The supply of aluminum is almost gone and there is no more 3 micron diamond powder. This was reflected on the inventory in the early part of the trip.
  • The last of the Struers Lube was consumed and the polisher slurry carrier now in use is Ethane Diol (ethylene glycol).
  • Routine cleaning and maintenance performed as needed and at the end of the trip.


FANTAIL

Gus

Summary

Navigation, Bathymetry and Magnetometer data was collected while transiting. Data recording was compromised on a couple of occasions. These problems would have been immediately observed had we been standing formal U/W watches.

Special projects

  • None.


Problems encountered

  • None.


Miscellaneous

  • Ball valves for the high pressure air lines have not yet been replaced due to other high priority welding tasks.
  • Routine winch and crane maintenance performed.


Underway Geophysics LAB

Erik Moortgat

Data Summary

Expedition 330 consisted of seven transits (∑ = 2574 nm) and six sites.
Transits:
1. Auckland to U1372 (L1T – 824nm)
2. U1372 to U1373 (L2T – 146nm)
3. U1373 to U1374 (L3T – 6nm in DP mode)
4. U1374 to U1375 (L4T – 321nm) an incorrect zero pressure on the magnetometer
5. U1375 to U1376 (L5T – 91nm) bathymetric data not collected
6. U1376 to U1377 (L6T – 391 nm)
7. U1377 to Auckland (L7T – 795 nm)
Sites:
U1372A(LOUI-1C)
U1373A (LOUI-6A)
U1374A (LOUI-6B)
U1375A/B (LOUI-2B)
U1376A (LOUI-7A)
U1377A/B (LOUI-4B)

Site fixes & PDR depths observed

Event data points were gathered every sixty seconds when underway and every thirty seconds when on-site.
Underway watches were not conducted, but it is my opinion that we should conduct these watches, for safety and data integrity reasons.
I also believe that the U/W Laboratory Technician should assign a 'backup person' who is responsible for the lab and if the U/W Lab Tech is not on shift then this person either must be informed of issues or they must wake the U/W Lab Tech.

Equipment Performance Summary

WinFrog

WinFrog1 was used entirely for primary navigation acquisition. There were no problems encountered.
WinFrog2 was used for position/heading data collection during the GBM (Göttingen Borehole Magnetometer) downhole tool runs. The Trimble's GPS moonpool offset was removed and data collection frequency was one second.
WinFrog1 was used for position/heading data collection at our last site (U1377) for testing purposes by the LO & scientist Jeff Gee. The Trimble's GPS moonpool offset was removed and data collection frequency was one second.
WinFrogs 1 & 2 were set-up for time synchronization from the GPS.

I/O devices added for LO's Trimble tests:

COM0 NMEAOUT – JRNav
Port 124 – listen
$GPGGA- position$GPVTG- speed
$GPZDA- time $HEHDT- heading
COM0 NMEAOUT – DPTH/HDGPort 8083 – listen
$GPGGA- position$GPDBT- depth
$GPHDT- heading$GPZDA- time

Gyro

NMEA GYRO #3 from DP was the gyro used for the duration of the Expedition.

Trimble/Ashtech GPS

The Trimble GPS was the primary GPS used for the duration of the Expedition. No problems were encountered. The Ashtech GPS unit was available as a backup.
The new Trimble unit (165.91.72.82) that arrived in port was used by the Lab Officer for testing and data collection (differential positioning) for the GBM downhole tool runs per GBM Specialist's Sebastian Ehmann and Jeff Gee's request. The antenna was installed (permanently?) on the roof of the lab stack. Another unit was requested by the LO for installation aft.

Bathy 2010

There were no problems encountered with the bathymetric system (raw data collection that is).

Marine Magnetics sent out a new version (8.00047) of the software to troubleshoot our midnight GMT lost GPS sync issue. It looks like this issue has been resolved.
The towfish was successfully deployed on all transits, except for L3T (transit in DP mode). Deployment on L4T had an incorrect zero pressure performed.
If there are plans to move the tow cable to the port seismic winch, facilitating a direct deployment from the winch, I would highly recommend acquiring a second deck cable. (40m deck leader cable, P/N M-SS2304 from Marine Magnetics).

EPC recorders

Event printing from the Bathymetric software was successfully tested.

Misc


Items arrived in port:
Trimble GPS antennae/receiver
SeaSpy tail fins (2each)
Requisition made:
Trimble GPS antennae/receiver
New set (16 sheet) of general bathymetric charts of the oceans (GEBCO)
Items sent back to College Station post-Expedition:
Geometrics power supply/console w/ cables (2 units)
Data backups:
data files for the Expedition were copied to:
\\JR1\Vol1\data1\EXP330\1.5 Ops Navigation
\\JR1\Vol1\data1\EXP330\1.6 Ops Bathymetry - PDR pulse depth recorder
\\JR1\Vol1\data1\EXP330\1.7 Ops Magnetics




Information Technology Report

Grant Banta and Andrew Trefethen 
 

Summary

 
The IT infrastructure performed as intended with little to no interruptions to operations. Some operational issues still persist with GroupWise Web Access. From time to time services stop and require restarting the service. Apple File Protocol seems to be working much better with file sharing and file access with the OSX systems. Restarting of services was minimal during this expedition. However, we implemented a new service (SAMBA) that should work well for both Windows and OSX Scientists/Users in the future. They will no longer be required to install the Novell client to access data on the servers. A problem with our management software on one of the OES servers that was identified on EXP329 is still not working correctly. Service is planned during the transit. The Cumulus service on the Solaris server is still causing the message log to fill up requiring a restart of the resource. Finally after a scheduled move of Rignet services from Riverside to Napa Valley in California we started noticing that the phones were almost impossible to use because incoming communication was so bad. After working with Texas A&M and Rignet the problem was finally identified. Communication and the previous level of service were restored. We also lost phone service for about a day that was NOT related to the move. Again after working with Texas A&M and Rignet the problem was resolved and services fully restored.
 
 

Servers (Microsoft)

    • Installed software to monitor the room alerts from our server room.
    • Moved our admin software for managing our Network to another server for security reasons.
    • Updated all servers to McAfee patch level 4.


Servers (Novell - OES)

    • SAMBA services were installed to allow oncoming Scientist a way to access the servers without the need of the Novell client.
    • Increased the size of our hard drive volumes to give the users more work space.
    • Imanager still isn't working correctly on Ararat which was identified on EXP329.


Servers (Sun Solaris)

    • Still received a couple of SMELT messages that /var was 91% full. For some reason it appears that messages is 91% full, but after restarting service we are able to free up about 60% of space.

 

EVA4000 Storage

  • No problems with the EVA.


Network:

    • We starting having systems hitting the failsafe role. After talking opening a ticket with Enterasys they have escalated to their engineers to find a solution. A solution should be put in place prior to the next expedition.
    • Moved Net flow services from our core network switch to the Sonicwall Firewall to see if we get more insight on network traffic.


Printers and/or Printing:

    • Users were not able to print horizontally in OSX. Found that this problem started after upgrading all systems to 10.6.5. Further research showed that the problem was with OSX 10.6.5 and the easiest solution was to upgrade to 10.6.6.


PCs Workstations:

    • Updated all of our Windows images with the latest updates and made new master images.
    • Created a new Win7 32bit image for possible use to replace the XP 32bit systems.
    • Upgraded several systems from XP in the Chemlab.


Apple MAC Workstations:

    • Updated all OSX systems to 10.6.6.


Equipment Repairs:

    • Voice quality issues after Rignet moved equipment to a new location on shore were resolved.
    • The phones went down and had to be configured again with Rignet and Campus.
    • Found cables labeled incorrectly. Corrected and documented.
    • Found the threshold on the satellite systems not configured correctly.
    • After UPS repairs in port we had a power supply that failed and had to be repaired.
    • Replaced VBRICK monitor that failed in the user room.
    • Replaced a few failed VBRICK units.


Special Projects:

    • Migrated MCS website to the wiki.
    • Updated Scrutinizer to 8.5 for support of the Sonicwall Firewall.
    • Increased Volume size for VOL1 on the servers.
    • Enabled SAMBA for our customers to have a gain access to our servers requiring no additional software to be installed. Keeping their systems uniform to their current configuration.


DEVELOPER'S Report

James Zhao and David Fackler (Wm. Mills, Trevor Cobine, and Chris Bennight)

Overview

This document highlights changes in laboratory data acquisition and collection systems, and changes in the development work environment. It is a synopsis of efforts from many individuals. There will be overlaps here with the routine technical reports provided by laboratory staff. Where there is variance, this document explicitly defers. Notable issues and changes are presented in-line. Issues not addressed and features requested are collated in Addendum C: Issues and Features.

Curation and Core Handling

SampleMaster

Production is at release 2.1.1.6. Pending release 2.2.0.1 implements responsiveness improvements—presently undergoing evaluation and test. Curatorial staff noted numerous issues—see addenda. No "show-stoppers", but enough unexpected behavior to warrant repeated validation checks on content uploaded and printed, with the attendant implicit reduction of faith in the tool.

Labeling services

The label print web-services were updated to accommodate feeding label content to the laser engraver. Integration of the laser engraver into core-flow is stalled due to technical impediments (vendor proprietary communications protocols) to controlling the engraver from USIO software.

Geophysics

Whole-round logger (WRMSL)

Installed these updates for further vetting of MSL software on this track: National Instruments device driver distribution Nov 2010, GalilTools 1.4.4-102, LabVIEW 2009 (9.0.1 32-bit), LabVIEW 2009 run-time environment. The updated device drivers are retained. All others rolled back.
The MSL (multi-sensor logger implementation framework) code-base is revised and tested for use with shipboard WRMSL hardware. Support for multiple deployment targets is enabled via the specification of "conditional disable" variables in LabVIEW projects. One variable is defined for each kind of build variance, e.g.:

  • run-time vs. development environment
  • different hardware used to perform similar functions (Galil vs National Instruments for acquiring digital I/O, or for choosing between several data acquisition boards that may be used to acquire the same kind of data)
  • operating system variation (PC/Linux/Mac )

The MSL codes are NOT presently deployed on WRMSL. This decision will be discussed and revisited in port.

Special task logger (STMSL)

This system retains a Windows XP image re-loaded from the May 3, 2010 Victoria tie-up period. GalilTools 1.4.4-102 was installed and WSDK removed. LabVIEW 2009 run-time environment only is installed. If the development system is found on this box, it will be removed.
The MSL codes are buildable and deployable from development laptops and BUILD environments for multiple hardware configurations.
The LabVIEW build process is coupled to the layout of the source tree. To enable repeatability of the build and deployment process across multiple developers for production-ready products, recommend the USIO adopt a common set of standards for layout of source, libraries, configuration, documentation, build products and installer products.

Natural gamma (NGR)

The external data reduction step has been integrated into the routine operation of the program. This required a re-work of the internal data structure and the implementation of a functional configuration user interface. Other code modification included:

  • Replaced the LabView configuration(ini) VI with the INI handler developed for IMS(SHMSL code)
  • The NGR configuration panel is now functional. This panel must be used to identify the Calibration and Background Files for NGR MASTER data processing code. Also, the Configuration utility must be used to set other NGR acquisition parameters such as acquisition time. Note: This panel is basically a clone of the original configuration vi and still contains data controls that appear to have no use
  • The "Sample Information" vi UI was dressed up. The NaN values were removed (chokes NGR MASTER). The state of the Cap Lock key is no longer an issue when scanning in labels. The "run" button is disabled until a non-zero value is entered for core length.
  • The NGR front panel vi has been dressed up. A tab control now handles the Summary and Detail views of data acquisition; as well as, access to the motion control utilities. "Tab Events" manages the front panel controls and prevents access of the Motion utilities during data acquisition.
  • The NGR front panel menu bar works now and provides user access to the configuration panel and program exit. Redundant buttons on the front panel have been removed along with unnecessary Event and Consumer cases in the code.
  • The cyclic condition caused by a DMC error has been fixed. Now when the user choose to exit, they do.
  • The NGR code base is now organized under the folder C:\NGR_TRACK. All of the code is now within a LabView project along with project scripts for building a Source Distribution, Application Executable, and Installer. The current application for the user is an installed executable with short cuts in the Start Menu and Desktop.
  • The entire LabVIEW environment and driver base was uninstalled and re-installed: LabVIEW 2009 (9.0.1 32-bit)—no National Instruments drivers required. National Instruments components required to drive the Tektronix oscilloscope were reinstalled.
  • Note! The NGR MASTER uses a unsigned dll and cause a significant delay when the application opens and again when NGR Master runs the first time. The time delay is significant and it may appear to be hung, just wait! Chris is looking into the issue.


WRMSL, STMSL, NGR.

These instrument hosts may be updated to Windows 7 32-bit. The impediment to installing and configuring the ORTEC multi-channel buffer drivers for this platform is removed beginning with the 6.11.01 driver release. Must install Scintivision or Maestro first using the Windows 7 Run as… > Administrator option. Reboot. Then run the 6.11.01 driver release install using Run as… > Administrator.

MATLAB scripting.

Physical properties scientists managed the revision and processing of MATLAB scripts inherited from expedition 324 to filter and reduce physical properties data content for specific presentation in expedition barrel sheets. The data reduction and presentation methodology are published as part of the expedition proceedings. Codes available in these locations:

  • Expedition 330 proceedings: \\jr1\vol1\uservol\Lab Groups\I_Physical Properties\Matlab\Code
  • Digital master library on ship: \\jr1\vol1\tas\dml\software\labsystems\physprops\matlab filtering-330-fulton-kalnins-ebuna
  • Digital master library on shore: \\odpads\tas\dml\software\labsystems\physprops\matlab filtering-330-fulton-kalnins-ebuna

Sonic velocity (GANTRY)

  • A revised version of the interface is implemented. Improves process workflow for measuring cubes. Includes significant improvements in the codes applied for signal analysis and data reduction. Incorporates multiple sonic velocity signal time-break methods.


DESCRIPTION TRACKS

Line-scan imager (SHIL).

The imager control software is unchanged. Additional duty was added to the logger by provision of script and protocol to print each acquired image on 11x17 pages. These printouts are the basis for handwritten core description notes which are subsequently entered into DESCLogik.
Lost the camera calibration at the end of site U1372. Rewired the system so the camera is back on its own dedicated power-supply independent of the computer.

Reflectance and susceptibility (SHMSL)

Revised laser profile output . Revised format is required for the MATLAB scripts applied by physical properties scientists.

Discrete sample systems

Thermal conductivity (TCON)

No change.

Moisture and density (MAD)

No change. Continues to provide recurring support effort for repair PYC and MAD_MASS data cataloging errors.

MadMax

LabVIEW portion of the code revised at the beginning of the expedition. Removed digital I/O board reset which stopped the LabVIEW-based DLL from functioning at program initialization. No further effort expended on total package.

Paleomagnetics

Point susceptibility

Bartington MS2C point susceptibility used to acquire additional susceptibility observations on discrete cubes. Readings acquired manually and recorded in spreadsheets as part of the expedition proceedings.

Agico J6A

Spinner magnetometer data acquired using vendor software. Methods and codes supplied by Scripps (Gee) for data transformation and post-processing.

Kappa Bridge KLY-4S

Magnetic susceptibility data acquired using Scripps codes (Gee): MacAMS. Methods and codes provided by Scripps also for data transformation and post-processing.

GPS sighting.

A LabView application was put together to simultaneously log data from WinFrog and the new Trimble mounted GPS antenna over the LO's Office. This provided a base line for a GPS corrected heading vs the ship' gyro. This base line was used to set the GBM's (Göttinger Bohrlochmagnetometer) heading reference.

Superconducting magnetometer (SRM section)

Codes modified in-situ for USIO software to: increase speed of load/unload motion; parallelize the SQUID data acquisition. Consider code on box a hack rather a release. Significant defect log and feature requests.

Superconducting magnetometer (SRM discrete)


No change. Not used due to: time constraints running archive halves, and concerns over data quality based running Scripps standards.
A challenging and [rightfully] demanding laboratory to support. Of all the laboratories, this one is quantitatively most intensive. The sheer bulk of observations made requires automated reduction and visualization support. This participant group came prepared to tackle the analytical tasks with their own software, scripts, toolkits and skillsets. Including adopting and adapting to systems new to this lab or used only with frequency on hard-rock expeditions (Agico J6A, KLY-4S). The pressure is no less to provide improved integration and support for their needs.
Codes for data acquisition, reduction, presentation available in these locations. Turning these tools into routine components of our data acquisition will require dedicated technical effort and collaboration with Scripps Institute of Oceanography, Paleomagnetics Lab (Gee, J.; Tauxe, L., et al).


  • Expedition 330 proceedings: \\jr1\vol1\ Uservol\Lab Groups\J_Paleomagnetism\Programs
  • Digital master library on ship: \\jr1\vol1\tas\dml\software\ labsystems\pmag apps from scripps-330-gee
  • Digital master library on shore: \\odpads\tas\dml\software\ labsystems\pmag apps from scripps-330-gee


Underway

Navigation, bathymetry feed

NMEA data outputs were at some point re-configured in WinFrog. This broke the feed into the OPS database (program only looked for 3 records but was receiving five). Codes changed to accommodate more variability in output, but not further tested or deployed. Proper configuration restored.

Magnetometer

Processed data made available as part of expedition proceedings content. Archival content placed on Data1. 12-hour hole in archival data at EOX due to Fackler debugging above without proper communication and support of underway technician. Know your responsibilities and limits. Collaborate. Communicate.

Geochemistry

MCS and chemistry technician co-ordinated to upgrade most instrument host systems in Chemistry to Windows 7 x64. Defer to the chemistry report.

Alkalinity

Brought forward to LabVIEW 9.0.1. Compiled version pending.

Coulometer

LabVIEW retired on this instrument host. Coulometry now controlled by a dotNET application.

DESCRIPTION

DESCLogik

Production operating at release 3.22.9.8 release candidate. Various fixes: speed improvement on download (new web service); fixes to right-click cancel behavior (immediate cell cancel); fixes to delete cell (complete cancellation on upload) behavior; download button now implicitly clears the current template tab; value lists defined with a global template are now sticky for non-owning user of the template. Next candidate 3.23.x series is queued, but not released.
This project has strong internal advocacy and is well-managed. The milestone/bug/feature list is held by the Geology working group (Blum, P., et al) and is current with respect to shipboard activity. Improving steadily. Detail oriented technical staff and participants helped identify and reproduce long-standing bugs made difficult to repeat by complexity of process, and complexity of terminology in describing the process, model, and tool.
The system is a functional work-horse, but not loved. Discussions of the product touch a nerve with every participant involved in the descriptive process. It is bound to collaborative process and convention, not to individual preference and control. Division of labor by column for content within the same template tabs leads to collaborative complexities we don't encounter with any other data set.
The power of the descriptive concepts are readily apparent. But the end-to-end implementation is short of the visions. The first descriptions are still written with paper and pencil, and then entered in tabular form. Suggest an incremental form of that process be prototyped and examined using slate/stylus form-factor computing devices. Sufficiently important to the work we do to specify it carefully, and budget for outsourcing to a professional software house.
Lagniappe: Ownership and custodianship of the data during the expedition are important issues to lay out and discuss up front. Strongly recommend retiring the use of shared accounts. Traceability for problem-solving and troubleshooting is more important. Got bit by data management issues one more time at the end of the expedition. Available only by personal anecdote. Highlights data custodianship issues, collaborative complexity, and the need to communicate.

Statistics

Interpret at your own risk: scoreboard/dashboard/breakdown of descriptive content by template, tab, hole and individual. These data are a useful post-cruise reference for what data is available on what tabs from DESCLogik. See addendum Addendum A: DESCLogik Data Dashboard.

Barrel sheet production (VCD)

Defer to the publication specialist.

  • The VCD process is complex, error-prone, and above all demanding. What can be done to:
  • streamline the tools
  • distribute the data-validation efforts
  • streamline the review, debug, and ["slave to the database"] edit cycles

  • improve the round-trip time by two orders of magnitude
  • Detailed efforts of the publications specialist helped identify, reproduce, and resolve long-standing bugs with Lims2Excel and DESCLogik.



Closeup, Microphoto, Thinsection Capture

Single tool with three modes of operation. At production release 1.0.7.8. End-users actually happy with this system. Only feature to add—ability to cancel an image for those time you forget to change an objective or filter description before you upload. Otherwise it just works—minimal support effort required.
Closeup capture station is in need of a more appropriate printer solution than the Zebras. A smaller, lighter duty desktop printer is desirable. Must be able to support both the large and small form-factor labels applied to closeup images. A networkable printer supporting an ftp or similar protocol is desirable for ease-of-integration with current label print service architecture.

Operations

Core winch counter video overlay

No change.
The Xbob software frequently stops polling the core-winch counter (Veeder-Root). Outage is evidenced by Xbob's front-panel elapsed-time counter continually increasing without acquiring new data. RigWatch stops receiving data at the same time. Restarting Xbob is often sufficient. Presently no serial port splitter is in place. Further analysis required.

Reporting Systems

Revised the Science Applications web-index page to launch applications via the load-balancer rather than via the "webserv" alias.

Web-tabular reports

No change. See addenda.

Lims to Excel

Fixes to prevent reporting of canceled or rejected samples/tests/results. Ignores empty DESCLogik templates. Fixes handling of embedded whitespace in summary and unit descriptions. Revised handling provided for replicate or overlapping presentation of descriptive content.

Thin section report

Revisions to accommodate expedition-specific changes in thin section reporting. Fixes to prevent reporting of canceled or rejected samples/tests/results. Speed improvements. Revised to use depths from the revised sample depth model. Macro provided to format output more closely to paper template—primarily as a visual aid to the data validation process.

LimsPeak

Added support for display of various new image products: microphoto, thinsection, and closeup. Added right-click support on images to obtain descriptive content and metadata. Added a prototype barrel-sheet display toward reducing time between data entry and validation of VCD output. Fixed handling of cancelled images so that they cease to display.

Laboratory System Services

MegaUploadaTron

At production release 2.0.6.6. Additional revisions pending formal release: Revised to support 100 character instrument names. Revised to support multiple file extensions for the same analysis. Revised XRD to enable upload of comments via files named like Text_ID.comment.

LimsOnLine

At production release 0.3.4. Notable additions: updated data editing support for all current analysis as of first weeks of the expedition; errors occurring during logging of standards (attempting to compute depths for standards) fixed in underlying backend services.

Database

Model Changes

  • Depth data model went through two revisions. Retained the newer series of tables: x_scale, x_hole_to_scale, x_sample_depth—in production, though still under evaluation and test. Removed the older series of "x_depth…" tables from the production database after verifications that they were not referenced.
  • Revised DESCINFO schema ship and shore to ensure sequence numbering for new content is uniquely identified across both databases. Key generation triggers were revised to create new database keys only when incoming records present a null or zero ID field. Enables transfer of the content between the two environments without key collisions.
  • Updated model applied in support of affine and splice files recorded as outputs of the stratigraphic correlation process. Progress in improving/streamlining the tools and interfaces used to manage that process. Progress in integrating that depth computation variant into the revised depth management and computation model.
  • INSTRUMENT field extended to accommodate instrument names up to 100 characters.
  • Function [FUC_GETENTRY()] revised to ignore canceled and rejected values. Revised for public visibility and execution. Helper function for retrieving offsets recorded with analytical data.


SAMPLE & DATA STATISTICS

Samples. Counts of sample records

  • 34794 total samples (1)
  • 24253 exp 330 samples(2)
  • 626 standard reference materials (3)
  • 420 exp 330 samples canceled (4)
  • 4151 samples fulfilling requests (5)
  • 20102 routine shipboard samples (6)
  • 8 hole (7)
  • 161 core
  • 722 section
  • 1444 section half
  • 16323 pieces logger
  • 286 thin section
  • 92 smear slide


Method reference. (1) select count(star) from sample; (2) select count(star) from sample where x_expedition='330' and standard='F' and instr('ICPAR', status) > 0; (3) select count(star) from sample where x_expedition='QAQC' and standard='T' and instr('ICPAR', status) > 0 and sample_name not like 'CONT%'; (4) select count(star) from sample where x_expedition='330' and standard='F' and instr('X', status) > 0; (5) select count(star) from sample where x_expedition='330' and (x_req_code is not null or x_requestor is not null) and instr('ICPA', status) > 0; (6) select count(star) from sample where x_expedition='330' and (x_req_code is null and x_requestor is null) and instr('ICPA', status) > 0; (7) select sample_type, count(star) from sample where x_expedition='330' and instr('ICPA', status) > 0 and sample_type in ('HOLE','CORE','SECT','SHLF','PC','TS','SS') group by sample_type;
Tests. Counts of test records.

  • 264703 total tests (1)
  • 170976 exp 330 tests conducted (2)
  • 36990 exp 330 tests canceled (3)
  • 2333 qaqc during 330 (Dec 11 – Feb 9) (4)

Method reference. (1) select count(star) from test; (2) select count(star) from test where sample_number in (select sample_number from sample where x_expedition='330' and instr('ICPAR', status) > 0) and instr('ICPAR', status) > 0; (3) select count(star) from test where sample_number in (select sample_number from sample where x_expedition='330' and instr('ICPAR', status) > 0) and instr('X', status) > 0; (4) select count(star) from test where sample_number in (select sample_number from sample where standard='T' and instr('ICPAR', status) > 0) and instr('ICPAR', status) > 0 and date_received > to_date('2010-12-11 00:00:00', 'yyyy-mm-dd hh24:mi:ss') and date_received < to_date('2011-02-09 23:59:59', 'yyyy-mm-dd hh24:mi:ss');
Results. Counts of result records.

  • 13835354 total results (1)
  • 11325427 exp 330 results (2)
  • 456137 exp 330 results canceled (3)
  • 193513 qaqc during 330 (Dec 11 – Feb 9) (4)


Method reference. (1) select count(star) from result; (2) select count(star) from result where sample_number in (select sample_number from sample where x_expedition='330' and instr('ICPAR', status) > 0) and instr('EMAR', status) > 0; (3) select count(star) from result where sample_number in (select sample_number from sample where x_expedition='330' and instr('ICPAR', status) > 0) and instr('X', status) > 0; (4) select count(star) from result where sample_number in (select sample_number from sample where standard='T' and instr('ICPAR', status) > 0) and instr('EMAR', status) > 0 and entered_on > to_date('2010-12-11 00:00:00', 'yyyy-mm-dd hh24:mi:ss') and entered_on < to_date('2011-02-09 23:59:59', 'yyyy-mm-dd hh24:mi:ss');

Data Acquired

See Addendum B: Data Acquired.

Developer Resources

DesKtops


Monitors. Removed one 30" monitor. Dell Latitude 630 video hardware won't drive them. Swapped MCS for two smaller screens.
Desktop box. The development desktop remains a fully functional development box. Applied a series of incremental software updates [noted in labhelp email archive "CHANGED: build and development desktop box updated"]. Desktop xw4400 now also drives the large form-factor monitor at full resolution. The video card carries a single-link and dual-link DVI video connector. Only the dual-link DVI connector is capable of driving the monitor at full resolution.
Macintosh. Installed seismic analysis packages here for Dr. Ebuna: SIOSEIS, Seismic Unix. MCS applied latest updates and patches from Apple. Downloaded and applied current Xcode development environment. Searched for and installed matching release level of the GNU gfortran compiler.


Servers


Load balancing. Load balancing is implemented for these additional services: resteasy-desclogik-services, resteasy-printer, resteasy-monitor, resteasy-drillreport, resteasy-MAD, resteasy-reqmgmt [PENDING on transit in, attempt two].
BUILD box. Various incremental software updates—same as noted above for the desktop box. Most notably: retired Visual Studio 2008, updated Java to 1.6.0_23, updated VisualSVN to version 2.1.5, updated Eclipse to version 3.6.1 [HELIOS], updated SQLdeveloper to 3.0. VisualSVN updated simultaneously on the shore BUILD box.
BUILD box is underpowered, particularly for conducting application builds under Visual Studio 2010, and especially when multiple developers access the system concurrently. Replace it. Recommend this as an entry-point to gain experience with solid-state drives on the server-side. Outstanding requests regarding this server: (1) obtain "new" hardware—at parity with existing hardware or newer; (2) must support a 64-bit Windows operating system; (3) must support 3-5 concurrent development users without "bogging down"; (4) dedicated for development use; (5) purchase a COMVault agent for backups of this system.
Digital master library. This local cache of instrument host and development system drivers and control software now has a permanent home:
JR \\jr1\vol1\tas\dml HQ\\odpads\tas\dml
Test servers. No change. Rather than expend further effort on this path, returned the available Solaris UltraSPARC system to the MCS. Pursuing alternatives.
Testing practice. Production systems suffice for most testing. Apply "good" test practice by considering the context and kind of tests being performed. Database read/write changes may be tested in individual schema accounts without affecting production LIMS content. A small body of test data is always available in the production server intended for more carefully constrained read/write situations. Web-service changes may always be vetted against a production server by applying the update to a less-used or un-used server node (e.g. web). More typically a node may be removed from load balancing for the duration of a specific test.
Krakatoa. Devlopment account is now only used for backup of the BUILD box. Ops account and console are used to manage RigWatch master console, XBob video overlay, and the navigation/bathymetry data feed. Developers and MCS need to negotiate at beginning of expedition who will manage the component purely to avoid stepping on each other.


Addendum B: Data Acquired

Breakdowns of samples analyzed, tests conducted, results collected organized by analytical tests.

Analysis

Samples(3)

Tests(2)

Results(1)

BHA

8

8

8

BIT

8

10

60

CALIPER

1

1

2

CHNS

14

23

276

CLOSEUP

81

216

3837

COREPHOTO

147

147

294

COUL

14

16

80

DEPLETED

244

247

1205

DESC

1202

69982

608906

DRILLING

161

161

3299

GRA

715

819

349055

ICPAES

80

80

39005

LATLONG

8

8

16

LSIMG

715

742

12560

MAD

405

405

7695

MAD_MASS

405

411

2009

MICROIMG

229

985

13598

MS

715

818

246222

MSPOINT

719

761

325539

NGR

687

703

70841

OBSLENGTH

24254

52107

208428

ORIGDEPTH

8

16

36

PC_ORIENT

5441

5507

5507

PROFILE

720

766

4596

PWAVE_C

406

1212

10900

PWAVE_L

19

20

1131

PYC

408

408

3268

RSC

714

743

713906

SCALINPUTS

24254

24254

48669

SPINNER

134

1882

22584

SRM

692

5451

8611112

TCON

48

54

3273

TSIMAGE

289

593

7123

XRD

45

67

427

Method. (1) select analysis, count(star) from result where sample_number in (select sample_number from sample where x_expedition='330' and instr('ICPAR', status) > 0) and instr('EMAR', status) > 0 group by analysis order by analysis; (2) select analysis, count(star) from test where sample_number in (select sample_number from sample where x_expedition='330' and instr('ICPAR', status) > 0) and instr('ICPAR', status) > 0 group by analysis order by analysis;(3) select analysis, count(distinct sample_number) from test where sample_number in (select sample_number from sample where x_expedition='330' and instr('ICPAR', status) > 0) and instr('ICPAR', status) > 0 group by analysis order by analysis;


Addendum C: Issues and Features

Items in green have been addressed. Others are outstanding. To be forwarded to the appropriate Lab Working Groups.

Section Half Imager

BUG An artifact of the additional print image workflow. Have had to re-scan cores several times. Sometimes the cropping dialog will not display. Conjecture: printing image from Irfanview before crop dialog is displayed leaves insufficient memory to display cropped image therefore quits the dialog? Or previous run closed save dialog, but doesn't become apparent until next run--perhaps save button not being properly value reset before invoking? Needs to be more robust.
BUG If ImageSave dialog is interrupted on previous run, will automatically click through on the next invocation of that window.

SRM Section

BUG Tray Background.vi: master array wired in different order for Get Background Moments from Squid.vi than with Get Moments from Squid.vi.
FEATURE SRM Recover data from backup: recovers data to the sample last loaded in memory! Did that six times this morning resulting in good data attached to the wrong identifiers. Just cancelled the lot. Was distinguishable in this case because the core was short. data 0-74 cm was from "recovered" set.
BUG PMAG hung again at end of travel. No cause identified. Shut-down program and restart (4th time). Additionally noted: limit switch trip-foil has been sheared off.
BUG/FEAT Data Upload Service SRM moments intensities, MS raw_counts are in-appropriately rounded and reformatted on upload. Formatted entry should reflect what's uploaded. Need some flag to indicate this. Back-end service does no formatting on components identified as text.

MegaUploadaTron

BUG Uploader or web-service is transforming scientific notation inputs from SRM into non-scientific notation formatted entries. SRM requires data content stay in scientific notation (for units AM2, AM_PER_M). Fixed in back-end log samples services. Also affects handling of not-a-number values, and data from DESCLogik formatted as-enters (dip angle, dip azimuth).
BUG When file collisions occur on move, XRD is overwriting files. Preferrable to apply an autonumbering extension.
BUG Uploader still behaving badly on SRM instrument host. Crashes frequently, leaves zero length files in queue after move attempts. Ameliorated significantly by disabling virus scanning of the data directory tree. Still occurs though.
BUG There's a launch loophole that allows multiple copies of MUT to run. They contend with each other for file access. When a running copy is detected, the facility to cause the UI to display should be invoked (un-minimize from tray) instead of starting a new instance. Get rid of the TEST message.

LimsPeak

FEATURE Change message displayed when image tiles are not available. Remind to check whether LSIMG have been scanned and uploaded yet.
BUG Scroll mouse button Zoom function for Section Photo column does not work on Macintoshes (10.6)--neither in Safari or Firefox.
FEATURE Enable Samples column sample selection box to support section specifications as well as cores: e.g. typing "5R1" or "5r-1" or "5r 1" scales and zooms in on that specific section.
FEATURE Add right-click display to Thin Section photo column which pulls up all the DESC content for that thin section. Explicitly request was making this an access point for the thin section report content.
FEATURE Section photo column. Add check-buttons to enable/disable additional features and functionality: (1) show/hide sample overlay (where samples children of working section halves have been taken); (2) check-box to do/don't preserve image aspect ratio (allow stretching across column and leave it that way); (3) show clickable "links" to other content.
FEATURE Samples column. Add navigation buttons: up, down, next/previous core, next/previous section, top, bottom.
FEATURE Add templates specifically for track system use: NGR, PWL, MS, GRA--don't bother with images—as these are not available yet when technicians are checking this content in the core flow process.

LimsOnLine

FEATURE Often need to see timestamps to determine older vs. newer reading. Typical use-case cancels older observations.
FEATURE Data > Physical Properties lacks the analysis PWAVE_C, PWAVE_B. Would like to be able to reject readings here too. Other analysis added in 0.3.4 release.
BUG After cancelling data, unable to go select and cancel data against another sample. Must re-run whole applications. Occurs on development laptop. Not seen on other workstations.
BUG Test Status shows result status codes. Something not quite right for that display. Test statuses are in "UICPARX". Result statuses are in "NEMARX".
BUG: Moves Sample in internal representation, even if actual move fails. Gives error message. Fixed in backend web-services. Back-end was inappropriately trying to compute depths for standard reference materials.

DESCLogik


All of these have previously been reported back to Blum, Gorgas.
FEATURE Revise cataloging of DESC observations so that columns are uniquely identified by both value entry and value entry qualifiers.
BUG DESCLogik 3.22.9.6 Regression Export to excel was made the default. Fixed the filter number in the IF-THEN-ELSE to match that re-ordering--was exporting just CSV named as XLS.
BUG DESCLogik 3.22.9.6 There is no "delete" function. What should it do? How should it behave? Suggestion Right-click in cell gives you a Cancel this cell option. Revised to work properly in version 3.22.9.8.
BUG DESCLogik Select row by clicking on row-header. Ctrl-C to copy. Click in Sample1 field of empty row. Paste. All data is offset by one column.]
BUG DESCLogik feedback bug. Edit a new row. Upload it. Tab into a cell, type a new value. Tab out. Tab back, change value back. Registers as orange [edited]. But orange doesn't clear on next upload. Color should change to reflect status. Fixed in release 3.22.9.8.
BUG DESCLogik: several sets of data with same row groups, but slightly different sample interval. Not cancelled: DESC doesn't display the data, Lims2Excel does. How did the data get in that way, fix display issues. Artifact of malfunction of delete/cancel buttons. Users were also doing things like copying a row, pasting it, immediately canceling the entire source row, then re-uploading the new content. Error prone.
BUG DESCLogik Cancelling a description results in a crash 3.22.9.8. Right-click cancelling gives errors. Have resorted to editing rows again by copy, paste, cancel previous replicate. Ackkk! Not reproduced. Encountered when summary descriptions had been erroneously pasted into the unit descriptions on the 330_...template.
BUG DESCLogik 3.22.9.8 Masks duplicate uncancelled entries. DESC only displays the latest of two entries. If the user cancels the visible entry (via delete upload cycle) the other value becomes visible. Masks data replication issues that must be cleaned up to obtain clean barrel-sheets.
BUG DESCLogik. Valuelist editor lets you enter data in cells beyond Level 9. Then generates uncaught exception when saving.
BUG DescLogik 3.22.9.4 Rows being reported on upload have no match to the actual number of rows in the spreadsheet. In spite of having cleared the sheet and then cutting and pasting in new values. Revised message and changed counters in 3.22.9.8 release.
BUG Opening of tabular sheet still takes 45-90 seconds. Too long without providing some kind of progress bar. Progress bar has been added with 3.22.9.8. Leave open for consideration of additional optimizations.
FEATURE [Nichols 330] Request feedback on-screen that upload is complete. Suggestions: asterisk next to name in window. Status line that indicates this. Asterisks that clear per row when (changed) data is uploaded. Or even by clearing color highlights to indicate upload is complete.
FEATURE Regression. Really want the ability to Export all tabs to Excel--that did work well: fold it in as a function of that button. Actually this feature didn't change.
BUG Will not upload new data entered in cell. Occurred for Alex twice within 330_Thinsection_U1372. Magmatic Texture tab, modified texture column, uploaded. Nothing actually uploaded. General Lithology tab, Lithology modifier column one cell was changed, uploaded. Download to verify, didn't keep value (or never uploaded). Repaired in 3.22.9.8. When cutting and pasting content between rows and cells, cached information about that cell was not properly being propagated/updated.
BUG Alex testing 3.22.9.4 version. 330_ThinSection_U1372. Added row to Magmatic texture. Uploaded. Oops. Some values were wrong. Cut and paste down a row. Cancel previous row. Make edits in new row. Upload. Download to verify--now have two rows: original and modified content. Believed resolved, but needs more validation/testing due to unusual workflow.
BUG/FEATURE Range selector for data download is not inclusive of the end-point. E.g. downloaded section descriptions by rangy 330-U1372A-17R-1 0, 330-U1372A-28R-1-A 131; fails to pick up section 28R-1.
FEATURE Give message whether upload fails or completes.
BUG DESCLogik--the big tooltip buttons take focus away from the button being pressed. First press fails behavior. Get rid of tooltips, or change their properties so that don't take over focus from button press. Revised behavior of the tooltips with 3.22.9.8. Failed clicks much ameliorated.


Web Tabular and LIMS Reports

BUG Web-tabular still doesn't honor reject flag (only cancel). Technically a resteasy-lims-webservices issue.
FEATURE Multiple image report: would be nice to have a single report that lists all images, that way you can do a lookup in one place rather than sifting through four distinct reports.
BUG UWQ or DEPTHCALC CSF-B shows overlap when it shouldn't. Verify against the DB. CSF-B depth computation incorrect in some way for hard-rock processing.
BUG Running JR UWQ twice fails on 2nd run. Repeat: Select & submit first report. Completes Ok. Follow-up by submitting 2nd report fails on Submit. Sits and spins. Not repeated. May have been an artifact of multiple different versions of product deployed under single load-balance environment.
BUG MAD_MASS selection in WTR. Though label_id shows 77/79. Interval columns in WTR show 77 & 77. Occurs for all MAD samples. Incorrect offsets.
FEATURE Tabular report to showing the analyses done for a section half--particularly as a way to check that all images have been scanned. Quickest check is to use LimsPeak or pull composite Virtual Photo Table images.

Thin Section Report

BUG Report downloads items that have clearly been cancelled.
BUG Report is splitting some text across cell boundaries.
BUG Cancelled sample being picked up by ThinSectionReport before the actual sample. Prevents retrieval of ThinSectionReport: cancelled sample 330-U1373A-1 sn:2726241; sample it should be pulling from: parent core 330-U1373A-1R sn:2724471, actual thinsection 72 at sn:2734701330-U1373A-1R-1-W 8/12-BILLET 72-SLIDE 72. Is picking up cancelled sections. Fixed in 2.0 and 3.0 release series.

Sample Label Printing

BUG Sample labels are printing with empty fields unsubstituted. Still reported with some label variants. Means there is another path through the code where substitution variables are not being handled in all cases entirely.

Lims to Excel

BUG Hangs when a DESC template definition is null.
BUG Is picking up first matching canceled sample instead of normal samples. Part of sample browser functionality.
BUG Some depths not being reported correctly. How did they get that way? All cases traced so far have been procedural (rebuild of depths, recuration of sample) or entry error (miskeying).
FEATURE In the Definition File Components box, provide a check-box for each downloadable component set. If the box is checked, that set of components is run. If the box is not checked, that component set is not run. This enables the publications specialist to use a single template for all aspects of managing the data for a single VCD set.
FEATURE Would save time round-tripping if new content could be replaced at the worksheet or column level. For example: specify that only the alteration worksheet is being downloaded; point at an existing file; replace just the alteration sheet. Suggested that this a significant process time-saver for the publications specialist. If we can find and Excel read/write component of sufficient quality, would be nice to provide this capability and support at the granularity of spreadsheet columns: only go and update the latest and greatest.

Java Balance

BUG Chris, Heather--Java Balance allows NULL value to be saved for TARE in the calibration properties file. Causes the balance software to crash with stack dump.

SampleMaster

BUG SampleMaster Entering sections from the catwalk. Why do the bottom intervals go red for the last one or two sections entered. Validation there is not quite correct.
BUG SampleMaster Occasionally does not print a label in a list of samples uploaded. Most typically the first sample is missed. Sometimes a sample further down is missed. Work-around: find missing and reprint. Not repeated. Capturing various SampleMaster station log files to see if we can catch the failure in action there.
BUG SampleMaster Entering pieces, validation code causes bin intervals to go red inappropriately.
BUG SampleMaster does not clear a cell when you start typing, it appends it to the existing content. Should be able to tab to the next cell and type, not have to click three times and then enter. Both TAB and ENTER should move across the rows. F2 should highlight content of cell, if arrow over, can edit; start typing replaces all. Up and down arrow keys should move up and down rows. Left and right arrow keys should move across cells. This should work on ALL tabs, not just some.
BUG Web-services, but only those used by SampleMaster. Top and bottom depths are being catalogs as lower-case "cm" units in LIMS.
FEATURE SampleMaster Log pieces on the respective section halves. Clarify. What about the uncut piece?
FEATURE SampleMaster Would like to be able to edit the piece number from within SampleMaster.
BUG SampleMaster Issues with entering/editing a section out of sequence. Behaved strangely. Particularly the recalculate offsets button is totally dependent on order.
BUG SampleMaster Won't accept "#" characters in the comment field. Causes upload to fail.
BUG SampleMaster cannot handle an ampersand in the comment stream or any of its fields--upload refuses to complete.
BUG SampleMaster Columns exported by SampleMaster do not match columns imported by SampleMaster. Current length observation is left out.
BUG SampleMaster YUKI samples not uploading. Why? Shifted column is corrected. Looked up and added curated parent length for reference. Still no go. Debugging indicates many fields not pulled for upload though present in the spreadsheet. Reproduction case in expedition 330 support folder.
BUG SampleMaster sometimes labels are dropping--requires additional double-checking. Not repeatable on demand.
FEATURE SampleMaster In browser lists, sort items alphabetically the same by their numeric component. E.g. piece numbers should sort numerically, not alphabetically. Similarly for cores.
BUG When clicking through the hierarchical browser, it is still possible to click in an empty portion of the core, section, half, etc. lists and cause an untrapped exception to occur.
FEATURE SampleMaster In sample Edit/View tabs, sort samples by Exp, Site, Hole, Core, Type, Sect, Interval.
FEATURE SampleMaster Curatorial request: Sample table tab--drop the sampling tool concept. Get rid of it.
FEATURE SampleMaster Remove original length (x_orig_len) and current length (x_curr_len) from the Sample Table tab. Not used for these kinds of samples. Useful for sections and halves only.
BUG SampleMaster Frustratingly slow on sample browser updates. Partially due to back-end server loading, partially due to server side speeds. Users expect it to "pop": <3-5 sec response time for browsing activities. Chieh Partially addressed by current back-end service changes with release 2.2.0.1 of SampleMaster. Need other optimization done to bring this up to the order of magnitude difference that end-users really appreciate.
BUG SampleMaster Edit Sample Table tab. Select parent sample. Enter three samples. Upload them. Copy and paste those three rows down. Modify. Upload again. The first sample of the copy-paste batch is assigned to the selected parent sample. The others are fine. Work-around has been to cancel and re-enter the samples.
FEATURE SampleMaster Enable samples to be moved within the sample hierarchy. Typically: sub-sample of archive half moved to working half where it was actually taken from; allow re-arranging of the sample hierarchy (maybe two shifts cataloged material differently: one against SHLF, the other against pieces under section half). Work-around: use LimsOnLine.
BUG SampleMaster on upload of TS (w/ Print on Upload enabled) SampleMaster indicates "No labels were printed. There does not appear to be anything to print." Behavior seen when pointed at load balancer web service. Label print service not presently load balanced. Cause identified, fix pending.
BUG SampleMaster or web-service or database sample rename trigger is leaving trailing spaces on label_id. E.g. this thin section got named with trailing spaces upon entry from SampleMaster: "TEST-TESTA-16H-5-W 20/22-TS14 ".
BUG Upload of MADC sample from SampleMaster generates a new OBSLENGTH analysis for each upload. Also generates a new MAD_MASS container_number component for each upload (does cancel the previous. Similarly for PYC analysis: generates new PYC container_number analysis for each upload, cancelling previous as it goes.
BUG Curator frequently receiving X of red death when doing simple queries in the parameter search window. Supply search, press Get Samples, crashes during sample retrieval. Seen it happen. Reproduction cases gathered.

Multi-Sensor Logger Framework


FEATURE MSL Revised various sensor Save Output file methods to return data as formatted for upload. Analyses, file extensions, etc. are specified in the sensor configuration files.
BUG MSL Run a section. Change the interval selection before specify the next section. Measure it. The track still moves the previous interval. And no measurements are taken. Recommend we grey out the buttons.
BUG MSL Interval buttons not properly aligned with more than two sensor packages. Manually aligned for presentation of up to 3 sensors. Still need to look into the automated code to see what causes it to be off for button and graph alignment.
BUG MSL Closing main windows should also stop and close all the instrument driver windows. Doesn't always.
BUG MSL Calibrate hangs the UI, doesn't move. Fixed.
BUG MSL Reset Controller menu item hangs the UI. No apparent effect on the Galil controller (no feedback, no messages). Fixed.
BUG MSL Find Laser works. But needs to be more robust: check for early tripping of laser switch. Leaving a core on the track breaks the current Find Laser algorithm. Added additional methods to detect obstructing material left on the track.
BUG MSL Configure Sensor: GRA Read config file hangs its UI. IODP MagSus: no configure window shows and the UI hangs. Fixed.
BUG MSL Find Home. No activity, no feedback. Hangs UI. Fixed. Has a functioning state diagram for the event to trigger.
BUG MSL About info. Show relese/version info automatically. Implementation ugly. Re-work.
FEATURE MSL Add menu items to enable/disable Context Help. Add Run-Time menus to do so.
FEATURE MSL Show units and data type for graphed values instead of just "values".
FEATURE MSL Identify the sample as part of the graph display, not as separate text boxes. Lengthen graph display.
FEATURE MSL Allow user to resize main display, auto-resize graphs to fit length and width.
BUG MSL Handle sample specified short better. Affects previous material run if you have to cancel or abort current run. Stop and ask operator for proper curated length, or option to go back and re-submit via Sample Entry.
BUG MSL GRA When Raw counts array is empty, a carriage return is omitted in the data stream that will be required for parsing. Fixed.
BUG MSL If omitted, command-line specification of config file path leads to empty config file path. If not specified, should default to hard-wired internal setting. Fixed.
BUG MSL Motion hangs for no apparent reason running 0.5 intervals for GRA, MS. Often can kick-start again by Abort and Resume--though not reliable. Hang still occurs, seems to be associated with VISA (timeout?) failure on Bartington. No longer causing interference with other instruments, revised to not generate NaN data content. To be merged between code bases.
FEATURE WRMSL EOT switch hit, acted as if it was going to recover, then died. Finish rewiring of WRMSL to match STMSL.
BUG MSL Running PWL with GRA and MS. Runs to PWL then hangs and beep-boops on the first PWL measurement. Leaves the PWL calipers clamped. No dialog. Must Task Manager kill the process. Fixed, but not released on WRMSL. WRMSL running different hardware for PWL than was available at HQ. Codebase is configurable for multiple hardware platforms.
BUG MSL code. P-wave files being output with default analysis GENERIC and extension "generic". Update base sensor class to handle all of these fields. Refactor other sensors to inherit and extend the base class functionality. Instead of duplicating and re-doing a different way. To be merged with production codebase.
BUG MSL code. Running at 0.5cm, three instruments, occasionally stops with Galil 2010 error. Recoverable by Abort and Resume dialog. Occurs with greater frequency later in the run. Often leading to incomplete run. Not usable. Worked around in current production code—don't generate NaN observation. Stop still occurs.
BUG MSL code. Occasionally an instrument will time out. Causing a whole measurement-point cycle to fail. Need to change how we handle time-out and a single failed measurement. Worked around in production code.
BUG MSL All sensors receive measure method, but one or two sensors fail to set their measured flag [done measuring indicator] on 1 in 300 measurements. Same as above. Worked around in production code.
BUG MSL 2.0.3.3 MagSus1 zero measurement now always NaN. Fix it.
BUG MSL Saw GRA not write out the analysis code in the header. Regression. Fixed.
FEATURE MSL Stop GRA writing calibration file. Just write it to configuration file in [Last Calibration] as this files gets uploaded when changed. Fixed in subversion code base.
BUG MSL No place to put user name! Add it to SampleEntry screen. Class has a spot for it. Fixed added to Sample cluster. Dropped use of authentication class.
BUG MSL Fails to measure 1 in ~200 times with multiple sensors operating. See below. Cause unknown. Known to occur more often when: remote desktop eating CPU cycles, multiple instruments are operating at once. Captured traces can be matched to the below times. Worked around in code. Production release fixed. Not fixed to par in subversion code-base.


Visual Core Description

BUG VCD production. Descriptions entered against section. Section recurated. Description plots in "wrong place" on VCD. Need to recompute depths on recurated sample material. Takes long time to show up in VCD, so isn't caught early. No automated check for this.
BUG VCD production. Describers don't take time to setup value list or sometimes find it inconvenient to use, or have variances cross-personnel in how things should be described. Results in many variants of a description that have to be trapped to plot it consistently. Double-spaces versus single spaces may not be trapped in names.
BUG VCD production. L2E retrieves content with the wrong depth--previously cancelled depth was being picked up.

Closeup, Microphoto, Thinsection Capture

BUG: CloseupCapture Thin Section mode, needs to show interval in file-name label shown for the image. Fixed 1.0.7.3.
FEATURE MicroscopeCapture needs to be able to import images saved just via the spot software. Need a function to browse the file system and import a batch of images or a panel that displays the content of a selected directory. Acquire/Import functionality are technically mutually exclusive: only support TIFF and JPG read.
BUG Export images from CloseupCapture are actually PNG files? Verify. Labelling them as JPGs causes occasional issues with import for Illustrator and Photoshops on board. Content in T:\Uservol\Thin Section Images\U1372\Specials ren *.jpg *.png after verifying with ImageMagick. Fixed 1.0.7.3

Virtual Photo Table

FEATURE Koppers requested annotated version of VPT image, showing unit boundaries, lithologies, legend.

Moisture and Density: Pycnometer and Balance

FEATURE Pycnometer--don't select sample just by text_id; provide more identifying info just like the Balance does (consistent set of parameters).
FEATURE Pycnometer needs to show more info on the sample selection, preferrably to match content the balance shows.
BUG Spelling Pycnometer uploads "CELCIUS" as the temperature unit rather than "CELSIUS".
BUG: Duplicate MAD_MASS analyses are being generated: same sample_number, same test_number, different result_numbers. Been happening for 3 expeditions now. Balance change? Log test change? Unknown. Checking. Should use existing test record created by SampleMaster at sample cataloging time. We are obtaining duplicate container entries and mass_dry entries on MAD_MASS records: one comes from SampleMaster, the other from? Balance or web-service or invocation combo.

Other

FEATURE Crawford. Would be nice to have contact sheets for cores and smear slide/thin section/microphoto
BUG resteasy-lims-webservices: OBSLENGTH.observed_length is cataloged with unit "m" instead of METER--in one place in the code.
BUG resteasy-lims-webservices: OBSLENGTH. Other components should be cataloged as NONE for the unit--METER is being applied inappropriately in all cases.
BUG SCALINPUTS catalogs its lengths using "m" rather than METER.
BUG Revise precision of data values on WTR tabular report--should be 5 significant figures aka 4 places behind the decimal point. Really need to revisit on a per analysis basis.
BUG Asman use of resteasy service does not provide file Content-disposition header. So we lose the filenames on download.
BUG web-services, child samples with 0 length (i.e. length meaningless) should inherit their parent sample interval and depth. DESCLogik is currently picking up the 0 offset, 0 length applied e.g. for the child (TS).
FEATURE Request from Craig, drill-shack: mini-computer for Drill shack to run Sample-Master, that way they can move monitor and console up in the world into better field of view. Even better, just provide them with a web page with all the necessary functionality for core, and no more.