Simpler overview is here: Workflow
- Make sure everything is properly prepared for upload (see Preparing Collections)
(If new to this process, see Command-line_Work_on_Linux_Server)
- Run makeFits script on server in scripts directory
- This will take some time, probably hours
- Check the corresponding ouput file located in the scripts output directory
- If there are errors, run FixFits, in the same directory; else it will say "Everything looks fine."
- If this is an ongoing collection, add today's date (no spaces) to the end of the collection directory name, to distinguish it from other sets of batches.
- Run holdContent, a script which
- puts tiffs, FITS, OCR, and collection xml (if there) in hold4metadata directory on server
- makes a collection directory in For_Metadata_Librarians and puts the Metadata directory and contents there
- generates smaller, non-watermarked images for metadata librarians in a Jpegs directory there
- deletes the collection in Digital_Coll_in_Progress
- Wait till you hear that the MODS are ready: Metadata librarians will move collection to Return_To_Digital_Services
- a weekly script, checkStatusDS will watch for content returned there, and will:
- check their jpegs against our tiffs
- check their MODS against their jpegs
- if all good, the script will delete their jpegs and move the collection to Digital_Coll_Complete for processing -- then notify us
- if problems, the script will send us a list of errors, and leave the collection in Return_To_Digital_Services
- If collection is still in Return_To_Digital_Services, resolve issues (may require renaming TIFFs and regenerating FITS to replace those on server), then rerun checkStatusDS (in ds /home/scripts/) till all is good
- a weekly script, checkStatusDS will watch for content returned there, and will:
*** NOTE: checkStatusDS will FAIL if the metadata librarians have renamed either the Jpegs folder OR the collection folder, as the script depends on these to test against the TIFFs in the hold4metadata directory on the server. This is INTENTIONAL so that we can give the metadata librarians multiple batches of scans within a single collection folder AND/OR multiple sets of content for a single collection, as we may digitize far more quickly than they create metadata. For ongoing large collections this would be an issue. Number/name batches and or sets of collection content appropriately, and DO NOT REUSE any already in their directory.
- Run jpegs2Server, a script which:
- copies the MODS to the UploadArea for online delivery and Deposits for archiving, and deletes the version from the Share drive
- copies any .txt files in the Metadata to the Deposits for archiving
- creates watermarked JPEG derivatives and thumbs of all the TIFFs and puts them into the UploadArea
- generates MIX files and logs technical metadata to the database
- moves any Excel spreadsheets to Administrative\Pipeline\collectionInfo\Storage_Excel
- moves the TIFFs to Deposits
- uploads transcripts & OCR to the Acumen database
- checks the collection XML (if there is one; warns you if it needs one) and gets it in the database, online, and in Deposits
- removes the hold4metadata folder and the Digital_Coll_Complete folders if empty
- sends you an email to check the output, and then run relocate_all to get stuff in Acumen
- check the output of jpegs2Server, and look to make sure all the content is gone from Digital_Coll_Complete; also eyeball the Upload directories to make sure MODS and Jpegs are there ready and waiting.
- Run relocate_all, a script which puts that content in the UploadArea in the right places in Acumen
- Check error report. If no errors, run findMissing. Correct any problems found.
Problems? See Troubleshooting
Important things to remember about working with the server
- Think long and hard before you run upload scripts on more than one collection at a time. Yes, it is possible to run the same scripts on multiple collections at a time. But just because it's possible doesn't mean you should do it. If something goes wrong, it can be very difficult to disentangle the separate collections in order to fix the problem.
- Do not run the same script more than once at a time on a particular collection. If a script encounters a problem such that it doesn't finish running, do not just run it again. We do not want two instances of the same script running simultaneously. This gets very confusing and multiplies the possibility of error exponentially!
- If you need to kill a script, please see someone who manages the scripts (Jody or the Repository Manager). Closing the current Terminal Window (the command line interface) of SSH Secure File Transfer does not kill the scripts that are running. This is especially important to remember for makeJpegs, which splits into two processes as soon as you run it (whether you see results of this in the UploadArea or not).
- When checking a libcontent folder for ongoing upload status, hit Refresh. The SSH File Transfer Window will not refresh on its own.
Preliminary steps on the server
- make sure that the Windows share is mounted: type into the ssh window on libcontent: `ls /cifs-mount` -- if no listing appears, or the window hangs up, you need to mount the share drive. Otherwise, proceed to getting content from the share drive...
- To mount the Windows drive, type this in on the commandline on libcontent: `sudo mount -t cifs -o username=jjcolonnaromano,domain=lib //libfs1.lib.ua-net.ua.edu/share/Digital\ Projects/ /cifs-mount` and use the password for share for Jeremiah. If successful, the command in the last step will show you the directories within the Digital Projects folder on Share.
- Make sure that all quality control scripts have been run, and all corrections made.
- Add today's date (no spaces) to collection directory name if this is an ongoing collection. Make sure scans batch numbers are not repeats.
- Log onto libcontent and change into the scripts directory. (see Command-line_Work_on_Linux_Server)
- Create FITS with the makeFits scripts on the server (in the scripts directory, not in the UploadArea).
- Change into UploadArea/scripts, and run holdContent, which should email you.
- Check for any errors in output; your directory should have disappeared from the working area, and part of it should reappear in For_Metadata_Librarians, and part in the /srv/deposits/hold4metadata/ directory.
After Content Returns from Metadata Librarians
- If the content remains in Return_To_Digital_Services, check the problems list in the email notification.
- You may need to rename collection directory or batch directories (on the share drive) to what they were originally
- You may need to rename TIFFs in /srv/deposits/hold4metadata/ and either modify & rename FITS files, or regenerate them
- Then rerun checkStatusDS till all is clear, and collection gets moved to Digital_Coll_Complete.
- Run jpegs2server on the content that the last script moved to Digital_Coll_Complete, without changing anything.
- Check output file, MODS and JPEGs in UploadArea, and ensure collection disappeared from the Completed folder.
- If/When all is good, run relocate_all
- Check output file for any issues.
- Run findMissing in /home/ds/scripts/ and check the output. This script will hunt through Acumen to make sure there is a MODS file for every item, and at least one derivative for each MODS file. Any errors will be found in the output file written to the scripts/output directory. If errors are found, get Metadata folks to regenerate those MODS -- or you regenerate JPEGs/MP3s, then rerun relocate_all. Then run this script again to ensure all errors have been remedied. (Script here: File:FindMissing.txt)
- Check the indexing of the uploaded content the next day to verify. (Content is currently being indexed each day overnight.)
- Login as DS and navigate to scripts
- Type in makeFits (check output)
- Navigate to UploadArea/scripts
- Type in holdContent (check output)
- WAIT FOR CONTENT TO BE RETURNED
- Repair problems if any, and run checkStatusDS until clear
- Run jpegs2server (check output)
- Run relocate_all (check output)
- Run findMissing (check output)
The first sections of the attached diagram which pertain to Digital Services are delineated in Preparing_Collections_on_the_S_Drive_for_Online_Delivery_and_Storage.
See lines 9-25 and 31-33 on this page: Moving_Content_To_Long-Term_Storage