Difference between revisions of "XBio:D Database Server Management"

From xBio:D Wiki
Jump to navigation Jump to search
(Startup Database)
Line 33: Line 33:
 
## $ <code>dbstart $ORACLE_HOME</code>
 
## $ <code>dbstart $ORACLE_HOME</code>
 
# Start the database listener: $ <code>lsnrctl start</code>
 
# Start the database listener: $ <code>lsnrctl start</code>
 +
 +
# 01 Apr 2016 (no, this is <u>not</u> and April Fools joke)
 +
## $ <code>sqlplus /nolog</code> // This connects to an idle instance
 +
## $ <code>conn sys/password as sysdba</code><br>
 +
## $ <code>startup</code>
 +
 +
I'm not sure how or why this worked, as we could not execute the first set of steps. For now, everything seems to be working. We'll see if that lasts.
  
 
== Database Backup ==
 
== Database Backup ==

Revision as of 12:56, 1 April 2016



xBio:D Database Server Management (osuc.biosci.ohio-state.edu)


General Locations

  • Apache Server Config: /etc/httpd/
  • Apache Web Directory: /var/www/
  • Oracle Home Directory: /opt/app/oracle/product/11.2.0/db_2
  • Oracle Base Directory: /opt/app/oracle


Applications / Services

Startup Database

Although the database and all of the web services should automatically begin on startup, the database and its listener do not seem to obey. These instructions will provide you with information on starting the database. The assumption is that the user is already logged into the OSUC server.

  1. Startup the database:
    1. $ sqlplus
    2. $ conn sys/password as sysdba /nolog
    3. $ dbstart $ORACLE_HOME
  2. Start the database listener: $ lsnrctl start
  1. 01 Apr 2016 (no, this is not and April Fools joke)
    1. $ sqlplus /nolog // This connects to an idle instance
    2. $ conn sys/password as sysdba
    3. $ startup

I'm not sure how or why this worked, as we could not execute the first set of steps. For now, everything seems to be working. We'll see if that lasts.

Database Backup

The database is backup up daily at 10PM via the db_backup.sh script. This script create a database dump using the exp command and transfers this file via FTP to the dumpfiles directory of the hymfiles web directory.


Dataset Export

Data sets that are intended to be automatically updated in the xBio:D IPT must be specified within the export_dwca_collections.py found within the cgi-bin of the web server. Exports are placed into the data exports directory of the web server and present for harvesting by a supplementary script found on the hymfiles server where the IPT resides.


Automatic File Cleanup

The database server hosts the xBio:D database, the APIs, HOL, vSysLab, DB Manager, HNS, and the various HOL-based sites, however the available hard disk space is somewhat limited. Trace and log files quickly fill up available space that requires a solution that addresses the issue periodically. There are two scripts on server that remove temporary files that may cause problems.

  • The first script, removeOldTraceFiles.py, removes the Oracle trace files used by the optimizer to make database usage efficient. If trace files are older than five days, the script will remove these files to save space.
  • The second script, removeOldDataExports.py, removes data exports from the various web applications. This script removes any export that has resided on the server for over a month.