New Builds – Do You Have a Checklist?

In case anyone is wondering why I’m currently just barely managing one post a week, the first and most obvious reason is preparation for Black Friday/Cyber Monday. Retail clients really stress their databases on these days, and a bit of proactive work can go a long way. The other reason is because somehow someone thought it would be a good idea to schedule a go-live for a rather large client for early/mid November. If I figure out who, I may have to complain. But it got me thinking about a checklist in my own head that I roll through before go-live, so I thought this week’s two-hour or less post would be on that.

Purpose of a go-live Checklist and Details

Many people who work on WebSphere Commerce just for their smallish employer may have only one go-live, or maybe 2-3 over 10 years to cover hardware and major software/site upgrades. The DBAs where I work handle about 6-12 per year, on top of the on-going support, builds, and training. After a while we’ve built up a list of things to double check. Most of these we do during a build process that is 2 months or longer and just check off when we’re getting close to go-live.

My list is very specifically DBA focused – there is plenty more for WebSphere Commerce or your specific application that it may not cover.

DBA e-commerce Checklist for go-live

The things I check are for WebSphere Commerce, but they should all apply to all e-commerce and even many DW/DSS databases.

  1. Basic database maintenance is scheduled. This includes:
    • Backups
    • Reorgs, Runstats, Rbinds
    • Performance monitoring/Snapshots
    • Archiving of db2diag.log, instance.nfy, and cleaning up of DIAGPATH
    • Fail-safe maintenance check script to find silent maintenance failures
  2. Standard starting settings are in place. This includes:
    • DB2 Registry
    • DBM Configuration
    • DB Configuration
  3. Standard permissions scheme in place. This includes:
    • Revoking connect from PUBLIC
    • Giving the WebSphere Commerce database id DBADM and/or DATAACCESS
    • Creating groups for read-only and select/update/insert/delete access – per server if LDAP is used for database connections
  4. Data pruning strategy in place. This is one step that can be done either pre-go-live or shortly after go-live. This includes:
    • Discussing detailed plan with dev team and client to identify any gaps or custom pruning areas needed
    • Sign off by dev team and Project Manager and Client on all retention guidelines
    • Coding of custom pruning areas if needed
    • Scheduling of pruning scripts in crontab
  5. Failover tests for all types of HA and DR solutions used
  6. Performance review. This includes:
    • Preferably load testing. Please, please, please let this client do full load testing.
    • Detailed review of physical database performance, and tweaking of parameters
    • SQL review – even if client doesn’t do load testing, look for problem SQL
  7. Quick physical storage review – is any filesystem getting full as we load data in?
  8. Code – may not have time to apply a DB2 FixPack, but I like to know if I’m going to need to in the near future. Sometimes a new one came out between the time I built the servers and the time we go live.
  9. Licensing – did I remember to apply a license file to all servers?
  10. db2look – I like to grab the structure of the objects via a db2look as it is during the pre-go-live code freeze

So what am I missing?

Most of this is in my head and the head of one of our other DBAs – between two of us, we handle most of the go-lives.

Ember Crooks
Ember Crooks

Ember is always curious and thrives on change. She has built internationally recognized expertise in IBM Db2, spent a year working with high-volume MySQL, and is now learning Snowflake. Ember shares both posts about her core skill sets and her journey learning Snowflake.

Ember lives in Denver and work from home

Articles: 544

One comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.