Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

GOALS:
  1. Cause a Remote Backup from the Containers
  2. Bring that Remote Backup locally, as added safeguard (Note: at times, people have removed the WindRiver AAF Test VM accidentally.  Extra backups made reconstruction reasonably simple)
  3. Use Backup Data to update Identity and Initialized
  4. Cause a Remote Backup from the Containers
  5. Use filtered data from backup to populate Initialization Data within AAF Containers, ready for check-in to Repo
  6. Use Backup Data to update Identity and Initialized Cassandra Data files into local "authz" file area
  7. Perform normal Checkin Procedures for authz
Steps 1-

...

2 are accomplished with One local Shell Script, combined with Shell Scripts deployed on remote Directories and in Containers (see script to trace each one)

IT SHOULD BE NOTED, that backups can be run often, but that doesn't mean you HAVE to push the data to the REPO daily.  That is why there are two scripts.  Currently, we wait until ONAP Project leads ask for it to be pushed, or during correct Milestone.

This is done via a shell script.  Documented here is a template which the identified person can modify for their particular keys, but requires:

...

D) This is being run from a subdirectory being worked by said user.  This subdirectory should NOT be within a Repo Boundary.  Here is a recommendation for this structure, assuming Unix 

/home/mefictitious home directory
/home/me/openDirectory where I keep all Open Source work, including ONAP
/home/me/open/authzCreated by 'git clone "https://gerrit.onap.org/r/aaf/authz"'
/home/me/open/...

Don't forget all the other AAF and ONAP projects, like 'cadi', 'doc', 'oom', etc.

/home/me/open/backupThe directory we'll work with in this effort... 


WARNING:  the "DAT Date" mechanism is simple, and doesn't adjust for Global Time zones.  Just choose a time (like morning?) when WindRiver Day is the same as your location's Day

...

Results:  In the directory, after running, you should find the following (This sample is run on March 4, 2020.  OBVIOUSLY, dates should vary)

  • a "tar gz" backup for today of Test Data
  • the latest "identities.dat" file
  • a subdirectory called "dats", which has the latest data file un-tarred, ready for updating repo.


Code Block
languagebash
titleRun pull.sh
backup me$ bash pull.sh
Using 7 child processes

Starting copy of authz.locate with columns [name, hostname, port, latitude, longitude, major, minor, patch, pkg, port_key, protocol, subprotocol].
Processed: 16 rows; Rate:      92 rows/s; Avg. rate:      92 rows/s
16 rows exported to 1 files in 0.183 seconds.
Using 7 child processes
...
x dats/run_lock.dat
x dats/user_role.dat
x dats/x509.dat

backup me$ ls -altr
...
-rw-r--r--   1 jon  staff     533 Mar  4 08:09 pull.sh
-rw-r--r--   1 jon  staff  323591 Mar  4 08:06 dat20200304.gz
-rw-r--r--   1 jon  staff    4186 Mar  4 08:06 identities.dat
drwxr-xr-x  23 jon  staff     736 Mar  4 08:06 dats
...


Steps 3-4:

WHEN it is appropriate to update data in the Repo (when asked by another ONAP Project PTL, or when Milestones arrive, needing a new Docker Image), the process of moving the "Golden Source" of aaf-test to repo is as follows.

  • go to "authz"... "git status" should be updated, so you don't mix up commits.  
  • Note: This script uses "dos2unix", which is typically available to Unix.  This keeps DOSisms out, and allows for a cleaner check in.
  • Either Finish checking in, stash, whatever..  Just try to make commits of new aaf-test data CLEAN.
  • cd - (go back to "backup" directory)
Code Block
languagebash
titletoSample.sh
#!/bin/sh

# CORRECT THIS DIRECTORY to YOUR directory structure
DIR="/home/me/open/authz/auth/sample"

cp identities.dat $DIR/data/sample.identities.dat

cd "$DIR/cass_data"
mkdir -p dats
rm -f dats/*
cd - >/dev/null

pwd
FILES=$(ls dats/*.dat)
for D in $FILES; do
  sort $D > $DIR/cass_data/$D
done

cd $DIR/cass_data
bash scrub.sh
dos2unix *.dat
cd - >/dev/null


Code Block
languagebash
titlerun toSample.sh
backup me$ bash toSample.sh
2020-09-04 08:34:13.000+0000
Create default Passwords for all Identities in cred.dat
Scrubbing user_roles not in ../data/sample.identities.dat
Removed IDs from user_roles
> aaf-authz@aaf.osaaf.org
> aaronh@people.osaaf.org
> af_admin@people.osaaf.org
> appc123@appc.onap.org
> dglfromatt@people.osaaf.org
> djtimoney@people.osaaf.org
> jimmy@people.
Step 5:
Step 6:
osaaf.org
> kirank@people.osaaf.org
> m99501@dmaapBC.openecomp.org
> m99751@dmaapBC.openecomp.org
> ragarwal@people.osaaf.org
> richardt@people.osaaf.org
> ryan@appc.onap.org
> ryany@people.osaaf.org
> saratp@people.osaaf.org
> sunilu@people.osaaf.org
> xuegao@people.osaaf.org
dos2unix: converting file artifact.dat to Unix format...
dos2unix: converting file config.dat to Unix format...
dos2unix: converting file cred.dat to Unix format...
dos2unix: converting file ns.dat to Unix format...
dos2unix: converting file ns_attrib.dat to Unix format...
dos2unix: converting file perm.dat to Unix format...
dos2unix: converting file role.dat to Unix format...
dos2unix: converting file user_role.dat to Unix format...

backup me$ cd -
authz me$ git status
On branch master
Your branch is up to date with 'origin/master'.

Changes not staged for commit:
  (use "git add <file>..." to update what will be committed)
  (use "git checkout -- <file>..." to discard changes in working directory)

	modified:   auth/sample/cass_data/cred.dat
	modified:   auth/sample/cass_data/perm.dat
	modified:   auth/sample/cass_data/role.dat
	modified:   auth/sample/cass_data/user_role.dat


Step 5:

You are ready to do normal AAF authz code check in procedures with "git commit" (and --amend), and git review.


SUMMARY:

This process

  • Created a Backup of aaf-test data remotely
  • Copied Backup locally
  • Filtered data that shouldn't be in Initialization data
  • Prepared this data in "authz" repo directory, ready for check in


Obviously, to be part of the containers, you have to follow ONAP Release procedures, which are not covered here.