Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

ONAP Initial identities fictitiously a euphemistic hierarchy.  Obviously, a real company would want to replace this with their own.

identity.dat

This file is store in the Repository (starting from authz, see above clone)


Code Block
languagebash
titleLocation of identities.dat
auth/sample/data/sample.identities.dat


When deployed, you will find the file "identities.dat" in docker directories "/opt/app/osaaf/data", and a configured Docker Volume of "config".  (The Docker Volume keeps this file up persistently, whether Apps exist or not).

Before each AAF Docker component is launched, it is preceded by "aaf-config" init-container. The aaf-config init-container, among other things:

checks to see if the "identities.dat" file exists.

If not, it copies it from "/opt/app/aaf_config/data/sample.identities.dat" in the aaf-config Docker Image, and places it at /opt/app/osaaf/data/identities.dat, which is on the Persistent Docker Volume "config", as noted.

This ensures that identities.dat only starts new when it doesn't exist, and DOESN'T overwrite work that Companies may be doing.  Being in the "aaf-config" image avoids size implications in the other components.

* This "identities.dat" file is utilized exclusively by the "DefaultOrganization".  Companies are welcome and encouraged, if they wish, to create their own "Plugin that implements the 'Organization' interface", and connects to their own data how they please.  

** If this is too much work, they are free to update the "identities.dat" file from their own Organization information on a timely basis.

*** Companies should note that this mechanism was written for an ONAP member company with a nightly feed that included more than 1.3 million records.  It does so very efficiently, without synchronizing data.

Cassandra Data:

Cassandra data, that is interrelated to the above "Identities" and each other also need enough data to be functional with all the ONAP apps required.

In the repo, these files (ending in ".dat") are found at

Code Block
languagebash
titleCassandra Initial Data
auth/sample/cass_data

Included here are the ESSENTIAL data files in Cassandra (other Cassandra Data files are fine to be empty at the beginning, i.e. the history data).  

Similarly to the identity file, the Cassandra Docker keeps the INIT files in a docker image, but as opposed to AAF Components, it simply keeps them in the aaf-cass docker image.

IF the container, when starting, finds that cassandra tables are not defined, THEN the container will initialize cassandra from "cql" files located in the image at "/opt/app/aaf/cass_init " (copied from the repo directory auth/auth-cass/cass_init).  The container executes "keyspace.cql", "init.cql", "temp_identity.cql" and "onap.cql" to setup the initial tables.

IF the image finds that the cassandra tables are uninitialized THEN the image will load the data from the docker image directory "/opt/app/aaf/cass_init/".  It does NOT initialize if the data exists, so as not to overwrite running systems.

Please see repo "auth/auth-cass/cass_init/cmd.sh" to see how this is accomplished.

Relationship to various ONAP Test environments:

ONAP

...

Testing typically start off "from scratch", which means that there is no existing implementation or configuration.  In some cases, every day. This validates that all ONAP could start, as a System, "from scratch", and gives a clean environment for various tests.  

AAF utilizes the above Configurations mechanisms to ensure that each brand-new environment for ONAP starts off with existing Data.  This includes NEW ONAP components, documented in "sample.identities.dat" and the Permissions and Roles (etc) that the  Apps create within the AAF Test system.

As described above, all the Identities and Data are pushed into Containers (init or otherwise), which can start up and configure when data does not exist, or avoid overwriting when they do exist.

Processes to copy data from AAF Test systems in WindRiver to be ready in ONAP Docker Images, when required for Testing, other.

The expectation is that this process 

  • Is done from the development machine of ONAP AAF PTL (or other responsible party within ONAP)
  • This person already works with ONAP "aaf" project and has "authz" git repo setup and ready.
  • This person has VPN Access to WindRiver and a Key to the AAF WindRiver Project for K8s environment, and specifically the "aaf-test" instance running there (should never be removed)
GOALS:
  1. Cause a Remote Backup from the Containers
  2. Bring that Remote Backup locally, as added safeguard (Note: at times, people have removed the WindRiver AAF Test VM accidentally.  Extra backups made reconstruction reasonably simple)
  3. Use filtered data from backup to populate Initialization Data within AAF Containers, ready for check-in to Repo
  4. Use Backup Data to update Identity and Initialized Cassandra Data files into local "authz" file area
  5. Perform normal Checkin Procedures for authz
Steps 1-2 are accomplished with One local Shell Script, combined with Shell Scripts deployed on remote Directories and in Containers (see script to trace each one)

IT SHOULD BE NOTED, that backups can be run often, but that doesn't mean you HAVE to push the data to the REPO daily.  That is why there are two scripts.  Currently, we wait until ONAP Project leads ask for it to be pushed, or during correct Milestone.

This is done via a shell script.  Documented here is a template which the identified person can modify for their particular keys, but requires:

A) VPN Tunnel access to WindRiver, turned on

B) Private Key setup for WindRiver VM.  This is typically stored locally in "pem" format.  We'll call it "aaf-onap.pem" for the following template.

C) Your "/etc/hosts" directory has entry "10.12.5.145 aaf-onap-test.osaaf.org" (or whatever is the current IP with ONAP aaf-test VM)

D) This is being run from a subdirectory being worked by said user.  This subdirectory should NOT be within a Repo Boundary.  Here is a recommendation for this structure, assuming Unix 

/home/mefictitious home directory
/home/me/openDirectory where I keep all Open Source work, including ONAP
/home/me/open/authzCreated by 'git clone "https://gerrit.onap.org/r/aaf/authz"'
/home/me/open/...

Don't forget all the other AAF and ONAP projects, like 'cadi', 'doc', 'oom', etc.

/home/me/open/backupThe directory we'll work with in this effort... 


WARNING:  the "DAT Date" mechanism is simple, and doesn't adjust for Global Time zones.  Just choose a time (like morning?) when WindRiver Day is the same as your location's Day

Code Block
languagebash
titlepull.sh
#!/etc/bash

# NOTE: YOU much change this to point to YOUR personal locate for PEM based ONAP VM Key
PEM=/home/me/<directory for ONAP keys>/aaf-onap.pem

ssh -i $PEM ubuntu@aaf-onap-test.osaaf.org -t 'cd ~/authz/auth/auth-cass/docker;bash backup.sh'
ssh -i $PEM ubuntu@aaf-onap-test.osaaf.org -t 'cd ~/authz/auth/sample/data; bash pull.sh'

#DAT_TODAY="dat20190222.gz"
DAT_TODAY="dat$(date +%Y%m%d).gz"

scp -i $PEM ubuntu@aaf-onap-test.osaaf.org:~/authz/auth/auth-cass/docker/$DAT_TODAY .
scp -i $PEM ubuntu@aaf-onap-test.osaaf.org:~/authz/auth/sample/data/identities.dat .
if [ -e dats ]; then
  rm dats/*
fi

tar -xvf $DAT_TODAY


Results:  In the directory, after running, you should find the following (This sample is run on March 4, 2020.  OBVIOUSLY, dates should vary)

  • a "tar gz" backup for today of Test Data
  • the latest "identities.dat" file
  • a subdirectory called "dats", which has the latest data file un-tarred, ready for updating repo.


Code Block
languagebash
titleRun pull.sh
backup me$ bash pull.sh
Using 7 child processes

Starting copy of authz.locate with columns [name, hostname, port, latitude, longitude, major, minor, patch, pkg, port_key, protocol, subprotocol].
Processed: 16 rows; Rate:      92 rows/s; Avg. rate:      92 rows/s
16 rows exported to 1 files in 0.183 seconds.
Using 7 child processes
...
x dats/run_lock.dat
x dats/user_role.dat
x dats/x509.dat

backup me$ ls -altr
...
-rw-r--r--   1 jon  staff     533 Mar  4 08:09 pull.sh
-rw-r--r--   1 jon  staff  323591 Mar  4 08:06 dat20200304.gz
-rw-r--r--   1 jon  staff    4186 Mar  4 08:06 identities.dat
drwxr-xr-x  23 jon  staff     736 Mar  4 08:06 dats
...


Steps 3-4:

WHEN it is appropriate to update data in the Repo (when asked by another ONAP Project PTL, or when Milestones arrive, needing a new Docker Image), the process of moving the "Golden Source" of aaf-test to repo is as follows.

  • go to "authz"... "git status" should be updated, so you don't mix up commits.  
  • Note: This script uses "dos2unix", which is typically available to Unix.  This keeps DOSisms out, and allows for a cleaner check in.
  • Either Finish checking in, stash, whatever..  Just try to make commits of new aaf-test data CLEAN.
  • cd - (go back to "backup" directory)
Code Block
languagebash
titletoSample.sh
#!/bin/sh

# CORRECT THIS DIRECTORY to YOUR directory structure
DIR="/home/me/open/authz/auth/sample"

cp identities.dat $DIR/data/sample.identities.dat

cd "$DIR/cass_data"
mkdir -p dats
rm -f dats/*
cd - >/dev/null

pwd
FILES=$(ls dats/*.dat)
for D in $FILES; do
  sort $D > $DIR/cass_data/$D
done

cd $DIR/cass_data
bash scrub.sh
dos2unix *.dat
cd - >/dev/null


Code Block
languagebash
titlerun toSample.sh
backup me$ bash toSample.sh
2020-09-04 08:34:13.000+0000
Create default Passwords for all Identities in cred.dat
Scrubbing user_roles not in ../data/sample.identities.dat
Removed IDs from user_roles
> aaf-authz@aaf.osaaf.org
> aaronh@people.osaaf.org
> af_admin@people.osaaf.org
> appc123@appc.onap.org
> dglfromatt@people.osaaf.org
> djtimoney@people.osaaf.org
> jimmy@people.osaaf.org
> kirank@people.osaaf.org
> m99501@dmaapBC.openecomp.org
> m99751@dmaapBC.openecomp.org
> ragarwal@people.osaaf.org
> richardt@people.osaaf.org
> ryan@appc.onap.org
> ryany@people.osaaf.org
> saratp@people.osaaf.org
> sunilu@people.osaaf.org
> xuegao@people.osaaf.org
dos2unix: converting file artifact.dat to Unix format...
dos2unix: converting file config.dat to Unix format...
dos2unix: converting file cred.dat to Unix format...
dos2unix: converting file ns.dat to Unix format...
dos2unix: converting file ns_attrib.dat to Unix format...
dos2unix: converting file perm.dat to Unix format...
dos2unix: converting file role.dat to Unix format...
dos2unix: converting file user_role.dat to Unix format...

backup me$ cd -
authz me$ git status
On branch master
Your branch is up to date with 'origin/master'.

Changes not staged for commit:
  (use "git add <file>..." to update what will be committed)
  (use "git checkout -- <file>..." to discard changes in working directory)

	modified:   auth/sample/cass_data/cred.dat
	modified:   auth/sample/cass_data/perm.dat
	modified:   auth/sample/cass_data/role.dat
	modified:   auth/sample/cass_data/user_role.dat


Step 5:

You are ready to do normal AAF authz code check in procedures with "git commit" (and --amend), and git review.


SUMMARY:

This process

  • Created a Backup of aaf-test data remotely
  • Copied Backup locally
  • Filtered data that shouldn't be in Initialization data
  • Prepared this data in "authz" repo directory, ready for check in


Obviously, to be part of the containers, you have to follow ONAP Release procedures, which are not covered here.