* # This is a combination of 5 commits. # This is the 1st commit message: Add initial scripts # This is the commit message #2: Add working pytest script # This is the commit message #3: Add initial scripts # This is the commit message #4: Add environment variable files # This is the commit message #5: Remove old cluster script * Add initial scripts Add working pytest script Add initial scripts Add environment variable files Remove old cluster script Update pipeline credentials to OIDC Add initial scripts Add working pytest script Add initial scripts Add working pytest script * Remove debugging mark * Update example EKS cluster name * Remove quiet from Docker build * Manually pass env * Update env list vars as string * Update use array directly * Update variable array to export * Update to using read for splitting * Move to helper script * Update export from CodeBuild * Add wait for minio * Update kubectl wait timeout * Update minor changes for PR * Update integration test buildspec to quiet build * Add region to delete EKS * Add wait for pods * Updated README * Add fixed interval wait * Fix CodeBuild step order * Add file lock for experiment ID * Fix missing pytest parameter * Update run create only once * Add filelock to conda env * Update experiment name ensuring creation each time * Add try/catch with create experiment * Remove caching from KFP deployment * Remove disable KFP caching * Move .gitignore changes to inside component * Add blank line to default .gitignore |
||
|---|---|---|
| .. | ||
| component_tests | ||
| resources | ||
| scripts | ||
| utils | ||
| .env.example | ||
| .flake8 | ||
| Dockerfile | ||
| README.md | ||
| conftest.py | ||
| environment.yml | ||
| pytest.ini | ||
README.md
Requirements
- Docker
- IAM Role with a SageMakerFullAccess and AmazonS3FullAccess
- IAM User credentials with SageMakerFullAccess, AWSCloudFormationFullAccess, IAMFullAccess, AmazonEC2FullAccess, AmazonS3FullAccess permissions
Creating S3 buckets with datasets
In the following Python script, change the bucket name and run the s3_sample_data_creator.py to create an S3 bucket with the sample mnist dataset in the region where you want to run the tests.
Step to run integration tests
- Copy the
.env.examplefile to.envand in the following steps modify the fields of this new file:- Configure the AWS credentials fields with those of your IAM User.
- Update the
SAGEMAKER_EXECUTION_ROLE_ARNwith that of your role created earlier. - Update the
S3_DATA_BUCKETparameter with the name of the bucket created earlier. - (Optional) If you have already created an EKS cluster for testing, replace the
EKS_EXISTING_CLUSTERfield with it's name.
- Build the image by doing the following:
- Navigate to the
components/awsdirectory. - Run
docker build . -f sagemaker/tests/integration_tests/Dockerfile -t amazon/integration_test
- Navigate to the
- Run the image, injecting your environment variable files:
- Navigate to the
components/awsdirectory. - Run
docker run --env-file sagemaker/tests/integration_tests/.env amazon/integration_test
- Navigate to the