Signed-off-by: ananth102 <abashyam@amazon.com> |
||
---|---|---|
.. | ||
component_tests | ||
resources | ||
scripts | ||
utils | ||
.env.example | ||
Dockerfile | ||
README.md | ||
conftest.py | ||
pytest.ini | ||
requirements.txt |
README.md
Requirements
- Docker
- IAM Role with a SageMakerFullAccess, RoboMakerFullAccess and AmazonS3FullAccess
- IAM User credentials with SageMakerFullAccess, RoboMakerFullAccess, AWSCloudFormationFullAccess, IAMFullAccess, AmazonEC2FullAccess, AmazonS3FullAccess permissions
- The SageMaker WorkTeam and GroundTruth Component tests expect that at least one private workteam already exists in the region where you are running these tests.
Creating S3 buckets with datasets
- In the following Python script, change the bucket name and run the
s3_sample_data_creator.py
to create an S3 bucket with the sample mnist dataset in the region where you want to run the tests. - To prepare the dataset for the SageMaker GroundTruth Component test, follow the steps in the GroundTruth Sample README.
- To prepare the processing script for the SageMaker Processing Component tests, upload the
scripts/kmeans_preprocessing.py
script to your bucket. This can be done by replacing<my-bucket>
with your bucket name and runningaws s3 cp scripts/kmeans_preprocessing.py s3://<my-bucket>/mnist_kmeans_example/processing_code/kmeans_preprocessing.py
- Prepare RoboMaker Simulation App sources and Robot App sources and place them in the data bucket under the
/robomaker
key. The easiest way to create the files you need is to copy them from the public buckets that are used to store the RoboMaker Hello World demos:
The files in theaws s3 cp s3://aws-robomaker-samples-us-east-1-1fd12c306611/hello-world/melodic/gazebo9/1.4.0.62/1.2.0/simulation_ws.tar . aws s3 cp ./simulation_ws.tar s3://<your_bucket_name>/robomaker/simulation_ws.tar aws s3 cp s3://aws-robomaker-samples-us-east-1-1fd12c306611/hello-world/melodic/gazebo9/1.4.0.62/1.2.0/robot_ws.tar . aws s3 cp ./robot_ws.tar s3://<your_bucket_name>/robomaker/robot_ws.tar
/robomaker
directory on S3 should follow this pattern:/robomaker/simulation_ws.tar /robomaker/robot_ws.tar
- Prepare RLEstimator sources and place them in the data bucket under the
/rlestimator
key. The easiest way to create the files you need is to follow the notebooks outlined in the RLEstimator Samples README. The files in the/rlestimator
directory on S3 should follow this pattern:/rlestimator/sourcedir.tar.gz
Step to run integration tests
- Copy the
.env.example
file to.env
and in the following steps modify the fields of this new file:- Configure the AWS credentials fields with those of your IAM User.
- Update the
SAGEMAKER_EXECUTION_ROLE_ARN
with that of your role created earlier. - Update the
S3_DATA_BUCKET
parameter with the name of the bucket created earlier. - (Optional) If you have already created an EKS cluster for testing, replace the
EKS_EXISTING_CLUSTER
field with it's name.
- Build the image by doing the following:
- Navigate to the root of this github directory.
- Run
docker build . -f components/aws/sagemaker/tests/integration_tests/Dockerfile -t amazon/integration_test
- Run the image, injecting your environment variable files and mounting the repo files into the container:
- Run
docker run -v <path_to_this_repo_on_your_machine>:/pipelines --env-file components/aws/sagemaker/tests/integration_tests/.env amazon/integration_test
- Run