pipelines/components/aws/sagemaker/tests/integration_tests
ananth102 664deaf933
test(components): Reduce sagemaker component test flakiness (#10225)
Signed-off-by: ananth102 <abashyam@amazon.com>
2024-02-14 19:29:10 +00:00
..
component_tests test(components): fix k8s_client 401 unauthorized error (#9749) 2023-07-18 18:37:22 +00:00
resources chore(test): Increase time out for git fetching and test run (#9462) 2023-06-05 18:38:23 +00:00
scripts test(components): Reduce sagemaker component test flakiness (#10225) 2024-02-14 19:29:10 +00:00
utils test(components): fix k8s_client 401 unauthorized error (#9749) 2023-07-18 18:37:22 +00:00
.env.example chore: update aws sagemaker components tests to kfp 1.7.0 (#6805) 2021-10-27 11:08:25 -07:00
Dockerfile chore(components): add test image cache (#9111) 2023-04-06 19:26:41 +00:00
README.md feat(components) Adds RoboMaker and SageMaker RLEstimator components (#4813) 2020-12-11 13:27:27 -08:00
conftest.py feat(components): SageMaker V2 model monitor component and testing (#9253) 2023-05-09 19:42:33 +00:00
pytest.ini feat(components): Sagemaker V2 Hosting components and tests (#9243) 2023-05-03 17:56:15 +00:00
requirements.txt test: Upgrade package versions and remove dependency on "sagemaker-sample-data-<region>" bucket (#9204) 2023-04-25 02:31:44 +00:00

README.md

Requirements

  1. Docker
  2. IAM Role with a SageMakerFullAccess, RoboMakerFullAccess and AmazonS3FullAccess
  3. IAM User credentials with SageMakerFullAccess, RoboMakerFullAccess, AWSCloudFormationFullAccess, IAMFullAccess, AmazonEC2FullAccess, AmazonS3FullAccess permissions
  4. The SageMaker WorkTeam and GroundTruth Component tests expect that at least one private workteam already exists in the region where you are running these tests.

Creating S3 buckets with datasets

  1. In the following Python script, change the bucket name and run the s3_sample_data_creator.py to create an S3 bucket with the sample mnist dataset in the region where you want to run the tests.
  2. To prepare the dataset for the SageMaker GroundTruth Component test, follow the steps in the GroundTruth Sample README.
  3. To prepare the processing script for the SageMaker Processing Component tests, upload the scripts/kmeans_preprocessing.py script to your bucket. This can be done by replacing <my-bucket> with your bucket name and running aws s3 cp scripts/kmeans_preprocessing.py s3://<my-bucket>/mnist_kmeans_example/processing_code/kmeans_preprocessing.py
  4. Prepare RoboMaker Simulation App sources and Robot App sources and place them in the data bucket under the /robomaker key. The easiest way to create the files you need is to copy them from the public buckets that are used to store the RoboMaker Hello World demos:
    aws s3 cp s3://aws-robomaker-samples-us-east-1-1fd12c306611/hello-world/melodic/gazebo9/1.4.0.62/1.2.0/simulation_ws.tar .
    aws s3 cp ./simulation_ws.tar s3://<your_bucket_name>/robomaker/simulation_ws.tar
    aws s3 cp s3://aws-robomaker-samples-us-east-1-1fd12c306611/hello-world/melodic/gazebo9/1.4.0.62/1.2.0/robot_ws.tar .
    aws s3 cp ./robot_ws.tar s3://<your_bucket_name>/robomaker/robot_ws.tar
    
    The files in the /robomaker directory on S3 should follow this pattern:
    /robomaker/simulation_ws.tar
    /robomaker/robot_ws.tar
    
  5. Prepare RLEstimator sources and place them in the data bucket under the /rlestimator key. The easiest way to create the files you need is to follow the notebooks outlined in the RLEstimator Samples README. The files in the /rlestimator directory on S3 should follow this pattern:
    /rlestimator/sourcedir.tar.gz
    

Step to run integration tests

  1. Copy the .env.example file to .env and in the following steps modify the fields of this new file:
    1. Configure the AWS credentials fields with those of your IAM User.
    2. Update the SAGEMAKER_EXECUTION_ROLE_ARN with that of your role created earlier.
    3. Update the S3_DATA_BUCKET parameter with the name of the bucket created earlier.
    4. (Optional) If you have already created an EKS cluster for testing, replace the EKS_EXISTING_CLUSTER field with it's name.
  2. Build the image by doing the following:
    1. Navigate to the root of this github directory.
    2. Run docker build . -f components/aws/sagemaker/tests/integration_tests/Dockerfile -t amazon/integration_test
  3. Run the image, injecting your environment variable files and mounting the repo files into the container:
    1. Run docker run -v <path_to_this_repo_on_your_machine>:/pipelines --env-file components/aws/sagemaker/tests/integration_tests/.env amazon/integration_test