* SDK - Reduce python component limitations - no import errors for custom type annotations
By default, create_component_from_func copies the source code of the function and creates a component using that source code. No global imports are captured. This is problematic for the function definition, since any annotation, that uses a type that needs to be imported, will cause error. There were some special provisions for
NamedTuple, InputPath and OutputPath, but even they were brittle (for example, "typing.NamedTuple" or "components.InputPath" annotations still caused failures at runtime).
This commit fixes the issue by stripping the type annotations from function declarations.
Fixes cases that were failing before:
```python
import typing
import collections
MyFuncOutputs = typing.NamedTuple('Outputs', [('sum', int), ('product', int)])
@create_component_from_func
def my_func(
param1: CustomType, # This caused failure previously
param2: collections.OrderedDict, # This caused failure previously
) -> MyFuncOutputs: # This caused failure previously
pass
```
* Fixed the compiler tests
* Fixed crashes on print function
Code `print(line, end="")` was causing error: "lib2to3.pgen2.parse.ParseError: bad input: type=22, value='=', context=('', (2, 15))"
* Using the strip_hints library to strip the annotations
* Updating test workflow yamls
* Workaround for bug in untokenize
* Switched to the new strip_string_to_string method
* Fixed typo.
Co-Authored-By: Jiaxiao Zheng <jxzheng@google.com>
Co-authored-by: Jiaxiao Zheng <jxzheng@google.com>
Added the `create_component_from_func` function as alias for `func_to_container_op`.
It behaves exactly the same, but the name now does not imply that you'll always get `ContainerOp` from it.
Some function parameters are not added at this moment as they're not widely used and might be deprecated in the future.
* SDK - Components refactoring
This change is a pure refactoring of the implementation of component task creation.
For pipelines compiled using the DSL compiler (the compile() function or the command-line program) nothing should change.
The main goal of the refactoring is to change the way the component instantiation can be customized.
Previously, the flow was like this:
`ComponentSpec` + arguments --> `TaskSpec` --resolving+transform--> `ContainerOp`
This PR changes it to more direct path:
`ComponentSpec` + arguments --constructor--> `ContainerOp`
or
`ComponentSpec` + arguments --constructor--> `TaskSpec`
or
`ComponentSpec` + arguments --constructor--> `SomeCustomTask`
The original approach where the flow always passes through `TaskSpec` had some issues since TaskSpec only accepts string arguments (and two
other reference classes). This made it harder to handle custom types of arguments like PipelineParam or Channel.
Low-level refactoring changes:
Resolving of command-line argument placeholders has been extracted into a function usable by different task constructors.
Changed `_components._created_task_transformation_handler` to `_components._container_task_constructor`. Previously, the handler was receiving a `TaskSpec` instance. Now it receives `ComponentSpec` + arguments [+ `ComponentReference`].
Moved the `ContainerOp` construction handler setup to the `kfp.dsl.Pipeline` context class as planned.
Extracted `TaskSpec` creation to `_components._create_task_spec_from_component_and_arguments`.
Refactored `_dsl_bridge.create_container_op_from_task` to `_components._resolve_command_line_and_paths` which returns `_ResolvedCommandLineAndPaths`.
Renamed `_dsl_bridge._create_container_op_from_resolved_task` to `_dsl_bridge._create_container_op_from_component_and_arguments`.
The signature of `_components._resolve_graph_task` was changed and it now returns `_ResolvedGraphTask` instead of modified `TaskSpec`.
Some of the component tests still expect ContainerOp and its attributes.
These tests will be changed later.
* Adapted the _python_op tests
* Fixed linter failure
I do not want to add any top-level kfp imports in this file to prevent circular references.
* Added docstrings
* FIxed the return type forward reference
* Replaced `_instance_to_dict(obj)` with `obj.to_dict()`
* Fixed the capitalization in _python_function_name_to_component_name
It now only changes the case of the first letter.
* Replaced the _extract_component_metadata function with _extract_component_interface
* Stopped adding newline to the component description.
* Handling None inputs and outputs
* Not including emply inputs and outputs in component spec
* Renamed the private attributes that the @pipeline decorator sets
* Changged _extract_pipeline_metadata to use _extract_component_interface
* Fixed issues based on feedback
* SDK/Components - Added JsonSchema spec for the component format
Added the schema outline
* Fixed missing "required"
* Replaced PrimitiveTypes with just string
* Renamed CommandlineArgumentType to StringOrPlaceholder
Made more ContainerSpec properties support placeholders
* Removed support for type inheritance and generic types as requested by Ning and Ajay
* Some people are scared of graphs/pipelines - removing them
* Some people do not like optional inputs and conditionals - removing them
Sorry, Yasser and Bradley.
* Some people might be scared of predicates or conditional execution - removing them
Sure, Argo and DSL supports it, but some people care more about the spec file size even when that means dropping already supported features.
Sorry, Bradley and Riley.
* Reverting the last 4 commits
Making those big compromises did not have any noticeable effect on people asking for them.
* Removed list-style type specifications
We've standardized on the map-style specification.
* Renamed TypeType to TypeSpecType
* Updated the structure of graphInput
* Added the type attribute to taskOutput and graphInput
* Updated the execution options structure
* Using the official Kubernetes PodSpec schema instead of Argo's subset
* SDK - Components - Fixed YAML formatting for some components
This fixes formatting for components where function does not have a return annotation.
The low-level cause of issue: Trailing whitespace when there are no serializers.
Trailing whitespace triggers ugly YAML string formatting.
* Addressed feedback
This makes the graph input references consistent with task output references.
This is a breaking change, but the graph components are not exposed in the documentation or samples yet.
* SDK - Python components - Fixed handling multiline decorators
* Switched to using dedent
* Added error checking
* Testing multiline decorator
* Test calling the component created from decorated function
Also fixed `helper_test_component_against_func_using_local_call`.
This makes the generated files more readable.
The attributes were properly ordered before, but the ordering broke when the `.to_dict` methods started outputting `dict` instead of `OrderedDict`.
Also fixed the existing generated `component.yaml` files.
This part of the spec was unused, so this is not a breaking change.
Consolidating Kubernetes-related options under a single attribute: `TaskSpec.execution_options.kubernetes_options`.
`TaskSpec.k8s_container_options` -> `TaskSpec.execution_options.kubernetes_options.main_container`
`TaskSpec.k8s_pod_options.spec` -> `TaskSpec.execution_options.kubernetes_options.pod_spec`
Added `TaskSpec.execution_options.retry_strategy.max_tetries` attribute.
* SDK/Components - Creating graph components from python pipeline function
`create_graph_component_from_pipeline_func` converts python pipeline function to a graph component object that can be saved, shared, composed or submitted for execution.
Example:
producer_op = load_component(component_with_0_inputs_and_2_outputs)
processor_op = load_component(component_with_2_inputs_and_2_outputs)
def pipeline1(pipeline_param_1: int):
producer_task = producer_op()
processor_task = processor_op(pipeline_param_1, producer_task.outputs['Output 2'])
return OrderedDict([
('Pipeline output 1', producer_task.outputs['Output 1']),
('Pipeline output 2', processor_task.outputs['Output 2']),
])
graph_component = create_graph_component_from_pipeline_func(pipeline1)
* Changed the signatures of exported functions
Non-public create_graph_component_spec_from_pipeline_func creates ComponentSpec
Public create_graph_component_from_pipeline_func creates component and writes it to file.
* Switched to using _extract_component_interface to analyze function signature
Stopped humanizing the input names for now. I think it's benefitial to extract interface from function signature the same way for both container and graph python components.
* Support outputs declared using pipeline function's return annotation
* Cleaned up the test
* Stop including the whole parent tasks in task output references
* By default, do not include task component specs in the graph component
Remove the component spec from component reference unless it will make the reference empty or unless explicitly asked by the user
* Exported the create_graph_component_from_pipeline_func function
* Fixed imports
* Updated the copyright year.
* SDK - Components - Verify the object type when serializing primitive arguments
Fixes an issue where if an input had a primitive type (e.g. `Integer`), you could pass anything to it (e.g. booleans, `ContainerOp`s, functions etc), because it just used `str` as serializer. Now the serializers chack the value type and raise error if the type is incorrect.
* Allow serializing integer when float is required.
* SDK - Refactoring - Passing the parameters explicitly in python_op.
This helps avoid problems when new parameters are added.
* SDK - Components - Added package installation support to func_to_container_op
Example:
```python
op = func_to_container_op(my_func, packages_to_install=['pandas==0.24'])
```
* Make pip quieter
* Added the test_packages_to_install_feature test
Fixed accessing inputs and outputs without checking for None.
Fixed case where the default value of graph component input has to be passed to component as an argument.
* SDK - Lightweight - Convert the names of file inputs and outputs
Removing the "_path" and "_file" suffixes from the names of file inputs and outputs.
Problem: When accepting file inputs (outputs), the function inside the component receives file paths (or file streams), so it's natural to call the function parameter "something_file_path" (e.g. model_file_path or number_file_path).
But from the outside perspective, there are no files or paths - the actual data objects (or references to them) are passed in.
It looks very strange when argument passing code looks like this: `component(number_file_path=42)`. This looks like an error since 42 is not a path. It's not even a string.
It's much more natural to strip the names of file inputs and outputs of "_file" or "_path" suffixes. Then the argument passing code will look natural: "component(number=42)"
* Removed the _FEATURE_STRIP_FILE_IO_NAME_PARTS feature switch
Problem: It's hard to distinguish components loaded by name (e.g. using `ComponentStore`) from components that were never loaded (e.g. just created from python function).
`component_ref.name` was previously being set, since it was a required parameter.
`component_ref.name` should only be set if component was loaded by name.
Lightweight components now allow function to mark some outputs that it wants to produce by writing data to files, not returning it as in-memory data objects.
This is useful when the data is expected to be big.
Example 1 (writing big amount of data to output file with provided path):
```python
@func_to_container_op
def write_big_data(big_file_path: OutputPath(str)):
with open(big_file_path) as big_file:
for i in range(1000000):
big_file.write('Hello world\n')
```
Example 2 (writing big amount of data to provided output file stream):
```python
@func_to_container_op
def write_big_data(big_file: OutputTextFile(str)):
for i in range(1000000):
big_file.write('Hello world\n')
```
Lightweight components now allow function to mark some inputs that it wants to consume as files, not as in-memory data objects.
This is useful when the data is expected to be big.
Example 1:
```python
def consume_big_file_path(big_file_path: InputPath(str)) -> int:
line_count = 0
with open(big_file_path) as f:
while f.readline():
line_count = line_count + 1
return line_count
```
Example 2:
```python
def consume_big_file(big_file: InputTextFile(str)) -> int:
line_count = 0
while big_file.readline():
line_count = line_count + 1
return line_count
```
* SDK - Tests - Added better helper functions for testing python components
* SDK - Python components - Properly serializing outputs
Background:
Component arguments are already properly serialized when calling the component program and then deserialized before the execution of the component function.
But the component outputs were only serialized using `str()` which is inadequate for data types like lists or dictionaries.
This commit fixes the mismatch - theoutputs are now serialized the same ways as arguments and default values.
* SDK - Refactoring - Replaced the ParameterMeta class with InputSpec and OutputSpec
* SDK - Refactoring - Replaced the internal PipelineMeta class with ComponentSpec
* SDK - Refactoring - Replaced the internal ComponentMeta class with ComponentSpec
* SDK - Refactoring - Replaced the *Meta classes with the *Spec classes
Replaced the ComponentMeta class with ComponentSpec
Replaced the PipelineMeta class with ComponentSpec
Replaced the ParameterMeta class with InputSpec and OutputSpec
* Removed empty fields
* SDK - Components - Hiding signature attribute from CloudPickle
Cloudpickle has some issues with pickling type annotations in python versions < 3.7, so they disabled it. https://github.com/cloudpipe/cloudpickle/issues/196
`create component_from_airflow_op` spoofs the function signature by setting the `func.__signature__` attribute. cloudpickle then tries to pickle that attribute which leads to failures during unpickling.
To prevent this we remove the `.__signature__` attribute before pickling.
* Added comments
# Hack to prevent cloudpickle from trying to pickle generic types that might be present in the signature. See https://github.com/cloudpipe/cloudpickle/issues/196
# Currently the __signature__ is only set by Airflow components as a means to spoof/pass the function signature to _func_to_component_spec
* SDK - Components - Added type to TaskOutputReference
Now the task output references taken from TaskSpec instances can be
type-checked when passed to components.
* Renamed TypeType to TypeSpecType
Problem: When the user loads component using the load_component function, the object they get back is a task factory function. Since it's a normal function object, the user cannot inspect any of the attributes of the component they just loaded (they can only see the name, description and input names). For example, the user cannot see the list of component outputs, the annotations etc.
This change fixes the issue by adding the original component properties to the function object.
Example usage:
```python
train_op = load_component_from_url(...)
print(train_op.outputs)
```
* SDK - Added support for raw artifact values to ContainerOp
* `ContainerOp` now gets artifact artguments from command line instead of the constructor.
* Added back input_artifact_arguments to the ContainerOp constructor.
In some scenarios it's hard to provide the artifact arguments through the `command` list when it already has resolved artifact paths.
* Exporting InputArtifactArgument from kfp.dsl
* Updated the sample
* Properly passing artifact arguments as task arguments
as opposed to default input values.
* Renamed input_artifact_arguments to artifact_arguments to reduce confusion
* Renamed InputArtifactArgument to InputArgumentPath
Also renamed input_artifact_arguments to artifact_argument_paths in the ContainerOp's constructor
* Replaced getattr with isinstance checks.
getattr is too fragile and can be broken by renames.
* Fixed the type annotations
* Unlocked the input artifact support in components
Added the test_input_path_placeholder_with_constant_argument test
Added kfp.components.set_default_base_image which sets the name of the container image that will be used for component creation when base_image is not specified.
Alternatively, the base image can also be set to a factory function that will be returning the image.
The support is added for both Lightweight components and python container components.
* SDK - Components - Improved serialization and deserialization of arguments and defaults
Properly serialize default values and passed arguments using the same code.
Check the types of passed argument values and issue warnings.
Improved argument reference type compatibility checking. When types do not match there is always either error or warning.
When creating component from python function, the input types are now canonicalized.
* Addressed the feedback
* SDK - Refactoring - Replaced the TypeMeta class
The PipelineParam no longer exposes the private TypeMeta class
Fixes#1420
The refactoring PR is part of a series of PR which unifies the metadata and specification types.
This PR fixes a bug in AirFlow op creation.
The `_run_airflow_op` helper function was not captured along with the `_run_airflow_op_closure` function, because they belong to different modules (`_run_airflow_op_closure` was module-less).
This was not discovered during the notebook testing of the code since in that environment the `_run_airflow_op` was also module-less as it was defined in a notebook (not in .py file).
* Lint Python code for undefined names
* Lint Python code for undefined names
* Exclude tfdv.py to workaround an overzealous pytest
* Fixup for tfdv.py
* Fixup for tfdv.py
* Fixup for tfdv.py
* SDK - Lightweight - Added support for "None" default values
Previously it was impossible to pass None to components since it was being converted to the string "None".
* is_required = not input.optional for now
As asked by @gaoning777
It's required to correctly handle None arguments or None default values (also needed for optional and variable-number inputs).
It's easier to understand and generates better command-line code.
* SDK - Refactored the code in kfp.components._python_op._capture_function_code_using_cloudpickle
* SDK/Lightweight - Added python version compatibility checks
See my compatibility analysis: https://github.com/cloudpipe/cloudpickle/issues/293
I've introduced code pickling to capture dependencies in https://github.com/kubeflow/pipelines/pull/1372
Later I've discovered that there is a serious opcode incompatibility between python versions 3.5 and 3.6+. See my analysis of the issue: https://github.com/cloudpipe/cloudpickle/issues/293
Dues to this issue I decided to switch back to using source code copying by default and to continue improving it.
Until we stop supporting python 3.5 (https://github.com/kubeflow/pipelines/pull/668) it's too dangerous to use code pickling by default.
Code pickling can be enabled by specifying `pickle_code=True` when calling `func_to_container_op`
Due to its nature, Argo will replace any strings it encounters
that are enclosed in double curly braces, which will make the code
non-executable. To workaround this, the code is encoded in the Argo
yaml template and decoded on the fly, before the execution.
* SDK - Controlling which modules are captured with Lightweight components
All func_to_* functions now accept the modules_to_capture parameter: List of module names that will be captured (instead of just referencing) during the dependency scan. By default the func.__module__ is captured.
* Described the behavior more in depth.
* Added a test to check that only dependencies are captured
* Transitively capturing code dependencies
Using cloudpickle.
* Got rid of func_type_declarations_code variable
* Extracted the function code extraction functions
* Improved support for capturing module-level dependencies
* Added test for capturing module-level dependencies
* Removed the _capture_function_code_using_source_copy function
As requested by Ning
* SDK - Made ComponentSpec.implementation field optional
Improved the error message when trying to convert tasks to ContainerOp.
* Switched from attribute checking to type checking
* Feature: sidecar for ContainerOp
* replace f-string with string format for compatibility with py3.5
* ContainerOp now can be updated with any k8s V1Container attributes as well as sidecars with Sidecar class. ContainerOp accepts PipelineParam in any valid k8 properties.
* WIP: fix conflicts and bugs with recent master. TODO: more complex template with pipeline params
* fix proxy args
* Fixed to work with latest master head
* Added container_kwargs to ContainerOp to pass in k8s container kwargs
* Fix comment bug, updated with example in ContainerOp docstring
* fix copyright year
* expose match_serialized_pipelineparam as public for compiler to process serialized pipeline params
* fixed pydoc example and removed unnecessary ContainerOp.container.parent
* Fix conflicts in compiler tests
* add core types and type checking function
* fix unit test bug
* avoid defining dynamic classes
* typo fix
* add component metadata format
* add a construct for the component decorator
* add default values for the meta classes
* add input/output types to the metadata
* add from_dict in TypeMeta
* small fix
* add unit tests
* use python struct for the openapi schema
* add default in parameter
* add default value
* remove the str restriction for the param default
* bug fix
* add pipelinemeta
* add pipeline metadata
* ignore annotation if it is not str/BaseType/dict
* update param name in the check_type functions
remove schema validators for GCRPath, and adjust for GCRPath, GCSPath
change _check_valid_dict to _check_valid_type_dict to avoid confusion
fix typo in the comments
adjust function order for readability
* remove default values for non-primitive types in the function signature
update the _check_valid_type_dict name
* pass metadata from component decorator and task factory to containerOp
* pass pipeline metadata to Pipeline
* fix unit test
* typo in the comments
* move the metadata classes to a separate module
* fix unit test
* small change
* add __eq__ to meta classes
not export _metadata classes
* nothing
* fix unit test
* unit test python component
* unit test python pipeline
* fix bug: duplicate variable of args
* fix unit tests
* move python_component and _component decorator in _component file
* remove the print
* change parameter default value to None
* add functools wraps around _component decorator
* TypeMeta accept both str and dict
* fix indent, add unit test for type as strings
* do not set default value for the name field in ParameterMeta, ComponentMeta, and PipelineMeta
* add type check in task factory
* output error message
* add type check in component decorator; move the metadata assignment out of the containerop __init__ function
* fix bug; add unit test
* add more unit tests
* more unit tests; fix bugs
* more unit tests; fix bugs
* add unit tests
* more unit tests
* add type check switch; add unit tests
* add compiler option for type check
* resolving pr comments
* add unit test for pipeline param check with component types; fix the bug; also fix the bug when there are not a single return annotations
The zip-packed components are supported in all load_component APIs:
`kfp.components.load_component`
`kfp.components.load_component_from_file`
`kfp.components.load_component_from_url`
`kfp.components.ComponentStore.load_component`
When the DSL bridge code was written, ContainerOp did not support env, so we did not pass it. Now we're adding the passing code.
Added test that chacks that the env variables get to the ContainerOp.
Ultimately, command line is an array of strings. Component yaml files should have the arguments as strings instead of Python SDK doing conversion sometimes.