@@ -207,6 +207,8 @@ You can set them using the ``with_{property}`` functions:
207207- ``with_spark_version ``
208208- ``with_warehouse_bucket_uri ``
209209- ``with_private_endpoint_id `` (`doc <https://docs.oracle.com/en-us/iaas/data-flow/using/pe-allowing.htm#pe-allowing >`__)
210+ - ``with_defined_tags ``
211+ - ``with_freeform_tags ``
210212
211213For more details, see `Data Flow class documentation <https://docs.oracle.com/en-us/iaas/tools/ads-sdk/latest/ads.jobs.html#module-ads.jobs.builders.infrastructure.dataflow >`__.
212214
@@ -229,10 +231,10 @@ create applications.
229231
230232In the following "hello-world" example, ``DataFlow `` is populated with ``compartment_id ``,
231233``driver_shape ``, ``driver_shape_config ``, ``executor_shape ``, ``executor_shape_config ``
232- and ``spark_version ``. ``DataFlowRuntime `` is populated with `` script_uri `` and
233- `` script_bucket ``. The ``script_uri `` specifies the path to the script. It can be
234- local or remote (an Object Storage path). If the path is local, then
235- ``script_bucket `` must be specified additionally because Data Flow
234+ , ``spark_version ``, ``defined_tags `` and `` freeform_tags ``. `` DataFlowRuntime `` is
235+ populated with `` script_uri `` and `` script_bucket ``. The ``script_uri `` specifies the
236+ path to the script. It can be local or remote (an Object Storage path). If the path
237+ is local, then ``script_bucket `` must be specified additionally because Data Flow
236238requires a script to be available in Object Storage. ADS
237239performs the upload step for you, as long as you give the bucket name
238240or the Object Storage path prefix to upload the script. Either can be
@@ -272,6 +274,10 @@ accepted. In the next example, the prefix is given for ``script_bucket``.
272274 .with_executor_shape(" VM.Standard.E4.Flex" )
273275 .with_executor_shape_config(ocpus = 4 , memory_in_gbs = 64 )
274276 .with_spark_version(" 3.0.2" )
277+ .with_defined_tag(
278+ ** {" Oracle-Tags" : {" CreatedBy" : " test_name@oracle.com" }}
279+ )
280+ .with_freeform_tag(test_freeform_key = " test_freeform_value" )
275281 )
276282 runtime_config = (
277283 DataFlowRuntime()
@@ -393,6 +399,10 @@ In the next example, ``archive_uri`` is given as an Object Storage location.
393399 " spark.driverEnv.myEnvVariable" : " value1" ,
394400 " spark.executorEnv.myEnvVariable" : " value2" ,
395401 })
402+ .with_defined_tag(
403+ ** {" Oracle-Tags" : {" CreatedBy" : " test_name@oracle.com" }}
404+ )
405+ .with_freeform_tag(test_freeform_key = " test_freeform_value" )
396406 )
397407 runtime_config = (
398408 DataFlowRuntime()
@@ -566,6 +576,11 @@ into the ``Job.from_yaml()`` function to build a Data Flow job:
566576 numExecutors : 1
567577 sparkVersion : 3.2.1
568578 privateEndpointId : <private_endpoint_ocid>
579+ definedTags :
580+ Oracle-Tags :
581+ CreatedBy : test_name@oracle.com
582+ freeformTags :
583+ test_freeform_key : test_freeform_value
569584 type : dataFlow
570585 name : dataflow_app_name
571586 runtime :
@@ -647,6 +662,12 @@ into the ``Job.from_yaml()`` function to build a Data Flow job:
647662 configuration :
648663 required : false
649664 type : dict
665+ definedTags :
666+ required : false
667+ type : dict
668+ freeformTags :
669+ required : false
670+ type : dict
650671 type :
651672 allowed :
652673 - dataFlow
@@ -694,7 +715,10 @@ into the ``Job.from_yaml()`` function to build a Data Flow job:
694715 configuration :
695716 required : false
696717 type : dict
697- freeform_tag :
718+ definedTags :
719+ required : false
720+ type : dict
721+ freeformTags :
698722 required : false
699723 type : dict
700724 scriptBucket :
0 commit comments