1. Packages
  2. Google Cloud Native
  3. API Docs
  4. dataplex
  5. dataplex/v1
  6. Task

Google Cloud Native is in preview. Google Cloud Classic is fully supported.

Google Cloud Native v0.32.0 published on Wednesday, Nov 29, 2023 by Pulumi

google-native.dataplex/v1.Task

Explore with Pulumi AI

Google Cloud Native is in preview. Google Cloud Classic is fully supported.

Google Cloud Native v0.32.0 published on Wednesday, Nov 29, 2023 by Pulumi

Creates a task resource within a lake. Auto-naming is currently not supported for this resource.

Create Task Resource

Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

Constructor syntax

new Task(name: string, args: TaskArgs, opts?: CustomResourceOptions);
@overload
def Task(resource_name: str,
         args: TaskArgs,
         opts: Optional[ResourceOptions] = None)

@overload
def Task(resource_name: str,
         opts: Optional[ResourceOptions] = None,
         execution_spec: Optional[GoogleCloudDataplexV1TaskExecutionSpecArgs] = None,
         lake_id: Optional[str] = None,
         task_id: Optional[str] = None,
         trigger_spec: Optional[GoogleCloudDataplexV1TaskTriggerSpecArgs] = None,
         description: Optional[str] = None,
         display_name: Optional[str] = None,
         labels: Optional[Mapping[str, str]] = None,
         location: Optional[str] = None,
         notebook: Optional[GoogleCloudDataplexV1TaskNotebookTaskConfigArgs] = None,
         project: Optional[str] = None,
         spark: Optional[GoogleCloudDataplexV1TaskSparkTaskConfigArgs] = None)
func NewTask(ctx *Context, name string, args TaskArgs, opts ...ResourceOption) (*Task, error)
public Task(string name, TaskArgs args, CustomResourceOptions? opts = null)
public Task(String name, TaskArgs args)
public Task(String name, TaskArgs args, CustomResourceOptions options)
type: google-native:dataplex/v1:Task
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.

Parameters

name This property is required. string
The unique name of the resource.
args This property is required. TaskArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
resource_name This property is required. str
The unique name of the resource.
args This property is required. TaskArgs
The arguments to resource properties.
opts ResourceOptions
Bag of options to control resource's behavior.
ctx Context
Context object for the current deployment.
name This property is required. string
The unique name of the resource.
args This property is required. TaskArgs
The arguments to resource properties.
opts ResourceOption
Bag of options to control resource's behavior.
name This property is required. string
The unique name of the resource.
args This property is required. TaskArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
name This property is required. String
The unique name of the resource.
args This property is required. TaskArgs
The arguments to resource properties.
options CustomResourceOptions
Bag of options to control resource's behavior.

Constructor example

The following reference example uses placeholder values for all input properties.

var exampletaskResourceResourceFromDataplexv1 = new GoogleNative.Dataplex.V1.Task("exampletaskResourceResourceFromDataplexv1", new()
{
    ExecutionSpec = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskExecutionSpecArgs
    {
        ServiceAccount = "string",
        Args = 
        {
            { "string", "string" },
        },
        KmsKey = "string",
        MaxJobExecutionLifetime = "string",
        Project = "string",
    },
    LakeId = "string",
    TaskId = "string",
    TriggerSpec = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskTriggerSpecArgs
    {
        Type = GoogleNative.Dataplex.V1.GoogleCloudDataplexV1TaskTriggerSpecType.TypeUnspecified,
        Disabled = false,
        MaxRetries = 0,
        Schedule = "string",
        StartTime = "string",
    },
    Description = "string",
    DisplayName = "string",
    Labels = 
    {
        { "string", "string" },
    },
    Location = "string",
    Notebook = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskNotebookTaskConfigArgs
    {
        Notebook = "string",
        ArchiveUris = new[]
        {
            "string",
        },
        FileUris = new[]
        {
            "string",
        },
        InfrastructureSpec = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecArgs
        {
            Batch = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs
            {
                ExecutorsCount = 0,
                MaxExecutorsCount = 0,
            },
            ContainerImage = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs
            {
                Image = "string",
                JavaJars = new[]
                {
                    "string",
                },
                Properties = 
                {
                    { "string", "string" },
                },
                PythonPackages = new[]
                {
                    "string",
                },
            },
            VpcNetwork = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs
            {
                Network = "string",
                NetworkTags = new[]
                {
                    "string",
                },
                SubNetwork = "string",
            },
        },
    },
    Project = "string",
    Spark = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskSparkTaskConfigArgs
    {
        ArchiveUris = new[]
        {
            "string",
        },
        FileUris = new[]
        {
            "string",
        },
        InfrastructureSpec = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecArgs
        {
            Batch = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs
            {
                ExecutorsCount = 0,
                MaxExecutorsCount = 0,
            },
            ContainerImage = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs
            {
                Image = "string",
                JavaJars = new[]
                {
                    "string",
                },
                Properties = 
                {
                    { "string", "string" },
                },
                PythonPackages = new[]
                {
                    "string",
                },
            },
            VpcNetwork = new GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs
            {
                Network = "string",
                NetworkTags = new[]
                {
                    "string",
                },
                SubNetwork = "string",
            },
        },
        MainClass = "string",
        MainJarFileUri = "string",
        PythonScriptFile = "string",
        SqlScript = "string",
        SqlScriptFile = "string",
    },
});
Copy
example, err := dataplex.NewTask(ctx, "exampletaskResourceResourceFromDataplexv1", &dataplex.TaskArgs{
	ExecutionSpec: &dataplex.GoogleCloudDataplexV1TaskExecutionSpecArgs{
		ServiceAccount: pulumi.String("string"),
		Args: pulumi.StringMap{
			"string": pulumi.String("string"),
		},
		KmsKey:                  pulumi.String("string"),
		MaxJobExecutionLifetime: pulumi.String("string"),
		Project:                 pulumi.String("string"),
	},
	LakeId: pulumi.String("string"),
	TaskId: pulumi.String("string"),
	TriggerSpec: &dataplex.GoogleCloudDataplexV1TaskTriggerSpecArgs{
		Type:       dataplex.GoogleCloudDataplexV1TaskTriggerSpecTypeTypeUnspecified,
		Disabled:   pulumi.Bool(false),
		MaxRetries: pulumi.Int(0),
		Schedule:   pulumi.String("string"),
		StartTime:  pulumi.String("string"),
	},
	Description: pulumi.String("string"),
	DisplayName: pulumi.String("string"),
	Labels: pulumi.StringMap{
		"string": pulumi.String("string"),
	},
	Location: pulumi.String("string"),
	Notebook: &dataplex.GoogleCloudDataplexV1TaskNotebookTaskConfigArgs{
		Notebook: pulumi.String("string"),
		ArchiveUris: pulumi.StringArray{
			pulumi.String("string"),
		},
		FileUris: pulumi.StringArray{
			pulumi.String("string"),
		},
		InfrastructureSpec: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecArgs{
			Batch: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs{
				ExecutorsCount:    pulumi.Int(0),
				MaxExecutorsCount: pulumi.Int(0),
			},
			ContainerImage: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs{
				Image: pulumi.String("string"),
				JavaJars: pulumi.StringArray{
					pulumi.String("string"),
				},
				Properties: pulumi.StringMap{
					"string": pulumi.String("string"),
				},
				PythonPackages: pulumi.StringArray{
					pulumi.String("string"),
				},
			},
			VpcNetwork: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs{
				Network: pulumi.String("string"),
				NetworkTags: pulumi.StringArray{
					pulumi.String("string"),
				},
				SubNetwork: pulumi.String("string"),
			},
		},
	},
	Project: pulumi.String("string"),
	Spark: &dataplex.GoogleCloudDataplexV1TaskSparkTaskConfigArgs{
		ArchiveUris: pulumi.StringArray{
			pulumi.String("string"),
		},
		FileUris: pulumi.StringArray{
			pulumi.String("string"),
		},
		InfrastructureSpec: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecArgs{
			Batch: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs{
				ExecutorsCount:    pulumi.Int(0),
				MaxExecutorsCount: pulumi.Int(0),
			},
			ContainerImage: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs{
				Image: pulumi.String("string"),
				JavaJars: pulumi.StringArray{
					pulumi.String("string"),
				},
				Properties: pulumi.StringMap{
					"string": pulumi.String("string"),
				},
				PythonPackages: pulumi.StringArray{
					pulumi.String("string"),
				},
			},
			VpcNetwork: &dataplex.GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs{
				Network: pulumi.String("string"),
				NetworkTags: pulumi.StringArray{
					pulumi.String("string"),
				},
				SubNetwork: pulumi.String("string"),
			},
		},
		MainClass:        pulumi.String("string"),
		MainJarFileUri:   pulumi.String("string"),
		PythonScriptFile: pulumi.String("string"),
		SqlScript:        pulumi.String("string"),
		SqlScriptFile:    pulumi.String("string"),
	},
})
Copy
var exampletaskResourceResourceFromDataplexv1 = new Task("exampletaskResourceResourceFromDataplexv1", TaskArgs.builder()
    .executionSpec(GoogleCloudDataplexV1TaskExecutionSpecArgs.builder()
        .serviceAccount("string")
        .args(Map.of("string", "string"))
        .kmsKey("string")
        .maxJobExecutionLifetime("string")
        .project("string")
        .build())
    .lakeId("string")
    .taskId("string")
    .triggerSpec(GoogleCloudDataplexV1TaskTriggerSpecArgs.builder()
        .type("TYPE_UNSPECIFIED")
        .disabled(false)
        .maxRetries(0)
        .schedule("string")
        .startTime("string")
        .build())
    .description("string")
    .displayName("string")
    .labels(Map.of("string", "string"))
    .location("string")
    .notebook(GoogleCloudDataplexV1TaskNotebookTaskConfigArgs.builder()
        .notebook("string")
        .archiveUris("string")
        .fileUris("string")
        .infrastructureSpec(GoogleCloudDataplexV1TaskInfrastructureSpecArgs.builder()
            .batch(GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs.builder()
                .executorsCount(0)
                .maxExecutorsCount(0)
                .build())
            .containerImage(GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs.builder()
                .image("string")
                .javaJars("string")
                .properties(Map.of("string", "string"))
                .pythonPackages("string")
                .build())
            .vpcNetwork(GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs.builder()
                .network("string")
                .networkTags("string")
                .subNetwork("string")
                .build())
            .build())
        .build())
    .project("string")
    .spark(GoogleCloudDataplexV1TaskSparkTaskConfigArgs.builder()
        .archiveUris("string")
        .fileUris("string")
        .infrastructureSpec(GoogleCloudDataplexV1TaskInfrastructureSpecArgs.builder()
            .batch(GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs.builder()
                .executorsCount(0)
                .maxExecutorsCount(0)
                .build())
            .containerImage(GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs.builder()
                .image("string")
                .javaJars("string")
                .properties(Map.of("string", "string"))
                .pythonPackages("string")
                .build())
            .vpcNetwork(GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs.builder()
                .network("string")
                .networkTags("string")
                .subNetwork("string")
                .build())
            .build())
        .mainClass("string")
        .mainJarFileUri("string")
        .pythonScriptFile("string")
        .sqlScript("string")
        .sqlScriptFile("string")
        .build())
    .build());
Copy
exampletask_resource_resource_from_dataplexv1 = google_native.dataplex.v1.Task("exampletaskResourceResourceFromDataplexv1",
    execution_spec={
        "service_account": "string",
        "args": {
            "string": "string",
        },
        "kms_key": "string",
        "max_job_execution_lifetime": "string",
        "project": "string",
    },
    lake_id="string",
    task_id="string",
    trigger_spec={
        "type": google_native.dataplex.v1.GoogleCloudDataplexV1TaskTriggerSpecType.TYPE_UNSPECIFIED,
        "disabled": False,
        "max_retries": 0,
        "schedule": "string",
        "start_time": "string",
    },
    description="string",
    display_name="string",
    labels={
        "string": "string",
    },
    location="string",
    notebook={
        "notebook": "string",
        "archive_uris": ["string"],
        "file_uris": ["string"],
        "infrastructure_spec": {
            "batch": {
                "executors_count": 0,
                "max_executors_count": 0,
            },
            "container_image": {
                "image": "string",
                "java_jars": ["string"],
                "properties": {
                    "string": "string",
                },
                "python_packages": ["string"],
            },
            "vpc_network": {
                "network": "string",
                "network_tags": ["string"],
                "sub_network": "string",
            },
        },
    },
    project="string",
    spark={
        "archive_uris": ["string"],
        "file_uris": ["string"],
        "infrastructure_spec": {
            "batch": {
                "executors_count": 0,
                "max_executors_count": 0,
            },
            "container_image": {
                "image": "string",
                "java_jars": ["string"],
                "properties": {
                    "string": "string",
                },
                "python_packages": ["string"],
            },
            "vpc_network": {
                "network": "string",
                "network_tags": ["string"],
                "sub_network": "string",
            },
        },
        "main_class": "string",
        "main_jar_file_uri": "string",
        "python_script_file": "string",
        "sql_script": "string",
        "sql_script_file": "string",
    })
Copy
const exampletaskResourceResourceFromDataplexv1 = new google_native.dataplex.v1.Task("exampletaskResourceResourceFromDataplexv1", {
    executionSpec: {
        serviceAccount: "string",
        args: {
            string: "string",
        },
        kmsKey: "string",
        maxJobExecutionLifetime: "string",
        project: "string",
    },
    lakeId: "string",
    taskId: "string",
    triggerSpec: {
        type: google_native.dataplex.v1.GoogleCloudDataplexV1TaskTriggerSpecType.TypeUnspecified,
        disabled: false,
        maxRetries: 0,
        schedule: "string",
        startTime: "string",
    },
    description: "string",
    displayName: "string",
    labels: {
        string: "string",
    },
    location: "string",
    notebook: {
        notebook: "string",
        archiveUris: ["string"],
        fileUris: ["string"],
        infrastructureSpec: {
            batch: {
                executorsCount: 0,
                maxExecutorsCount: 0,
            },
            containerImage: {
                image: "string",
                javaJars: ["string"],
                properties: {
                    string: "string",
                },
                pythonPackages: ["string"],
            },
            vpcNetwork: {
                network: "string",
                networkTags: ["string"],
                subNetwork: "string",
            },
        },
    },
    project: "string",
    spark: {
        archiveUris: ["string"],
        fileUris: ["string"],
        infrastructureSpec: {
            batch: {
                executorsCount: 0,
                maxExecutorsCount: 0,
            },
            containerImage: {
                image: "string",
                javaJars: ["string"],
                properties: {
                    string: "string",
                },
                pythonPackages: ["string"],
            },
            vpcNetwork: {
                network: "string",
                networkTags: ["string"],
                subNetwork: "string",
            },
        },
        mainClass: "string",
        mainJarFileUri: "string",
        pythonScriptFile: "string",
        sqlScript: "string",
        sqlScriptFile: "string",
    },
});
Copy
type: google-native:dataplex/v1:Task
properties:
    description: string
    displayName: string
    executionSpec:
        args:
            string: string
        kmsKey: string
        maxJobExecutionLifetime: string
        project: string
        serviceAccount: string
    labels:
        string: string
    lakeId: string
    location: string
    notebook:
        archiveUris:
            - string
        fileUris:
            - string
        infrastructureSpec:
            batch:
                executorsCount: 0
                maxExecutorsCount: 0
            containerImage:
                image: string
                javaJars:
                    - string
                properties:
                    string: string
                pythonPackages:
                    - string
            vpcNetwork:
                network: string
                networkTags:
                    - string
                subNetwork: string
        notebook: string
    project: string
    spark:
        archiveUris:
            - string
        fileUris:
            - string
        infrastructureSpec:
            batch:
                executorsCount: 0
                maxExecutorsCount: 0
            containerImage:
                image: string
                javaJars:
                    - string
                properties:
                    string: string
                pythonPackages:
                    - string
            vpcNetwork:
                network: string
                networkTags:
                    - string
                subNetwork: string
        mainClass: string
        mainJarFileUri: string
        pythonScriptFile: string
        sqlScript: string
        sqlScriptFile: string
    taskId: string
    triggerSpec:
        disabled: false
        maxRetries: 0
        schedule: string
        startTime: string
        type: TYPE_UNSPECIFIED
Copy

Task Resource Properties

To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

Inputs

In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.

The Task resource accepts the following input properties:

ExecutionSpec This property is required. Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskExecutionSpec
Spec related to how a task is executed.
LakeId
This property is required.
Changes to this property will trigger replacement.
string
TaskId
This property is required.
Changes to this property will trigger replacement.
string
Required. Task identifier.
TriggerSpec This property is required. Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskTriggerSpec
Spec related to how often and when a task should be triggered.
Description string
Optional. Description of the task.
DisplayName string
Optional. User friendly display name.
Labels Dictionary<string, string>
Optional. User-defined labels for the task.
Location Changes to this property will trigger replacement. string
Notebook Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskNotebookTaskConfig
Config related to running scheduled Notebooks.
Project Changes to this property will trigger replacement. string
Spark Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskSparkTaskConfig
Config related to running custom Spark tasks.
ExecutionSpec This property is required. GoogleCloudDataplexV1TaskExecutionSpecArgs
Spec related to how a task is executed.
LakeId
This property is required.
Changes to this property will trigger replacement.
string
TaskId
This property is required.
Changes to this property will trigger replacement.
string
Required. Task identifier.
TriggerSpec This property is required. GoogleCloudDataplexV1TaskTriggerSpecArgs
Spec related to how often and when a task should be triggered.
Description string
Optional. Description of the task.
DisplayName string
Optional. User friendly display name.
Labels map[string]string
Optional. User-defined labels for the task.
Location Changes to this property will trigger replacement. string
Notebook GoogleCloudDataplexV1TaskNotebookTaskConfigArgs
Config related to running scheduled Notebooks.
Project Changes to this property will trigger replacement. string
Spark GoogleCloudDataplexV1TaskSparkTaskConfigArgs
Config related to running custom Spark tasks.
executionSpec This property is required. GoogleCloudDataplexV1TaskExecutionSpec
Spec related to how a task is executed.
lakeId
This property is required.
Changes to this property will trigger replacement.
String
taskId
This property is required.
Changes to this property will trigger replacement.
String
Required. Task identifier.
triggerSpec This property is required. GoogleCloudDataplexV1TaskTriggerSpec
Spec related to how often and when a task should be triggered.
description String
Optional. Description of the task.
displayName String
Optional. User friendly display name.
labels Map<String,String>
Optional. User-defined labels for the task.
location Changes to this property will trigger replacement. String
notebook GoogleCloudDataplexV1TaskNotebookTaskConfig
Config related to running scheduled Notebooks.
project Changes to this property will trigger replacement. String
spark GoogleCloudDataplexV1TaskSparkTaskConfig
Config related to running custom Spark tasks.
executionSpec This property is required. GoogleCloudDataplexV1TaskExecutionSpec
Spec related to how a task is executed.
lakeId
This property is required.
Changes to this property will trigger replacement.
string
taskId
This property is required.
Changes to this property will trigger replacement.
string
Required. Task identifier.
triggerSpec This property is required. GoogleCloudDataplexV1TaskTriggerSpec
Spec related to how often and when a task should be triggered.
description string
Optional. Description of the task.
displayName string
Optional. User friendly display name.
labels {[key: string]: string}
Optional. User-defined labels for the task.
location Changes to this property will trigger replacement. string
notebook GoogleCloudDataplexV1TaskNotebookTaskConfig
Config related to running scheduled Notebooks.
project Changes to this property will trigger replacement. string
spark GoogleCloudDataplexV1TaskSparkTaskConfig
Config related to running custom Spark tasks.
execution_spec This property is required. GoogleCloudDataplexV1TaskExecutionSpecArgs
Spec related to how a task is executed.
lake_id
This property is required.
Changes to this property will trigger replacement.
str
task_id
This property is required.
Changes to this property will trigger replacement.
str
Required. Task identifier.
trigger_spec This property is required. GoogleCloudDataplexV1TaskTriggerSpecArgs
Spec related to how often and when a task should be triggered.
description str
Optional. Description of the task.
display_name str
Optional. User friendly display name.
labels Mapping[str, str]
Optional. User-defined labels for the task.
location Changes to this property will trigger replacement. str
notebook GoogleCloudDataplexV1TaskNotebookTaskConfigArgs
Config related to running scheduled Notebooks.
project Changes to this property will trigger replacement. str
spark GoogleCloudDataplexV1TaskSparkTaskConfigArgs
Config related to running custom Spark tasks.
executionSpec This property is required. Property Map
Spec related to how a task is executed.
lakeId
This property is required.
Changes to this property will trigger replacement.
String
taskId
This property is required.
Changes to this property will trigger replacement.
String
Required. Task identifier.
triggerSpec This property is required. Property Map
Spec related to how often and when a task should be triggered.
description String
Optional. Description of the task.
displayName String
Optional. User friendly display name.
labels Map<String>
Optional. User-defined labels for the task.
location Changes to this property will trigger replacement. String
notebook Property Map
Config related to running scheduled Notebooks.
project Changes to this property will trigger replacement. String
spark Property Map
Config related to running custom Spark tasks.

Outputs

All input properties are implicitly available as output properties. Additionally, the Task resource produces the following output properties:

CreateTime string
The time when the task was created.
ExecutionStatus Pulumi.GoogleNative.Dataplex.V1.Outputs.GoogleCloudDataplexV1TaskExecutionStatusResponse
Status of the latest task executions.
Id string
The provider-assigned unique ID for this managed resource.
Name string
The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
State string
Current state of the task.
Uid string
System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
UpdateTime string
The time when the task was last updated.
CreateTime string
The time when the task was created.
ExecutionStatus GoogleCloudDataplexV1TaskExecutionStatusResponse
Status of the latest task executions.
Id string
The provider-assigned unique ID for this managed resource.
Name string
The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
State string
Current state of the task.
Uid string
System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
UpdateTime string
The time when the task was last updated.
createTime String
The time when the task was created.
executionStatus GoogleCloudDataplexV1TaskExecutionStatusResponse
Status of the latest task executions.
id String
The provider-assigned unique ID for this managed resource.
name String
The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
state String
Current state of the task.
uid String
System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
updateTime String
The time when the task was last updated.
createTime string
The time when the task was created.
executionStatus GoogleCloudDataplexV1TaskExecutionStatusResponse
Status of the latest task executions.
id string
The provider-assigned unique ID for this managed resource.
name string
The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
state string
Current state of the task.
uid string
System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
updateTime string
The time when the task was last updated.
create_time str
The time when the task was created.
execution_status GoogleCloudDataplexV1TaskExecutionStatusResponse
Status of the latest task executions.
id str
The provider-assigned unique ID for this managed resource.
name str
The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
state str
Current state of the task.
uid str
System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
update_time str
The time when the task was last updated.
createTime String
The time when the task was created.
executionStatus Property Map
Status of the latest task executions.
id String
The provider-assigned unique ID for this managed resource.
name String
The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
state String
Current state of the task.
uid String
System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
updateTime String
The time when the task was last updated.

Supporting Types

GoogleCloudDataplexV1JobResponse
, GoogleCloudDataplexV1JobResponseArgs

EndTime This property is required. string
The time when the job ended.
ExecutionSpec This property is required. Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskExecutionSpecResponse
Spec related to how a task is executed.
Labels This property is required. Dictionary<string, string>
User-defined labels for the task.
Message This property is required. string
Additional information about the current state.
Name This property is required. string
The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
RetryCount This property is required. int
The number of times the job has been retried (excluding the initial attempt).
Service This property is required. string
The underlying service running a job.
ServiceJob This property is required. string
The full resource name for the job run under a particular service.
StartTime This property is required. string
The time when the job was started.
State This property is required. string
Execution state for the job.
Trigger This property is required. string
Job execution trigger.
Uid This property is required. string
System generated globally unique ID for the job.
EndTime This property is required. string
The time when the job ended.
ExecutionSpec This property is required. GoogleCloudDataplexV1TaskExecutionSpecResponse
Spec related to how a task is executed.
Labels This property is required. map[string]string
User-defined labels for the task.
Message This property is required. string
Additional information about the current state.
Name This property is required. string
The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
RetryCount This property is required. int
The number of times the job has been retried (excluding the initial attempt).
Service This property is required. string
The underlying service running a job.
ServiceJob This property is required. string
The full resource name for the job run under a particular service.
StartTime This property is required. string
The time when the job was started.
State This property is required. string
Execution state for the job.
Trigger This property is required. string
Job execution trigger.
Uid This property is required. string
System generated globally unique ID for the job.
endTime This property is required. String
The time when the job ended.
executionSpec This property is required. GoogleCloudDataplexV1TaskExecutionSpecResponse
Spec related to how a task is executed.
labels This property is required. Map<String,String>
User-defined labels for the task.
message This property is required. String
Additional information about the current state.
name This property is required. String
The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
retryCount This property is required. Integer
The number of times the job has been retried (excluding the initial attempt).
service This property is required. String
The underlying service running a job.
serviceJob This property is required. String
The full resource name for the job run under a particular service.
startTime This property is required. String
The time when the job was started.
state This property is required. String
Execution state for the job.
trigger This property is required. String
Job execution trigger.
uid This property is required. String
System generated globally unique ID for the job.
endTime This property is required. string
The time when the job ended.
executionSpec This property is required. GoogleCloudDataplexV1TaskExecutionSpecResponse
Spec related to how a task is executed.
labels This property is required. {[key: string]: string}
User-defined labels for the task.
message This property is required. string
Additional information about the current state.
name This property is required. string
The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
retryCount This property is required. number
The number of times the job has been retried (excluding the initial attempt).
service This property is required. string
The underlying service running a job.
serviceJob This property is required. string
The full resource name for the job run under a particular service.
startTime This property is required. string
The time when the job was started.
state This property is required. string
Execution state for the job.
trigger This property is required. string
Job execution trigger.
uid This property is required. string
System generated globally unique ID for the job.
end_time This property is required. str
The time when the job ended.
execution_spec This property is required. GoogleCloudDataplexV1TaskExecutionSpecResponse
Spec related to how a task is executed.
labels This property is required. Mapping[str, str]
User-defined labels for the task.
message This property is required. str
Additional information about the current state.
name This property is required. str
The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
retry_count This property is required. int
The number of times the job has been retried (excluding the initial attempt).
service This property is required. str
The underlying service running a job.
service_job This property is required. str
The full resource name for the job run under a particular service.
start_time This property is required. str
The time when the job was started.
state This property is required. str
Execution state for the job.
trigger This property is required. str
Job execution trigger.
uid This property is required. str
System generated globally unique ID for the job.
endTime This property is required. String
The time when the job ended.
executionSpec This property is required. Property Map
Spec related to how a task is executed.
labels This property is required. Map<String>
User-defined labels for the task.
message This property is required. String
Additional information about the current state.
name This property is required. String
The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
retryCount This property is required. Number
The number of times the job has been retried (excluding the initial attempt).
service This property is required. String
The underlying service running a job.
serviceJob This property is required. String
The full resource name for the job run under a particular service.
startTime This property is required. String
The time when the job was started.
state This property is required. String
Execution state for the job.
trigger This property is required. String
Job execution trigger.
uid This property is required. String
System generated globally unique ID for the job.

GoogleCloudDataplexV1TaskExecutionSpec
, GoogleCloudDataplexV1TaskExecutionSpecArgs

ServiceAccount This property is required. string
Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
Args Dictionary<string, string>
Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
KmsKey string
Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
MaxJobExecutionLifetime string
Optional. The maximum duration after which the job execution is expired.
Project string
Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
ServiceAccount This property is required. string
Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
Args map[string]string
Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
KmsKey string
Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
MaxJobExecutionLifetime string
Optional. The maximum duration after which the job execution is expired.
Project string
Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
serviceAccount This property is required. String
Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
args Map<String,String>
Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
kmsKey String
Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
maxJobExecutionLifetime String
Optional. The maximum duration after which the job execution is expired.
project String
Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
serviceAccount This property is required. string
Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
args {[key: string]: string}
Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
kmsKey string
Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
maxJobExecutionLifetime string
Optional. The maximum duration after which the job execution is expired.
project string
Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
service_account This property is required. str
Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
args Mapping[str, str]
Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
kms_key str
Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
max_job_execution_lifetime str
Optional. The maximum duration after which the job execution is expired.
project str
Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
serviceAccount This property is required. String
Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
args Map<String>
Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
kmsKey String
Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
maxJobExecutionLifetime String
Optional. The maximum duration after which the job execution is expired.
project String
Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

GoogleCloudDataplexV1TaskExecutionSpecResponse
, GoogleCloudDataplexV1TaskExecutionSpecResponseArgs

Args This property is required. Dictionary<string, string>
Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
KmsKey This property is required. string
Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
MaxJobExecutionLifetime This property is required. string
Optional. The maximum duration after which the job execution is expired.
Project This property is required. string
Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
ServiceAccount This property is required. string
Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
Args This property is required. map[string]string
Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
KmsKey This property is required. string
Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
MaxJobExecutionLifetime This property is required. string
Optional. The maximum duration after which the job execution is expired.
Project This property is required. string
Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
ServiceAccount This property is required. string
Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
args This property is required. Map<String,String>
Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
kmsKey This property is required. String
Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
maxJobExecutionLifetime This property is required. String
Optional. The maximum duration after which the job execution is expired.
project This property is required. String
Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
serviceAccount This property is required. String
Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
args This property is required. {[key: string]: string}
Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
kmsKey This property is required. string
Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
maxJobExecutionLifetime This property is required. string
Optional. The maximum duration after which the job execution is expired.
project This property is required. string
Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
serviceAccount This property is required. string
Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
args This property is required. Mapping[str, str]
Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
kms_key This property is required. str
Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
max_job_execution_lifetime This property is required. str
Optional. The maximum duration after which the job execution is expired.
project This property is required. str
Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
service_account This property is required. str
Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
args This property is required. Map<String>
Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
kmsKey This property is required. String
Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
maxJobExecutionLifetime This property is required. String
Optional. The maximum duration after which the job execution is expired.
project This property is required. String
Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
serviceAccount This property is required. String
Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

GoogleCloudDataplexV1TaskExecutionStatusResponse
, GoogleCloudDataplexV1TaskExecutionStatusResponseArgs

LatestJob This property is required. Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1JobResponse
latest job execution
UpdateTime This property is required. string
Last update time of the status.
LatestJob This property is required. GoogleCloudDataplexV1JobResponse
latest job execution
UpdateTime This property is required. string
Last update time of the status.
latestJob This property is required. GoogleCloudDataplexV1JobResponse
latest job execution
updateTime This property is required. String
Last update time of the status.
latestJob This property is required. GoogleCloudDataplexV1JobResponse
latest job execution
updateTime This property is required. string
Last update time of the status.
latest_job This property is required. GoogleCloudDataplexV1JobResponse
latest job execution
update_time This property is required. str
Last update time of the status.
latestJob This property is required. Property Map
latest job execution
updateTime This property is required. String
Last update time of the status.

GoogleCloudDataplexV1TaskInfrastructureSpec
, GoogleCloudDataplexV1TaskInfrastructureSpecArgs

batch Property Map
Compute resources needed for a Task when using Dataproc Serverless.
containerImage Property Map
Container Image Runtime Configuration.
vpcNetwork Property Map
Vpc network.

GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResources
, GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs

ExecutorsCount int
Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
MaxExecutorsCount int
Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
ExecutorsCount int
Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
MaxExecutorsCount int
Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
executorsCount Integer
Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
maxExecutorsCount Integer
Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
executorsCount number
Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
maxExecutorsCount number
Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
executors_count int
Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
max_executors_count int
Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
executorsCount Number
Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
maxExecutorsCount Number
Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesResponse
, GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesResponseArgs

ExecutorsCount This property is required. int
Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
MaxExecutorsCount This property is required. int
Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
ExecutorsCount This property is required. int
Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
MaxExecutorsCount This property is required. int
Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
executorsCount This property is required. Integer
Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
maxExecutorsCount This property is required. Integer
Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
executorsCount This property is required. number
Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
maxExecutorsCount This property is required. number
Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
executors_count This property is required. int
Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
max_executors_count This property is required. int
Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
executorsCount This property is required. Number
Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
maxExecutorsCount This property is required. Number
Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntime
, GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs

Image string
Optional. Container image to use.
JavaJars List<string>
Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
Properties Dictionary<string, string>
Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
PythonPackages List<string>
Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
Image string
Optional. Container image to use.
JavaJars []string
Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
Properties map[string]string
Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
PythonPackages []string
Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
image String
Optional. Container image to use.
javaJars List<String>
Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
properties Map<String,String>
Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
pythonPackages List<String>
Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
image string
Optional. Container image to use.
javaJars string[]
Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
properties {[key: string]: string}
Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
pythonPackages string[]
Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
image str
Optional. Container image to use.
java_jars Sequence[str]
Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
properties Mapping[str, str]
Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
python_packages Sequence[str]
Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
image String
Optional. Container image to use.
javaJars List<String>
Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
properties Map<String>
Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
pythonPackages List<String>
Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeResponse
, GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeResponseArgs

Image This property is required. string
Optional. Container image to use.
JavaJars This property is required. List<string>
Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
Properties This property is required. Dictionary<string, string>
Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
PythonPackages This property is required. List<string>
Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
Image This property is required. string
Optional. Container image to use.
JavaJars This property is required. []string
Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
Properties This property is required. map[string]string
Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
PythonPackages This property is required. []string
Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
image This property is required. String
Optional. Container image to use.
javaJars This property is required. List<String>
Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
properties This property is required. Map<String,String>
Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
pythonPackages This property is required. List<String>
Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
image This property is required. string
Optional. Container image to use.
javaJars This property is required. string[]
Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
properties This property is required. {[key: string]: string}
Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
pythonPackages This property is required. string[]
Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
image This property is required. str
Optional. Container image to use.
java_jars This property is required. Sequence[str]
Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
properties This property is required. Mapping[str, str]
Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
python_packages This property is required. Sequence[str]
Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
image This property is required. String
Optional. Container image to use.
javaJars This property is required. List<String>
Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
properties This property is required. Map<String>
Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
pythonPackages This property is required. List<String>
Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

GoogleCloudDataplexV1TaskInfrastructureSpecResponse
, GoogleCloudDataplexV1TaskInfrastructureSpecResponseArgs

Batch This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesResponse
Compute resources needed for a Task when using Dataproc Serverless.
ContainerImage This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeResponse
Container Image Runtime Configuration.
VpcNetwork This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkResponse
Vpc network.
batch This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesResponse
Compute resources needed for a Task when using Dataproc Serverless.
containerImage This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeResponse
Container Image Runtime Configuration.
vpcNetwork This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkResponse
Vpc network.
batch This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesResponse
Compute resources needed for a Task when using Dataproc Serverless.
containerImage This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeResponse
Container Image Runtime Configuration.
vpcNetwork This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkResponse
Vpc network.
batch This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesResponse
Compute resources needed for a Task when using Dataproc Serverless.
container_image This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeResponse
Container Image Runtime Configuration.
vpc_network This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkResponse
Vpc network.
batch This property is required. Property Map
Compute resources needed for a Task when using Dataproc Serverless.
containerImage This property is required. Property Map
Container Image Runtime Configuration.
vpcNetwork This property is required. Property Map
Vpc network.

GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetwork
, GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs

Network string
Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
NetworkTags List<string>
Optional. List of network tags to apply to the job.
SubNetwork string
Optional. The Cloud VPC sub-network in which the job is run.
Network string
Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
NetworkTags []string
Optional. List of network tags to apply to the job.
SubNetwork string
Optional. The Cloud VPC sub-network in which the job is run.
network String
Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
networkTags List<String>
Optional. List of network tags to apply to the job.
subNetwork String
Optional. The Cloud VPC sub-network in which the job is run.
network string
Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
networkTags string[]
Optional. List of network tags to apply to the job.
subNetwork string
Optional. The Cloud VPC sub-network in which the job is run.
network str
Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
network_tags Sequence[str]
Optional. List of network tags to apply to the job.
sub_network str
Optional. The Cloud VPC sub-network in which the job is run.
network String
Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
networkTags List<String>
Optional. List of network tags to apply to the job.
subNetwork String
Optional. The Cloud VPC sub-network in which the job is run.

GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkResponse
, GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkResponseArgs

Network This property is required. string
Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
NetworkTags This property is required. List<string>
Optional. List of network tags to apply to the job.
SubNetwork This property is required. string
Optional. The Cloud VPC sub-network in which the job is run.
Network This property is required. string
Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
NetworkTags This property is required. []string
Optional. List of network tags to apply to the job.
SubNetwork This property is required. string
Optional. The Cloud VPC sub-network in which the job is run.
network This property is required. String
Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
networkTags This property is required. List<String>
Optional. List of network tags to apply to the job.
subNetwork This property is required. String
Optional. The Cloud VPC sub-network in which the job is run.
network This property is required. string
Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
networkTags This property is required. string[]
Optional. List of network tags to apply to the job.
subNetwork This property is required. string
Optional. The Cloud VPC sub-network in which the job is run.
network This property is required. str
Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
network_tags This property is required. Sequence[str]
Optional. List of network tags to apply to the job.
sub_network This property is required. str
Optional. The Cloud VPC sub-network in which the job is run.
network This property is required. String
Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
networkTags This property is required. List<String>
Optional. List of network tags to apply to the job.
subNetwork This property is required. String
Optional. The Cloud VPC sub-network in which the job is run.

GoogleCloudDataplexV1TaskNotebookTaskConfig
, GoogleCloudDataplexV1TaskNotebookTaskConfigArgs

Notebook This property is required. string
Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
ArchiveUris List<string>
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
FileUris List<string>
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
InfrastructureSpec Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpec
Optional. Infrastructure specification for the execution.
Notebook This property is required. string
Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
ArchiveUris []string
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
FileUris []string
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
InfrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpec
Optional. Infrastructure specification for the execution.
notebook This property is required. String
Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
archiveUris List<String>
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
fileUris List<String>
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpec
Optional. Infrastructure specification for the execution.
notebook This property is required. string
Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
archiveUris string[]
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
fileUris string[]
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpec
Optional. Infrastructure specification for the execution.
notebook This property is required. str
Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
archive_uris Sequence[str]
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
file_uris Sequence[str]
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructure_spec GoogleCloudDataplexV1TaskInfrastructureSpec
Optional. Infrastructure specification for the execution.
notebook This property is required. String
Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
archiveUris List<String>
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
fileUris List<String>
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructureSpec Property Map
Optional. Infrastructure specification for the execution.

GoogleCloudDataplexV1TaskNotebookTaskConfigResponse
, GoogleCloudDataplexV1TaskNotebookTaskConfigResponseArgs

ArchiveUris This property is required. List<string>
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
FileUris This property is required. List<string>
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
InfrastructureSpec This property is required. Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecResponse
Optional. Infrastructure specification for the execution.
Notebook This property is required. string
Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
ArchiveUris This property is required. []string
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
FileUris This property is required. []string
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
InfrastructureSpec This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecResponse
Optional. Infrastructure specification for the execution.
Notebook This property is required. string
Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
archiveUris This property is required. List<String>
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
fileUris This property is required. List<String>
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructureSpec This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecResponse
Optional. Infrastructure specification for the execution.
notebook This property is required. String
Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
archiveUris This property is required. string[]
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
fileUris This property is required. string[]
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructureSpec This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecResponse
Optional. Infrastructure specification for the execution.
notebook This property is required. string
Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
archive_uris This property is required. Sequence[str]
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
file_uris This property is required. Sequence[str]
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructure_spec This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecResponse
Optional. Infrastructure specification for the execution.
notebook This property is required. str
Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
archiveUris This property is required. List<String>
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
fileUris This property is required. List<String>
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructureSpec This property is required. Property Map
Optional. Infrastructure specification for the execution.
notebook This property is required. String
Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

GoogleCloudDataplexV1TaskSparkTaskConfig
, GoogleCloudDataplexV1TaskSparkTaskConfigArgs

ArchiveUris List<string>
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
FileUris List<string>
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
InfrastructureSpec Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpec
Optional. Infrastructure specification for the execution.
MainClass string
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
MainJarFileUri string
The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
PythonScriptFile string
The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
SqlScript string
The query text. The execution args are used to declare a set of script variables (set key="value";).
SqlScriptFile string
A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
ArchiveUris []string
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
FileUris []string
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
InfrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpec
Optional. Infrastructure specification for the execution.
MainClass string
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
MainJarFileUri string
The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
PythonScriptFile string
The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
SqlScript string
The query text. The execution args are used to declare a set of script variables (set key="value";).
SqlScriptFile string
A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
archiveUris List<String>
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
fileUris List<String>
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpec
Optional. Infrastructure specification for the execution.
mainClass String
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
mainJarFileUri String
The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
pythonScriptFile String
The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
sqlScript String
The query text. The execution args are used to declare a set of script variables (set key="value";).
sqlScriptFile String
A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
archiveUris string[]
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
fileUris string[]
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpec
Optional. Infrastructure specification for the execution.
mainClass string
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
mainJarFileUri string
The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
pythonScriptFile string
The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
sqlScript string
The query text. The execution args are used to declare a set of script variables (set key="value";).
sqlScriptFile string
A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
archive_uris Sequence[str]
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
file_uris Sequence[str]
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructure_spec GoogleCloudDataplexV1TaskInfrastructureSpec
Optional. Infrastructure specification for the execution.
main_class str
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
main_jar_file_uri str
The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
python_script_file str
The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
sql_script str
The query text. The execution args are used to declare a set of script variables (set key="value";).
sql_script_file str
A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
archiveUris List<String>
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
fileUris List<String>
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructureSpec Property Map
Optional. Infrastructure specification for the execution.
mainClass String
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
mainJarFileUri String
The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
pythonScriptFile String
The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
sqlScript String
The query text. The execution args are used to declare a set of script variables (set key="value";).
sqlScriptFile String
A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

GoogleCloudDataplexV1TaskSparkTaskConfigResponse
, GoogleCloudDataplexV1TaskSparkTaskConfigResponseArgs

ArchiveUris This property is required. List<string>
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
FileUris This property is required. List<string>
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
InfrastructureSpec This property is required. Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecResponse
Optional. Infrastructure specification for the execution.
MainClass This property is required. string
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
MainJarFileUri This property is required. string
The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
PythonScriptFile This property is required. string
The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
SqlScript This property is required. string
The query text. The execution args are used to declare a set of script variables (set key="value";).
SqlScriptFile This property is required. string
A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
ArchiveUris This property is required. []string
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
FileUris This property is required. []string
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
InfrastructureSpec This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecResponse
Optional. Infrastructure specification for the execution.
MainClass This property is required. string
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
MainJarFileUri This property is required. string
The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
PythonScriptFile This property is required. string
The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
SqlScript This property is required. string
The query text. The execution args are used to declare a set of script variables (set key="value";).
SqlScriptFile This property is required. string
A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
archiveUris This property is required. List<String>
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
fileUris This property is required. List<String>
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructureSpec This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecResponse
Optional. Infrastructure specification for the execution.
mainClass This property is required. String
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
mainJarFileUri This property is required. String
The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
pythonScriptFile This property is required. String
The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
sqlScript This property is required. String
The query text. The execution args are used to declare a set of script variables (set key="value";).
sqlScriptFile This property is required. String
A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
archiveUris This property is required. string[]
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
fileUris This property is required. string[]
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructureSpec This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecResponse
Optional. Infrastructure specification for the execution.
mainClass This property is required. string
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
mainJarFileUri This property is required. string
The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
pythonScriptFile This property is required. string
The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
sqlScript This property is required. string
The query text. The execution args are used to declare a set of script variables (set key="value";).
sqlScriptFile This property is required. string
A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
archive_uris This property is required. Sequence[str]
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
file_uris This property is required. Sequence[str]
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructure_spec This property is required. GoogleCloudDataplexV1TaskInfrastructureSpecResponse
Optional. Infrastructure specification for the execution.
main_class This property is required. str
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
main_jar_file_uri This property is required. str
The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
python_script_file This property is required. str
The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
sql_script This property is required. str
The query text. The execution args are used to declare a set of script variables (set key="value";).
sql_script_file This property is required. str
A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
archiveUris This property is required. List<String>
Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
fileUris This property is required. List<String>
Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
infrastructureSpec This property is required. Property Map
Optional. Infrastructure specification for the execution.
mainClass This property is required. String
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
mainJarFileUri This property is required. String
The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
pythonScriptFile This property is required. String
The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
sqlScript This property is required. String
The query text. The execution args are used to declare a set of script variables (set key="value";).
sqlScriptFile This property is required. String
A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

GoogleCloudDataplexV1TaskTriggerSpec
, GoogleCloudDataplexV1TaskTriggerSpecArgs

Type This property is required. Pulumi.GoogleNative.Dataplex.V1.GoogleCloudDataplexV1TaskTriggerSpecType
Immutable. Trigger type of the user-specified Task.
Disabled bool
Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
MaxRetries int
Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
Schedule string
Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
StartTime string
Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
Type This property is required. GoogleCloudDataplexV1TaskTriggerSpecType
Immutable. Trigger type of the user-specified Task.
Disabled bool
Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
MaxRetries int
Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
Schedule string
Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
StartTime string
Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
type This property is required. GoogleCloudDataplexV1TaskTriggerSpecType
Immutable. Trigger type of the user-specified Task.
disabled Boolean
Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
maxRetries Integer
Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
schedule String
Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
startTime String
Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
type This property is required. GoogleCloudDataplexV1TaskTriggerSpecType
Immutable. Trigger type of the user-specified Task.
disabled boolean
Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
maxRetries number
Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
schedule string
Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
startTime string
Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
type This property is required. GoogleCloudDataplexV1TaskTriggerSpecType
Immutable. Trigger type of the user-specified Task.
disabled bool
Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
max_retries int
Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
schedule str
Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
start_time str
Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
type This property is required. "TYPE_UNSPECIFIED" | "ON_DEMAND" | "RECURRING"
Immutable. Trigger type of the user-specified Task.
disabled Boolean
Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
maxRetries Number
Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
schedule String
Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
startTime String
Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

GoogleCloudDataplexV1TaskTriggerSpecResponse
, GoogleCloudDataplexV1TaskTriggerSpecResponseArgs

Disabled This property is required. bool
Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
MaxRetries This property is required. int
Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
Schedule This property is required. string
Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
StartTime This property is required. string
Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
Type This property is required. string
Immutable. Trigger type of the user-specified Task.
Disabled This property is required. bool
Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
MaxRetries This property is required. int
Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
Schedule This property is required. string
Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
StartTime This property is required. string
Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
Type This property is required. string
Immutable. Trigger type of the user-specified Task.
disabled This property is required. Boolean
Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
maxRetries This property is required. Integer
Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
schedule This property is required. String
Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
startTime This property is required. String
Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
type This property is required. String
Immutable. Trigger type of the user-specified Task.
disabled This property is required. boolean
Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
maxRetries This property is required. number
Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
schedule This property is required. string
Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
startTime This property is required. string
Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
type This property is required. string
Immutable. Trigger type of the user-specified Task.
disabled This property is required. bool
Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
max_retries This property is required. int
Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
schedule This property is required. str
Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
start_time This property is required. str
Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
type This property is required. str
Immutable. Trigger type of the user-specified Task.
disabled This property is required. Boolean
Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
maxRetries This property is required. Number
Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
schedule This property is required. String
Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
startTime This property is required. String
Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
type This property is required. String
Immutable. Trigger type of the user-specified Task.

GoogleCloudDataplexV1TaskTriggerSpecType
, GoogleCloudDataplexV1TaskTriggerSpecTypeArgs

TypeUnspecified
TYPE_UNSPECIFIEDUnspecified trigger type.
OnDemand
ON_DEMANDThe task runs one-time shortly after Task Creation.
Recurring
RECURRINGThe task is scheduled to run periodically.
GoogleCloudDataplexV1TaskTriggerSpecTypeTypeUnspecified
TYPE_UNSPECIFIEDUnspecified trigger type.
GoogleCloudDataplexV1TaskTriggerSpecTypeOnDemand
ON_DEMANDThe task runs one-time shortly after Task Creation.
GoogleCloudDataplexV1TaskTriggerSpecTypeRecurring
RECURRINGThe task is scheduled to run periodically.
TypeUnspecified
TYPE_UNSPECIFIEDUnspecified trigger type.
OnDemand
ON_DEMANDThe task runs one-time shortly after Task Creation.
Recurring
RECURRINGThe task is scheduled to run periodically.
TypeUnspecified
TYPE_UNSPECIFIEDUnspecified trigger type.
OnDemand
ON_DEMANDThe task runs one-time shortly after Task Creation.
Recurring
RECURRINGThe task is scheduled to run periodically.
TYPE_UNSPECIFIED
TYPE_UNSPECIFIEDUnspecified trigger type.
ON_DEMAND
ON_DEMANDThe task runs one-time shortly after Task Creation.
RECURRING
RECURRINGThe task is scheduled to run periodically.
"TYPE_UNSPECIFIED"
TYPE_UNSPECIFIEDUnspecified trigger type.
"ON_DEMAND"
ON_DEMANDThe task runs one-time shortly after Task Creation.
"RECURRING"
RECURRINGThe task is scheduled to run periodically.

Package Details

Repository
Google Cloud Native pulumi/pulumi-google-native
License
Apache-2.0

Google Cloud Native is in preview. Google Cloud Classic is fully supported.

Google Cloud Native v0.32.0 published on Wednesday, Nov 29, 2023 by Pulumi