Commands for working with Dagster assets.
dagster asset [OPTIONS] COMMAND [ARGS]...
Commands
wipe
Eliminate asset key indexes from event logs.
Commands for debugging Dagster pipeline/job runs.
dagster debug [OPTIONS] COMMAND [ARGS]...
Commands
export
Export the relevant artifacts for a…
import
Import the relevant artifacts for a…
Commands for working with the current Dagster instance.
dagster instance [OPTIONS] COMMAND [ARGS]...
Commands
info
List the information about the current…
migrate
Automatically migrate an out of date…
reindex
Rebuild index over historical runs for…
Commands for working with Dagster jobs.
dagster job [OPTIONS] COMMAND [ARGS]...
Commands
backfill
Backfill a partitioned job.
execute
Execute a job.
launch
Launch a job using the run launcher…
list
List the jobs in a repository.
list_versions
Display the freshness of memoized results…
print
Print a job.
scaffold_config
Scaffold the config for a job.
Commands for working with Dagster pipeline/job runs.
dagster run [OPTIONS] COMMAND [ARGS]...
Commands
delete
Delete a run by id and its associated…
list
List the runs in the current Dagster…
wipe
Eliminate all run history and event logs.
Commands for working with Dagster schedules.
dagster schedule [OPTIONS] COMMAND [ARGS]...
Commands
debug
Debug information about the scheduler.
list
List all schedules that correspond to a…
logs
Get logs for a schedule.
preview
Preview changes that will be performed by…
restart
Restart a running schedule.
start
Start an existing schedule.
stop
Stop an existing schedule.
wipe
Delete the schedule history and turn off…
Commands for working with Dagster sensors.
dagster sensor [OPTIONS] COMMAND [ARGS]...
Commands
cursor
Set the cursor value for an existing sensor.
list
List all sensors that correspond to a…
preview
Preview an existing sensor execution.
start
Start an existing sensor.
stop
Stop an existing sensor.
Run a GraphQL query against the dagster interface to a specified repository or pipeline/job.
Can only use ONE of –workspace/-w, –python-file/-f, –module-name/-m, –grpc-port, –grpc-socket.
Examples:
dagster-graphql
dagster-graphql -y path/to/workspace.yaml
dagster-graphql -f path/to/file.py -a define_repo
dagster-graphql -m some_module -a define_repo
dagster-graphql -f path/to/file.py -a define_pipeline
dagster-graphql -m some_module -a define_pipeline
dagster-graphql [OPTIONS]
Options
--version
¶Show the version and exit.
-t
,
--text
<text>
¶GraphQL document to execute passed as a string
-f
,
--file
<file>
¶GraphQL document to execute passed as a file
-p
,
--predefined
<predefined>
¶GraphQL document to execute, from a predefined set provided by dagster-graphql.
launchPipelineExecution
-v
,
--variables
<variables>
¶A JSON encoded string containing the variables for GraphQL execution.
-r
,
--remote
<remote>
¶A URL for a remote instance running dagit server to send the GraphQL request to.
-o
,
--output
<output>
¶A file path to store the GraphQL response to. This flag is useful when making pipeline/job execution queries, since pipeline/job execution causes logs to print to stdout and stderr.
--ephemeral-instance
¶Use an ephemeral DagsterInstance instead of resolving via DAGSTER_HOME
--empty-workspace
¶Allow an empty workspace
-w
,
--workspace
<workspace>
¶Path to workspace file. Argument can be provided multiple times.
-d
,
--working-directory
<working_directory>
¶Specify working directory to use when loading the repository or pipeline/job.
-f
,
--python-file
<python_file>
¶Specify python file where repository or pipeline/job function lives
--package-name
<package_name>
¶Specify Python package where repository or pipeline/job function lives
-m
,
--module-name
<module_name>
¶Specify module where repository or pipeline/job function lives
-a
,
--attribute
<attribute>
¶Attribute that is either a 1) repository or pipeline/job or 2) a function that returns a repository or pipeline/job
--grpc-port
<grpc_port>
¶Port to use to connect to gRPC server
--grpc-socket
<grpc_socket>
¶Named socket to use to connect to gRPC server
--grpc-host
<grpc_host>
¶Host to use to connect to gRPC server, defaults to localhost
--use-ssl
¶Use a secure channel when connecting to the gRPC server
Run dagit. Loads a repository or pipeline/job.
Can only use ONE of –workspace/-w, –python-file/-f, –module-name/-m, –grpc-port, –grpc-socket.
Examples:
dagit (works if .workspace.yaml exists)
dagit -w path/to/workspace.yaml
dagit -f path/to/file.py
dagit -f path/to/file.py -d path/to/working_directory
dagit -m some_module
dagit -f path/to/file.py -a define_repo
dagit -m some_module -a define_repo
dagit -p 3333
Options can also provide arguments via environment variables prefixed with DAGIT
For example, DAGIT_PORT=3333 dagit
dagit [OPTIONS]
Options
--use-ssl
¶Use a secure channel when connecting to the gRPC server
--grpc-host
<grpc_host>
¶Host to use to connect to gRPC server, defaults to localhost
--grpc-socket
<grpc_socket>
¶Named socket to use to connect to gRPC server
--grpc-port
<grpc_port>
¶Port to use to connect to gRPC server
-a
,
--attribute
<attribute>
¶Attribute that is either a 1) repository or pipeline/job or 2) a function that returns a repository or pipeline/job
-m
,
--module-name
<module_name>
¶Specify module where repository or pipeline/job function lives
--package-name
<package_name>
¶Specify Python package where repository or pipeline/job function lives
-f
,
--python-file
<python_file>
¶Specify python file where repository or pipeline/job function lives
-d
,
--working-directory
<working_directory>
¶Specify working directory to use when loading the repository or pipeline/job.
-w
,
--workspace
<workspace>
¶Path to workspace file. Argument can be provided multiple times.
--empty-workspace
¶Allow an empty workspace
-h
,
--host
<host>
¶Host to run server on
127.0.0.1
-p
,
--port
<port>
¶Port to run server on.
3000
-l
,
--path-prefix
<path_prefix>
¶The path prefix where Dagit will be hosted (eg: /dagit)
--db-statement-timeout
<db_statement_timeout>
¶The timeout in milliseconds to set on database statements sent to the DagsterInstance. Not respected in all configurations.
15000
--read-only
¶Start Dagit in read-only mode, where all mutations such as launching runs and turning schedules on/off are turned off.
--suppress-warnings
¶Filter all warnings when hosting Dagit.
--version
¶Show the version and exit.
Run any daemons configured on the DagsterInstance.
dagster-daemon run [OPTIONS]
Options
--use-ssl
¶Use a secure channel when connecting to the gRPC server
--grpc-host
<grpc_host>
¶Host to use to connect to gRPC server, defaults to localhost
--grpc-socket
<grpc_socket>
¶Named socket to use to connect to gRPC server
--grpc-port
<grpc_port>
¶Port to use to connect to gRPC server
-a
,
--attribute
<attribute>
¶Attribute that is either a 1) repository or pipeline/job or 2) a function that returns a repository or pipeline/job
-m
,
--module-name
<module_name>
¶Specify module where repository or pipeline/job function lives
--package-name
<package_name>
¶Specify Python package where repository or pipeline/job function lives
-f
,
--python-file
<python_file>
¶Specify python file where repository or pipeline/job function lives
-d
,
--working-directory
<working_directory>
¶Specify working directory to use when loading the repository or pipeline/job.
-w
,
--workspace
<workspace>
¶Path to workspace file. Argument can be provided multiple times.
--empty-workspace
¶Allow an empty workspace
Log all heartbeat statuses
dagster-daemon debug heartbeat-dump [OPTIONS]
Serve the Dagster inter-process API over GRPC
dagster api grpc [OPTIONS]
Options
-p
,
--port
<port>
¶Port over which to serve. You must pass one and only one of –port/-p or –socket/-s.
-s
,
--socket
<socket>
¶Serve over a UDS socket. You must pass one and only one of –port/-p or –socket/-s.
-h
,
--host
<host>
¶Hostname at which to serve. Default is localhost.
-n
,
--max_workers
<max_workers>
¶Maximum number of (threaded) workers to use in the GRPC server
--heartbeat
¶If set, the GRPC server will shut itself down when it fails to receive a heartbeat after a timeout configurable with –heartbeat-timeout.
--heartbeat-timeout
<heartbeat_timeout>
¶Timeout after which to shutdown if –heartbeat is set and a heartbeat is not received
--lazy-load-user-code
¶Wait until the first LoadRepositories call to actually load the repositories, instead of waiting to load them when the server is launched. Useful for surfacing errors when the server is managed directly from Dagit
-a
,
--attribute
<attribute>
¶Attribute that is either a 1) repository or pipeline/job or 2) a function that returns a repository or pipeline/job
-m
,
--module-name
<module_name>
¶Specify module where repository or pipeline/job function lives
--package-name
<package_name>
¶Specify Python package where repository or pipeline/job function lives
-f
,
--python-file
<python_file>
¶Specify python file where repository or pipeline/job function lives
-d
,
--working-directory
<working_directory>
¶Specify working directory to use when loading the repository or pipeline/job.
--use-python-environment-entry-point
¶If this flag is set, the server will signal to clients that they should launch dagster commands using <this server’s python executable> -m dagster, instead of the default dagster entry point. This is useful when there are multiple Python environments running in the same machine, so a single dagster entry point is not enough to uniquely determine the environment.
--empty-working-directory
¶Indicates that the working directory should be empty and should not set to the current directory as a default
--ipc-output-file
<ipc_output_file>
¶[INTERNAL] This option should generally not be used by users. Internal param used by dagster when it automatically spawns gRPC servers to communicate the success or failure of the server launching.
--fixed-server-id
<fixed_server_id>
¶[INTERNAL] This option should generally not be used by users. Internal param used by dagster to spawn a gRPC server with the specified server id.
--override-system-timezone
<override_system_timezone>
¶[INTERNAL] This option should generally not be used by users. Override the system timezone for tests.
--log-level
<log_level>
¶Level at which to log output from the gRPC server process
Commands for working with Dagster pipelines/jobs.
dagster pipeline [OPTIONS] COMMAND [ARGS]...
Commands
backfill
Backfill a partitioned pipeline/job.
execute
Execute a pipeline.
launch
Launch a pipeline using the run launcher…
list
List the pipelines/jobs in a repository.
list_versions
Display the freshness of memoized results…
print
Print a pipeline/job.
scaffold_config
Scaffold the config for a pipeline.