Airflow get params from context. return 'second_branch_task'. text import MIMEText from email. I've tried to trigger another dag with some paramters in a TriggerDagRunOperator, but in the triggered dag, the dag_run object is always None. 0 and contrasts this with DAGs written using the traditional paradigm. ds ( str) – anchor date in YYYY-MM-DD format to add to. Feb 26, 2019 · I just started using Airflow, can anyone enlighten me how to pass a parameter into PythonOperator like below: t5_send_notification = PythonOperator( task_id='t5_send_notification', Oct 11, 2021 · Documentation on the nature of context is pretty sparse at the moment. If None, current-context is used. The result of templated arguments can be checked with airflow tasks render. It can be used to parameterize your dags. Below is sample code: from datetime import datetime. Jan 10, 2011 · Im using Airflow 1. While each component does not require all, some configurations need to be same otherwise they would not work as expected. So I think the problem in your code is related to how you are trying to access those params. static xcom_push (context, key, value, execution_date = None) [source] ¶ Make an XCom available for tasks to pull. Airflow Variables can also be created and managed using Environment Variables. In this case, you can retrieve the default args from the context: from airflow. my_conn_id) it will perform a lookup of using the airflow. providers. Use the same configuration across all the Airflow components. x. def run_after(): context=get_current_context() So problem solved, anyway it would be great if the context was also available in the kwargs. Param(5, type=["null", "number", "string"])) or that can assume a fixed set of DAG Runs. These Airflow default variables are only instantiated in the context of a task instance for a given DAG run, and thus they are only available in the templated fields of each operator. Apr 28, 2020 · This operator takes two parameters: google_cloud_storage_conn_id and dest_aws_conn_id. See Templates reference. Aug 15, 2018 · 1. This is similar to defining your tasks in a for loop, but instead of having the DAG file fetch the data and do that itself Given a list of dag_ids, get a set of Paused Dag Ids. taskinstance. my_dict_var. To use the conf parameters in the overall DAG: dag_id="my_dag", default_args=args. I have returned 2 examples for you to see a) the state, b) the last execution date, you can explore this further by just returning last_dag_run[0] Args: dag_id (str): The dag_id to check. Dynamic Task Mapping. This set of kwargs correspond exactly to what you can use in your jinja templates. The status of the DAG Run depends on the tasks states. You will see a similar result as in the screenshot below. This set Feb 23, 2023 · BTW. callbacks (list[Callable] | None) – List of callbacks to call. from the python function binded to the PythonOperator, if the operator has provide_context=True, the function will accept a **kwargs argument with extra context information for that task. op_kwargs (dict (templated)) – a dictionary of keyword arguments that will get unpacked in your function. Variables. Marking the question as solved. The context objects are accesible just by declaring the parameterss in the task signature: Jan 10, 2013 · op_kwargs (dict (templated)) – a dictionary of keyword arguments that will get unpacked in your function. dump()[source] ¶. This page contains the list of all the available Airflow configurations that you can set in airflow. AirflowParsingContext[source] ¶. 0, TaskFlow means a "Python callable" is sometimes just a function annotated with @task. @task. From my experience, an array in a Param can either mean that you want to delcare a parameter that can have multiple types (i. Some airflow specific macros are also defined: airflow. Returns: Jan 21, 2021 · This ConnectionGrabber provides dynamic/managed attributes, so when you request attribute my_conn_id (like connection. json. The names of the connections that you pass into these parameters should be entered into your airflow connections screen and the operator should then connect to the right source and target. V. Ok so I could access the context with. join(runpath, "mnist") test = os. python_operator import PythonOperator from time import sleep from datetime import datetime def my_func(*op_args): print(op_args) return op_args[0] with DAG('python_dag', description='Python DAG', schedule_interval='*/5 This context is the same context you get in jinja templates Used the same code and modified params like Startdate etc. path. This can also be used in templates by accessing { {context. from airflow import DAG. Once you have the context dict, the 'params' key contains the arguments sent to the Dag via REST API. task = task context May 16, 2017 · In Airflow 2. Sep 13, 2018 · Another thing you might keep in mind if you find yourself working with stats like task duration a lot is Airflow's StatsD integration which gathers metrics on Airflow itself at execution time. labels – labels to apply to the Pod. key – A key for the XCom. For example, export AIRFLOW_VAR_FOO= BAR. Would like to access all the parameters passed while triggering the DAG. (There is a long discussion in the Github repo about "making the concept less nebulous". param. The ideal use case of this class is to implicitly convert args passed to a method decorated by @dag. One of the most common values to retrieve from the Airflow context is the ti / task_instance keyword, which allows you to access attributes and methods of the taskinstance object. LoggingMixin. dag_id='airflow_run_conf' , start_date=datetime ( 2022, 8, 14 ), default_args=default_args , schedule_interval=None , task = BashOperator (. def conditionally_trigger(context, dag_run_obj): if context['params']['condition_param']: Aug 4, 2021 · 1. Deprecated function that calls @task. maybe the issue is that I am trying backfill the DAG that has been run with the default parameters. This set Jan 18, 2019 · 4. . The dynamic nature of Airflow allows for the generation of pipelines that can adjust to varying workloads and data patterns. def get_job_dts(**kwargs): #Do something to determine the appropriate job_start_dt and job_end_dt #Package up as a list as inputs to other PythonCallables using op_args job_params = [job_start_dt, job_end_dt] # Push job_params into XCom kwargs['ti']. on_failure_callback = lambda context: my_function(context, arg2) Full example:-. Not sure what details you will find in the exception, when you use PythonOperator, but you are increasing your chances of getting a macros. Now here I need to get that value and pass it inside my pull task. def my_function(context, agr2): # function code here. Aug 24, 2022 · In fact, I was preparing an example to explain my use case, but fortunately, I tested it in a new Airflow docker container, and I found the answer to my question. Jun 23, 2021 · When triggering this DAG from the UI you could add an extra param: Params could be accessed in templated fields, as in BashOperator case: bash_task = BashOperator(. models import DagRun. How do I read the JSON string passed as the --conf parameter in the command line trigger_dag command, in the python DAG file. A DAG (Directed Acyclic Graph) is the core concept of Airflow, collecting Tasks together, organized with dependencies and relationships to say how they should run. Context) [source] ¶ This is the main method to derive when creating an operator. python import get_current_context. get_parsing_context () Return the current (DAG) parsing context info. task_id='bash_task', bash_command='echo bash_task: {{ params. from airflow. log. I am trying to trigger an airflow DAG externally and passing some parameters to the DAG. Jan 10, 2014 · Ignored when in_cluster is True. Managing dependencies in Apache Airflow - FAQ November 2023 Mar 20, 2019 · The best way to do this is to push your value into XCom in get_job_dts, and pull the value back from Xcom in first_task. overwrite_params_with_dag_run_conf (self, params, dag_run) [source] ¶ Overwrite Task Params with DagRun. start_date will provide the start date (as opposed to the execution date) of the task: from datetime import datetime, timedelta. value ( Any) – The value to be updated for the Param. 18. t") May 14, 2021 · The code above works just fine but, the so called context objects, are directly accesible in task-decorated functions. Jan 10, 2012 · Ignored when in_cluster is True. :param templates_dict: a dictionary where the values are templates that will get templated by the Airflow engine sometime between ``__init__`` and ``execute`` takes place and are made available in your callable's context after the template has been applied:param templates_exts: a list of file extensions to New in version 1. Connection attributes like host, login, password, etc. param1 }}') Params are accessible within execution context, like in python_callable: Jan 7, 2021 · There is a new function get_current_context() to fetch the context in Airflow 2. You should use PythonOperator if you want the exceptions to propagate. You can overwrite its value by setting it on conf when you trigger your DagRun. Functions. Apr 5, 2022 · To be clear, when you say "top level code", this includes code within the context manager of your DAG definition that is outside of a task, correct? So really, "any code that is outside of a task" might be another way to phrase that? Feb 26, 2020 · Having problems passing parameters to an external bash script from a BashOperator. dumps to create the string. key1 }}. python. The KubernetesPodOperator uses the Kubernetes API to launch a pod in a Kubernetes cluster. Apr 28, 2020 at 15:22. airflow. """ import smtplib, ssl from email. Jun 21, 2019 · def notify_email(context): import inspect """Send custom email alerts. default_args = {. determine_kwargs (self, context: Mapping [str, Any]) → Mapping [str, Any] [source] ¶ execute Mar 2, 2022 · Utilise the find method of DagRun class. Oct 9, 2023 · 2. Other common reasons to access the Airflow context are: You want to use DAG-level parameters in your Airflow tasks. The following code solved the issue. Feb 16, 2019 · This is how you can pass arguments for a Python operator in Airflow. and then inside the task. This tutorial builds on the regular Airflow Tutorial and focuses specifically on writing data pipelines using the TaskFlow API paradigm which is introduced as part of Airflow 2. task_instance_scheduling_decisions. Step 1 returns a list, so we sort it by the last execution date. format(table_name) do some process and push a data value to xcom. ') the_db = kwargs['client'] the ds (and all other macros are passed to kwargs as you set provide_context=True, you can either use named params like you did or let the ds be passed into kwargs as well) Since in your code you don't Allows a workflow to "branch" or follow a path following the execution of this task. Configuration Reference. If you use JSON, you are also able to walk nested structures, such as dictionaries like: {{ var. variable: set & get global parameter among different dags in airflow system level Xcome : set & get parameter amongh different tasks of certain dag level. But my new question is: Can I use the parameter from the dag_run on a def when using **kwargs? So I can retrieve the xcom values and the dag_run Airflow Variables in Templates. Class that represents a DAG run parameter. Any time the DAG is executed, a DAG Run is created and all tasks inside it are executed. For example, if you want to display example_bash_operator DAG then you can use the following command: airflow dags show example_bash_operator --imgcat. py' file, 'get_airflow_context_vars' function, and 'context' parameter. utils. 11. ssh. python import BranchPythonOperator. logging_mixin. 10. Base, airflow. schedule_interval="@daily", start_date=dats_ago(1) ) as dag. To do this, you should use the --imgcat switch in the airflow dags show command. startup_timeout_seconds – timeout in seconds to startup the pod. As I know airflow test has -tp that can pass params to the task. By supplying an image URL and a command with optional arguments, the operator uses the Kube Python Client to generate a Kubernetes API request that dynamically launches those individual pods. I have prepared a simple DAG with task that displays execution date (ds) as a parameter: Understanding Airflow's Dynamic Context. How can i get the value from the "task1" variable or How can i get the value which is returned from Task1 method? updated : . an Airflow task. Operators describe what to do; hooks determine how to do work. As a heads up my work around was to use a lambda function to get the context parameter, then just pass that into the function you want on the other side of the lambda:-. Dynamic Task Mapping allows a way for a workflow to create a number of tasks at runtime based upon current data, rather than the DAG author having to know in advance how many tasks would be needed. reattach_on_restart – if the scheduler dies while the pod is running, reattach and monitor. info('Checking for inactive campaign types. def branch_function(**kwargs): if some_condition: return 'first_branch_task'. dummy_operator import DummyOperator from airflow. Bonus, you can give extra context information with op_kwargs parameter – Oct 7, 2020 · 1. May 7, 2022 · Basically, the DAG can take upto 10 values for a param (say, number). The DAG is scheduled to run every 3 minutes. operators. That method is returning a value,that value i need to pass to the next PythonOperator. python-3. Dump the Param as a dictionary. Dec 15, 2020 · I am new to Airflow. Also note that only tasks *immediately* downstream of the previous task instance are waited for; the statuses of any tasks further downstream are ignored. for conn_id in {"prod_db", "test_db"}: with DAG( conn_id = conn_id, ) But I am not sure how to do that really, as I did the change the Oct 24, 2020 · op_kwargs={'tablename':table_name}, python_callable=pgexport, provide_context=True, ) There was a task called push_result_{}. Context | None) – Context to pass to all callbacks. random. Is there a way to do this in Python programmatically to get the param that the user passes/enters and then, call the DAG? python. dagparam. Returns. Templating the PythonOperator works different from other operators; variables are passed to the provided callable. Now I need to adjust the code and add another connection to our test environment, I was advised to create a for cycle like. python import task, get_current_context. ). Explore FAQs on Airflow initialization, role of 'airflow_cluster: main', 'airflow_local_settings. Consider this example: Use params to provide the parameters to the DAG (could be also done from the UI), in this example: {"enabled": True} DAGs. Use Airflow JSON Conf to pass JSON data to a single DAG run. Paused Dag_ids. :type wait_for_downstream: bool :param dag: a reference to the dag the task is attached to (if any) :type dag: airflow. import os. Based on the number param value passed, the corresponding notebook will be called. multipart import MIMEMultipart sender_email = '[email protected]' receiver_email = '[email protected]' password = "abc" message = MIMEMultipart("alternative") #task_instance = context['task']. Can I use a TriggerDagRunOperator to pass a parameter to the triggered dag? Airflow from a previous question I know that I can send parameter using a TriggerDagRunOperator. The standard lib’s random. get_active_runs [source] ¶ Return a list of dag run execution dates currently running Dec 7, 2018 · So can I create such an airflow DAG, when it's scheduled, that the default time range is from 01:30 yesterday to 01:30 today. get_default_view (self) [source] ¶ Get the Default DAG View, returns the default config value if DagModel does not have a value. The TaskFlow API is new as of Airflow 2. It is also possible to fetch a variable by string if needed (for example Working with TaskFlow. import airflow from datetime import Mar 25, 2022 · Each DAG is supposed to have context information, that could be expressed as constants, that I would like to share with the alerting stack. Feb 27, 2019 · Same here. Should you need them then you must pass them when initializing DockerOperator. I mean, when DAG was executed by airflow scheduler it didn't have any params. Validates & returns all the Params object stored in the dictionary. conf) is None. class airflow. Having Bash in-between Airflow (Python) and your python code, you are loosing the exception information as @Taragolis mentioned. DagParam (current_dag, name: str, default: Optional [Any] = None) [source] ¶ Class that represents a DAG run parameter & binds a simple Param object to a name within a DAG instance, so that it can be resolved during the run time via {{context Dec 5, 2022 · Accessing the Context Object, Including DagRun Params, Requires the TaskFlow API If you are using the Airflow REST API and passing in a conf object to the DAGRun endpoint, for example, you cannot access these arguments from within a classic style operator such as PythonOperator. context (Any) – Execution Context Dictionary. params}} dictionary. context (airflow. DagParam(current_dag, name: str, default: Optional[Any] = None)[source] ¶. from typing import Sequence. get_connection_from_secrets and return that, from there you can use the a. Instead, you must use the TaskFlow API designed for usage with DTM. baseoperator import BaseOperator. JSON can be passed either from. Based on the variables defined above, example logic of setting the source code related fields is shown here: Module Contents. m. Refer to get_template_context for more context. In the callable, if kwargs['test_mode'] is set, you can retrieve the parameters to build a dummy DagRun object like so: from airflow. You should do: def get_campaign_active(ds, **kwargs): logging. To print execution date inside the callable function of your PythonOperator you can use the following in your Airflow Script and also can add start_time and end_time as follows: def python_func(**kwargs): execution_date = kwargs["execution_date"] #<datetime> type with timezone. render_templates (self, context: Optional [airflow. You can have these metrics go into a push-based system like StatsD itself, or into a pull-based system like Prometheus / Grafana by using statsd_exporter . This operator allows you to run different tasks based on the outcome of a Python function: from airflow. – Simon D. Two ways to change your DAG behavior: Use Airflow variables like mentioned by Bryan in his answer. txt' import datetime as dt from airflow import DAG from airflow. Currently, I am only able to send the dag_id I retrieve from the context, via context['ti']. edited May 7, 2022 at 10:42. train = os. Sep 30, 2022 · 1. kwargs['dag_run']. ex: airflow trigger_dag 'dag_name' -r 'run_id' --conf '{"key":"value"}' Additionally, default_args might contain zip_path parameter to run the extra step of uploading the source code before deploying it. I tried creating a task which uses get_current_context() then transforms the params to a string (without Aug 9, 2023 · How to update the params on the fly? I want to update 'input_file' param to the path present in file 'temp_file. Feb 15, 2022 · I have an existing dag that cleans up airflow_db from our production airflow db. May 3, 2018 · From PythonOperator i am calling "Task1" method. So if your variable key is FOO then the variable name should be AIRFLOW_VAR_FOO. Context] = None) → None [source] ¶ Render templates in the operator Unfortunately, Airflow does not support serializing var, ti and task_instance due to incompatibilities with the underlying library. To use them, just import and call get on the Variable model: from airflow. models. We need to do some in-place updates to ensure the template context reflects the unmapped task instead. This one took some digging as I wanted to use the conf to create a dynamic workflow, not just to be used in other operators. To avoid this you can use Airflow DAGs as context managers to Sep 3, 2021 · 1. execute (self, context: airflow. Triggers the callbacks with the given context. task(python_callable: Optional[Callable] = None, multiple_outputs: Optional[ bool] = None, **kwargs)[source] ¶. Context is the same dictionary used as when rendering jinja templates. _driver_status = "SUBMITTED Apr 20, 2016 · 9. dummy_operator. 0, and you are likely to encounter DAGs written for previous versions of Airflow that instead use PythonOperator to achieve similar goals, albeit with a lot more code. provide_context – if set to true, Airflow will pass a set of keyword arguments that can be used in your function. So if the arguments you passed in the conf were. Dec 4, 2018 · @P. ds_add(ds, days) [source] ¶. Tasks can also be set to execute conditionally using the BranchPythonOperator. In the TriggerDagRunOperator, the message param is added into dag_run_obj's payload. e. Please use the following instead: from airflow. Note that args are split by newline. Python Operator: it can be task instances. render_templates(), which won't update the Jinja context, only the task attibute, after rendering any of the template_fields or template_exts. A DAG Run is an object representing an instantiation of the DAG in time. A DAG run can be created by the scheduler (i. or from. Operators can communicate with other systems via hooks. Nov 16, 2018 · 1. It derives the PythonOperator and expects a Python function that returns a single task_id or list of task_ids to follow. I cleared the instance, then executed command: airflow backfill -conf the str (context ['dag_run']. Preview of DAG in iTerm2. Invocation instance of a DAG. py file) above just has 2 tasks, but if you have 10 or more then the redundancy becomes more evident. The data pipeline chosen here is a simple pattern with three separate # We want the Airflow job to wait until the Spark driver is finished if self. Sep 7, 2023 · That works fine if I only need the context directly inside that function, but where this actually popped up in practice was a DAG that used some shared lib functions that used get_current_context, which of course works fine when called from normal tasks but blew up when called from a virtualenv task. May 4, 2023 · 1. provide_context ( bool) – if set to true, Airflow will pass a set of keyword arguments that can be used in your function. days ( int) – number of days to add to the ds, you can use negative values. static deserialize(data, version)[source] ¶. Return repr (self). param import process_params context ["task"] = context ["ti"]. Oct 24, 2020 · Here is an airflow operator example t3 = BashOperator( task_id='templated', params={'my_param': 'Parameter I passed in'}, dag=dag, ) Is that possible to use params in params, like this Apr 15, 2019 · kaxil. from airflow import DAG from airflow. So my SQL templated query will get the value and pass it to the pgexport function. _should_track_driver_status: if self. value (Any) – A value for the XCom. join(runpath, "mnist. python and allows users to turn a python function into. Here are some key aspects of Airflow's dynamic context: Since ``get_template_context ()`` is called before unmapping, the context contains information about the mapped task. mime. You may use user_defined_macros parameter when instantiating DAG and pass your decision function here. You can access them as either plain-text or JSON. scheduled runs), or by an external trigger (i. set_is_paused (self, is_paused: bool, including_subdags: bool = True Nov 20, 2017 · 13. dag_id When using get_rendered_k8s_spec (self, session = None) [source] ¶ Fetch rendered template fields from DB. 3. For this to work, you need to define Feb 13, 2023 · The problem lays in part with the way you've declared the parameters using Param. When I run a local command, the params are substituted correctly: task_id='log_cleanup_task', provide_context=True, bash_command = log_cleanup, params = {'BASE_LOG_FOLDER': "/var/opt"}, dag=dagInstance, But if I call an external bash script, the params don't Type of return for DagRun. More context around the addition and design of the TaskFlow API can be found as part of its Airflow Improvement Proposal AIP-31 Feb 18, 2024 · I have a master dag that executes a certain number of times a sub dag. Each DAG Run is run separately from one another, meaning that you can have many runs of a DAG at the same time. It can be used to parameterize a DAG. Any use of the threading, subprocess or multiprocessing module within an operator needs to be cleaned up or it will leave ghost processes behind. Connection. Trying to use them outside of this context will not work. embed and logging). Aug 29, 2017 · I am trying to run a airflow DAG and need to pass some parameters for the tasks. :meta private: """ from airflow. task_id dag_instance=context['dag_id']. I want the number of times the dag is executed to be determined by a parameter passed in a JSON file from the Postman. I'm using json. models import Variable # Normal call style foo DAG run parameter reference. dag_id, and eventually the conf (parameters). serialize()[source] ¶. get_logs – get the stdout of the container as logs of the tasks. manual runs). UI - manual trigger from tree view UI - create new DAG run from browse > DAG runs > create new record. 3, it's caused by TaskInstance. 0 Hot Network Questions Medieval fantasy movie where a sorceress disguises herself as the queen to have a child by the king May 7, 2019 · Airflow useful concept: DAG/Tasks: You can view & track in the airflow admin web->dag page. on_kill [source] ¶ Override this method to cleanup subprocesses when a task instance gets killed. The environment variable naming convention is AIRFLOW_VAR_{VARIABLE_NAME}, all uppercase. To pass the params from CLI the correct way is pretty much how you did it (unless you are really missing the closing ' as in your post above): airflow tasks test killer_dag get_idle_queries 20210802 -t '{"pid":"12345"}'. In the last case, you also need to provide an empty sourceUploadUrl parameter in the body. If true and validations fails, the return value would be None. suppress_exception ( bool) – To raise an exception or not when the validations fails. cfg file or using environment variables. Bases: NamedTuple. If these values are not None, they will contain the specific DAG and Task ID that Airflow is requesting to execute. Apache Airflow's dynamic context is essential for creating flexible and dynamic DAGs (Directed Acyclic Graphs). Jan 1, 2018 · There is no --conf option for the airflow test command but you can work around this by passing parameters to the task's python_callable. session – ORM Session. Dec 25, 2018 · The example (example_dag. xcom_push(key='job_params class airflow. op_args (list (templated)) – a list of positional arguments that will get unpacked when calling your callable. In a few places in the documentation it's referred to as a "context dictionary" or even an "execution context dictionary", but never really spelled out what that is. Context of parsing for the DAG. Param in Airflow is used to perform parameter validation. macros. base. Then if anything wrong with the data source, I need to manually trigger the DAG and manually pass the time range as parameters. The var template variable allows you to access Airflow Variables. Users can specify a kubeconfig file using the config_file Jun 2, 2022 · The code you are executing within DockerOperator is in a "closed environment" it can not access the Airflow resources. This means that there is no need to import get_current_context anymore. Add or subtract days from a YYYY-MM-DD. As of Airflow 1. Variables are Airflow’s runtime configuration concept - a general key/value store that is global and can be queried from your tasks, and easily set via Airflow’s user interface, or bulk-uploaded as a JSON file. 2. operato Jan 10, 2012 · op_args ( list) – a list of positional arguments that will get unpacked when calling your callable. 2k 3 61 82. Mar 30, 2018 · Parameter passing to a shell script using BashOperator in Airflow 2. DAG :param priority_weight: priority weight of this task against Jul 20, 2023 · I'd like to pass the whole user defined {{ params }} dictionary as an argument to a KubernetesPodOperator which runs a script that uses argparse to read it and convert it back to a usable dict. I've just spent a few hours (days?) to find out the cause of the problem (god save IPython. For Airflow context variables make sure that you either have access to Airflow through setting system_site_packages to True or add apache-airflow to the requirements argument. The value is pickled and stored in the database. Here’s a basic example DAG: It defines four Tasks - A, B, C, and D - and dictates the order in which they have to run, and which tasks depend on what others. context. decorators import task. dag_parsing_context. Jul 21, 2021 · I think a good way to solve this, is with BranchPythonOperator to branch dynamically based on the provided DAG parameters. The command parameter is templated field so simply use Jinja to achieve that: p1_auth_task = DockerOperator(. conf. The task_id (s) returned should point to a task directly downstream from {self}. I would like read the Trigger DAG configuration passed by user and store as a variable which can be passed as job argument to the actual code. Parameters. Bases: airflow. 0. My problem is that the parameters are only being used by the first DAG run. dag_ids – List of Dag ids. This binds a simple Param object to a name within a DAG instance, so that it can be resolved during the runtime via the {{ context }} dictionary. dag_id – The dag_id of the DAG to find. _driver_id is None: raise AirflowException( "No driver id is known: something went wrong when executing " + "the spark submit command" ) # We start with the SUBMITTED status as initial status self. sd ra ls ns dj nn lo lm er wq