jina.executors.metas¶
The default meta config that all executors follow, they can be overrided by the YAML config
Warning
When you define your own Executor class, make sure your attributes/methods name do not conflict with the name listed below.
Note
Essentially, the meta config can be set in two places: as part of the YAML file, or as the class attribute
via __init__()
or in class definition. When multiple meta specification exists, the overwrite priority is:
metas defined in YAML > metas defined as class attribute > metas default values listed below
Any executor inherited from BaseExecutor
always has the following meta fields:
is_trained
¶indicates if the executor is trained or not, if not then methods decorated by
@required_train()
can not be executed.
- Type
bool
- Default
False
is_updated
¶indicates if the executor is updated or changed since last save, if not then
save()
will do nothing. A forced save is possible to usetouch()
beforesave()
- Type
bool
- Default
False
batch_size
¶the size of each batch, methods decorated by
@batching()
will respect this. useful when incoming data is too large to fit into (GPU) memory.
- Type
int
- Default
None
workspace
¶the working directory, for persisting the artifacts of the executor. An artifact is a file or collection of files used during a workflow run.
- Type
str
- Default
environment variable
JINA_EXECUTOR_WORKDIR
, if not set then using current working dir, akacwd
.
name
¶the name of the executor.
- Type
str
- Default
class name plus a random string
on_gpu
¶if the executor is set to run on GPU.
- Type
bool
- Default
False
py_modules
¶the external python module paths. it is useful when you want to load external python modules using
BaseExecutor.load_config()
from a YAML file. If a relative path is given then the root path is set to the path of the current YAML file.Example of
py_module
usage:
- This is a valid structure and it is RECOMMENDED:
“my_cust_module” is a python module
all core logic of your customized executor goes to
__init__.py
to import
foo.py
, you can use relative import, e.g.from .foo import bar
helper.py
needs to be put BEFORE __init__.py in YAMLpy_modules
This is also the structure given by
jina hub new
CLI.my_cust_module |- __init__.py |- helper.py |- config.yml |- py_modules |- helper.py |- __init__.py
- This is a valid structure but not recommended:
“my_cust_module” is not a python module (lack of __init__.py under the root)
to import
foo.py
, you must to usefrom jinahub.foo import bar
jinahub
is a common namespace for all plugin-modules, not changeable.
helper.py
needs to be put BEFORE my_cust.py in YAMLpy_modules
my_cust_module |- my_cust.py |- helper.py |- config.yml |- py_modules |- helper.py |- my_cust.py
- Type
str/List[str]
- Default
None
pea_id
¶the integer index used for distinguish each parallel pea of this executor, useful in
pea_workspace
- Type
int
- Default
'${{root.metas.pea_id}}'
separated_workspace
¶whether to isolate the data of the parallel of this executor. If
True
, then each parallel pea works in its own workspace specified inpea_workspace
- Type
bool
- Default
'${{root.metas.separated_workspace}}'
pea_workspace
¶the workspace of each parallel pea, useful when
separated_workspace
is set to True. All data and IO operations related to this parallel pea will be conducted under this workspace. It is often set as the sub-directory ofworkspace
.
- Type
str
- Default
'${{root.metas.workspace}}/${{root.metas.name}}-${{root.metas.pea_id}}'
read_only
¶do not allow the pod to modify the model, save calls will be ignored. If set to true no serialization of the executor
- Type
bool
- Default
False
Warning
name
andworkspace
must be set if you want to serialize/deserialize this executor.Note
separated_workspace
,pea_workspace
andpea_id
is set in a way that when the executorA
is used as a component of ajina.executors.compound.CompoundExecutor
B
, thenA
’s setting will be overrided by B’s counterpart.These meta fields can be accessed via self.is_trained or loaded from a YAML config via
load_config()
:!MyAwesomeExecutor with: ... metas: name: my_transformer # a customized name is_trained: true # indicate the model has been trained workspace: ./ # path for serialize/deserialize