Executors can be served - and remotely accessed - directly, without instantiating a Flow manually.
This is especially useful when debugging an Executor in a remote setting. It can also be used to run external/shared Executors to be used in multiple Flows.
There are different options for deploying and running a standalone Executor:
Run the Executor directly from Python with the
Run the static
to_kubernetes_yaml()method to generate K8s deployment configuration files
Run the static
to_docker_compose_yaml()method to generate a Docker Compose service file
Served vs. shared Executor
In Jina there are two ways of running standalone Executors: Served Executors and shared Executors.
A served Executor is launched by one of the following methods:
to_docker_compose_yaml(). It resides behind a Gateway and can thus be directly accessed by a Client. It can also be used as part of a Flow.
A shared Executor is launched using the Jina CLI and does not sit behind a Gateway. It is intended to be used in one or more Flows. Because a shared Executor does not reside behind a Gataway, it cannot be directly accessed by a Client, but it requires fewer networking hops when used inside of a Flow.
Both served and shared Executors can be used as part of a Flow, by adding them as an external Executor.
from docarray import DocumentArray, Document from jina import Executor, requests class MyExec(Executor): @requests def foo(self, docs: DocumentArray, **kwargs): docs = 'executed MyExec' # custom logic goes here MyExec.serve(port=12345)
from jina import Client, DocumentArray, Document print(Client(port=12345).post(inputs=DocumentArray.empty(1), on='/foo').texts)
serve() method creates and starts a
Flow. Therefore, it can take all associated parameters:
uses_requests are passed to the internal
the Executor, and
**kwargs is passed to the internal
Flow() initialisation call.
For more details on these arguments and the workings of a Flow, see the Flow section.
Serve via Kubernetes#
You can generate Kubernetes configuration files for your containerized Executor by using the static
Executor.to_kubernetes_yaml() method. This works like deploying a Flow in Kubernetes, because your Executor is wrapped automatically in a Flow and uses the very same deployment techniques.
from jina import Executor Executor.to_kubernetes_yaml( output_base_path='/tmp/config_out_folder', port_expose=8080, uses='jinahub+docker://DummyHubExecutor', executor_type=Executor.StandaloneExecutorType.EXTERNAL, )
kubectl apply -R -f /tmp/config_out_folder
The above example deploys the
DummyHubExecutor from Jina Hub into your Kubernetes cluster.
The Executor you use needs to be already containerized and stored in a registry accessible from your Kubernetes cluster. We recommend Jina Hub for this.
Serve via Docker Compose#
You can generate a Docker Compose service file for your containerized Executor by using the static
to_docker_compose_yaml() method. This works like running a Flow with Docker Compose, because your Executor is wrapped automatically in a Flow and uses the very same deployment techniques.
from jina import Executor Executor.to_docker_compose_yaml( output_path='/tmp/docker-compose.yml', port_expose=8080, uses='jinahub+docker://DummyHubExecutor', )
docker-compose -f /tmp/docker-compose.yml up
The above example runs the
DummyHubExecutor from Jina Hub locally on your computer using Docker Compose.
The Executor you use needs to be already containerized and stored in an accessible registry. We recommend Jina Hub for this.