Compare commits
28 Commits
main
...
devx_subpk
Author | SHA1 | Date |
---|---|---|
|
d0876bb0a4 | |
|
850b9999ff | |
|
4571b8cc84 | |
|
db58f6e1b5 | |
|
76b7006977 | |
|
bd1885bce1 | |
|
066a35322e | |
|
2ebc30d708 | |
|
57a5b7eb6f | |
|
e269aa3751 | |
|
7fc9297104 | |
|
9208708b3a | |
|
cf2f2adec2 | |
|
f28abc6720 | |
|
6f33a9891e | |
|
79604b7f98 | |
|
cec4a2a0ab | |
|
4089e4b3ac | |
|
5ec48310b6 | |
|
697900deb1 | |
|
2e55c124b1 | |
|
0f21c8ba6a | |
|
7b7410bc0f | |
|
b59cba74cd | |
|
7e39ef7ed1 | |
|
c8ea0fdf53 | |
|
885319e9ae | |
|
b815b61707 |
|
@ -20,7 +20,7 @@ jobs:
|
||||||
- name: Setup python
|
- name: Setup python
|
||||||
uses: actions/setup-python@v2
|
uses: actions/setup-python@v2
|
||||||
with:
|
with:
|
||||||
python-version: '3.11'
|
python-version: '3.10'
|
||||||
|
|
||||||
- name: Install dependencies
|
- name: Install dependencies
|
||||||
run: pip install -U . --upgrade-strategy eager -r requirements-test.txt
|
run: pip install -U . --upgrade-strategy eager -r requirements-test.txt
|
||||||
|
@ -41,7 +41,7 @@ jobs:
|
||||||
- name: Setup python
|
- name: Setup python
|
||||||
uses: actions/setup-python@v2
|
uses: actions/setup-python@v2
|
||||||
with:
|
with:
|
||||||
python-version: '3.11'
|
python-version: '3.10'
|
||||||
|
|
||||||
- name: Build sdist
|
- name: Build sdist
|
||||||
run: python setup.py sdist --formats=zip
|
run: python setup.py sdist --formats=zip
|
||||||
|
@ -59,7 +59,7 @@ jobs:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
os: [ubuntu-latest]
|
os: [ubuntu-latest]
|
||||||
python: ['3.11']
|
python: ['3.10']
|
||||||
spawn_backend: [
|
spawn_backend: [
|
||||||
'trio',
|
'trio',
|
||||||
'mp_spawn',
|
'mp_spawn',
|
||||||
|
|
149
docs/README.rst
149
docs/README.rst
|
@ -1,23 +1,20 @@
|
||||||
|logo| ``tractor``: distributed structurred concurrency
|
|logo| ``tractor``: next-gen Python parallelism
|
||||||
|
|
||||||
|gh_actions|
|
|gh_actions|
|
||||||
|docs|
|
|docs|
|
||||||
|
|
||||||
``tractor`` is a `structured concurrency`_ (SC), multi-processing_ runtime built on trio_.
|
``tractor`` is a `structured concurrent`_, (optionally
|
||||||
|
distributed_) multi-processing_ runtime built on trio_.
|
||||||
|
|
||||||
Fundamentally, ``tractor`` provides parallelism via
|
Fundamentally, ``tractor`` gives you parallelism via
|
||||||
``trio``-"*actors*": independent Python **processes** (i.e.
|
``trio``-"*actors*": independent Python processes (aka
|
||||||
*non-shared-memory threads*) which can schedule ``trio`` tasks whilst
|
non-shared-memory threads) which maintain structured
|
||||||
maintaining *end-to-end SC* inside a *distributed supervision tree*.
|
concurrency (SC) *end-to-end* inside a *supervision tree*.
|
||||||
|
|
||||||
Cross-process (and thus cross-host) SC is accomplished through the
|
Cross-process (and thus cross-host) SC is accomplished through the
|
||||||
combined use of our,
|
combined use of our "actor nurseries_" and an "SC-transitive IPC
|
||||||
|
protocol" constructed on top of multiple Pythons each running a ``trio``
|
||||||
- "actor nurseries_" which provide for spawning multiple, and
|
scheduled runtime - a call to ``trio.run()``.
|
||||||
possibly nested, Python processes each running a ``trio`` scheduled
|
|
||||||
runtime - a call to ``trio.run()``,
|
|
||||||
- an "SC-transitive supervision protocol" enforced as an
|
|
||||||
IPC-message-spec encapsulating all RPC-dialogs.
|
|
||||||
|
|
||||||
We believe the system adheres to the `3 axioms`_ of an "`actor model`_"
|
We believe the system adheres to the `3 axioms`_ of an "`actor model`_"
|
||||||
but likely **does not** look like what **you** probably *think* an "actor
|
but likely **does not** look like what **you** probably *think* an "actor
|
||||||
|
@ -30,7 +27,6 @@ The first step to grok ``tractor`` is to get an intermediate
|
||||||
knowledge of ``trio`` and **structured concurrency** B)
|
knowledge of ``trio`` and **structured concurrency** B)
|
||||||
|
|
||||||
Some great places to start are,
|
Some great places to start are,
|
||||||
|
|
||||||
- the seminal `blog post`_
|
- the seminal `blog post`_
|
||||||
- obviously the `trio docs`_
|
- obviously the `trio docs`_
|
||||||
- wikipedia's nascent SC_ page
|
- wikipedia's nascent SC_ page
|
||||||
|
@ -39,84 +35,22 @@ Some great places to start are,
|
||||||
|
|
||||||
Features
|
Features
|
||||||
--------
|
--------
|
||||||
- **It's just** a ``trio`` API!
|
- **It's just** a ``trio`` API
|
||||||
- *Infinitely nesteable* process trees running embedded ``trio`` tasks.
|
- *Infinitely nesteable* process trees
|
||||||
- Swappable, OS-specific, process spawning via multiple backends.
|
- Builtin IPC streaming APIs with task fan-out broadcasting
|
||||||
- Modular IPC stack, allowing for custom interchange formats (eg.
|
- A "native" multi-core debugger REPL using `pdbp`_ (a fork & fix of
|
||||||
as offered from `msgspec`_), varied transport protocols (TCP, RUDP,
|
`pdb++`_ thanks to @mdmintz!)
|
||||||
QUIC, wireguard), and OS-env specific higher-perf primitives (UDS,
|
- Support for a swappable, OS specific, process spawning layer
|
||||||
shm-ring-buffers).
|
- A modular transport stack, allowing for custom serialization (eg. with
|
||||||
- Optionally distributed_: all IPC and RPC APIs work over multi-host
|
`msgspec`_), communications protocols, and environment specific IPC
|
||||||
transports the same as local.
|
primitives
|
||||||
- Builtin high-level streaming API that enables your app to easily
|
- Support for spawning process-level-SC, inter-loop one-to-one-task oriented
|
||||||
leverage the benefits of a "`cheap or nasty`_" `(un)protocol`_.
|
``asyncio`` actors via "infected ``asyncio``" mode
|
||||||
- A "native UX" around a multi-process safe debugger REPL using
|
- `structured chadcurrency`_ from the ground up
|
||||||
`pdbp`_ (a fork & fix of `pdb++`_)
|
|
||||||
- "Infected ``asyncio``" mode: support for starting an actor's
|
|
||||||
runtime as a `guest`_ on the ``asyncio`` loop allowing us to
|
|
||||||
provide stringent SC-style ``trio.Task``-supervision around any
|
|
||||||
``asyncio.Task`` spawned via our ``tractor.to_asyncio`` APIs.
|
|
||||||
- A **very naive** and still very much work-in-progress inter-actor
|
|
||||||
`discovery`_ sys with plans to support multiple `modern protocol`_
|
|
||||||
approaches.
|
|
||||||
- Various ``trio`` extension APIs via ``tractor.trionics`` such as,
|
|
||||||
- task fan-out `broadcasting`_,
|
|
||||||
- multi-task-single-resource-caching and fan-out-to-multi
|
|
||||||
``__aenter__()`` APIs for ``@acm`` functions,
|
|
||||||
- (WIP) a ``TaskMngr``: one-cancels-one style nursery supervisor.
|
|
||||||
|
|
||||||
|
|
||||||
Install
|
|
||||||
-------
|
|
||||||
``tractor`` is still in a *alpha-near-beta-stage* for many
|
|
||||||
of its subsystems, however we are very close to having a stable
|
|
||||||
lowlevel runtime and API.
|
|
||||||
|
|
||||||
As such, it's currently recommended that you clone and install the
|
|
||||||
repo from source::
|
|
||||||
|
|
||||||
pip install git+git://github.com/goodboy/tractor.git
|
|
||||||
|
|
||||||
|
|
||||||
We use the very hip `uv`_ for project mgmt::
|
|
||||||
|
|
||||||
git clone https://github.com/goodboy/tractor.git
|
|
||||||
cd tractor
|
|
||||||
uv sync --dev
|
|
||||||
uv run python examples/rpc_bidir_streaming.py
|
|
||||||
|
|
||||||
Consider activating a virtual/project-env before starting to hack on
|
|
||||||
the code base::
|
|
||||||
|
|
||||||
# you could use plain ol' venvs
|
|
||||||
# https://docs.astral.sh/uv/pip/environments/
|
|
||||||
uv venv tractor_py313 --python 3.13
|
|
||||||
|
|
||||||
# but @goodboy prefers the more explicit (and shell agnostic)
|
|
||||||
# https://docs.astral.sh/uv/configuration/environment/#uv_project_environment
|
|
||||||
UV_PROJECT_ENVIRONMENT="tractor_py313
|
|
||||||
|
|
||||||
# hint hint, enter @goodboy's fave shell B)
|
|
||||||
uv run --dev xonsh
|
|
||||||
|
|
||||||
Alongside all this we ofc offer "releases" on PyPi::
|
|
||||||
|
|
||||||
pip install tractor
|
|
||||||
|
|
||||||
Just note that YMMV since the main git branch is often much further
|
|
||||||
ahead then any latest release.
|
|
||||||
|
|
||||||
|
|
||||||
Example codez
|
|
||||||
-------------
|
|
||||||
In ``tractor``'s (very lacking) documention we prefer to point to
|
|
||||||
example scripts in the repo over duplicating them in docs, but with
|
|
||||||
that in mind here are some definitive snippets to try and hook you
|
|
||||||
into digging deeper.
|
|
||||||
|
|
||||||
|
|
||||||
Run a func in a process
|
Run a func in a process
|
||||||
***********************
|
-----------------------
|
||||||
Use ``trio``'s style of focussing on *tasks as functions*:
|
Use ``trio``'s style of focussing on *tasks as functions*:
|
||||||
|
|
||||||
.. code:: python
|
.. code:: python
|
||||||
|
@ -174,7 +108,7 @@ might want to check out `trio-parallel`_.
|
||||||
|
|
||||||
|
|
||||||
Zombie safe: self-destruct a process tree
|
Zombie safe: self-destruct a process tree
|
||||||
*****************************************
|
-----------------------------------------
|
||||||
``tractor`` tries to protect you from zombies, no matter what.
|
``tractor`` tries to protect you from zombies, no matter what.
|
||||||
|
|
||||||
.. code:: python
|
.. code:: python
|
||||||
|
@ -230,7 +164,7 @@ it **is a bug**.
|
||||||
|
|
||||||
|
|
||||||
"Native" multi-process debugging
|
"Native" multi-process debugging
|
||||||
********************************
|
--------------------------------
|
||||||
Using the magic of `pdbp`_ and our internal IPC, we've
|
Using the magic of `pdbp`_ and our internal IPC, we've
|
||||||
been able to create a native feeling debugging experience for
|
been able to create a native feeling debugging experience for
|
||||||
any (sub-)process in your ``tractor`` tree.
|
any (sub-)process in your ``tractor`` tree.
|
||||||
|
@ -285,7 +219,7 @@ We're hoping to add a respawn-from-repl system soon!
|
||||||
|
|
||||||
|
|
||||||
SC compatible bi-directional streaming
|
SC compatible bi-directional streaming
|
||||||
**************************************
|
--------------------------------------
|
||||||
Yes, you saw it here first; we provide 2-way streams
|
Yes, you saw it here first; we provide 2-way streams
|
||||||
with reliable, transitive setup/teardown semantics.
|
with reliable, transitive setup/teardown semantics.
|
||||||
|
|
||||||
|
@ -377,7 +311,7 @@ hear your thoughts on!
|
||||||
|
|
||||||
|
|
||||||
Worker poolz are easy peasy
|
Worker poolz are easy peasy
|
||||||
***************************
|
---------------------------
|
||||||
The initial ask from most new users is *"how do I make a worker
|
The initial ask from most new users is *"how do I make a worker
|
||||||
pool thing?"*.
|
pool thing?"*.
|
||||||
|
|
||||||
|
@ -399,10 +333,10 @@ This uses no extra threads, fancy semaphores or futures; all we need
|
||||||
is ``tractor``'s IPC!
|
is ``tractor``'s IPC!
|
||||||
|
|
||||||
"Infected ``asyncio``" mode
|
"Infected ``asyncio``" mode
|
||||||
***************************
|
---------------------------
|
||||||
Have a bunch of ``asyncio`` code you want to force to be SC at the process level?
|
Have a bunch of ``asyncio`` code you want to force to be SC at the process level?
|
||||||
|
|
||||||
Check out our experimental system for `guest`_-mode controlled
|
Check out our experimental system for `guest-mode`_ controlled
|
||||||
``asyncio`` actors:
|
``asyncio`` actors:
|
||||||
|
|
||||||
.. code:: python
|
.. code:: python
|
||||||
|
@ -508,7 +442,7 @@ We need help refining the `asyncio`-side channel API to be more
|
||||||
|
|
||||||
|
|
||||||
Higher level "cluster" APIs
|
Higher level "cluster" APIs
|
||||||
***************************
|
---------------------------
|
||||||
To be extra terse the ``tractor`` devs have started hacking some "higher
|
To be extra terse the ``tractor`` devs have started hacking some "higher
|
||||||
level" APIs for managing actor trees/clusters. These interfaces should
|
level" APIs for managing actor trees/clusters. These interfaces should
|
||||||
generally be condsidered provisional for now but we encourage you to try
|
generally be condsidered provisional for now but we encourage you to try
|
||||||
|
@ -565,6 +499,18 @@ spawn a flat cluster:
|
||||||
.. _full worker pool re-implementation: https://github.com/goodboy/tractor/blob/master/examples/parallelism/concurrent_actors_primes.py
|
.. _full worker pool re-implementation: https://github.com/goodboy/tractor/blob/master/examples/parallelism/concurrent_actors_primes.py
|
||||||
|
|
||||||
|
|
||||||
|
Install
|
||||||
|
-------
|
||||||
|
From PyPi::
|
||||||
|
|
||||||
|
pip install tractor
|
||||||
|
|
||||||
|
|
||||||
|
From git::
|
||||||
|
|
||||||
|
pip install git+git://github.com/goodboy/tractor.git
|
||||||
|
|
||||||
|
|
||||||
Under the hood
|
Under the hood
|
||||||
--------------
|
--------------
|
||||||
``tractor`` is an attempt to pair trionic_ `structured concurrency`_ with
|
``tractor`` is an attempt to pair trionic_ `structured concurrency`_ with
|
||||||
|
@ -668,26 +614,21 @@ channel`_!
|
||||||
.. _adherance to: https://www.youtube.com/watch?v=7erJ1DV_Tlo&t=1821s
|
.. _adherance to: https://www.youtube.com/watch?v=7erJ1DV_Tlo&t=1821s
|
||||||
.. _trio gitter channel: https://gitter.im/python-trio/general
|
.. _trio gitter channel: https://gitter.im/python-trio/general
|
||||||
.. _matrix channel: https://matrix.to/#/!tractor:matrix.org
|
.. _matrix channel: https://matrix.to/#/!tractor:matrix.org
|
||||||
.. _broadcasting: https://github.com/goodboy/tractor/pull/229
|
|
||||||
.. _modern procotol: https://en.wikipedia.org/wiki/Rendezvous_protocol
|
|
||||||
.. _pdbp: https://github.com/mdmintz/pdbp
|
.. _pdbp: https://github.com/mdmintz/pdbp
|
||||||
.. _pdb++: https://github.com/pdbpp/pdbpp
|
.. _pdb++: https://github.com/pdbpp/pdbpp
|
||||||
.. _cheap or nasty: https://zguide.zeromq.org/docs/chapter7/#The-Cheap-or-Nasty-Pattern
|
.. _guest mode: https://trio.readthedocs.io/en/stable/reference-lowlevel.html?highlight=guest%20mode#using-guest-mode-to-run-trio-on-top-of-other-event-loops
|
||||||
.. _(un)protocol: https://zguide.zeromq.org/docs/chapter7/#Unprotocols
|
|
||||||
.. _discovery: https://zguide.zeromq.org/docs/chapter8/#Discovery
|
|
||||||
.. _modern protocol: https://en.wikipedia.org/wiki/Rendezvous_protocol
|
|
||||||
.. _messages: https://en.wikipedia.org/wiki/Message_passing
|
.. _messages: https://en.wikipedia.org/wiki/Message_passing
|
||||||
.. _trio docs: https://trio.readthedocs.io/en/latest/
|
.. _trio docs: https://trio.readthedocs.io/en/latest/
|
||||||
.. _blog post: https://vorpus.org/blog/notes-on-structured-concurrency-or-go-statement-considered-harmful/
|
.. _blog post: https://vorpus.org/blog/notes-on-structured-concurrency-or-go-statement-considered-harmful/
|
||||||
.. _structured concurrency: https://en.wikipedia.org/wiki/Structured_concurrency
|
.. _structured concurrency: https://en.wikipedia.org/wiki/Structured_concurrency
|
||||||
.. _SC: https://en.wikipedia.org/wiki/Structured_concurrency
|
.. _SC: https://en.wikipedia.org/wiki/Structured_concurrency
|
||||||
.. _libdill-docs: https://sustrik.github.io/libdill/structured-concurrency.html
|
.. _libdill-docs: https://sustrik.github.io/libdill/structured-concurrency.html
|
||||||
|
.. _structured chadcurrency: https://en.wikipedia.org/wiki/Structured_concurrency
|
||||||
.. _unrequirements: https://en.wikipedia.org/wiki/Actor_model#Direct_communication_and_asynchrony
|
.. _unrequirements: https://en.wikipedia.org/wiki/Actor_model#Direct_communication_and_asynchrony
|
||||||
.. _async generators: https://www.python.org/dev/peps/pep-0525/
|
.. _async generators: https://www.python.org/dev/peps/pep-0525/
|
||||||
.. _trio-parallel: https://github.com/richardsheridan/trio-parallel
|
.. _trio-parallel: https://github.com/richardsheridan/trio-parallel
|
||||||
.. _uv: https://docs.astral.sh/uv/
|
|
||||||
.. _msgspec: https://jcristharif.com/msgspec/
|
.. _msgspec: https://jcristharif.com/msgspec/
|
||||||
.. _guest: https://trio.readthedocs.io/en/stable/reference-lowlevel.html?highlight=guest%20mode#using-guest-mode-to-run-trio-on-top-of-other-event-loops
|
.. _guest-mode: https://trio.readthedocs.io/en/stable/reference-lowlevel.html?highlight=guest%20mode#using-guest-mode-to-run-trio-on-top-of-other-event-loops
|
||||||
|
|
||||||
|
|
||||||
.. |gh_actions| image:: https://img.shields.io/endpoint.svg?url=https%3A%2F%2Factions-badge.atrox.dev%2Fgoodboy%2Ftractor%2Fbadge&style=popout-square
|
.. |gh_actions| image:: https://img.shields.io/endpoint.svg?url=https%3A%2F%2Factions-badge.atrox.dev%2Fgoodboy%2Ftractor%2Fbadge&style=popout-square
|
||||||
|
|
|
@ -21,12 +21,75 @@ import trio
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
|
||||||
|
async def break_ipc(
|
||||||
|
stream: MsgStream,
|
||||||
|
method: str|None = None,
|
||||||
|
pre_close: bool = False,
|
||||||
|
|
||||||
|
def_method: str = 'eof',
|
||||||
|
|
||||||
|
) -> None:
|
||||||
|
'''
|
||||||
|
XXX: close the channel right after an error is raised
|
||||||
|
purposely breaking the IPC transport to make sure the parent
|
||||||
|
doesn't get stuck in debug or hang on the connection join.
|
||||||
|
this more or less simulates an infinite msg-receive hang on
|
||||||
|
the other end.
|
||||||
|
|
||||||
|
'''
|
||||||
|
# close channel via IPC prot msging before
|
||||||
|
# any transport breakage
|
||||||
|
if pre_close:
|
||||||
|
await stream.aclose()
|
||||||
|
|
||||||
|
method: str = method or def_method
|
||||||
|
print(
|
||||||
|
'#################################\n'
|
||||||
|
'Simulating CHILD-side IPC BREAK!\n'
|
||||||
|
f'method: {method}\n'
|
||||||
|
f'pre `.aclose()`: {pre_close}\n'
|
||||||
|
'#################################\n'
|
||||||
|
)
|
||||||
|
|
||||||
|
match method:
|
||||||
|
case 'trans_aclose':
|
||||||
|
await stream._ctx.chan.transport.stream.aclose()
|
||||||
|
|
||||||
|
case 'eof':
|
||||||
|
await stream._ctx.chan.transport.stream.send_eof()
|
||||||
|
|
||||||
|
case 'msg':
|
||||||
|
await stream._ctx.chan.send(None)
|
||||||
|
|
||||||
|
# TODO: the actual real-world simulated cases like
|
||||||
|
# transport layer hangs and/or lower layer 2-gens type
|
||||||
|
# scenarios..
|
||||||
|
#
|
||||||
|
# -[ ] already have some issues for this general testing
|
||||||
|
# area:
|
||||||
|
# - https://github.com/goodboy/tractor/issues/97
|
||||||
|
# - https://github.com/goodboy/tractor/issues/124
|
||||||
|
# - PR from @guille:
|
||||||
|
# https://github.com/goodboy/tractor/pull/149
|
||||||
|
# case 'hang':
|
||||||
|
# TODO: framework research:
|
||||||
|
#
|
||||||
|
# - https://github.com/GuoTengda1993/pynetem
|
||||||
|
# - https://github.com/shopify/toxiproxy
|
||||||
|
# - https://manpages.ubuntu.com/manpages/trusty/man1/wirefilter.1.html
|
||||||
|
|
||||||
|
case _:
|
||||||
|
raise RuntimeError(
|
||||||
|
f'IPC break method unsupported: {method}'
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
async def break_ipc_then_error(
|
async def break_ipc_then_error(
|
||||||
stream: MsgStream,
|
stream: MsgStream,
|
||||||
break_ipc_with: str|None = None,
|
break_ipc_with: str|None = None,
|
||||||
pre_close: bool = False,
|
pre_close: bool = False,
|
||||||
):
|
):
|
||||||
await _testing.break_ipc(
|
await break_ipc(
|
||||||
stream=stream,
|
stream=stream,
|
||||||
method=break_ipc_with,
|
method=break_ipc_with,
|
||||||
pre_close=pre_close,
|
pre_close=pre_close,
|
||||||
|
@ -58,32 +121,25 @@ async def recv_and_spawn_net_killers(
|
||||||
Receive stream msgs and spawn some IPC killers mid-stream.
|
Receive stream msgs and spawn some IPC killers mid-stream.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
broke_ipc: bool = False
|
|
||||||
await ctx.started()
|
await ctx.started()
|
||||||
async with (
|
async with (
|
||||||
ctx.open_stream() as stream,
|
ctx.open_stream() as stream,
|
||||||
trio.open_nursery(
|
trio.open_nursery() as n,
|
||||||
strict_exception_groups=False,
|
|
||||||
) as tn,
|
|
||||||
):
|
):
|
||||||
async for i in stream:
|
async for i in stream:
|
||||||
print(f'child echoing {i}')
|
print(f'child echoing {i}')
|
||||||
if not broke_ipc:
|
await stream.send(i)
|
||||||
await stream.send(i)
|
|
||||||
else:
|
|
||||||
await trio.sleep(0.01)
|
|
||||||
|
|
||||||
if (
|
if (
|
||||||
break_ipc_after
|
break_ipc_after
|
||||||
and
|
and
|
||||||
i >= break_ipc_after
|
i >= break_ipc_after
|
||||||
):
|
):
|
||||||
broke_ipc = True
|
n.start_soon(
|
||||||
tn.start_soon(
|
|
||||||
iter_ipc_stream,
|
iter_ipc_stream,
|
||||||
stream,
|
stream,
|
||||||
)
|
)
|
||||||
tn.start_soon(
|
n.start_soon(
|
||||||
partial(
|
partial(
|
||||||
break_ipc_then_error,
|
break_ipc_then_error,
|
||||||
stream=stream,
|
stream=stream,
|
||||||
|
@ -186,13 +242,14 @@ async def main(
|
||||||
# await stream._ctx.chan.send(None)
|
# await stream._ctx.chan.send(None)
|
||||||
# await stream._ctx.chan.transport.stream.send_eof()
|
# await stream._ctx.chan.transport.stream.send_eof()
|
||||||
await stream._ctx.chan.transport.stream.aclose()
|
await stream._ctx.chan.transport.stream.aclose()
|
||||||
|
|
||||||
ipc_break_sent = True
|
ipc_break_sent = True
|
||||||
|
|
||||||
# it actually breaks right here in the
|
# it actually breaks right here in the
|
||||||
# mp_spawn/forkserver backends and thus the
|
# mp_spawn/forkserver backends and thus the zombie
|
||||||
# zombie reaper never even kicks in?
|
# reaper never even kicks in?
|
||||||
|
print(f'parent sending {i}')
|
||||||
try:
|
try:
|
||||||
print(f'parent sending {i}')
|
|
||||||
await stream.send(i)
|
await stream.send(i)
|
||||||
except ContextCancelled as ctxc:
|
except ContextCancelled as ctxc:
|
||||||
print(
|
print(
|
||||||
|
@ -205,13 +262,6 @@ async def main(
|
||||||
# TODO: is this needed or no?
|
# TODO: is this needed or no?
|
||||||
raise
|
raise
|
||||||
|
|
||||||
except trio.ClosedResourceError:
|
|
||||||
# NOTE: don't send if we already broke the
|
|
||||||
# connection to avoid raising a closed-error
|
|
||||||
# such that we drop through to the ctl-c
|
|
||||||
# mashing by user.
|
|
||||||
await trio.sleep(0.01)
|
|
||||||
|
|
||||||
# timeout: int = 1
|
# timeout: int = 1
|
||||||
# with trio.move_on_after(timeout) as cs:
|
# with trio.move_on_after(timeout) as cs:
|
||||||
async with stuff_hangin_ctlc() as timeout:
|
async with stuff_hangin_ctlc() as timeout:
|
||||||
|
|
|
@ -1,16 +1,8 @@
|
||||||
'''
|
|
||||||
Examples of using the builtin `breakpoint()` from an `asyncio.Task`
|
|
||||||
running in a subactor spawned with `infect_asyncio=True`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
import asyncio
|
import asyncio
|
||||||
|
|
||||||
import trio
|
import trio
|
||||||
import tractor
|
import tractor
|
||||||
from tractor import (
|
from tractor import to_asyncio
|
||||||
to_asyncio,
|
|
||||||
Portal,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
async def aio_sleep_forever():
|
async def aio_sleep_forever():
|
||||||
|
@ -25,21 +17,21 @@ async def bp_then_error(
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
|
|
||||||
# sync with `trio`-side (caller) task
|
# sync with ``trio``-side (caller) task
|
||||||
to_trio.send_nowait('start')
|
to_trio.send_nowait('start')
|
||||||
|
|
||||||
# NOTE: what happens here inside the hook needs some refinement..
|
# NOTE: what happens here inside the hook needs some refinement..
|
||||||
# => seems like it's still `._debug._set_trace()` but
|
# => seems like it's still `._debug._set_trace()` but
|
||||||
# we set `Lock.local_task_in_debug = 'sync'`, we probably want
|
# we set `Lock.local_task_in_debug = 'sync'`, we probably want
|
||||||
# some further, at least, meta-data about the task/actor in debug
|
# some further, at least, meta-data about the task/actoq in debug
|
||||||
# in terms of making it clear it's `asyncio` mucking about.
|
# in terms of making it clear it's asyncio mucking about.
|
||||||
breakpoint() # asyncio-side
|
breakpoint()
|
||||||
|
|
||||||
# short checkpoint / delay
|
# short checkpoint / delay
|
||||||
await asyncio.sleep(0.5) # asyncio-side
|
await asyncio.sleep(0.5)
|
||||||
|
|
||||||
if raise_after_bp:
|
if raise_after_bp:
|
||||||
raise ValueError('asyncio side error!')
|
raise ValueError('blah')
|
||||||
|
|
||||||
# TODO: test case with this so that it gets cancelled?
|
# TODO: test case with this so that it gets cancelled?
|
||||||
else:
|
else:
|
||||||
|
@ -57,21 +49,23 @@ async def trio_ctx(
|
||||||
# this will block until the ``asyncio`` task sends a "first"
|
# this will block until the ``asyncio`` task sends a "first"
|
||||||
# message, see first line in above func.
|
# message, see first line in above func.
|
||||||
async with (
|
async with (
|
||||||
|
|
||||||
to_asyncio.open_channel_from(
|
to_asyncio.open_channel_from(
|
||||||
bp_then_error,
|
bp_then_error,
|
||||||
# raise_after_bp=not bp_before_started,
|
raise_after_bp=not bp_before_started,
|
||||||
) as (first, chan),
|
) as (first, chan),
|
||||||
|
|
||||||
trio.open_nursery() as tn,
|
trio.open_nursery() as n,
|
||||||
):
|
):
|
||||||
|
|
||||||
assert first == 'start'
|
assert first == 'start'
|
||||||
|
|
||||||
if bp_before_started:
|
if bp_before_started:
|
||||||
await tractor.pause() # trio-side
|
await tractor.breakpoint()
|
||||||
|
|
||||||
await ctx.started(first) # trio-side
|
await ctx.started(first)
|
||||||
|
|
||||||
tn.start_soon(
|
n.start_soon(
|
||||||
to_asyncio.run_task,
|
to_asyncio.run_task,
|
||||||
aio_sleep_forever,
|
aio_sleep_forever,
|
||||||
)
|
)
|
||||||
|
@ -79,50 +73,37 @@ async def trio_ctx(
|
||||||
|
|
||||||
|
|
||||||
async def main(
|
async def main(
|
||||||
bps_all_over: bool = True,
|
bps_all_over: bool = False,
|
||||||
|
|
||||||
# TODO, WHICH OF THESE HAZ BUGZ?
|
|
||||||
cancel_from_root: bool = False,
|
|
||||||
err_from_root: bool = False,
|
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
|
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery() as n:
|
||||||
debug_mode=True,
|
|
||||||
maybe_enable_greenback=True,
|
p = await n.start_actor(
|
||||||
# loglevel='devx',
|
|
||||||
) as an:
|
|
||||||
ptl: Portal = await an.start_actor(
|
|
||||||
'aio_daemon',
|
'aio_daemon',
|
||||||
enable_modules=[__name__],
|
enable_modules=[__name__],
|
||||||
infect_asyncio=True,
|
infect_asyncio=True,
|
||||||
debug_mode=True,
|
debug_mode=True,
|
||||||
# loglevel='cancel',
|
loglevel='cancel',
|
||||||
)
|
)
|
||||||
|
|
||||||
async with ptl.open_context(
|
async with p.open_context(
|
||||||
trio_ctx,
|
trio_ctx,
|
||||||
bp_before_started=bps_all_over,
|
bp_before_started=bps_all_over,
|
||||||
) as (ctx, first):
|
) as (ctx, first):
|
||||||
|
|
||||||
assert first == 'start'
|
assert first == 'start'
|
||||||
|
|
||||||
# pause in parent to ensure no cross-actor
|
if bps_all_over:
|
||||||
# locking problems exist!
|
await tractor.breakpoint()
|
||||||
await tractor.pause() # trio-root
|
|
||||||
|
|
||||||
if cancel_from_root:
|
|
||||||
await ctx.cancel()
|
|
||||||
|
|
||||||
if err_from_root:
|
|
||||||
assert 0
|
|
||||||
else:
|
|
||||||
await trio.sleep_forever()
|
|
||||||
|
|
||||||
|
# await trio.sleep_forever()
|
||||||
|
await ctx.cancel()
|
||||||
|
assert 0
|
||||||
|
|
||||||
# TODO: case where we cancel from trio-side while asyncio task
|
# TODO: case where we cancel from trio-side while asyncio task
|
||||||
# has debugger lock?
|
# has debugger lock?
|
||||||
# await ptl.cancel_actor()
|
# await p.cancel_actor()
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
|
@ -1,5 +1,5 @@
|
||||||
'''
|
'''
|
||||||
Fast fail test with a `Context`.
|
Fast fail test with a context.
|
||||||
|
|
||||||
Ensure the partially initialized sub-actor process
|
Ensure the partially initialized sub-actor process
|
||||||
doesn't cause a hang on error/cancel of the parent
|
doesn't cause a hang on error/cancel of the parent
|
||||||
|
|
|
@ -4,15 +4,9 @@ import trio
|
||||||
|
|
||||||
async def breakpoint_forever():
|
async def breakpoint_forever():
|
||||||
"Indefinitely re-enter debugger in child actor."
|
"Indefinitely re-enter debugger in child actor."
|
||||||
try:
|
while True:
|
||||||
while True:
|
yield 'yo'
|
||||||
yield 'yo'
|
await tractor.breakpoint()
|
||||||
await tractor.pause()
|
|
||||||
except BaseException:
|
|
||||||
tractor.log.get_console_log().exception(
|
|
||||||
'Cancelled while trying to enter pause point!'
|
|
||||||
)
|
|
||||||
raise
|
|
||||||
|
|
||||||
|
|
||||||
async def name_error():
|
async def name_error():
|
||||||
|
@ -21,14 +15,11 @@ async def name_error():
|
||||||
|
|
||||||
|
|
||||||
async def main():
|
async def main():
|
||||||
'''
|
"""Test breakpoint in a streaming actor.
|
||||||
Test breakpoint in a streaming actor.
|
"""
|
||||||
|
|
||||||
'''
|
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery(
|
||||||
debug_mode=True,
|
debug_mode=True,
|
||||||
loglevel='cancel',
|
loglevel='error',
|
||||||
# loglevel='devx',
|
|
||||||
) as n:
|
) as n:
|
||||||
|
|
||||||
p0 = await n.start_actor('bp_forever', enable_modules=[__name__])
|
p0 = await n.start_actor('bp_forever', enable_modules=[__name__])
|
||||||
|
@ -41,7 +32,7 @@ async def main():
|
||||||
try:
|
try:
|
||||||
await p1.run(name_error)
|
await p1.run(name_error)
|
||||||
except tractor.RemoteActorError as rae:
|
except tractor.RemoteActorError as rae:
|
||||||
assert rae.boxed_type is NameError
|
assert rae.type is NameError
|
||||||
|
|
||||||
async for i in stream:
|
async for i in stream:
|
||||||
|
|
||||||
|
|
|
@ -10,7 +10,7 @@ async def name_error():
|
||||||
async def breakpoint_forever():
|
async def breakpoint_forever():
|
||||||
"Indefinitely re-enter debugger in child actor."
|
"Indefinitely re-enter debugger in child actor."
|
||||||
while True:
|
while True:
|
||||||
await tractor.pause()
|
await tractor.breakpoint()
|
||||||
|
|
||||||
# NOTE: if the test never sent 'q'/'quit' commands
|
# NOTE: if the test never sent 'q'/'quit' commands
|
||||||
# on the pdb repl, without this checkpoint line the
|
# on the pdb repl, without this checkpoint line the
|
||||||
|
@ -45,7 +45,6 @@ async def spawn_until(depth=0):
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
# TODO: notes on the new boxed-relayed errors through proxy actors
|
|
||||||
async def main():
|
async def main():
|
||||||
"""The main ``tractor`` routine.
|
"""The main ``tractor`` routine.
|
||||||
|
|
||||||
|
|
|
@ -40,7 +40,7 @@ async def main():
|
||||||
"""
|
"""
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery(
|
||||||
debug_mode=True,
|
debug_mode=True,
|
||||||
loglevel='devx',
|
# loglevel='cancel',
|
||||||
) as n:
|
) as n:
|
||||||
|
|
||||||
# spawn both actors
|
# spawn both actors
|
||||||
|
|
|
@ -6,7 +6,7 @@ async def breakpoint_forever():
|
||||||
"Indefinitely re-enter debugger in child actor."
|
"Indefinitely re-enter debugger in child actor."
|
||||||
while True:
|
while True:
|
||||||
await trio.sleep(0.1)
|
await trio.sleep(0.1)
|
||||||
await tractor.pause()
|
await tractor.breakpoint()
|
||||||
|
|
||||||
|
|
||||||
async def name_error():
|
async def name_error():
|
||||||
|
@ -38,7 +38,6 @@ async def main():
|
||||||
"""
|
"""
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery(
|
||||||
debug_mode=True,
|
debug_mode=True,
|
||||||
# loglevel='runtime',
|
|
||||||
) as n:
|
) as n:
|
||||||
|
|
||||||
# Spawn both actors, don't bother with collecting results
|
# Spawn both actors, don't bother with collecting results
|
||||||
|
|
|
@ -23,6 +23,5 @@ async def main():
|
||||||
n.start_soon(debug_actor.run, die)
|
n.start_soon(debug_actor.run, die)
|
||||||
n.start_soon(crash_boi.run, die)
|
n.start_soon(crash_boi.run, die)
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
|
|
|
@ -1,56 +0,0 @@
|
||||||
import trio
|
|
||||||
import tractor
|
|
||||||
|
|
||||||
|
|
||||||
@tractor.context
|
|
||||||
async def name_error(
|
|
||||||
ctx: tractor.Context,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Raise a `NameError`, catch it and enter `.post_mortem()`, then
|
|
||||||
expect the `._rpc._invoke()` crash handler to also engage.
|
|
||||||
|
|
||||||
'''
|
|
||||||
try:
|
|
||||||
getattr(doggypants) # noqa (on purpose)
|
|
||||||
except NameError:
|
|
||||||
await tractor.post_mortem()
|
|
||||||
raise
|
|
||||||
|
|
||||||
|
|
||||||
async def main():
|
|
||||||
'''
|
|
||||||
Test 3 `PdbREPL` entries:
|
|
||||||
- one in the child due to manual `.post_mortem()`,
|
|
||||||
- another in the child due to runtime RPC crash handling.
|
|
||||||
- final one here in parent from the RAE.
|
|
||||||
|
|
||||||
'''
|
|
||||||
# XXX NOTE: ideally the REPL arrives at this frame in the parent
|
|
||||||
# ONE UP FROM the inner ctx block below!
|
|
||||||
async with tractor.open_nursery(
|
|
||||||
debug_mode=True,
|
|
||||||
# loglevel='cancel',
|
|
||||||
) as an:
|
|
||||||
p: tractor.Portal = await an.start_actor(
|
|
||||||
'child',
|
|
||||||
enable_modules=[__name__],
|
|
||||||
)
|
|
||||||
|
|
||||||
# XXX should raise `RemoteActorError[NameError]`
|
|
||||||
# AND be the active frame when REPL enters!
|
|
||||||
try:
|
|
||||||
async with p.open_context(name_error) as (ctx, first):
|
|
||||||
assert first
|
|
||||||
except tractor.RemoteActorError as rae:
|
|
||||||
assert rae.boxed_type is NameError
|
|
||||||
|
|
||||||
# manually handle in root's parent task
|
|
||||||
await tractor.post_mortem()
|
|
||||||
raise
|
|
||||||
else:
|
|
||||||
raise RuntimeError('IPC ctx should have remote errored!?')
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
trio.run(main)
|
|
|
@ -6,44 +6,19 @@ import tractor
|
||||||
|
|
||||||
|
|
||||||
async def main() -> None:
|
async def main() -> None:
|
||||||
|
async with tractor.open_nursery(debug_mode=True) as an:
|
||||||
|
|
||||||
# intially unset, no entry.
|
assert os.environ['PYTHONBREAKPOINT'] == 'tractor._debug._set_trace'
|
||||||
orig_pybp_var: int = os.environ.get('PYTHONBREAKPOINT')
|
|
||||||
assert orig_pybp_var in {None, "0"}
|
|
||||||
|
|
||||||
async with tractor.open_nursery(
|
|
||||||
debug_mode=True,
|
|
||||||
) as an:
|
|
||||||
assert an
|
|
||||||
assert (
|
|
||||||
(pybp_var := os.environ['PYTHONBREAKPOINT'])
|
|
||||||
==
|
|
||||||
'tractor.devx._debug._sync_pause_from_builtin'
|
|
||||||
)
|
|
||||||
|
|
||||||
# TODO: an assert that verifies the hook has indeed been, hooked
|
# TODO: an assert that verifies the hook has indeed been, hooked
|
||||||
# XD
|
# XD
|
||||||
assert (
|
assert sys.breakpointhook is not tractor._debug._set_trace
|
||||||
(pybp_hook := sys.breakpointhook)
|
|
||||||
is not tractor.devx._debug._set_trace
|
|
||||||
)
|
|
||||||
|
|
||||||
print(
|
breakpoint()
|
||||||
f'$PYTHONOBREAKPOINT: {pybp_var!r}\n'
|
|
||||||
f'`sys.breakpointhook`: {pybp_hook!r}\n'
|
|
||||||
)
|
|
||||||
breakpoint() # first bp, tractor hook set.
|
|
||||||
|
|
||||||
# XXX AFTER EXIT (of actor-runtime) verify the hook is unset..
|
# TODO: an assert that verifies the hook is unhooked..
|
||||||
#
|
|
||||||
# YES, this is weird but it's how stdlib docs say to do it..
|
|
||||||
# https://docs.python.org/3/library/sys.html#sys.breakpointhook
|
|
||||||
assert os.environ.get('PYTHONBREAKPOINT') is orig_pybp_var
|
|
||||||
assert sys.breakpointhook
|
assert sys.breakpointhook
|
||||||
|
breakpoint()
|
||||||
# now ensure a regular builtin pause still works
|
|
||||||
breakpoint() # last bp, stdlib hook restored
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
|
|
|
@ -10,7 +10,7 @@ async def main():
|
||||||
|
|
||||||
await trio.sleep(0.1)
|
await trio.sleep(0.1)
|
||||||
|
|
||||||
await tractor.pause()
|
await tractor.breakpoint()
|
||||||
|
|
||||||
await trio.sleep(0.1)
|
await trio.sleep(0.1)
|
||||||
|
|
||||||
|
|
|
@ -2,16 +2,13 @@ import trio
|
||||||
import tractor
|
import tractor
|
||||||
|
|
||||||
|
|
||||||
async def main(
|
async def main():
|
||||||
registry_addrs: tuple[str, int]|None = None
|
|
||||||
):
|
|
||||||
|
|
||||||
async with tractor.open_root_actor(
|
async with tractor.open_root_actor(
|
||||||
debug_mode=True,
|
debug_mode=True,
|
||||||
# loglevel='runtime',
|
|
||||||
):
|
):
|
||||||
while True:
|
while True:
|
||||||
await tractor.pause()
|
await tractor.breakpoint()
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
|
@ -1,83 +0,0 @@
|
||||||
'''
|
|
||||||
Verify we can dump a `stackscope` tree on a hang.
|
|
||||||
|
|
||||||
'''
|
|
||||||
import os
|
|
||||||
import signal
|
|
||||||
|
|
||||||
import trio
|
|
||||||
import tractor
|
|
||||||
|
|
||||||
@tractor.context
|
|
||||||
async def start_n_shield_hang(
|
|
||||||
ctx: tractor.Context,
|
|
||||||
):
|
|
||||||
# actor: tractor.Actor = tractor.current_actor()
|
|
||||||
|
|
||||||
# sync to parent-side task
|
|
||||||
await ctx.started(os.getpid())
|
|
||||||
|
|
||||||
print('Entering shield sleep..')
|
|
||||||
with trio.CancelScope(shield=True):
|
|
||||||
await trio.sleep_forever() # in subactor
|
|
||||||
|
|
||||||
# XXX NOTE ^^^ since this shields, we expect
|
|
||||||
# the zombie reaper (aka T800) to engage on
|
|
||||||
# SIGINT from the user and eventually hard-kill
|
|
||||||
# this subprocess!
|
|
||||||
|
|
||||||
|
|
||||||
async def main(
|
|
||||||
from_test: bool = False,
|
|
||||||
) -> None:
|
|
||||||
|
|
||||||
async with (
|
|
||||||
tractor.open_nursery(
|
|
||||||
debug_mode=True,
|
|
||||||
enable_stack_on_sig=True,
|
|
||||||
# maybe_enable_greenback=False,
|
|
||||||
loglevel='devx',
|
|
||||||
) as an,
|
|
||||||
):
|
|
||||||
ptl: tractor.Portal = await an.start_actor(
|
|
||||||
'hanger',
|
|
||||||
enable_modules=[__name__],
|
|
||||||
debug_mode=True,
|
|
||||||
)
|
|
||||||
async with ptl.open_context(
|
|
||||||
start_n_shield_hang,
|
|
||||||
) as (ctx, cpid):
|
|
||||||
|
|
||||||
_, proc, _ = an._children[ptl.chan.uid]
|
|
||||||
assert cpid == proc.pid
|
|
||||||
|
|
||||||
print(
|
|
||||||
'Yo my child hanging..?\n'
|
|
||||||
# "i'm a user who wants to see a `stackscope` tree!\n"
|
|
||||||
)
|
|
||||||
|
|
||||||
# XXX simulate the wrapping test's "user actions"
|
|
||||||
# (i.e. if a human didn't run this manually but wants to
|
|
||||||
# know what they should do to reproduce test behaviour)
|
|
||||||
if from_test:
|
|
||||||
print(
|
|
||||||
f'Sending SIGUSR1 to {cpid!r}!\n'
|
|
||||||
)
|
|
||||||
os.kill(
|
|
||||||
cpid,
|
|
||||||
signal.SIGUSR1,
|
|
||||||
)
|
|
||||||
|
|
||||||
# simulate user cancelling program
|
|
||||||
await trio.sleep(0.5)
|
|
||||||
os.kill(
|
|
||||||
os.getpid(),
|
|
||||||
signal.SIGINT,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
# actually let user send the ctl-c
|
|
||||||
await trio.sleep_forever() # in root
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
trio.run(main)
|
|
|
@ -1,88 +0,0 @@
|
||||||
import trio
|
|
||||||
import tractor
|
|
||||||
|
|
||||||
|
|
||||||
async def cancellable_pause_loop(
|
|
||||||
task_status: trio.TaskStatus[trio.CancelScope] = trio.TASK_STATUS_IGNORED
|
|
||||||
):
|
|
||||||
with trio.CancelScope() as cs:
|
|
||||||
task_status.started(cs)
|
|
||||||
for _ in range(3):
|
|
||||||
try:
|
|
||||||
# ON first entry, there is no level triggered
|
|
||||||
# cancellation yet, so this cp does a parent task
|
|
||||||
# ctx-switch so that this scope raises for the NEXT
|
|
||||||
# checkpoint we hit.
|
|
||||||
await trio.lowlevel.checkpoint()
|
|
||||||
await tractor.pause()
|
|
||||||
|
|
||||||
cs.cancel()
|
|
||||||
|
|
||||||
# parent should have called `cs.cancel()` by now
|
|
||||||
await trio.lowlevel.checkpoint()
|
|
||||||
|
|
||||||
except trio.Cancelled:
|
|
||||||
print('INSIDE SHIELDED PAUSE')
|
|
||||||
await tractor.pause(shield=True)
|
|
||||||
else:
|
|
||||||
# should raise it again, bubbling up to parent
|
|
||||||
print('BUBBLING trio.Cancelled to parent task-nursery')
|
|
||||||
await trio.lowlevel.checkpoint()
|
|
||||||
|
|
||||||
|
|
||||||
async def pm_on_cancelled():
|
|
||||||
async with trio.open_nursery() as tn:
|
|
||||||
tn.cancel_scope.cancel()
|
|
||||||
try:
|
|
||||||
await trio.sleep_forever()
|
|
||||||
except trio.Cancelled:
|
|
||||||
# should also raise `Cancelled` since
|
|
||||||
# we didn't pass `shield=True`.
|
|
||||||
try:
|
|
||||||
await tractor.post_mortem(hide_tb=False)
|
|
||||||
except trio.Cancelled as taskc:
|
|
||||||
|
|
||||||
# should enter just fine, in fact it should
|
|
||||||
# be debugging the internals of the previous
|
|
||||||
# sin-shield call above Bo
|
|
||||||
await tractor.post_mortem(
|
|
||||||
hide_tb=False,
|
|
||||||
shield=True,
|
|
||||||
)
|
|
||||||
raise taskc
|
|
||||||
|
|
||||||
else:
|
|
||||||
raise RuntimeError('Dint cancel as expected!?')
|
|
||||||
|
|
||||||
|
|
||||||
async def cancelled_before_pause(
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Verify that using a shielded pause works despite surrounding
|
|
||||||
cancellation called state in the calling task.
|
|
||||||
|
|
||||||
'''
|
|
||||||
async with trio.open_nursery() as tn:
|
|
||||||
cs: trio.CancelScope = await tn.start(cancellable_pause_loop)
|
|
||||||
await trio.sleep(0.1)
|
|
||||||
|
|
||||||
assert cs.cancelled_caught
|
|
||||||
|
|
||||||
await pm_on_cancelled()
|
|
||||||
|
|
||||||
|
|
||||||
async def main():
|
|
||||||
async with tractor.open_nursery(
|
|
||||||
debug_mode=True,
|
|
||||||
) as n:
|
|
||||||
portal: tractor.Portal = await n.run_in_actor(
|
|
||||||
cancelled_before_pause,
|
|
||||||
)
|
|
||||||
await portal.result()
|
|
||||||
|
|
||||||
# ensure the same works in the root actor!
|
|
||||||
await pm_on_cancelled()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
trio.run(main)
|
|
|
@ -4,9 +4,9 @@ import trio
|
||||||
|
|
||||||
async def gen():
|
async def gen():
|
||||||
yield 'yo'
|
yield 'yo'
|
||||||
await tractor.pause()
|
await tractor.breakpoint()
|
||||||
yield 'yo'
|
yield 'yo'
|
||||||
await tractor.pause()
|
await tractor.breakpoint()
|
||||||
|
|
||||||
|
|
||||||
@tractor.context
|
@tractor.context
|
||||||
|
@ -15,7 +15,7 @@ async def just_bp(
|
||||||
) -> None:
|
) -> None:
|
||||||
|
|
||||||
await ctx.started()
|
await ctx.started()
|
||||||
await tractor.pause()
|
await tractor.breakpoint()
|
||||||
|
|
||||||
# TODO: bps and errors in this call..
|
# TODO: bps and errors in this call..
|
||||||
async for val in gen():
|
async for val in gen():
|
||||||
|
|
|
@ -3,20 +3,17 @@ import tractor
|
||||||
|
|
||||||
|
|
||||||
async def breakpoint_forever():
|
async def breakpoint_forever():
|
||||||
'''
|
"""Indefinitely re-enter debugger in child actor.
|
||||||
Indefinitely re-enter debugger in child actor.
|
"""
|
||||||
|
|
||||||
'''
|
|
||||||
while True:
|
while True:
|
||||||
await trio.sleep(0.1)
|
await trio.sleep(0.1)
|
||||||
await tractor.pause()
|
await tractor.breakpoint()
|
||||||
|
|
||||||
|
|
||||||
async def main():
|
async def main():
|
||||||
|
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery(
|
||||||
debug_mode=True,
|
debug_mode=True,
|
||||||
loglevel='cancel',
|
|
||||||
) as n:
|
) as n:
|
||||||
|
|
||||||
portal = await n.run_in_actor(
|
portal = await n.run_in_actor(
|
||||||
|
|
|
@ -3,26 +3,16 @@ import tractor
|
||||||
|
|
||||||
|
|
||||||
async def name_error():
|
async def name_error():
|
||||||
getattr(doggypants) # noqa (on purpose)
|
getattr(doggypants)
|
||||||
|
|
||||||
|
|
||||||
async def main():
|
async def main():
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery(
|
||||||
debug_mode=True,
|
debug_mode=True,
|
||||||
# loglevel='transport',
|
) as n:
|
||||||
) as an:
|
|
||||||
|
|
||||||
# TODO: ideally the REPL arrives at this frame in the parent,
|
portal = await n.run_in_actor(name_error)
|
||||||
# ABOVE the @api_frame of `Portal.run_in_actor()` (which
|
await portal.result()
|
||||||
# should eventually not even be a portal method ... XD)
|
|
||||||
# await tractor.pause()
|
|
||||||
p: tractor.Portal = await an.run_in_actor(name_error)
|
|
||||||
|
|
||||||
# with this style, should raise on this line
|
|
||||||
await p.result()
|
|
||||||
|
|
||||||
# with this alt style should raise at `open_nusery()`
|
|
||||||
# return await p.result()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
|
@ -1,169 +0,0 @@
|
||||||
from functools import partial
|
|
||||||
import time
|
|
||||||
|
|
||||||
import trio
|
|
||||||
import tractor
|
|
||||||
|
|
||||||
# TODO: only import these when not running from test harness?
|
|
||||||
# can we detect `pexpect` usage maybe?
|
|
||||||
# from tractor.devx._debug import (
|
|
||||||
# get_lock,
|
|
||||||
# get_debug_req,
|
|
||||||
# )
|
|
||||||
|
|
||||||
|
|
||||||
def sync_pause(
|
|
||||||
use_builtin: bool = False,
|
|
||||||
error: bool = False,
|
|
||||||
hide_tb: bool = True,
|
|
||||||
pre_sleep: float|None = None,
|
|
||||||
):
|
|
||||||
if pre_sleep:
|
|
||||||
time.sleep(pre_sleep)
|
|
||||||
|
|
||||||
if use_builtin:
|
|
||||||
breakpoint(hide_tb=hide_tb)
|
|
||||||
|
|
||||||
else:
|
|
||||||
# TODO: maybe for testing some kind of cm style interface
|
|
||||||
# where the `._set_trace()` call doesn't happen until block
|
|
||||||
# exit?
|
|
||||||
# assert get_lock().ctx_in_debug is None
|
|
||||||
# assert get_debug_req().repl is None
|
|
||||||
tractor.pause_from_sync()
|
|
||||||
# assert get_debug_req().repl is None
|
|
||||||
|
|
||||||
if error:
|
|
||||||
raise RuntimeError('yoyo sync code error')
|
|
||||||
|
|
||||||
|
|
||||||
@tractor.context
|
|
||||||
async def start_n_sync_pause(
|
|
||||||
ctx: tractor.Context,
|
|
||||||
):
|
|
||||||
actor: tractor.Actor = tractor.current_actor()
|
|
||||||
|
|
||||||
# sync to parent-side task
|
|
||||||
await ctx.started()
|
|
||||||
|
|
||||||
print(f'Entering `sync_pause()` in subactor: {actor.uid}\n')
|
|
||||||
sync_pause()
|
|
||||||
print(f'Exited `sync_pause()` in subactor: {actor.uid}\n')
|
|
||||||
|
|
||||||
|
|
||||||
async def main() -> None:
|
|
||||||
async with (
|
|
||||||
tractor.open_nursery(
|
|
||||||
debug_mode=True,
|
|
||||||
maybe_enable_greenback=True,
|
|
||||||
enable_stack_on_sig=True,
|
|
||||||
# loglevel='warning',
|
|
||||||
# loglevel='devx',
|
|
||||||
) as an,
|
|
||||||
trio.open_nursery() as tn,
|
|
||||||
):
|
|
||||||
# just from root task
|
|
||||||
sync_pause()
|
|
||||||
|
|
||||||
p: tractor.Portal = await an.start_actor(
|
|
||||||
'subactor',
|
|
||||||
enable_modules=[__name__],
|
|
||||||
# infect_asyncio=True,
|
|
||||||
debug_mode=True,
|
|
||||||
)
|
|
||||||
|
|
||||||
# TODO: 3 sub-actor usage cases:
|
|
||||||
# -[x] via a `.open_context()`
|
|
||||||
# -[ ] via a `.run_in_actor()` call
|
|
||||||
# -[ ] via a `.run()`
|
|
||||||
# -[ ] via a `.to_thread.run_sync()` in subactor
|
|
||||||
async with p.open_context(
|
|
||||||
start_n_sync_pause,
|
|
||||||
) as (ctx, first):
|
|
||||||
assert first is None
|
|
||||||
|
|
||||||
# TODO: handle bg-thread-in-root-actor special cases!
|
|
||||||
#
|
|
||||||
# there are a couple very subtle situations possible here
|
|
||||||
# and they are likely to become more important as cpython
|
|
||||||
# moves to support no-GIL.
|
|
||||||
#
|
|
||||||
# Cases:
|
|
||||||
# 1. root-actor bg-threads that call `.pause_from_sync()`
|
|
||||||
# whilst an in-tree subactor also is using ` .pause()`.
|
|
||||||
# |_ since the root-actor bg thread can not
|
|
||||||
# `Lock._debug_lock.acquire_nowait()` without running
|
|
||||||
# a `trio.Task`, AND because the
|
|
||||||
# `PdbREPL.set_continue()` is called from that
|
|
||||||
# bg-thread, we can not `._debug_lock.release()`
|
|
||||||
# either!
|
|
||||||
# |_ this results in no actor-tree `Lock` being used
|
|
||||||
# on behalf of the bg-thread and thus the subactor's
|
|
||||||
# task and the thread trying to to use stdio
|
|
||||||
# simultaneously which results in the classic TTY
|
|
||||||
# clobbering!
|
|
||||||
#
|
|
||||||
# 2. mutiple sync-bg-threads that call
|
|
||||||
# `.pause_from_sync()` where one is scheduled via
|
|
||||||
# `Nursery.start_soon(to_thread.run_sync)` in a bg
|
|
||||||
# task.
|
|
||||||
#
|
|
||||||
# Due to the GIL, the threads never truly try to step
|
|
||||||
# through the REPL simultaneously, BUT their `logging`
|
|
||||||
# and traceback outputs are interleaved since the GIL
|
|
||||||
# (seemingly) on every REPL-input from the user
|
|
||||||
# switches threads..
|
|
||||||
#
|
|
||||||
# Soo, the context switching semantics of the GIL
|
|
||||||
# result in a very confusing and messy interaction UX
|
|
||||||
# since eval and (tb) print output is NOT synced to
|
|
||||||
# each REPL-cycle (like we normally make it via
|
|
||||||
# a `.set_continue()` callback triggering the
|
|
||||||
# `Lock.release()`). Ideally we can solve this
|
|
||||||
# usability issue NOW because this will of course be
|
|
||||||
# that much more important when eventually there is no
|
|
||||||
# GIL!
|
|
||||||
|
|
||||||
# XXX should cause double REPL entry and thus TTY
|
|
||||||
# clobbering due to case 1. above!
|
|
||||||
tn.start_soon(
|
|
||||||
partial(
|
|
||||||
trio.to_thread.run_sync,
|
|
||||||
partial(
|
|
||||||
sync_pause,
|
|
||||||
use_builtin=False,
|
|
||||||
# pre_sleep=0.5,
|
|
||||||
),
|
|
||||||
abandon_on_cancel=True,
|
|
||||||
thread_name='start_soon_root_bg_thread',
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
await tractor.pause()
|
|
||||||
|
|
||||||
# XXX should cause double REPL entry and thus TTY
|
|
||||||
# clobbering due to case 2. above!
|
|
||||||
await trio.to_thread.run_sync(
|
|
||||||
partial(
|
|
||||||
sync_pause,
|
|
||||||
# NOTE this already works fine since in the new
|
|
||||||
# thread the `breakpoint()` built-in is never
|
|
||||||
# overloaded, thus NO locking is used, HOWEVER
|
|
||||||
# the case 2. from above still exists!
|
|
||||||
use_builtin=True,
|
|
||||||
),
|
|
||||||
# TODO: with this `False` we can hang!??!
|
|
||||||
# abandon_on_cancel=False,
|
|
||||||
abandon_on_cancel=True,
|
|
||||||
thread_name='inline_root_bg_thread',
|
|
||||||
)
|
|
||||||
|
|
||||||
await ctx.cancel()
|
|
||||||
|
|
||||||
# TODO: case where we cancel from trio-side while asyncio task
|
|
||||||
# has debugger lock?
|
|
||||||
await p.cancel_actor()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
trio.run(main)
|
|
|
@ -1,11 +1,6 @@
|
||||||
import time
|
import time
|
||||||
import trio
|
import trio
|
||||||
import tractor
|
import tractor
|
||||||
from tractor import (
|
|
||||||
ActorNursery,
|
|
||||||
MsgStream,
|
|
||||||
Portal,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# this is the first 2 actors, streamer_1 and streamer_2
|
# this is the first 2 actors, streamer_1 and streamer_2
|
||||||
|
@ -17,18 +12,14 @@ async def stream_data(seed):
|
||||||
|
|
||||||
# this is the third actor; the aggregator
|
# this is the third actor; the aggregator
|
||||||
async def aggregate(seed):
|
async def aggregate(seed):
|
||||||
'''
|
"""Ensure that the two streams we receive match but only stream
|
||||||
Ensure that the two streams we receive match but only stream
|
|
||||||
a single set of values to the parent.
|
a single set of values to the parent.
|
||||||
|
"""
|
||||||
'''
|
async with tractor.open_nursery() as nursery:
|
||||||
an: ActorNursery
|
portals = []
|
||||||
async with tractor.open_nursery() as an:
|
|
||||||
portals: list[Portal] = []
|
|
||||||
for i in range(1, 3):
|
for i in range(1, 3):
|
||||||
|
# fork point
|
||||||
# fork/spawn call
|
portal = await nursery.start_actor(
|
||||||
portal = await an.start_actor(
|
|
||||||
name=f'streamer_{i}',
|
name=f'streamer_{i}',
|
||||||
enable_modules=[__name__],
|
enable_modules=[__name__],
|
||||||
)
|
)
|
||||||
|
@ -52,11 +43,7 @@ async def aggregate(seed):
|
||||||
async with trio.open_nursery() as n:
|
async with trio.open_nursery() as n:
|
||||||
|
|
||||||
for portal in portals:
|
for portal in portals:
|
||||||
n.start_soon(
|
n.start_soon(push_to_chan, portal, send_chan.clone())
|
||||||
push_to_chan,
|
|
||||||
portal,
|
|
||||||
send_chan.clone(),
|
|
||||||
)
|
|
||||||
|
|
||||||
# close this local task's reference to send side
|
# close this local task's reference to send side
|
||||||
await send_chan.aclose()
|
await send_chan.aclose()
|
||||||
|
@ -73,7 +60,7 @@ async def aggregate(seed):
|
||||||
|
|
||||||
print("FINISHED ITERATING in aggregator")
|
print("FINISHED ITERATING in aggregator")
|
||||||
|
|
||||||
await an.cancel()
|
await nursery.cancel()
|
||||||
print("WAITING on `ActorNursery` to finish")
|
print("WAITING on `ActorNursery` to finish")
|
||||||
print("AGGREGATOR COMPLETE!")
|
print("AGGREGATOR COMPLETE!")
|
||||||
|
|
||||||
|
@ -88,21 +75,18 @@ async def main() -> list[int]:
|
||||||
|
|
||||||
'''
|
'''
|
||||||
# yes, a nursery which spawns `trio`-"actors" B)
|
# yes, a nursery which spawns `trio`-"actors" B)
|
||||||
an: ActorNursery
|
nursery: tractor.ActorNursery
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery() as nursery:
|
||||||
loglevel='cancel',
|
|
||||||
# debug_mode=True,
|
|
||||||
) as an:
|
|
||||||
|
|
||||||
seed = int(1e3)
|
seed = int(1e3)
|
||||||
pre_start = time.time()
|
pre_start = time.time()
|
||||||
|
|
||||||
portal: Portal = await an.start_actor(
|
portal: tractor.Portal = await nursery.start_actor(
|
||||||
name='aggregator',
|
name='aggregator',
|
||||||
enable_modules=[__name__],
|
enable_modules=[__name__],
|
||||||
)
|
)
|
||||||
|
|
||||||
stream: MsgStream
|
stream: tractor.MsgStream
|
||||||
async with portal.open_stream_from(
|
async with portal.open_stream_from(
|
||||||
aggregate,
|
aggregate,
|
||||||
seed=seed,
|
seed=seed,
|
||||||
|
@ -111,12 +95,11 @@ async def main() -> list[int]:
|
||||||
start = time.time()
|
start = time.time()
|
||||||
# the portal call returns exactly what you'd expect
|
# the portal call returns exactly what you'd expect
|
||||||
# as if the remote "aggregate" function was called locally
|
# as if the remote "aggregate" function was called locally
|
||||||
result_stream: list[int] = []
|
result_stream = []
|
||||||
async for value in stream:
|
async for value in stream:
|
||||||
result_stream.append(value)
|
result_stream.append(value)
|
||||||
|
|
||||||
cancelled: bool = await portal.cancel_actor()
|
await portal.cancel_actor()
|
||||||
assert cancelled
|
|
||||||
|
|
||||||
print(f"STREAM TIME = {time.time() - start}")
|
print(f"STREAM TIME = {time.time() - start}")
|
||||||
print(f"STREAM + SPAWN TIME = {time.time() - pre_start}")
|
print(f"STREAM + SPAWN TIME = {time.time() - pre_start}")
|
||||||
|
|
|
@ -3,18 +3,20 @@ import trio
|
||||||
import tractor
|
import tractor
|
||||||
|
|
||||||
|
|
||||||
async def sleepy_jane() -> None:
|
async def sleepy_jane():
|
||||||
uid: tuple = tractor.current_actor().uid
|
uid = tractor.current_actor().uid
|
||||||
print(f'Yo i am actor {uid}')
|
print(f'Yo i am actor {uid}')
|
||||||
await trio.sleep_forever()
|
await trio.sleep_forever()
|
||||||
|
|
||||||
|
|
||||||
async def main():
|
async def main():
|
||||||
'''
|
'''
|
||||||
Spawn a flat actor cluster, with one process per detected core.
|
Spawn a flat actor cluster, with one process per
|
||||||
|
detected core.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
portal_map: dict[str, tractor.Portal]
|
portal_map: dict[str, tractor.Portal]
|
||||||
|
results: dict[str, str]
|
||||||
|
|
||||||
# look at this hip new syntax!
|
# look at this hip new syntax!
|
||||||
async with (
|
async with (
|
||||||
|
@ -23,16 +25,11 @@ async def main():
|
||||||
modules=[__name__]
|
modules=[__name__]
|
||||||
) as portal_map,
|
) as portal_map,
|
||||||
|
|
||||||
trio.open_nursery(
|
trio.open_nursery() as n,
|
||||||
strict_exception_groups=False,
|
|
||||||
) as tn,
|
|
||||||
):
|
):
|
||||||
|
|
||||||
for (name, portal) in portal_map.items():
|
for (name, portal) in portal_map.items():
|
||||||
tn.start_soon(
|
n.start_soon(portal.run, sleepy_jane)
|
||||||
portal.run,
|
|
||||||
sleepy_jane,
|
|
||||||
)
|
|
||||||
|
|
||||||
await trio.sleep(0.5)
|
await trio.sleep(0.5)
|
||||||
|
|
||||||
|
@ -44,4 +41,4 @@ if __name__ == '__main__':
|
||||||
try:
|
try:
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
except KeyboardInterrupt:
|
except KeyboardInterrupt:
|
||||||
print('trio cancelled by KBI')
|
pass
|
||||||
|
|
|
@ -9,7 +9,7 @@ async def main(service_name):
|
||||||
async with tractor.open_nursery() as an:
|
async with tractor.open_nursery() as an:
|
||||||
await an.start_actor(service_name)
|
await an.start_actor(service_name)
|
||||||
|
|
||||||
async with tractor.get_registry('127.0.0.1', 1616) as portal:
|
async with tractor.get_arbiter('127.0.0.1', 1616) as portal:
|
||||||
print(f"Arbiter is listening on {portal.channel}")
|
print(f"Arbiter is listening on {portal.channel}")
|
||||||
|
|
||||||
async with tractor.wait_for_actor(service_name) as sockaddr:
|
async with tractor.wait_for_actor(service_name) as sockaddr:
|
||||||
|
|
|
@ -1,18 +0,0 @@
|
||||||
First generate a built disti:
|
|
||||||
|
|
||||||
```
|
|
||||||
python -m pip install --upgrade build
|
|
||||||
python -m build --sdist --outdir dist/alpha5/
|
|
||||||
```
|
|
||||||
|
|
||||||
Then try a test ``pypi`` upload:
|
|
||||||
|
|
||||||
```
|
|
||||||
python -m twine upload --repository testpypi dist/alpha5/*
|
|
||||||
```
|
|
||||||
|
|
||||||
The push to `pypi` for realz.
|
|
||||||
|
|
||||||
```
|
|
||||||
python -m twine upload --repository testpypi dist/alpha5/*
|
|
||||||
```
|
|
120
pyproject.toml
120
pyproject.toml
|
@ -1,111 +1,3 @@
|
||||||
[build-system]
|
|
||||||
requires = ["hatchling"]
|
|
||||||
build-backend = "hatchling.build"
|
|
||||||
|
|
||||||
# ------ build-system ------
|
|
||||||
|
|
||||||
[project]
|
|
||||||
name = "tractor"
|
|
||||||
version = "0.1.0a6dev0"
|
|
||||||
description = 'structured concurrent `trio`-"actors"'
|
|
||||||
authors = [{ name = "Tyler Goodlet", email = "goodboy_foss@protonmail.com" }]
|
|
||||||
requires-python = ">= 3.11"
|
|
||||||
readme = "docs/README.rst"
|
|
||||||
license = "AGPL-3.0-or-later"
|
|
||||||
keywords = [
|
|
||||||
"trio",
|
|
||||||
"async",
|
|
||||||
"concurrency",
|
|
||||||
"structured concurrency",
|
|
||||||
"actor model",
|
|
||||||
"distributed",
|
|
||||||
"multiprocessing",
|
|
||||||
]
|
|
||||||
classifiers = [
|
|
||||||
"Development Status :: 3 - Alpha",
|
|
||||||
"Operating System :: POSIX :: Linux",
|
|
||||||
"Framework :: Trio",
|
|
||||||
"License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)",
|
|
||||||
"Programming Language :: Python :: Implementation :: CPython",
|
|
||||||
"Programming Language :: Python :: 3 :: Only",
|
|
||||||
"Programming Language :: Python :: 3.11",
|
|
||||||
"Topic :: System :: Distributed Computing",
|
|
||||||
]
|
|
||||||
dependencies = [
|
|
||||||
# trio runtime and friends
|
|
||||||
# (poetry) proper range specs,
|
|
||||||
# https://packaging.python.org/en/latest/discussions/install-requires-vs-requirements/#id5
|
|
||||||
# TODO, for 3.13 we must go go `0.27` which means we have to
|
|
||||||
# disable strict egs or port to handling them internally!
|
|
||||||
"trio>0.27",
|
|
||||||
"tricycle>=0.4.1,<0.5",
|
|
||||||
"wrapt>=1.16.0,<2",
|
|
||||||
"colorlog>=6.8.2,<7",
|
|
||||||
# built-in multi-actor `pdb` REPL
|
|
||||||
"pdbp>=1.6,<2", # windows only (from `pdbp`)
|
|
||||||
# typed IPC msging
|
|
||||||
"msgspec>=0.19.0",
|
|
||||||
]
|
|
||||||
|
|
||||||
# ------ project ------
|
|
||||||
|
|
||||||
[dependency-groups]
|
|
||||||
dev = [
|
|
||||||
# test suite
|
|
||||||
# TODO: maybe some of these layout choices?
|
|
||||||
# https://docs.pytest.org/en/8.0.x/explanation/goodpractices.html#choosing-a-test-layout-import-rules
|
|
||||||
"pytest>=8.3.5",
|
|
||||||
"pexpect>=4.9.0,<5",
|
|
||||||
# `tractor.devx` tooling
|
|
||||||
"greenback>=1.2.1,<2",
|
|
||||||
"stackscope>=0.2.2,<0.3",
|
|
||||||
"pyperclip>=1.9.0",
|
|
||||||
"prompt-toolkit>=3.0.50",
|
|
||||||
"xonsh>=0.19.2",
|
|
||||||
]
|
|
||||||
# TODO, add these with sane versions; were originally in
|
|
||||||
# `requirements-docs.txt`..
|
|
||||||
# docs = [
|
|
||||||
# "sphinx>="
|
|
||||||
# "sphinx_book_theme>="
|
|
||||||
# ]
|
|
||||||
|
|
||||||
# ------ dependency-groups ------
|
|
||||||
|
|
||||||
# ------ dependency-groups ------
|
|
||||||
|
|
||||||
[tool.uv.sources]
|
|
||||||
# XXX NOTE, only for @goodboy's hacking on `pprint(sort_dicts=False)`
|
|
||||||
# for the `pp` alias..
|
|
||||||
# pdbp = { path = "../pdbp", editable = true }
|
|
||||||
|
|
||||||
# ------ tool.uv.sources ------
|
|
||||||
# TODO, distributed (multi-host) extensions
|
|
||||||
# linux kernel networking
|
|
||||||
# 'pyroute2
|
|
||||||
|
|
||||||
# ------ tool.uv.sources ------
|
|
||||||
|
|
||||||
[tool.uv]
|
|
||||||
# XXX NOTE, prefer the sys python bc apparently the distis from
|
|
||||||
# `astral` are built in a way that breaks `pdbp`+`tabcompleter`'s
|
|
||||||
# likely due to linking against `libedit` over `readline`..
|
|
||||||
# |_https://docs.astral.sh/uv/concepts/python-versions/#managed-python-distributions
|
|
||||||
# |_https://gregoryszorc.com/docs/python-build-standalone/main/quirks.html#use-of-libedit-on-linux
|
|
||||||
#
|
|
||||||
# https://docs.astral.sh/uv/reference/settings/#python-preference
|
|
||||||
python-preference = 'system'
|
|
||||||
|
|
||||||
# ------ tool.uv ------
|
|
||||||
|
|
||||||
[tool.hatch.build.targets.sdist]
|
|
||||||
include = ["tractor"]
|
|
||||||
|
|
||||||
[tool.hatch.build.targets.wheel]
|
|
||||||
include = ["tractor"]
|
|
||||||
|
|
||||||
# ------ tool.hatch ------
|
|
||||||
|
|
||||||
[tool.towncrier]
|
[tool.towncrier]
|
||||||
package = "tractor"
|
package = "tractor"
|
||||||
filename = "NEWS.rst"
|
filename = "NEWS.rst"
|
||||||
|
@ -115,27 +7,26 @@ title_format = "tractor {version} ({project_date})"
|
||||||
template = "nooz/_template.rst"
|
template = "nooz/_template.rst"
|
||||||
all_bullets = true
|
all_bullets = true
|
||||||
|
|
||||||
[[tool.towncrier.type]]
|
[[tool.towncrier.type]]
|
||||||
directory = "feature"
|
directory = "feature"
|
||||||
name = "Features"
|
name = "Features"
|
||||||
showcontent = true
|
showcontent = true
|
||||||
|
|
||||||
[[tool.towncrier.type]]
|
[[tool.towncrier.type]]
|
||||||
directory = "bugfix"
|
directory = "bugfix"
|
||||||
name = "Bug Fixes"
|
name = "Bug Fixes"
|
||||||
showcontent = true
|
showcontent = true
|
||||||
|
|
||||||
[[tool.towncrier.type]]
|
[[tool.towncrier.type]]
|
||||||
directory = "doc"
|
directory = "doc"
|
||||||
name = "Improved Documentation"
|
name = "Improved Documentation"
|
||||||
showcontent = true
|
showcontent = true
|
||||||
|
|
||||||
[[tool.towncrier.type]]
|
[[tool.towncrier.type]]
|
||||||
directory = "trivial"
|
directory = "trivial"
|
||||||
name = "Trivial/Internal Changes"
|
name = "Trivial/Internal Changes"
|
||||||
showcontent = true
|
showcontent = true
|
||||||
|
|
||||||
# ------ tool.towncrier ------
|
|
||||||
|
|
||||||
[tool.pytest.ini_options]
|
[tool.pytest.ini_options]
|
||||||
minversion = '6.0'
|
minversion = '6.0'
|
||||||
|
@ -151,8 +42,7 @@ addopts = [
|
||||||
'--show-capture=no',
|
'--show-capture=no',
|
||||||
]
|
]
|
||||||
log_cli = false
|
log_cli = false
|
||||||
|
|
||||||
# TODO: maybe some of these layout choices?
|
# TODO: maybe some of these layout choices?
|
||||||
# https://docs.pytest.org/en/8.0.x/explanation/goodpractices.html#choosing-a-test-layout-import-rules
|
# https://docs.pytest.org/en/8.0.x/explanation/goodpractices.html#choosing-a-test-layout-import-rules
|
||||||
# pythonpath = "src"
|
# pythonpath = "src"
|
||||||
|
|
||||||
# ------ tool.pytest ------
|
|
||||||
|
|
|
@ -1,8 +0,0 @@
|
||||||
# vim: ft=ini
|
|
||||||
# pytest.ini for tractor
|
|
||||||
|
|
||||||
[pytest]
|
|
||||||
# don't show frickin captured logs AGAIN in the report..
|
|
||||||
addopts = --show-capture='no'
|
|
||||||
log_cli = false
|
|
||||||
; minversion = 6.0
|
|
|
@ -0,0 +1,2 @@
|
||||||
|
sphinx
|
||||||
|
sphinx_book_theme
|
|
@ -0,0 +1,8 @@
|
||||||
|
pytest
|
||||||
|
pytest-trio
|
||||||
|
pytest-timeout
|
||||||
|
pdbp
|
||||||
|
mypy
|
||||||
|
trio_typing
|
||||||
|
pexpect
|
||||||
|
towncrier
|
82
ruff.toml
82
ruff.toml
|
@ -1,82 +0,0 @@
|
||||||
# from default `ruff.toml` @
|
|
||||||
# https://docs.astral.sh/ruff/configuration/
|
|
||||||
|
|
||||||
# Exclude a variety of commonly ignored directories.
|
|
||||||
exclude = [
|
|
||||||
".bzr",
|
|
||||||
".direnv",
|
|
||||||
".eggs",
|
|
||||||
".git",
|
|
||||||
".git-rewrite",
|
|
||||||
".hg",
|
|
||||||
".ipynb_checkpoints",
|
|
||||||
".mypy_cache",
|
|
||||||
".nox",
|
|
||||||
".pants.d",
|
|
||||||
".pyenv",
|
|
||||||
".pytest_cache",
|
|
||||||
".pytype",
|
|
||||||
".ruff_cache",
|
|
||||||
".svn",
|
|
||||||
".tox",
|
|
||||||
".venv",
|
|
||||||
".vscode",
|
|
||||||
"__pypackages__",
|
|
||||||
"_build",
|
|
||||||
"buck-out",
|
|
||||||
"build",
|
|
||||||
"dist",
|
|
||||||
"node_modules",
|
|
||||||
"site-packages",
|
|
||||||
"venv",
|
|
||||||
]
|
|
||||||
|
|
||||||
# Same as Black.
|
|
||||||
line-length = 88
|
|
||||||
indent-width = 4
|
|
||||||
|
|
||||||
# Assume Python 3.9
|
|
||||||
target-version = "py311"
|
|
||||||
|
|
||||||
[lint]
|
|
||||||
# Enable Pyflakes (`F`) and a subset of the pycodestyle (`E`) codes by default.
|
|
||||||
# Unlike Flake8, Ruff doesn't enable pycodestyle warnings (`W`) or
|
|
||||||
# McCabe complexity (`C901`) by default.
|
|
||||||
select = ["E4", "E7", "E9", "F"]
|
|
||||||
ignore = [
|
|
||||||
'E402', # https://docs.astral.sh/ruff/rules/module-import-not-at-top-of-file/
|
|
||||||
]
|
|
||||||
|
|
||||||
# Allow fix for all enabled rules (when `--fix`) is provided.
|
|
||||||
fixable = ["ALL"]
|
|
||||||
unfixable = []
|
|
||||||
|
|
||||||
# Allow unused variables when underscore-prefixed.
|
|
||||||
# dummy-variable-rgx = "^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$"
|
|
||||||
|
|
||||||
[format]
|
|
||||||
# Use single quotes in `ruff format`.
|
|
||||||
quote-style = "single"
|
|
||||||
|
|
||||||
# Like Black, indent with spaces, rather than tabs.
|
|
||||||
indent-style = "space"
|
|
||||||
|
|
||||||
# Like Black, respect magic trailing commas.
|
|
||||||
skip-magic-trailing-comma = false
|
|
||||||
|
|
||||||
# Like Black, automatically detect the appropriate line ending.
|
|
||||||
line-ending = "auto"
|
|
||||||
|
|
||||||
# Enable auto-formatting of code examples in docstrings. Markdown,
|
|
||||||
# reStructuredText code/literal blocks and doctests are all supported.
|
|
||||||
#
|
|
||||||
# This is currently disabled by default, but it is planned for this
|
|
||||||
# to be opt-out in the future.
|
|
||||||
docstring-code-format = false
|
|
||||||
|
|
||||||
# Set the line length limit used when formatting code snippets in
|
|
||||||
# docstrings.
|
|
||||||
#
|
|
||||||
# This only has an effect when the `docstring-code-format` setting is
|
|
||||||
# enabled.
|
|
||||||
docstring-code-line-length = "dynamic"
|
|
|
@ -0,0 +1,102 @@
|
||||||
|
#!/usr/bin/env python
|
||||||
|
#
|
||||||
|
# tractor: structured concurrent "actors".
|
||||||
|
#
|
||||||
|
# Copyright 2018-eternity Tyler Goodlet.
|
||||||
|
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU Affero General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU Affero General Public License for more details.
|
||||||
|
|
||||||
|
# You should have received a copy of the GNU Affero General Public License
|
||||||
|
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
with open('docs/README.rst', encoding='utf-8') as f:
|
||||||
|
readme = f.read()
|
||||||
|
|
||||||
|
|
||||||
|
setup(
|
||||||
|
name="tractor",
|
||||||
|
version='0.1.0a6dev0', # alpha zone
|
||||||
|
description='structured concurrent `trio`-"actors"',
|
||||||
|
long_description=readme,
|
||||||
|
license='AGPLv3',
|
||||||
|
author='Tyler Goodlet',
|
||||||
|
maintainer='Tyler Goodlet',
|
||||||
|
maintainer_email='goodboy_foss@protonmail.com',
|
||||||
|
url='https://github.com/goodboy/tractor',
|
||||||
|
platforms=['linux', 'windows'],
|
||||||
|
packages=[
|
||||||
|
'tractor',
|
||||||
|
'tractor.experimental', # wacky ideas
|
||||||
|
'tractor.trionics', # trio extensions
|
||||||
|
'tractor.msg', # lowlevel data types
|
||||||
|
'tractor.devx', # "dev-experience"
|
||||||
|
],
|
||||||
|
install_requires=[
|
||||||
|
|
||||||
|
# trio related
|
||||||
|
# proper range spec:
|
||||||
|
# https://packaging.python.org/en/latest/discussions/install-requires-vs-requirements/#id5
|
||||||
|
'trio >= 0.24',
|
||||||
|
|
||||||
|
# 'async_generator', # in stdlib mostly!
|
||||||
|
# 'trio_typing', # trio==0.23.0 has type hints!
|
||||||
|
# 'exceptiongroup', # in stdlib as of 3.11!
|
||||||
|
|
||||||
|
# tooling
|
||||||
|
'stackscope',
|
||||||
|
'tricycle',
|
||||||
|
'trio_typing',
|
||||||
|
'colorlog',
|
||||||
|
'wrapt',
|
||||||
|
|
||||||
|
# IPC serialization
|
||||||
|
'msgspec',
|
||||||
|
|
||||||
|
# debug mode REPL
|
||||||
|
'pdbp',
|
||||||
|
|
||||||
|
# TODO: distributed transport using
|
||||||
|
# linux kernel networking
|
||||||
|
# 'pyroute2',
|
||||||
|
|
||||||
|
# pip ref docs on these specs:
|
||||||
|
# https://pip.pypa.io/en/stable/reference/requirement-specifiers/#examples
|
||||||
|
# and pep:
|
||||||
|
# https://peps.python.org/pep-0440/#version-specifiers
|
||||||
|
|
||||||
|
],
|
||||||
|
tests_require=['pytest'],
|
||||||
|
python_requires=">=3.10",
|
||||||
|
keywords=[
|
||||||
|
'trio',
|
||||||
|
'async',
|
||||||
|
'concurrency',
|
||||||
|
'structured concurrency',
|
||||||
|
'actor model',
|
||||||
|
'distributed',
|
||||||
|
'multiprocessing'
|
||||||
|
],
|
||||||
|
classifiers=[
|
||||||
|
"Development Status :: 3 - Alpha",
|
||||||
|
"Operating System :: POSIX :: Linux",
|
||||||
|
"Operating System :: Microsoft :: Windows",
|
||||||
|
"Framework :: Trio",
|
||||||
|
"License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)",
|
||||||
|
"Programming Language :: Python :: Implementation :: CPython",
|
||||||
|
"Programming Language :: Python :: 3 :: Only",
|
||||||
|
"Programming Language :: Python :: 3.10",
|
||||||
|
"Intended Audience :: Science/Research",
|
||||||
|
"Intended Audience :: Developers",
|
||||||
|
"Topic :: System :: Distributed Computing",
|
||||||
|
],
|
||||||
|
)
|
|
@ -41,46 +41,22 @@ no_windows = pytest.mark.skipif(
|
||||||
|
|
||||||
def pytest_addoption(parser):
|
def pytest_addoption(parser):
|
||||||
parser.addoption(
|
parser.addoption(
|
||||||
"--ll",
|
"--ll", action="store", dest='loglevel',
|
||||||
action="store",
|
|
||||||
dest='loglevel',
|
|
||||||
default='ERROR', help="logging level to set when testing"
|
default='ERROR', help="logging level to set when testing"
|
||||||
)
|
)
|
||||||
|
|
||||||
parser.addoption(
|
parser.addoption(
|
||||||
"--spawn-backend",
|
"--spawn-backend", action="store", dest='spawn_backend',
|
||||||
action="store",
|
|
||||||
dest='spawn_backend',
|
|
||||||
default='trio',
|
default='trio',
|
||||||
help="Processing spawning backend to use for test run",
|
help="Processing spawning backend to use for test run",
|
||||||
)
|
)
|
||||||
|
|
||||||
parser.addoption(
|
|
||||||
"--tpdb", "--debug-mode",
|
|
||||||
action="store_true",
|
|
||||||
dest='tractor_debug_mode',
|
|
||||||
# default=False,
|
|
||||||
help=(
|
|
||||||
'Enable a flag that can be used by tests to to set the '
|
|
||||||
'`debug_mode: bool` for engaging the internal '
|
|
||||||
'multi-proc debugger sys.'
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def pytest_configure(config):
|
def pytest_configure(config):
|
||||||
backend = config.option.spawn_backend
|
backend = config.option.spawn_backend
|
||||||
tractor._spawn.try_set_start_method(backend)
|
tractor._spawn.try_set_start_method(backend)
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope='session')
|
|
||||||
def debug_mode(request):
|
|
||||||
debug_mode: bool = request.config.option.tractor_debug_mode
|
|
||||||
# if debug_mode:
|
|
||||||
# breakpoint()
|
|
||||||
return debug_mode
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope='session', autouse=True)
|
@pytest.fixture(scope='session', autouse=True)
|
||||||
def loglevel(request):
|
def loglevel(request):
|
||||||
orig = tractor.log._default_loglevel
|
orig = tractor.log._default_loglevel
|
||||||
|
@ -95,12 +71,6 @@ def spawn_backend(request) -> str:
|
||||||
return request.config.option.spawn_backend
|
return request.config.option.spawn_backend
|
||||||
|
|
||||||
|
|
||||||
# @pytest.fixture(scope='function', autouse=True)
|
|
||||||
# def debug_enabled(request) -> str:
|
|
||||||
# from tractor import _state
|
|
||||||
# if _state._runtime_vars['_debug_mode']:
|
|
||||||
# breakpoint()
|
|
||||||
|
|
||||||
_ci_env: bool = os.environ.get('CI', False)
|
_ci_env: bool = os.environ.get('CI', False)
|
||||||
|
|
||||||
|
|
||||||
|
@ -123,18 +93,12 @@ _reg_addr: tuple[str, int] = (
|
||||||
'127.0.0.1',
|
'127.0.0.1',
|
||||||
random.randint(1000, 9999),
|
random.randint(1000, 9999),
|
||||||
)
|
)
|
||||||
|
_arb_addr = _reg_addr
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope='session')
|
@pytest.fixture(scope='session')
|
||||||
def reg_addr() -> tuple[str, int]:
|
def arb_addr():
|
||||||
|
return _arb_addr
|
||||||
# globally override the runtime to the per-test-session-dynamic
|
|
||||||
# addr so that all tests never conflict with any other actor
|
|
||||||
# tree using the default.
|
|
||||||
from tractor import _root
|
|
||||||
_root._default_lo_addrs = [_reg_addr]
|
|
||||||
|
|
||||||
return _reg_addr
|
|
||||||
|
|
||||||
|
|
||||||
def pytest_generate_tests(metafunc):
|
def pytest_generate_tests(metafunc):
|
||||||
|
@ -159,18 +123,6 @@ def pytest_generate_tests(metafunc):
|
||||||
metafunc.parametrize("start_method", [spawn_backend], scope='module')
|
metafunc.parametrize("start_method", [spawn_backend], scope='module')
|
||||||
|
|
||||||
|
|
||||||
# TODO: a way to let test scripts (like from `examples/`)
|
|
||||||
# guarantee they won't registry addr collide!
|
|
||||||
# @pytest.fixture
|
|
||||||
# def open_test_runtime(
|
|
||||||
# reg_addr: tuple,
|
|
||||||
# ) -> AsyncContextManager:
|
|
||||||
# return partial(
|
|
||||||
# tractor.open_nursery,
|
|
||||||
# registry_addrs=[reg_addr],
|
|
||||||
# )
|
|
||||||
|
|
||||||
|
|
||||||
def sig_prog(proc, sig):
|
def sig_prog(proc, sig):
|
||||||
"Kill the actor-process with ``sig``."
|
"Kill the actor-process with ``sig``."
|
||||||
proc.send_signal(sig)
|
proc.send_signal(sig)
|
||||||
|
@ -188,35 +140,30 @@ def sig_prog(proc, sig):
|
||||||
def daemon(
|
def daemon(
|
||||||
loglevel: str,
|
loglevel: str,
|
||||||
testdir,
|
testdir,
|
||||||
reg_addr: tuple[str, int],
|
arb_addr: tuple[str, int],
|
||||||
):
|
):
|
||||||
'''
|
'''
|
||||||
Run a daemon root actor as a separate actor-process tree and
|
Run a daemon actor as a "remote arbiter".
|
||||||
"remote registrar" for discovery-protocol related tests.
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
if loglevel in ('trace', 'debug'):
|
if loglevel in ('trace', 'debug'):
|
||||||
# XXX: too much logging will lock up the subproc (smh)
|
# too much logging will lock up the subproc (smh)
|
||||||
loglevel: str = 'info'
|
loglevel = 'info'
|
||||||
|
|
||||||
code: str = (
|
cmdargs = [
|
||||||
"import tractor; "
|
sys.executable, '-c',
|
||||||
"tractor.run_daemon([], registry_addrs={reg_addrs}, loglevel={ll})"
|
"import tractor; tractor.run_daemon([], registry_addr={}, loglevel={})"
|
||||||
).format(
|
.format(
|
||||||
reg_addrs=str([reg_addr]),
|
arb_addr,
|
||||||
ll="'{}'".format(loglevel) if loglevel else None,
|
"'{}'".format(loglevel) if loglevel else None)
|
||||||
)
|
|
||||||
cmd: list[str] = [
|
|
||||||
sys.executable,
|
|
||||||
'-c', code,
|
|
||||||
]
|
]
|
||||||
kwargs = {}
|
kwargs = dict()
|
||||||
if platform.system() == 'Windows':
|
if platform.system() == 'Windows':
|
||||||
# without this, tests hang on windows forever
|
# without this, tests hang on windows forever
|
||||||
kwargs['creationflags'] = subprocess.CREATE_NEW_PROCESS_GROUP
|
kwargs['creationflags'] = subprocess.CREATE_NEW_PROCESS_GROUP
|
||||||
|
|
||||||
proc = testdir.popen(
|
proc = testdir.popen(
|
||||||
cmd,
|
cmdargs,
|
||||||
stdout=subprocess.PIPE,
|
stdout=subprocess.PIPE,
|
||||||
stderr=subprocess.PIPE,
|
stderr=subprocess.PIPE,
|
||||||
**kwargs,
|
**kwargs,
|
||||||
|
|
|
@ -1,243 +0,0 @@
|
||||||
'''
|
|
||||||
`tractor.devx.*` tooling sub-pkg test space.
|
|
||||||
|
|
||||||
'''
|
|
||||||
import time
|
|
||||||
from typing import (
|
|
||||||
Callable,
|
|
||||||
)
|
|
||||||
|
|
||||||
import pytest
|
|
||||||
from pexpect.exceptions import (
|
|
||||||
TIMEOUT,
|
|
||||||
)
|
|
||||||
from pexpect.spawnbase import SpawnBase
|
|
||||||
|
|
||||||
from tractor._testing import (
|
|
||||||
mk_cmd,
|
|
||||||
)
|
|
||||||
from tractor.devx._debug import (
|
|
||||||
_pause_msg as _pause_msg,
|
|
||||||
_crash_msg as _crash_msg,
|
|
||||||
_repl_fail_msg as _repl_fail_msg,
|
|
||||||
_ctlc_ignore_header as _ctlc_ignore_header,
|
|
||||||
)
|
|
||||||
from ..conftest import (
|
|
||||||
_ci_env,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def spawn(
|
|
||||||
start_method,
|
|
||||||
testdir: pytest.Pytester,
|
|
||||||
reg_addr: tuple[str, int],
|
|
||||||
|
|
||||||
) -> Callable[[str], None]:
|
|
||||||
'''
|
|
||||||
Use the `pexpect` module shipped via `testdir.spawn()` to
|
|
||||||
run an `./examples/..` script by name.
|
|
||||||
|
|
||||||
'''
|
|
||||||
if start_method != 'trio':
|
|
||||||
pytest.skip(
|
|
||||||
'`pexpect` based tests only supported on `trio` backend'
|
|
||||||
)
|
|
||||||
|
|
||||||
def unset_colors():
|
|
||||||
'''
|
|
||||||
Python 3.13 introduced colored tracebacks that break patt
|
|
||||||
matching,
|
|
||||||
|
|
||||||
https://docs.python.org/3/using/cmdline.html#envvar-PYTHON_COLORS
|
|
||||||
https://docs.python.org/3/using/cmdline.html#using-on-controlling-color
|
|
||||||
|
|
||||||
'''
|
|
||||||
import os
|
|
||||||
os.environ['PYTHON_COLORS'] = '0'
|
|
||||||
|
|
||||||
def _spawn(
|
|
||||||
cmd: str,
|
|
||||||
**mkcmd_kwargs,
|
|
||||||
):
|
|
||||||
unset_colors()
|
|
||||||
return testdir.spawn(
|
|
||||||
cmd=mk_cmd(
|
|
||||||
cmd,
|
|
||||||
**mkcmd_kwargs,
|
|
||||||
),
|
|
||||||
expect_timeout=3,
|
|
||||||
# preexec_fn=unset_colors,
|
|
||||||
# ^TODO? get `pytest` core to expose underlying
|
|
||||||
# `pexpect.spawn()` stuff?
|
|
||||||
)
|
|
||||||
|
|
||||||
# such that test-dep can pass input script name.
|
|
||||||
return _spawn
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(
|
|
||||||
params=[False, True],
|
|
||||||
ids='ctl-c={}'.format,
|
|
||||||
)
|
|
||||||
def ctlc(
|
|
||||||
request,
|
|
||||||
ci_env: bool,
|
|
||||||
|
|
||||||
) -> bool:
|
|
||||||
|
|
||||||
use_ctlc = request.param
|
|
||||||
|
|
||||||
node = request.node
|
|
||||||
markers = node.own_markers
|
|
||||||
for mark in markers:
|
|
||||||
if mark.name == 'has_nested_actors':
|
|
||||||
pytest.skip(
|
|
||||||
f'Test {node} has nested actors and fails with Ctrl-C.\n'
|
|
||||||
f'The test can sometimes run fine locally but until'
|
|
||||||
' we solve' 'this issue this CI test will be xfail:\n'
|
|
||||||
'https://github.com/goodboy/tractor/issues/320'
|
|
||||||
)
|
|
||||||
|
|
||||||
if mark.name == 'ctlcs_bish':
|
|
||||||
pytest.skip(
|
|
||||||
f'Test {node} prolly uses something from the stdlib (namely `asyncio`..)\n'
|
|
||||||
f'The test and/or underlying example script can *sometimes* run fine '
|
|
||||||
f'locally but more then likely until the cpython peeps get their sh#$ together, '
|
|
||||||
f'this test will definitely not behave like `trio` under SIGINT..\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
if use_ctlc:
|
|
||||||
# XXX: disable pygments highlighting for auto-tests
|
|
||||||
# since some envs (like actions CI) will struggle
|
|
||||||
# the the added color-char encoding..
|
|
||||||
from tractor.devx._debug import TractorConfig
|
|
||||||
TractorConfig.use_pygements = False
|
|
||||||
|
|
||||||
yield use_ctlc
|
|
||||||
|
|
||||||
|
|
||||||
def expect(
|
|
||||||
child,
|
|
||||||
|
|
||||||
# normally a `pdb` prompt by default
|
|
||||||
patt: str,
|
|
||||||
|
|
||||||
**kwargs,
|
|
||||||
|
|
||||||
) -> None:
|
|
||||||
'''
|
|
||||||
Expect wrapper that prints last seen console
|
|
||||||
data before failing.
|
|
||||||
|
|
||||||
'''
|
|
||||||
try:
|
|
||||||
child.expect(
|
|
||||||
patt,
|
|
||||||
**kwargs,
|
|
||||||
)
|
|
||||||
except TIMEOUT:
|
|
||||||
before = str(child.before.decode())
|
|
||||||
print(before)
|
|
||||||
raise
|
|
||||||
|
|
||||||
|
|
||||||
PROMPT = r"\(Pdb\+\)"
|
|
||||||
|
|
||||||
|
|
||||||
def in_prompt_msg(
|
|
||||||
child: SpawnBase,
|
|
||||||
parts: list[str],
|
|
||||||
|
|
||||||
pause_on_false: bool = False,
|
|
||||||
err_on_false: bool = False,
|
|
||||||
print_prompt_on_false: bool = True,
|
|
||||||
|
|
||||||
) -> bool:
|
|
||||||
'''
|
|
||||||
Predicate check if (the prompt's) std-streams output has all
|
|
||||||
`str`-parts in it.
|
|
||||||
|
|
||||||
Can be used in test asserts for bulk matching expected
|
|
||||||
log/REPL output for a given `pdb` interact point.
|
|
||||||
|
|
||||||
'''
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
|
|
||||||
before: str = str(child.before.decode())
|
|
||||||
for part in parts:
|
|
||||||
if part not in before:
|
|
||||||
if pause_on_false:
|
|
||||||
import pdbp
|
|
||||||
pdbp.set_trace()
|
|
||||||
|
|
||||||
if print_prompt_on_false:
|
|
||||||
print(before)
|
|
||||||
|
|
||||||
if err_on_false:
|
|
||||||
raise ValueError(
|
|
||||||
f'Could not find pattern in `before` output?\n'
|
|
||||||
f'part: {part!r}\n'
|
|
||||||
)
|
|
||||||
return False
|
|
||||||
|
|
||||||
return True
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: todo support terminal color-chars stripping so we can match
|
|
||||||
# against call stack frame output from the the 'll' command the like!
|
|
||||||
# -[ ] SO answer for stipping ANSI codes: https://stackoverflow.com/a/14693789
|
|
||||||
def assert_before(
|
|
||||||
child: SpawnBase,
|
|
||||||
patts: list[str],
|
|
||||||
|
|
||||||
**kwargs,
|
|
||||||
|
|
||||||
) -> None:
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
|
|
||||||
assert in_prompt_msg(
|
|
||||||
child=child,
|
|
||||||
parts=patts,
|
|
||||||
|
|
||||||
# since this is an "assert" helper ;)
|
|
||||||
err_on_false=True,
|
|
||||||
**kwargs
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def do_ctlc(
|
|
||||||
child,
|
|
||||||
count: int = 3,
|
|
||||||
delay: float = 0.1,
|
|
||||||
patt: str|None = None,
|
|
||||||
|
|
||||||
# expect repl UX to reprint the prompt after every
|
|
||||||
# ctrl-c send.
|
|
||||||
# XXX: no idea but, in CI this never seems to work even on 3.10 so
|
|
||||||
# needs some further investigation potentially...
|
|
||||||
expect_prompt: bool = not _ci_env,
|
|
||||||
|
|
||||||
) -> str|None:
|
|
||||||
|
|
||||||
before: str|None = None
|
|
||||||
|
|
||||||
# make sure ctl-c sends don't do anything but repeat output
|
|
||||||
for _ in range(count):
|
|
||||||
time.sleep(delay)
|
|
||||||
child.sendcontrol('c')
|
|
||||||
|
|
||||||
# TODO: figure out why this makes CI fail..
|
|
||||||
# if you run this test manually it works just fine..
|
|
||||||
if expect_prompt:
|
|
||||||
time.sleep(delay)
|
|
||||||
child.expect(PROMPT)
|
|
||||||
before = str(child.before.decode())
|
|
||||||
time.sleep(delay)
|
|
||||||
|
|
||||||
if patt:
|
|
||||||
# should see the last line on console
|
|
||||||
assert patt in before
|
|
||||||
|
|
||||||
# return the console content up to the final prompt
|
|
||||||
return before
|
|
|
@ -1,381 +0,0 @@
|
||||||
'''
|
|
||||||
That "foreign loop/thread" debug REPL support better ALSO WORK!
|
|
||||||
|
|
||||||
Same as `test_native_pause.py`.
|
|
||||||
All these tests can be understood (somewhat) by running the
|
|
||||||
equivalent `examples/debugging/` scripts manually.
|
|
||||||
|
|
||||||
'''
|
|
||||||
from contextlib import (
|
|
||||||
contextmanager as cm,
|
|
||||||
)
|
|
||||||
# from functools import partial
|
|
||||||
# import itertools
|
|
||||||
import time
|
|
||||||
# from typing import (
|
|
||||||
# Iterator,
|
|
||||||
# )
|
|
||||||
|
|
||||||
import pytest
|
|
||||||
from pexpect.exceptions import (
|
|
||||||
TIMEOUT,
|
|
||||||
EOF,
|
|
||||||
)
|
|
||||||
|
|
||||||
from .conftest import (
|
|
||||||
# _ci_env,
|
|
||||||
do_ctlc,
|
|
||||||
PROMPT,
|
|
||||||
# expect,
|
|
||||||
in_prompt_msg,
|
|
||||||
assert_before,
|
|
||||||
_pause_msg,
|
|
||||||
_crash_msg,
|
|
||||||
_ctlc_ignore_header,
|
|
||||||
# _repl_fail_msg,
|
|
||||||
)
|
|
||||||
|
|
||||||
@cm
|
|
||||||
def maybe_expect_timeout(
|
|
||||||
ctlc: bool = False,
|
|
||||||
) -> None:
|
|
||||||
try:
|
|
||||||
yield
|
|
||||||
except TIMEOUT:
|
|
||||||
# breakpoint()
|
|
||||||
if ctlc:
|
|
||||||
pytest.xfail(
|
|
||||||
'Some kinda redic threading SIGINT bug i think?\n'
|
|
||||||
'See the notes in `examples/debugging/sync_bp.py`..\n'
|
|
||||||
)
|
|
||||||
raise
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.ctlcs_bish
|
|
||||||
def test_pause_from_sync(
|
|
||||||
spawn,
|
|
||||||
ctlc: bool,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Verify we can use the `pdbp` REPL from sync functions AND from
|
|
||||||
any thread spawned with `trio.to_thread.run_sync()`.
|
|
||||||
|
|
||||||
`examples/debugging/sync_bp.py`
|
|
||||||
|
|
||||||
'''
|
|
||||||
child = spawn('sync_bp')
|
|
||||||
|
|
||||||
# first `sync_pause()` after nurseries open
|
|
||||||
child.expect(PROMPT)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
# pre-prompt line
|
|
||||||
_pause_msg,
|
|
||||||
"<Task '__main__.main'",
|
|
||||||
"('root'",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
if ctlc:
|
|
||||||
do_ctlc(child)
|
|
||||||
# ^NOTE^ subactor not spawned yet; don't need extra delay.
|
|
||||||
|
|
||||||
child.sendline('c')
|
|
||||||
|
|
||||||
# first `await tractor.pause()` inside `p.open_context()` body
|
|
||||||
child.expect(PROMPT)
|
|
||||||
|
|
||||||
# XXX shouldn't see gb loaded message with PDB loglevel!
|
|
||||||
# assert not in_prompt_msg(
|
|
||||||
# child,
|
|
||||||
# ['`greenback` portal opened!'],
|
|
||||||
# )
|
|
||||||
# should be same root task
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
_pause_msg,
|
|
||||||
"<Task '__main__.main'",
|
|
||||||
"('root'",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
if ctlc:
|
|
||||||
do_ctlc(
|
|
||||||
child,
|
|
||||||
# NOTE: setting this to 0 (or some other sufficient
|
|
||||||
# small val) can cause the test to fail since the
|
|
||||||
# `subactor` suffers a race where the root/parent
|
|
||||||
# sends an actor-cancel prior to it hitting its pause
|
|
||||||
# point; by def the value is 0.1
|
|
||||||
delay=0.4,
|
|
||||||
)
|
|
||||||
|
|
||||||
# XXX, fwiw without a brief sleep here the SIGINT might actually
|
|
||||||
# trigger "subactor" cancellation by its parent before the
|
|
||||||
# shield-handler is engaged.
|
|
||||||
#
|
|
||||||
# => similar to the `delay` input to `do_ctlc()` below, setting
|
|
||||||
# this too low can cause the test to fail since the `subactor`
|
|
||||||
# suffers a race where the root/parent sends an actor-cancel
|
|
||||||
# prior to the context task hitting its pause point (and thus
|
|
||||||
# engaging the `sigint_shield()` handler in time); this value
|
|
||||||
# seems be good enuf?
|
|
||||||
time.sleep(0.6)
|
|
||||||
|
|
||||||
# one of the bg thread or subactor should have
|
|
||||||
# `Lock.acquire()`-ed
|
|
||||||
# (NOT both, which will result in REPL clobbering!)
|
|
||||||
attach_patts: dict[str, list[str]] = {
|
|
||||||
'subactor': [
|
|
||||||
"'start_n_sync_pause'",
|
|
||||||
"('subactor'",
|
|
||||||
],
|
|
||||||
'inline_root_bg_thread': [
|
|
||||||
"<Thread(inline_root_bg_thread",
|
|
||||||
"('root'",
|
|
||||||
],
|
|
||||||
'start_soon_root_bg_thread': [
|
|
||||||
"<Thread(start_soon_root_bg_thread",
|
|
||||||
"('root'",
|
|
||||||
],
|
|
||||||
}
|
|
||||||
conts: int = 0 # for debugging below matching logic on failure
|
|
||||||
while attach_patts:
|
|
||||||
child.sendline('c')
|
|
||||||
conts += 1
|
|
||||||
child.expect(PROMPT)
|
|
||||||
before = str(child.before.decode())
|
|
||||||
for key in attach_patts:
|
|
||||||
if key in before:
|
|
||||||
attach_key: str = key
|
|
||||||
expected_patts: str = attach_patts.pop(key)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[_pause_msg]
|
|
||||||
+
|
|
||||||
expected_patts
|
|
||||||
)
|
|
||||||
break
|
|
||||||
else:
|
|
||||||
pytest.fail(
|
|
||||||
f'No keys found?\n\n'
|
|
||||||
f'{attach_patts.keys()}\n\n'
|
|
||||||
f'{before}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
# ensure no other task/threads engaged a REPL
|
|
||||||
# at the same time as the one that was detected above.
|
|
||||||
for key, other_patts in attach_patts.copy().items():
|
|
||||||
assert not in_prompt_msg(
|
|
||||||
child,
|
|
||||||
other_patts,
|
|
||||||
)
|
|
||||||
|
|
||||||
if ctlc:
|
|
||||||
do_ctlc(
|
|
||||||
child,
|
|
||||||
patt=attach_key,
|
|
||||||
# NOTE same as comment above
|
|
||||||
delay=0.4,
|
|
||||||
)
|
|
||||||
|
|
||||||
child.sendline('c')
|
|
||||||
|
|
||||||
# XXX TODO, weird threading bug it seems despite the
|
|
||||||
# `abandon_on_cancel: bool` setting to
|
|
||||||
# `trio.to_thread.run_sync()`..
|
|
||||||
with maybe_expect_timeout(
|
|
||||||
ctlc=ctlc,
|
|
||||||
):
|
|
||||||
child.expect(EOF)
|
|
||||||
|
|
||||||
|
|
||||||
def expect_any_of(
|
|
||||||
attach_patts: dict[str, list[str]],
|
|
||||||
child, # what type?
|
|
||||||
ctlc: bool = False,
|
|
||||||
prompt: str = _ctlc_ignore_header,
|
|
||||||
ctlc_delay: float = .4,
|
|
||||||
|
|
||||||
) -> list[str]:
|
|
||||||
'''
|
|
||||||
Receive any of a `list[str]` of patterns provided in
|
|
||||||
`attach_patts`.
|
|
||||||
|
|
||||||
Used to test racing prompts from multiple actors and/or
|
|
||||||
tasks using a common root process' `pdbp` REPL.
|
|
||||||
|
|
||||||
'''
|
|
||||||
assert attach_patts
|
|
||||||
|
|
||||||
child.expect(PROMPT)
|
|
||||||
before = str(child.before.decode())
|
|
||||||
|
|
||||||
for attach_key in attach_patts:
|
|
||||||
if attach_key in before:
|
|
||||||
expected_patts: str = attach_patts.pop(attach_key)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
expected_patts
|
|
||||||
)
|
|
||||||
break # from for
|
|
||||||
else:
|
|
||||||
pytest.fail(
|
|
||||||
f'No keys found?\n\n'
|
|
||||||
f'{attach_patts.keys()}\n\n'
|
|
||||||
f'{before}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
# ensure no other task/threads engaged a REPL
|
|
||||||
# at the same time as the one that was detected above.
|
|
||||||
for key, other_patts in attach_patts.copy().items():
|
|
||||||
assert not in_prompt_msg(
|
|
||||||
child,
|
|
||||||
other_patts,
|
|
||||||
)
|
|
||||||
|
|
||||||
if ctlc:
|
|
||||||
do_ctlc(
|
|
||||||
child,
|
|
||||||
patt=prompt,
|
|
||||||
# NOTE same as comment above
|
|
||||||
delay=ctlc_delay,
|
|
||||||
)
|
|
||||||
|
|
||||||
return expected_patts
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.ctlcs_bish
|
|
||||||
def test_sync_pause_from_aio_task(
|
|
||||||
spawn,
|
|
||||||
|
|
||||||
ctlc: bool
|
|
||||||
# ^TODO, fix for `asyncio`!!
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Verify we can use the `pdbp` REPL from an `asyncio.Task` spawned using
|
|
||||||
APIs in `.to_asyncio`.
|
|
||||||
|
|
||||||
`examples/debugging/asycio_bp.py`
|
|
||||||
|
|
||||||
'''
|
|
||||||
child = spawn('asyncio_bp')
|
|
||||||
|
|
||||||
# RACE on whether trio/asyncio task bps first
|
|
||||||
attach_patts: dict[str, list[str]] = {
|
|
||||||
|
|
||||||
# first pause in guest-mode (aka "infecting")
|
|
||||||
# `trio.Task`.
|
|
||||||
'trio-side': [
|
|
||||||
_pause_msg,
|
|
||||||
"<Task 'trio_ctx'",
|
|
||||||
"('aio_daemon'",
|
|
||||||
],
|
|
||||||
|
|
||||||
# `breakpoint()` from `asyncio.Task`.
|
|
||||||
'asyncio-side': [
|
|
||||||
_pause_msg,
|
|
||||||
"<Task pending name='Task-2' coro=<greenback_shim()",
|
|
||||||
"('aio_daemon'",
|
|
||||||
],
|
|
||||||
}
|
|
||||||
|
|
||||||
while attach_patts:
|
|
||||||
expect_any_of(
|
|
||||||
attach_patts=attach_patts,
|
|
||||||
child=child,
|
|
||||||
ctlc=ctlc,
|
|
||||||
)
|
|
||||||
child.sendline('c')
|
|
||||||
|
|
||||||
# NOW in race order,
|
|
||||||
# - the asyncio-task will error
|
|
||||||
# - the root-actor parent task will pause
|
|
||||||
#
|
|
||||||
attach_patts: dict[str, list[str]] = {
|
|
||||||
|
|
||||||
# error raised in `asyncio.Task`
|
|
||||||
"raise ValueError('asyncio side error!')": [
|
|
||||||
_crash_msg,
|
|
||||||
"<Task 'trio_ctx'",
|
|
||||||
"@ ('aio_daemon'",
|
|
||||||
"ValueError: asyncio side error!",
|
|
||||||
|
|
||||||
# XXX, we no longer show this frame by default!
|
|
||||||
# 'return await chan.receive()', # `.to_asyncio` impl internals in tb
|
|
||||||
],
|
|
||||||
|
|
||||||
# parent-side propagation via actor-nursery/portal
|
|
||||||
# "tractor._exceptions.RemoteActorError: remote task raised a 'ValueError'": [
|
|
||||||
"remote task raised a 'ValueError'": [
|
|
||||||
_crash_msg,
|
|
||||||
"src_uid=('aio_daemon'",
|
|
||||||
"('aio_daemon'",
|
|
||||||
],
|
|
||||||
|
|
||||||
# a final pause in root-actor
|
|
||||||
"<Task '__main__.main'": [
|
|
||||||
_pause_msg,
|
|
||||||
"<Task '__main__.main'",
|
|
||||||
"('root'",
|
|
||||||
],
|
|
||||||
}
|
|
||||||
while attach_patts:
|
|
||||||
expect_any_of(
|
|
||||||
attach_patts=attach_patts,
|
|
||||||
child=child,
|
|
||||||
ctlc=ctlc,
|
|
||||||
)
|
|
||||||
child.sendline('c')
|
|
||||||
|
|
||||||
assert not attach_patts
|
|
||||||
|
|
||||||
# final boxed error propagates to root
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
_crash_msg,
|
|
||||||
"<Task '__main__.main'",
|
|
||||||
"('root'",
|
|
||||||
"remote task raised a 'ValueError'",
|
|
||||||
"ValueError: asyncio side error!",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
if ctlc:
|
|
||||||
do_ctlc(
|
|
||||||
child,
|
|
||||||
# NOTE: setting this to 0 (or some other sufficient
|
|
||||||
# small val) can cause the test to fail since the
|
|
||||||
# `subactor` suffers a race where the root/parent
|
|
||||||
# sends an actor-cancel prior to it hitting its pause
|
|
||||||
# point; by def the value is 0.1
|
|
||||||
delay=0.4,
|
|
||||||
)
|
|
||||||
|
|
||||||
child.sendline('c')
|
|
||||||
# with maybe_expect_timeout():
|
|
||||||
child.expect(EOF)
|
|
||||||
|
|
||||||
|
|
||||||
def test_sync_pause_from_non_greenbacked_aio_task():
|
|
||||||
'''
|
|
||||||
Where the `breakpoint()` caller task is NOT spawned by
|
|
||||||
`tractor.to_asyncio` and thus never activates
|
|
||||||
a `greenback.ensure_portal()` beforehand, presumably bc the task
|
|
||||||
was started by some lib/dep as in often seen in the field.
|
|
||||||
|
|
||||||
Ensure sync pausing works when the pause is in,
|
|
||||||
|
|
||||||
- the root actor running in infected-mode?
|
|
||||||
|_ since we don't need any IPC to acquire the debug lock?
|
|
||||||
|_ is there some way to handle this like the non-main-thread case?
|
|
||||||
|
|
||||||
All other cases need to error out appropriately right?
|
|
||||||
|
|
||||||
- for any subactor we can't avoid needing the repl lock..
|
|
||||||
|_ is there a way to hook into `asyncio.ensure_future(obj)`?
|
|
||||||
|
|
||||||
'''
|
|
||||||
pass
|
|
|
@ -1,172 +0,0 @@
|
||||||
'''
|
|
||||||
That "native" runtime-hackin toolset better be dang useful!
|
|
||||||
|
|
||||||
Verify the funtion of a variety of "developer-experience" tools we
|
|
||||||
offer from the `.devx` sub-pkg:
|
|
||||||
|
|
||||||
- use of the lovely `stackscope` for dumping actor `trio`-task trees
|
|
||||||
during operation and hangs.
|
|
||||||
|
|
||||||
TODO:
|
|
||||||
- demonstration of `CallerInfo` call stack frame filtering such that
|
|
||||||
for logging and REPL purposes a user sees exactly the layers needed
|
|
||||||
when debugging a problem inside the stack vs. in their app.
|
|
||||||
|
|
||||||
'''
|
|
||||||
import os
|
|
||||||
import signal
|
|
||||||
import time
|
|
||||||
|
|
||||||
from .conftest import (
|
|
||||||
expect,
|
|
||||||
assert_before,
|
|
||||||
in_prompt_msg,
|
|
||||||
PROMPT,
|
|
||||||
_pause_msg,
|
|
||||||
)
|
|
||||||
from pexpect.exceptions import (
|
|
||||||
# TIMEOUT,
|
|
||||||
EOF,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def test_shield_pause(
|
|
||||||
spawn,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Verify the `tractor.pause()/.post_mortem()` API works inside an
|
|
||||||
already cancelled `trio.CancelScope` and that you can step to the
|
|
||||||
next checkpoint wherein the cancelled will get raised.
|
|
||||||
|
|
||||||
'''
|
|
||||||
child = spawn(
|
|
||||||
'shield_hang_in_sub'
|
|
||||||
)
|
|
||||||
expect(
|
|
||||||
child,
|
|
||||||
'Yo my child hanging..?',
|
|
||||||
)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
'Entering shield sleep..',
|
|
||||||
'Enabling trace-trees on `SIGUSR1` since `stackscope` is installed @',
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
script_pid: int = child.pid
|
|
||||||
print(
|
|
||||||
f'Sending SIGUSR1 to {script_pid}\n'
|
|
||||||
f'(kill -s SIGUSR1 {script_pid})\n'
|
|
||||||
)
|
|
||||||
os.kill(
|
|
||||||
script_pid,
|
|
||||||
signal.SIGUSR1,
|
|
||||||
)
|
|
||||||
time.sleep(0.2)
|
|
||||||
expect(
|
|
||||||
child,
|
|
||||||
# end-of-tree delimiter
|
|
||||||
"end-of-\('root'",
|
|
||||||
)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
# 'Srying to dump `stackscope` tree..',
|
|
||||||
# 'Dumping `stackscope` tree for actor',
|
|
||||||
"('root'", # uid line
|
|
||||||
|
|
||||||
# TODO!? this used to show?
|
|
||||||
# -[ ] mk reproducable for @oremanj?
|
|
||||||
#
|
|
||||||
# parent block point (non-shielded)
|
|
||||||
# 'await trio.sleep_forever() # in root',
|
|
||||||
]
|
|
||||||
)
|
|
||||||
expect(
|
|
||||||
child,
|
|
||||||
# end-of-tree delimiter
|
|
||||||
"end-of-\('hanger'",
|
|
||||||
)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
# relay to the sub should be reported
|
|
||||||
'Relaying `SIGUSR1`[10] to sub-actor',
|
|
||||||
|
|
||||||
"('hanger'", # uid line
|
|
||||||
|
|
||||||
# TODO!? SEE ABOVE
|
|
||||||
# hanger LOC where it's shield-halted
|
|
||||||
# 'await trio.sleep_forever() # in subactor',
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
# simulate the user sending a ctl-c to the hanging program.
|
|
||||||
# this should result in the terminator kicking in since
|
|
||||||
# the sub is shield blocking and can't respond to SIGINT.
|
|
||||||
os.kill(
|
|
||||||
child.pid,
|
|
||||||
signal.SIGINT,
|
|
||||||
)
|
|
||||||
expect(
|
|
||||||
child,
|
|
||||||
'Shutting down actor runtime',
|
|
||||||
timeout=6,
|
|
||||||
)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
'raise KeyboardInterrupt',
|
|
||||||
# 'Shutting down actor runtime',
|
|
||||||
'#T-800 deployed to collect zombie B0',
|
|
||||||
"'--uid', \"('hanger',",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def test_breakpoint_hook_restored(
|
|
||||||
spawn,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Ensures our actor runtime sets a custom `breakpoint()` hook
|
|
||||||
on open then restores the stdlib's default on close.
|
|
||||||
|
|
||||||
The hook state validation is done via `assert`s inside the
|
|
||||||
invoked script with only `breakpoint()` (not `tractor.pause()`)
|
|
||||||
calls used.
|
|
||||||
|
|
||||||
'''
|
|
||||||
child = spawn('restore_builtin_breakpoint')
|
|
||||||
|
|
||||||
child.expect(PROMPT)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
_pause_msg,
|
|
||||||
"<Task '__main__.main'",
|
|
||||||
"('root'",
|
|
||||||
"first bp, tractor hook set",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
child.sendline('c')
|
|
||||||
child.expect(PROMPT)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
"last bp, stdlib hook restored",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
# since the stdlib hook was already restored there should be NO
|
|
||||||
# `tractor` `log.pdb()` content from console!
|
|
||||||
assert not in_prompt_msg(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
_pause_msg,
|
|
||||||
"<Task '__main__.main'",
|
|
||||||
"('root'",
|
|
||||||
],
|
|
||||||
)
|
|
||||||
child.sendline('c')
|
|
||||||
child.expect(EOF)
|
|
|
@ -3,6 +3,7 @@ Sketchy network blackoutz, ugly byzantine gens, puedes eschuchar la
|
||||||
cancelacion?..
|
cancelacion?..
|
||||||
|
|
||||||
'''
|
'''
|
||||||
|
import itertools
|
||||||
from functools import partial
|
from functools import partial
|
||||||
from types import ModuleType
|
from types import ModuleType
|
||||||
|
|
||||||
|
@ -12,7 +13,6 @@ import trio
|
||||||
import tractor
|
import tractor
|
||||||
from tractor._testing import (
|
from tractor._testing import (
|
||||||
examples_dir,
|
examples_dir,
|
||||||
break_ipc,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@ -90,14 +90,11 @@ def test_ipc_channel_break_during_stream(
|
||||||
|
|
||||||
# non-`trio` spawners should never hit the hang condition that
|
# non-`trio` spawners should never hit the hang condition that
|
||||||
# requires the user to do ctl-c to cancel the actor tree.
|
# requires the user to do ctl-c to cancel the actor tree.
|
||||||
# expect_final_exc = trio.ClosedResourceError
|
expect_final_exc = trio.ClosedResourceError
|
||||||
expect_final_exc = tractor.TransportClosed
|
|
||||||
|
|
||||||
mod: ModuleType = import_path(
|
mod: ModuleType = import_path(
|
||||||
examples_dir() / 'advanced_faults'
|
examples_dir() / 'advanced_faults' / 'ipc_failure_during_stream.py',
|
||||||
/ 'ipc_failure_during_stream.py',
|
|
||||||
root=examples_dir(),
|
root=examples_dir(),
|
||||||
consider_namespace_packages=False,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
# by def we expect KBI from user after a simulated "hang
|
# by def we expect KBI from user after a simulated "hang
|
||||||
|
@ -157,7 +154,7 @@ def test_ipc_channel_break_during_stream(
|
||||||
if pre_aclose_msgstream:
|
if pre_aclose_msgstream:
|
||||||
expect_final_exc = KeyboardInterrupt
|
expect_final_exc = KeyboardInterrupt
|
||||||
|
|
||||||
# NOTE when the parent IPC side dies (even if the child does as well
|
# NOTE when the parent IPC side dies (even if the child's does as well
|
||||||
# but the child fails BEFORE the parent) we always expect the
|
# but the child fails BEFORE the parent) we always expect the
|
||||||
# IPC layer to raise a closed-resource, NEVER do we expect
|
# IPC layer to raise a closed-resource, NEVER do we expect
|
||||||
# a stop msg since the parent-side ctx apis will error out
|
# a stop msg since the parent-side ctx apis will error out
|
||||||
|
@ -169,8 +166,7 @@ def test_ipc_channel_break_during_stream(
|
||||||
and
|
and
|
||||||
ipc_break['break_child_ipc_after'] is False
|
ipc_break['break_child_ipc_after'] is False
|
||||||
):
|
):
|
||||||
# expect_final_exc = trio.ClosedResourceError
|
expect_final_exc = trio.ClosedResourceError
|
||||||
expect_final_exc = tractor.TransportClosed
|
|
||||||
|
|
||||||
# BOTH but, PARENT breaks FIRST
|
# BOTH but, PARENT breaks FIRST
|
||||||
elif (
|
elif (
|
||||||
|
@ -181,8 +177,7 @@ def test_ipc_channel_break_during_stream(
|
||||||
ipc_break['break_parent_ipc_after']
|
ipc_break['break_parent_ipc_after']
|
||||||
)
|
)
|
||||||
):
|
):
|
||||||
# expect_final_exc = trio.ClosedResourceError
|
expect_final_exc = trio.ClosedResourceError
|
||||||
expect_final_exc = tractor.TransportClosed
|
|
||||||
|
|
||||||
with pytest.raises(
|
with pytest.raises(
|
||||||
expected_exception=(
|
expected_exception=(
|
||||||
|
@ -201,8 +196,8 @@ def test_ipc_channel_break_during_stream(
|
||||||
**ipc_break,
|
**ipc_break,
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
except KeyboardInterrupt as _kbi:
|
except KeyboardInterrupt as kbi:
|
||||||
kbi = _kbi
|
_err = kbi
|
||||||
if expect_final_exc is not KeyboardInterrupt:
|
if expect_final_exc is not KeyboardInterrupt:
|
||||||
pytest.fail(
|
pytest.fail(
|
||||||
'Rxed unexpected KBI !?\n'
|
'Rxed unexpected KBI !?\n'
|
||||||
|
@ -211,28 +206,16 @@ def test_ipc_channel_break_during_stream(
|
||||||
|
|
||||||
raise
|
raise
|
||||||
|
|
||||||
except tractor.TransportClosed as _tc:
|
|
||||||
tc = _tc
|
|
||||||
if expect_final_exc is KeyboardInterrupt:
|
|
||||||
pytest.fail(
|
|
||||||
'Unexpected transport failure !?\n'
|
|
||||||
f'{repr(tc)}'
|
|
||||||
)
|
|
||||||
cause: Exception = tc.__cause__
|
|
||||||
assert (
|
|
||||||
type(cause) is trio.ClosedResourceError
|
|
||||||
and
|
|
||||||
cause.args[0] == 'another task closed this fd'
|
|
||||||
)
|
|
||||||
raise
|
|
||||||
|
|
||||||
# get raw instance from pytest wrapper
|
# get raw instance from pytest wrapper
|
||||||
value = excinfo.value
|
value = excinfo.value
|
||||||
if isinstance(value, ExceptionGroup):
|
if isinstance(value, ExceptionGroup):
|
||||||
excs = value.exceptions
|
value = next(
|
||||||
assert len(excs) == 1
|
itertools.dropwhile(
|
||||||
final_exc = excs[0]
|
lambda exc: not isinstance(exc, expect_final_exc),
|
||||||
assert isinstance(final_exc, expect_final_exc)
|
value.exceptions,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
assert value
|
||||||
|
|
||||||
|
|
||||||
@tractor.context
|
@tractor.context
|
||||||
|
@ -241,30 +224,23 @@ async def break_ipc_after_started(
|
||||||
) -> None:
|
) -> None:
|
||||||
await ctx.started()
|
await ctx.started()
|
||||||
async with ctx.open_stream() as stream:
|
async with ctx.open_stream() as stream:
|
||||||
|
await stream.aclose()
|
||||||
# TODO: make a test which verifies the error
|
await trio.sleep(0.2)
|
||||||
# for this, i.e. raises a `MsgTypeError`
|
await ctx.chan.send(None)
|
||||||
# await ctx.chan.send(None)
|
|
||||||
|
|
||||||
await break_ipc(
|
|
||||||
stream=stream,
|
|
||||||
pre_close=True,
|
|
||||||
)
|
|
||||||
print('child broke IPC and terminating')
|
print('child broke IPC and terminating')
|
||||||
|
|
||||||
|
|
||||||
def test_stream_closed_right_after_ipc_break_and_zombie_lord_engages():
|
def test_stream_closed_right_after_ipc_break_and_zombie_lord_engages():
|
||||||
'''
|
'''
|
||||||
Verify that is a subactor's IPC goes down just after bringing up
|
Verify that is a subactor's IPC goes down just after bringing up a stream
|
||||||
a stream the parent can trigger a SIGINT and the child will be
|
the parent can trigger a SIGINT and the child will be reaped out-of-IPC by
|
||||||
reaped out-of-IPC by the localhost process supervision machinery:
|
the localhost process supervision machinery: aka "zombie lord".
|
||||||
aka "zombie lord".
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
async def main():
|
async def main():
|
||||||
with trio.fail_after(3):
|
with trio.fail_after(3):
|
||||||
async with tractor.open_nursery() as an:
|
async with tractor.open_nursery() as n:
|
||||||
portal = await an.start_actor(
|
portal = await n.start_actor(
|
||||||
'ipc_breaker',
|
'ipc_breaker',
|
||||||
enable_modules=[__name__],
|
enable_modules=[__name__],
|
||||||
)
|
)
|
||||||
|
|
|
@ -307,15 +307,7 @@ async def inf_streamer(
|
||||||
|
|
||||||
async with (
|
async with (
|
||||||
ctx.open_stream() as stream,
|
ctx.open_stream() as stream,
|
||||||
|
trio.open_nursery() as tn,
|
||||||
# XXX TODO, INTERESTING CASE!!
|
|
||||||
# - if we don't collapse the eg then the embedded
|
|
||||||
# `trio.EndOfChannel` doesn't propagate directly to the above
|
|
||||||
# .open_stream() parent, resulting in it also raising instead
|
|
||||||
# of gracefully absorbing as normal.. so how to handle?
|
|
||||||
trio.open_nursery(
|
|
||||||
strict_exception_groups=False,
|
|
||||||
) as tn,
|
|
||||||
):
|
):
|
||||||
async def close_stream_on_sentinel():
|
async def close_stream_on_sentinel():
|
||||||
async for msg in stream:
|
async for msg in stream:
|
||||||
|
|
|
@ -14,7 +14,7 @@ import tractor
|
||||||
from tractor._testing import (
|
from tractor._testing import (
|
||||||
tractor_test,
|
tractor_test,
|
||||||
)
|
)
|
||||||
from .conftest import no_windows
|
from conftest import no_windows
|
||||||
|
|
||||||
|
|
||||||
def is_win():
|
def is_win():
|
||||||
|
@ -77,7 +77,7 @@ def test_remote_error(reg_addr, args_err):
|
||||||
# of this actor nursery.
|
# of this actor nursery.
|
||||||
await portal.result()
|
await portal.result()
|
||||||
except tractor.RemoteActorError as err:
|
except tractor.RemoteActorError as err:
|
||||||
assert err.boxed_type == errtype
|
assert err.type == errtype
|
||||||
print("Look Maa that actor failed hard, hehh")
|
print("Look Maa that actor failed hard, hehh")
|
||||||
raise
|
raise
|
||||||
|
|
||||||
|
@ -86,33 +86,20 @@ def test_remote_error(reg_addr, args_err):
|
||||||
with pytest.raises(tractor.RemoteActorError) as excinfo:
|
with pytest.raises(tractor.RemoteActorError) as excinfo:
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
|
|
||||||
assert excinfo.value.boxed_type == errtype
|
assert excinfo.value.type == errtype
|
||||||
|
|
||||||
else:
|
else:
|
||||||
# the root task will also error on the `Portal.result()`
|
# the root task will also error on the `.result()` call
|
||||||
# call so we expect an error from there AND the child.
|
# so we expect an error from there AND the child.
|
||||||
# |_ tho seems like on new `trio` this doesn't always
|
with pytest.raises(BaseExceptionGroup) as excinfo:
|
||||||
# happen?
|
|
||||||
with pytest.raises((
|
|
||||||
BaseExceptionGroup,
|
|
||||||
tractor.RemoteActorError,
|
|
||||||
)) as excinfo:
|
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
|
|
||||||
# ensure boxed errors are `errtype`
|
# ensure boxed errors
|
||||||
err: BaseException = excinfo.value
|
for exc in excinfo.value.exceptions:
|
||||||
if isinstance(err, BaseExceptionGroup):
|
assert exc.type == errtype
|
||||||
suberrs: list[BaseException] = err.exceptions
|
|
||||||
else:
|
|
||||||
suberrs: list[BaseException] = [err]
|
|
||||||
|
|
||||||
for exc in suberrs:
|
|
||||||
assert exc.boxed_type == errtype
|
|
||||||
|
|
||||||
|
|
||||||
def test_multierror(
|
def test_multierror(reg_addr):
|
||||||
reg_addr: tuple[str, int],
|
|
||||||
):
|
|
||||||
'''
|
'''
|
||||||
Verify we raise a ``BaseExceptionGroup`` out of a nursery where
|
Verify we raise a ``BaseExceptionGroup`` out of a nursery where
|
||||||
more then one actor errors.
|
more then one actor errors.
|
||||||
|
@ -130,7 +117,7 @@ def test_multierror(
|
||||||
try:
|
try:
|
||||||
await portal2.result()
|
await portal2.result()
|
||||||
except tractor.RemoteActorError as err:
|
except tractor.RemoteActorError as err:
|
||||||
assert err.boxed_type is AssertionError
|
assert err.type == AssertionError
|
||||||
print("Look Maa that first actor failed hard, hehh")
|
print("Look Maa that first actor failed hard, hehh")
|
||||||
raise
|
raise
|
||||||
|
|
||||||
|
@ -182,7 +169,7 @@ def test_multierror_fast_nursery(reg_addr, start_method, num_subactors, delay):
|
||||||
|
|
||||||
for exc in exceptions:
|
for exc in exceptions:
|
||||||
assert isinstance(exc, tractor.RemoteActorError)
|
assert isinstance(exc, tractor.RemoteActorError)
|
||||||
assert exc.boxed_type is AssertionError
|
assert exc.type == AssertionError
|
||||||
|
|
||||||
|
|
||||||
async def do_nothing():
|
async def do_nothing():
|
||||||
|
@ -323,7 +310,7 @@ async def test_some_cancels_all(num_actors_and_errs, start_method, loglevel):
|
||||||
await portal.run(func, **kwargs)
|
await portal.run(func, **kwargs)
|
||||||
|
|
||||||
except tractor.RemoteActorError as err:
|
except tractor.RemoteActorError as err:
|
||||||
assert err.boxed_type == err_type
|
assert err.type == err_type
|
||||||
# we only expect this first error to propogate
|
# we only expect this first error to propogate
|
||||||
# (all other daemons are cancelled before they
|
# (all other daemons are cancelled before they
|
||||||
# can be scheduled)
|
# can be scheduled)
|
||||||
|
@ -342,11 +329,11 @@ async def test_some_cancels_all(num_actors_and_errs, start_method, loglevel):
|
||||||
assert len(err.exceptions) == num_actors
|
assert len(err.exceptions) == num_actors
|
||||||
for exc in err.exceptions:
|
for exc in err.exceptions:
|
||||||
if isinstance(exc, tractor.RemoteActorError):
|
if isinstance(exc, tractor.RemoteActorError):
|
||||||
assert exc.boxed_type == err_type
|
assert exc.type == err_type
|
||||||
else:
|
else:
|
||||||
assert isinstance(exc, trio.Cancelled)
|
assert isinstance(exc, trio.Cancelled)
|
||||||
elif isinstance(err, tractor.RemoteActorError):
|
elif isinstance(err, tractor.RemoteActorError):
|
||||||
assert err.boxed_type == err_type
|
assert err.type == err_type
|
||||||
|
|
||||||
assert n.cancelled is True
|
assert n.cancelled is True
|
||||||
assert not n._children
|
assert not n._children
|
||||||
|
@ -425,7 +412,7 @@ async def test_nested_multierrors(loglevel, start_method):
|
||||||
elif isinstance(subexc, tractor.RemoteActorError):
|
elif isinstance(subexc, tractor.RemoteActorError):
|
||||||
# on windows it seems we can't exactly be sure wtf
|
# on windows it seems we can't exactly be sure wtf
|
||||||
# will happen..
|
# will happen..
|
||||||
assert subexc.boxed_type in (
|
assert subexc.type in (
|
||||||
tractor.RemoteActorError,
|
tractor.RemoteActorError,
|
||||||
trio.Cancelled,
|
trio.Cancelled,
|
||||||
BaseExceptionGroup,
|
BaseExceptionGroup,
|
||||||
|
@ -435,7 +422,7 @@ async def test_nested_multierrors(loglevel, start_method):
|
||||||
for subsub in subexc.exceptions:
|
for subsub in subexc.exceptions:
|
||||||
|
|
||||||
if subsub in (tractor.RemoteActorError,):
|
if subsub in (tractor.RemoteActorError,):
|
||||||
subsub = subsub.boxed_type
|
subsub = subsub.type
|
||||||
|
|
||||||
assert type(subsub) in (
|
assert type(subsub) in (
|
||||||
trio.Cancelled,
|
trio.Cancelled,
|
||||||
|
@ -450,16 +437,16 @@ async def test_nested_multierrors(loglevel, start_method):
|
||||||
# we get back the (sent) cancel signal instead
|
# we get back the (sent) cancel signal instead
|
||||||
if is_win():
|
if is_win():
|
||||||
if isinstance(subexc, tractor.RemoteActorError):
|
if isinstance(subexc, tractor.RemoteActorError):
|
||||||
assert subexc.boxed_type in (
|
assert subexc.type in (
|
||||||
BaseExceptionGroup,
|
BaseExceptionGroup,
|
||||||
tractor.RemoteActorError
|
tractor.RemoteActorError
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
assert isinstance(subexc, BaseExceptionGroup)
|
assert isinstance(subexc, BaseExceptionGroup)
|
||||||
else:
|
else:
|
||||||
assert subexc.boxed_type is ExceptionGroup
|
assert subexc.type is ExceptionGroup
|
||||||
else:
|
else:
|
||||||
assert subexc.boxed_type in (
|
assert subexc.type in (
|
||||||
tractor.RemoteActorError,
|
tractor.RemoteActorError,
|
||||||
trio.Cancelled
|
trio.Cancelled
|
||||||
)
|
)
|
||||||
|
@ -504,9 +491,7 @@ def test_cancel_via_SIGINT_other_task(
|
||||||
if is_win(): # smh
|
if is_win(): # smh
|
||||||
timeout += 1
|
timeout += 1
|
||||||
|
|
||||||
async def spawn_and_sleep_forever(
|
async def spawn_and_sleep_forever(task_status=trio.TASK_STATUS_IGNORED):
|
||||||
task_status=trio.TASK_STATUS_IGNORED
|
|
||||||
):
|
|
||||||
async with tractor.open_nursery() as tn:
|
async with tractor.open_nursery() as tn:
|
||||||
for i in range(3):
|
for i in range(3):
|
||||||
await tn.run_in_actor(
|
await tn.run_in_actor(
|
||||||
|
@ -519,9 +504,7 @@ def test_cancel_via_SIGINT_other_task(
|
||||||
async def main():
|
async def main():
|
||||||
# should never timeout since SIGINT should cancel the current program
|
# should never timeout since SIGINT should cancel the current program
|
||||||
with trio.fail_after(timeout):
|
with trio.fail_after(timeout):
|
||||||
async with trio.open_nursery(
|
async with trio.open_nursery() as n:
|
||||||
strict_exception_groups=False,
|
|
||||||
) as n:
|
|
||||||
await n.start(spawn_and_sleep_forever)
|
await n.start(spawn_and_sleep_forever)
|
||||||
if 'mp' in spawn_backend:
|
if 'mp' in spawn_backend:
|
||||||
time.sleep(0.1)
|
time.sleep(0.1)
|
||||||
|
@ -614,12 +597,6 @@ def test_fast_graceful_cancel_when_spawn_task_in_soft_proc_wait_for_daemon(
|
||||||
nurse.start_soon(delayed_kbi)
|
nurse.start_soon(delayed_kbi)
|
||||||
|
|
||||||
await p.run(do_nuthin)
|
await p.run(do_nuthin)
|
||||||
|
|
||||||
# need to explicitly re-raise the lone kbi..now
|
|
||||||
except* KeyboardInterrupt as kbi_eg:
|
|
||||||
assert (len(excs := kbi_eg.exceptions) == 1)
|
|
||||||
raise excs[0]
|
|
||||||
|
|
||||||
finally:
|
finally:
|
||||||
duration = time.time() - start
|
duration = time.time() - start
|
||||||
if duration > timeout:
|
if duration > timeout:
|
||||||
|
|
|
@ -95,8 +95,8 @@ async def trio_main(
|
||||||
|
|
||||||
# stash a "service nursery" as "actor local" (aka a Python global)
|
# stash a "service nursery" as "actor local" (aka a Python global)
|
||||||
global _nursery
|
global _nursery
|
||||||
tn = _nursery
|
n = _nursery
|
||||||
assert tn
|
assert n
|
||||||
|
|
||||||
async def consume_stream():
|
async def consume_stream():
|
||||||
async with wrapper_mngr() as stream:
|
async with wrapper_mngr() as stream:
|
||||||
|
@ -104,10 +104,10 @@ async def trio_main(
|
||||||
print(msg)
|
print(msg)
|
||||||
|
|
||||||
# run 2 tasks to ensure broadcaster chan use
|
# run 2 tasks to ensure broadcaster chan use
|
||||||
tn.start_soon(consume_stream)
|
n.start_soon(consume_stream)
|
||||||
tn.start_soon(consume_stream)
|
n.start_soon(consume_stream)
|
||||||
|
|
||||||
tn.start_soon(trio_sleep_and_err)
|
n.start_soon(trio_sleep_and_err)
|
||||||
|
|
||||||
await trio.sleep_forever()
|
await trio.sleep_forever()
|
||||||
|
|
||||||
|
@ -117,10 +117,8 @@ async def open_actor_local_nursery(
|
||||||
ctx: tractor.Context,
|
ctx: tractor.Context,
|
||||||
):
|
):
|
||||||
global _nursery
|
global _nursery
|
||||||
async with trio.open_nursery(
|
async with trio.open_nursery() as n:
|
||||||
strict_exception_groups=False,
|
_nursery = n
|
||||||
) as tn:
|
|
||||||
_nursery = tn
|
|
||||||
await ctx.started()
|
await ctx.started()
|
||||||
await trio.sleep(10)
|
await trio.sleep(10)
|
||||||
# await trio.sleep(1)
|
# await trio.sleep(1)
|
||||||
|
@ -134,7 +132,7 @@ async def open_actor_local_nursery(
|
||||||
# never yields back.. aka a scenario where the
|
# never yields back.. aka a scenario where the
|
||||||
# ``tractor.context`` task IS NOT in the service n's cancel
|
# ``tractor.context`` task IS NOT in the service n's cancel
|
||||||
# scope.
|
# scope.
|
||||||
tn.cancel_scope.cancel()
|
n.cancel_scope.cancel()
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
|
@ -159,7 +157,7 @@ def test_actor_managed_trio_nursery_task_error_cancels_aio(
|
||||||
async with tractor.open_nursery() as n:
|
async with tractor.open_nursery() as n:
|
||||||
p = await n.start_actor(
|
p = await n.start_actor(
|
||||||
'nursery_mngr',
|
'nursery_mngr',
|
||||||
infect_asyncio=asyncio_mode, # TODO, is this enabling debug mode?
|
infect_asyncio=asyncio_mode,
|
||||||
enable_modules=[__name__],
|
enable_modules=[__name__],
|
||||||
)
|
)
|
||||||
async with (
|
async with (
|
||||||
|
@ -173,4 +171,4 @@ def test_actor_managed_trio_nursery_task_error_cancels_aio(
|
||||||
|
|
||||||
# verify boxed error
|
# verify boxed error
|
||||||
err = excinfo.value
|
err = excinfo.value
|
||||||
assert err.boxed_type is NameError
|
assert isinstance(err.type(), NameError)
|
||||||
|
|
|
@ -6,7 +6,6 @@ sync-opening a ``tractor.Context`` beforehand.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
from itertools import count
|
from itertools import count
|
||||||
import math
|
|
||||||
import platform
|
import platform
|
||||||
from pprint import pformat
|
from pprint import pformat
|
||||||
from typing import (
|
from typing import (
|
||||||
|
@ -25,7 +24,6 @@ from tractor._exceptions import (
|
||||||
StreamOverrun,
|
StreamOverrun,
|
||||||
ContextCancelled,
|
ContextCancelled,
|
||||||
)
|
)
|
||||||
from tractor._state import current_ipc_ctx
|
|
||||||
|
|
||||||
from tractor._testing import (
|
from tractor._testing import (
|
||||||
tractor_test,
|
tractor_test,
|
||||||
|
@ -38,9 +36,9 @@ from tractor._testing import (
|
||||||
# - standard setup/teardown:
|
# - standard setup/teardown:
|
||||||
# ``Portal.open_context()`` starts a new
|
# ``Portal.open_context()`` starts a new
|
||||||
# remote task context in another actor. The target actor's task must
|
# remote task context in another actor. The target actor's task must
|
||||||
# call ``Context.started()`` to unblock this entry on the parent side.
|
# call ``Context.started()`` to unblock this entry on the caller side.
|
||||||
# the child task executes until complete and returns a final value
|
# the callee task executes until complete and returns a final value
|
||||||
# which is delivered to the parent side and retreived via
|
# which is delivered to the caller side and retreived via
|
||||||
# ``Context.result()``.
|
# ``Context.result()``.
|
||||||
|
|
||||||
# - cancel termination:
|
# - cancel termination:
|
||||||
|
@ -145,8 +143,6 @@ async def simple_setup_teardown(
|
||||||
global _state
|
global _state
|
||||||
_state = True
|
_state = True
|
||||||
|
|
||||||
assert current_ipc_ctx() is ctx
|
|
||||||
|
|
||||||
# signal to parent that we're up
|
# signal to parent that we're up
|
||||||
await ctx.started(data + 1)
|
await ctx.started(data + 1)
|
||||||
|
|
||||||
|
@ -170,9 +166,9 @@ async def assert_state(value: bool):
|
||||||
[False, ValueError, KeyboardInterrupt],
|
[False, ValueError, KeyboardInterrupt],
|
||||||
)
|
)
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
'child_blocks_forever',
|
'callee_blocks_forever',
|
||||||
[False, True],
|
[False, True],
|
||||||
ids=lambda item: f'child_blocks_forever={item}'
|
ids=lambda item: f'callee_blocks_forever={item}'
|
||||||
)
|
)
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
'pointlessly_open_stream',
|
'pointlessly_open_stream',
|
||||||
|
@ -181,7 +177,7 @@ async def assert_state(value: bool):
|
||||||
)
|
)
|
||||||
def test_simple_context(
|
def test_simple_context(
|
||||||
error_parent,
|
error_parent,
|
||||||
child_blocks_forever,
|
callee_blocks_forever,
|
||||||
pointlessly_open_stream,
|
pointlessly_open_stream,
|
||||||
debug_mode: bool,
|
debug_mode: bool,
|
||||||
):
|
):
|
||||||
|
@ -204,13 +200,12 @@ def test_simple_context(
|
||||||
portal.open_context(
|
portal.open_context(
|
||||||
simple_setup_teardown,
|
simple_setup_teardown,
|
||||||
data=10,
|
data=10,
|
||||||
block_forever=child_blocks_forever,
|
block_forever=callee_blocks_forever,
|
||||||
) as (ctx, sent),
|
) as (ctx, sent),
|
||||||
):
|
):
|
||||||
assert current_ipc_ctx() is ctx
|
|
||||||
assert sent == 11
|
assert sent == 11
|
||||||
|
|
||||||
if child_blocks_forever:
|
if callee_blocks_forever:
|
||||||
await portal.run(assert_state, value=True)
|
await portal.run(assert_state, value=True)
|
||||||
else:
|
else:
|
||||||
assert await ctx.result() == 'yo'
|
assert await ctx.result() == 'yo'
|
||||||
|
@ -220,7 +215,7 @@ def test_simple_context(
|
||||||
if error_parent:
|
if error_parent:
|
||||||
raise error_parent
|
raise error_parent
|
||||||
|
|
||||||
if child_blocks_forever:
|
if callee_blocks_forever:
|
||||||
await ctx.cancel()
|
await ctx.cancel()
|
||||||
else:
|
else:
|
||||||
# in this case the stream will send a
|
# in this case the stream will send a
|
||||||
|
@ -250,18 +245,18 @@ def test_simple_context(
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
except error_parent:
|
except error_parent:
|
||||||
pass
|
pass
|
||||||
except BaseExceptionGroup as beg:
|
except trio.MultiError as me:
|
||||||
# XXX: on windows it seems we may have to expect the group error
|
# XXX: on windows it seems we may have to expect the group error
|
||||||
from tractor._exceptions import is_multi_cancelled
|
from tractor._exceptions import is_multi_cancelled
|
||||||
assert is_multi_cancelled(beg)
|
assert is_multi_cancelled(me)
|
||||||
else:
|
else:
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
'child_returns_early',
|
'callee_returns_early',
|
||||||
[True, False],
|
[True, False],
|
||||||
ids=lambda item: f'child_returns_early={item}'
|
ids=lambda item: f'callee_returns_early={item}'
|
||||||
)
|
)
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
'cancel_method',
|
'cancel_method',
|
||||||
|
@ -273,14 +268,14 @@ def test_simple_context(
|
||||||
[True, False],
|
[True, False],
|
||||||
ids=lambda item: f'chk_ctx_result_before_exit={item}'
|
ids=lambda item: f'chk_ctx_result_before_exit={item}'
|
||||||
)
|
)
|
||||||
def test_parent_cancels(
|
def test_caller_cancels(
|
||||||
cancel_method: str,
|
cancel_method: str,
|
||||||
chk_ctx_result_before_exit: bool,
|
chk_ctx_result_before_exit: bool,
|
||||||
child_returns_early: bool,
|
callee_returns_early: bool,
|
||||||
debug_mode: bool,
|
debug_mode: bool,
|
||||||
):
|
):
|
||||||
'''
|
'''
|
||||||
Verify that when the opening side of a context (aka the parent)
|
Verify that when the opening side of a context (aka the caller)
|
||||||
cancels that context, the ctx does not raise a cancelled when
|
cancels that context, the ctx does not raise a cancelled when
|
||||||
either calling `.result()` or on context exit.
|
either calling `.result()` or on context exit.
|
||||||
|
|
||||||
|
@ -294,7 +289,7 @@ def test_parent_cancels(
|
||||||
|
|
||||||
if (
|
if (
|
||||||
cancel_method == 'portal'
|
cancel_method == 'portal'
|
||||||
and not child_returns_early
|
and not callee_returns_early
|
||||||
):
|
):
|
||||||
try:
|
try:
|
||||||
res = await ctx.result()
|
res = await ctx.result()
|
||||||
|
@ -318,7 +313,7 @@ def test_parent_cancels(
|
||||||
pytest.fail(f'should not have raised ctxc\n{ctxc}')
|
pytest.fail(f'should not have raised ctxc\n{ctxc}')
|
||||||
|
|
||||||
# we actually get a result
|
# we actually get a result
|
||||||
if child_returns_early:
|
if callee_returns_early:
|
||||||
assert res == 'yo'
|
assert res == 'yo'
|
||||||
assert ctx.outcome is res
|
assert ctx.outcome is res
|
||||||
assert ctx.maybe_error is None
|
assert ctx.maybe_error is None
|
||||||
|
@ -362,14 +357,14 @@ def test_parent_cancels(
|
||||||
)
|
)
|
||||||
timeout: float = (
|
timeout: float = (
|
||||||
0.5
|
0.5
|
||||||
if not child_returns_early
|
if not callee_returns_early
|
||||||
else 2
|
else 2
|
||||||
)
|
)
|
||||||
with trio.fail_after(timeout):
|
with trio.fail_after(timeout):
|
||||||
async with (
|
async with (
|
||||||
expect_ctxc(
|
expect_ctxc(
|
||||||
yay=(
|
yay=(
|
||||||
not child_returns_early
|
not callee_returns_early
|
||||||
and cancel_method == 'portal'
|
and cancel_method == 'portal'
|
||||||
)
|
)
|
||||||
),
|
),
|
||||||
|
@ -377,13 +372,13 @@ def test_parent_cancels(
|
||||||
portal.open_context(
|
portal.open_context(
|
||||||
simple_setup_teardown,
|
simple_setup_teardown,
|
||||||
data=10,
|
data=10,
|
||||||
block_forever=not child_returns_early,
|
block_forever=not callee_returns_early,
|
||||||
) as (ctx, sent),
|
) as (ctx, sent),
|
||||||
):
|
):
|
||||||
|
|
||||||
if child_returns_early:
|
if callee_returns_early:
|
||||||
# ensure we block long enough before sending
|
# ensure we block long enough before sending
|
||||||
# a cancel such that the child has already
|
# a cancel such that the callee has already
|
||||||
# returned it's result.
|
# returned it's result.
|
||||||
await trio.sleep(0.5)
|
await trio.sleep(0.5)
|
||||||
|
|
||||||
|
@ -421,7 +416,7 @@ def test_parent_cancels(
|
||||||
# which should in turn cause `ctx._scope` to
|
# which should in turn cause `ctx._scope` to
|
||||||
# catch any cancellation?
|
# catch any cancellation?
|
||||||
if (
|
if (
|
||||||
not child_returns_early
|
not callee_returns_early
|
||||||
and cancel_method != 'portal'
|
and cancel_method != 'portal'
|
||||||
):
|
):
|
||||||
assert not ctx._scope.cancelled_caught
|
assert not ctx._scope.cancelled_caught
|
||||||
|
@ -430,11 +425,11 @@ def test_parent_cancels(
|
||||||
|
|
||||||
|
|
||||||
# basic stream terminations:
|
# basic stream terminations:
|
||||||
# - child context closes without using stream
|
# - callee context closes without using stream
|
||||||
# - parent context closes without using stream
|
# - caller context closes without using stream
|
||||||
# - parent context calls `Context.cancel()` while streaming
|
# - caller context calls `Context.cancel()` while streaming
|
||||||
# is ongoing resulting in child being cancelled
|
# is ongoing resulting in callee being cancelled
|
||||||
# - child calls `Context.cancel()` while streaming and parent
|
# - callee calls `Context.cancel()` while streaming and caller
|
||||||
# sees stream terminated in `RemoteActorError`
|
# sees stream terminated in `RemoteActorError`
|
||||||
|
|
||||||
# TODO: future possible features
|
# TODO: future possible features
|
||||||
|
@ -443,6 +438,7 @@ def test_parent_cancels(
|
||||||
|
|
||||||
@tractor.context
|
@tractor.context
|
||||||
async def close_ctx_immediately(
|
async def close_ctx_immediately(
|
||||||
|
|
||||||
ctx: Context,
|
ctx: Context,
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
|
@ -453,24 +449,13 @@ async def close_ctx_immediately(
|
||||||
async with ctx.open_stream():
|
async with ctx.open_stream():
|
||||||
pass
|
pass
|
||||||
|
|
||||||
print('child returning!')
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'parent_send_before_receive',
|
|
||||||
[
|
|
||||||
False,
|
|
||||||
True,
|
|
||||||
],
|
|
||||||
ids=lambda item: f'child_send_before_receive={item}'
|
|
||||||
)
|
|
||||||
@tractor_test
|
@tractor_test
|
||||||
async def test_child_exits_ctx_after_stream_open(
|
async def test_callee_closes_ctx_after_stream_open(
|
||||||
debug_mode: bool,
|
debug_mode: bool,
|
||||||
parent_send_before_receive: bool,
|
|
||||||
):
|
):
|
||||||
'''
|
'''
|
||||||
child context closes without using stream.
|
callee context closes without using stream.
|
||||||
|
|
||||||
This should result in a msg sequence
|
This should result in a msg sequence
|
||||||
|_<root>_
|
|_<root>_
|
||||||
|
@ -484,9 +469,6 @@ async def test_child_exits_ctx_after_stream_open(
|
||||||
=> {'stop': True, 'cid': <str>}
|
=> {'stop': True, 'cid': <str>}
|
||||||
|
|
||||||
'''
|
'''
|
||||||
timeout: float = (
|
|
||||||
0.5 if not debug_mode else 999
|
|
||||||
)
|
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery(
|
||||||
debug_mode=debug_mode,
|
debug_mode=debug_mode,
|
||||||
) as an:
|
) as an:
|
||||||
|
@ -495,7 +477,7 @@ async def test_child_exits_ctx_after_stream_open(
|
||||||
enable_modules=[__name__],
|
enable_modules=[__name__],
|
||||||
)
|
)
|
||||||
|
|
||||||
with trio.fail_after(timeout):
|
with trio.fail_after(0.5):
|
||||||
async with portal.open_context(
|
async with portal.open_context(
|
||||||
close_ctx_immediately,
|
close_ctx_immediately,
|
||||||
|
|
||||||
|
@ -507,56 +489,41 @@ async def test_child_exits_ctx_after_stream_open(
|
||||||
|
|
||||||
with trio.fail_after(0.4):
|
with trio.fail_after(0.4):
|
||||||
async with ctx.open_stream() as stream:
|
async with ctx.open_stream() as stream:
|
||||||
if parent_send_before_receive:
|
|
||||||
print('sending first msg from parent!')
|
|
||||||
await stream.send('yo')
|
|
||||||
|
|
||||||
# should fall through since ``StopAsyncIteration``
|
# should fall through since ``StopAsyncIteration``
|
||||||
# should be raised through translation of
|
# should be raised through translation of
|
||||||
# a ``trio.EndOfChannel`` by
|
# a ``trio.EndOfChannel`` by
|
||||||
# ``trio.abc.ReceiveChannel.__anext__()``
|
# ``trio.abc.ReceiveChannel.__anext__()``
|
||||||
msg = 10
|
async for _ in stream:
|
||||||
async for msg in stream:
|
|
||||||
# trigger failure if we DO NOT
|
# trigger failure if we DO NOT
|
||||||
# get an EOC!
|
# get an EOC!
|
||||||
assert 0
|
assert 0
|
||||||
else:
|
else:
|
||||||
# never should get anythinig new from
|
|
||||||
# the underlying stream
|
|
||||||
assert msg == 10
|
|
||||||
|
|
||||||
# verify stream is now closed
|
# verify stream is now closed
|
||||||
try:
|
try:
|
||||||
with trio.fail_after(0.3):
|
with trio.fail_after(0.3):
|
||||||
print('parent trying to `.receive()` on EoC stream!')
|
|
||||||
await stream.receive()
|
await stream.receive()
|
||||||
assert 0, 'should have raised eoc!?'
|
|
||||||
except trio.EndOfChannel:
|
except trio.EndOfChannel:
|
||||||
print('parent got EoC as expected!')
|
|
||||||
pass
|
pass
|
||||||
# raise
|
|
||||||
|
|
||||||
# TODO: should be just raise the closed resource err
|
# TODO: should be just raise the closed resource err
|
||||||
# directly here to enforce not allowing a re-open
|
# directly here to enforce not allowing a re-open
|
||||||
# of a stream to the context (at least until a time of
|
# of a stream to the context (at least until a time of
|
||||||
# if/when we decide that's a good idea?)
|
# if/when we decide that's a good idea?)
|
||||||
try:
|
try:
|
||||||
with trio.fail_after(timeout):
|
with trio.fail_after(0.5):
|
||||||
async with ctx.open_stream() as stream:
|
async with ctx.open_stream() as stream:
|
||||||
pass
|
pass
|
||||||
except trio.ClosedResourceError:
|
except trio.ClosedResourceError:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
# if ctx._rx_chan._state.data:
|
|
||||||
# await tractor.pause()
|
|
||||||
|
|
||||||
await portal.cancel_actor()
|
await portal.cancel_actor()
|
||||||
|
|
||||||
|
|
||||||
@tractor.context
|
@tractor.context
|
||||||
async def expect_cancelled(
|
async def expect_cancelled(
|
||||||
ctx: Context,
|
ctx: Context,
|
||||||
send_before_receive: bool = False,
|
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
global _state
|
global _state
|
||||||
|
@ -566,10 +533,6 @@ async def expect_cancelled(
|
||||||
|
|
||||||
try:
|
try:
|
||||||
async with ctx.open_stream() as stream:
|
async with ctx.open_stream() as stream:
|
||||||
|
|
||||||
if send_before_receive:
|
|
||||||
await stream.send('yo')
|
|
||||||
|
|
||||||
async for msg in stream:
|
async for msg in stream:
|
||||||
await stream.send(msg) # echo server
|
await stream.send(msg) # echo server
|
||||||
|
|
||||||
|
@ -596,49 +559,26 @@ async def expect_cancelled(
|
||||||
raise
|
raise
|
||||||
|
|
||||||
else:
|
else:
|
||||||
assert 0, "child wasn't cancelled !?"
|
assert 0, "callee wasn't cancelled !?"
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'child_send_before_receive',
|
|
||||||
[
|
|
||||||
False,
|
|
||||||
True,
|
|
||||||
],
|
|
||||||
ids=lambda item: f'child_send_before_receive={item}'
|
|
||||||
)
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'rent_wait_for_msg',
|
|
||||||
[
|
|
||||||
False,
|
|
||||||
True,
|
|
||||||
],
|
|
||||||
ids=lambda item: f'rent_wait_for_msg={item}'
|
|
||||||
)
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
'use_ctx_cancel_method',
|
'use_ctx_cancel_method',
|
||||||
[
|
[False, True],
|
||||||
False,
|
|
||||||
'pre_stream',
|
|
||||||
'post_stream_open',
|
|
||||||
'post_stream_close',
|
|
||||||
],
|
|
||||||
ids=lambda item: f'use_ctx_cancel_method={item}'
|
|
||||||
)
|
)
|
||||||
@tractor_test
|
@tractor_test
|
||||||
async def test_parent_exits_ctx_after_child_enters_stream(
|
async def test_caller_closes_ctx_after_callee_opens_stream(
|
||||||
use_ctx_cancel_method: bool|str,
|
use_ctx_cancel_method: bool,
|
||||||
debug_mode: bool,
|
debug_mode: bool,
|
||||||
rent_wait_for_msg: bool,
|
|
||||||
child_send_before_receive: bool,
|
|
||||||
):
|
):
|
||||||
'''
|
'''
|
||||||
Parent-side of IPC context closes without sending on `MsgStream`.
|
caller context closes without using/opening stream
|
||||||
|
|
||||||
'''
|
'''
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery(
|
||||||
debug_mode=debug_mode,
|
debug_mode=debug_mode,
|
||||||
) as an:
|
) as an:
|
||||||
|
|
||||||
root: Actor = current_actor()
|
root: Actor = current_actor()
|
||||||
portal = await an.start_actor(
|
portal = await an.start_actor(
|
||||||
'ctx_cancelled',
|
'ctx_cancelled',
|
||||||
|
@ -647,52 +587,41 @@ async def test_parent_exits_ctx_after_child_enters_stream(
|
||||||
|
|
||||||
async with portal.open_context(
|
async with portal.open_context(
|
||||||
expect_cancelled,
|
expect_cancelled,
|
||||||
send_before_receive=child_send_before_receive,
|
|
||||||
) as (ctx, sent):
|
) as (ctx, sent):
|
||||||
assert sent is None
|
assert sent is None
|
||||||
|
|
||||||
await portal.run(assert_state, value=True)
|
await portal.run(assert_state, value=True)
|
||||||
|
|
||||||
# call `ctx.cancel()` explicitly
|
# call `ctx.cancel()` explicitly
|
||||||
if use_ctx_cancel_method == 'pre_stream':
|
if use_ctx_cancel_method:
|
||||||
await ctx.cancel()
|
await ctx.cancel()
|
||||||
|
|
||||||
# NOTE: means the local side `ctx._scope` will
|
# NOTE: means the local side `ctx._scope` will
|
||||||
# have been cancelled by an ctxc ack and thus
|
# have been cancelled by an ctxc ack and thus
|
||||||
# `._scope.cancelled_caught` should be set.
|
# `._scope.cancelled_caught` should be set.
|
||||||
async with (
|
try:
|
||||||
expect_ctxc(
|
|
||||||
# XXX: the cause is US since we call
|
|
||||||
# `Context.cancel()` just above!
|
|
||||||
yay=True,
|
|
||||||
|
|
||||||
# XXX: must be propagated to __aexit__
|
|
||||||
# and should be silently absorbed there
|
|
||||||
# since we called `.cancel()` just above ;)
|
|
||||||
reraise=True,
|
|
||||||
) as maybe_ctxc,
|
|
||||||
):
|
|
||||||
async with ctx.open_stream() as stream:
|
async with ctx.open_stream() as stream:
|
||||||
|
async for msg in stream:
|
||||||
|
pass
|
||||||
|
|
||||||
if rent_wait_for_msg:
|
except tractor.ContextCancelled as ctxc:
|
||||||
async for msg in stream:
|
# XXX: the cause is US since we call
|
||||||
print(f'PARENT rx: {msg!r}\n')
|
# `Context.cancel()` just above!
|
||||||
break
|
assert (
|
||||||
|
ctxc.canceller
|
||||||
|
==
|
||||||
|
current_actor().uid
|
||||||
|
==
|
||||||
|
root.uid
|
||||||
|
)
|
||||||
|
|
||||||
if use_ctx_cancel_method == 'post_stream_open':
|
# XXX: must be propagated to __aexit__
|
||||||
await ctx.cancel()
|
# and should be silently absorbed there
|
||||||
|
# since we called `.cancel()` just above ;)
|
||||||
|
raise
|
||||||
|
|
||||||
if use_ctx_cancel_method == 'post_stream_close':
|
else:
|
||||||
await ctx.cancel()
|
assert 0, "Should have context cancelled?"
|
||||||
|
|
||||||
ctxc: tractor.ContextCancelled = maybe_ctxc.value
|
|
||||||
assert (
|
|
||||||
ctxc.canceller
|
|
||||||
==
|
|
||||||
current_actor().uid
|
|
||||||
==
|
|
||||||
root.uid
|
|
||||||
)
|
|
||||||
|
|
||||||
# channel should still be up
|
# channel should still be up
|
||||||
assert portal.channel.connected()
|
assert portal.channel.connected()
|
||||||
|
@ -703,20 +632,13 @@ async def test_parent_exits_ctx_after_child_enters_stream(
|
||||||
value=False,
|
value=False,
|
||||||
)
|
)
|
||||||
|
|
||||||
# XXX CHILD-BLOCKS case, we SHOULD NOT exit from the
|
|
||||||
# `.open_context()` before the child has returned,
|
|
||||||
# errored or been cancelled!
|
|
||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
with trio.fail_after(
|
with trio.fail_after(0.2):
|
||||||
0.5 # if not debug_mode else 999
|
await ctx.result()
|
||||||
):
|
|
||||||
res = await ctx.wait_for_result()
|
|
||||||
assert res is not tractor._context.Unresolved
|
|
||||||
assert 0, "Callee should have blocked!?"
|
assert 0, "Callee should have blocked!?"
|
||||||
except trio.TooSlowError:
|
except trio.TooSlowError:
|
||||||
# NO-OP -> since already triggered by
|
# NO-OP -> since already called above
|
||||||
# `trio.fail_after()` above!
|
|
||||||
await ctx.cancel()
|
await ctx.cancel()
|
||||||
|
|
||||||
# NOTE: local scope should have absorbed the cancellation since
|
# NOTE: local scope should have absorbed the cancellation since
|
||||||
|
@ -756,7 +678,7 @@ async def test_parent_exits_ctx_after_child_enters_stream(
|
||||||
|
|
||||||
|
|
||||||
@tractor_test
|
@tractor_test
|
||||||
async def test_multitask_parent_cancels_from_nonroot_task(
|
async def test_multitask_caller_cancels_from_nonroot_task(
|
||||||
debug_mode: bool,
|
debug_mode: bool,
|
||||||
):
|
):
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery(
|
||||||
|
@ -808,6 +730,7 @@ async def test_multitask_parent_cancels_from_nonroot_task(
|
||||||
|
|
||||||
@tractor.context
|
@tractor.context
|
||||||
async def cancel_self(
|
async def cancel_self(
|
||||||
|
|
||||||
ctx: Context,
|
ctx: Context,
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
|
@ -847,11 +770,11 @@ async def cancel_self(
|
||||||
|
|
||||||
|
|
||||||
@tractor_test
|
@tractor_test
|
||||||
async def test_child_cancels_before_started(
|
async def test_callee_cancels_before_started(
|
||||||
debug_mode: bool,
|
debug_mode: bool,
|
||||||
):
|
):
|
||||||
'''
|
'''
|
||||||
Callee calls `Context.cancel()` while streaming and parent
|
Callee calls `Context.cancel()` while streaming and caller
|
||||||
sees stream terminated in `ContextCancelled`.
|
sees stream terminated in `ContextCancelled`.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
|
@ -872,12 +795,10 @@ async def test_child_cancels_before_started(
|
||||||
|
|
||||||
# raises a special cancel signal
|
# raises a special cancel signal
|
||||||
except tractor.ContextCancelled as ce:
|
except tractor.ContextCancelled as ce:
|
||||||
_ce = ce # for debug on crash
|
ce.type == trio.Cancelled
|
||||||
ce.boxed_type == trio.Cancelled
|
|
||||||
|
|
||||||
# the traceback should be informative
|
# the traceback should be informative
|
||||||
assert 'itself' in ce.tb_str
|
assert 'itself' in ce.msgdata['tb_str']
|
||||||
assert ce.tb_str == ce.msgdata['tb_str']
|
|
||||||
|
|
||||||
# teardown the actor
|
# teardown the actor
|
||||||
await portal.cancel_actor()
|
await portal.cancel_actor()
|
||||||
|
@ -898,13 +819,14 @@ async def never_open_stream(
|
||||||
|
|
||||||
|
|
||||||
@tractor.context
|
@tractor.context
|
||||||
async def keep_sending_from_child(
|
async def keep_sending_from_callee(
|
||||||
|
|
||||||
ctx: Context,
|
ctx: Context,
|
||||||
msg_buffer_size: int|None = None,
|
msg_buffer_size: int|None = None,
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
Send endlessly on the child stream.
|
Send endlessly on the calleee stream.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
await ctx.started()
|
await ctx.started()
|
||||||
|
@ -912,7 +834,7 @@ async def keep_sending_from_child(
|
||||||
msg_buffer_size=msg_buffer_size,
|
msg_buffer_size=msg_buffer_size,
|
||||||
) as stream:
|
) as stream:
|
||||||
for msg in count():
|
for msg in count():
|
||||||
print(f'child sending {msg}')
|
print(f'callee sending {msg}')
|
||||||
await stream.send(msg)
|
await stream.send(msg)
|
||||||
await trio.sleep(0.01)
|
await trio.sleep(0.01)
|
||||||
|
|
||||||
|
@ -920,13 +842,10 @@ async def keep_sending_from_child(
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
'overrun_by',
|
'overrun_by',
|
||||||
[
|
[
|
||||||
('parent', 1, never_open_stream),
|
('caller', 1, never_open_stream),
|
||||||
('child', 0, keep_sending_from_child),
|
('callee', 0, keep_sending_from_callee),
|
||||||
],
|
],
|
||||||
ids=[
|
ids='overrun_condition={}'.format,
|
||||||
('parent_1buf_never_open_stream'),
|
|
||||||
('child_0buf_keep_sending_from_child'),
|
|
||||||
]
|
|
||||||
)
|
)
|
||||||
def test_one_end_stream_not_opened(
|
def test_one_end_stream_not_opened(
|
||||||
overrun_by: tuple[str, int, Callable],
|
overrun_by: tuple[str, int, Callable],
|
||||||
|
@ -950,50 +869,50 @@ def test_one_end_stream_not_opened(
|
||||||
enable_modules=[__name__],
|
enable_modules=[__name__],
|
||||||
)
|
)
|
||||||
|
|
||||||
with trio.fail_after(1):
|
async with portal.open_context(
|
||||||
async with portal.open_context(
|
entrypoint,
|
||||||
entrypoint,
|
) as (ctx, sent):
|
||||||
) as (ctx, sent):
|
assert sent is None
|
||||||
assert sent is None
|
|
||||||
|
|
||||||
if 'parent' in overrunner:
|
if 'caller' in overrunner:
|
||||||
async with ctx.open_stream() as stream:
|
|
||||||
|
|
||||||
# itersend +1 msg more then the buffer size
|
async with ctx.open_stream() as stream:
|
||||||
# to cause the most basic overrun.
|
|
||||||
for i in range(buf_size):
|
|
||||||
print(f'sending {i}')
|
|
||||||
await stream.send(i)
|
|
||||||
|
|
||||||
else:
|
# itersend +1 msg more then the buffer size
|
||||||
# expect overrun error to be relayed back
|
# to cause the most basic overrun.
|
||||||
# and this sleep interrupted
|
for i in range(buf_size):
|
||||||
await trio.sleep_forever()
|
print(f'sending {i}')
|
||||||
|
await stream.send(i)
|
||||||
|
|
||||||
else:
|
else:
|
||||||
# child overruns parent case so we do nothing here
|
# expect overrun error to be relayed back
|
||||||
await trio.sleep_forever()
|
# and this sleep interrupted
|
||||||
|
await trio.sleep_forever()
|
||||||
|
|
||||||
|
else:
|
||||||
|
# callee overruns caller case so we do nothing here
|
||||||
|
await trio.sleep_forever()
|
||||||
|
|
||||||
await portal.cancel_actor()
|
await portal.cancel_actor()
|
||||||
|
|
||||||
# 2 overrun cases and the no overrun case (which pushes right up to
|
# 2 overrun cases and the no overrun case (which pushes right up to
|
||||||
# the msg limit)
|
# the msg limit)
|
||||||
if (
|
if (
|
||||||
overrunner == 'parent'
|
overrunner == 'caller'
|
||||||
):
|
):
|
||||||
with pytest.raises(tractor.RemoteActorError) as excinfo:
|
with pytest.raises(tractor.RemoteActorError) as excinfo:
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
|
|
||||||
assert excinfo.value.boxed_type == StreamOverrun
|
assert excinfo.value.type == StreamOverrun
|
||||||
|
|
||||||
elif overrunner == 'child':
|
elif overrunner == 'callee':
|
||||||
with pytest.raises(tractor.RemoteActorError) as excinfo:
|
with pytest.raises(tractor.RemoteActorError) as excinfo:
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
|
|
||||||
# TODO: embedded remote errors so that we can verify the source
|
# TODO: embedded remote errors so that we can verify the source
|
||||||
# error? the child delivers an error which is an overrun
|
# error? the callee delivers an error which is an overrun
|
||||||
# wrapped in a remote actor error.
|
# wrapped in a remote actor error.
|
||||||
assert excinfo.value.boxed_type == tractor.RemoteActorError
|
assert excinfo.value.type == tractor.RemoteActorError
|
||||||
|
|
||||||
else:
|
else:
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
|
@ -1001,7 +920,8 @@ def test_one_end_stream_not_opened(
|
||||||
|
|
||||||
@tractor.context
|
@tractor.context
|
||||||
async def echo_back_sequence(
|
async def echo_back_sequence(
|
||||||
ctx: Context,
|
|
||||||
|
ctx: Context,
|
||||||
seq: list[int],
|
seq: list[int],
|
||||||
wait_for_cancel: bool,
|
wait_for_cancel: bool,
|
||||||
allow_overruns_side: str,
|
allow_overruns_side: str,
|
||||||
|
@ -1010,12 +930,12 @@ async def echo_back_sequence(
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
Send endlessly on the child stream using a small buffer size
|
Send endlessly on the calleee stream using a small buffer size
|
||||||
setting on the contex to simulate backlogging that would normally
|
setting on the contex to simulate backlogging that would normally
|
||||||
cause overruns.
|
cause overruns.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
# NOTE: ensure that if the parent is expecting to cancel this task
|
# NOTE: ensure that if the caller is expecting to cancel this task
|
||||||
# that we stay echoing much longer then they are so we don't
|
# that we stay echoing much longer then they are so we don't
|
||||||
# return early instead of receive the cancel msg.
|
# return early instead of receive the cancel msg.
|
||||||
total_batches: int = (
|
total_batches: int = (
|
||||||
|
@ -1024,7 +944,7 @@ async def echo_back_sequence(
|
||||||
)
|
)
|
||||||
|
|
||||||
await ctx.started()
|
await ctx.started()
|
||||||
# await tractor.pause()
|
# await tractor.breakpoint()
|
||||||
async with ctx.open_stream(
|
async with ctx.open_stream(
|
||||||
msg_buffer_size=msg_buffer_size,
|
msg_buffer_size=msg_buffer_size,
|
||||||
|
|
||||||
|
@ -1065,18 +985,18 @@ async def echo_back_sequence(
|
||||||
if be_slow:
|
if be_slow:
|
||||||
await trio.sleep(0.05)
|
await trio.sleep(0.05)
|
||||||
|
|
||||||
print('child waiting on next')
|
print('callee waiting on next')
|
||||||
|
|
||||||
print(f'child echoing back latest batch\n{batch}')
|
print(f'callee echoing back latest batch\n{batch}')
|
||||||
for msg in batch:
|
for msg in batch:
|
||||||
print(f'child sending msg\n{msg}')
|
print(f'callee sending msg\n{msg}')
|
||||||
await stream.send(msg)
|
await stream.send(msg)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
return 'yo'
|
return 'yo'
|
||||||
finally:
|
finally:
|
||||||
print(
|
print(
|
||||||
'exiting child with context:\n'
|
'exiting callee with context:\n'
|
||||||
f'{pformat(ctx)}\n'
|
f'{pformat(ctx)}\n'
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -1130,68 +1050,59 @@ def test_maybe_allow_overruns_stream(
|
||||||
debug_mode=debug_mode,
|
debug_mode=debug_mode,
|
||||||
) as an:
|
) as an:
|
||||||
portal = await an.start_actor(
|
portal = await an.start_actor(
|
||||||
'child_sends_forever',
|
'callee_sends_forever',
|
||||||
enable_modules=[__name__],
|
enable_modules=[__name__],
|
||||||
loglevel=loglevel,
|
loglevel=loglevel,
|
||||||
debug_mode=debug_mode,
|
debug_mode=debug_mode,
|
||||||
)
|
)
|
||||||
|
seq = list(range(10))
|
||||||
|
async with portal.open_context(
|
||||||
|
echo_back_sequence,
|
||||||
|
seq=seq,
|
||||||
|
wait_for_cancel=cancel_ctx,
|
||||||
|
be_slow=(slow_side == 'child'),
|
||||||
|
allow_overruns_side=allow_overruns_side,
|
||||||
|
|
||||||
# stream-sequence batch info with send delay to determine
|
) as (ctx, sent):
|
||||||
# approx timeout determining whether test has hung.
|
assert sent is None
|
||||||
total_batches: int = 2
|
|
||||||
num_items: int = 10
|
|
||||||
seq = list(range(num_items))
|
|
||||||
parent_send_delay: float = 0.16
|
|
||||||
timeout: float = math.ceil(
|
|
||||||
total_batches * num_items * parent_send_delay
|
|
||||||
)
|
|
||||||
with trio.fail_after(timeout):
|
|
||||||
async with portal.open_context(
|
|
||||||
echo_back_sequence,
|
|
||||||
seq=seq,
|
|
||||||
wait_for_cancel=cancel_ctx,
|
|
||||||
be_slow=(slow_side == 'child'),
|
|
||||||
allow_overruns_side=allow_overruns_side,
|
|
||||||
|
|
||||||
) as (ctx, sent):
|
async with ctx.open_stream(
|
||||||
assert sent is None
|
msg_buffer_size=1 if slow_side == 'parent' else None,
|
||||||
|
allow_overruns=(allow_overruns_side in {'parent', 'both'}),
|
||||||
|
) as stream:
|
||||||
|
|
||||||
async with ctx.open_stream(
|
total_batches: int = 2
|
||||||
msg_buffer_size=1 if slow_side == 'parent' else None,
|
for _ in range(total_batches):
|
||||||
allow_overruns=(allow_overruns_side in {'parent', 'both'}),
|
for msg in seq:
|
||||||
) as stream:
|
# print(f'root tx {msg}')
|
||||||
|
await stream.send(msg)
|
||||||
|
if slow_side == 'parent':
|
||||||
|
# NOTE: we make the parent slightly
|
||||||
|
# slower, when it is slow, to make sure
|
||||||
|
# that in the overruns everywhere case
|
||||||
|
await trio.sleep(0.16)
|
||||||
|
|
||||||
for _ in range(total_batches):
|
batch = []
|
||||||
for msg in seq:
|
async for msg in stream:
|
||||||
# print(f'root tx {msg}')
|
print(f'root rx {msg}')
|
||||||
await stream.send(msg)
|
batch.append(msg)
|
||||||
if slow_side == 'parent':
|
if batch == seq:
|
||||||
# NOTE: we make the parent slightly
|
break
|
||||||
# slower, when it is slow, to make sure
|
|
||||||
# that in the overruns everywhere case
|
|
||||||
await trio.sleep(parent_send_delay)
|
|
||||||
|
|
||||||
batch = []
|
|
||||||
async for msg in stream:
|
|
||||||
print(f'root rx {msg}')
|
|
||||||
batch.append(msg)
|
|
||||||
if batch == seq:
|
|
||||||
break
|
|
||||||
|
|
||||||
if cancel_ctx:
|
|
||||||
# cancel the remote task
|
|
||||||
print('Requesting `ctx.cancel()` in parent!')
|
|
||||||
await ctx.cancel()
|
|
||||||
|
|
||||||
res: str|ContextCancelled = await ctx.result()
|
|
||||||
|
|
||||||
if cancel_ctx:
|
if cancel_ctx:
|
||||||
assert isinstance(res, ContextCancelled)
|
# cancel the remote task
|
||||||
assert tuple(res.canceller) == current_actor().uid
|
print('Requesting `ctx.cancel()` in parent!')
|
||||||
|
await ctx.cancel()
|
||||||
|
|
||||||
else:
|
res: str|ContextCancelled = await ctx.result()
|
||||||
print(f'RX ROOT SIDE RESULT {res}')
|
|
||||||
assert res == 'yo'
|
if cancel_ctx:
|
||||||
|
assert isinstance(res, ContextCancelled)
|
||||||
|
assert tuple(res.canceller) == current_actor().uid
|
||||||
|
|
||||||
|
else:
|
||||||
|
print(f'RX ROOT SIDE RESULT {res}')
|
||||||
|
assert res == 'yo'
|
||||||
|
|
||||||
# cancel the daemon
|
# cancel the daemon
|
||||||
await portal.cancel_actor()
|
await portal.cancel_actor()
|
||||||
|
@ -1220,7 +1131,7 @@ def test_maybe_allow_overruns_stream(
|
||||||
# NOTE: i tried to isolate to a deterministic case here
|
# NOTE: i tried to isolate to a deterministic case here
|
||||||
# based on timeing, but i was kinda wasted, and i don't
|
# based on timeing, but i was kinda wasted, and i don't
|
||||||
# think it's sane to catch them..
|
# think it's sane to catch them..
|
||||||
assert err.boxed_type in (
|
assert err.type in (
|
||||||
tractor.RemoteActorError,
|
tractor.RemoteActorError,
|
||||||
StreamOverrun,
|
StreamOverrun,
|
||||||
)
|
)
|
||||||
|
@ -1228,12 +1139,11 @@ def test_maybe_allow_overruns_stream(
|
||||||
elif (
|
elif (
|
||||||
slow_side == 'child'
|
slow_side == 'child'
|
||||||
):
|
):
|
||||||
assert err.boxed_type == StreamOverrun
|
assert err.type == StreamOverrun
|
||||||
|
|
||||||
elif slow_side == 'parent':
|
elif slow_side == 'parent':
|
||||||
assert err.boxed_type == tractor.RemoteActorError
|
assert err.type == tractor.RemoteActorError
|
||||||
assert 'StreamOverrun' in err.tb_str
|
assert 'StreamOverrun' in err.msgdata['tb_str']
|
||||||
assert err.tb_str == err.msgdata['tb_str']
|
|
||||||
|
|
||||||
else:
|
else:
|
||||||
# if this hits the logic blocks from above are not
|
# if this hits the logic blocks from above are not
|
||||||
|
|
|
@ -12,26 +12,27 @@ TODO:
|
||||||
"""
|
"""
|
||||||
from functools import partial
|
from functools import partial
|
||||||
import itertools
|
import itertools
|
||||||
|
from typing import Optional
|
||||||
import platform
|
import platform
|
||||||
|
import pathlib
|
||||||
import time
|
import time
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
import pexpect
|
||||||
from pexpect.exceptions import (
|
from pexpect.exceptions import (
|
||||||
TIMEOUT,
|
TIMEOUT,
|
||||||
EOF,
|
EOF,
|
||||||
)
|
)
|
||||||
|
|
||||||
from .conftest import (
|
from tractor._testing import (
|
||||||
do_ctlc,
|
examples_dir,
|
||||||
PROMPT,
|
)
|
||||||
|
from tractor.devx._debug import (
|
||||||
_pause_msg,
|
_pause_msg,
|
||||||
_crash_msg,
|
_crash_msg,
|
||||||
_repl_fail_msg,
|
|
||||||
)
|
)
|
||||||
from .conftest import (
|
from conftest import (
|
||||||
expect,
|
_ci_env,
|
||||||
in_prompt_msg,
|
|
||||||
assert_before,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
# TODO: The next great debugger audit could be done by you!
|
# TODO: The next great debugger audit could be done by you!
|
||||||
|
@ -51,6 +52,15 @@ if platform.system() == 'Windows':
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def mk_cmd(ex_name: str) -> str:
|
||||||
|
'''
|
||||||
|
Generate a command suitable to pass to ``pexpect.spawn()``.
|
||||||
|
|
||||||
|
'''
|
||||||
|
script_path: pathlib.Path = examples_dir() / 'debugging' / f'{ex_name}.py'
|
||||||
|
return ' '.join(['python', str(script_path)])
|
||||||
|
|
||||||
|
|
||||||
# TODO: was trying to this xfail style but some weird bug i see in CI
|
# TODO: was trying to this xfail style but some weird bug i see in CI
|
||||||
# that's happening at collect time.. pretty soon gonna dump actions i'm
|
# that's happening at collect time.. pretty soon gonna dump actions i'm
|
||||||
# thinkin...
|
# thinkin...
|
||||||
|
@ -69,6 +79,136 @@ has_nested_actors = pytest.mark.has_nested_actors
|
||||||
# )
|
# )
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def spawn(
|
||||||
|
start_method,
|
||||||
|
testdir,
|
||||||
|
reg_addr,
|
||||||
|
) -> 'pexpect.spawn':
|
||||||
|
|
||||||
|
if start_method != 'trio':
|
||||||
|
pytest.skip(
|
||||||
|
"Debugger tests are only supported on the trio backend"
|
||||||
|
)
|
||||||
|
|
||||||
|
def _spawn(cmd):
|
||||||
|
return testdir.spawn(
|
||||||
|
cmd=mk_cmd(cmd),
|
||||||
|
expect_timeout=3,
|
||||||
|
)
|
||||||
|
|
||||||
|
return _spawn
|
||||||
|
|
||||||
|
|
||||||
|
PROMPT = r"\(Pdb\+\)"
|
||||||
|
|
||||||
|
|
||||||
|
def expect(
|
||||||
|
child,
|
||||||
|
|
||||||
|
# prompt by default
|
||||||
|
patt: str = PROMPT,
|
||||||
|
|
||||||
|
**kwargs,
|
||||||
|
|
||||||
|
) -> None:
|
||||||
|
'''
|
||||||
|
Expect wrapper that prints last seen console
|
||||||
|
data before failing.
|
||||||
|
|
||||||
|
'''
|
||||||
|
try:
|
||||||
|
child.expect(
|
||||||
|
patt,
|
||||||
|
**kwargs,
|
||||||
|
)
|
||||||
|
except TIMEOUT:
|
||||||
|
before = str(child.before.decode())
|
||||||
|
print(before)
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
def in_prompt_msg(
|
||||||
|
prompt: str,
|
||||||
|
parts: list[str],
|
||||||
|
|
||||||
|
pause_on_false: bool = False,
|
||||||
|
print_prompt_on_false: bool = True,
|
||||||
|
|
||||||
|
) -> bool:
|
||||||
|
'''
|
||||||
|
Predicate check if (the prompt's) std-streams output has all
|
||||||
|
`str`-parts in it.
|
||||||
|
|
||||||
|
Can be used in test asserts for bulk matching expected
|
||||||
|
log/REPL output for a given `pdb` interact point.
|
||||||
|
|
||||||
|
'''
|
||||||
|
for part in parts:
|
||||||
|
if part not in prompt:
|
||||||
|
|
||||||
|
if pause_on_false:
|
||||||
|
import pdbp
|
||||||
|
pdbp.set_trace()
|
||||||
|
|
||||||
|
if print_prompt_on_false:
|
||||||
|
print(prompt)
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
|
def assert_before(
|
||||||
|
child,
|
||||||
|
patts: list[str],
|
||||||
|
|
||||||
|
**kwargs,
|
||||||
|
|
||||||
|
) -> None:
|
||||||
|
|
||||||
|
# as in before the prompt end
|
||||||
|
before: str = str(child.before.decode())
|
||||||
|
assert in_prompt_msg(
|
||||||
|
prompt=before,
|
||||||
|
parts=patts,
|
||||||
|
|
||||||
|
**kwargs
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(
|
||||||
|
params=[False, True],
|
||||||
|
ids='ctl-c={}'.format,
|
||||||
|
)
|
||||||
|
def ctlc(
|
||||||
|
request,
|
||||||
|
ci_env: bool,
|
||||||
|
|
||||||
|
) -> bool:
|
||||||
|
|
||||||
|
use_ctlc = request.param
|
||||||
|
|
||||||
|
node = request.node
|
||||||
|
markers = node.own_markers
|
||||||
|
for mark in markers:
|
||||||
|
if mark.name == 'has_nested_actors':
|
||||||
|
pytest.skip(
|
||||||
|
f'Test {node} has nested actors and fails with Ctrl-C.\n'
|
||||||
|
f'The test can sometimes run fine locally but until'
|
||||||
|
' we solve' 'this issue this CI test will be xfail:\n'
|
||||||
|
'https://github.com/goodboy/tractor/issues/320'
|
||||||
|
)
|
||||||
|
|
||||||
|
if use_ctlc:
|
||||||
|
# XXX: disable pygments highlighting for auto-tests
|
||||||
|
# since some envs (like actions CI) will struggle
|
||||||
|
# the the added color-char encoding..
|
||||||
|
from tractor._debug import TractorConfig
|
||||||
|
TractorConfig.use_pygements = False
|
||||||
|
|
||||||
|
yield use_ctlc
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
'user_in_out',
|
'user_in_out',
|
||||||
[
|
[
|
||||||
|
@ -77,10 +217,7 @@ has_nested_actors = pytest.mark.has_nested_actors
|
||||||
],
|
],
|
||||||
ids=lambda item: f'{item[0]} -> {item[1]}',
|
ids=lambda item: f'{item[0]} -> {item[1]}',
|
||||||
)
|
)
|
||||||
def test_root_actor_error(
|
def test_root_actor_error(spawn, user_in_out):
|
||||||
spawn,
|
|
||||||
user_in_out,
|
|
||||||
):
|
|
||||||
'''
|
'''
|
||||||
Demonstrate crash handler entering pdb from basic error in root actor.
|
Demonstrate crash handler entering pdb from basic error in root actor.
|
||||||
|
|
||||||
|
@ -92,15 +229,14 @@ def test_root_actor_error(
|
||||||
# scan for the prompt
|
# scan for the prompt
|
||||||
expect(child, PROMPT)
|
expect(child, PROMPT)
|
||||||
|
|
||||||
|
before = str(child.before.decode())
|
||||||
|
|
||||||
# make sure expected logging and error arrives
|
# make sure expected logging and error arrives
|
||||||
assert in_prompt_msg(
|
assert in_prompt_msg(
|
||||||
child,
|
before,
|
||||||
[
|
[_crash_msg, "('root'"]
|
||||||
_crash_msg,
|
|
||||||
"('root'",
|
|
||||||
'AssertionError',
|
|
||||||
]
|
|
||||||
)
|
)
|
||||||
|
assert 'AssertionError' in before
|
||||||
|
|
||||||
# send user command
|
# send user command
|
||||||
child.sendline(user_input)
|
child.sendline(user_input)
|
||||||
|
@ -119,10 +255,8 @@ def test_root_actor_error(
|
||||||
ids=lambda item: f'{item[0]} -> {item[1]}',
|
ids=lambda item: f'{item[0]} -> {item[1]}',
|
||||||
)
|
)
|
||||||
def test_root_actor_bp(spawn, user_in_out):
|
def test_root_actor_bp(spawn, user_in_out):
|
||||||
'''
|
"""Demonstrate breakpoint from in root actor.
|
||||||
Demonstrate breakpoint from in root actor.
|
"""
|
||||||
|
|
||||||
'''
|
|
||||||
user_input, expect_err_str = user_in_out
|
user_input, expect_err_str = user_in_out
|
||||||
child = spawn('root_actor_breakpoint')
|
child = spawn('root_actor_breakpoint')
|
||||||
|
|
||||||
|
@ -136,7 +270,7 @@ def test_root_actor_bp(spawn, user_in_out):
|
||||||
child.expect('\r\n')
|
child.expect('\r\n')
|
||||||
|
|
||||||
# process should exit
|
# process should exit
|
||||||
child.expect(EOF)
|
child.expect(pexpect.EOF)
|
||||||
|
|
||||||
if expect_err_str is None:
|
if expect_err_str is None:
|
||||||
assert 'Error' not in str(child.before)
|
assert 'Error' not in str(child.before)
|
||||||
|
@ -144,6 +278,38 @@ def test_root_actor_bp(spawn, user_in_out):
|
||||||
assert expect_err_str in str(child.before)
|
assert expect_err_str in str(child.before)
|
||||||
|
|
||||||
|
|
||||||
|
def do_ctlc(
|
||||||
|
child,
|
||||||
|
count: int = 3,
|
||||||
|
delay: float = 0.1,
|
||||||
|
patt: Optional[str] = None,
|
||||||
|
|
||||||
|
# expect repl UX to reprint the prompt after every
|
||||||
|
# ctrl-c send.
|
||||||
|
# XXX: no idea but, in CI this never seems to work even on 3.10 so
|
||||||
|
# needs some further investigation potentially...
|
||||||
|
expect_prompt: bool = not _ci_env,
|
||||||
|
|
||||||
|
) -> None:
|
||||||
|
|
||||||
|
# make sure ctl-c sends don't do anything but repeat output
|
||||||
|
for _ in range(count):
|
||||||
|
time.sleep(delay)
|
||||||
|
child.sendcontrol('c')
|
||||||
|
|
||||||
|
# TODO: figure out why this makes CI fail..
|
||||||
|
# if you run this test manually it works just fine..
|
||||||
|
if expect_prompt:
|
||||||
|
before = str(child.before.decode())
|
||||||
|
time.sleep(delay)
|
||||||
|
child.expect(PROMPT)
|
||||||
|
time.sleep(delay)
|
||||||
|
|
||||||
|
if patt:
|
||||||
|
# should see the last line on console
|
||||||
|
assert patt in before
|
||||||
|
|
||||||
|
|
||||||
def test_root_actor_bp_forever(
|
def test_root_actor_bp_forever(
|
||||||
spawn,
|
spawn,
|
||||||
ctlc: bool,
|
ctlc: bool,
|
||||||
|
@ -183,7 +349,7 @@ def test_root_actor_bp_forever(
|
||||||
|
|
||||||
# quit out of the loop
|
# quit out of the loop
|
||||||
child.sendline('q')
|
child.sendline('q')
|
||||||
child.expect(EOF)
|
child.expect(pexpect.EOF)
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
|
@ -205,12 +371,10 @@ def test_subactor_error(
|
||||||
# scan for the prompt
|
# scan for the prompt
|
||||||
child.expect(PROMPT)
|
child.expect(PROMPT)
|
||||||
|
|
||||||
|
before = str(child.before.decode())
|
||||||
assert in_prompt_msg(
|
assert in_prompt_msg(
|
||||||
child,
|
before,
|
||||||
[
|
[_crash_msg, "('name_error'"]
|
||||||
_crash_msg,
|
|
||||||
"('name_error'",
|
|
||||||
]
|
|
||||||
)
|
)
|
||||||
|
|
||||||
if do_next:
|
if do_next:
|
||||||
|
@ -229,15 +393,17 @@ def test_subactor_error(
|
||||||
child.sendline('continue')
|
child.sendline('continue')
|
||||||
|
|
||||||
child.expect(PROMPT)
|
child.expect(PROMPT)
|
||||||
|
before = str(child.before.decode())
|
||||||
|
|
||||||
|
# root actor gets debugger engaged
|
||||||
assert in_prompt_msg(
|
assert in_prompt_msg(
|
||||||
child,
|
before,
|
||||||
[
|
[_crash_msg, "('root'"]
|
||||||
_crash_msg,
|
)
|
||||||
# root actor gets debugger engaged
|
# error is a remote error propagated from the subactor
|
||||||
"('root'",
|
assert in_prompt_msg(
|
||||||
# error is a remote error propagated from the subactor
|
before,
|
||||||
"('name_error'",
|
[_crash_msg, "('name_error'"]
|
||||||
]
|
|
||||||
)
|
)
|
||||||
|
|
||||||
# another round
|
# another round
|
||||||
|
@ -248,7 +414,7 @@ def test_subactor_error(
|
||||||
child.expect('\r\n')
|
child.expect('\r\n')
|
||||||
|
|
||||||
# process should exit
|
# process should exit
|
||||||
child.expect(EOF)
|
child.expect(pexpect.EOF)
|
||||||
|
|
||||||
|
|
||||||
def test_subactor_breakpoint(
|
def test_subactor_breakpoint(
|
||||||
|
@ -258,11 +424,14 @@ def test_subactor_breakpoint(
|
||||||
"Single subactor with an infinite breakpoint loop"
|
"Single subactor with an infinite breakpoint loop"
|
||||||
|
|
||||||
child = spawn('subactor_breakpoint')
|
child = spawn('subactor_breakpoint')
|
||||||
|
|
||||||
|
# scan for the prompt
|
||||||
child.expect(PROMPT)
|
child.expect(PROMPT)
|
||||||
|
|
||||||
|
before = str(child.before.decode())
|
||||||
assert in_prompt_msg(
|
assert in_prompt_msg(
|
||||||
child,
|
before,
|
||||||
[_pause_msg,
|
[_pause_msg, "('breakpoint_forever'"]
|
||||||
"('breakpoint_forever'",]
|
|
||||||
)
|
)
|
||||||
|
|
||||||
# do some "next" commands to demonstrate recurrent breakpoint
|
# do some "next" commands to demonstrate recurrent breakpoint
|
||||||
|
@ -278,8 +447,9 @@ def test_subactor_breakpoint(
|
||||||
for _ in range(5):
|
for _ in range(5):
|
||||||
child.sendline('continue')
|
child.sendline('continue')
|
||||||
child.expect(PROMPT)
|
child.expect(PROMPT)
|
||||||
|
before = str(child.before.decode())
|
||||||
assert in_prompt_msg(
|
assert in_prompt_msg(
|
||||||
child,
|
before,
|
||||||
[_pause_msg, "('breakpoint_forever'"]
|
[_pause_msg, "('breakpoint_forever'"]
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -292,12 +462,9 @@ def test_subactor_breakpoint(
|
||||||
# child process should exit but parent will capture pdb.BdbQuit
|
# child process should exit but parent will capture pdb.BdbQuit
|
||||||
child.expect(PROMPT)
|
child.expect(PROMPT)
|
||||||
|
|
||||||
assert in_prompt_msg(
|
before = str(child.before.decode())
|
||||||
child,
|
assert "RemoteActorError: ('breakpoint_forever'" in before
|
||||||
['RemoteActorError:',
|
assert 'bdb.BdbQuit' in before
|
||||||
"('breakpoint_forever'",
|
|
||||||
'bdb.BdbQuit',]
|
|
||||||
)
|
|
||||||
|
|
||||||
if ctlc:
|
if ctlc:
|
||||||
do_ctlc(child)
|
do_ctlc(child)
|
||||||
|
@ -306,17 +473,11 @@ def test_subactor_breakpoint(
|
||||||
child.sendline('c')
|
child.sendline('c')
|
||||||
|
|
||||||
# process should exit
|
# process should exit
|
||||||
child.expect(EOF)
|
child.expect(pexpect.EOF)
|
||||||
|
|
||||||
assert in_prompt_msg(
|
before = str(child.before.decode())
|
||||||
child, [
|
assert "RemoteActorError: ('breakpoint_forever'" in before
|
||||||
'MessagingError:',
|
assert 'bdb.BdbQuit' in before
|
||||||
'RemoteActorError:',
|
|
||||||
"('breakpoint_forever'",
|
|
||||||
'bdb.BdbQuit',
|
|
||||||
],
|
|
||||||
pause_on_false=True,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@has_nested_actors
|
@has_nested_actors
|
||||||
|
@ -336,7 +497,7 @@ def test_multi_subactors(
|
||||||
|
|
||||||
before = str(child.before.decode())
|
before = str(child.before.decode())
|
||||||
assert in_prompt_msg(
|
assert in_prompt_msg(
|
||||||
child,
|
before,
|
||||||
[_pause_msg, "('breakpoint_forever'"]
|
[_pause_msg, "('breakpoint_forever'"]
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -357,14 +518,12 @@ def test_multi_subactors(
|
||||||
|
|
||||||
# first name_error failure
|
# first name_error failure
|
||||||
child.expect(PROMPT)
|
child.expect(PROMPT)
|
||||||
|
before = str(child.before.decode())
|
||||||
assert in_prompt_msg(
|
assert in_prompt_msg(
|
||||||
child,
|
before,
|
||||||
[
|
[_crash_msg, "('name_error'"]
|
||||||
_crash_msg,
|
|
||||||
"('name_error'",
|
|
||||||
"NameError",
|
|
||||||
]
|
|
||||||
)
|
)
|
||||||
|
assert "NameError" in before
|
||||||
|
|
||||||
if ctlc:
|
if ctlc:
|
||||||
do_ctlc(child)
|
do_ctlc(child)
|
||||||
|
@ -388,8 +547,9 @@ def test_multi_subactors(
|
||||||
# breakpoint loop should re-engage
|
# breakpoint loop should re-engage
|
||||||
child.sendline('c')
|
child.sendline('c')
|
||||||
child.expect(PROMPT)
|
child.expect(PROMPT)
|
||||||
|
before = str(child.before.decode())
|
||||||
assert in_prompt_msg(
|
assert in_prompt_msg(
|
||||||
child,
|
before,
|
||||||
[_pause_msg, "('breakpoint_forever'"]
|
[_pause_msg, "('breakpoint_forever'"]
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -452,7 +612,7 @@ def test_multi_subactors(
|
||||||
|
|
||||||
# process should exit
|
# process should exit
|
||||||
child.sendline('c')
|
child.sendline('c')
|
||||||
child.expect(EOF)
|
child.expect(pexpect.EOF)
|
||||||
|
|
||||||
# repeat of previous multierror for final output
|
# repeat of previous multierror for final output
|
||||||
assert_before(child, [
|
assert_before(child, [
|
||||||
|
@ -482,28 +642,25 @@ def test_multi_daemon_subactors(
|
||||||
# the root's tty lock first so anticipate either crash
|
# the root's tty lock first so anticipate either crash
|
||||||
# message on the first entry.
|
# message on the first entry.
|
||||||
|
|
||||||
bp_forev_parts = [
|
bp_forev_parts = [_pause_msg, "('bp_forever'"]
|
||||||
_pause_msg,
|
|
||||||
"('bp_forever'",
|
|
||||||
]
|
|
||||||
bp_forev_in_msg = partial(
|
bp_forev_in_msg = partial(
|
||||||
in_prompt_msg,
|
in_prompt_msg,
|
||||||
parts=bp_forev_parts,
|
parts=bp_forev_parts,
|
||||||
)
|
)
|
||||||
|
|
||||||
name_error_msg: str = "NameError: name 'doggypants' is not defined"
|
name_error_msg = "NameError: name 'doggypants' is not defined"
|
||||||
name_error_parts: list[str] = [name_error_msg]
|
name_error_parts = [name_error_msg]
|
||||||
|
|
||||||
before = str(child.before.decode())
|
before = str(child.before.decode())
|
||||||
|
|
||||||
if bp_forev_in_msg(child=child):
|
if bp_forev_in_msg(prompt=before):
|
||||||
next_parts = name_error_parts
|
next_parts = name_error_parts
|
||||||
|
|
||||||
elif name_error_msg in before:
|
elif name_error_msg in before:
|
||||||
next_parts = bp_forev_parts
|
next_parts = bp_forev_parts
|
||||||
|
|
||||||
else:
|
else:
|
||||||
raise ValueError('Neither log msg was found !?')
|
raise ValueError("Neither log msg was found !?")
|
||||||
|
|
||||||
if ctlc:
|
if ctlc:
|
||||||
do_ctlc(child)
|
do_ctlc(child)
|
||||||
|
@ -528,7 +685,7 @@ def test_multi_daemon_subactors(
|
||||||
# now the root actor won't clobber the bp_forever child
|
# now the root actor won't clobber the bp_forever child
|
||||||
# during it's first access to the debug lock, but will instead
|
# during it's first access to the debug lock, but will instead
|
||||||
# wait for the lock to release, by the edge triggered
|
# wait for the lock to release, by the edge triggered
|
||||||
# ``devx._debug.Lock.no_remote_has_tty`` event before sending cancel messages
|
# ``_debug.Lock.no_remote_has_tty`` event before sending cancel messages
|
||||||
# (via portals) to its underlings B)
|
# (via portals) to its underlings B)
|
||||||
|
|
||||||
# at some point here there should have been some warning msg from
|
# at some point here there should have been some warning msg from
|
||||||
|
@ -572,12 +729,14 @@ def test_multi_daemon_subactors(
|
||||||
# wait for final error in root
|
# wait for final error in root
|
||||||
# where it crashs with boxed error
|
# where it crashs with boxed error
|
||||||
while True:
|
while True:
|
||||||
child.sendline('c')
|
try:
|
||||||
child.expect(PROMPT)
|
child.sendline('c')
|
||||||
if not in_prompt_msg(
|
child.expect(PROMPT)
|
||||||
child,
|
assert_before(
|
||||||
bp_forev_parts
|
child,
|
||||||
):
|
bp_forev_parts
|
||||||
|
)
|
||||||
|
except AssertionError:
|
||||||
break
|
break
|
||||||
|
|
||||||
assert_before(
|
assert_before(
|
||||||
|
@ -586,14 +745,13 @@ def test_multi_daemon_subactors(
|
||||||
# boxed error raised in root task
|
# boxed error raised in root task
|
||||||
# "Attaching to pdb in crashed actor: ('root'",
|
# "Attaching to pdb in crashed actor: ('root'",
|
||||||
_crash_msg,
|
_crash_msg,
|
||||||
"('root'", # should attach in root
|
"('root'",
|
||||||
"_exceptions.RemoteActorError:", # with an embedded RAE for..
|
"_exceptions.RemoteActorError: ('name_error'",
|
||||||
"('name_error'", # the src subactor which raised
|
|
||||||
]
|
]
|
||||||
)
|
)
|
||||||
|
|
||||||
child.sendline('c')
|
child.sendline('c')
|
||||||
child.expect(EOF)
|
child.expect(pexpect.EOF)
|
||||||
|
|
||||||
|
|
||||||
@has_nested_actors
|
@has_nested_actors
|
||||||
|
@ -669,7 +827,7 @@ def test_multi_subactors_root_errors(
|
||||||
])
|
])
|
||||||
|
|
||||||
child.sendline('c')
|
child.sendline('c')
|
||||||
child.expect(EOF)
|
child.expect(pexpect.EOF)
|
||||||
|
|
||||||
assert_before(child, [
|
assert_before(child, [
|
||||||
# "Attaching to pdb in crashed actor: ('root'",
|
# "Attaching to pdb in crashed actor: ('root'",
|
||||||
|
@ -689,11 +847,10 @@ def test_multi_nested_subactors_error_through_nurseries(
|
||||||
# https://github.com/goodboy/tractor/issues/320
|
# https://github.com/goodboy/tractor/issues/320
|
||||||
# ctlc: bool,
|
# ctlc: bool,
|
||||||
):
|
):
|
||||||
'''
|
"""Verify deeply nested actors that error trigger debugger entries
|
||||||
Verify deeply nested actors that error trigger debugger entries
|
|
||||||
at each actor nurserly (level) all the way up the tree.
|
at each actor nurserly (level) all the way up the tree.
|
||||||
|
|
||||||
'''
|
"""
|
||||||
# NOTE: previously, inside this script was a bug where if the
|
# NOTE: previously, inside this script was a bug where if the
|
||||||
# parent errors before a 2-levels-lower actor has released the lock,
|
# parent errors before a 2-levels-lower actor has released the lock,
|
||||||
# the parent tries to cancel it but it's stuck in the debugger?
|
# the parent tries to cancel it but it's stuck in the debugger?
|
||||||
|
@ -713,31 +870,22 @@ def test_multi_nested_subactors_error_through_nurseries(
|
||||||
except EOF:
|
except EOF:
|
||||||
break
|
break
|
||||||
|
|
||||||
assert_before(
|
assert_before(child, [
|
||||||
child,
|
|
||||||
[ # boxed source errors
|
|
||||||
"NameError: name 'doggypants' is not defined",
|
|
||||||
"tractor._exceptions.RemoteActorError:",
|
|
||||||
"('name_error'",
|
|
||||||
"bdb.BdbQuit",
|
|
||||||
|
|
||||||
# first level subtrees
|
# boxed source errors
|
||||||
# "tractor._exceptions.RemoteActorError: ('spawner0'",
|
"NameError: name 'doggypants' is not defined",
|
||||||
"src_uid=('spawner0'",
|
"tractor._exceptions.RemoteActorError: ('name_error'",
|
||||||
|
"bdb.BdbQuit",
|
||||||
|
|
||||||
# "tractor._exceptions.RemoteActorError: ('spawner1'",
|
# first level subtrees
|
||||||
|
"tractor._exceptions.RemoteActorError: ('spawner0'",
|
||||||
|
# "tractor._exceptions.RemoteActorError: ('spawner1'",
|
||||||
|
|
||||||
# propagation of errors up through nested subtrees
|
# propagation of errors up through nested subtrees
|
||||||
# "tractor._exceptions.RemoteActorError: ('spawn_until_0'",
|
"tractor._exceptions.RemoteActorError: ('spawn_until_0'",
|
||||||
# "tractor._exceptions.RemoteActorError: ('spawn_until_1'",
|
"tractor._exceptions.RemoteActorError: ('spawn_until_1'",
|
||||||
# "tractor._exceptions.RemoteActorError: ('spawn_until_2'",
|
"tractor._exceptions.RemoteActorError: ('spawn_until_2'",
|
||||||
# ^-NOTE-^ old RAE repr, new one is below with a field
|
])
|
||||||
# showing the src actor's uid.
|
|
||||||
"src_uid=('spawn_until_0'",
|
|
||||||
"relay_uid=('spawn_until_1'",
|
|
||||||
"src_uid=('spawn_until_2'",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.timeout(15)
|
@pytest.mark.timeout(15)
|
||||||
|
@ -758,13 +906,10 @@ def test_root_nursery_cancels_before_child_releases_tty_lock(
|
||||||
child = spawn('root_cancelled_but_child_is_in_tty_lock')
|
child = spawn('root_cancelled_but_child_is_in_tty_lock')
|
||||||
|
|
||||||
child.expect(PROMPT)
|
child.expect(PROMPT)
|
||||||
assert_before(
|
|
||||||
child,
|
before = str(child.before.decode())
|
||||||
[
|
assert "NameError: name 'doggypants' is not defined" in before
|
||||||
"NameError: name 'doggypants' is not defined",
|
assert "tractor._exceptions.RemoteActorError: ('name_error'" not in before
|
||||||
"tractor._exceptions.RemoteActorError: ('name_error'",
|
|
||||||
],
|
|
||||||
)
|
|
||||||
time.sleep(0.5)
|
time.sleep(0.5)
|
||||||
|
|
||||||
if ctlc:
|
if ctlc:
|
||||||
|
@ -802,7 +947,7 @@ def test_root_nursery_cancels_before_child_releases_tty_lock(
|
||||||
|
|
||||||
for i in range(3):
|
for i in range(3):
|
||||||
try:
|
try:
|
||||||
child.expect(EOF, timeout=0.5)
|
child.expect(pexpect.EOF, timeout=0.5)
|
||||||
break
|
break
|
||||||
except TIMEOUT:
|
except TIMEOUT:
|
||||||
child.sendline('c')
|
child.sendline('c')
|
||||||
|
@ -844,7 +989,7 @@ def test_root_cancels_child_context_during_startup(
|
||||||
do_ctlc(child)
|
do_ctlc(child)
|
||||||
|
|
||||||
child.sendline('c')
|
child.sendline('c')
|
||||||
child.expect(EOF)
|
child.expect(pexpect.EOF)
|
||||||
|
|
||||||
|
|
||||||
def test_different_debug_mode_per_actor(
|
def test_different_debug_mode_per_actor(
|
||||||
|
@ -855,8 +1000,9 @@ def test_different_debug_mode_per_actor(
|
||||||
child.expect(PROMPT)
|
child.expect(PROMPT)
|
||||||
|
|
||||||
# only one actor should enter the debugger
|
# only one actor should enter the debugger
|
||||||
|
before = str(child.before.decode())
|
||||||
assert in_prompt_msg(
|
assert in_prompt_msg(
|
||||||
child,
|
before,
|
||||||
[_crash_msg, "('debugged_boi'", "RuntimeError"],
|
[_crash_msg, "('debugged_boi'", "RuntimeError"],
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -864,240 +1010,18 @@ def test_different_debug_mode_per_actor(
|
||||||
do_ctlc(child)
|
do_ctlc(child)
|
||||||
|
|
||||||
child.sendline('c')
|
child.sendline('c')
|
||||||
child.expect(EOF)
|
child.expect(pexpect.EOF)
|
||||||
|
|
||||||
|
before = str(child.before.decode())
|
||||||
|
|
||||||
# NOTE: this debugged actor error currently WON'T show up since the
|
# NOTE: this debugged actor error currently WON'T show up since the
|
||||||
# root will actually cancel and terminate the nursery before the error
|
# root will actually cancel and terminate the nursery before the error
|
||||||
# msg reported back from the debug mode actor is processed.
|
# msg reported back from the debug mode actor is processed.
|
||||||
# assert "tractor._exceptions.RemoteActorError: ('debugged_boi'" in before
|
# assert "tractor._exceptions.RemoteActorError: ('debugged_boi'" in before
|
||||||
|
|
||||||
|
assert "tractor._exceptions.RemoteActorError: ('crash_boi'" in before
|
||||||
|
|
||||||
# the crash boi should not have made a debugger request but
|
# the crash boi should not have made a debugger request but
|
||||||
# instead crashed completely
|
# instead crashed completely
|
||||||
assert_before(
|
assert "tractor._exceptions.RemoteActorError: ('crash_boi'" in before
|
||||||
child,
|
assert "RuntimeError" in before
|
||||||
[
|
|
||||||
"tractor._exceptions.RemoteActorError:",
|
|
||||||
"src_uid=('crash_boi'",
|
|
||||||
"RuntimeError",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def test_post_mortem_api(
|
|
||||||
spawn,
|
|
||||||
ctlc: bool,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Verify the `tractor.post_mortem()` API works in an exception
|
|
||||||
handler block.
|
|
||||||
|
|
||||||
'''
|
|
||||||
child = spawn('pm_in_subactor')
|
|
||||||
|
|
||||||
# First entry is via manual `.post_mortem()`
|
|
||||||
child.expect(PROMPT)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
_crash_msg,
|
|
||||||
"<Task 'name_error'",
|
|
||||||
"NameError",
|
|
||||||
"('child'",
|
|
||||||
"tractor.post_mortem()",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
if ctlc:
|
|
||||||
do_ctlc(child)
|
|
||||||
child.sendline('c')
|
|
||||||
|
|
||||||
# 2nd is RPC crash handler
|
|
||||||
child.expect(PROMPT)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
_crash_msg,
|
|
||||||
"<Task 'name_error'",
|
|
||||||
"NameError",
|
|
||||||
"('child'",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
if ctlc:
|
|
||||||
do_ctlc(child)
|
|
||||||
child.sendline('c')
|
|
||||||
|
|
||||||
# 3rd is via RAE bubbled to root's parent ctx task and
|
|
||||||
# crash-handled via another manual pm call.
|
|
||||||
child.expect(PROMPT)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
_crash_msg,
|
|
||||||
"<Task '__main__.main'",
|
|
||||||
"('root'",
|
|
||||||
"NameError",
|
|
||||||
"tractor.post_mortem()",
|
|
||||||
"src_uid=('child'",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
if ctlc:
|
|
||||||
do_ctlc(child)
|
|
||||||
child.sendline('c')
|
|
||||||
|
|
||||||
# 4th and FINAL is via RAE bubbled to root's parent ctx task and
|
|
||||||
# crash-handled via another manual pm call.
|
|
||||||
child.expect(PROMPT)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
_crash_msg,
|
|
||||||
"<Task '__main__.main'",
|
|
||||||
"('root'",
|
|
||||||
"NameError",
|
|
||||||
"src_uid=('child'",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
if ctlc:
|
|
||||||
do_ctlc(child)
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: ensure we're stopped and showing the right call stack frame
|
|
||||||
# -[ ] need a way to strip the terminal color chars in order to
|
|
||||||
# pattern match... see TODO around `assert_before()` above!
|
|
||||||
# child.sendline('w')
|
|
||||||
# child.expect(PROMPT)
|
|
||||||
# assert_before(
|
|
||||||
# child,
|
|
||||||
# [
|
|
||||||
# # error src block annot at ctx open
|
|
||||||
# '-> async with p.open_context(name_error) as (ctx, first):',
|
|
||||||
# ]
|
|
||||||
# )
|
|
||||||
|
|
||||||
# # step up a frame to ensure the it's the root's nursery
|
|
||||||
# child.sendline('u')
|
|
||||||
# child.expect(PROMPT)
|
|
||||||
# assert_before(
|
|
||||||
# child,
|
|
||||||
# [
|
|
||||||
# # handler block annotation
|
|
||||||
# '-> async with tractor.open_nursery(',
|
|
||||||
# ]
|
|
||||||
# )
|
|
||||||
|
|
||||||
child.sendline('c')
|
|
||||||
child.expect(EOF)
|
|
||||||
|
|
||||||
|
|
||||||
def test_shield_pause(
|
|
||||||
spawn,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Verify the `tractor.pause()/.post_mortem()` API works inside an
|
|
||||||
already cancelled `trio.CancelScope` and that you can step to the
|
|
||||||
next checkpoint wherein the cancelled will get raised.
|
|
||||||
|
|
||||||
'''
|
|
||||||
child = spawn('shielded_pause')
|
|
||||||
|
|
||||||
# First entry is via manual `.post_mortem()`
|
|
||||||
child.expect(PROMPT)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
_pause_msg,
|
|
||||||
"cancellable_pause_loop'",
|
|
||||||
"('cancelled_before_pause'", # actor name
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
# since 3 tries in ex. shield pause loop
|
|
||||||
for i in range(3):
|
|
||||||
child.sendline('c')
|
|
||||||
child.expect(PROMPT)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
_pause_msg,
|
|
||||||
"INSIDE SHIELDED PAUSE",
|
|
||||||
"('cancelled_before_pause'", # actor name
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
# back inside parent task that opened nursery
|
|
||||||
child.sendline('c')
|
|
||||||
child.expect(PROMPT)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
_crash_msg,
|
|
||||||
"('cancelled_before_pause'", # actor name
|
|
||||||
_repl_fail_msg,
|
|
||||||
"trio.Cancelled",
|
|
||||||
"raise Cancelled._create()",
|
|
||||||
|
|
||||||
# we should be handling a taskc inside
|
|
||||||
# the first `.port_mortem()` sin-shield!
|
|
||||||
'await DebugStatus.req_finished.wait()',
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
# same as above but in the root actor's task
|
|
||||||
child.sendline('c')
|
|
||||||
child.expect(PROMPT)
|
|
||||||
assert_before(
|
|
||||||
child,
|
|
||||||
[
|
|
||||||
_crash_msg,
|
|
||||||
"('root'", # actor name
|
|
||||||
_repl_fail_msg,
|
|
||||||
"trio.Cancelled",
|
|
||||||
"raise Cancelled._create()",
|
|
||||||
|
|
||||||
# handling a taskc inside the first unshielded
|
|
||||||
# `.port_mortem()`.
|
|
||||||
# BUT in this case in the root-proc path ;)
|
|
||||||
'wait Lock._debug_lock.acquire()',
|
|
||||||
]
|
|
||||||
)
|
|
||||||
child.sendline('c')
|
|
||||||
child.expect(EOF)
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: better error for "non-ideal" usage from the root actor.
|
|
||||||
# -[ ] if called from an async scope emit a message that suggests
|
|
||||||
# using `await tractor.pause()` instead since it's less overhead
|
|
||||||
# (in terms of `greenback` and/or extra threads) and if it's from
|
|
||||||
# a sync scope suggest that usage must first call
|
|
||||||
# `ensure_portal()` in the (eventual parent) async calling scope?
|
|
||||||
def test_sync_pause_from_bg_task_in_root_actor_():
|
|
||||||
'''
|
|
||||||
When used from the root actor, normally we can only implicitly
|
|
||||||
support `.pause_from_sync()` from the main-parent-task (that
|
|
||||||
opens the runtime via `open_root_actor()`) since `greenback`
|
|
||||||
requires a `.ensure_portal()` call per `trio.Task` where it is
|
|
||||||
used.
|
|
||||||
|
|
||||||
'''
|
|
||||||
...
|
|
||||||
|
|
||||||
# TODO: needs ANSI code stripping tho, see `assert_before()` # above!
|
|
||||||
def test_correct_frames_below_hidden():
|
|
||||||
'''
|
|
||||||
Ensure that once a `tractor.pause()` enages, when the user
|
|
||||||
inputs a "next"/"n" command the actual next line steps
|
|
||||||
and that using a "step"/"s" into the next LOC, particuarly
|
|
||||||
`tractor` APIs, you can step down into that code.
|
|
||||||
|
|
||||||
'''
|
|
||||||
...
|
|
||||||
|
|
||||||
|
|
||||||
def test_cant_pause_from_paused_task():
|
|
||||||
'''
|
|
||||||
Pausing from with an already paused task should raise an error.
|
|
||||||
|
|
||||||
Normally this should only happen in practise while debugging the call stack of `tractor.pause()` itself, likely
|
|
||||||
by a `.pause()` line somewhere inside our runtime.
|
|
||||||
|
|
||||||
'''
|
|
||||||
...
|
|
|
@ -26,7 +26,7 @@ async def test_reg_then_unreg(reg_addr):
|
||||||
portal = await n.start_actor('actor', enable_modules=[__name__])
|
portal = await n.start_actor('actor', enable_modules=[__name__])
|
||||||
uid = portal.channel.uid
|
uid = portal.channel.uid
|
||||||
|
|
||||||
async with tractor.get_registry(*reg_addr) as aportal:
|
async with tractor.get_arbiter(*reg_addr) as aportal:
|
||||||
# this local actor should be the arbiter
|
# this local actor should be the arbiter
|
||||||
assert actor is aportal.actor
|
assert actor is aportal.actor
|
||||||
|
|
||||||
|
@ -160,7 +160,7 @@ async def spawn_and_check_registry(
|
||||||
async with tractor.open_root_actor(
|
async with tractor.open_root_actor(
|
||||||
registry_addrs=[reg_addr],
|
registry_addrs=[reg_addr],
|
||||||
):
|
):
|
||||||
async with tractor.get_registry(*reg_addr) as portal:
|
async with tractor.get_arbiter(*reg_addr) as portal:
|
||||||
# runtime needs to be up to call this
|
# runtime needs to be up to call this
|
||||||
actor = tractor.current_actor()
|
actor = tractor.current_actor()
|
||||||
|
|
||||||
|
@ -181,9 +181,7 @@ async def spawn_and_check_registry(
|
||||||
|
|
||||||
try:
|
try:
|
||||||
async with tractor.open_nursery() as n:
|
async with tractor.open_nursery() as n:
|
||||||
async with trio.open_nursery(
|
async with trio.open_nursery() as trion:
|
||||||
strict_exception_groups=False,
|
|
||||||
) as trion:
|
|
||||||
|
|
||||||
portals = {}
|
portals = {}
|
||||||
for i in range(3):
|
for i in range(3):
|
||||||
|
@ -300,7 +298,7 @@ async def close_chans_before_nursery(
|
||||||
async with tractor.open_root_actor(
|
async with tractor.open_root_actor(
|
||||||
registry_addrs=[reg_addr],
|
registry_addrs=[reg_addr],
|
||||||
):
|
):
|
||||||
async with tractor.get_registry(*reg_addr) as aportal:
|
async with tractor.get_arbiter(*reg_addr) as aportal:
|
||||||
try:
|
try:
|
||||||
get_reg = partial(unpack_reg, aportal)
|
get_reg = partial(unpack_reg, aportal)
|
||||||
|
|
||||||
|
@ -318,9 +316,7 @@ async def close_chans_before_nursery(
|
||||||
async with portal2.open_stream_from(
|
async with portal2.open_stream_from(
|
||||||
stream_forever
|
stream_forever
|
||||||
) as agen2:
|
) as agen2:
|
||||||
async with trio.open_nursery(
|
async with trio.open_nursery() as n:
|
||||||
strict_exception_groups=False,
|
|
||||||
) as n:
|
|
||||||
n.start_soon(streamer, agen1)
|
n.start_soon(streamer, agen1)
|
||||||
n.start_soon(cancel, use_signal, .5)
|
n.start_soon(cancel, use_signal, .5)
|
||||||
try:
|
try:
|
||||||
|
|
|
@ -19,8 +19,8 @@ from tractor._testing import (
|
||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
def run_example_in_subproc(
|
def run_example_in_subproc(
|
||||||
loglevel: str,
|
loglevel: str,
|
||||||
testdir: pytest.Pytester,
|
testdir,
|
||||||
reg_addr: tuple[str, int],
|
arb_addr: tuple[str, int],
|
||||||
):
|
):
|
||||||
|
|
||||||
@contextmanager
|
@contextmanager
|
||||||
|
@ -81,36 +81,27 @@ def run_example_in_subproc(
|
||||||
|
|
||||||
# walk yields: (dirpath, dirnames, filenames)
|
# walk yields: (dirpath, dirnames, filenames)
|
||||||
[
|
[
|
||||||
(p[0], f)
|
(p[0], f) for p in os.walk(examples_dir()) for f in p[2]
|
||||||
for p in os.walk(examples_dir())
|
|
||||||
for f in p[2]
|
|
||||||
|
|
||||||
if (
|
if '__' not in f
|
||||||
'__' not in f
|
and f[0] != '_'
|
||||||
and f[0] != '_'
|
and 'debugging' not in p[0]
|
||||||
and 'debugging' not in p[0]
|
and 'integration' not in p[0]
|
||||||
and 'integration' not in p[0]
|
and 'advanced_faults' not in p[0]
|
||||||
and 'advanced_faults' not in p[0]
|
|
||||||
and 'multihost' not in p[0]
|
|
||||||
)
|
|
||||||
],
|
],
|
||||||
|
|
||||||
ids=lambda t: t[1],
|
ids=lambda t: t[1],
|
||||||
)
|
)
|
||||||
def test_example(
|
def test_example(run_example_in_subproc, example_script):
|
||||||
run_example_in_subproc,
|
"""Load and run scripts from this repo's ``examples/`` dir as a user
|
||||||
example_script,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Load and run scripts from this repo's ``examples/`` dir as a user
|
|
||||||
would copy and pasing them into their editor.
|
would copy and pasing them into their editor.
|
||||||
|
|
||||||
On windows a little more "finessing" is done to make
|
On windows a little more "finessing" is done to make
|
||||||
``multiprocessing`` play nice: we copy the ``__main__.py`` into the
|
``multiprocessing`` play nice: we copy the ``__main__.py`` into the
|
||||||
test directory and invoke the script as a module with ``python -m
|
test directory and invoke the script as a module with ``python -m
|
||||||
test_example``.
|
test_example``.
|
||||||
|
"""
|
||||||
'''
|
ex_file = os.path.join(*example_script)
|
||||||
ex_file: str = os.path.join(*example_script)
|
|
||||||
|
|
||||||
if 'rpc_bidir_streaming' in ex_file and sys.version_info < (3, 9):
|
if 'rpc_bidir_streaming' in ex_file and sys.version_info < (3, 9):
|
||||||
pytest.skip("2-way streaming example requires py3.9 async with syntax")
|
pytest.skip("2-way streaming example requires py3.9 async with syntax")
|
||||||
|
@ -136,8 +127,7 @@ def test_example(
|
||||||
# shouldn't eventually once we figure out what's
|
# shouldn't eventually once we figure out what's
|
||||||
# a better way to be explicit about aio side
|
# a better way to be explicit about aio side
|
||||||
# cancels?
|
# cancels?
|
||||||
and
|
and 'asyncio.exceptions.CancelledError' not in last_error
|
||||||
'asyncio.exceptions.CancelledError' not in last_error
|
|
||||||
):
|
):
|
||||||
raise Exception(errmsg)
|
raise Exception(errmsg)
|
||||||
|
|
||||||
|
|
|
@ -1,946 +0,0 @@
|
||||||
'''
|
|
||||||
Low-level functional audits for our
|
|
||||||
"capability based messaging"-spec feats.
|
|
||||||
|
|
||||||
B~)
|
|
||||||
|
|
||||||
'''
|
|
||||||
from contextlib import (
|
|
||||||
contextmanager as cm,
|
|
||||||
# nullcontext,
|
|
||||||
)
|
|
||||||
import importlib
|
|
||||||
from typing import (
|
|
||||||
Any,
|
|
||||||
Type,
|
|
||||||
Union,
|
|
||||||
)
|
|
||||||
|
|
||||||
from msgspec import (
|
|
||||||
# structs,
|
|
||||||
# msgpack,
|
|
||||||
Raw,
|
|
||||||
# Struct,
|
|
||||||
ValidationError,
|
|
||||||
)
|
|
||||||
import pytest
|
|
||||||
import trio
|
|
||||||
|
|
||||||
import tractor
|
|
||||||
from tractor import (
|
|
||||||
Actor,
|
|
||||||
# _state,
|
|
||||||
MsgTypeError,
|
|
||||||
Context,
|
|
||||||
)
|
|
||||||
from tractor.msg import (
|
|
||||||
_codec,
|
|
||||||
_ctxvar_MsgCodec,
|
|
||||||
_exts,
|
|
||||||
|
|
||||||
NamespacePath,
|
|
||||||
MsgCodec,
|
|
||||||
MsgDec,
|
|
||||||
mk_codec,
|
|
||||||
mk_dec,
|
|
||||||
apply_codec,
|
|
||||||
current_codec,
|
|
||||||
)
|
|
||||||
from tractor.msg.types import (
|
|
||||||
log,
|
|
||||||
Started,
|
|
||||||
# _payload_msgs,
|
|
||||||
# PayloadMsg,
|
|
||||||
# mk_msg_spec,
|
|
||||||
)
|
|
||||||
from tractor.msg._ops import (
|
|
||||||
limit_plds,
|
|
||||||
)
|
|
||||||
|
|
||||||
def enc_nsp(obj: Any) -> Any:
|
|
||||||
actor: Actor = tractor.current_actor(
|
|
||||||
err_on_no_runtime=False,
|
|
||||||
)
|
|
||||||
uid: tuple[str, str]|None = None if not actor else actor.uid
|
|
||||||
print(f'{uid} ENC HOOK')
|
|
||||||
|
|
||||||
match obj:
|
|
||||||
# case NamespacePath()|str():
|
|
||||||
case NamespacePath():
|
|
||||||
encoded: str = str(obj)
|
|
||||||
print(
|
|
||||||
f'----- ENCODING `NamespacePath` as `str` ------\n'
|
|
||||||
f'|_obj:{type(obj)!r} = {obj!r}\n'
|
|
||||||
f'|_encoded: str = {encoded!r}\n'
|
|
||||||
)
|
|
||||||
# if type(obj) != NamespacePath:
|
|
||||||
# breakpoint()
|
|
||||||
return encoded
|
|
||||||
case _:
|
|
||||||
logmsg: str = (
|
|
||||||
f'{uid}\n'
|
|
||||||
'FAILED ENCODE\n'
|
|
||||||
f'obj-> `{obj}: {type(obj)}`\n'
|
|
||||||
)
|
|
||||||
raise NotImplementedError(logmsg)
|
|
||||||
|
|
||||||
|
|
||||||
def dec_nsp(
|
|
||||||
obj_type: Type,
|
|
||||||
obj: Any,
|
|
||||||
|
|
||||||
) -> Any:
|
|
||||||
# breakpoint()
|
|
||||||
actor: Actor = tractor.current_actor(
|
|
||||||
err_on_no_runtime=False,
|
|
||||||
)
|
|
||||||
uid: tuple[str, str]|None = None if not actor else actor.uid
|
|
||||||
print(
|
|
||||||
f'{uid}\n'
|
|
||||||
'CUSTOM DECODE\n'
|
|
||||||
f'type-arg-> {obj_type}\n'
|
|
||||||
f'obj-arg-> `{obj}`: {type(obj)}\n'
|
|
||||||
)
|
|
||||||
nsp = None
|
|
||||||
# XXX, never happens right?
|
|
||||||
if obj_type is Raw:
|
|
||||||
breakpoint()
|
|
||||||
|
|
||||||
if (
|
|
||||||
obj_type is NamespacePath
|
|
||||||
and isinstance(obj, str)
|
|
||||||
and ':' in obj
|
|
||||||
):
|
|
||||||
nsp = NamespacePath(obj)
|
|
||||||
# TODO: we could built a generic handler using
|
|
||||||
# JUST matching the obj_type part?
|
|
||||||
# nsp = obj_type(obj)
|
|
||||||
|
|
||||||
if nsp:
|
|
||||||
print(f'Returning NSP instance: {nsp}')
|
|
||||||
return nsp
|
|
||||||
|
|
||||||
logmsg: str = (
|
|
||||||
f'{uid}\n'
|
|
||||||
'FAILED DECODE\n'
|
|
||||||
f'type-> {obj_type}\n'
|
|
||||||
f'obj-arg-> `{obj}`: {type(obj)}\n\n'
|
|
||||||
f'current codec:\n'
|
|
||||||
f'{current_codec()}\n'
|
|
||||||
)
|
|
||||||
# TODO: figure out the ignore subsys for this!
|
|
||||||
# -[ ] option whether to defense-relay backc the msg
|
|
||||||
# inside an `Invalid`/`Ignore`
|
|
||||||
# -[ ] how to make this handling pluggable such that a
|
|
||||||
# `Channel`/`MsgTransport` can intercept and process
|
|
||||||
# back msgs either via exception handling or some other
|
|
||||||
# signal?
|
|
||||||
log.warning(logmsg)
|
|
||||||
# NOTE: this delivers the invalid
|
|
||||||
# value up to `msgspec`'s decoding
|
|
||||||
# machinery for error raising.
|
|
||||||
return obj
|
|
||||||
# raise NotImplementedError(logmsg)
|
|
||||||
|
|
||||||
|
|
||||||
def ex_func(*args):
|
|
||||||
'''
|
|
||||||
A mod level func we can ref and load via our `NamespacePath`
|
|
||||||
python-object pointer `str` subtype.
|
|
||||||
|
|
||||||
'''
|
|
||||||
print(f'ex_func({args})')
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'add_codec_hooks',
|
|
||||||
[
|
|
||||||
True,
|
|
||||||
False,
|
|
||||||
],
|
|
||||||
ids=['use_codec_hooks', 'no_codec_hooks'],
|
|
||||||
)
|
|
||||||
def test_custom_extension_types(
|
|
||||||
debug_mode: bool,
|
|
||||||
add_codec_hooks: bool
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Verify that a `MsgCodec` (used for encoding all outbound IPC msgs
|
|
||||||
and decoding all inbound `PayloadMsg`s) and a paired `MsgDec`
|
|
||||||
(used for decoding the `PayloadMsg.pld: Raw` received within a given
|
|
||||||
task's ipc `Context` scope) can both send and receive "extension types"
|
|
||||||
as supported via custom converter hooks passed to `msgspec`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
nsp_pld_dec: MsgDec = mk_dec(
|
|
||||||
spec=None, # ONLY support the ext type
|
|
||||||
dec_hook=dec_nsp if add_codec_hooks else None,
|
|
||||||
ext_types=[NamespacePath],
|
|
||||||
)
|
|
||||||
nsp_codec: MsgCodec = mk_codec(
|
|
||||||
# ipc_pld_spec=Raw, # default!
|
|
||||||
|
|
||||||
# NOTE XXX: the encode hook MUST be used no matter what since
|
|
||||||
# our `NamespacePath` is not any of a `Any` native type nor
|
|
||||||
# a `msgspec.Struct` subtype - so `msgspec` has no way to know
|
|
||||||
# how to encode it unless we provide the custom hook.
|
|
||||||
#
|
|
||||||
# AGAIN that is, regardless of whether we spec an
|
|
||||||
# `Any`-decoded-pld the enc has no knowledge (by default)
|
|
||||||
# how to enc `NamespacePath` (nsp), so we add a custom
|
|
||||||
# hook to do that ALWAYS.
|
|
||||||
enc_hook=enc_nsp if add_codec_hooks else None,
|
|
||||||
|
|
||||||
# XXX NOTE: pretty sure this is mutex with the `type=` to
|
|
||||||
# `Decoder`? so it won't work in tandem with the
|
|
||||||
# `ipc_pld_spec` passed above?
|
|
||||||
ext_types=[NamespacePath],
|
|
||||||
|
|
||||||
# TODO? is it useful to have the `.pld` decoded *prior* to
|
|
||||||
# the `PldRx`?? like perf or mem related?
|
|
||||||
# ext_dec=nsp_pld_dec,
|
|
||||||
)
|
|
||||||
if add_codec_hooks:
|
|
||||||
assert nsp_codec.dec.dec_hook is None
|
|
||||||
|
|
||||||
# TODO? if we pass `ext_dec` above?
|
|
||||||
# assert nsp_codec.dec.dec_hook is dec_nsp
|
|
||||||
|
|
||||||
assert nsp_codec.enc.enc_hook is enc_nsp
|
|
||||||
|
|
||||||
nsp = NamespacePath.from_ref(ex_func)
|
|
||||||
|
|
||||||
try:
|
|
||||||
nsp_bytes: bytes = nsp_codec.encode(nsp)
|
|
||||||
nsp_rt_sin_msg = nsp_pld_dec.decode(nsp_bytes)
|
|
||||||
nsp_rt_sin_msg.load_ref() is ex_func
|
|
||||||
except TypeError:
|
|
||||||
if not add_codec_hooks:
|
|
||||||
pass
|
|
||||||
|
|
||||||
try:
|
|
||||||
msg_bytes: bytes = nsp_codec.encode(
|
|
||||||
Started(
|
|
||||||
cid='cid',
|
|
||||||
pld=nsp,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
# since the ext-type obj should also be set as the msg.pld
|
|
||||||
assert nsp_bytes in msg_bytes
|
|
||||||
started_rt: Started = nsp_codec.decode(msg_bytes)
|
|
||||||
pld: Raw = started_rt.pld
|
|
||||||
assert isinstance(pld, Raw)
|
|
||||||
nsp_rt: NamespacePath = nsp_pld_dec.decode(pld)
|
|
||||||
assert isinstance(nsp_rt, NamespacePath)
|
|
||||||
# in obj comparison terms they should be the same
|
|
||||||
assert nsp_rt == nsp
|
|
||||||
# ensure we've decoded to ext type!
|
|
||||||
assert nsp_rt.load_ref() is ex_func
|
|
||||||
|
|
||||||
except TypeError:
|
|
||||||
if not add_codec_hooks:
|
|
||||||
pass
|
|
||||||
|
|
||||||
@tractor.context
|
|
||||||
async def sleep_forever_in_sub(
|
|
||||||
ctx: Context,
|
|
||||||
) -> None:
|
|
||||||
await trio.sleep_forever()
|
|
||||||
|
|
||||||
|
|
||||||
def mk_custom_codec(
|
|
||||||
add_hooks: bool,
|
|
||||||
|
|
||||||
) -> tuple[
|
|
||||||
MsgCodec, # encode to send
|
|
||||||
MsgDec, # pld receive-n-decode
|
|
||||||
]:
|
|
||||||
'''
|
|
||||||
Create custom `msgpack` enc/dec-hooks and set a `Decoder`
|
|
||||||
which only loads `pld_spec` (like `NamespacePath`) types.
|
|
||||||
|
|
||||||
'''
|
|
||||||
|
|
||||||
# XXX NOTE XXX: despite defining `NamespacePath` as a type
|
|
||||||
# field on our `PayloadMsg.pld`, we still need a enc/dec_hook() pair
|
|
||||||
# to cast to/from that type on the wire. See the docs:
|
|
||||||
# https://jcristharif.com/msgspec/extending.html#mapping-to-from-native-types
|
|
||||||
|
|
||||||
# if pld_spec is Any:
|
|
||||||
# pld_spec = Raw
|
|
||||||
|
|
||||||
nsp_codec: MsgCodec = mk_codec(
|
|
||||||
# ipc_pld_spec=Raw, # default!
|
|
||||||
|
|
||||||
# NOTE XXX: the encode hook MUST be used no matter what since
|
|
||||||
# our `NamespacePath` is not any of a `Any` native type nor
|
|
||||||
# a `msgspec.Struct` subtype - so `msgspec` has no way to know
|
|
||||||
# how to encode it unless we provide the custom hook.
|
|
||||||
#
|
|
||||||
# AGAIN that is, regardless of whether we spec an
|
|
||||||
# `Any`-decoded-pld the enc has no knowledge (by default)
|
|
||||||
# how to enc `NamespacePath` (nsp), so we add a custom
|
|
||||||
# hook to do that ALWAYS.
|
|
||||||
enc_hook=enc_nsp if add_hooks else None,
|
|
||||||
|
|
||||||
# XXX NOTE: pretty sure this is mutex with the `type=` to
|
|
||||||
# `Decoder`? so it won't work in tandem with the
|
|
||||||
# `ipc_pld_spec` passed above?
|
|
||||||
ext_types=[NamespacePath],
|
|
||||||
)
|
|
||||||
# dec_hook=dec_nsp if add_hooks else None,
|
|
||||||
return nsp_codec
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'limit_plds_args',
|
|
||||||
[
|
|
||||||
(
|
|
||||||
{'dec_hook': None, 'ext_types': None},
|
|
||||||
None,
|
|
||||||
),
|
|
||||||
(
|
|
||||||
{'dec_hook': dec_nsp, 'ext_types': None},
|
|
||||||
TypeError,
|
|
||||||
),
|
|
||||||
(
|
|
||||||
{'dec_hook': dec_nsp, 'ext_types': [NamespacePath]},
|
|
||||||
None,
|
|
||||||
),
|
|
||||||
(
|
|
||||||
{'dec_hook': dec_nsp, 'ext_types': [NamespacePath|None]},
|
|
||||||
None,
|
|
||||||
),
|
|
||||||
],
|
|
||||||
ids=[
|
|
||||||
'no_hook_no_ext_types',
|
|
||||||
'only_hook',
|
|
||||||
'hook_and_ext_types',
|
|
||||||
'hook_and_ext_types_w_null',
|
|
||||||
]
|
|
||||||
)
|
|
||||||
def test_pld_limiting_usage(
|
|
||||||
limit_plds_args: tuple[dict, Exception|None],
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Verify `dec_hook()` and `ext_types` need to either both be
|
|
||||||
provided or we raise a explanator type-error.
|
|
||||||
|
|
||||||
'''
|
|
||||||
kwargs, maybe_err = limit_plds_args
|
|
||||||
async def main():
|
|
||||||
async with tractor.open_nursery() as an: # just to open runtime
|
|
||||||
|
|
||||||
# XXX SHOULD NEVER WORK outside an ipc ctx scope!
|
|
||||||
try:
|
|
||||||
with limit_plds(**kwargs):
|
|
||||||
pass
|
|
||||||
except RuntimeError:
|
|
||||||
pass
|
|
||||||
|
|
||||||
p: tractor.Portal = await an.start_actor(
|
|
||||||
'sub',
|
|
||||||
enable_modules=[__name__],
|
|
||||||
)
|
|
||||||
async with (
|
|
||||||
p.open_context(
|
|
||||||
sleep_forever_in_sub
|
|
||||||
) as (ctx, first),
|
|
||||||
):
|
|
||||||
try:
|
|
||||||
with limit_plds(**kwargs):
|
|
||||||
pass
|
|
||||||
except maybe_err as exc:
|
|
||||||
assert type(exc) is maybe_err
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
def chk_codec_applied(
|
|
||||||
expect_codec: MsgCodec|None,
|
|
||||||
enter_value: MsgCodec|None = None,
|
|
||||||
|
|
||||||
) -> MsgCodec:
|
|
||||||
'''
|
|
||||||
buncha sanity checks ensuring that the IPC channel's
|
|
||||||
context-vars are set to the expected codec and that are
|
|
||||||
ctx-var wrapper APIs match the same.
|
|
||||||
|
|
||||||
'''
|
|
||||||
# TODO: play with tricyle again, bc this is supposed to work
|
|
||||||
# the way we want?
|
|
||||||
#
|
|
||||||
# TreeVar
|
|
||||||
# task: trio.Task = trio.lowlevel.current_task()
|
|
||||||
# curr_codec = _ctxvar_MsgCodec.get_in(task)
|
|
||||||
|
|
||||||
# ContextVar
|
|
||||||
# task_ctx: Context = task.context
|
|
||||||
# assert _ctxvar_MsgCodec in task_ctx
|
|
||||||
# curr_codec: MsgCodec = task.context[_ctxvar_MsgCodec]
|
|
||||||
if expect_codec is None:
|
|
||||||
assert enter_value is None
|
|
||||||
return
|
|
||||||
|
|
||||||
# NOTE: currently we use this!
|
|
||||||
# RunVar
|
|
||||||
curr_codec: MsgCodec = current_codec()
|
|
||||||
last_read_codec = _ctxvar_MsgCodec.get()
|
|
||||||
# assert curr_codec is last_read_codec
|
|
||||||
|
|
||||||
assert (
|
|
||||||
(same_codec := expect_codec) is
|
|
||||||
# returned from `mk_codec()`
|
|
||||||
|
|
||||||
# yielded value from `apply_codec()`
|
|
||||||
|
|
||||||
# read from current task's `contextvars.Context`
|
|
||||||
curr_codec is
|
|
||||||
last_read_codec
|
|
||||||
|
|
||||||
# the default `msgspec` settings
|
|
||||||
is not _codec._def_msgspec_codec
|
|
||||||
is not _codec._def_tractor_codec
|
|
||||||
)
|
|
||||||
|
|
||||||
if enter_value:
|
|
||||||
assert enter_value is same_codec
|
|
||||||
|
|
||||||
|
|
||||||
@tractor.context
|
|
||||||
async def send_back_values(
|
|
||||||
ctx: Context,
|
|
||||||
rent_pld_spec_type_strs: list[str],
|
|
||||||
add_hooks: bool,
|
|
||||||
|
|
||||||
) -> None:
|
|
||||||
'''
|
|
||||||
Setup up a custom codec to load instances of `NamespacePath`
|
|
||||||
and ensure we can round trip a func ref with our parent.
|
|
||||||
|
|
||||||
'''
|
|
||||||
uid: tuple = tractor.current_actor().uid
|
|
||||||
|
|
||||||
# init state in sub-actor should be default
|
|
||||||
chk_codec_applied(
|
|
||||||
expect_codec=_codec._def_tractor_codec,
|
|
||||||
)
|
|
||||||
|
|
||||||
# load pld spec from input str
|
|
||||||
rent_pld_spec = _exts.dec_type_union(
|
|
||||||
rent_pld_spec_type_strs,
|
|
||||||
mods=[
|
|
||||||
importlib.import_module(__name__),
|
|
||||||
],
|
|
||||||
)
|
|
||||||
rent_pld_spec_types: set[Type] = _codec.unpack_spec_types(
|
|
||||||
rent_pld_spec,
|
|
||||||
)
|
|
||||||
|
|
||||||
# ONLY add ext-hooks if the rent specified a non-std type!
|
|
||||||
add_hooks: bool = (
|
|
||||||
NamespacePath in rent_pld_spec_types
|
|
||||||
and
|
|
||||||
add_hooks
|
|
||||||
)
|
|
||||||
|
|
||||||
# same as on parent side config.
|
|
||||||
nsp_codec: MsgCodec|None = None
|
|
||||||
if add_hooks:
|
|
||||||
nsp_codec = mk_codec(
|
|
||||||
enc_hook=enc_nsp,
|
|
||||||
ext_types=[NamespacePath],
|
|
||||||
)
|
|
||||||
|
|
||||||
with (
|
|
||||||
maybe_apply_codec(nsp_codec) as codec,
|
|
||||||
limit_plds(
|
|
||||||
rent_pld_spec,
|
|
||||||
dec_hook=dec_nsp if add_hooks else None,
|
|
||||||
ext_types=[NamespacePath] if add_hooks else None,
|
|
||||||
) as pld_dec,
|
|
||||||
):
|
|
||||||
# ?XXX? SHOULD WE NOT be swapping the global codec since it
|
|
||||||
# breaks `Context.started()` roundtripping checks??
|
|
||||||
chk_codec_applied(
|
|
||||||
expect_codec=nsp_codec,
|
|
||||||
enter_value=codec,
|
|
||||||
)
|
|
||||||
|
|
||||||
# ?TODO, mismatch case(s)?
|
|
||||||
#
|
|
||||||
# ensure pld spec matches on both sides
|
|
||||||
ctx_pld_dec: MsgDec = ctx._pld_rx._pld_dec
|
|
||||||
assert pld_dec is ctx_pld_dec
|
|
||||||
child_pld_spec: Type = pld_dec.spec
|
|
||||||
child_pld_spec_types: set[Type] = _codec.unpack_spec_types(
|
|
||||||
child_pld_spec,
|
|
||||||
)
|
|
||||||
assert (
|
|
||||||
child_pld_spec_types.issuperset(
|
|
||||||
rent_pld_spec_types
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
# ?TODO, try loop for each of the types in pld-superset?
|
|
||||||
#
|
|
||||||
# for send_value in [
|
|
||||||
# nsp,
|
|
||||||
# str(nsp),
|
|
||||||
# None,
|
|
||||||
# ]:
|
|
||||||
nsp = NamespacePath.from_ref(ex_func)
|
|
||||||
try:
|
|
||||||
print(
|
|
||||||
f'{uid}: attempting to `.started({nsp})`\n'
|
|
||||||
f'\n'
|
|
||||||
f'rent_pld_spec: {rent_pld_spec}\n'
|
|
||||||
f'child_pld_spec: {child_pld_spec}\n'
|
|
||||||
f'codec: {codec}\n'
|
|
||||||
)
|
|
||||||
# await tractor.pause()
|
|
||||||
await ctx.started(nsp)
|
|
||||||
|
|
||||||
except tractor.MsgTypeError as _mte:
|
|
||||||
mte = _mte
|
|
||||||
|
|
||||||
# false -ve case
|
|
||||||
if add_hooks:
|
|
||||||
raise RuntimeError(
|
|
||||||
f'EXPECTED to `.started()` value given spec ??\n\n'
|
|
||||||
f'child_pld_spec -> {child_pld_spec}\n'
|
|
||||||
f'value = {nsp}: {type(nsp)}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
# true -ve case
|
|
||||||
raise mte
|
|
||||||
|
|
||||||
# TODO: maybe we should add our own wrapper error so as to
|
|
||||||
# be interchange-lib agnostic?
|
|
||||||
# -[ ] the error type is wtv is raised from the hook so we
|
|
||||||
# could also require a type-class of errors for
|
|
||||||
# indicating whether the hook-failure can be handled by
|
|
||||||
# a nasty-dialog-unprot sub-sys?
|
|
||||||
except TypeError as typerr:
|
|
||||||
# false -ve
|
|
||||||
if add_hooks:
|
|
||||||
raise RuntimeError('Should have been able to send `nsp`??')
|
|
||||||
|
|
||||||
# true -ve
|
|
||||||
print('Failed to send `nsp` due to no ext hooks set!')
|
|
||||||
raise typerr
|
|
||||||
|
|
||||||
# now try sending a set of valid and invalid plds to ensure
|
|
||||||
# the pld spec is respected.
|
|
||||||
sent: list[Any] = []
|
|
||||||
async with ctx.open_stream() as ipc:
|
|
||||||
print(
|
|
||||||
f'{uid}: streaming all pld types to rent..'
|
|
||||||
)
|
|
||||||
|
|
||||||
# for send_value, expect_send in iter_send_val_items:
|
|
||||||
for send_value in [
|
|
||||||
nsp,
|
|
||||||
str(nsp),
|
|
||||||
None,
|
|
||||||
]:
|
|
||||||
send_type: Type = type(send_value)
|
|
||||||
print(
|
|
||||||
f'{uid}: SENDING NEXT pld\n'
|
|
||||||
f'send_type: {send_type}\n'
|
|
||||||
f'send_value: {send_value}\n'
|
|
||||||
)
|
|
||||||
try:
|
|
||||||
await ipc.send(send_value)
|
|
||||||
sent.append(send_value)
|
|
||||||
|
|
||||||
except ValidationError as valerr:
|
|
||||||
print(f'{uid} FAILED TO SEND {send_value}!')
|
|
||||||
|
|
||||||
# false -ve
|
|
||||||
if add_hooks:
|
|
||||||
raise RuntimeError(
|
|
||||||
f'EXPECTED to roundtrip value given spec:\n'
|
|
||||||
f'rent_pld_spec -> {rent_pld_spec}\n'
|
|
||||||
f'child_pld_spec -> {child_pld_spec}\n'
|
|
||||||
f'value = {send_value}: {send_type}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
# true -ve
|
|
||||||
raise valerr
|
|
||||||
# continue
|
|
||||||
|
|
||||||
else:
|
|
||||||
print(
|
|
||||||
f'{uid}: finished sending all values\n'
|
|
||||||
'Should be exiting stream block!\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
print(f'{uid}: exited streaming block!')
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@cm
|
|
||||||
def maybe_apply_codec(codec: MsgCodec|None) -> MsgCodec|None:
|
|
||||||
if codec is None:
|
|
||||||
yield None
|
|
||||||
return
|
|
||||||
|
|
||||||
with apply_codec(codec) as codec:
|
|
||||||
yield codec
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'pld_spec',
|
|
||||||
[
|
|
||||||
Any,
|
|
||||||
NamespacePath,
|
|
||||||
NamespacePath|None, # the "maybe" spec Bo
|
|
||||||
],
|
|
||||||
ids=[
|
|
||||||
'any_type',
|
|
||||||
'only_nsp_ext',
|
|
||||||
'maybe_nsp_ext',
|
|
||||||
]
|
|
||||||
)
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'add_hooks',
|
|
||||||
[
|
|
||||||
True,
|
|
||||||
False,
|
|
||||||
],
|
|
||||||
ids=[
|
|
||||||
'use_codec_hooks',
|
|
||||||
'no_codec_hooks',
|
|
||||||
],
|
|
||||||
)
|
|
||||||
def test_ext_types_over_ipc(
|
|
||||||
debug_mode: bool,
|
|
||||||
pld_spec: Union[Type],
|
|
||||||
add_hooks: bool,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Ensure we can support extension types coverted using
|
|
||||||
`enc/dec_hook()`s passed to the `.msg.limit_plds()` API
|
|
||||||
and that sane errors happen when we try do the same without
|
|
||||||
the codec hooks.
|
|
||||||
|
|
||||||
'''
|
|
||||||
pld_types: set[Type] = _codec.unpack_spec_types(pld_spec)
|
|
||||||
|
|
||||||
async def main():
|
|
||||||
|
|
||||||
# sanity check the default pld-spec beforehand
|
|
||||||
chk_codec_applied(
|
|
||||||
expect_codec=_codec._def_tractor_codec,
|
|
||||||
)
|
|
||||||
|
|
||||||
# extension type we want to send as msg payload
|
|
||||||
nsp = NamespacePath.from_ref(ex_func)
|
|
||||||
|
|
||||||
# ^NOTE, 2 cases:
|
|
||||||
# - codec hooks noto added -> decode nsp as `str`
|
|
||||||
# - codec with hooks -> decode nsp as `NamespacePath`
|
|
||||||
nsp_codec: MsgCodec|None = None
|
|
||||||
if (
|
|
||||||
NamespacePath in pld_types
|
|
||||||
and
|
|
||||||
add_hooks
|
|
||||||
):
|
|
||||||
nsp_codec = mk_codec(
|
|
||||||
enc_hook=enc_nsp,
|
|
||||||
ext_types=[NamespacePath],
|
|
||||||
)
|
|
||||||
|
|
||||||
async with tractor.open_nursery(
|
|
||||||
debug_mode=debug_mode,
|
|
||||||
) as an:
|
|
||||||
p: tractor.Portal = await an.start_actor(
|
|
||||||
'sub',
|
|
||||||
enable_modules=[__name__],
|
|
||||||
)
|
|
||||||
with (
|
|
||||||
maybe_apply_codec(nsp_codec) as codec,
|
|
||||||
):
|
|
||||||
chk_codec_applied(
|
|
||||||
expect_codec=nsp_codec,
|
|
||||||
enter_value=codec,
|
|
||||||
)
|
|
||||||
rent_pld_spec_type_strs: list[str] = _exts.enc_type_union(pld_spec)
|
|
||||||
|
|
||||||
# XXX should raise an mte (`MsgTypeError`)
|
|
||||||
# when `add_hooks == False` bc the input
|
|
||||||
# `expect_ipc_send` kwarg has a nsp which can't be
|
|
||||||
# serialized!
|
|
||||||
#
|
|
||||||
# TODO:can we ensure this happens from the
|
|
||||||
# `Return`-side (aka the sub) as well?
|
|
||||||
try:
|
|
||||||
ctx: tractor.Context
|
|
||||||
ipc: tractor.MsgStream
|
|
||||||
async with (
|
|
||||||
|
|
||||||
# XXX should raise an mte (`MsgTypeError`)
|
|
||||||
# when `add_hooks == False`..
|
|
||||||
p.open_context(
|
|
||||||
send_back_values,
|
|
||||||
# expect_debug=debug_mode,
|
|
||||||
rent_pld_spec_type_strs=rent_pld_spec_type_strs,
|
|
||||||
add_hooks=add_hooks,
|
|
||||||
# expect_ipc_send=expect_ipc_send,
|
|
||||||
) as (ctx, first),
|
|
||||||
|
|
||||||
ctx.open_stream() as ipc,
|
|
||||||
):
|
|
||||||
with (
|
|
||||||
limit_plds(
|
|
||||||
pld_spec,
|
|
||||||
dec_hook=dec_nsp if add_hooks else None,
|
|
||||||
ext_types=[NamespacePath] if add_hooks else None,
|
|
||||||
) as pld_dec,
|
|
||||||
):
|
|
||||||
ctx_pld_dec: MsgDec = ctx._pld_rx._pld_dec
|
|
||||||
assert pld_dec is ctx_pld_dec
|
|
||||||
|
|
||||||
# if (
|
|
||||||
# not add_hooks
|
|
||||||
# and
|
|
||||||
# NamespacePath in
|
|
||||||
# ):
|
|
||||||
# pytest.fail('ctx should fail to open without custom enc_hook!?')
|
|
||||||
|
|
||||||
await ipc.send(nsp)
|
|
||||||
nsp_rt = await ipc.receive()
|
|
||||||
|
|
||||||
assert nsp_rt == nsp
|
|
||||||
assert nsp_rt.load_ref() is ex_func
|
|
||||||
|
|
||||||
# this test passes bc we can go no further!
|
|
||||||
except MsgTypeError as mte:
|
|
||||||
# if not add_hooks:
|
|
||||||
# # teardown nursery
|
|
||||||
# await p.cancel_actor()
|
|
||||||
# return
|
|
||||||
|
|
||||||
raise mte
|
|
||||||
|
|
||||||
await p.cancel_actor()
|
|
||||||
|
|
||||||
if (
|
|
||||||
NamespacePath in pld_types
|
|
||||||
and
|
|
||||||
add_hooks
|
|
||||||
):
|
|
||||||
trio.run(main)
|
|
||||||
|
|
||||||
else:
|
|
||||||
with pytest.raises(
|
|
||||||
expected_exception=tractor.RemoteActorError,
|
|
||||||
) as excinfo:
|
|
||||||
trio.run(main)
|
|
||||||
|
|
||||||
exc = excinfo.value
|
|
||||||
# bc `.started(nsp: NamespacePath)` will raise
|
|
||||||
assert exc.boxed_type is TypeError
|
|
||||||
|
|
||||||
|
|
||||||
# def chk_pld_type(
|
|
||||||
# payload_spec: Type[Struct]|Any,
|
|
||||||
# pld: Any,
|
|
||||||
|
|
||||||
# expect_roundtrip: bool|None = None,
|
|
||||||
|
|
||||||
# ) -> bool:
|
|
||||||
|
|
||||||
# pld_val_type: Type = type(pld)
|
|
||||||
|
|
||||||
# # TODO: verify that the overridden subtypes
|
|
||||||
# # DO NOT have modified type-annots from original!
|
|
||||||
# # 'Start', .pld: FuncSpec
|
|
||||||
# # 'StartAck', .pld: IpcCtxSpec
|
|
||||||
# # 'Stop', .pld: UNSEt
|
|
||||||
# # 'Error', .pld: ErrorData
|
|
||||||
|
|
||||||
# codec: MsgCodec = mk_codec(
|
|
||||||
# # NOTE: this ONLY accepts `PayloadMsg.pld` fields of a specified
|
|
||||||
# # type union.
|
|
||||||
# ipc_pld_spec=payload_spec,
|
|
||||||
# )
|
|
||||||
|
|
||||||
# # make a one-off dec to compare with our `MsgCodec` instance
|
|
||||||
# # which does the below `mk_msg_spec()` call internally
|
|
||||||
# ipc_msg_spec: Union[Type[Struct]]
|
|
||||||
# msg_types: list[PayloadMsg[payload_spec]]
|
|
||||||
# (
|
|
||||||
# ipc_msg_spec,
|
|
||||||
# msg_types,
|
|
||||||
# ) = mk_msg_spec(
|
|
||||||
# payload_type_union=payload_spec,
|
|
||||||
# )
|
|
||||||
# _enc = msgpack.Encoder()
|
|
||||||
# _dec = msgpack.Decoder(
|
|
||||||
# type=ipc_msg_spec or Any, # like `PayloadMsg[Any]`
|
|
||||||
# )
|
|
||||||
|
|
||||||
# assert (
|
|
||||||
# payload_spec
|
|
||||||
# ==
|
|
||||||
# codec.pld_spec
|
|
||||||
# )
|
|
||||||
|
|
||||||
# # assert codec.dec == dec
|
|
||||||
# #
|
|
||||||
# # ^-XXX-^ not sure why these aren't "equal" but when cast
|
|
||||||
# # to `str` they seem to match ?? .. kk
|
|
||||||
|
|
||||||
# assert (
|
|
||||||
# str(ipc_msg_spec)
|
|
||||||
# ==
|
|
||||||
# str(codec.msg_spec)
|
|
||||||
# ==
|
|
||||||
# str(_dec.type)
|
|
||||||
# ==
|
|
||||||
# str(codec.dec.type)
|
|
||||||
# )
|
|
||||||
|
|
||||||
# # verify the boxed-type for all variable payload-type msgs.
|
|
||||||
# if not msg_types:
|
|
||||||
# breakpoint()
|
|
||||||
|
|
||||||
# roundtrip: bool|None = None
|
|
||||||
# pld_spec_msg_names: list[str] = [
|
|
||||||
# td.__name__ for td in _payload_msgs
|
|
||||||
# ]
|
|
||||||
# for typedef in msg_types:
|
|
||||||
|
|
||||||
# skip_runtime_msg: bool = typedef.__name__ not in pld_spec_msg_names
|
|
||||||
# if skip_runtime_msg:
|
|
||||||
# continue
|
|
||||||
|
|
||||||
# pld_field = structs.fields(typedef)[1]
|
|
||||||
# assert pld_field.type is payload_spec # TODO-^ does this need to work to get all subtypes to adhere?
|
|
||||||
|
|
||||||
# kwargs: dict[str, Any] = {
|
|
||||||
# 'cid': '666',
|
|
||||||
# 'pld': pld,
|
|
||||||
# }
|
|
||||||
# enc_msg: PayloadMsg = typedef(**kwargs)
|
|
||||||
|
|
||||||
# _wire_bytes: bytes = _enc.encode(enc_msg)
|
|
||||||
# wire_bytes: bytes = codec.enc.encode(enc_msg)
|
|
||||||
# assert _wire_bytes == wire_bytes
|
|
||||||
|
|
||||||
# ve: ValidationError|None = None
|
|
||||||
# try:
|
|
||||||
# dec_msg = codec.dec.decode(wire_bytes)
|
|
||||||
# _dec_msg = _dec.decode(wire_bytes)
|
|
||||||
|
|
||||||
# # decoded msg and thus payload should be exactly same!
|
|
||||||
# assert (roundtrip := (
|
|
||||||
# _dec_msg
|
|
||||||
# ==
|
|
||||||
# dec_msg
|
|
||||||
# ==
|
|
||||||
# enc_msg
|
|
||||||
# ))
|
|
||||||
|
|
||||||
# if (
|
|
||||||
# expect_roundtrip is not None
|
|
||||||
# and expect_roundtrip != roundtrip
|
|
||||||
# ):
|
|
||||||
# breakpoint()
|
|
||||||
|
|
||||||
# assert (
|
|
||||||
# pld
|
|
||||||
# ==
|
|
||||||
# dec_msg.pld
|
|
||||||
# ==
|
|
||||||
# enc_msg.pld
|
|
||||||
# )
|
|
||||||
# # assert (roundtrip := (_dec_msg == enc_msg))
|
|
||||||
|
|
||||||
# except ValidationError as _ve:
|
|
||||||
# ve = _ve
|
|
||||||
# roundtrip: bool = False
|
|
||||||
# if pld_val_type is payload_spec:
|
|
||||||
# raise ValueError(
|
|
||||||
# 'Got `ValidationError` despite type-var match!?\n'
|
|
||||||
# f'pld_val_type: {pld_val_type}\n'
|
|
||||||
# f'payload_type: {payload_spec}\n'
|
|
||||||
# ) from ve
|
|
||||||
|
|
||||||
# else:
|
|
||||||
# # ow we good cuz the pld spec mismatched.
|
|
||||||
# print(
|
|
||||||
# 'Got expected `ValidationError` since,\n'
|
|
||||||
# f'{pld_val_type} is not {payload_spec}\n'
|
|
||||||
# )
|
|
||||||
# else:
|
|
||||||
# if (
|
|
||||||
# payload_spec is not Any
|
|
||||||
# and
|
|
||||||
# pld_val_type is not payload_spec
|
|
||||||
# ):
|
|
||||||
# raise ValueError(
|
|
||||||
# 'DID NOT `ValidationError` despite expected type match!?\n'
|
|
||||||
# f'pld_val_type: {pld_val_type}\n'
|
|
||||||
# f'payload_type: {payload_spec}\n'
|
|
||||||
# )
|
|
||||||
|
|
||||||
# # full code decode should always be attempted!
|
|
||||||
# if roundtrip is None:
|
|
||||||
# breakpoint()
|
|
||||||
|
|
||||||
# return roundtrip
|
|
||||||
|
|
||||||
|
|
||||||
# ?TODO? maybe remove since covered in the newer `test_pldrx_limiting`
|
|
||||||
# via end-2-end testing of all this?
|
|
||||||
# -[ ] IOW do we really NEED this lowlevel unit testing?
|
|
||||||
#
|
|
||||||
# def test_limit_msgspec(
|
|
||||||
# debug_mode: bool,
|
|
||||||
# ):
|
|
||||||
# '''
|
|
||||||
# Internals unit testing to verify that type-limiting an IPC ctx's
|
|
||||||
# msg spec with `Pldrx.limit_plds()` results in various
|
|
||||||
# encapsulated `msgspec` object settings and state.
|
|
||||||
|
|
||||||
# '''
|
|
||||||
# async def main():
|
|
||||||
# async with tractor.open_root_actor(
|
|
||||||
# debug_mode=debug_mode,
|
|
||||||
# ):
|
|
||||||
# # ensure we can round-trip a boxing `PayloadMsg`
|
|
||||||
# assert chk_pld_type(
|
|
||||||
# payload_spec=Any,
|
|
||||||
# pld=None,
|
|
||||||
# expect_roundtrip=True,
|
|
||||||
# )
|
|
||||||
|
|
||||||
# # verify that a mis-typed payload value won't decode
|
|
||||||
# assert not chk_pld_type(
|
|
||||||
# payload_spec=int,
|
|
||||||
# pld='doggy',
|
|
||||||
# )
|
|
||||||
|
|
||||||
# # parametrize the boxed `.pld` type as a custom-struct
|
|
||||||
# # and ensure that parametrization propagates
|
|
||||||
# # to all payload-msg-spec-able subtypes!
|
|
||||||
# class CustomPayload(Struct):
|
|
||||||
# name: str
|
|
||||||
# value: Any
|
|
||||||
|
|
||||||
# assert not chk_pld_type(
|
|
||||||
# payload_spec=CustomPayload,
|
|
||||||
# pld='doggy',
|
|
||||||
# )
|
|
||||||
|
|
||||||
# assert chk_pld_type(
|
|
||||||
# payload_spec=CustomPayload,
|
|
||||||
# pld=CustomPayload(name='doggy', value='urmom')
|
|
||||||
# )
|
|
||||||
|
|
||||||
# # yah, we can `.pause_from_sync()` now!
|
|
||||||
# # breakpoint()
|
|
||||||
|
|
||||||
# trio.run(main)
|
|
File diff suppressed because it is too large
Load Diff
|
@ -16,11 +16,6 @@ from tractor import ( # typing
|
||||||
Portal,
|
Portal,
|
||||||
Context,
|
Context,
|
||||||
ContextCancelled,
|
ContextCancelled,
|
||||||
RemoteActorError,
|
|
||||||
)
|
|
||||||
from tractor._testing import (
|
|
||||||
# tractor_test,
|
|
||||||
expect_ctxc,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
# XXX TODO cases:
|
# XXX TODO cases:
|
||||||
|
@ -55,10 +50,9 @@ from tractor._testing import (
|
||||||
|
|
||||||
|
|
||||||
@tractor.context
|
@tractor.context
|
||||||
async def open_stream_then_sleep_forever(
|
async def sleep_forever(
|
||||||
ctx: Context,
|
ctx: Context,
|
||||||
expect_ctxc: bool = False,
|
expect_ctxc: bool = False,
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
Sync the context, open a stream then just sleep.
|
Sync the context, open a stream then just sleep.
|
||||||
|
@ -68,10 +62,6 @@ async def open_stream_then_sleep_forever(
|
||||||
'''
|
'''
|
||||||
try:
|
try:
|
||||||
await ctx.started()
|
await ctx.started()
|
||||||
|
|
||||||
# NOTE: the below means this child will send a `Stop`
|
|
||||||
# to it's parent-side task despite that side never
|
|
||||||
# opening a stream itself.
|
|
||||||
async with ctx.open_stream():
|
async with ctx.open_stream():
|
||||||
await trio.sleep_forever()
|
await trio.sleep_forever()
|
||||||
|
|
||||||
|
@ -105,7 +95,7 @@ async def error_before_started(
|
||||||
'''
|
'''
|
||||||
async with tractor.wait_for_actor('sleeper') as p2:
|
async with tractor.wait_for_actor('sleeper') as p2:
|
||||||
async with (
|
async with (
|
||||||
p2.open_context(open_stream_then_sleep_forever) as (peer_ctx, first),
|
p2.open_context(sleep_forever) as (peer_ctx, first),
|
||||||
peer_ctx.open_stream(),
|
peer_ctx.open_stream(),
|
||||||
):
|
):
|
||||||
# NOTE: this WAS inside an @acm body but i factored it
|
# NOTE: this WAS inside an @acm body but i factored it
|
||||||
|
@ -166,11 +156,10 @@ def test_do_not_swallow_error_before_started_by_remote_contextcancelled(
|
||||||
):
|
):
|
||||||
await trio.sleep_forever()
|
await trio.sleep_forever()
|
||||||
|
|
||||||
with pytest.raises(RemoteActorError) as excinfo:
|
with pytest.raises(tractor.RemoteActorError) as excinfo:
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
|
|
||||||
rae = excinfo.value
|
assert excinfo.value.type == TypeError
|
||||||
assert rae.boxed_type is TypeError
|
|
||||||
|
|
||||||
|
|
||||||
@tractor.context
|
@tractor.context
|
||||||
|
@ -190,10 +179,6 @@ async def sleep_a_bit_then_cancel_peer(
|
||||||
await trio.sleep(cancel_after)
|
await trio.sleep(cancel_after)
|
||||||
await peer.cancel_actor()
|
await peer.cancel_actor()
|
||||||
|
|
||||||
# such that we're cancelled by our rent ctx-task
|
|
||||||
await trio.sleep(3)
|
|
||||||
print('CANCELLER RETURNING!')
|
|
||||||
|
|
||||||
|
|
||||||
@tractor.context
|
@tractor.context
|
||||||
async def stream_ints(
|
async def stream_ints(
|
||||||
|
@ -209,13 +194,9 @@ async def stream_ints(
|
||||||
@tractor.context
|
@tractor.context
|
||||||
async def stream_from_peer(
|
async def stream_from_peer(
|
||||||
ctx: Context,
|
ctx: Context,
|
||||||
debug_mode: bool,
|
|
||||||
peer_name: str = 'sleeper',
|
peer_name: str = 'sleeper',
|
||||||
) -> None:
|
) -> None:
|
||||||
|
|
||||||
# sanity
|
|
||||||
assert tractor._state.debug_mode() == debug_mode
|
|
||||||
|
|
||||||
peer: Portal
|
peer: Portal
|
||||||
try:
|
try:
|
||||||
async with (
|
async with (
|
||||||
|
@ -249,54 +230,20 @@ async def stream_from_peer(
|
||||||
assert msg is not None
|
assert msg is not None
|
||||||
print(msg)
|
print(msg)
|
||||||
|
|
||||||
# NOTE: cancellation of the (sleeper) peer should always cause
|
# NOTE: cancellation of the (sleeper) peer should always
|
||||||
# a `ContextCancelled` raise in this streaming actor.
|
# cause a `ContextCancelled` raise in this streaming
|
||||||
except ContextCancelled as _ctxc:
|
# actor.
|
||||||
ctxc = _ctxc
|
except ContextCancelled as ctxc:
|
||||||
|
ctxerr = ctxc
|
||||||
|
|
||||||
# print("TRYING TO ENTER PAUSSE!!!")
|
assert peer_ctx._remote_error is ctxerr
|
||||||
# await tractor.pause(shield=True)
|
assert peer_ctx._remote_error.msgdata == ctxerr.msgdata
|
||||||
re: ContextCancelled = peer_ctx._remote_error
|
|
||||||
|
|
||||||
# XXX YES XXX, remote error should be unpacked only once!
|
|
||||||
assert (
|
|
||||||
re
|
|
||||||
is
|
|
||||||
peer_ctx.maybe_error
|
|
||||||
is
|
|
||||||
ctxc
|
|
||||||
is
|
|
||||||
peer_ctx._local_error
|
|
||||||
)
|
|
||||||
# NOTE: these errors should all match!
|
|
||||||
# ------ - ------
|
|
||||||
# XXX [2024-05-03] XXX
|
|
||||||
# ------ - ------
|
|
||||||
# broke this due to a re-raise inside `.msg._ops.drain_to_final_msg()`
|
|
||||||
# where the `Error()` msg was directly raising the ctxc
|
|
||||||
# instead of just returning up to the caller inside
|
|
||||||
# `Context.return()` which would results in a diff instance of
|
|
||||||
# the same remote error bubbling out above vs what was
|
|
||||||
# already unpacked and set inside `Context.
|
|
||||||
assert (
|
|
||||||
peer_ctx._remote_error.msgdata
|
|
||||||
==
|
|
||||||
ctxc.msgdata
|
|
||||||
)
|
|
||||||
# ^-XXX-^ notice the data is of course the exact same.. so
|
|
||||||
# the above larger assert makes sense to also always be true!
|
|
||||||
|
|
||||||
# XXX YES XXX, bc should be exact same msg instances
|
|
||||||
assert peer_ctx._remote_error._ipc_msg is ctxc._ipc_msg
|
|
||||||
|
|
||||||
# XXX NO XXX, bc new one always created for property accesss
|
|
||||||
assert peer_ctx._remote_error.ipc_msg != ctxc.ipc_msg
|
|
||||||
|
|
||||||
# the peer ctx is the canceller even though it's canceller
|
# the peer ctx is the canceller even though it's canceller
|
||||||
# is the "canceller" XD
|
# is the "canceller" XD
|
||||||
assert peer_name in peer_ctx.canceller
|
assert peer_name in peer_ctx.canceller
|
||||||
|
|
||||||
assert "canceller" in ctxc.canceller
|
assert "canceller" in ctxerr.canceller
|
||||||
|
|
||||||
# caller peer should not be the cancel requester
|
# caller peer should not be the cancel requester
|
||||||
assert not ctx.cancel_called
|
assert not ctx.cancel_called
|
||||||
|
@ -320,13 +267,12 @@ async def stream_from_peer(
|
||||||
|
|
||||||
# TODO / NOTE `.canceller` won't have been set yet
|
# TODO / NOTE `.canceller` won't have been set yet
|
||||||
# here because that machinery is inside
|
# here because that machinery is inside
|
||||||
# `Portal.open_context().__aexit__()` BUT, if we had
|
# `.open_context().__aexit__()` BUT, if we had
|
||||||
# a way to know immediately (from the last
|
# a way to know immediately (from the last
|
||||||
# checkpoint) that cancellation was due to
|
# checkpoint) that cancellation was due to
|
||||||
# a remote, we COULD assert this here..see,
|
# a remote, we COULD assert this here..see,
|
||||||
# https://github.com/goodboy/tractor/issues/368
|
# https://github.com/goodboy/tractor/issues/368
|
||||||
#
|
#
|
||||||
# await tractor.pause()
|
|
||||||
# assert 'canceller' in ctx.canceller
|
# assert 'canceller' in ctx.canceller
|
||||||
|
|
||||||
# root/parent actor task should NEVER HAVE cancelled us!
|
# root/parent actor task should NEVER HAVE cancelled us!
|
||||||
|
@ -430,13 +376,12 @@ def test_peer_canceller(
|
||||||
try:
|
try:
|
||||||
async with (
|
async with (
|
||||||
sleeper.open_context(
|
sleeper.open_context(
|
||||||
open_stream_then_sleep_forever,
|
sleep_forever,
|
||||||
expect_ctxc=True,
|
expect_ctxc=True,
|
||||||
) as (sleeper_ctx, sent),
|
) as (sleeper_ctx, sent),
|
||||||
|
|
||||||
just_caller.open_context(
|
just_caller.open_context(
|
||||||
stream_from_peer,
|
stream_from_peer,
|
||||||
debug_mode=debug_mode,
|
|
||||||
) as (caller_ctx, sent),
|
) as (caller_ctx, sent),
|
||||||
|
|
||||||
canceller.open_context(
|
canceller.open_context(
|
||||||
|
@ -462,11 +407,10 @@ def test_peer_canceller(
|
||||||
|
|
||||||
# should always raise since this root task does
|
# should always raise since this root task does
|
||||||
# not request the sleeper cancellation ;)
|
# not request the sleeper cancellation ;)
|
||||||
except ContextCancelled as _ctxc:
|
except ContextCancelled as ctxerr:
|
||||||
ctxc = _ctxc
|
|
||||||
print(
|
print(
|
||||||
'CAUGHT REMOTE CONTEXT CANCEL\n\n'
|
'CAUGHT REMOTE CONTEXT CANCEL\n\n'
|
||||||
f'{ctxc}\n'
|
f'{ctxerr}\n'
|
||||||
)
|
)
|
||||||
|
|
||||||
# canceller and caller peers should not
|
# canceller and caller peers should not
|
||||||
|
@ -477,7 +421,7 @@ def test_peer_canceller(
|
||||||
# we were not the actor, our peer was
|
# we were not the actor, our peer was
|
||||||
assert not sleeper_ctx.cancel_acked
|
assert not sleeper_ctx.cancel_acked
|
||||||
|
|
||||||
assert ctxc.canceller[0] == 'canceller'
|
assert ctxerr.canceller[0] == 'canceller'
|
||||||
|
|
||||||
# XXX NOTE XXX: since THIS `ContextCancelled`
|
# XXX NOTE XXX: since THIS `ContextCancelled`
|
||||||
# HAS NOT YET bubbled up to the
|
# HAS NOT YET bubbled up to the
|
||||||
|
@ -488,7 +432,7 @@ def test_peer_canceller(
|
||||||
|
|
||||||
# CASE_1: error-during-ctxc-handling,
|
# CASE_1: error-during-ctxc-handling,
|
||||||
if error_during_ctxerr_handling:
|
if error_during_ctxerr_handling:
|
||||||
raise RuntimeError('Simulated RTE re-raise during ctxc handling')
|
raise RuntimeError('Simulated error during teardown')
|
||||||
|
|
||||||
# CASE_2: standard teardown inside in `.open_context()` block
|
# CASE_2: standard teardown inside in `.open_context()` block
|
||||||
raise
|
raise
|
||||||
|
@ -553,9 +497,6 @@ def test_peer_canceller(
|
||||||
# should be cancelled by US.
|
# should be cancelled by US.
|
||||||
#
|
#
|
||||||
if error_during_ctxerr_handling:
|
if error_during_ctxerr_handling:
|
||||||
print(f'loc_err: {_loc_err}\n')
|
|
||||||
assert isinstance(loc_err, RuntimeError)
|
|
||||||
|
|
||||||
# since we do a rte reraise above, the
|
# since we do a rte reraise above, the
|
||||||
# `.open_context()` error handling should have
|
# `.open_context()` error handling should have
|
||||||
# raised a local rte, thus the internal
|
# raised a local rte, thus the internal
|
||||||
|
@ -564,6 +505,9 @@ def test_peer_canceller(
|
||||||
# a `trio.Cancelled` due to a local
|
# a `trio.Cancelled` due to a local
|
||||||
# `._scope.cancel()` call.
|
# `._scope.cancel()` call.
|
||||||
assert not sleeper_ctx._scope.cancelled_caught
|
assert not sleeper_ctx._scope.cancelled_caught
|
||||||
|
|
||||||
|
assert isinstance(loc_err, RuntimeError)
|
||||||
|
print(f'_loc_err: {_loc_err}\n')
|
||||||
# assert sleeper_ctx._local_error is _loc_err
|
# assert sleeper_ctx._local_error is _loc_err
|
||||||
# assert sleeper_ctx._local_error is _loc_err
|
# assert sleeper_ctx._local_error is _loc_err
|
||||||
assert not (
|
assert not (
|
||||||
|
@ -600,12 +544,9 @@ def test_peer_canceller(
|
||||||
|
|
||||||
else: # the other 2 ctxs
|
else: # the other 2 ctxs
|
||||||
assert (
|
assert (
|
||||||
isinstance(re, ContextCancelled)
|
re.canceller
|
||||||
and (
|
==
|
||||||
re.canceller
|
canceller.channel.uid
|
||||||
==
|
|
||||||
canceller.channel.uid
|
|
||||||
)
|
|
||||||
)
|
)
|
||||||
|
|
||||||
# since the sleeper errors while handling a
|
# since the sleeper errors while handling a
|
||||||
|
@ -798,16 +739,14 @@ def test_peer_canceller(
|
||||||
with pytest.raises(ContextCancelled) as excinfo:
|
with pytest.raises(ContextCancelled) as excinfo:
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
|
|
||||||
assert excinfo.value.boxed_type == ContextCancelled
|
assert excinfo.value.type == ContextCancelled
|
||||||
assert excinfo.value.canceller[0] == 'canceller'
|
assert excinfo.value.canceller[0] == 'canceller'
|
||||||
|
|
||||||
|
|
||||||
@tractor.context
|
@tractor.context
|
||||||
async def basic_echo_server(
|
async def basic_echo_server(
|
||||||
ctx: Context,
|
ctx: Context,
|
||||||
peer_name: str = 'wittle_bruv',
|
peer_name: str = 'stepbro',
|
||||||
|
|
||||||
err_after: int|None = None,
|
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
|
@ -835,30 +774,17 @@ async def basic_echo_server(
|
||||||
# assert 0
|
# assert 0
|
||||||
await ipc.send(resp)
|
await ipc.send(resp)
|
||||||
|
|
||||||
if (
|
|
||||||
err_after
|
|
||||||
and i > err_after
|
|
||||||
):
|
|
||||||
raise RuntimeError(
|
|
||||||
f'Simulated error in `{peer_name}`'
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@tractor.context
|
@tractor.context
|
||||||
async def serve_subactors(
|
async def serve_subactors(
|
||||||
ctx: Context,
|
ctx: Context,
|
||||||
peer_name: str,
|
peer_name: str,
|
||||||
debug_mode: bool,
|
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
async with open_nursery() as an:
|
async with open_nursery() as an:
|
||||||
|
|
||||||
# sanity
|
|
||||||
assert tractor._state.debug_mode() == debug_mode
|
|
||||||
|
|
||||||
await ctx.started(peer_name)
|
await ctx.started(peer_name)
|
||||||
async with ctx.open_stream() as ipc:
|
async with ctx.open_stream() as reqs:
|
||||||
async for msg in ipc:
|
async for msg in reqs:
|
||||||
peer_name: str = msg
|
peer_name: str = msg
|
||||||
peer: Portal = await an.start_actor(
|
peer: Portal = await an.start_actor(
|
||||||
name=peer_name,
|
name=peer_name,
|
||||||
|
@ -869,7 +795,7 @@ async def serve_subactors(
|
||||||
f'{peer_name}\n'
|
f'{peer_name}\n'
|
||||||
f'|_{peer}\n'
|
f'|_{peer}\n'
|
||||||
)
|
)
|
||||||
await ipc.send((
|
await reqs.send((
|
||||||
peer.chan.uid,
|
peer.chan.uid,
|
||||||
peer.chan.raddr,
|
peer.chan.raddr,
|
||||||
))
|
))
|
||||||
|
@ -881,20 +807,14 @@ async def serve_subactors(
|
||||||
async def client_req_subactor(
|
async def client_req_subactor(
|
||||||
ctx: Context,
|
ctx: Context,
|
||||||
peer_name: str,
|
peer_name: str,
|
||||||
debug_mode: bool,
|
|
||||||
|
|
||||||
# used to simulate a user causing an error to be raised
|
# used to simulate a user causing an error to be raised
|
||||||
# directly in thread (like a KBI) to better replicate the
|
# directly in thread (like a KBI) to better replicate the
|
||||||
# case where a `modden` CLI client would hang afer requesting
|
# case where a `modden` CLI client would hang afer requesting
|
||||||
# a `Context.cancel()` to `bigd`'s wks spawner.
|
# a `Context.cancel()` to `bigd`'s wks spawner.
|
||||||
reraise_on_cancel: str|None = None,
|
reraise_on_cancel: str|None = None,
|
||||||
sub_err_after: int|None = None,
|
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
# sanity
|
|
||||||
if debug_mode:
|
|
||||||
assert tractor._state.debug_mode()
|
|
||||||
|
|
||||||
# TODO: other cases to do with sub lifetimes:
|
# TODO: other cases to do with sub lifetimes:
|
||||||
# -[ ] test that we can have the server spawn a sub
|
# -[ ] test that we can have the server spawn a sub
|
||||||
# that lives longer then ctx with this client.
|
# that lives longer then ctx with this client.
|
||||||
|
@ -916,7 +836,6 @@ async def client_req_subactor(
|
||||||
spawner.open_context(
|
spawner.open_context(
|
||||||
serve_subactors,
|
serve_subactors,
|
||||||
peer_name=peer_name,
|
peer_name=peer_name,
|
||||||
debug_mode=debug_mode,
|
|
||||||
) as (spawner_ctx, first),
|
) as (spawner_ctx, first),
|
||||||
):
|
):
|
||||||
assert first == peer_name
|
assert first == peer_name
|
||||||
|
@ -938,7 +857,6 @@ async def client_req_subactor(
|
||||||
await tell_little_bro(
|
await tell_little_bro(
|
||||||
actor_name=sub_uid[0],
|
actor_name=sub_uid[0],
|
||||||
caller='client',
|
caller='client',
|
||||||
err_after=sub_err_after,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
# TODO: test different scope-layers of
|
# TODO: test different scope-layers of
|
||||||
|
@ -950,7 +868,9 @@ async def client_req_subactor(
|
||||||
# TODO: would be super nice to have a special injected
|
# TODO: would be super nice to have a special injected
|
||||||
# cancel type here (maybe just our ctxc) but using
|
# cancel type here (maybe just our ctxc) but using
|
||||||
# some native mechanism in `trio` :p
|
# some native mechanism in `trio` :p
|
||||||
except trio.Cancelled as err:
|
except (
|
||||||
|
trio.Cancelled
|
||||||
|
) as err:
|
||||||
_err = err
|
_err = err
|
||||||
if reraise_on_cancel:
|
if reraise_on_cancel:
|
||||||
errtype = globals()['__builtins__'][reraise_on_cancel]
|
errtype = globals()['__builtins__'][reraise_on_cancel]
|
||||||
|
@ -977,9 +897,7 @@ async def client_req_subactor(
|
||||||
|
|
||||||
async def tell_little_bro(
|
async def tell_little_bro(
|
||||||
actor_name: str,
|
actor_name: str,
|
||||||
|
caller: str = ''
|
||||||
caller: str = '',
|
|
||||||
err_after: int|None = None,
|
|
||||||
):
|
):
|
||||||
# contact target actor, do a stream dialog.
|
# contact target actor, do a stream dialog.
|
||||||
async with (
|
async with (
|
||||||
|
@ -988,12 +906,10 @@ async def tell_little_bro(
|
||||||
) as lb,
|
) as lb,
|
||||||
lb.open_context(
|
lb.open_context(
|
||||||
basic_echo_server,
|
basic_echo_server,
|
||||||
|
|
||||||
# XXX proxy any delayed err condition
|
|
||||||
err_after=err_after,
|
|
||||||
) as (sub_ctx, first),
|
) as (sub_ctx, first),
|
||||||
|
sub_ctx.open_stream(
|
||||||
sub_ctx.open_stream() as echo_ipc,
|
basic_echo_server,
|
||||||
|
) as echo_ipc,
|
||||||
):
|
):
|
||||||
actor: Actor = current_actor()
|
actor: Actor = current_actor()
|
||||||
uid: tuple = actor.uid
|
uid: tuple = actor.uid
|
||||||
|
@ -1020,15 +936,9 @@ async def tell_little_bro(
|
||||||
'raise_client_error',
|
'raise_client_error',
|
||||||
[None, 'KeyboardInterrupt'],
|
[None, 'KeyboardInterrupt'],
|
||||||
)
|
)
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'raise_sub_spawn_error_after',
|
|
||||||
[None, 50],
|
|
||||||
)
|
|
||||||
def test_peer_spawns_and_cancels_service_subactor(
|
def test_peer_spawns_and_cancels_service_subactor(
|
||||||
debug_mode: bool,
|
debug_mode: bool,
|
||||||
raise_client_error: str,
|
raise_client_error: str,
|
||||||
reg_addr: tuple[str, int],
|
|
||||||
raise_sub_spawn_error_after: int|None,
|
|
||||||
):
|
):
|
||||||
# NOTE: this tests for the modden `mod wks open piker` bug
|
# NOTE: this tests for the modden `mod wks open piker` bug
|
||||||
# discovered as part of implementing workspace ctx
|
# discovered as part of implementing workspace ctx
|
||||||
|
@ -1042,21 +952,10 @@ def test_peer_spawns_and_cancels_service_subactor(
|
||||||
# and the server's spawned child should cancel and terminate!
|
# and the server's spawned child should cancel and terminate!
|
||||||
peer_name: str = 'little_bro'
|
peer_name: str = 'little_bro'
|
||||||
|
|
||||||
def check_inner_rte(rae: RemoteActorError):
|
|
||||||
'''
|
|
||||||
Validate the little_bro's relayed inception!
|
|
||||||
|
|
||||||
'''
|
|
||||||
assert rae.boxed_type is RemoteActorError
|
|
||||||
assert rae.src_type is RuntimeError
|
|
||||||
assert 'client' in rae.relay_uid
|
|
||||||
assert peer_name in rae.src_uid
|
|
||||||
|
|
||||||
async def main():
|
async def main():
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery(
|
||||||
# NOTE: to halt the peer tasks on ctxc, uncomment this.
|
# NOTE: to halt the peer tasks on ctxc, uncomment this.
|
||||||
debug_mode=debug_mode,
|
debug_mode=debug_mode,
|
||||||
registry_addrs=[reg_addr],
|
|
||||||
) as an:
|
) as an:
|
||||||
server: Portal = await an.start_actor(
|
server: Portal = await an.start_actor(
|
||||||
(server_name := 'spawn_server'),
|
(server_name := 'spawn_server'),
|
||||||
|
@ -1075,24 +974,14 @@ def test_peer_spawns_and_cancels_service_subactor(
|
||||||
server.open_context(
|
server.open_context(
|
||||||
serve_subactors,
|
serve_subactors,
|
||||||
peer_name=peer_name,
|
peer_name=peer_name,
|
||||||
debug_mode=debug_mode,
|
|
||||||
|
|
||||||
) as (spawn_ctx, first),
|
) as (spawn_ctx, first),
|
||||||
|
|
||||||
client.open_context(
|
client.open_context(
|
||||||
client_req_subactor,
|
client_req_subactor,
|
||||||
peer_name=peer_name,
|
peer_name=peer_name,
|
||||||
debug_mode=debug_mode,
|
|
||||||
reraise_on_cancel=raise_client_error,
|
reraise_on_cancel=raise_client_error,
|
||||||
|
|
||||||
# trigger for error condition in sub
|
|
||||||
# during streaming.
|
|
||||||
sub_err_after=raise_sub_spawn_error_after,
|
|
||||||
|
|
||||||
) as (client_ctx, client_says),
|
) as (client_ctx, client_says),
|
||||||
):
|
):
|
||||||
root: Actor = current_actor()
|
|
||||||
spawner_uid: tuple = spawn_ctx.chan.uid
|
|
||||||
print(
|
print(
|
||||||
f'Server says: {first}\n'
|
f'Server says: {first}\n'
|
||||||
f'Client says: {client_says}\n'
|
f'Client says: {client_says}\n'
|
||||||
|
@ -1102,7 +991,6 @@ def test_peer_spawns_and_cancels_service_subactor(
|
||||||
# (grandchild of this root actor) "little_bro"
|
# (grandchild of this root actor) "little_bro"
|
||||||
# and ensure we can also use it as an echo
|
# and ensure we can also use it as an echo
|
||||||
# server.
|
# server.
|
||||||
sub: Portal
|
|
||||||
async with tractor.wait_for_actor(
|
async with tractor.wait_for_actor(
|
||||||
name=peer_name,
|
name=peer_name,
|
||||||
) as sub:
|
) as sub:
|
||||||
|
@ -1114,138 +1002,56 @@ def test_peer_spawns_and_cancels_service_subactor(
|
||||||
f'.uid: {sub.actor.uid}\n'
|
f'.uid: {sub.actor.uid}\n'
|
||||||
f'chan.raddr: {sub.chan.raddr}\n'
|
f'chan.raddr: {sub.chan.raddr}\n'
|
||||||
)
|
)
|
||||||
|
await tell_little_bro(
|
||||||
|
actor_name=peer_name,
|
||||||
|
caller='root',
|
||||||
|
)
|
||||||
|
|
||||||
async with expect_ctxc(
|
# signal client to raise a KBI
|
||||||
yay=raise_sub_spawn_error_after,
|
await client_ctx.cancel()
|
||||||
reraise=False,
|
print('root cancelled client, checking that sub-spawn is down')
|
||||||
):
|
|
||||||
await tell_little_bro(
|
|
||||||
actor_name=peer_name,
|
|
||||||
caller='root',
|
|
||||||
)
|
|
||||||
|
|
||||||
if not raise_sub_spawn_error_after:
|
async with tractor.find_actor(
|
||||||
|
name=peer_name,
|
||||||
|
) as sub:
|
||||||
|
assert not sub
|
||||||
|
|
||||||
# signal client to cancel and maybe raise a KBI
|
print('root cancelling server/client sub-actors')
|
||||||
await client_ctx.cancel()
|
|
||||||
print(
|
|
||||||
'-> root cancelling client,\n'
|
|
||||||
'-> root checking `client_ctx.result()`,\n'
|
|
||||||
f'-> checking that sub-spawn {peer_name} is down\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
try:
|
# await tractor.pause()
|
||||||
res = await client_ctx.result(hide_tb=False)
|
res = await client_ctx.result(hide_tb=False)
|
||||||
|
assert isinstance(res, ContextCancelled)
|
||||||
# in remote (relayed inception) error
|
assert client_ctx.cancel_acked
|
||||||
# case, we should error on the line above!
|
assert res.canceller == current_actor().uid
|
||||||
if raise_sub_spawn_error_after:
|
|
||||||
pytest.fail(
|
|
||||||
'Never rxed proxied `RemoteActorError[RuntimeError]` !?'
|
|
||||||
)
|
|
||||||
|
|
||||||
assert isinstance(res, ContextCancelled)
|
|
||||||
assert client_ctx.cancel_acked
|
|
||||||
assert res.canceller == root.uid
|
|
||||||
|
|
||||||
except RemoteActorError as rae:
|
|
||||||
_err = rae
|
|
||||||
assert raise_sub_spawn_error_after
|
|
||||||
|
|
||||||
# since this is a "relayed error" via the client
|
|
||||||
# sub-actor, it is expected to be
|
|
||||||
# a `RemoteActorError` boxing another
|
|
||||||
# `RemoteActorError` otherwise known as
|
|
||||||
# an "inception" (from `trio`'s parlance)
|
|
||||||
# ((or maybe a "Matryoshka" and/or "matron"
|
|
||||||
# in our own working parlance)) which
|
|
||||||
# contains the source error from the
|
|
||||||
# little_bro: a `RuntimeError`.
|
|
||||||
#
|
|
||||||
check_inner_rte(rae)
|
|
||||||
assert rae.relay_uid == client.chan.uid
|
|
||||||
assert rae.src_uid == sub.chan.uid
|
|
||||||
|
|
||||||
assert not client_ctx.cancel_acked
|
|
||||||
assert (
|
|
||||||
client_ctx.maybe_error
|
|
||||||
is client_ctx.outcome
|
|
||||||
is rae
|
|
||||||
)
|
|
||||||
raise
|
|
||||||
# await tractor.pause()
|
|
||||||
|
|
||||||
else:
|
|
||||||
assert not raise_sub_spawn_error_after
|
|
||||||
|
|
||||||
# cancelling the spawner sub should
|
|
||||||
# transitively cancel it's sub, the little
|
|
||||||
# bruv.
|
|
||||||
print('root cancelling server/client sub-actors')
|
|
||||||
await spawn_ctx.cancel()
|
|
||||||
async with tractor.find_actor(
|
|
||||||
name=peer_name,
|
|
||||||
) as sub:
|
|
||||||
assert not sub
|
|
||||||
|
|
||||||
|
await spawn_ctx.cancel()
|
||||||
# await server.cancel_actor()
|
# await server.cancel_actor()
|
||||||
|
|
||||||
except RemoteActorError as rae:
|
|
||||||
# XXX more-or-less same as above handler
|
|
||||||
# this is just making sure the error bubbles out
|
|
||||||
# of the
|
|
||||||
_err = rae
|
|
||||||
assert raise_sub_spawn_error_after
|
|
||||||
raise
|
|
||||||
|
|
||||||
# since we called `.cancel_actor()`, `.cancel_ack`
|
# since we called `.cancel_actor()`, `.cancel_ack`
|
||||||
# will not be set on the ctx bc `ctx.cancel()` was not
|
# will not be set on the ctx bc `ctx.cancel()` was not
|
||||||
# called directly fot this confext.
|
# called directly fot this confext.
|
||||||
except ContextCancelled as ctxc:
|
except ContextCancelled as ctxc:
|
||||||
_ctxc = ctxc
|
print('caught ctxc from contexts!')
|
||||||
print(
|
assert ctxc.canceller == current_actor().uid
|
||||||
f'{root.uid} caught ctxc from ctx with {client_ctx.chan.uid}\n'
|
|
||||||
f'{repr(ctxc)}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
if not raise_sub_spawn_error_after:
|
|
||||||
assert ctxc.canceller == root.uid
|
|
||||||
else:
|
|
||||||
assert ctxc.canceller == spawner_uid
|
|
||||||
|
|
||||||
assert ctxc is spawn_ctx.outcome
|
assert ctxc is spawn_ctx.outcome
|
||||||
assert ctxc is spawn_ctx.maybe_error
|
assert ctxc is spawn_ctx.maybe_error
|
||||||
raise
|
raise
|
||||||
|
|
||||||
if raise_sub_spawn_error_after:
|
# assert spawn_ctx.cancel_acked
|
||||||
pytest.fail(
|
assert spawn_ctx.cancel_acked
|
||||||
'context block(s) in PARENT never raised?!?'
|
assert client_ctx.cancel_acked
|
||||||
)
|
|
||||||
|
|
||||||
if not raise_sub_spawn_error_after:
|
await client.cancel_actor()
|
||||||
# assert spawn_ctx.cancel_acked
|
await server.cancel_actor()
|
||||||
assert spawn_ctx.cancel_acked
|
|
||||||
assert client_ctx.cancel_acked
|
|
||||||
|
|
||||||
await client.cancel_actor()
|
# WOA WOA WOA! we need this to close..!!!??
|
||||||
await server.cancel_actor()
|
# that's super bad XD
|
||||||
|
|
||||||
# WOA WOA WOA! we need this to close..!!!??
|
# TODO: why isn't this working!?!?
|
||||||
# that's super bad XD
|
# we're now outside the `.open_context()` block so
|
||||||
|
# the internal `Context._scope: CancelScope` should be
|
||||||
|
# gracefully "closed" ;)
|
||||||
|
|
||||||
# TODO: why isn't this working!?!?
|
# assert spawn_ctx.cancelled_caught
|
||||||
# we're now outside the `.open_context()` block so
|
|
||||||
# the internal `Context._scope: CancelScope` should be
|
|
||||||
# gracefully "closed" ;)
|
|
||||||
|
|
||||||
# assert spawn_ctx.cancelled_caught
|
trio.run(main)
|
||||||
|
|
||||||
if raise_sub_spawn_error_after:
|
|
||||||
with pytest.raises(RemoteActorError) as excinfo:
|
|
||||||
trio.run(main)
|
|
||||||
|
|
||||||
rae: RemoteActorError = excinfo.value
|
|
||||||
check_inner_rte(rae)
|
|
||||||
|
|
||||||
else:
|
|
||||||
trio.run(main)
|
|
||||||
|
|
|
@ -38,13 +38,10 @@ async def async_gen_stream(sequence):
|
||||||
assert cs.cancelled_caught
|
assert cs.cancelled_caught
|
||||||
|
|
||||||
|
|
||||||
# TODO: deprecated either remove entirely
|
|
||||||
# or re-impl in terms of `MsgStream` one-sides
|
|
||||||
# wrapper, but at least remove `Portal.open_stream_from()`
|
|
||||||
@tractor.stream
|
@tractor.stream
|
||||||
async def context_stream(
|
async def context_stream(
|
||||||
ctx: tractor.Context,
|
ctx: tractor.Context,
|
||||||
sequence: list[int],
|
sequence
|
||||||
):
|
):
|
||||||
for i in sequence:
|
for i in sequence:
|
||||||
await ctx.send_yield(i)
|
await ctx.send_yield(i)
|
||||||
|
|
|
@ -38,7 +38,7 @@ async def test_self_is_registered_localportal(reg_addr):
|
||||||
"Verify waiting on the arbiter to register itself using a local portal."
|
"Verify waiting on the arbiter to register itself using a local portal."
|
||||||
actor = tractor.current_actor()
|
actor = tractor.current_actor()
|
||||||
assert actor.is_arbiter
|
assert actor.is_arbiter
|
||||||
async with tractor.get_registry(*reg_addr) as portal:
|
async with tractor.get_arbiter(*reg_addr) as portal:
|
||||||
assert isinstance(portal, tractor._portal.LocalPortal)
|
assert isinstance(portal, tractor._portal.LocalPortal)
|
||||||
|
|
||||||
with trio.fail_after(0.2):
|
with trio.fail_after(0.2):
|
||||||
|
|
|
@ -10,7 +10,7 @@ import tractor
|
||||||
from tractor._testing import (
|
from tractor._testing import (
|
||||||
tractor_test,
|
tractor_test,
|
||||||
)
|
)
|
||||||
from .conftest import (
|
from conftest import (
|
||||||
sig_prog,
|
sig_prog,
|
||||||
_INT_SIGNAL,
|
_INT_SIGNAL,
|
||||||
_INT_RETURN_CODE,
|
_INT_RETURN_CODE,
|
||||||
|
@ -32,7 +32,7 @@ def test_abort_on_sigint(daemon):
|
||||||
@tractor_test
|
@tractor_test
|
||||||
async def test_cancel_remote_arbiter(daemon, reg_addr):
|
async def test_cancel_remote_arbiter(daemon, reg_addr):
|
||||||
assert not tractor.current_actor().is_arbiter
|
assert not tractor.current_actor().is_arbiter
|
||||||
async with tractor.get_registry(*reg_addr) as portal:
|
async with tractor.get_arbiter(*reg_addr) as portal:
|
||||||
await portal.cancel_actor()
|
await portal.cancel_actor()
|
||||||
|
|
||||||
time.sleep(0.1)
|
time.sleep(0.1)
|
||||||
|
@ -41,7 +41,7 @@ async def test_cancel_remote_arbiter(daemon, reg_addr):
|
||||||
|
|
||||||
# no arbiter socket should exist
|
# no arbiter socket should exist
|
||||||
with pytest.raises(OSError):
|
with pytest.raises(OSError):
|
||||||
async with tractor.get_registry(*reg_addr) as portal:
|
async with tractor.get_arbiter(*reg_addr) as portal:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -1,364 +0,0 @@
|
||||||
'''
|
|
||||||
Audit sub-sys APIs from `.msg._ops`
|
|
||||||
mostly for ensuring correct `contextvars`
|
|
||||||
related settings around IPC contexts.
|
|
||||||
|
|
||||||
'''
|
|
||||||
from contextlib import (
|
|
||||||
asynccontextmanager as acm,
|
|
||||||
)
|
|
||||||
|
|
||||||
from msgspec import (
|
|
||||||
Struct,
|
|
||||||
)
|
|
||||||
import pytest
|
|
||||||
import trio
|
|
||||||
|
|
||||||
import tractor
|
|
||||||
from tractor import (
|
|
||||||
Context,
|
|
||||||
MsgTypeError,
|
|
||||||
current_ipc_ctx,
|
|
||||||
Portal,
|
|
||||||
)
|
|
||||||
from tractor.msg import (
|
|
||||||
_ops as msgops,
|
|
||||||
Return,
|
|
||||||
)
|
|
||||||
from tractor.msg import (
|
|
||||||
_codec,
|
|
||||||
)
|
|
||||||
from tractor.msg.types import (
|
|
||||||
log,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class PldMsg(
|
|
||||||
Struct,
|
|
||||||
|
|
||||||
# TODO: with multiple structs in-spec we need to tag them!
|
|
||||||
# -[ ] offer a built-in `PldMsg` type to inherit from which takes
|
|
||||||
# case of these details?
|
|
||||||
#
|
|
||||||
# https://jcristharif.com/msgspec/structs.html#tagged-unions
|
|
||||||
# tag=True,
|
|
||||||
# tag_field='msg_type',
|
|
||||||
):
|
|
||||||
field: str
|
|
||||||
|
|
||||||
|
|
||||||
maybe_msg_spec = PldMsg|None
|
|
||||||
|
|
||||||
|
|
||||||
@acm
|
|
||||||
async def maybe_expect_raises(
|
|
||||||
raises: BaseException|None = None,
|
|
||||||
ensure_in_message: list[str]|None = None,
|
|
||||||
post_mortem: bool = False,
|
|
||||||
timeout: int = 3,
|
|
||||||
) -> None:
|
|
||||||
'''
|
|
||||||
Async wrapper for ensuring errors propagate from the inner scope.
|
|
||||||
|
|
||||||
'''
|
|
||||||
if tractor._state.debug_mode():
|
|
||||||
timeout += 999
|
|
||||||
|
|
||||||
with trio.fail_after(timeout):
|
|
||||||
try:
|
|
||||||
yield
|
|
||||||
except BaseException as _inner_err:
|
|
||||||
inner_err = _inner_err
|
|
||||||
# wasn't-expected to error..
|
|
||||||
if raises is None:
|
|
||||||
raise
|
|
||||||
|
|
||||||
else:
|
|
||||||
assert type(inner_err) is raises
|
|
||||||
|
|
||||||
# maybe check for error txt content
|
|
||||||
if ensure_in_message:
|
|
||||||
part: str
|
|
||||||
err_repr: str = repr(inner_err)
|
|
||||||
for part in ensure_in_message:
|
|
||||||
for i, arg in enumerate(inner_err.args):
|
|
||||||
if part in err_repr:
|
|
||||||
break
|
|
||||||
# if part never matches an arg, then we're
|
|
||||||
# missing a match.
|
|
||||||
else:
|
|
||||||
raise ValueError(
|
|
||||||
'Failed to find error message content?\n\n'
|
|
||||||
f'expected: {ensure_in_message!r}\n'
|
|
||||||
f'part: {part!r}\n\n'
|
|
||||||
f'{inner_err.args}'
|
|
||||||
)
|
|
||||||
|
|
||||||
if post_mortem:
|
|
||||||
await tractor.post_mortem()
|
|
||||||
|
|
||||||
else:
|
|
||||||
if raises:
|
|
||||||
raise RuntimeError(
|
|
||||||
f'Expected a {raises.__name__!r} to be raised?'
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@tractor.context(
|
|
||||||
pld_spec=maybe_msg_spec,
|
|
||||||
)
|
|
||||||
async def child(
|
|
||||||
ctx: Context,
|
|
||||||
started_value: int|PldMsg|None,
|
|
||||||
return_value: str|None,
|
|
||||||
validate_pld_spec: bool,
|
|
||||||
raise_on_started_mte: bool = True,
|
|
||||||
|
|
||||||
) -> None:
|
|
||||||
'''
|
|
||||||
Call ``Context.started()`` more then once (an error).
|
|
||||||
|
|
||||||
'''
|
|
||||||
expect_started_mte: bool = started_value == 10
|
|
||||||
|
|
||||||
# sanaity check that child RPC context is the current one
|
|
||||||
curr_ctx: Context = current_ipc_ctx()
|
|
||||||
assert ctx is curr_ctx
|
|
||||||
|
|
||||||
rx: msgops.PldRx = ctx._pld_rx
|
|
||||||
curr_pldec: _codec.MsgDec = rx.pld_dec
|
|
||||||
|
|
||||||
ctx_meta: dict = getattr(
|
|
||||||
child,
|
|
||||||
'_tractor_context_meta',
|
|
||||||
None,
|
|
||||||
)
|
|
||||||
if ctx_meta:
|
|
||||||
assert (
|
|
||||||
ctx_meta['pld_spec']
|
|
||||||
is curr_pldec.spec
|
|
||||||
is curr_pldec.pld_spec
|
|
||||||
)
|
|
||||||
|
|
||||||
# 2 cases: hdndle send-side and recv-only validation
|
|
||||||
# - when `raise_on_started_mte == True`, send validate
|
|
||||||
# - else, parent-recv-side only validation
|
|
||||||
mte: MsgTypeError|None = None
|
|
||||||
try:
|
|
||||||
await ctx.started(
|
|
||||||
value=started_value,
|
|
||||||
validate_pld_spec=validate_pld_spec,
|
|
||||||
)
|
|
||||||
|
|
||||||
except MsgTypeError as _mte:
|
|
||||||
mte = _mte
|
|
||||||
log.exception('started()` raised an MTE!\n')
|
|
||||||
if not expect_started_mte:
|
|
||||||
raise RuntimeError(
|
|
||||||
'Child-ctx-task SHOULD NOT HAVE raised an MTE for\n\n'
|
|
||||||
f'{started_value!r}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
boxed_div: str = '------ - ------'
|
|
||||||
assert boxed_div not in mte._message
|
|
||||||
assert boxed_div not in mte.tb_str
|
|
||||||
assert boxed_div not in repr(mte)
|
|
||||||
assert boxed_div not in str(mte)
|
|
||||||
mte_repr: str = repr(mte)
|
|
||||||
for line in mte.message.splitlines():
|
|
||||||
assert line in mte_repr
|
|
||||||
|
|
||||||
# since this is a *local error* there should be no
|
|
||||||
# boxed traceback content!
|
|
||||||
assert not mte.tb_str
|
|
||||||
|
|
||||||
# propagate to parent?
|
|
||||||
if raise_on_started_mte:
|
|
||||||
raise
|
|
||||||
|
|
||||||
# no-send-side-error fallthrough
|
|
||||||
if (
|
|
||||||
validate_pld_spec
|
|
||||||
and
|
|
||||||
expect_started_mte
|
|
||||||
):
|
|
||||||
raise RuntimeError(
|
|
||||||
'Child-ctx-task SHOULD HAVE raised an MTE for\n\n'
|
|
||||||
f'{started_value!r}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
assert (
|
|
||||||
not expect_started_mte
|
|
||||||
or
|
|
||||||
not validate_pld_spec
|
|
||||||
)
|
|
||||||
|
|
||||||
# if wait_for_parent_to_cancel:
|
|
||||||
# ...
|
|
||||||
#
|
|
||||||
# ^-TODO-^ logic for diff validation policies on each side:
|
|
||||||
#
|
|
||||||
# -[ ] ensure that if we don't validate on the send
|
|
||||||
# side, that we are eventually error-cancelled by our
|
|
||||||
# parent due to the bad `Started` payload!
|
|
||||||
# -[ ] the boxed error should be srced from the parent's
|
|
||||||
# runtime NOT ours!
|
|
||||||
# -[ ] we should still error on bad `return_value`s
|
|
||||||
# despite the parent not yet error-cancelling us?
|
|
||||||
# |_ how do we want the parent side to look in that
|
|
||||||
# case?
|
|
||||||
# -[ ] maybe the equiv of "during handling of the
|
|
||||||
# above error another occurred" for the case where
|
|
||||||
# the parent sends a MTE to this child and while
|
|
||||||
# waiting for the child to terminate it gets back
|
|
||||||
# the MTE for this case?
|
|
||||||
#
|
|
||||||
|
|
||||||
# XXX should always fail on recv side since we can't
|
|
||||||
# really do much else beside terminate and relay the
|
|
||||||
# msg-type-error from this RPC task ;)
|
|
||||||
return return_value
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'return_value',
|
|
||||||
[
|
|
||||||
'yo',
|
|
||||||
None,
|
|
||||||
],
|
|
||||||
ids=[
|
|
||||||
'return[invalid-"yo"]',
|
|
||||||
'return[valid-None]',
|
|
||||||
],
|
|
||||||
)
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'started_value',
|
|
||||||
[
|
|
||||||
10,
|
|
||||||
PldMsg(field='yo'),
|
|
||||||
],
|
|
||||||
ids=[
|
|
||||||
'Started[invalid-10]',
|
|
||||||
'Started[valid-PldMsg]',
|
|
||||||
],
|
|
||||||
)
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'pld_check_started_value',
|
|
||||||
[
|
|
||||||
True,
|
|
||||||
False,
|
|
||||||
],
|
|
||||||
ids=[
|
|
||||||
'check-started-pld',
|
|
||||||
'no-started-pld-validate',
|
|
||||||
],
|
|
||||||
)
|
|
||||||
def test_basic_payload_spec(
|
|
||||||
debug_mode: bool,
|
|
||||||
loglevel: str,
|
|
||||||
return_value: str|None,
|
|
||||||
started_value: int|PldMsg,
|
|
||||||
pld_check_started_value: bool,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Validate the most basic `PldRx` msg-type-spec semantics around
|
|
||||||
a IPC `Context` endpoint start, started-sync, and final return
|
|
||||||
value depending on set payload types and the currently applied
|
|
||||||
pld-spec.
|
|
||||||
|
|
||||||
'''
|
|
||||||
invalid_return: bool = return_value == 'yo'
|
|
||||||
invalid_started: bool = started_value == 10
|
|
||||||
|
|
||||||
async def main():
|
|
||||||
async with tractor.open_nursery(
|
|
||||||
debug_mode=debug_mode,
|
|
||||||
loglevel=loglevel,
|
|
||||||
) as an:
|
|
||||||
p: Portal = await an.start_actor(
|
|
||||||
'child',
|
|
||||||
enable_modules=[__name__],
|
|
||||||
)
|
|
||||||
|
|
||||||
# since not opened yet.
|
|
||||||
assert current_ipc_ctx() is None
|
|
||||||
|
|
||||||
if invalid_started:
|
|
||||||
msg_type_str: str = 'Started'
|
|
||||||
bad_value: int = 10
|
|
||||||
elif invalid_return:
|
|
||||||
msg_type_str: str = 'Return'
|
|
||||||
bad_value: str = 'yo'
|
|
||||||
else:
|
|
||||||
# XXX but should never be used below then..
|
|
||||||
msg_type_str: str = ''
|
|
||||||
bad_value: str = ''
|
|
||||||
|
|
||||||
maybe_mte: MsgTypeError|None = None
|
|
||||||
should_raise: Exception|None = (
|
|
||||||
MsgTypeError if (
|
|
||||||
invalid_return
|
|
||||||
or
|
|
||||||
invalid_started
|
|
||||||
) else None
|
|
||||||
)
|
|
||||||
async with (
|
|
||||||
maybe_expect_raises(
|
|
||||||
raises=should_raise,
|
|
||||||
ensure_in_message=[
|
|
||||||
f"invalid `{msg_type_str}` msg payload",
|
|
||||||
f'{bad_value}',
|
|
||||||
f'has type {type(bad_value)!r}',
|
|
||||||
'not match type-spec',
|
|
||||||
f'`{msg_type_str}.pld: PldMsg|NoneType`',
|
|
||||||
],
|
|
||||||
# only for debug
|
|
||||||
# post_mortem=True,
|
|
||||||
),
|
|
||||||
p.open_context(
|
|
||||||
child,
|
|
||||||
return_value=return_value,
|
|
||||||
started_value=started_value,
|
|
||||||
validate_pld_spec=pld_check_started_value,
|
|
||||||
) as (ctx, first),
|
|
||||||
):
|
|
||||||
# now opened with 'child' sub
|
|
||||||
assert current_ipc_ctx() is ctx
|
|
||||||
|
|
||||||
assert type(first) is PldMsg
|
|
||||||
assert first.field == 'yo'
|
|
||||||
|
|
||||||
try:
|
|
||||||
res: None|PldMsg = await ctx.result(hide_tb=False)
|
|
||||||
assert res is None
|
|
||||||
except MsgTypeError as mte:
|
|
||||||
maybe_mte = mte
|
|
||||||
if not invalid_return:
|
|
||||||
raise
|
|
||||||
|
|
||||||
# expected this invalid `Return.pld` so audit
|
|
||||||
# the error state + meta-data
|
|
||||||
assert mte.expected_msg_type is Return
|
|
||||||
assert mte.cid == ctx.cid
|
|
||||||
mte_repr: str = repr(mte)
|
|
||||||
for line in mte.message.splitlines():
|
|
||||||
assert line in mte_repr
|
|
||||||
|
|
||||||
assert mte.tb_str
|
|
||||||
# await tractor.pause(shield=True)
|
|
||||||
|
|
||||||
# verify expected remote mte deats
|
|
||||||
assert ctx._local_error is None
|
|
||||||
assert (
|
|
||||||
mte is
|
|
||||||
ctx._remote_error is
|
|
||||||
ctx.maybe_error is
|
|
||||||
ctx.outcome
|
|
||||||
)
|
|
||||||
|
|
||||||
if should_raise is None:
|
|
||||||
assert maybe_mte is None
|
|
||||||
|
|
||||||
await p.cancel_actor()
|
|
||||||
|
|
||||||
trio.run(main)
|
|
|
@ -1,248 +0,0 @@
|
||||||
'''
|
|
||||||
Special attention cases for using "infect `asyncio`" mode from a root
|
|
||||||
actor; i.e. not using a std `trio.run()` bootstrap.
|
|
||||||
|
|
||||||
'''
|
|
||||||
import asyncio
|
|
||||||
from functools import partial
|
|
||||||
|
|
||||||
import pytest
|
|
||||||
import trio
|
|
||||||
import tractor
|
|
||||||
from tractor import (
|
|
||||||
to_asyncio,
|
|
||||||
)
|
|
||||||
from tests.test_infected_asyncio import (
|
|
||||||
aio_echo_server,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'raise_error_mid_stream',
|
|
||||||
[
|
|
||||||
False,
|
|
||||||
Exception,
|
|
||||||
KeyboardInterrupt,
|
|
||||||
],
|
|
||||||
ids='raise_error={}'.format,
|
|
||||||
)
|
|
||||||
def test_infected_root_actor(
|
|
||||||
raise_error_mid_stream: bool|Exception,
|
|
||||||
|
|
||||||
# conftest wide
|
|
||||||
loglevel: str,
|
|
||||||
debug_mode: bool,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Verify you can run the `tractor` runtime with `Actor.is_infected_aio() == True`
|
|
||||||
in the root actor.
|
|
||||||
|
|
||||||
'''
|
|
||||||
async def _trio_main():
|
|
||||||
with trio.fail_after(2 if not debug_mode else 999):
|
|
||||||
first: str
|
|
||||||
chan: to_asyncio.LinkedTaskChannel
|
|
||||||
async with (
|
|
||||||
tractor.open_root_actor(
|
|
||||||
debug_mode=debug_mode,
|
|
||||||
loglevel=loglevel,
|
|
||||||
),
|
|
||||||
to_asyncio.open_channel_from(
|
|
||||||
aio_echo_server,
|
|
||||||
) as (first, chan),
|
|
||||||
):
|
|
||||||
assert first == 'start'
|
|
||||||
|
|
||||||
for i in range(1000):
|
|
||||||
await chan.send(i)
|
|
||||||
out = await chan.receive()
|
|
||||||
assert out == i
|
|
||||||
print(f'asyncio echoing {i}')
|
|
||||||
|
|
||||||
if (
|
|
||||||
raise_error_mid_stream
|
|
||||||
and
|
|
||||||
i == 500
|
|
||||||
):
|
|
||||||
raise raise_error_mid_stream
|
|
||||||
|
|
||||||
if out is None:
|
|
||||||
try:
|
|
||||||
out = await chan.receive()
|
|
||||||
except trio.EndOfChannel:
|
|
||||||
break
|
|
||||||
else:
|
|
||||||
raise RuntimeError(
|
|
||||||
'aio channel never stopped?'
|
|
||||||
)
|
|
||||||
|
|
||||||
if raise_error_mid_stream:
|
|
||||||
with pytest.raises(raise_error_mid_stream):
|
|
||||||
tractor.to_asyncio.run_as_asyncio_guest(
|
|
||||||
trio_main=_trio_main,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
tractor.to_asyncio.run_as_asyncio_guest(
|
|
||||||
trio_main=_trio_main,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
async def sync_and_err(
|
|
||||||
# just signature placeholders for compat with
|
|
||||||
# ``to_asyncio.open_channel_from()``
|
|
||||||
to_trio: trio.MemorySendChannel,
|
|
||||||
from_trio: asyncio.Queue,
|
|
||||||
ev: asyncio.Event,
|
|
||||||
|
|
||||||
):
|
|
||||||
if to_trio:
|
|
||||||
to_trio.send_nowait('start')
|
|
||||||
|
|
||||||
await ev.wait()
|
|
||||||
raise RuntimeError('asyncio-side')
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
'aio_err_trigger',
|
|
||||||
[
|
|
||||||
'before_start_point',
|
|
||||||
'after_trio_task_starts',
|
|
||||||
'after_start_point',
|
|
||||||
],
|
|
||||||
ids='aio_err_triggered={}'.format
|
|
||||||
)
|
|
||||||
def test_trio_prestarted_task_bubbles(
|
|
||||||
aio_err_trigger: str,
|
|
||||||
|
|
||||||
# conftest wide
|
|
||||||
loglevel: str,
|
|
||||||
debug_mode: bool,
|
|
||||||
):
|
|
||||||
async def pre_started_err(
|
|
||||||
raise_err: bool = False,
|
|
||||||
pre_sleep: float|None = None,
|
|
||||||
aio_trigger: asyncio.Event|None = None,
|
|
||||||
task_status=trio.TASK_STATUS_IGNORED,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Maybe pre-started error then sleep.
|
|
||||||
|
|
||||||
'''
|
|
||||||
if pre_sleep is not None:
|
|
||||||
print(f'Sleeping from trio for {pre_sleep!r}s !')
|
|
||||||
await trio.sleep(pre_sleep)
|
|
||||||
|
|
||||||
# signal aio-task to raise JUST AFTER this task
|
|
||||||
# starts but has not yet `.started()`
|
|
||||||
if aio_trigger:
|
|
||||||
print('Signalling aio-task to raise from `trio`!!')
|
|
||||||
aio_trigger.set()
|
|
||||||
|
|
||||||
if raise_err:
|
|
||||||
print('Raising from trio!')
|
|
||||||
raise TypeError('trio-side')
|
|
||||||
|
|
||||||
task_status.started()
|
|
||||||
await trio.sleep_forever()
|
|
||||||
|
|
||||||
async def _trio_main():
|
|
||||||
# with trio.fail_after(2):
|
|
||||||
with trio.fail_after(999):
|
|
||||||
first: str
|
|
||||||
chan: to_asyncio.LinkedTaskChannel
|
|
||||||
aio_ev = asyncio.Event()
|
|
||||||
|
|
||||||
async with (
|
|
||||||
tractor.open_root_actor(
|
|
||||||
debug_mode=False,
|
|
||||||
loglevel=loglevel,
|
|
||||||
),
|
|
||||||
):
|
|
||||||
# TODO, tests for this with 3.13 egs?
|
|
||||||
# from tractor.devx import open_crash_handler
|
|
||||||
# with open_crash_handler():
|
|
||||||
async with (
|
|
||||||
# where we'll start a sub-task that errors BEFORE
|
|
||||||
# calling `.started()` such that the error should
|
|
||||||
# bubble before the guest run terminates!
|
|
||||||
trio.open_nursery() as tn,
|
|
||||||
|
|
||||||
# THEN start an infect task which should error just
|
|
||||||
# after the trio-side's task does.
|
|
||||||
to_asyncio.open_channel_from(
|
|
||||||
partial(
|
|
||||||
sync_and_err,
|
|
||||||
ev=aio_ev,
|
|
||||||
)
|
|
||||||
) as (first, chan),
|
|
||||||
):
|
|
||||||
|
|
||||||
for i in range(5):
|
|
||||||
pre_sleep: float|None = None
|
|
||||||
last_iter: bool = (i == 4)
|
|
||||||
|
|
||||||
# TODO, missing cases?
|
|
||||||
# -[ ] error as well on
|
|
||||||
# 'after_start_point' case as well for
|
|
||||||
# another case?
|
|
||||||
raise_err: bool = False
|
|
||||||
|
|
||||||
if last_iter:
|
|
||||||
raise_err: bool = True
|
|
||||||
|
|
||||||
# trigger aio task to error on next loop
|
|
||||||
# tick/checkpoint
|
|
||||||
if aio_err_trigger == 'before_start_point':
|
|
||||||
aio_ev.set()
|
|
||||||
|
|
||||||
pre_sleep: float = 0
|
|
||||||
|
|
||||||
await tn.start(
|
|
||||||
pre_started_err,
|
|
||||||
raise_err,
|
|
||||||
pre_sleep,
|
|
||||||
(aio_ev if (
|
|
||||||
aio_err_trigger == 'after_trio_task_starts'
|
|
||||||
and
|
|
||||||
last_iter
|
|
||||||
) else None
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
if (
|
|
||||||
aio_err_trigger == 'after_start_point'
|
|
||||||
and
|
|
||||||
last_iter
|
|
||||||
):
|
|
||||||
aio_ev.set()
|
|
||||||
|
|
||||||
with pytest.raises(
|
|
||||||
expected_exception=ExceptionGroup,
|
|
||||||
) as excinfo:
|
|
||||||
tractor.to_asyncio.run_as_asyncio_guest(
|
|
||||||
trio_main=_trio_main,
|
|
||||||
)
|
|
||||||
|
|
||||||
eg = excinfo.value
|
|
||||||
rte_eg, rest_eg = eg.split(RuntimeError)
|
|
||||||
|
|
||||||
# ensure the trio-task's error bubbled despite the aio-side
|
|
||||||
# having (maybe) errored first.
|
|
||||||
if aio_err_trigger in (
|
|
||||||
'after_trio_task_starts',
|
|
||||||
'after_start_point',
|
|
||||||
):
|
|
||||||
assert len(errs := rest_eg.exceptions) == 1
|
|
||||||
typerr = errs[0]
|
|
||||||
assert (
|
|
||||||
type(typerr) is TypeError
|
|
||||||
and
|
|
||||||
'trio-side' in typerr.args
|
|
||||||
)
|
|
||||||
|
|
||||||
# when aio errors BEFORE (last) trio task is scheduled, we should
|
|
||||||
# never see anythinb but the aio-side.
|
|
||||||
else:
|
|
||||||
assert len(rtes := rte_eg.exceptions) == 1
|
|
||||||
assert 'asyncio-side' in rtes[0].args[0]
|
|
|
@ -15,19 +15,9 @@ async def sleep_back_actor(
|
||||||
func_name,
|
func_name,
|
||||||
func_defined,
|
func_defined,
|
||||||
exposed_mods,
|
exposed_mods,
|
||||||
*,
|
|
||||||
reg_addr: tuple,
|
|
||||||
):
|
):
|
||||||
if actor_name:
|
if actor_name:
|
||||||
async with tractor.find_actor(
|
async with tractor.find_actor(actor_name) as portal:
|
||||||
actor_name,
|
|
||||||
# NOTE: must be set manually since
|
|
||||||
# the subactor doesn't have the reg_addr
|
|
||||||
# fixture code run in it!
|
|
||||||
# TODO: maybe we should just set this once in the
|
|
||||||
# _state mod and derive to all children?
|
|
||||||
registry_addrs=[reg_addr],
|
|
||||||
) as portal:
|
|
||||||
try:
|
try:
|
||||||
await portal.run(__name__, func_name)
|
await portal.run(__name__, func_name)
|
||||||
except tractor.RemoteActorError as err:
|
except tractor.RemoteActorError as err:
|
||||||
|
@ -36,7 +26,7 @@ async def sleep_back_actor(
|
||||||
if not exposed_mods:
|
if not exposed_mods:
|
||||||
expect = tractor.ModuleNotExposed
|
expect = tractor.ModuleNotExposed
|
||||||
|
|
||||||
assert err.boxed_type is expect
|
assert err.type is expect
|
||||||
raise
|
raise
|
||||||
else:
|
else:
|
||||||
await trio.sleep(float('inf'))
|
await trio.sleep(float('inf'))
|
||||||
|
@ -62,17 +52,11 @@ async def short_sleep():
|
||||||
'fail_on_syntax',
|
'fail_on_syntax',
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
def test_rpc_errors(
|
def test_rpc_errors(reg_addr, to_call, testdir):
|
||||||
reg_addr,
|
"""Test errors when making various RPC requests to an actor
|
||||||
to_call,
|
|
||||||
testdir,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Test errors when making various RPC requests to an actor
|
|
||||||
that either doesn't have the requested module exposed or doesn't define
|
that either doesn't have the requested module exposed or doesn't define
|
||||||
the named function.
|
the named function.
|
||||||
|
"""
|
||||||
'''
|
|
||||||
exposed_mods, funcname, inside_err = to_call
|
exposed_mods, funcname, inside_err = to_call
|
||||||
subactor_exposed_mods = []
|
subactor_exposed_mods = []
|
||||||
func_defined = globals().get(funcname, False)
|
func_defined = globals().get(funcname, False)
|
||||||
|
@ -100,13 +84,8 @@ def test_rpc_errors(
|
||||||
|
|
||||||
# spawn a subactor which calls us back
|
# spawn a subactor which calls us back
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery(
|
||||||
registry_addrs=[reg_addr],
|
arbiter_addr=reg_addr,
|
||||||
enable_modules=exposed_mods.copy(),
|
enable_modules=exposed_mods.copy(),
|
||||||
|
|
||||||
# NOTE: will halt test in REPL if uncommented, so only
|
|
||||||
# do that if actually debugging subactor but keep it
|
|
||||||
# disabled for the test.
|
|
||||||
# debug_mode=True,
|
|
||||||
) as n:
|
) as n:
|
||||||
|
|
||||||
actor = tractor.current_actor()
|
actor = tractor.current_actor()
|
||||||
|
@ -123,7 +102,6 @@ def test_rpc_errors(
|
||||||
exposed_mods=exposed_mods,
|
exposed_mods=exposed_mods,
|
||||||
func_defined=True if func_defined else False,
|
func_defined=True if func_defined else False,
|
||||||
enable_modules=subactor_exposed_mods,
|
enable_modules=subactor_exposed_mods,
|
||||||
reg_addr=reg_addr,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
def run():
|
def run():
|
||||||
|
@ -150,4 +128,4 @@ def test_rpc_errors(
|
||||||
))
|
))
|
||||||
|
|
||||||
if getattr(value, 'type', None):
|
if getattr(value, 'type', None):
|
||||||
assert value.boxed_type is inside_err
|
assert value.type is inside_err
|
||||||
|
|
|
@ -2,9 +2,7 @@
|
||||||
Spawning basics
|
Spawning basics
|
||||||
|
|
||||||
"""
|
"""
|
||||||
from typing import (
|
from typing import Optional
|
||||||
Any,
|
|
||||||
)
|
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
import trio
|
import trio
|
||||||
|
@ -27,12 +25,15 @@ async def spawn(
|
||||||
async with tractor.open_root_actor(
|
async with tractor.open_root_actor(
|
||||||
arbiter_addr=reg_addr,
|
arbiter_addr=reg_addr,
|
||||||
):
|
):
|
||||||
|
|
||||||
actor = tractor.current_actor()
|
actor = tractor.current_actor()
|
||||||
assert actor.is_arbiter == is_arbiter
|
assert actor.is_arbiter == is_arbiter
|
||||||
data = data_to_pass_down
|
data = data_to_pass_down
|
||||||
|
|
||||||
if actor.is_arbiter:
|
if actor.is_arbiter:
|
||||||
async with tractor.open_nursery() as nursery:
|
|
||||||
|
async with tractor.open_nursery(
|
||||||
|
) as nursery:
|
||||||
|
|
||||||
# forks here
|
# forks here
|
||||||
portal = await nursery.run_in_actor(
|
portal = await nursery.run_in_actor(
|
||||||
|
@ -54,9 +55,7 @@ async def spawn(
|
||||||
return 10
|
return 10
|
||||||
|
|
||||||
|
|
||||||
def test_local_arbiter_subactor_global_state(
|
def test_local_arbiter_subactor_global_state(reg_addr):
|
||||||
reg_addr,
|
|
||||||
):
|
|
||||||
result = trio.run(
|
result = trio.run(
|
||||||
spawn,
|
spawn,
|
||||||
True,
|
True,
|
||||||
|
@ -95,9 +94,7 @@ async def test_movie_theatre_convo(start_method):
|
||||||
await portal.cancel_actor()
|
await portal.cancel_actor()
|
||||||
|
|
||||||
|
|
||||||
async def cellar_door(
|
async def cellar_door(return_value: Optional[str]):
|
||||||
return_value: str|None,
|
|
||||||
):
|
|
||||||
return return_value
|
return return_value
|
||||||
|
|
||||||
|
|
||||||
|
@ -107,18 +104,16 @@ async def cellar_door(
|
||||||
)
|
)
|
||||||
@tractor_test
|
@tractor_test
|
||||||
async def test_most_beautiful_word(
|
async def test_most_beautiful_word(
|
||||||
start_method: str,
|
start_method,
|
||||||
return_value: Any,
|
return_value
|
||||||
debug_mode: bool,
|
|
||||||
):
|
):
|
||||||
'''
|
'''
|
||||||
The main ``tractor`` routine.
|
The main ``tractor`` routine.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
with trio.fail_after(1):
|
with trio.fail_after(1):
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery() as n:
|
||||||
debug_mode=debug_mode,
|
|
||||||
) as n:
|
|
||||||
portal = await n.run_in_actor(
|
portal = await n.run_in_actor(
|
||||||
cellar_door,
|
cellar_door,
|
||||||
return_value=return_value,
|
return_value=return_value,
|
||||||
|
|
|
@ -2,9 +2,7 @@
|
||||||
Broadcast channels for fan-out to local tasks.
|
Broadcast channels for fan-out to local tasks.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
from contextlib import (
|
from contextlib import asynccontextmanager
|
||||||
asynccontextmanager as acm,
|
|
||||||
)
|
|
||||||
from functools import partial
|
from functools import partial
|
||||||
from itertools import cycle
|
from itertools import cycle
|
||||||
import time
|
import time
|
||||||
|
@ -17,7 +15,6 @@ import tractor
|
||||||
from tractor.trionics import (
|
from tractor.trionics import (
|
||||||
broadcast_receiver,
|
broadcast_receiver,
|
||||||
Lagged,
|
Lagged,
|
||||||
collapse_eg,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@ -65,7 +62,7 @@ async def ensure_sequence(
|
||||||
break
|
break
|
||||||
|
|
||||||
|
|
||||||
@acm
|
@asynccontextmanager
|
||||||
async def open_sequence_streamer(
|
async def open_sequence_streamer(
|
||||||
|
|
||||||
sequence: list[int],
|
sequence: list[int],
|
||||||
|
@ -77,9 +74,9 @@ async def open_sequence_streamer(
|
||||||
async with tractor.open_nursery(
|
async with tractor.open_nursery(
|
||||||
arbiter_addr=reg_addr,
|
arbiter_addr=reg_addr,
|
||||||
start_method=start_method,
|
start_method=start_method,
|
||||||
) as an:
|
) as tn:
|
||||||
|
|
||||||
portal = await an.start_actor(
|
portal = await tn.start_actor(
|
||||||
'sequence_echoer',
|
'sequence_echoer',
|
||||||
enable_modules=[__name__],
|
enable_modules=[__name__],
|
||||||
)
|
)
|
||||||
|
@ -158,12 +155,9 @@ def test_consumer_and_parent_maybe_lag(
|
||||||
) as stream:
|
) as stream:
|
||||||
|
|
||||||
try:
|
try:
|
||||||
async with (
|
async with trio.open_nursery() as n:
|
||||||
collapse_eg(),
|
|
||||||
trio.open_nursery() as tn,
|
|
||||||
):
|
|
||||||
|
|
||||||
tn.start_soon(
|
n.start_soon(
|
||||||
ensure_sequence,
|
ensure_sequence,
|
||||||
stream,
|
stream,
|
||||||
sequence.copy(),
|
sequence.copy(),
|
||||||
|
@ -236,8 +230,8 @@ def test_faster_task_to_recv_is_cancelled_by_slower(
|
||||||
|
|
||||||
) as stream:
|
) as stream:
|
||||||
|
|
||||||
async with trio.open_nursery() as tn:
|
async with trio.open_nursery() as n:
|
||||||
tn.start_soon(
|
n.start_soon(
|
||||||
ensure_sequence,
|
ensure_sequence,
|
||||||
stream,
|
stream,
|
||||||
sequence.copy(),
|
sequence.copy(),
|
||||||
|
@ -259,7 +253,7 @@ def test_faster_task_to_recv_is_cancelled_by_slower(
|
||||||
continue
|
continue
|
||||||
|
|
||||||
print('cancelling faster subtask')
|
print('cancelling faster subtask')
|
||||||
tn.cancel_scope.cancel()
|
n.cancel_scope.cancel()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
value = await stream.receive()
|
value = await stream.receive()
|
||||||
|
@ -277,7 +271,7 @@ def test_faster_task_to_recv_is_cancelled_by_slower(
|
||||||
# the faster subtask was cancelled
|
# the faster subtask was cancelled
|
||||||
break
|
break
|
||||||
|
|
||||||
# await tractor.pause()
|
# await tractor.breakpoint()
|
||||||
# await stream.receive()
|
# await stream.receive()
|
||||||
print(f'final value: {value}')
|
print(f'final value: {value}')
|
||||||
|
|
||||||
|
@ -377,13 +371,13 @@ def test_ensure_slow_consumers_lag_out(
|
||||||
f'on {lags}:{value}')
|
f'on {lags}:{value}')
|
||||||
return
|
return
|
||||||
|
|
||||||
async with trio.open_nursery() as tn:
|
async with trio.open_nursery() as nursery:
|
||||||
|
|
||||||
for i in range(1, num_laggers):
|
for i in range(1, num_laggers):
|
||||||
|
|
||||||
task_name = f'sub_{i}'
|
task_name = f'sub_{i}'
|
||||||
laggers[task_name] = 0
|
laggers[task_name] = 0
|
||||||
tn.start_soon(
|
nursery.start_soon(
|
||||||
partial(
|
partial(
|
||||||
sub_and_print,
|
sub_and_print,
|
||||||
delay=i*0.001,
|
delay=i*0.001,
|
||||||
|
@ -503,7 +497,6 @@ def test_no_raise_on_lag():
|
||||||
# internals when the no raise flag is set.
|
# internals when the no raise flag is set.
|
||||||
loglevel='warning',
|
loglevel='warning',
|
||||||
),
|
),
|
||||||
collapse_eg(),
|
|
||||||
trio.open_nursery() as n,
|
trio.open_nursery() as n,
|
||||||
):
|
):
|
||||||
n.start_soon(slow)
|
n.start_soon(slow)
|
||||||
|
|
|
@ -3,10 +3,6 @@ Reminders for oddities in `trio` that we need to stay aware of and/or
|
||||||
want to see changed.
|
want to see changed.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
from contextlib import (
|
|
||||||
asynccontextmanager as acm,
|
|
||||||
)
|
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
import trio
|
import trio
|
||||||
from trio import TaskStatus
|
from trio import TaskStatus
|
||||||
|
@ -64,9 +60,7 @@ def test_stashed_child_nursery(use_start_soon):
|
||||||
async def main():
|
async def main():
|
||||||
|
|
||||||
async with (
|
async with (
|
||||||
trio.open_nursery(
|
trio.open_nursery() as pn,
|
||||||
strict_exception_groups=False,
|
|
||||||
) as pn,
|
|
||||||
):
|
):
|
||||||
cn = await pn.start(mk_child_nursery)
|
cn = await pn.start(mk_child_nursery)
|
||||||
assert cn
|
assert cn
|
||||||
|
@ -86,118 +80,3 @@ def test_stashed_child_nursery(use_start_soon):
|
||||||
|
|
||||||
with pytest.raises(NameError):
|
with pytest.raises(NameError):
|
||||||
trio.run(main)
|
trio.run(main)
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
('unmask_from_canc', 'canc_from_finally'),
|
|
||||||
[
|
|
||||||
(True, False),
|
|
||||||
(True, True),
|
|
||||||
pytest.param(False, True,
|
|
||||||
marks=pytest.mark.xfail(reason="never raises!")
|
|
||||||
),
|
|
||||||
],
|
|
||||||
# TODO, ask ronny how to impl this .. XD
|
|
||||||
# ids='unmask_from_canc={0}, canc_from_finally={1}',#.format,
|
|
||||||
)
|
|
||||||
def test_acm_embedded_nursery_propagates_enter_err(
|
|
||||||
canc_from_finally: bool,
|
|
||||||
unmask_from_canc: bool,
|
|
||||||
debug_mode: bool,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Demo how a masking `trio.Cancelled` could be handled by unmasking from the
|
|
||||||
`.__context__` field when a user (by accident) re-raises from a `finally:`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
import tractor
|
|
||||||
|
|
||||||
@acm
|
|
||||||
async def maybe_raise_from_masking_exc(
|
|
||||||
tn: trio.Nursery,
|
|
||||||
unmask_from: BaseException|None = trio.Cancelled
|
|
||||||
|
|
||||||
# TODO, maybe offer a collection?
|
|
||||||
# unmask_from: set[BaseException] = {
|
|
||||||
# trio.Cancelled,
|
|
||||||
# },
|
|
||||||
):
|
|
||||||
if not unmask_from:
|
|
||||||
yield
|
|
||||||
return
|
|
||||||
|
|
||||||
try:
|
|
||||||
yield
|
|
||||||
except* unmask_from as be_eg:
|
|
||||||
|
|
||||||
# TODO, if we offer `unmask_from: set`
|
|
||||||
# for masker_exc_type in unmask_from:
|
|
||||||
|
|
||||||
matches, rest = be_eg.split(unmask_from)
|
|
||||||
if not matches:
|
|
||||||
raise
|
|
||||||
|
|
||||||
for exc_match in be_eg.exceptions:
|
|
||||||
if (
|
|
||||||
(exc_ctx := exc_match.__context__)
|
|
||||||
and
|
|
||||||
type(exc_ctx) not in {
|
|
||||||
# trio.Cancelled, # always by default?
|
|
||||||
unmask_from,
|
|
||||||
}
|
|
||||||
):
|
|
||||||
exc_ctx.add_note(
|
|
||||||
f'\n'
|
|
||||||
f'WARNING: the above error was masked by a {unmask_from!r} !?!\n'
|
|
||||||
f'Are you always cancelling? Say from a `finally:` ?\n\n'
|
|
||||||
|
|
||||||
f'{tn!r}'
|
|
||||||
)
|
|
||||||
raise exc_ctx from exc_match
|
|
||||||
|
|
||||||
|
|
||||||
@acm
|
|
||||||
async def wraps_tn_that_always_cancels():
|
|
||||||
async with (
|
|
||||||
trio.open_nursery() as tn,
|
|
||||||
maybe_raise_from_masking_exc(
|
|
||||||
tn=tn,
|
|
||||||
unmask_from=(
|
|
||||||
trio.Cancelled
|
|
||||||
if unmask_from_canc
|
|
||||||
else None
|
|
||||||
),
|
|
||||||
)
|
|
||||||
):
|
|
||||||
try:
|
|
||||||
yield tn
|
|
||||||
finally:
|
|
||||||
if canc_from_finally:
|
|
||||||
tn.cancel_scope.cancel()
|
|
||||||
await trio.lowlevel.checkpoint()
|
|
||||||
|
|
||||||
async def _main():
|
|
||||||
with tractor.devx.maybe_open_crash_handler(
|
|
||||||
pdb=debug_mode,
|
|
||||||
) as bxerr:
|
|
||||||
assert not bxerr.value
|
|
||||||
|
|
||||||
async with (
|
|
||||||
wraps_tn_that_always_cancels() as tn,
|
|
||||||
):
|
|
||||||
assert not tn.cancel_scope.cancel_called
|
|
||||||
assert 0
|
|
||||||
|
|
||||||
assert (
|
|
||||||
(err := bxerr.value)
|
|
||||||
and
|
|
||||||
type(err) is AssertionError
|
|
||||||
)
|
|
||||||
|
|
||||||
with pytest.raises(ExceptionGroup) as excinfo:
|
|
||||||
trio.run(_main)
|
|
||||||
|
|
||||||
eg: ExceptionGroup = excinfo.value
|
|
||||||
assert_eg, rest_eg = eg.split(AssertionError)
|
|
||||||
|
|
||||||
assert len(assert_eg.exceptions) == 1
|
|
||||||
|
|
|
@ -18,53 +18,76 @@
|
||||||
tractor: structured concurrent ``trio``-"actors".
|
tractor: structured concurrent ``trio``-"actors".
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
from exceptiongroup import BaseExceptionGroup
|
||||||
|
|
||||||
from ._clustering import (
|
from ._clustering import open_actor_cluster
|
||||||
open_actor_cluster as open_actor_cluster,
|
|
||||||
)
|
|
||||||
from ._context import (
|
from ._context import (
|
||||||
Context as Context, # the type
|
Context, # the type
|
||||||
context as context, # a func-decorator
|
context, # a func-decorator
|
||||||
)
|
)
|
||||||
from ._streaming import (
|
from ._streaming import (
|
||||||
MsgStream as MsgStream,
|
MsgStream,
|
||||||
stream as stream,
|
stream,
|
||||||
)
|
)
|
||||||
from ._discovery import (
|
from ._discovery import (
|
||||||
get_registry as get_registry,
|
get_arbiter,
|
||||||
find_actor as find_actor,
|
find_actor,
|
||||||
wait_for_actor as wait_for_actor,
|
wait_for_actor,
|
||||||
query_actor as query_actor,
|
query_actor,
|
||||||
)
|
|
||||||
from ._supervise import (
|
|
||||||
open_nursery as open_nursery,
|
|
||||||
ActorNursery as ActorNursery,
|
|
||||||
)
|
)
|
||||||
|
from ._supervise import open_nursery
|
||||||
from ._state import (
|
from ._state import (
|
||||||
current_actor as current_actor,
|
current_actor,
|
||||||
is_root_process as is_root_process,
|
is_root_process,
|
||||||
current_ipc_ctx as current_ipc_ctx,
|
|
||||||
debug_mode as debug_mode
|
|
||||||
)
|
)
|
||||||
from ._exceptions import (
|
from ._exceptions import (
|
||||||
ContextCancelled as ContextCancelled,
|
RemoteActorError,
|
||||||
ModuleNotExposed as ModuleNotExposed,
|
ModuleNotExposed,
|
||||||
MsgTypeError as MsgTypeError,
|
ContextCancelled,
|
||||||
RemoteActorError as RemoteActorError,
|
|
||||||
TransportClosed as TransportClosed,
|
|
||||||
)
|
)
|
||||||
from .devx import (
|
from .devx import (
|
||||||
breakpoint as breakpoint,
|
breakpoint,
|
||||||
pause as pause,
|
pause,
|
||||||
pause_from_sync as pause_from_sync,
|
pause_from_sync,
|
||||||
post_mortem as post_mortem,
|
post_mortem,
|
||||||
)
|
)
|
||||||
from . import msg as msg
|
from . import msg
|
||||||
from ._root import (
|
from ._root import (
|
||||||
run_daemon as run_daemon,
|
run_daemon,
|
||||||
open_root_actor as open_root_actor,
|
open_root_actor,
|
||||||
)
|
)
|
||||||
from ._ipc import Channel as Channel
|
from ._ipc import Channel
|
||||||
from ._portal import Portal as Portal
|
from ._portal import Portal
|
||||||
from ._runtime import Actor as Actor
|
from ._runtime import Actor
|
||||||
# from . import hilevel as hilevel
|
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
'Actor',
|
||||||
|
'BaseExceptionGroup',
|
||||||
|
'Channel',
|
||||||
|
'Context',
|
||||||
|
'ContextCancelled',
|
||||||
|
'ModuleNotExposed',
|
||||||
|
'MsgStream',
|
||||||
|
'Portal',
|
||||||
|
'RemoteActorError',
|
||||||
|
'breakpoint',
|
||||||
|
'context',
|
||||||
|
'current_actor',
|
||||||
|
'find_actor',
|
||||||
|
'query_actor',
|
||||||
|
'get_arbiter',
|
||||||
|
'is_root_process',
|
||||||
|
'msg',
|
||||||
|
'open_actor_cluster',
|
||||||
|
'open_nursery',
|
||||||
|
'open_root_actor',
|
||||||
|
'pause',
|
||||||
|
'post_mortem',
|
||||||
|
'pause_from_sync',
|
||||||
|
'query_actor',
|
||||||
|
'run_daemon',
|
||||||
|
'stream',
|
||||||
|
'to_asyncio',
|
||||||
|
'wait_for_actor',
|
||||||
|
]
|
||||||
|
|
|
@ -36,7 +36,6 @@ def parse_ipaddr(arg):
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
__tracebackhide__: bool = True
|
|
||||||
|
|
||||||
parser = argparse.ArgumentParser()
|
parser = argparse.ArgumentParser()
|
||||||
parser.add_argument("--uid", type=parse_uid)
|
parser.add_argument("--uid", type=parse_uid)
|
||||||
|
|
|
@ -19,13 +19,10 @@ Actor cluster helpers.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
from contextlib import (
|
|
||||||
asynccontextmanager as acm,
|
from contextlib import asynccontextmanager as acm
|
||||||
)
|
|
||||||
from multiprocessing import cpu_count
|
from multiprocessing import cpu_count
|
||||||
from typing import (
|
from typing import AsyncGenerator, Optional
|
||||||
AsyncGenerator,
|
|
||||||
)
|
|
||||||
|
|
||||||
import trio
|
import trio
|
||||||
import tractor
|
import tractor
|
||||||
|
|
1833
tractor/_context.py
1833
tractor/_context.py
File diff suppressed because it is too large
Load Diff
|
@ -15,71 +15,52 @@
|
||||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
Discovery (protocols) API for automatic addressing and location
|
Actor discovery API.
|
||||||
management of (service) actors.
|
|
||||||
|
|
||||||
"""
|
"""
|
||||||
from __future__ import annotations
|
|
||||||
from typing import (
|
from typing import (
|
||||||
|
Optional,
|
||||||
|
Union,
|
||||||
AsyncGenerator,
|
AsyncGenerator,
|
||||||
AsyncContextManager,
|
|
||||||
TYPE_CHECKING,
|
|
||||||
)
|
)
|
||||||
from contextlib import asynccontextmanager as acm
|
from contextlib import asynccontextmanager as acm
|
||||||
|
|
||||||
from tractor.log import get_logger
|
|
||||||
from .trionics import gather_contexts
|
|
||||||
from ._ipc import _connect_chan, Channel
|
from ._ipc import _connect_chan, Channel
|
||||||
from ._portal import (
|
from ._portal import (
|
||||||
Portal,
|
Portal,
|
||||||
open_portal,
|
open_portal,
|
||||||
LocalPortal,
|
LocalPortal,
|
||||||
)
|
)
|
||||||
from ._state import (
|
from ._state import current_actor, _runtime_vars
|
||||||
current_actor,
|
|
||||||
_runtime_vars,
|
|
||||||
)
|
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
|
||||||
from ._runtime import Actor
|
|
||||||
|
|
||||||
|
|
||||||
log = get_logger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
@acm
|
@acm
|
||||||
async def get_registry(
|
async def get_arbiter(
|
||||||
|
|
||||||
host: str,
|
host: str,
|
||||||
port: int,
|
port: int,
|
||||||
|
|
||||||
) -> AsyncGenerator[
|
) -> AsyncGenerator[Union[Portal, LocalPortal], None]:
|
||||||
Portal | LocalPortal | None,
|
|
||||||
None,
|
|
||||||
]:
|
|
||||||
'''
|
'''
|
||||||
Return a portal instance connected to a local or remote
|
Return a portal instance connected to a local or remote
|
||||||
registry-service actor; if a connection already exists re-use it
|
arbiter.
|
||||||
(presumably to call a `.register_actor()` registry runtime RPC
|
|
||||||
ep).
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
actor: Actor = current_actor()
|
actor = current_actor()
|
||||||
if actor.is_registrar:
|
|
||||||
|
if not actor:
|
||||||
|
raise RuntimeError("No actor instance has been defined yet?")
|
||||||
|
|
||||||
|
if actor.is_arbiter:
|
||||||
# we're already the arbiter
|
# we're already the arbiter
|
||||||
# (likely a re-entrant call from the arbiter actor)
|
# (likely a re-entrant call from the arbiter actor)
|
||||||
yield LocalPortal(
|
yield LocalPortal(actor, Channel((host, port)))
|
||||||
actor,
|
|
||||||
Channel((host, port))
|
|
||||||
)
|
|
||||||
else:
|
else:
|
||||||
# TODO: try to look pre-existing connection from
|
async with _connect_chan(host, port) as chan:
|
||||||
# `Actor._peers` and use it instead?
|
|
||||||
async with (
|
|
||||||
_connect_chan(host, port) as chan,
|
|
||||||
open_portal(chan) as regstr_ptl,
|
|
||||||
):
|
|
||||||
yield regstr_ptl
|
|
||||||
|
|
||||||
|
async with open_portal(chan) as arb_portal:
|
||||||
|
|
||||||
|
yield arb_portal
|
||||||
|
|
||||||
|
|
||||||
@acm
|
@acm
|
||||||
|
@ -87,125 +68,51 @@ async def get_root(
|
||||||
**kwargs,
|
**kwargs,
|
||||||
) -> AsyncGenerator[Portal, None]:
|
) -> AsyncGenerator[Portal, None]:
|
||||||
|
|
||||||
# TODO: rename mailbox to `_root_maddr` when we finally
|
|
||||||
# add and impl libp2p multi-addrs?
|
|
||||||
host, port = _runtime_vars['_root_mailbox']
|
host, port = _runtime_vars['_root_mailbox']
|
||||||
assert host is not None
|
assert host is not None
|
||||||
|
|
||||||
async with (
|
async with _connect_chan(host, port) as chan:
|
||||||
_connect_chan(host, port) as chan,
|
async with open_portal(chan, **kwargs) as portal:
|
||||||
open_portal(chan, **kwargs) as portal,
|
yield portal
|
||||||
):
|
|
||||||
yield portal
|
|
||||||
|
|
||||||
|
|
||||||
def get_peer_by_name(
|
|
||||||
name: str,
|
|
||||||
# uuid: str|None = None,
|
|
||||||
|
|
||||||
) -> list[Channel]|None: # at least 1
|
|
||||||
'''
|
|
||||||
Scan for an existing connection (set) to a named actor
|
|
||||||
and return any channels from `Actor._peers`.
|
|
||||||
|
|
||||||
This is an optimization method over querying the registrar for
|
|
||||||
the same info.
|
|
||||||
|
|
||||||
'''
|
|
||||||
actor: Actor = current_actor()
|
|
||||||
to_scan: dict[tuple, list[Channel]] = actor._peers.copy()
|
|
||||||
pchan: Channel|None = actor._parent_chan
|
|
||||||
if pchan:
|
|
||||||
to_scan[pchan.uid].append(pchan)
|
|
||||||
|
|
||||||
for aid, chans in to_scan.items():
|
|
||||||
_, peer_name = aid
|
|
||||||
if name == peer_name:
|
|
||||||
if not chans:
|
|
||||||
log.warning(
|
|
||||||
'No IPC chans for matching peer {peer_name}\n'
|
|
||||||
)
|
|
||||||
continue
|
|
||||||
return chans
|
|
||||||
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
@acm
|
@acm
|
||||||
async def query_actor(
|
async def query_actor(
|
||||||
name: str,
|
name: str,
|
||||||
regaddr: tuple[str, int]|None = None,
|
arbiter_sockaddr: Optional[tuple[str, int]] = None,
|
||||||
|
|
||||||
) -> AsyncGenerator[
|
) -> AsyncGenerator[tuple[str, int], None]:
|
||||||
tuple[str, int]|None,
|
|
||||||
None,
|
|
||||||
]:
|
|
||||||
'''
|
'''
|
||||||
Lookup a transport address (by actor name) via querying a registrar
|
Simple address lookup for a given actor name.
|
||||||
listening @ `regaddr`.
|
|
||||||
|
|
||||||
Returns the transport protocol (socket) address or `None` if no
|
Returns the (socket) address or ``None``.
|
||||||
entry under that name exists.
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
actor: Actor = current_actor()
|
actor = current_actor()
|
||||||
if (
|
async with get_arbiter(
|
||||||
name == 'registrar'
|
*arbiter_sockaddr or actor._arb_addr
|
||||||
and actor.is_registrar
|
) as arb_portal:
|
||||||
):
|
|
||||||
raise RuntimeError(
|
|
||||||
'The current actor IS the registry!?'
|
|
||||||
)
|
|
||||||
|
|
||||||
maybe_peers: list[Channel]|None = get_peer_by_name(name)
|
sockaddr = await arb_portal.run_from_ns(
|
||||||
if maybe_peers:
|
|
||||||
yield maybe_peers[0].raddr
|
|
||||||
return
|
|
||||||
|
|
||||||
reg_portal: Portal
|
|
||||||
regaddr: tuple[str, int] = regaddr or actor.reg_addrs[0]
|
|
||||||
async with get_registry(*regaddr) as reg_portal:
|
|
||||||
# TODO: return portals to all available actors - for now
|
|
||||||
# just the last one that registered
|
|
||||||
sockaddr: tuple[str, int] = await reg_portal.run_from_ns(
|
|
||||||
'self',
|
'self',
|
||||||
'find_actor',
|
'find_actor',
|
||||||
name=name,
|
name=name,
|
||||||
)
|
)
|
||||||
yield sockaddr
|
|
||||||
|
|
||||||
|
# TODO: return portals to all available actors - for now just
|
||||||
|
# the last one that registered
|
||||||
|
if name == 'arbiter' and actor.is_arbiter:
|
||||||
|
raise RuntimeError("The current actor is the arbiter")
|
||||||
|
|
||||||
@acm
|
yield sockaddr if sockaddr else None
|
||||||
async def maybe_open_portal(
|
|
||||||
addr: tuple[str, int],
|
|
||||||
name: str,
|
|
||||||
):
|
|
||||||
async with query_actor(
|
|
||||||
name=name,
|
|
||||||
regaddr=addr,
|
|
||||||
) as sockaddr:
|
|
||||||
pass
|
|
||||||
|
|
||||||
if sockaddr:
|
|
||||||
async with _connect_chan(*sockaddr) as chan:
|
|
||||||
async with open_portal(chan) as portal:
|
|
||||||
yield portal
|
|
||||||
else:
|
|
||||||
yield None
|
|
||||||
|
|
||||||
|
|
||||||
@acm
|
@acm
|
||||||
async def find_actor(
|
async def find_actor(
|
||||||
name: str,
|
name: str,
|
||||||
registry_addrs: list[tuple[str, int]]|None = None,
|
arbiter_sockaddr: tuple[str, int] | None = None
|
||||||
|
|
||||||
only_first: bool = True,
|
) -> AsyncGenerator[Optional[Portal], None]:
|
||||||
raise_on_none: bool = False,
|
|
||||||
|
|
||||||
) -> AsyncGenerator[
|
|
||||||
Portal | list[Portal] | None,
|
|
||||||
None,
|
|
||||||
]:
|
|
||||||
'''
|
'''
|
||||||
Ask the arbiter to find actor(s) by name.
|
Ask the arbiter to find actor(s) by name.
|
||||||
|
|
||||||
|
@ -213,102 +120,43 @@ async def find_actor(
|
||||||
known to the arbiter.
|
known to the arbiter.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
# optimization path, use any pre-existing peer channel
|
async with query_actor(
|
||||||
maybe_peers: list[Channel]|None = get_peer_by_name(name)
|
name=name,
|
||||||
if maybe_peers and only_first:
|
arbiter_sockaddr=arbiter_sockaddr,
|
||||||
async with open_portal(maybe_peers[0]) as peer_portal:
|
) as sockaddr:
|
||||||
yield peer_portal
|
|
||||||
return
|
|
||||||
|
|
||||||
if not registry_addrs:
|
|
||||||
# XXX NOTE: make sure to dynamically read the value on
|
|
||||||
# every call since something may change it globally (eg.
|
|
||||||
# like in our discovery test suite)!
|
|
||||||
from . import _root
|
|
||||||
registry_addrs = (
|
|
||||||
_runtime_vars['_registry_addrs']
|
|
||||||
or
|
|
||||||
_root._default_lo_addrs
|
|
||||||
)
|
|
||||||
|
|
||||||
maybe_portals: list[
|
|
||||||
AsyncContextManager[tuple[str, int]]
|
|
||||||
] = list(
|
|
||||||
maybe_open_portal(
|
|
||||||
addr=addr,
|
|
||||||
name=name,
|
|
||||||
)
|
|
||||||
for addr in registry_addrs
|
|
||||||
)
|
|
||||||
portals: list[Portal]
|
|
||||||
async with gather_contexts(
|
|
||||||
mngrs=maybe_portals,
|
|
||||||
) as portals:
|
|
||||||
# log.runtime(
|
|
||||||
# 'Gathered portals:\n'
|
|
||||||
# f'{portals}'
|
|
||||||
# )
|
|
||||||
# NOTE: `gather_contexts()` will return a
|
|
||||||
# `tuple[None, None, ..., None]` if no contact
|
|
||||||
# can be made with any regstrar at any of the
|
|
||||||
# N provided addrs!
|
|
||||||
if not any(portals):
|
|
||||||
if raise_on_none:
|
|
||||||
raise RuntimeError(
|
|
||||||
f'No actor "{name}" found registered @ {registry_addrs}'
|
|
||||||
)
|
|
||||||
yield None
|
|
||||||
return
|
|
||||||
|
|
||||||
portals: list[Portal] = list(portals)
|
|
||||||
if only_first:
|
|
||||||
yield portals[0]
|
|
||||||
|
|
||||||
|
if sockaddr:
|
||||||
|
async with _connect_chan(*sockaddr) as chan:
|
||||||
|
async with open_portal(chan) as portal:
|
||||||
|
yield portal
|
||||||
else:
|
else:
|
||||||
# TODO: currently this may return multiple portals
|
yield None
|
||||||
# given there are multi-homed or multiple registrars..
|
|
||||||
# SO, we probably need de-duplication logic?
|
|
||||||
yield portals
|
|
||||||
|
|
||||||
|
|
||||||
@acm
|
@acm
|
||||||
async def wait_for_actor(
|
async def wait_for_actor(
|
||||||
name: str,
|
name: str,
|
||||||
registry_addr: tuple[str, int] | None = None,
|
arbiter_sockaddr: tuple[str, int] | None = None,
|
||||||
|
# registry_addr: tuple[str, int] | None = None,
|
||||||
|
|
||||||
) -> AsyncGenerator[Portal, None]:
|
) -> AsyncGenerator[Portal, None]:
|
||||||
'''
|
'''
|
||||||
Wait on at least one peer actor to register `name` with the
|
Wait on an actor to register with the arbiter.
|
||||||
registrar, yield a `Portal to the first registree.
|
|
||||||
|
A portal to the first registered actor is returned.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
actor: Actor = current_actor()
|
actor = current_actor()
|
||||||
|
|
||||||
# optimization path, use any pre-existing peer channel
|
async with get_arbiter(
|
||||||
maybe_peers: list[Channel]|None = get_peer_by_name(name)
|
*arbiter_sockaddr or actor._arb_addr,
|
||||||
if maybe_peers:
|
) as arb_portal:
|
||||||
async with open_portal(maybe_peers[0]) as peer_portal:
|
sockaddrs = await arb_portal.run_from_ns(
|
||||||
yield peer_portal
|
|
||||||
return
|
|
||||||
|
|
||||||
regaddr: tuple[str, int] = (
|
|
||||||
registry_addr
|
|
||||||
or
|
|
||||||
actor.reg_addrs[0]
|
|
||||||
)
|
|
||||||
# TODO: use `.trionics.gather_contexts()` like
|
|
||||||
# above in `find_actor()` as well?
|
|
||||||
reg_portal: Portal
|
|
||||||
async with get_registry(*regaddr) as reg_portal:
|
|
||||||
sockaddrs = await reg_portal.run_from_ns(
|
|
||||||
'self',
|
'self',
|
||||||
'wait_for_actor',
|
'wait_for_actor',
|
||||||
name=name,
|
name=name,
|
||||||
)
|
)
|
||||||
|
sockaddr = sockaddrs[-1]
|
||||||
# get latest registered addr by default?
|
|
||||||
# TODO: offer multi-portal yields in multi-homed case?
|
|
||||||
sockaddr: tuple[str, int] = sockaddrs[-1]
|
|
||||||
|
|
||||||
async with _connect_chan(*sockaddr) as chan:
|
async with _connect_chan(*sockaddr) as chan:
|
||||||
async with open_portal(chan) as portal:
|
async with open_portal(chan) as portal:
|
||||||
|
|
|
@ -20,9 +20,6 @@ Sub-process entry points.
|
||||||
"""
|
"""
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
from functools import partial
|
from functools import partial
|
||||||
import multiprocessing as mp
|
|
||||||
import os
|
|
||||||
import textwrap
|
|
||||||
from typing import (
|
from typing import (
|
||||||
Any,
|
Any,
|
||||||
TYPE_CHECKING,
|
TYPE_CHECKING,
|
||||||
|
@ -35,7 +32,6 @@ from .log import (
|
||||||
get_logger,
|
get_logger,
|
||||||
)
|
)
|
||||||
from . import _state
|
from . import _state
|
||||||
from .devx import _debug
|
|
||||||
from .to_asyncio import run_as_asyncio_guest
|
from .to_asyncio import run_as_asyncio_guest
|
||||||
from ._runtime import (
|
from ._runtime import (
|
||||||
async_main,
|
async_main,
|
||||||
|
@ -51,8 +47,8 @@ log = get_logger(__name__)
|
||||||
|
|
||||||
def _mp_main(
|
def _mp_main(
|
||||||
|
|
||||||
actor: Actor,
|
actor: Actor, # type: ignore
|
||||||
accept_addrs: list[tuple[str, int]],
|
accept_addr: tuple[str, int],
|
||||||
forkserver_info: tuple[Any, Any, Any, Any, Any],
|
forkserver_info: tuple[Any, Any, Any, Any, Any],
|
||||||
start_method: SpawnMethodKey,
|
start_method: SpawnMethodKey,
|
||||||
parent_addr: tuple[str, int] | None = None,
|
parent_addr: tuple[str, int] | None = None,
|
||||||
|
@ -60,31 +56,29 @@ def _mp_main(
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
The routine called *after fork* which invokes a fresh `trio.run()`
|
The routine called *after fork* which invokes a fresh ``trio.run``
|
||||||
|
|
||||||
'''
|
'''
|
||||||
actor._forkserver_info = forkserver_info
|
actor._forkserver_info = forkserver_info
|
||||||
from ._spawn import try_set_start_method
|
from ._spawn import try_set_start_method
|
||||||
spawn_ctx: mp.context.BaseContext = try_set_start_method(start_method)
|
spawn_ctx = try_set_start_method(start_method)
|
||||||
assert spawn_ctx
|
|
||||||
|
|
||||||
if actor.loglevel is not None:
|
if actor.loglevel is not None:
|
||||||
log.info(
|
log.info(
|
||||||
f'Setting loglevel for {actor.uid} to {actor.loglevel}'
|
f"Setting loglevel for {actor.uid} to {actor.loglevel}")
|
||||||
)
|
|
||||||
get_console_log(actor.loglevel)
|
get_console_log(actor.loglevel)
|
||||||
|
|
||||||
# TODO: use scops headers like for `trio` below!
|
assert spawn_ctx
|
||||||
# (well after we libify it maybe..)
|
|
||||||
log.info(
|
log.info(
|
||||||
f'Started new {spawn_ctx.current_process()} for {actor.uid}'
|
f"Started new {spawn_ctx.current_process()} for {actor.uid}")
|
||||||
# f"parent_addr is {parent_addr}"
|
|
||||||
)
|
_state._current_actor = actor
|
||||||
_state._current_actor: Actor = actor
|
|
||||||
|
log.debug(f"parent_addr is {parent_addr}")
|
||||||
trio_main = partial(
|
trio_main = partial(
|
||||||
async_main,
|
async_main,
|
||||||
actor=actor,
|
actor,
|
||||||
accept_addrs=accept_addrs,
|
accept_addr,
|
||||||
parent_addr=parent_addr
|
parent_addr=parent_addr
|
||||||
)
|
)
|
||||||
try:
|
try:
|
||||||
|
@ -97,114 +91,12 @@ def _mp_main(
|
||||||
pass # handle it the same way trio does?
|
pass # handle it the same way trio does?
|
||||||
|
|
||||||
finally:
|
finally:
|
||||||
log.info(
|
log.info(f"Actor {actor.uid} terminated")
|
||||||
f'`mp`-subactor {actor.uid} exited'
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: move this func to some kinda `.devx._conc_lang.py` eventually
|
|
||||||
# as we work out our multi-domain state-flow-syntax!
|
|
||||||
def nest_from_op(
|
|
||||||
input_op: str,
|
|
||||||
#
|
|
||||||
# ?TODO? an idea for a syntax to the state of concurrent systems
|
|
||||||
# as a "3-domain" (execution, scope, storage) model and using
|
|
||||||
# a minimal ascii/utf-8 operator-set.
|
|
||||||
#
|
|
||||||
# try not to take any of this seriously yet XD
|
|
||||||
#
|
|
||||||
# > is a "play operator" indicating (CPU bound)
|
|
||||||
# exec/work/ops required at the "lowest level computing"
|
|
||||||
#
|
|
||||||
# execution primititves (tasks, threads, actors..) denote their
|
|
||||||
# lifetime with '(' and ')' since parentheses normally are used
|
|
||||||
# in many langs to denote function calls.
|
|
||||||
#
|
|
||||||
# starting = (
|
|
||||||
# >( opening/starting; beginning of the thread-of-exec (toe?)
|
|
||||||
# (> opened/started, (finished spawning toe)
|
|
||||||
# |_<Task: blah blah..> repr of toe, in py these look like <objs>
|
|
||||||
#
|
|
||||||
# >) closing/exiting/stopping,
|
|
||||||
# )> closed/exited/stopped,
|
|
||||||
# |_<Task: blah blah..>
|
|
||||||
# [OR <), )< ?? ]
|
|
||||||
#
|
|
||||||
# ending = )
|
|
||||||
# >c) cancelling to close/exit
|
|
||||||
# c)> cancelled (caused close), OR?
|
|
||||||
# |_<Actor: ..>
|
|
||||||
# OR maybe "<c)" which better indicates the cancel being
|
|
||||||
# "delivered/returned" / returned" to LHS?
|
|
||||||
#
|
|
||||||
# >x) erroring to eventuall exit
|
|
||||||
# x)> errored and terminated
|
|
||||||
# |_<Actor: ...>
|
|
||||||
#
|
|
||||||
# scopes: supers/nurseries, IPC-ctxs, sessions, perms, etc.
|
|
||||||
# >{ opening
|
|
||||||
# {> opened
|
|
||||||
# }> closed
|
|
||||||
# >} closing
|
|
||||||
#
|
|
||||||
# storage: like queues, shm-buffers, files, etc..
|
|
||||||
# >[ opening
|
|
||||||
# [> opened
|
|
||||||
# |_<FileObj: ..>
|
|
||||||
#
|
|
||||||
# >] closing
|
|
||||||
# ]> closed
|
|
||||||
|
|
||||||
# IPC ops: channels, transports, msging
|
|
||||||
# => req msg
|
|
||||||
# <= resp msg
|
|
||||||
# <=> 2-way streaming (of msgs)
|
|
||||||
# <- recv 1 msg
|
|
||||||
# -> send 1 msg
|
|
||||||
#
|
|
||||||
# TODO: still not sure on R/L-HS approach..?
|
|
||||||
# =>( send-req to exec start (task, actor, thread..)
|
|
||||||
# (<= recv-req to ^
|
|
||||||
#
|
|
||||||
# (<= recv-req ^
|
|
||||||
# <=( recv-resp opened remote exec primitive
|
|
||||||
# <=) recv-resp closed
|
|
||||||
#
|
|
||||||
# )<=c req to stop due to cancel
|
|
||||||
# c=>) req to stop due to cancel
|
|
||||||
#
|
|
||||||
# =>{ recv-req to open
|
|
||||||
# <={ send-status that it closed
|
|
||||||
|
|
||||||
tree_str: str,
|
|
||||||
|
|
||||||
# NOTE: so move back-from-the-left of the `input_op` by
|
|
||||||
# this amount.
|
|
||||||
back_from_op: int = 0,
|
|
||||||
) -> str:
|
|
||||||
'''
|
|
||||||
Depth-increment the input (presumably hierarchy/supervision)
|
|
||||||
input "tree string" below the provided `input_op` execution
|
|
||||||
operator, so injecting a `"\n|_{input_op}\n"`and indenting the
|
|
||||||
`tree_str` to nest content aligned with the ops last char.
|
|
||||||
|
|
||||||
'''
|
|
||||||
return (
|
|
||||||
f'{input_op}\n'
|
|
||||||
+
|
|
||||||
textwrap.indent(
|
|
||||||
tree_str,
|
|
||||||
prefix=(
|
|
||||||
len(input_op)
|
|
||||||
-
|
|
||||||
(back_from_op + 1)
|
|
||||||
) * ' ',
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def _trio_main(
|
def _trio_main(
|
||||||
actor: Actor,
|
|
||||||
|
actor: Actor, # type: ignore
|
||||||
*,
|
*,
|
||||||
parent_addr: tuple[str, int] | None = None,
|
parent_addr: tuple[str, int] | None = None,
|
||||||
infect_asyncio: bool = False,
|
infect_asyncio: bool = False,
|
||||||
|
@ -214,8 +106,6 @@ def _trio_main(
|
||||||
Entry point for a `trio_run_in_process` subactor.
|
Entry point for a `trio_run_in_process` subactor.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
_debug.hide_runtime_frames()
|
|
||||||
|
|
||||||
_state._current_actor = actor
|
_state._current_actor = actor
|
||||||
trio_main = partial(
|
trio_main = partial(
|
||||||
async_main,
|
async_main,
|
||||||
|
@ -225,6 +115,7 @@ def _trio_main(
|
||||||
|
|
||||||
if actor.loglevel is not None:
|
if actor.loglevel is not None:
|
||||||
get_console_log(actor.loglevel)
|
get_console_log(actor.loglevel)
|
||||||
|
import os
|
||||||
actor_info: str = (
|
actor_info: str = (
|
||||||
f'|_{actor}\n'
|
f'|_{actor}\n'
|
||||||
f' uid: {actor.uid}\n'
|
f' uid: {actor.uid}\n'
|
||||||
|
@ -233,54 +124,27 @@ def _trio_main(
|
||||||
f' loglevel: {actor.loglevel}\n'
|
f' loglevel: {actor.loglevel}\n'
|
||||||
)
|
)
|
||||||
log.info(
|
log.info(
|
||||||
'Starting new `trio` subactor:\n'
|
'Started new trio process:\n'
|
||||||
+
|
+
|
||||||
nest_from_op(
|
actor_info
|
||||||
input_op='>(', # see syntax ideas above
|
|
||||||
tree_str=actor_info,
|
|
||||||
back_from_op=2, # since "complete"
|
|
||||||
)
|
|
||||||
)
|
)
|
||||||
logmeth = log.info
|
|
||||||
exit_status: str = (
|
|
||||||
'Subactor exited\n'
|
|
||||||
+
|
|
||||||
nest_from_op(
|
|
||||||
input_op=')>', # like a "closed-to-play"-icon from super perspective
|
|
||||||
tree_str=actor_info,
|
|
||||||
back_from_op=1,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
try:
|
try:
|
||||||
if infect_asyncio:
|
if infect_asyncio:
|
||||||
actor._infected_aio = True
|
actor._infected_aio = True
|
||||||
run_as_asyncio_guest(trio_main)
|
run_as_asyncio_guest(trio_main)
|
||||||
else:
|
else:
|
||||||
trio.run(trio_main)
|
trio.run(trio_main)
|
||||||
|
|
||||||
except KeyboardInterrupt:
|
except KeyboardInterrupt:
|
||||||
logmeth = log.cancel
|
log.cancel(
|
||||||
exit_status: str = (
|
'Actor received KBI\n'
|
||||||
'Actor received KBI (aka an OS-cancel)\n'
|
|
||||||
+
|
+
|
||||||
nest_from_op(
|
actor_info
|
||||||
input_op='c)>', # closed due to cancel (see above)
|
|
||||||
tree_str=actor_info,
|
|
||||||
)
|
|
||||||
)
|
)
|
||||||
except BaseException as err:
|
|
||||||
logmeth = log.error
|
|
||||||
exit_status: str = (
|
|
||||||
'Main actor task exited due to crash?\n'
|
|
||||||
+
|
|
||||||
nest_from_op(
|
|
||||||
input_op='x)>', # closed by error
|
|
||||||
tree_str=actor_info,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
# NOTE since we raise a tb will already be shown on the
|
|
||||||
# console, thus we do NOT use `.exception()` above.
|
|
||||||
raise err
|
|
||||||
|
|
||||||
finally:
|
finally:
|
||||||
logmeth(exit_status)
|
log.info(
|
||||||
|
'Actor terminated\n'
|
||||||
|
+
|
||||||
|
actor_info
|
||||||
|
)
|
||||||
|
|
File diff suppressed because it is too large
Load Diff
436
tractor/_ipc.py
436
tractor/_ipc.py
|
@ -23,17 +23,13 @@ from collections.abc import (
|
||||||
AsyncGenerator,
|
AsyncGenerator,
|
||||||
AsyncIterator,
|
AsyncIterator,
|
||||||
)
|
)
|
||||||
from contextlib import (
|
from contextlib import asynccontextmanager as acm
|
||||||
asynccontextmanager as acm,
|
|
||||||
contextmanager as cm,
|
|
||||||
)
|
|
||||||
import platform
|
import platform
|
||||||
from pprint import pformat
|
from pprint import pformat
|
||||||
import struct
|
import struct
|
||||||
import typing
|
import typing
|
||||||
from typing import (
|
from typing import (
|
||||||
Any,
|
Any,
|
||||||
Callable,
|
|
||||||
runtime_checkable,
|
runtime_checkable,
|
||||||
Protocol,
|
Protocol,
|
||||||
Type,
|
Type,
|
||||||
|
@ -45,38 +41,15 @@ from tricycle import BufferedReceiveStream
|
||||||
import trio
|
import trio
|
||||||
|
|
||||||
from tractor.log import get_logger
|
from tractor.log import get_logger
|
||||||
from tractor._exceptions import (
|
from tractor._exceptions import TransportClosed
|
||||||
MsgTypeError,
|
|
||||||
pack_from_raise,
|
|
||||||
TransportClosed,
|
|
||||||
_mk_send_mte,
|
|
||||||
_mk_recv_mte,
|
|
||||||
)
|
|
||||||
from tractor.msg import (
|
|
||||||
_ctxvar_MsgCodec,
|
|
||||||
# _codec, XXX see `self._codec` sanity/debug checks
|
|
||||||
MsgCodec,
|
|
||||||
types as msgtypes,
|
|
||||||
pretty_struct,
|
|
||||||
)
|
|
||||||
|
|
||||||
log = get_logger(__name__)
|
log = get_logger(__name__)
|
||||||
|
|
||||||
_is_windows = platform.system() == 'Windows'
|
_is_windows = platform.system() == 'Windows'
|
||||||
|
|
||||||
|
|
||||||
def get_stream_addrs(
|
def get_stream_addrs(stream: trio.SocketStream) -> tuple:
|
||||||
stream: trio.SocketStream
|
# should both be IP sockets
|
||||||
) -> tuple[
|
|
||||||
tuple[str, int], # local
|
|
||||||
tuple[str, int], # remote
|
|
||||||
]:
|
|
||||||
'''
|
|
||||||
Return the `trio` streaming transport prot's socket-addrs for
|
|
||||||
both the local and remote sides as a pair.
|
|
||||||
|
|
||||||
'''
|
|
||||||
# rn, should both be IP sockets
|
|
||||||
lsockname = stream.socket.getsockname()
|
lsockname = stream.socket.getsockname()
|
||||||
rsockname = stream.socket.getpeername()
|
rsockname = stream.socket.getpeername()
|
||||||
return (
|
return (
|
||||||
|
@ -85,22 +58,16 @@ def get_stream_addrs(
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
# from tractor.msg.types import MsgType
|
MsgType = TypeVar("MsgType")
|
||||||
# ?TODO? this should be our `Union[*msgtypes.__spec__]` alias now right..?
|
|
||||||
# => BLEH, except can't bc prots must inherit typevar or param-spec
|
# TODO: consider using a generic def and indexing with our eventual
|
||||||
# vars..
|
# msg definition/types?
|
||||||
MsgType = TypeVar('MsgType')
|
# - https://docs.python.org/3/library/typing.html#typing.Protocol
|
||||||
|
# - https://jcristharif.com/msgspec/usage.html#structs
|
||||||
|
|
||||||
|
|
||||||
# TODO: break up this mod into a subpkg so we can start adding new
|
|
||||||
# backends and move this type stuff into a dedicated file.. Bo
|
|
||||||
#
|
|
||||||
@runtime_checkable
|
@runtime_checkable
|
||||||
class MsgTransport(Protocol[MsgType]):
|
class MsgTransport(Protocol[MsgType]):
|
||||||
#
|
|
||||||
# ^-TODO-^ consider using a generic def and indexing with our
|
|
||||||
# eventual msg definition/types?
|
|
||||||
# - https://docs.python.org/3/library/typing.html#typing.Protocol
|
|
||||||
|
|
||||||
stream: trio.SocketStream
|
stream: trio.SocketStream
|
||||||
drained: list[MsgType]
|
drained: list[MsgType]
|
||||||
|
@ -135,9 +102,9 @@ class MsgTransport(Protocol[MsgType]):
|
||||||
...
|
...
|
||||||
|
|
||||||
|
|
||||||
# TODO: typing oddity.. not sure why we have to inherit here, but it
|
# TODO: not sure why we have to inherit here, but it seems to be an
|
||||||
# seems to be an issue with `get_msg_transport()` returning
|
# issue with ``get_msg_transport()`` returning a ``Type[Protocol]``;
|
||||||
# a `Type[Protocol]`; probably should make a `mypy` issue?
|
# probably should make a `mypy` issue?
|
||||||
class MsgpackTCPStream(MsgTransport):
|
class MsgpackTCPStream(MsgTransport):
|
||||||
'''
|
'''
|
||||||
A ``trio.SocketStream`` delivering ``msgpack`` formatted data
|
A ``trio.SocketStream`` delivering ``msgpack`` formatted data
|
||||||
|
@ -156,16 +123,6 @@ class MsgpackTCPStream(MsgTransport):
|
||||||
stream: trio.SocketStream,
|
stream: trio.SocketStream,
|
||||||
prefix_size: int = 4,
|
prefix_size: int = 4,
|
||||||
|
|
||||||
# XXX optionally provided codec pair for `msgspec`:
|
|
||||||
# https://jcristharif.com/msgspec/extending.html#mapping-to-from-native-types
|
|
||||||
#
|
|
||||||
# TODO: define this as a `Codec` struct which can be
|
|
||||||
# overriden dynamically by the application/runtime?
|
|
||||||
codec: tuple[
|
|
||||||
Callable[[Any], Any]|None, # coder
|
|
||||||
Callable[[type, Any], Any]|None, # decoder
|
|
||||||
]|None = None,
|
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
|
|
||||||
self.stream = stream
|
self.stream = stream
|
||||||
|
@ -175,44 +132,30 @@ class MsgpackTCPStream(MsgTransport):
|
||||||
self._laddr, self._raddr = get_stream_addrs(stream)
|
self._laddr, self._raddr = get_stream_addrs(stream)
|
||||||
|
|
||||||
# create read loop instance
|
# create read loop instance
|
||||||
self._aiter_pkts = self._iter_packets()
|
self._agen = self._iter_packets()
|
||||||
self._send_lock = trio.StrictFIFOLock()
|
self._send_lock = trio.StrictFIFOLock()
|
||||||
|
|
||||||
# public i guess?
|
# public i guess?
|
||||||
self.drained: list[dict] = []
|
self.drained: list[dict] = []
|
||||||
|
|
||||||
self.recv_stream = BufferedReceiveStream(
|
self.recv_stream = BufferedReceiveStream(transport_stream=stream)
|
||||||
transport_stream=stream
|
|
||||||
)
|
|
||||||
self.prefix_size = prefix_size
|
self.prefix_size = prefix_size
|
||||||
|
|
||||||
# allow for custom IPC msg interchange format
|
# TODO: struct aware messaging coders
|
||||||
# dynamic override Bo
|
self.encode = msgspec.msgpack.Encoder().encode
|
||||||
self._task = trio.lowlevel.current_task()
|
self.decode = msgspec.msgpack.Decoder().decode # dict[str, Any])
|
||||||
|
|
||||||
# XXX for ctxvar debug only!
|
|
||||||
# self._codec: MsgCodec = (
|
|
||||||
# codec
|
|
||||||
# or
|
|
||||||
# _codec._ctxvar_MsgCodec.get()
|
|
||||||
# )
|
|
||||||
|
|
||||||
async def _iter_packets(self) -> AsyncGenerator[dict, None]:
|
async def _iter_packets(self) -> AsyncGenerator[dict, None]:
|
||||||
'''
|
'''Yield packets from the underlying stream.
|
||||||
Yield `bytes`-blob decoded packets from the underlying TCP
|
|
||||||
stream using the current task's `MsgCodec`.
|
|
||||||
|
|
||||||
This is a streaming routine implemented as an async generator
|
|
||||||
func (which was the original design, but could be changed?)
|
|
||||||
and is allocated by a `.__call__()` inside `.__init__()` where
|
|
||||||
it is assigned to the `._aiter_pkts` attr.
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
|
import msgspec # noqa
|
||||||
decodes_failed: int = 0
|
decodes_failed: int = 0
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
try:
|
try:
|
||||||
header: bytes = await self.recv_stream.receive_exactly(4)
|
header = await self.recv_stream.receive_exactly(4)
|
||||||
|
|
||||||
except (
|
except (
|
||||||
ValueError,
|
ValueError,
|
||||||
ConnectionResetError,
|
ConnectionResetError,
|
||||||
|
@ -221,122 +164,25 @@ class MsgpackTCPStream(MsgTransport):
|
||||||
# seem to be getting racy failures here on
|
# seem to be getting racy failures here on
|
||||||
# arbiter/registry name subs..
|
# arbiter/registry name subs..
|
||||||
trio.BrokenResourceError,
|
trio.BrokenResourceError,
|
||||||
|
):
|
||||||
) as trans_err:
|
|
||||||
|
|
||||||
loglevel = 'transport'
|
|
||||||
match trans_err:
|
|
||||||
# case (
|
|
||||||
# ConnectionResetError()
|
|
||||||
# ):
|
|
||||||
# loglevel = 'transport'
|
|
||||||
|
|
||||||
# peer actor (graceful??) TCP EOF but `tricycle`
|
|
||||||
# seems to raise a 0-bytes-read?
|
|
||||||
case ValueError() if (
|
|
||||||
'unclean EOF' in trans_err.args[0]
|
|
||||||
):
|
|
||||||
pass
|
|
||||||
|
|
||||||
# peer actor (task) prolly shutdown quickly due
|
|
||||||
# to cancellation
|
|
||||||
case trio.BrokenResourceError() if (
|
|
||||||
'Connection reset by peer' in trans_err.args[0]
|
|
||||||
):
|
|
||||||
pass
|
|
||||||
|
|
||||||
# unless the disconnect condition falls under "a
|
|
||||||
# normal operation breakage" we usualy console warn
|
|
||||||
# about it.
|
|
||||||
case _:
|
|
||||||
loglevel: str = 'warning'
|
|
||||||
|
|
||||||
|
|
||||||
raise TransportClosed(
|
raise TransportClosed(
|
||||||
message=(
|
f'transport {self} was already closed prior ro read'
|
||||||
f'IPC transport already closed by peer\n'
|
)
|
||||||
f'x]> {type(trans_err)}\n'
|
|
||||||
f' |_{self}\n'
|
if header == b'':
|
||||||
),
|
raise TransportClosed(
|
||||||
loglevel=loglevel,
|
f'transport {self} was already closed prior ro read'
|
||||||
) from trans_err
|
|
||||||
|
|
||||||
# XXX definitely can happen if transport is closed
|
|
||||||
# manually by another `trio.lowlevel.Task` in the
|
|
||||||
# same actor; we use this in some simulated fault
|
|
||||||
# testing for ex, but generally should never happen
|
|
||||||
# under normal operation!
|
|
||||||
#
|
|
||||||
# NOTE: as such we always re-raise this error from the
|
|
||||||
# RPC msg loop!
|
|
||||||
except trio.ClosedResourceError as closure_err:
|
|
||||||
raise TransportClosed(
|
|
||||||
message=(
|
|
||||||
f'IPC transport already manually closed locally?\n'
|
|
||||||
f'x]> {type(closure_err)} \n'
|
|
||||||
f' |_{self}\n'
|
|
||||||
),
|
|
||||||
loglevel='error',
|
|
||||||
raise_on_report=(
|
|
||||||
closure_err.args[0] == 'another task closed this fd'
|
|
||||||
or
|
|
||||||
closure_err.args[0] in ['another task closed this fd']
|
|
||||||
),
|
|
||||||
) from closure_err
|
|
||||||
|
|
||||||
# graceful TCP EOF disconnect
|
|
||||||
if header == b'':
|
|
||||||
raise TransportClosed(
|
|
||||||
message=(
|
|
||||||
f'IPC transport already gracefully closed\n'
|
|
||||||
f']>\n'
|
|
||||||
f' |_{self}\n'
|
|
||||||
),
|
|
||||||
loglevel='transport',
|
|
||||||
# cause=??? # handy or no?
|
|
||||||
)
|
)
|
||||||
|
|
||||||
size: int
|
|
||||||
size, = struct.unpack("<I", header)
|
size, = struct.unpack("<I", header)
|
||||||
|
|
||||||
log.transport(f'received header {size}') # type: ignore
|
log.transport(f'received header {size}') # type: ignore
|
||||||
msg_bytes: bytes = await self.recv_stream.receive_exactly(size)
|
|
||||||
|
msg_bytes = await self.recv_stream.receive_exactly(size)
|
||||||
|
|
||||||
log.transport(f"received {msg_bytes}") # type: ignore
|
log.transport(f"received {msg_bytes}") # type: ignore
|
||||||
try:
|
try:
|
||||||
# NOTE: lookup the `trio.Task.context`'s var for
|
yield self.decode(msg_bytes)
|
||||||
# the current `MsgCodec`.
|
|
||||||
codec: MsgCodec = _ctxvar_MsgCodec.get()
|
|
||||||
|
|
||||||
# XXX for ctxvar debug only!
|
|
||||||
# if self._codec.pld_spec != codec.pld_spec:
|
|
||||||
# assert (
|
|
||||||
# task := trio.lowlevel.current_task()
|
|
||||||
# ) is not self._task
|
|
||||||
# self._task = task
|
|
||||||
# self._codec = codec
|
|
||||||
# log.runtime(
|
|
||||||
# f'Using new codec in {self}.recv()\n'
|
|
||||||
# f'codec: {self._codec}\n\n'
|
|
||||||
# f'msg_bytes: {msg_bytes}\n'
|
|
||||||
# )
|
|
||||||
yield codec.decode(msg_bytes)
|
|
||||||
|
|
||||||
# XXX NOTE: since the below error derives from
|
|
||||||
# `DecodeError` we need to catch is specially
|
|
||||||
# and always raise such that spec violations
|
|
||||||
# are never allowed to be caught silently!
|
|
||||||
except msgspec.ValidationError as verr:
|
|
||||||
msgtyperr: MsgTypeError = _mk_recv_mte(
|
|
||||||
msg=msg_bytes,
|
|
||||||
codec=codec,
|
|
||||||
src_validation_error=verr,
|
|
||||||
)
|
|
||||||
# XXX deliver up to `Channel.recv()` where
|
|
||||||
# a re-raise and `Error`-pack can inject the far
|
|
||||||
# end actor `.uid`.
|
|
||||||
yield msgtyperr
|
|
||||||
|
|
||||||
except (
|
except (
|
||||||
msgspec.DecodeError,
|
msgspec.DecodeError,
|
||||||
UnicodeDecodeError,
|
UnicodeDecodeError,
|
||||||
|
@ -346,15 +192,14 @@ class MsgpackTCPStream(MsgTransport):
|
||||||
# do with a channel drop - hope that receiving from the
|
# do with a channel drop - hope that receiving from the
|
||||||
# channel will raise an expected error and bubble up.
|
# channel will raise an expected error and bubble up.
|
||||||
try:
|
try:
|
||||||
msg_str: str|bytes = msg_bytes.decode()
|
msg_str: str | bytes = msg_bytes.decode()
|
||||||
except UnicodeDecodeError:
|
except UnicodeDecodeError:
|
||||||
msg_str = msg_bytes
|
msg_str = msg_bytes
|
||||||
|
|
||||||
log.exception(
|
log.error(
|
||||||
'Failed to decode msg?\n'
|
'`msgspec` failed to decode!?\n'
|
||||||
f'{codec}\n\n'
|
'dumping bytes:\n'
|
||||||
'Rxed bytes from wire:\n\n'
|
f'{msg_str!r}'
|
||||||
f'{msg_str!r}\n'
|
|
||||||
)
|
)
|
||||||
decodes_failed += 1
|
decodes_failed += 1
|
||||||
else:
|
else:
|
||||||
|
@ -362,79 +207,24 @@ class MsgpackTCPStream(MsgTransport):
|
||||||
|
|
||||||
async def send(
|
async def send(
|
||||||
self,
|
self,
|
||||||
msg: msgtypes.MsgType,
|
msg: Any,
|
||||||
|
|
||||||
strict_types: bool = True,
|
|
||||||
hide_tb: bool = False,
|
|
||||||
|
|
||||||
|
# hide_tb: bool = False,
|
||||||
) -> None:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
Send a msgpack encoded py-object-blob-as-msg over TCP.
|
Send a msgpack coded blob-as-msg over TCP.
|
||||||
|
|
||||||
If `strict_types == True` then a `MsgTypeError` will be raised on any
|
|
||||||
invalid msg type
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
__tracebackhide__: bool = hide_tb
|
# __tracebackhide__: bool = hide_tb
|
||||||
|
|
||||||
# XXX see `trio._sync.AsyncContextManagerMixin` for details
|
|
||||||
# on the `.acquire()`/`.release()` sequencing..
|
|
||||||
async with self._send_lock:
|
async with self._send_lock:
|
||||||
|
|
||||||
# NOTE: lookup the `trio.Task.context`'s var for
|
bytes_data: bytes = self.encode(msg)
|
||||||
# the current `MsgCodec`.
|
|
||||||
codec: MsgCodec = _ctxvar_MsgCodec.get()
|
|
||||||
|
|
||||||
# XXX for ctxvar debug only!
|
|
||||||
# if self._codec.pld_spec != codec.pld_spec:
|
|
||||||
# self._codec = codec
|
|
||||||
# log.runtime(
|
|
||||||
# f'Using new codec in {self}.send()\n'
|
|
||||||
# f'codec: {self._codec}\n\n'
|
|
||||||
# f'msg: {msg}\n'
|
|
||||||
# )
|
|
||||||
|
|
||||||
if type(msg) not in msgtypes.__msg_types__:
|
|
||||||
if strict_types:
|
|
||||||
raise _mk_send_mte(
|
|
||||||
msg,
|
|
||||||
codec=codec,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
log.warning(
|
|
||||||
'Sending non-`Msg`-spec msg?\n\n'
|
|
||||||
f'{msg}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
try:
|
|
||||||
bytes_data: bytes = codec.encode(msg)
|
|
||||||
except TypeError as _err:
|
|
||||||
typerr = _err
|
|
||||||
msgtyperr: MsgTypeError = _mk_send_mte(
|
|
||||||
msg,
|
|
||||||
codec=codec,
|
|
||||||
message=(
|
|
||||||
f'IPC-msg-spec violation in\n\n'
|
|
||||||
f'{pretty_struct.Struct.pformat(msg)}'
|
|
||||||
),
|
|
||||||
src_type_error=typerr,
|
|
||||||
)
|
|
||||||
raise msgtyperr from typerr
|
|
||||||
|
|
||||||
# supposedly the fastest says,
|
# supposedly the fastest says,
|
||||||
# https://stackoverflow.com/a/54027962
|
# https://stackoverflow.com/a/54027962
|
||||||
size: bytes = struct.pack("<I", len(bytes_data))
|
size: bytes = struct.pack("<I", len(bytes_data))
|
||||||
return await self.stream.send_all(size + bytes_data)
|
|
||||||
|
|
||||||
# ?TODO? does it help ever to dynamically show this
|
return await self.stream.send_all(size + bytes_data)
|
||||||
# frame?
|
|
||||||
# try:
|
|
||||||
# <the-above_code>
|
|
||||||
# except BaseException as _err:
|
|
||||||
# err = _err
|
|
||||||
# if not isinstance(err, MsgTypeError):
|
|
||||||
# __tracebackhide__: bool = False
|
|
||||||
# raise
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def laddr(self) -> tuple[str, int]:
|
def laddr(self) -> tuple[str, int]:
|
||||||
|
@ -445,7 +235,7 @@ class MsgpackTCPStream(MsgTransport):
|
||||||
return self._raddr
|
return self._raddr
|
||||||
|
|
||||||
async def recv(self) -> Any:
|
async def recv(self) -> Any:
|
||||||
return await self._aiter_pkts.asend(None)
|
return await self._agen.asend(None)
|
||||||
|
|
||||||
async def drain(self) -> AsyncIterator[dict]:
|
async def drain(self) -> AsyncIterator[dict]:
|
||||||
'''
|
'''
|
||||||
|
@ -462,7 +252,7 @@ class MsgpackTCPStream(MsgTransport):
|
||||||
yield msg
|
yield msg
|
||||||
|
|
||||||
def __aiter__(self):
|
def __aiter__(self):
|
||||||
return self._aiter_pkts
|
return self._agen
|
||||||
|
|
||||||
def connected(self) -> bool:
|
def connected(self) -> bool:
|
||||||
return self.stream.socket.fileno() != -1
|
return self.stream.socket.fileno() != -1
|
||||||
|
@ -517,7 +307,7 @@ class Channel:
|
||||||
# set after handshake - always uid of far end
|
# set after handshake - always uid of far end
|
||||||
self.uid: tuple[str, str]|None = None
|
self.uid: tuple[str, str]|None = None
|
||||||
|
|
||||||
self._aiter_msgs = self._iter_msgs()
|
self._agen = self._aiter_recv()
|
||||||
self._exc: Exception|None = None # set if far end actor errors
|
self._exc: Exception|None = None # set if far end actor errors
|
||||||
self._closed: bool = False
|
self._closed: bool = False
|
||||||
|
|
||||||
|
@ -528,9 +318,7 @@ class Channel:
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def msgstream(self) -> MsgTransport:
|
def msgstream(self) -> MsgTransport:
|
||||||
log.info(
|
log.info('`Channel.msgstream` is an old name, use `._transport`')
|
||||||
'`Channel.msgstream` is an old name, use `._transport`'
|
|
||||||
)
|
|
||||||
return self._transport
|
return self._transport
|
||||||
|
|
||||||
@property
|
@property
|
||||||
|
@ -561,45 +349,11 @@ class Channel:
|
||||||
stream: trio.SocketStream,
|
stream: trio.SocketStream,
|
||||||
type_key: tuple[str, str]|None = None,
|
type_key: tuple[str, str]|None = None,
|
||||||
|
|
||||||
# XXX optionally provided codec pair for `msgspec`:
|
|
||||||
# https://jcristharif.com/msgspec/extending.html#mapping-to-from-native-types
|
|
||||||
codec: MsgCodec|None = None,
|
|
||||||
|
|
||||||
) -> MsgTransport:
|
) -> MsgTransport:
|
||||||
type_key = (
|
type_key = type_key or self._transport_key
|
||||||
type_key
|
self._transport = get_msg_transport(type_key)(stream)
|
||||||
or
|
|
||||||
self._transport_key
|
|
||||||
)
|
|
||||||
# get transport type, then
|
|
||||||
self._transport = get_msg_transport(
|
|
||||||
type_key
|
|
||||||
# instantiate an instance of the msg-transport
|
|
||||||
)(
|
|
||||||
stream,
|
|
||||||
codec=codec,
|
|
||||||
)
|
|
||||||
return self._transport
|
return self._transport
|
||||||
|
|
||||||
@cm
|
|
||||||
def apply_codec(
|
|
||||||
self,
|
|
||||||
codec: MsgCodec,
|
|
||||||
|
|
||||||
) -> None:
|
|
||||||
'''
|
|
||||||
Temporarily override the underlying IPC msg codec for
|
|
||||||
dynamic enforcement of messaging schema.
|
|
||||||
|
|
||||||
'''
|
|
||||||
orig: MsgCodec = self._transport.codec
|
|
||||||
try:
|
|
||||||
self._transport.codec = codec
|
|
||||||
yield
|
|
||||||
finally:
|
|
||||||
self._transport.codec = orig
|
|
||||||
|
|
||||||
# TODO: do a .src/.dst: str for maddrs?
|
|
||||||
def __repr__(self) -> str:
|
def __repr__(self) -> str:
|
||||||
if not self._transport:
|
if not self._transport:
|
||||||
return '<Channel with inactive transport?>'
|
return '<Channel with inactive transport?>'
|
||||||
|
@ -643,53 +397,33 @@ class Channel:
|
||||||
)
|
)
|
||||||
return transport
|
return transport
|
||||||
|
|
||||||
# TODO: something like,
|
|
||||||
# `pdbp.hideframe_on(errors=[MsgTypeError])`
|
|
||||||
# instead of the `try/except` hack we have rn..
|
|
||||||
# seems like a pretty useful thing to have in general
|
|
||||||
# along with being able to filter certain stack frame(s / sets)
|
|
||||||
# possibly based on the current log-level?
|
|
||||||
async def send(
|
async def send(
|
||||||
self,
|
self,
|
||||||
payload: Any,
|
payload: Any,
|
||||||
|
|
||||||
hide_tb: bool = False,
|
# hide_tb: bool = False,
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
Send a coded msg-blob over the transport.
|
Send a coded msg-blob over the transport.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
__tracebackhide__: bool = hide_tb
|
# __tracebackhide__: bool = hide_tb
|
||||||
try:
|
log.transport(
|
||||||
log.transport(
|
'=> send IPC msg:\n\n'
|
||||||
'=> send IPC msg:\n\n'
|
f'{pformat(payload)}\n'
|
||||||
f'{pformat(payload)}\n'
|
) # type: ignore
|
||||||
)
|
assert self._transport
|
||||||
# assert self._transport # but why typing?
|
|
||||||
await self._transport.send(
|
|
||||||
payload,
|
|
||||||
hide_tb=hide_tb,
|
|
||||||
)
|
|
||||||
except BaseException as _err:
|
|
||||||
err = _err # bind for introspection
|
|
||||||
if not isinstance(_err, MsgTypeError):
|
|
||||||
# assert err
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
else:
|
|
||||||
assert err.cid
|
|
||||||
|
|
||||||
raise
|
await self._transport.send(
|
||||||
|
payload,
|
||||||
|
# hide_tb=hide_tb,
|
||||||
|
)
|
||||||
|
|
||||||
async def recv(self) -> Any:
|
async def recv(self) -> Any:
|
||||||
assert self._transport
|
assert self._transport
|
||||||
return await self._transport.recv()
|
return await self._transport.recv()
|
||||||
|
|
||||||
# TODO: auto-reconnect features like 0mq/nanomsg?
|
|
||||||
# -[ ] implement it manually with nods to SC prot
|
|
||||||
# possibly on multiple transport backends?
|
|
||||||
# -> seems like that might be re-inventing scalability
|
|
||||||
# prots tho no?
|
|
||||||
# try:
|
# try:
|
||||||
# return await self._transport.recv()
|
# return await self._transport.recv()
|
||||||
# except trio.BrokenResourceError:
|
# except trio.BrokenResourceError:
|
||||||
|
@ -716,11 +450,8 @@ class Channel:
|
||||||
await self.aclose(*args)
|
await self.aclose(*args)
|
||||||
|
|
||||||
def __aiter__(self):
|
def __aiter__(self):
|
||||||
return self._aiter_msgs
|
return self._agen
|
||||||
|
|
||||||
# ?TODO? run any reconnection sequence?
|
|
||||||
# -[ ] prolly should be impl-ed as deco-API?
|
|
||||||
#
|
|
||||||
# async def _reconnect(self) -> None:
|
# async def _reconnect(self) -> None:
|
||||||
# """Handle connection failures by polling until a reconnect can be
|
# """Handle connection failures by polling until a reconnect can be
|
||||||
# established.
|
# established.
|
||||||
|
@ -738,6 +469,7 @@ class Channel:
|
||||||
# else:
|
# else:
|
||||||
# log.transport("Stream connection re-established!")
|
# log.transport("Stream connection re-established!")
|
||||||
|
|
||||||
|
# # TODO: run any reconnection sequence
|
||||||
# # on_recon = self._recon_seq
|
# # on_recon = self._recon_seq
|
||||||
# # if on_recon:
|
# # if on_recon:
|
||||||
# # await on_recon(self)
|
# # await on_recon(self)
|
||||||
|
@ -751,42 +483,23 @@ class Channel:
|
||||||
# " for re-establishment")
|
# " for re-establishment")
|
||||||
# await trio.sleep(1)
|
# await trio.sleep(1)
|
||||||
|
|
||||||
async def _iter_msgs(
|
async def _aiter_recv(
|
||||||
self
|
self
|
||||||
) -> AsyncGenerator[Any, None]:
|
) -> AsyncGenerator[Any, None]:
|
||||||
'''
|
'''
|
||||||
Yield `MsgType` IPC msgs decoded and deliverd from
|
Async iterate items from underlying stream.
|
||||||
an underlying `MsgTransport` protocol.
|
|
||||||
|
|
||||||
This is a streaming routine alo implemented as an async-gen
|
|
||||||
func (same a `MsgTransport._iter_pkts()`) gets allocated by
|
|
||||||
a `.__call__()` inside `.__init__()` where it is assigned to
|
|
||||||
the `._aiter_msgs` attr.
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
assert self._transport
|
assert self._transport
|
||||||
while True:
|
while True:
|
||||||
try:
|
try:
|
||||||
async for msg in self._transport:
|
async for item in self._transport:
|
||||||
match msg:
|
yield item
|
||||||
# NOTE: if transport/interchange delivers
|
# sent = yield item
|
||||||
# a type error, we pack it with the far
|
# if sent is not None:
|
||||||
# end peer `Actor.uid` and relay the
|
# # optimization, passing None through all the
|
||||||
# `Error`-msg upward to the `._rpc` stack
|
# # time is pointless
|
||||||
# for normal RAE handling.
|
# await self._transport.send(sent)
|
||||||
case MsgTypeError():
|
|
||||||
yield pack_from_raise(
|
|
||||||
local_err=msg,
|
|
||||||
cid=msg.cid,
|
|
||||||
|
|
||||||
# XXX we pack it here bc lower
|
|
||||||
# layers have no notion of an
|
|
||||||
# actor-id ;)
|
|
||||||
src_uid=self.uid,
|
|
||||||
)
|
|
||||||
case _:
|
|
||||||
yield msg
|
|
||||||
|
|
||||||
except trio.BrokenResourceError:
|
except trio.BrokenResourceError:
|
||||||
|
|
||||||
# if not self._autorecon:
|
# if not self._autorecon:
|
||||||
|
@ -804,9 +517,7 @@ class Channel:
|
||||||
|
|
||||||
@acm
|
@acm
|
||||||
async def _connect_chan(
|
async def _connect_chan(
|
||||||
host: str,
|
host: str, port: int
|
||||||
port: int
|
|
||||||
|
|
||||||
) -> typing.AsyncGenerator[Channel, None]:
|
) -> typing.AsyncGenerator[Channel, None]:
|
||||||
'''
|
'''
|
||||||
Create and connect a channel with disconnect on context manager
|
Create and connect a channel with disconnect on context manager
|
||||||
|
@ -816,5 +527,4 @@ async def _connect_chan(
|
||||||
chan = Channel((host, port))
|
chan = Channel((host, port))
|
||||||
await chan.connect()
|
await chan.connect()
|
||||||
yield chan
|
yield chan
|
||||||
with trio.CancelScope(shield=True):
|
await chan.aclose()
|
||||||
await chan.aclose()
|
|
||||||
|
|
|
@ -1,151 +0,0 @@
|
||||||
# tractor: structured concurrent "actors".
|
|
||||||
# Copyright 2018-eternity Tyler Goodlet.
|
|
||||||
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as published by
|
|
||||||
# the Free Software Foundation, either version 3 of the License, or
|
|
||||||
# (at your option) any later version.
|
|
||||||
|
|
||||||
# This program is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU Affero General Public License for more details.
|
|
||||||
|
|
||||||
# You should have received a copy of the GNU Affero General Public License
|
|
||||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
'''
|
|
||||||
Multiaddress parser and utils according the spec(s) defined by
|
|
||||||
`libp2p` and used in dependent project such as `ipfs`:
|
|
||||||
|
|
||||||
- https://docs.libp2p.io/concepts/fundamentals/addressing/
|
|
||||||
- https://github.com/libp2p/specs/blob/master/addressing/README.md
|
|
||||||
|
|
||||||
'''
|
|
||||||
from typing import Iterator
|
|
||||||
|
|
||||||
from bidict import bidict
|
|
||||||
|
|
||||||
# TODO: see if we can leverage libp2p ecosys projects instead of
|
|
||||||
# rolling our own (parser) impls of the above addressing specs:
|
|
||||||
# - https://github.com/libp2p/py-libp2p
|
|
||||||
# - https://docs.libp2p.io/concepts/nat/circuit-relay/#relay-addresses
|
|
||||||
# prots: bidict[int, str] = bidict({
|
|
||||||
prots: bidict[int, str] = {
|
|
||||||
'ipv4': 3,
|
|
||||||
'ipv6': 3,
|
|
||||||
'wg': 3,
|
|
||||||
|
|
||||||
'tcp': 4,
|
|
||||||
'udp': 4,
|
|
||||||
|
|
||||||
# TODO: support the next-gen shite Bo
|
|
||||||
# 'quic': 4,
|
|
||||||
# 'ssh': 7, # via rsyscall bootstrapping
|
|
||||||
}
|
|
||||||
|
|
||||||
prot_params: dict[str, tuple[str]] = {
|
|
||||||
'ipv4': ('addr',),
|
|
||||||
'ipv6': ('addr',),
|
|
||||||
'wg': ('addr', 'port', 'pubkey'),
|
|
||||||
|
|
||||||
'tcp': ('port',),
|
|
||||||
'udp': ('port',),
|
|
||||||
|
|
||||||
# 'quic': ('port',),
|
|
||||||
# 'ssh': ('port',),
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def iter_prot_layers(
|
|
||||||
multiaddr: str,
|
|
||||||
) -> Iterator[
|
|
||||||
tuple[
|
|
||||||
int,
|
|
||||||
list[str]
|
|
||||||
]
|
|
||||||
]:
|
|
||||||
'''
|
|
||||||
Unpack a libp2p style "multiaddress" into multiple "segments"
|
|
||||||
for each "layer" of the protocoll stack (in OSI terms).
|
|
||||||
|
|
||||||
'''
|
|
||||||
tokens: list[str] = multiaddr.split('/')
|
|
||||||
root, tokens = tokens[0], tokens[1:]
|
|
||||||
assert not root # there is a root '/' on LHS
|
|
||||||
itokens = iter(tokens)
|
|
||||||
|
|
||||||
prot: str | None = None
|
|
||||||
params: list[str] = []
|
|
||||||
for token in itokens:
|
|
||||||
# every prot path should start with a known
|
|
||||||
# key-str.
|
|
||||||
if token in prots:
|
|
||||||
if prot is None:
|
|
||||||
prot: str = token
|
|
||||||
else:
|
|
||||||
yield prot, params
|
|
||||||
prot = token
|
|
||||||
|
|
||||||
params = []
|
|
||||||
|
|
||||||
elif token not in prots:
|
|
||||||
params.append(token)
|
|
||||||
|
|
||||||
else:
|
|
||||||
yield prot, params
|
|
||||||
|
|
||||||
|
|
||||||
def parse_maddr(
|
|
||||||
multiaddr: str,
|
|
||||||
) -> dict[str, str | int | dict]:
|
|
||||||
'''
|
|
||||||
Parse a libp2p style "multiaddress" into its distinct protocol
|
|
||||||
segments where each segment is of the form:
|
|
||||||
|
|
||||||
`../<protocol>/<param0>/<param1>/../<paramN>`
|
|
||||||
|
|
||||||
and is loaded into a (order preserving) `layers: dict[str,
|
|
||||||
dict[str, Any]` which holds each protocol-layer-segment of the
|
|
||||||
original `str` path as a separate entry according to its approx
|
|
||||||
OSI "layer number".
|
|
||||||
|
|
||||||
Any `paramN` in the path must be distinctly defined by a str-token in the
|
|
||||||
(module global) `prot_params` table.
|
|
||||||
|
|
||||||
For eg. for wireguard which requires an address, port number and publickey
|
|
||||||
the protocol params are specified as the entry:
|
|
||||||
|
|
||||||
'wg': ('addr', 'port', 'pubkey'),
|
|
||||||
|
|
||||||
and are thus parsed from a maddr in that order:
|
|
||||||
`'/wg/1.1.1.1/51820/<pubkey>'`
|
|
||||||
|
|
||||||
'''
|
|
||||||
layers: dict[str, str | int | dict] = {}
|
|
||||||
for (
|
|
||||||
prot_key,
|
|
||||||
params,
|
|
||||||
) in iter_prot_layers(multiaddr):
|
|
||||||
|
|
||||||
layer: int = prots[prot_key] # OSI layer used for sorting
|
|
||||||
ep: dict[str, int | str] = {'layer': layer}
|
|
||||||
layers[prot_key] = ep
|
|
||||||
|
|
||||||
# TODO; validation and resolving of names:
|
|
||||||
# - each param via a validator provided as part of the
|
|
||||||
# prot_params def? (also see `"port"` case below..)
|
|
||||||
# - do a resolv step that will check addrs against
|
|
||||||
# any loaded network.resolv: dict[str, str]
|
|
||||||
rparams: list = list(reversed(params))
|
|
||||||
for key in prot_params[prot_key]:
|
|
||||||
val: str | int = rparams.pop()
|
|
||||||
|
|
||||||
# TODO: UGHH, dunno what we should do for validation
|
|
||||||
# here, put it in the params spec somehow?
|
|
||||||
if key == 'port':
|
|
||||||
val = int(val)
|
|
||||||
|
|
||||||
ep[key] = val
|
|
||||||
|
|
||||||
return layers
|
|
|
@ -31,7 +31,7 @@ from typing import (
|
||||||
Any,
|
Any,
|
||||||
Callable,
|
Callable,
|
||||||
AsyncGenerator,
|
AsyncGenerator,
|
||||||
TYPE_CHECKING,
|
# Type,
|
||||||
)
|
)
|
||||||
from functools import partial
|
from functools import partial
|
||||||
from dataclasses import dataclass
|
from dataclasses import dataclass
|
||||||
|
@ -45,14 +45,9 @@ from ._state import (
|
||||||
)
|
)
|
||||||
from ._ipc import Channel
|
from ._ipc import Channel
|
||||||
from .log import get_logger
|
from .log import get_logger
|
||||||
from .msg import (
|
from .msg import NamespacePath
|
||||||
# Error,
|
|
||||||
PayloadMsg,
|
|
||||||
NamespacePath,
|
|
||||||
Return,
|
|
||||||
)
|
|
||||||
from ._exceptions import (
|
from ._exceptions import (
|
||||||
# unpack_error,
|
unpack_error,
|
||||||
NoResult,
|
NoResult,
|
||||||
)
|
)
|
||||||
from ._context import (
|
from ._context import (
|
||||||
|
@ -63,12 +58,41 @@ from ._streaming import (
|
||||||
MsgStream,
|
MsgStream,
|
||||||
)
|
)
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
|
||||||
from ._runtime import Actor
|
|
||||||
|
|
||||||
log = get_logger(__name__)
|
log = get_logger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
# TODO: rename to `unwrap_result()` and use
|
||||||
|
# `._raise_from_no_key_in_msg()` (after tweak to
|
||||||
|
# accept a `chan: Channel` arg) in key block!
|
||||||
|
def _unwrap_msg(
|
||||||
|
msg: dict[str, Any],
|
||||||
|
channel: Channel,
|
||||||
|
|
||||||
|
hide_tb: bool = True,
|
||||||
|
|
||||||
|
) -> Any:
|
||||||
|
'''
|
||||||
|
Unwrap a final result from a `{return: <Any>}` IPC msg.
|
||||||
|
|
||||||
|
'''
|
||||||
|
__tracebackhide__: bool = hide_tb
|
||||||
|
|
||||||
|
try:
|
||||||
|
return msg['return']
|
||||||
|
except KeyError as ke:
|
||||||
|
|
||||||
|
# internal error should never get here
|
||||||
|
assert msg.get('cid'), (
|
||||||
|
"Received internal error at portal?"
|
||||||
|
)
|
||||||
|
|
||||||
|
raise unpack_error(
|
||||||
|
msg,
|
||||||
|
channel
|
||||||
|
) from ke
|
||||||
|
|
||||||
|
|
||||||
class Portal:
|
class Portal:
|
||||||
'''
|
'''
|
||||||
A 'portal' to a memory-domain-separated `Actor`.
|
A 'portal' to a memory-domain-separated `Actor`.
|
||||||
|
@ -92,26 +116,17 @@ class Portal:
|
||||||
# connected (peer) actors.
|
# connected (peer) actors.
|
||||||
cancel_timeout: float = 0.5
|
cancel_timeout: float = 0.5
|
||||||
|
|
||||||
def __init__(
|
def __init__(self, channel: Channel) -> None:
|
||||||
self,
|
self.chan = channel
|
||||||
channel: Channel,
|
|
||||||
) -> None:
|
|
||||||
|
|
||||||
self._chan: Channel = channel
|
|
||||||
# during the portal's lifetime
|
# during the portal's lifetime
|
||||||
self._final_result_pld: Any|None = None
|
self._result_msg: dict|None = None
|
||||||
self._final_result_msg: PayloadMsg|None = None
|
|
||||||
|
|
||||||
# When set to a ``Context`` (when _submit_for_result is called)
|
# When set to a ``Context`` (when _submit_for_result is called)
|
||||||
# it is expected that ``result()`` will be awaited at some
|
# it is expected that ``result()`` will be awaited at some
|
||||||
# point.
|
# point.
|
||||||
self._expect_result_ctx: Context|None = None
|
self._expect_result: Context | None = None
|
||||||
self._streams: set[MsgStream] = set()
|
self._streams: set[MsgStream] = set()
|
||||||
self.actor: Actor = current_actor()
|
self.actor = current_actor()
|
||||||
|
|
||||||
@property
|
|
||||||
def chan(self) -> Channel:
|
|
||||||
return self._chan
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def channel(self) -> Channel:
|
def channel(self) -> Channel:
|
||||||
|
@ -125,8 +140,6 @@ class Portal:
|
||||||
)
|
)
|
||||||
return self.chan
|
return self.chan
|
||||||
|
|
||||||
# TODO: factor this out into a `.highlevel` API-wrapper that uses
|
|
||||||
# a single `.open_context()` call underneath.
|
|
||||||
async def _submit_for_result(
|
async def _submit_for_result(
|
||||||
self,
|
self,
|
||||||
ns: str,
|
ns: str,
|
||||||
|
@ -134,34 +147,32 @@ class Portal:
|
||||||
**kwargs
|
**kwargs
|
||||||
) -> None:
|
) -> None:
|
||||||
|
|
||||||
if self._expect_result_ctx is not None:
|
assert self._expect_result is None, (
|
||||||
raise RuntimeError(
|
"A pending main result has already been submitted"
|
||||||
'A pending main result has already been submitted'
|
|
||||||
)
|
|
||||||
|
|
||||||
self._expect_result_ctx: Context = await self.actor.start_remote_task(
|
|
||||||
self.channel,
|
|
||||||
nsf=NamespacePath(f'{ns}:{func}'),
|
|
||||||
kwargs=kwargs,
|
|
||||||
portal=self,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
# TODO: we should deprecate this API right? since if we remove
|
self._expect_result = await self.actor.start_remote_task(
|
||||||
# `.run_in_actor()` (and instead move it to a `.highlevel`
|
self.channel,
|
||||||
# wrapper api (around a single `.open_context()` call) we don't
|
nsf=NamespacePath(f'{ns}:{func}'),
|
||||||
# really have any notion of a "main" remote task any more?
|
kwargs=kwargs
|
||||||
#
|
)
|
||||||
# @api_frame
|
|
||||||
async def wait_for_result(
|
async def _return_once(
|
||||||
self,
|
self,
|
||||||
hide_tb: bool = True,
|
ctx: Context,
|
||||||
) -> Any:
|
|
||||||
|
) -> dict[str, Any]:
|
||||||
|
|
||||||
|
assert ctx._remote_func_type == 'asyncfunc' # single response
|
||||||
|
msg: dict = await ctx._recv_chan.receive()
|
||||||
|
return msg
|
||||||
|
|
||||||
|
async def result(self) -> Any:
|
||||||
'''
|
'''
|
||||||
Return the final result delivered by a `Return`-msg from the
|
Return the result(s) from the remote actor's "main" task.
|
||||||
remote peer actor's "main" task's `return` statement.
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
__tracebackhide__: bool = hide_tb
|
# __tracebackhide__ = True
|
||||||
# Check for non-rpc errors slapped on the
|
# Check for non-rpc errors slapped on the
|
||||||
# channel for which we always raise
|
# channel for which we always raise
|
||||||
exc = self.channel._exc
|
exc = self.channel._exc
|
||||||
|
@ -169,7 +180,7 @@ class Portal:
|
||||||
raise exc
|
raise exc
|
||||||
|
|
||||||
# not expecting a "main" result
|
# not expecting a "main" result
|
||||||
if self._expect_result_ctx is None:
|
if self._expect_result is None:
|
||||||
log.warning(
|
log.warning(
|
||||||
f"Portal for {self.channel.uid} not expecting a final"
|
f"Portal for {self.channel.uid} not expecting a final"
|
||||||
" result?\nresult() should only be called if subactor"
|
" result?\nresult() should only be called if subactor"
|
||||||
|
@ -177,40 +188,16 @@ class Portal:
|
||||||
return NoResult
|
return NoResult
|
||||||
|
|
||||||
# expecting a "main" result
|
# expecting a "main" result
|
||||||
assert self._expect_result_ctx
|
assert self._expect_result
|
||||||
|
|
||||||
if self._final_result_msg is None:
|
if self._result_msg is None:
|
||||||
try:
|
self._result_msg = await self._return_once(
|
||||||
(
|
self._expect_result
|
||||||
self._final_result_msg,
|
)
|
||||||
self._final_result_pld,
|
|
||||||
) = await self._expect_result_ctx._pld_rx.recv_msg(
|
|
||||||
ipc=self._expect_result_ctx,
|
|
||||||
expect_msg=Return,
|
|
||||||
)
|
|
||||||
except BaseException as err:
|
|
||||||
# TODO: wrap this into `@api_frame` optionally with
|
|
||||||
# some kinda filtering mechanism like log levels?
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
raise err
|
|
||||||
|
|
||||||
return self._final_result_pld
|
return _unwrap_msg(
|
||||||
|
self._result_msg,
|
||||||
# TODO: factor this out into a `.highlevel` API-wrapper that uses
|
self.channel,
|
||||||
# a single `.open_context()` call underneath.
|
|
||||||
async def result(
|
|
||||||
self,
|
|
||||||
*args,
|
|
||||||
**kwargs,
|
|
||||||
) -> Any|Exception:
|
|
||||||
typname: str = type(self).__name__
|
|
||||||
log.warning(
|
|
||||||
f'`{typname}.result()` is DEPRECATED!\n'
|
|
||||||
f'Use `{typname}.wait_for_result()` instead!\n'
|
|
||||||
)
|
|
||||||
return await self.wait_for_result(
|
|
||||||
*args,
|
|
||||||
**kwargs,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
async def _cancel_streams(self):
|
async def _cancel_streams(self):
|
||||||
|
@ -253,8 +240,6 @@ class Portal:
|
||||||
purpose.
|
purpose.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
__runtimeframe__: int = 1 # noqa
|
|
||||||
|
|
||||||
chan: Channel = self.channel
|
chan: Channel = self.channel
|
||||||
if not chan.connected():
|
if not chan.connected():
|
||||||
log.runtime(
|
log.runtime(
|
||||||
|
@ -263,15 +248,14 @@ class Portal:
|
||||||
return False
|
return False
|
||||||
|
|
||||||
reminfo: str = (
|
reminfo: str = (
|
||||||
f'c)=> {self.channel.uid}\n'
|
f'`Portal.cancel_actor()` => {self.channel.uid}\n'
|
||||||
f' |_{chan}\n'
|
f' |_{chan}\n'
|
||||||
)
|
)
|
||||||
log.cancel(
|
log.cancel(
|
||||||
f'Requesting actor-runtime cancel for peer\n\n'
|
f'Sending runtime `.cancel()` request to peer\n\n'
|
||||||
f'{reminfo}'
|
f'{reminfo}'
|
||||||
)
|
)
|
||||||
|
|
||||||
# XXX the one spot we set it?
|
|
||||||
self.channel._cancel_called: bool = True
|
self.channel._cancel_called: bool = True
|
||||||
try:
|
try:
|
||||||
# send cancel cmd - might not get response
|
# send cancel cmd - might not get response
|
||||||
|
@ -311,8 +295,6 @@ class Portal:
|
||||||
)
|
)
|
||||||
return False
|
return False
|
||||||
|
|
||||||
# TODO: do we still need this for low level `Actor`-runtime
|
|
||||||
# method calls or can we also remove it?
|
|
||||||
async def run_from_ns(
|
async def run_from_ns(
|
||||||
self,
|
self,
|
||||||
namespace_path: str,
|
namespace_path: str,
|
||||||
|
@ -335,23 +317,21 @@ class Portal:
|
||||||
internals!
|
internals!
|
||||||
|
|
||||||
'''
|
'''
|
||||||
__runtimeframe__: int = 1 # noqa
|
|
||||||
nsf = NamespacePath(
|
nsf = NamespacePath(
|
||||||
f'{namespace_path}:{function_name}'
|
f'{namespace_path}:{function_name}'
|
||||||
)
|
)
|
||||||
ctx: Context = await self.actor.start_remote_task(
|
ctx = await self.actor.start_remote_task(
|
||||||
chan=self.channel,
|
chan=self.channel,
|
||||||
nsf=nsf,
|
nsf=nsf,
|
||||||
kwargs=kwargs,
|
kwargs=kwargs,
|
||||||
portal=self,
|
|
||||||
)
|
)
|
||||||
return await ctx._pld_rx.recv_pld(
|
ctx._portal = self
|
||||||
ipc=ctx,
|
msg = await self._return_once(ctx)
|
||||||
expect_msg=Return,
|
return _unwrap_msg(
|
||||||
|
msg,
|
||||||
|
self.channel,
|
||||||
)
|
)
|
||||||
|
|
||||||
# TODO: factor this out into a `.highlevel` API-wrapper that uses
|
|
||||||
# a single `.open_context()` call underneath.
|
|
||||||
async def run(
|
async def run(
|
||||||
self,
|
self,
|
||||||
func: str,
|
func: str,
|
||||||
|
@ -367,8 +347,6 @@ class Portal:
|
||||||
remote rpc task or a local async generator instance.
|
remote rpc task or a local async generator instance.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
__runtimeframe__: int = 1 # noqa
|
|
||||||
|
|
||||||
if isinstance(func, str):
|
if isinstance(func, str):
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
"`Portal.run(namespace: str, funcname: str)` is now"
|
"`Portal.run(namespace: str, funcname: str)` is now"
|
||||||
|
@ -399,15 +377,13 @@ class Portal:
|
||||||
self.channel,
|
self.channel,
|
||||||
nsf=nsf,
|
nsf=nsf,
|
||||||
kwargs=kwargs,
|
kwargs=kwargs,
|
||||||
portal=self,
|
|
||||||
)
|
)
|
||||||
return await ctx._pld_rx.recv_pld(
|
ctx._portal = self
|
||||||
ipc=ctx,
|
return _unwrap_msg(
|
||||||
expect_msg=Return,
|
await self._return_once(ctx),
|
||||||
|
self.channel,
|
||||||
)
|
)
|
||||||
|
|
||||||
# TODO: factor this out into a `.highlevel` API-wrapper that uses
|
|
||||||
# a single `.open_context()` call underneath.
|
|
||||||
@acm
|
@acm
|
||||||
async def open_stream_from(
|
async def open_stream_from(
|
||||||
self,
|
self,
|
||||||
|
@ -415,14 +391,6 @@ class Portal:
|
||||||
**kwargs,
|
**kwargs,
|
||||||
|
|
||||||
) -> AsyncGenerator[MsgStream, None]:
|
) -> AsyncGenerator[MsgStream, None]:
|
||||||
'''
|
|
||||||
Legacy one-way streaming API.
|
|
||||||
|
|
||||||
TODO: re-impl on top `Portal.open_context()` + an async gen
|
|
||||||
around `Context.open_stream()`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
__runtimeframe__: int = 1 # noqa
|
|
||||||
|
|
||||||
if not inspect.isasyncgenfunction(async_gen_func):
|
if not inspect.isasyncgenfunction(async_gen_func):
|
||||||
if not (
|
if not (
|
||||||
|
@ -436,8 +404,8 @@ class Portal:
|
||||||
self.channel,
|
self.channel,
|
||||||
nsf=NamespacePath.from_ref(async_gen_func),
|
nsf=NamespacePath.from_ref(async_gen_func),
|
||||||
kwargs=kwargs,
|
kwargs=kwargs,
|
||||||
portal=self,
|
|
||||||
)
|
)
|
||||||
|
ctx._portal = self
|
||||||
|
|
||||||
# ensure receive-only stream entrypoint
|
# ensure receive-only stream entrypoint
|
||||||
assert ctx._remote_func_type == 'asyncgen'
|
assert ctx._remote_func_type == 'asyncgen'
|
||||||
|
@ -446,13 +414,13 @@ class Portal:
|
||||||
# deliver receive only stream
|
# deliver receive only stream
|
||||||
async with MsgStream(
|
async with MsgStream(
|
||||||
ctx=ctx,
|
ctx=ctx,
|
||||||
rx_chan=ctx._rx_chan,
|
rx_chan=ctx._recv_chan,
|
||||||
) as stream:
|
) as rchan:
|
||||||
self._streams.add(stream)
|
self._streams.add(rchan)
|
||||||
ctx._stream = stream
|
yield rchan
|
||||||
yield stream
|
|
||||||
|
|
||||||
finally:
|
finally:
|
||||||
|
|
||||||
# cancel the far end task on consumer close
|
# cancel the far end task on consumer close
|
||||||
# NOTE: this is a special case since we assume that if using
|
# NOTE: this is a special case since we assume that if using
|
||||||
# this ``.open_fream_from()`` api, the stream is one a one
|
# this ``.open_fream_from()`` api, the stream is one a one
|
||||||
|
@ -471,7 +439,7 @@ class Portal:
|
||||||
|
|
||||||
# XXX: should this always be done?
|
# XXX: should this always be done?
|
||||||
# await recv_chan.aclose()
|
# await recv_chan.aclose()
|
||||||
self._streams.remove(stream)
|
self._streams.remove(rchan)
|
||||||
|
|
||||||
# NOTE: impl is found in `._context`` mod to make
|
# NOTE: impl is found in `._context`` mod to make
|
||||||
# reading/groking the details simpler code-org-wise. This
|
# reading/groking the details simpler code-org-wise. This
|
||||||
|
@ -493,12 +461,7 @@ class LocalPortal:
|
||||||
actor: 'Actor' # type: ignore # noqa
|
actor: 'Actor' # type: ignore # noqa
|
||||||
channel: Channel
|
channel: Channel
|
||||||
|
|
||||||
async def run_from_ns(
|
async def run_from_ns(self, ns: str, func_name: str, **kwargs) -> Any:
|
||||||
self,
|
|
||||||
ns: str,
|
|
||||||
func_name: str,
|
|
||||||
**kwargs,
|
|
||||||
) -> Any:
|
|
||||||
'''
|
'''
|
||||||
Run a requested local function from a namespace path and
|
Run a requested local function from a namespace path and
|
||||||
return it's result.
|
return it's result.
|
||||||
|
@ -513,7 +476,7 @@ class LocalPortal:
|
||||||
async def open_portal(
|
async def open_portal(
|
||||||
|
|
||||||
channel: Channel,
|
channel: Channel,
|
||||||
tn: trio.Nursery|None = None,
|
nursery: trio.Nursery|None = None,
|
||||||
start_msg_loop: bool = True,
|
start_msg_loop: bool = True,
|
||||||
shield: bool = False,
|
shield: bool = False,
|
||||||
|
|
||||||
|
@ -521,23 +484,15 @@ async def open_portal(
|
||||||
'''
|
'''
|
||||||
Open a ``Portal`` through the provided ``channel``.
|
Open a ``Portal`` through the provided ``channel``.
|
||||||
|
|
||||||
Spawns a background task to handle RPC processing, normally
|
Spawns a background task to handle message processing (normally
|
||||||
done by the actor-runtime implicitly via a call to
|
done by the actor-runtime implicitly).
|
||||||
`._rpc.process_messages()`. just after connection establishment.
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
actor = current_actor()
|
actor = current_actor()
|
||||||
assert actor
|
assert actor
|
||||||
was_connected: bool = False
|
was_connected = False
|
||||||
|
|
||||||
async with maybe_open_nursery(
|
async with maybe_open_nursery(nursery, shield=shield) as nursery:
|
||||||
tn,
|
|
||||||
shield=shield,
|
|
||||||
strict_exception_groups=False,
|
|
||||||
# ^XXX^ TODO? soo roll our own then ??
|
|
||||||
# -> since we kinda want the "if only one `.exception` then
|
|
||||||
# just raise that" interface?
|
|
||||||
) as tn:
|
|
||||||
|
|
||||||
if not channel.connected():
|
if not channel.connected():
|
||||||
await channel.connect()
|
await channel.connect()
|
||||||
|
@ -549,7 +504,7 @@ async def open_portal(
|
||||||
msg_loop_cs: trio.CancelScope|None = None
|
msg_loop_cs: trio.CancelScope|None = None
|
||||||
if start_msg_loop:
|
if start_msg_loop:
|
||||||
from ._runtime import process_messages
|
from ._runtime import process_messages
|
||||||
msg_loop_cs = await tn.start(
|
msg_loop_cs = await nursery.start(
|
||||||
partial(
|
partial(
|
||||||
process_messages,
|
process_messages,
|
||||||
actor,
|
actor,
|
||||||
|
@ -566,10 +521,12 @@ async def open_portal(
|
||||||
await portal.aclose()
|
await portal.aclose()
|
||||||
|
|
||||||
if was_connected:
|
if was_connected:
|
||||||
await channel.aclose()
|
# gracefully signal remote channel-msg loop
|
||||||
|
await channel.send(None)
|
||||||
|
# await channel.aclose()
|
||||||
|
|
||||||
# cancel background msg loop task
|
# cancel background msg loop task
|
||||||
if msg_loop_cs is not None:
|
if msg_loop_cs:
|
||||||
msg_loop_cs.cancel()
|
msg_loop_cs.cancel()
|
||||||
|
|
||||||
tn.cancel_scope.cancel()
|
nursery.cancel_scope.cancel()
|
||||||
|
|
347
tractor/_root.py
347
tractor/_root.py
|
@ -18,15 +18,14 @@
|
||||||
Root actor runtime ignition(s).
|
Root actor runtime ignition(s).
|
||||||
|
|
||||||
'''
|
'''
|
||||||
from contextlib import asynccontextmanager as acm
|
from contextlib import asynccontextmanager
|
||||||
from functools import partial
|
from functools import partial
|
||||||
import importlib
|
import importlib
|
||||||
import inspect
|
|
||||||
import logging
|
import logging
|
||||||
import os
|
|
||||||
import signal
|
import signal
|
||||||
import sys
|
import sys
|
||||||
from typing import Callable
|
import os
|
||||||
|
import typing
|
||||||
import warnings
|
import warnings
|
||||||
|
|
||||||
|
|
||||||
|
@ -48,131 +47,60 @@ from ._exceptions import is_multi_cancelled
|
||||||
|
|
||||||
|
|
||||||
# set at startup and after forks
|
# set at startup and after forks
|
||||||
_default_host: str = '127.0.0.1'
|
_default_arbiter_host: str = '127.0.0.1'
|
||||||
_default_port: int = 1616
|
_default_arbiter_port: int = 1616
|
||||||
|
|
||||||
# default registry always on localhost
|
|
||||||
_default_lo_addrs: list[tuple[str, int]] = [(
|
|
||||||
_default_host,
|
|
||||||
_default_port,
|
|
||||||
)]
|
|
||||||
|
|
||||||
|
|
||||||
logger = log.get_logger('tractor')
|
logger = log.get_logger('tractor')
|
||||||
|
|
||||||
|
|
||||||
@acm
|
@asynccontextmanager
|
||||||
async def open_root_actor(
|
async def open_root_actor(
|
||||||
|
|
||||||
*,
|
*,
|
||||||
# defaults are above
|
# defaults are above
|
||||||
registry_addrs: list[tuple[str, int]]|None = None,
|
arbiter_addr: tuple[str, int] | None = None,
|
||||||
|
|
||||||
# defaults are above
|
# defaults are above
|
||||||
arbiter_addr: tuple[str, int]|None = None,
|
registry_addr: tuple[str, int] | None = None,
|
||||||
|
|
||||||
name: str|None = 'root',
|
name: str | None = 'root',
|
||||||
|
|
||||||
# either the `multiprocessing` start method:
|
# either the `multiprocessing` start method:
|
||||||
# https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods
|
# https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods
|
||||||
# OR `trio` (the new default).
|
# OR `trio` (the new default).
|
||||||
start_method: _spawn.SpawnMethodKey|None = None,
|
start_method: _spawn.SpawnMethodKey | None = None,
|
||||||
|
|
||||||
# enables the multi-process debugger support
|
# enables the multi-process debugger support
|
||||||
debug_mode: bool = False,
|
debug_mode: bool = False,
|
||||||
maybe_enable_greenback: bool = True, # `.pause_from_sync()/breakpoint()` support
|
|
||||||
enable_stack_on_sig: bool = False,
|
|
||||||
|
|
||||||
# internal logging
|
# internal logging
|
||||||
loglevel: str|None = None,
|
loglevel: str | None = None,
|
||||||
|
|
||||||
enable_modules: list|None = None,
|
enable_modules: list | None = None,
|
||||||
rpc_module_paths: list|None = None,
|
rpc_module_paths: list | None = None,
|
||||||
|
|
||||||
# NOTE: allow caller to ensure that only one registry exists
|
) -> typing.Any:
|
||||||
# and that this call creates it.
|
|
||||||
ensure_registry: bool = False,
|
|
||||||
|
|
||||||
hide_tb: bool = True,
|
|
||||||
|
|
||||||
# XXX, proxied directly to `.devx._debug._maybe_enter_pm()`
|
|
||||||
# for REPL-entry logic.
|
|
||||||
debug_filter: Callable[
|
|
||||||
[BaseException|BaseExceptionGroup],
|
|
||||||
bool,
|
|
||||||
] = lambda err: not is_multi_cancelled(err),
|
|
||||||
|
|
||||||
# TODO, a way for actors to augment passing derived
|
|
||||||
# read-only state to sublayers?
|
|
||||||
# extra_rt_vars: dict|None = None,
|
|
||||||
|
|
||||||
) -> Actor:
|
|
||||||
'''
|
'''
|
||||||
Runtime init entry point for ``tractor``.
|
Runtime init entry point for ``tractor``.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
_debug.hide_runtime_frames()
|
|
||||||
__tracebackhide__: bool = hide_tb
|
|
||||||
|
|
||||||
# TODO: stick this in a `@cm` defined in `devx._debug`?
|
|
||||||
#
|
|
||||||
# Override the global debugger hook to make it play nice with
|
# Override the global debugger hook to make it play nice with
|
||||||
# ``trio``, see much discussion in:
|
# ``trio``, see much discussion in:
|
||||||
# https://github.com/python-trio/trio/issues/1155#issuecomment-742964018
|
# https://github.com/python-trio/trio/issues/1155#issuecomment-742964018
|
||||||
builtin_bp_handler: Callable = sys.breakpointhook
|
builtin_bp_handler = sys.breakpointhook
|
||||||
orig_bp_path: str|None = os.environ.get(
|
orig_bp_path: str | None = os.environ.get('PYTHONBREAKPOINT', None)
|
||||||
'PYTHONBREAKPOINT',
|
os.environ['PYTHONBREAKPOINT'] = 'tractor.devx._debug.pause_from_sync'
|
||||||
None,
|
|
||||||
)
|
|
||||||
if (
|
|
||||||
debug_mode
|
|
||||||
and maybe_enable_greenback
|
|
||||||
and (
|
|
||||||
maybe_mod := await _debug.maybe_init_greenback(
|
|
||||||
raise_not_found=False,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
):
|
|
||||||
logger.info(
|
|
||||||
f'Found `greenback` installed @ {maybe_mod}\n'
|
|
||||||
'Enabling `tractor.pause_from_sync()` support!\n'
|
|
||||||
)
|
|
||||||
os.environ['PYTHONBREAKPOINT'] = (
|
|
||||||
'tractor.devx._debug._sync_pause_from_builtin'
|
|
||||||
)
|
|
||||||
_state._runtime_vars['use_greenback'] = True
|
|
||||||
|
|
||||||
else:
|
|
||||||
# TODO: disable `breakpoint()` by default (without
|
|
||||||
# `greenback`) since it will break any multi-actor
|
|
||||||
# usage by a clobbered TTY's stdstreams!
|
|
||||||
def block_bps(*args, **kwargs):
|
|
||||||
raise RuntimeError(
|
|
||||||
'Trying to use `breakpoint()` eh?\n\n'
|
|
||||||
'Welp, `tractor` blocks `breakpoint()` built-in calls by default!\n'
|
|
||||||
'If you need to use it please install `greenback` and set '
|
|
||||||
'`debug_mode=True` when opening the runtime '
|
|
||||||
'(either via `.open_nursery()` or `open_root_actor()`)\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
sys.breakpointhook = block_bps
|
|
||||||
# lol ok,
|
|
||||||
# https://docs.python.org/3/library/sys.html#sys.breakpointhook
|
|
||||||
os.environ['PYTHONBREAKPOINT'] = "0"
|
|
||||||
|
|
||||||
# attempt to retreive ``trio``'s sigint handler and stash it
|
# attempt to retreive ``trio``'s sigint handler and stash it
|
||||||
# on our debugger lock state.
|
# on our debugger lock state.
|
||||||
_debug.DebugStatus._trio_handler = signal.getsignal(signal.SIGINT)
|
_debug.Lock._trio_handler = signal.getsignal(signal.SIGINT)
|
||||||
|
|
||||||
# mark top most level process as root actor
|
# mark top most level process as root actor
|
||||||
_state._runtime_vars['_is_root'] = True
|
_state._runtime_vars['_is_root'] = True
|
||||||
|
|
||||||
# caps based rpc list
|
# caps based rpc list
|
||||||
enable_modules = (
|
enable_modules = enable_modules or []
|
||||||
enable_modules
|
|
||||||
or
|
|
||||||
[]
|
|
||||||
)
|
|
||||||
|
|
||||||
if rpc_module_paths:
|
if rpc_module_paths:
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
|
@ -188,19 +116,20 @@ async def open_root_actor(
|
||||||
|
|
||||||
if arbiter_addr is not None:
|
if arbiter_addr is not None:
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
'`arbiter_addr` is now deprecated\n'
|
'`arbiter_addr` is now deprecated and has been renamed to'
|
||||||
'Use `registry_addrs: list[tuple]` instead..',
|
'`registry_addr`.\nUse that instead..',
|
||||||
DeprecationWarning,
|
DeprecationWarning,
|
||||||
stacklevel=2,
|
stacklevel=2,
|
||||||
)
|
)
|
||||||
registry_addrs = [arbiter_addr]
|
|
||||||
|
|
||||||
registry_addrs: list[tuple[str, int]] = (
|
registry_addr = (host, port) = (
|
||||||
registry_addrs
|
registry_addr
|
||||||
or
|
or arbiter_addr
|
||||||
_default_lo_addrs
|
or (
|
||||||
|
_default_arbiter_host,
|
||||||
|
_default_arbiter_port,
|
||||||
|
)
|
||||||
)
|
)
|
||||||
assert registry_addrs
|
|
||||||
|
|
||||||
loglevel = (
|
loglevel = (
|
||||||
loglevel
|
loglevel
|
||||||
|
@ -228,7 +157,6 @@ async def open_root_actor(
|
||||||
):
|
):
|
||||||
loglevel = 'PDB'
|
loglevel = 'PDB'
|
||||||
|
|
||||||
|
|
||||||
elif debug_mode:
|
elif debug_mode:
|
||||||
raise RuntimeError(
|
raise RuntimeError(
|
||||||
"Debug mode is only supported for the `trio` backend!"
|
"Debug mode is only supported for the `trio` backend!"
|
||||||
|
@ -239,185 +167,103 @@ async def open_root_actor(
|
||||||
assert _log
|
assert _log
|
||||||
|
|
||||||
# TODO: factor this into `.devx._stackscope`!!
|
# TODO: factor this into `.devx._stackscope`!!
|
||||||
if (
|
if debug_mode:
|
||||||
debug_mode
|
|
||||||
and
|
|
||||||
enable_stack_on_sig
|
|
||||||
):
|
|
||||||
from .devx._stackscope import enable_stack_on_sig
|
|
||||||
enable_stack_on_sig()
|
|
||||||
|
|
||||||
# closed into below ping task-func
|
|
||||||
ponged_addrs: list[tuple[str, int]] = []
|
|
||||||
|
|
||||||
async def ping_tpt_socket(
|
|
||||||
addr: tuple[str, int],
|
|
||||||
timeout: float = 1,
|
|
||||||
) -> None:
|
|
||||||
'''
|
|
||||||
Attempt temporary connection to see if a registry is
|
|
||||||
listening at the requested address by a tranport layer
|
|
||||||
ping.
|
|
||||||
|
|
||||||
If a connection can't be made quickly we assume none no
|
|
||||||
server is listening at that addr.
|
|
||||||
|
|
||||||
'''
|
|
||||||
try:
|
try:
|
||||||
# TODO: this connect-and-bail forces us to have to
|
logger.info('Enabling `stackscope` traces on SIGUSR1')
|
||||||
# carefully rewrap TCP 104-connection-reset errors as
|
from .devx import enable_stack_on_sig
|
||||||
# EOF so as to avoid propagating cancel-causing errors
|
enable_stack_on_sig()
|
||||||
# to the channel-msg loop machinery. Likely it would
|
except ImportError:
|
||||||
# be better to eventually have a "discovery" protocol
|
logger.warning(
|
||||||
# with basic handshake instead?
|
'`stackscope` not installed for use in debug mode!'
|
||||||
with trio.move_on_after(timeout):
|
|
||||||
async with _connect_chan(*addr):
|
|
||||||
ponged_addrs.append(addr)
|
|
||||||
|
|
||||||
except OSError:
|
|
||||||
# TODO: make this a "discovery" log level?
|
|
||||||
logger.info(
|
|
||||||
f'No actor registry found @ {addr}\n'
|
|
||||||
)
|
)
|
||||||
|
|
||||||
async with trio.open_nursery() as tn:
|
try:
|
||||||
for addr in registry_addrs:
|
# make a temporary connection to see if an arbiter exists,
|
||||||
tn.start_soon(
|
# if one can't be made quickly we assume none exists.
|
||||||
ping_tpt_socket,
|
arbiter_found = False
|
||||||
tuple(addr), # TODO: just drop this requirement?
|
|
||||||
)
|
|
||||||
|
|
||||||
trans_bind_addrs: list[tuple[str, int]] = []
|
# TODO: this connect-and-bail forces us to have to carefully
|
||||||
|
# rewrap TCP 104-connection-reset errors as EOF so as to avoid
|
||||||
|
# propagating cancel-causing errors to the channel-msg loop
|
||||||
|
# machinery. Likely it would be better to eventually have
|
||||||
|
# a "discovery" protocol with basic handshake instead.
|
||||||
|
with trio.move_on_after(1):
|
||||||
|
async with _connect_chan(host, port):
|
||||||
|
arbiter_found = True
|
||||||
|
|
||||||
# Create a new local root-actor instance which IS NOT THE
|
except OSError:
|
||||||
# REGISTRAR
|
# TODO: make this a "discovery" log level?
|
||||||
if ponged_addrs:
|
logger.warning(f"No actor registry found @ {host}:{port}")
|
||||||
if ensure_registry:
|
|
||||||
raise RuntimeError(
|
# create a local actor and start up its main routine/task
|
||||||
f'Failed to open `{name}`@{ponged_addrs}: '
|
if arbiter_found:
|
||||||
'registry socket(s) already bound'
|
|
||||||
)
|
|
||||||
|
|
||||||
# we were able to connect to an arbiter
|
# we were able to connect to an arbiter
|
||||||
logger.info(
|
logger.info(f"Arbiter seems to exist @ {host}:{port}")
|
||||||
f'Registry(s) seem(s) to exist @ {ponged_addrs}'
|
|
||||||
)
|
|
||||||
|
|
||||||
actor = Actor(
|
actor = Actor(
|
||||||
name=name or 'anonymous',
|
name or 'anonymous',
|
||||||
registry_addrs=ponged_addrs,
|
arbiter_addr=registry_addr,
|
||||||
loglevel=loglevel,
|
loglevel=loglevel,
|
||||||
enable_modules=enable_modules,
|
enable_modules=enable_modules,
|
||||||
)
|
)
|
||||||
# DO NOT use the registry_addrs as the transport server
|
host, port = (host, 0)
|
||||||
# addrs for this new non-registar, root-actor.
|
|
||||||
for host, port in ponged_addrs:
|
|
||||||
# NOTE: zero triggers dynamic OS port allocation
|
|
||||||
trans_bind_addrs.append((host, 0))
|
|
||||||
|
|
||||||
# Start this local actor as the "registrar", aka a regular
|
|
||||||
# actor who manages the local registry of "mailboxes" of
|
|
||||||
# other process-tree-local sub-actors.
|
|
||||||
else:
|
else:
|
||||||
|
# start this local actor as the arbiter (aka a regular actor who
|
||||||
|
# manages the local registry of "mailboxes")
|
||||||
|
|
||||||
# NOTE that if the current actor IS THE REGISTAR, the
|
# Note that if the current actor is the arbiter it is desirable
|
||||||
# following init steps are taken:
|
# for it to stay up indefinitely until a re-election process has
|
||||||
# - the tranport layer server is bound to each (host, port)
|
# taken place - which is not implemented yet FYI).
|
||||||
# pair defined in provided registry_addrs, or the default.
|
|
||||||
trans_bind_addrs = registry_addrs
|
|
||||||
|
|
||||||
# - it is normally desirable for any registrar to stay up
|
|
||||||
# indefinitely until either all registered (child/sub)
|
|
||||||
# actors are terminated (via SC supervision) or,
|
|
||||||
# a re-election process has taken place.
|
|
||||||
# NOTE: all of ^ which is not implemented yet - see:
|
|
||||||
# https://github.com/goodboy/tractor/issues/216
|
|
||||||
# https://github.com/goodboy/tractor/pull/348
|
|
||||||
# https://github.com/goodboy/tractor/issues/296
|
|
||||||
|
|
||||||
actor = Arbiter(
|
actor = Arbiter(
|
||||||
name or 'registrar',
|
name or 'arbiter',
|
||||||
registry_addrs=registry_addrs,
|
arbiter_addr=registry_addr,
|
||||||
loglevel=loglevel,
|
loglevel=loglevel,
|
||||||
enable_modules=enable_modules,
|
enable_modules=enable_modules,
|
||||||
)
|
)
|
||||||
# XXX, in case the root actor runtime was actually run from
|
|
||||||
# `tractor.to_asyncio.run_as_asyncio_guest()` and NOt
|
|
||||||
# `.trio.run()`.
|
|
||||||
actor._infected_aio = _state._runtime_vars['_is_infected_aio']
|
|
||||||
|
|
||||||
# Start up main task set via core actor-runtime nurseries.
|
|
||||||
try:
|
try:
|
||||||
# assign process-local actor
|
# assign process-local actor
|
||||||
_state._current_actor = actor
|
_state._current_actor = actor
|
||||||
|
|
||||||
# start local channel-server and fake the portal API
|
# start local channel-server and fake the portal API
|
||||||
# NOTE: this won't block since we provide the nursery
|
# NOTE: this won't block since we provide the nursery
|
||||||
ml_addrs_str: str = '\n'.join(
|
logger.info(f"Starting local {actor} @ {host}:{port}")
|
||||||
f'@{addr}' for addr in trans_bind_addrs
|
|
||||||
)
|
|
||||||
logger.info(
|
|
||||||
f'Starting local {actor.uid} on the following transport addrs:\n'
|
|
||||||
f'{ml_addrs_str}'
|
|
||||||
)
|
|
||||||
|
|
||||||
# start the actor runtime in a new task
|
# start the actor runtime in a new task
|
||||||
async with trio.open_nursery(
|
async with trio.open_nursery() as nursery:
|
||||||
strict_exception_groups=False,
|
|
||||||
# ^XXX^ TODO? instead unpack any RAE as per "loose" style?
|
# ``_runtime.async_main()`` creates an internal nursery and
|
||||||
) as nursery:
|
# thus blocks here until the entire underlying actor tree has
|
||||||
|
# terminated thereby conducting structured concurrency.
|
||||||
|
|
||||||
# ``_runtime.async_main()`` creates an internal nursery
|
|
||||||
# and blocks here until any underlying actor(-process)
|
|
||||||
# tree has terminated thereby conducting so called
|
|
||||||
# "end-to-end" structured concurrency throughout an
|
|
||||||
# entire hierarchical python sub-process set; all
|
|
||||||
# "actor runtime" primitives are SC-compat and thus all
|
|
||||||
# transitively spawned actors/processes must be as
|
|
||||||
# well.
|
|
||||||
await nursery.start(
|
await nursery.start(
|
||||||
partial(
|
partial(
|
||||||
async_main,
|
async_main,
|
||||||
actor,
|
actor,
|
||||||
accept_addrs=trans_bind_addrs,
|
accept_addr=(host, port),
|
||||||
parent_addr=None
|
parent_addr=None
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
try:
|
try:
|
||||||
yield actor
|
yield actor
|
||||||
|
|
||||||
except (
|
except (
|
||||||
Exception,
|
Exception,
|
||||||
BaseExceptionGroup,
|
BaseExceptionGroup,
|
||||||
) as err:
|
) as err:
|
||||||
|
|
||||||
# TODO, in beginning to handle the subsubactor with
|
entered = await _debug._maybe_enter_pm(err)
|
||||||
# crashed grandparent cases..
|
|
||||||
#
|
|
||||||
# was_locked: bool = await _debug.maybe_wait_for_debugger(
|
|
||||||
# child_in_debug=True,
|
|
||||||
# )
|
|
||||||
# XXX NOTE XXX see equiv note inside
|
|
||||||
# `._runtime.Actor._stream_handler()` where in the
|
|
||||||
# non-root or root-that-opened-this-mahually case we
|
|
||||||
# wait for the local actor-nursery to exit before
|
|
||||||
# exiting the transport channel handler.
|
|
||||||
entered: bool = await _debug._maybe_enter_pm(
|
|
||||||
err,
|
|
||||||
api_frame=inspect.currentframe(),
|
|
||||||
debug_filter=debug_filter,
|
|
||||||
)
|
|
||||||
|
|
||||||
if (
|
if (
|
||||||
not entered
|
not entered
|
||||||
and
|
and
|
||||||
not is_multi_cancelled(
|
not is_multi_cancelled(err)
|
||||||
err,
|
|
||||||
)
|
|
||||||
):
|
):
|
||||||
logger.exception('Root actor crashed\n')
|
logger.exception('Root actor crashed:\n')
|
||||||
|
|
||||||
# ALWAYS re-raise any error bubbled up from the
|
# always re-raise
|
||||||
# runtime!
|
|
||||||
raise
|
raise
|
||||||
|
|
||||||
finally:
|
finally:
|
||||||
|
@ -438,21 +284,13 @@ async def open_root_actor(
|
||||||
_state._current_actor = None
|
_state._current_actor = None
|
||||||
_state._last_actor_terminated = actor
|
_state._last_actor_terminated = actor
|
||||||
|
|
||||||
# restore built-in `breakpoint()` hook state
|
# restore breakpoint hook state
|
||||||
if (
|
sys.breakpointhook = builtin_bp_handler
|
||||||
debug_mode
|
if orig_bp_path is not None:
|
||||||
and
|
os.environ['PYTHONBREAKPOINT'] = orig_bp_path
|
||||||
maybe_enable_greenback
|
else:
|
||||||
):
|
# clear env back to having no entry
|
||||||
if builtin_bp_handler is not None:
|
os.environ.pop('PYTHONBREAKPOINT')
|
||||||
sys.breakpointhook = builtin_bp_handler
|
|
||||||
|
|
||||||
if orig_bp_path is not None:
|
|
||||||
os.environ['PYTHONBREAKPOINT'] = orig_bp_path
|
|
||||||
|
|
||||||
else:
|
|
||||||
# clear env back to having no entry
|
|
||||||
os.environ.pop('PYTHONBREAKPOINT', None)
|
|
||||||
|
|
||||||
logger.runtime("Root actor terminated")
|
logger.runtime("Root actor terminated")
|
||||||
|
|
||||||
|
@ -462,23 +300,19 @@ def run_daemon(
|
||||||
|
|
||||||
# runtime kwargs
|
# runtime kwargs
|
||||||
name: str | None = 'root',
|
name: str | None = 'root',
|
||||||
registry_addrs: list[tuple[str, int]] = _default_lo_addrs,
|
registry_addr: tuple[str, int] = (
|
||||||
|
_default_arbiter_host,
|
||||||
|
_default_arbiter_port,
|
||||||
|
),
|
||||||
|
|
||||||
start_method: str | None = None,
|
start_method: str | None = None,
|
||||||
debug_mode: bool = False,
|
debug_mode: bool = False,
|
||||||
|
|
||||||
# TODO, support `infected_aio=True` mode by,
|
|
||||||
# - calling the appropriate entrypoint-func from `.to_asyncio`
|
|
||||||
# - maybe init-ing `greenback` as done above in
|
|
||||||
# `open_root_actor()`.
|
|
||||||
|
|
||||||
**kwargs
|
**kwargs
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
Spawn a root (daemon) actor which will respond to RPC; the main
|
Spawn daemon actor which will respond to RPC; the main task simply
|
||||||
task simply starts the runtime and then blocks via embedded
|
starts the runtime and then sleeps forever.
|
||||||
`trio.sleep_forever()`.
|
|
||||||
|
|
||||||
This is a very minimal convenience wrapper around starting
|
This is a very minimal convenience wrapper around starting
|
||||||
a "run-until-cancelled" root actor which can be started with a set
|
a "run-until-cancelled" root actor which can be started with a set
|
||||||
|
@ -491,8 +325,9 @@ def run_daemon(
|
||||||
importlib.import_module(path)
|
importlib.import_module(path)
|
||||||
|
|
||||||
async def _main():
|
async def _main():
|
||||||
|
|
||||||
async with open_root_actor(
|
async with open_root_actor(
|
||||||
registry_addrs=registry_addrs,
|
registry_addr=registry_addr,
|
||||||
name=name,
|
name=name,
|
||||||
start_method=start_method,
|
start_method=start_method,
|
||||||
debug_mode=debug_mode,
|
debug_mode=debug_mode,
|
||||||
|
|
1022
tractor/_rpc.py
1022
tractor/_rpc.py
File diff suppressed because it is too large
Load Diff
1308
tractor/_runtime.py
1308
tractor/_runtime.py
File diff suppressed because it is too large
Load Diff
|
@ -43,16 +43,12 @@ from tractor._state import (
|
||||||
is_main_process,
|
is_main_process,
|
||||||
is_root_process,
|
is_root_process,
|
||||||
debug_mode,
|
debug_mode,
|
||||||
_runtime_vars,
|
|
||||||
)
|
)
|
||||||
from tractor.log import get_logger
|
from tractor.log import get_logger
|
||||||
from tractor._portal import Portal
|
from tractor._portal import Portal
|
||||||
from tractor._runtime import Actor
|
from tractor._runtime import Actor
|
||||||
from tractor._entry import _mp_main
|
from tractor._entry import _mp_main
|
||||||
from tractor._exceptions import ActorFailure
|
from tractor._exceptions import ActorFailure
|
||||||
from tractor.msg.types import (
|
|
||||||
SpawnSpec,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
|
@ -143,13 +139,11 @@ async def exhaust_portal(
|
||||||
'''
|
'''
|
||||||
__tracebackhide__ = True
|
__tracebackhide__ = True
|
||||||
try:
|
try:
|
||||||
log.debug(
|
log.debug(f"Waiting on final result from {actor.uid}")
|
||||||
f'Waiting on final result from {actor.uid}'
|
|
||||||
)
|
|
||||||
|
|
||||||
# XXX: streams should never be reaped here since they should
|
# XXX: streams should never be reaped here since they should
|
||||||
# always be established and shutdown using a context manager api
|
# always be established and shutdown using a context manager api
|
||||||
final: Any = await portal.wait_for_result()
|
final: Any = await portal.result()
|
||||||
|
|
||||||
except (
|
except (
|
||||||
Exception,
|
Exception,
|
||||||
|
@ -198,10 +192,7 @@ async def cancel_on_completion(
|
||||||
# if this call errors we store the exception for later
|
# if this call errors we store the exception for later
|
||||||
# in ``errors`` which will be reraised inside
|
# in ``errors`` which will be reraised inside
|
||||||
# an exception group and we still send out a cancel request
|
# an exception group and we still send out a cancel request
|
||||||
result: Any|Exception = await exhaust_portal(
|
result: Any|Exception = await exhaust_portal(portal, actor)
|
||||||
portal,
|
|
||||||
actor,
|
|
||||||
)
|
|
||||||
if isinstance(result, Exception):
|
if isinstance(result, Exception):
|
||||||
errors[actor.uid]: Exception = result
|
errors[actor.uid]: Exception = result
|
||||||
log.cancel(
|
log.cancel(
|
||||||
|
@ -223,11 +214,7 @@ async def cancel_on_completion(
|
||||||
|
|
||||||
async def hard_kill(
|
async def hard_kill(
|
||||||
proc: trio.Process,
|
proc: trio.Process,
|
||||||
|
|
||||||
terminate_after: int = 1.6,
|
terminate_after: int = 1.6,
|
||||||
# NOTE: for mucking with `.pause()`-ing inside the runtime
|
|
||||||
# whilst also hacking on it XD
|
|
||||||
# terminate_after: int = 99999,
|
|
||||||
|
|
||||||
# NOTE: for mucking with `.pause()`-ing inside the runtime
|
# NOTE: for mucking with `.pause()`-ing inside the runtime
|
||||||
# whilst also hacking on it XD
|
# whilst also hacking on it XD
|
||||||
|
@ -250,9 +237,8 @@ async def hard_kill(
|
||||||
|
|
||||||
'''
|
'''
|
||||||
log.cancel(
|
log.cancel(
|
||||||
'Terminating sub-proc\n'
|
'Terminating sub-proc:\n'
|
||||||
f'>x)\n'
|
f'|_{proc}\n'
|
||||||
f' |_{proc}\n'
|
|
||||||
)
|
)
|
||||||
# NOTE: this timeout used to do nothing since we were shielding
|
# NOTE: this timeout used to do nothing since we were shielding
|
||||||
# the ``.wait()`` inside ``new_proc()`` which will pretty much
|
# the ``.wait()`` inside ``new_proc()`` which will pretty much
|
||||||
|
@ -298,13 +284,14 @@ async def hard_kill(
|
||||||
log.critical(
|
log.critical(
|
||||||
# 'Well, the #ZOMBIE_LORD_IS_HERE# to collect\n'
|
# 'Well, the #ZOMBIE_LORD_IS_HERE# to collect\n'
|
||||||
'#T-800 deployed to collect zombie B0\n'
|
'#T-800 deployed to collect zombie B0\n'
|
||||||
f'>x)\n'
|
f'|\n'
|
||||||
f' |_{proc}\n'
|
f'|_{proc}\n'
|
||||||
)
|
)
|
||||||
proc.kill()
|
proc.kill()
|
||||||
|
|
||||||
|
|
||||||
async def soft_kill(
|
async def soft_kill(
|
||||||
|
|
||||||
proc: ProcessType,
|
proc: ProcessType,
|
||||||
wait_func: Callable[
|
wait_func: Callable[
|
||||||
[ProcessType],
|
[ProcessType],
|
||||||
|
@ -327,27 +314,13 @@ async def soft_kill(
|
||||||
uid: tuple[str, str] = portal.channel.uid
|
uid: tuple[str, str] = portal.channel.uid
|
||||||
try:
|
try:
|
||||||
log.cancel(
|
log.cancel(
|
||||||
f'Soft killing sub-actor via portal request\n'
|
'Soft killing sub-actor via `Portal.cancel_actor()`\n'
|
||||||
f'\n'
|
f'|_{proc}\n'
|
||||||
f'(c=> {portal.chan.uid}\n'
|
|
||||||
f' |_{proc}\n'
|
|
||||||
)
|
)
|
||||||
# wait on sub-proc to signal termination
|
# wait on sub-proc to signal termination
|
||||||
await wait_func(proc)
|
await wait_func(proc)
|
||||||
|
|
||||||
except trio.Cancelled:
|
except trio.Cancelled:
|
||||||
with trio.CancelScope(shield=True):
|
|
||||||
await maybe_wait_for_debugger(
|
|
||||||
child_in_debug=_runtime_vars.get(
|
|
||||||
'_debug_mode', False
|
|
||||||
),
|
|
||||||
header_msg=(
|
|
||||||
'Delaying `soft_kill()` subproc reaper while debugger locked..\n'
|
|
||||||
),
|
|
||||||
# TODO: need a diff value then default?
|
|
||||||
# poll_steps=9999999,
|
|
||||||
)
|
|
||||||
|
|
||||||
# if cancelled during a soft wait, cancel the child
|
# if cancelled during a soft wait, cancel the child
|
||||||
# actor before entering the hard reap sequence
|
# actor before entering the hard reap sequence
|
||||||
# below. This means we try to do a graceful teardown
|
# below. This means we try to do a graceful teardown
|
||||||
|
@ -392,7 +365,7 @@ async def new_proc(
|
||||||
errors: dict[tuple[str, str], Exception],
|
errors: dict[tuple[str, str], Exception],
|
||||||
|
|
||||||
# passed through to actor main
|
# passed through to actor main
|
||||||
bind_addrs: list[tuple[str, int]],
|
bind_addr: tuple[str, int],
|
||||||
parent_addr: tuple[str, int],
|
parent_addr: tuple[str, int],
|
||||||
_runtime_vars: dict[str, Any], # serialized and sent to _child
|
_runtime_vars: dict[str, Any], # serialized and sent to _child
|
||||||
|
|
||||||
|
@ -414,7 +387,7 @@ async def new_proc(
|
||||||
actor_nursery,
|
actor_nursery,
|
||||||
subactor,
|
subactor,
|
||||||
errors,
|
errors,
|
||||||
bind_addrs,
|
bind_addr,
|
||||||
parent_addr,
|
parent_addr,
|
||||||
_runtime_vars, # run time vars
|
_runtime_vars, # run time vars
|
||||||
infect_asyncio=infect_asyncio,
|
infect_asyncio=infect_asyncio,
|
||||||
|
@ -429,7 +402,7 @@ async def trio_proc(
|
||||||
errors: dict[tuple[str, str], Exception],
|
errors: dict[tuple[str, str], Exception],
|
||||||
|
|
||||||
# passed through to actor main
|
# passed through to actor main
|
||||||
bind_addrs: list[tuple[str, int]],
|
bind_addr: tuple[str, int],
|
||||||
parent_addr: tuple[str, int],
|
parent_addr: tuple[str, int],
|
||||||
_runtime_vars: dict[str, Any], # serialized and sent to _child
|
_runtime_vars: dict[str, Any], # serialized and sent to _child
|
||||||
*,
|
*,
|
||||||
|
@ -475,9 +448,10 @@ async def trio_proc(
|
||||||
proc: trio.Process|None = None
|
proc: trio.Process|None = None
|
||||||
try:
|
try:
|
||||||
try:
|
try:
|
||||||
proc: trio.Process = await trio.lowlevel.open_process(spawn_cmd)
|
# TODO: needs ``trio_typing`` patch?
|
||||||
|
proc = await trio.lowlevel.open_process(spawn_cmd)
|
||||||
log.runtime(
|
log.runtime(
|
||||||
'Started new child\n'
|
'Started new sub-proc\n'
|
||||||
f'|_{proc}\n'
|
f'|_{proc}\n'
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -515,20 +489,18 @@ async def trio_proc(
|
||||||
portal,
|
portal,
|
||||||
)
|
)
|
||||||
|
|
||||||
# send a "spawning specification" which configures the
|
# send additional init params
|
||||||
# initial runtime state of the child.
|
await chan.send({
|
||||||
await chan.send(
|
"_parent_main_data": subactor._parent_main_data,
|
||||||
SpawnSpec(
|
"enable_modules": subactor.enable_modules,
|
||||||
_parent_main_data=subactor._parent_main_data,
|
"_arb_addr": subactor._arb_addr,
|
||||||
enable_modules=subactor.enable_modules,
|
"bind_host": bind_addr[0],
|
||||||
reg_addrs=subactor.reg_addrs,
|
"bind_port": bind_addr[1],
|
||||||
bind_addrs=bind_addrs,
|
"_runtime_vars": _runtime_vars,
|
||||||
_runtime_vars=_runtime_vars,
|
})
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
# track subactor in current nursery
|
# track subactor in current nursery
|
||||||
curr_actor: Actor = current_actor()
|
curr_actor = current_actor()
|
||||||
curr_actor._actoruid2nursery[subactor.uid] = actor_nursery
|
curr_actor._actoruid2nursery[subactor.uid] = actor_nursery
|
||||||
|
|
||||||
# resume caller at next checkpoint now that child is up
|
# resume caller at next checkpoint now that child is up
|
||||||
|
@ -559,9 +531,8 @@ async def trio_proc(
|
||||||
# cancel result waiter that may have been spawned in
|
# cancel result waiter that may have been spawned in
|
||||||
# tandem if not done already
|
# tandem if not done already
|
||||||
log.cancel(
|
log.cancel(
|
||||||
'Cancelling portal result reaper task\n'
|
'Cancelling existing result waiter task for '
|
||||||
f'>c)\n'
|
f'{subactor.uid}'
|
||||||
f' |_{subactor.uid}\n'
|
|
||||||
)
|
)
|
||||||
nursery.cancel_scope.cancel()
|
nursery.cancel_scope.cancel()
|
||||||
|
|
||||||
|
@ -570,13 +541,9 @@ async def trio_proc(
|
||||||
# allowed! Do this **after** cancellation/teardown to avoid
|
# allowed! Do this **after** cancellation/teardown to avoid
|
||||||
# killing the process too early.
|
# killing the process too early.
|
||||||
if proc:
|
if proc:
|
||||||
log.cancel(
|
log.cancel(f'Hard reap sequence starting for {subactor.uid}')
|
||||||
f'Hard reap sequence starting for subactor\n'
|
|
||||||
f'>x)\n'
|
|
||||||
f' |_{subactor}@{subactor.uid}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
with trio.CancelScope(shield=True):
|
with trio.CancelScope(shield=True):
|
||||||
|
|
||||||
# don't clobber an ongoing pdb
|
# don't clobber an ongoing pdb
|
||||||
if cancelled_during_spawn:
|
if cancelled_during_spawn:
|
||||||
# Try again to avoid TTY clobbering.
|
# Try again to avoid TTY clobbering.
|
||||||
|
@ -635,7 +602,7 @@ async def mp_proc(
|
||||||
subactor: Actor,
|
subactor: Actor,
|
||||||
errors: dict[tuple[str, str], Exception],
|
errors: dict[tuple[str, str], Exception],
|
||||||
# passed through to actor main
|
# passed through to actor main
|
||||||
bind_addrs: list[tuple[str, int]],
|
bind_addr: tuple[str, int],
|
||||||
parent_addr: tuple[str, int],
|
parent_addr: tuple[str, int],
|
||||||
_runtime_vars: dict[str, Any], # serialized and sent to _child
|
_runtime_vars: dict[str, Any], # serialized and sent to _child
|
||||||
*,
|
*,
|
||||||
|
@ -693,7 +660,7 @@ async def mp_proc(
|
||||||
target=_mp_main,
|
target=_mp_main,
|
||||||
args=(
|
args=(
|
||||||
subactor,
|
subactor,
|
||||||
bind_addrs,
|
bind_addr,
|
||||||
fs_info,
|
fs_info,
|
||||||
_spawn_method,
|
_spawn_method,
|
||||||
parent_addr,
|
parent_addr,
|
||||||
|
|
|
@ -19,35 +19,21 @@ Per process state
|
||||||
|
|
||||||
"""
|
"""
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
from contextvars import (
|
|
||||||
ContextVar,
|
|
||||||
)
|
|
||||||
from typing import (
|
from typing import (
|
||||||
Any,
|
Any,
|
||||||
TYPE_CHECKING,
|
TYPE_CHECKING,
|
||||||
)
|
)
|
||||||
|
|
||||||
from trio.lowlevel import current_task
|
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from ._runtime import Actor
|
from ._runtime import Actor
|
||||||
from ._context import Context
|
|
||||||
|
|
||||||
|
|
||||||
_current_actor: Actor|None = None # type: ignore # noqa
|
_current_actor: Actor|None = None # type: ignore # noqa
|
||||||
_last_actor_terminated: Actor|None = None
|
_last_actor_terminated: Actor|None = None
|
||||||
|
|
||||||
# TODO: mk this a `msgspec.Struct`!
|
|
||||||
_runtime_vars: dict[str, Any] = {
|
_runtime_vars: dict[str, Any] = {
|
||||||
'_debug_mode': False,
|
'_debug_mode': False,
|
||||||
'_is_root': False,
|
'_is_root': False,
|
||||||
'_root_mailbox': (None, None),
|
'_root_mailbox': (None, None)
|
||||||
'_registry_addrs': [],
|
|
||||||
|
|
||||||
'_is_infected_aio': False,
|
|
||||||
|
|
||||||
# for `tractor.pause_from_sync()` & `breakpoint()` support
|
|
||||||
'use_greenback': False,
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@ -72,10 +58,9 @@ def current_actor(
|
||||||
'''
|
'''
|
||||||
if (
|
if (
|
||||||
err_on_no_runtime
|
err_on_no_runtime
|
||||||
and
|
and _current_actor is None
|
||||||
_current_actor is None
|
|
||||||
):
|
):
|
||||||
msg: str = 'No local actor has been initialized yet?\n'
|
msg: str = 'No local actor has been initialized yet'
|
||||||
from ._exceptions import NoRuntime
|
from ._exceptions import NoRuntime
|
||||||
|
|
||||||
if last := last_actor():
|
if last := last_actor():
|
||||||
|
@ -88,8 +73,8 @@ def current_actor(
|
||||||
# this process.
|
# this process.
|
||||||
else:
|
else:
|
||||||
msg += (
|
msg += (
|
||||||
# 'No last actor found?\n'
|
'No last actor found?\n'
|
||||||
'\nDid you forget to call one of,\n'
|
'Did you forget to open one of:\n\n'
|
||||||
'- `tractor.open_root_actor()`\n'
|
'- `tractor.open_root_actor()`\n'
|
||||||
'- `tractor.open_nursery()`\n'
|
'- `tractor.open_nursery()`\n'
|
||||||
)
|
)
|
||||||
|
@ -108,7 +93,6 @@ def is_main_process() -> bool:
|
||||||
return mp.current_process().name == 'MainProcess'
|
return mp.current_process().name == 'MainProcess'
|
||||||
|
|
||||||
|
|
||||||
# TODO, more verby name?
|
|
||||||
def debug_mode() -> bool:
|
def debug_mode() -> bool:
|
||||||
'''
|
'''
|
||||||
Bool determining if "debug mode" is on which enables
|
Bool determining if "debug mode" is on which enables
|
||||||
|
@ -120,26 +104,3 @@ def debug_mode() -> bool:
|
||||||
|
|
||||||
def is_root_process() -> bool:
|
def is_root_process() -> bool:
|
||||||
return _runtime_vars['_is_root']
|
return _runtime_vars['_is_root']
|
||||||
|
|
||||||
|
|
||||||
_ctxvar_Context: ContextVar[Context] = ContextVar(
|
|
||||||
'ipc_context',
|
|
||||||
default=None,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def current_ipc_ctx(
|
|
||||||
error_on_not_set: bool = False,
|
|
||||||
) -> Context|None:
|
|
||||||
ctx: Context = _ctxvar_Context.get()
|
|
||||||
|
|
||||||
if (
|
|
||||||
not ctx
|
|
||||||
and error_on_not_set
|
|
||||||
):
|
|
||||||
from ._exceptions import InternalError
|
|
||||||
raise InternalError(
|
|
||||||
'No IPC context has been allocated for this task yet?\n'
|
|
||||||
f'|_{current_task()}\n'
|
|
||||||
)
|
|
||||||
return ctx
|
|
||||||
|
|
|
@ -26,7 +26,6 @@ import inspect
|
||||||
from pprint import pformat
|
from pprint import pformat
|
||||||
from typing import (
|
from typing import (
|
||||||
Any,
|
Any,
|
||||||
AsyncGenerator,
|
|
||||||
Callable,
|
Callable,
|
||||||
AsyncIterator,
|
AsyncIterator,
|
||||||
TYPE_CHECKING,
|
TYPE_CHECKING,
|
||||||
|
@ -36,27 +35,17 @@ import warnings
|
||||||
import trio
|
import trio
|
||||||
|
|
||||||
from ._exceptions import (
|
from ._exceptions import (
|
||||||
|
_raise_from_no_key_in_msg,
|
||||||
ContextCancelled,
|
ContextCancelled,
|
||||||
RemoteActorError,
|
|
||||||
)
|
)
|
||||||
from .log import get_logger
|
from .log import get_logger
|
||||||
from .trionics import (
|
from .trionics import (
|
||||||
broadcast_receiver,
|
broadcast_receiver,
|
||||||
BroadcastReceiver,
|
BroadcastReceiver,
|
||||||
)
|
)
|
||||||
from tractor.msg import (
|
|
||||||
Error,
|
|
||||||
Return,
|
|
||||||
Stop,
|
|
||||||
MsgType,
|
|
||||||
PayloadT,
|
|
||||||
Yield,
|
|
||||||
)
|
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from ._runtime import Actor
|
|
||||||
from ._context import Context
|
from ._context import Context
|
||||||
from ._ipc import Channel
|
|
||||||
|
|
||||||
|
|
||||||
log = get_logger(__name__)
|
log = get_logger(__name__)
|
||||||
|
@ -70,9 +59,10 @@ log = get_logger(__name__)
|
||||||
class MsgStream(trio.abc.Channel):
|
class MsgStream(trio.abc.Channel):
|
||||||
'''
|
'''
|
||||||
A bidirectional message stream for receiving logically sequenced
|
A bidirectional message stream for receiving logically sequenced
|
||||||
values over an inter-actor IPC `Channel`.
|
values over an inter-actor IPC ``Channel``.
|
||||||
|
|
||||||
|
|
||||||
|
This is the type returned to a local task which entered either
|
||||||
|
``Portal.open_stream_from()`` or ``Context.open_stream()``.
|
||||||
|
|
||||||
Termination rules:
|
Termination rules:
|
||||||
|
|
||||||
|
@ -88,109 +78,46 @@ class MsgStream(trio.abc.Channel):
|
||||||
self,
|
self,
|
||||||
ctx: Context, # typing: ignore # noqa
|
ctx: Context, # typing: ignore # noqa
|
||||||
rx_chan: trio.MemoryReceiveChannel,
|
rx_chan: trio.MemoryReceiveChannel,
|
||||||
_broadcaster: BroadcastReceiver|None = None,
|
_broadcaster: BroadcastReceiver | None = None,
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
self._ctx = ctx
|
self._ctx = ctx
|
||||||
self._rx_chan = rx_chan
|
self._rx_chan = rx_chan
|
||||||
self._broadcaster = _broadcaster
|
self._broadcaster = _broadcaster
|
||||||
|
|
||||||
# any actual IPC msg which is effectively an `EndOfStream`
|
|
||||||
self._stop_msg: bool|Stop = False
|
|
||||||
|
|
||||||
# flag to denote end of stream
|
# flag to denote end of stream
|
||||||
self._eoc: bool|trio.EndOfChannel = False
|
self._eoc: bool|trio.EndOfChannel = False
|
||||||
self._closed: bool|trio.ClosedResourceError = False
|
self._closed: bool|trio.ClosedResourceError = False
|
||||||
|
|
||||||
@property
|
|
||||||
def ctx(self) -> Context:
|
|
||||||
'''
|
|
||||||
A read-only ref to this stream's inter-actor-task `Context`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
return self._ctx
|
|
||||||
|
|
||||||
@property
|
|
||||||
def chan(self) -> Channel:
|
|
||||||
'''
|
|
||||||
Ref to the containing `Context`'s transport `Channel`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
return self._ctx.chan
|
|
||||||
|
|
||||||
# TODO: could we make this a direct method bind to `PldRx`?
|
|
||||||
# -> receive_nowait = PldRx.recv_pld
|
|
||||||
# |_ means latter would have to accept `MsgStream`-as-`self`?
|
|
||||||
# => should be fine as long as,
|
|
||||||
# -[ ] both define `._rx_chan`
|
|
||||||
# -[ ] .ctx is bound into `PldRx` using a `@cm`?
|
|
||||||
#
|
|
||||||
# delegate directly to underlying mem channel
|
# delegate directly to underlying mem channel
|
||||||
def receive_nowait(
|
def receive_nowait(
|
||||||
self,
|
self,
|
||||||
expect_msg: MsgType = Yield,
|
allow_msg_keys: list[str] = ['yield'],
|
||||||
) -> PayloadT:
|
):
|
||||||
ctx: Context = self._ctx
|
msg: dict = self._rx_chan.receive_nowait()
|
||||||
(
|
for (
|
||||||
msg,
|
i,
|
||||||
pld,
|
key,
|
||||||
) = ctx._pld_rx.recv_msg_nowait(
|
) in enumerate(allow_msg_keys):
|
||||||
ipc=self,
|
try:
|
||||||
expect_msg=expect_msg,
|
return msg[key]
|
||||||
)
|
except KeyError as kerr:
|
||||||
|
if i < (len(allow_msg_keys) - 1):
|
||||||
|
continue
|
||||||
|
|
||||||
# ?TODO, maybe factor this into a hyper-common `unwrap_pld()`
|
_raise_from_no_key_in_msg(
|
||||||
#
|
ctx=self._ctx,
|
||||||
match msg:
|
msg=msg,
|
||||||
|
src_err=kerr,
|
||||||
# XXX, these never seems to ever hit? cool?
|
log=log,
|
||||||
case Stop():
|
expect_key=key,
|
||||||
log.cancel(
|
stream=self,
|
||||||
f'Msg-stream was ended via stop msg\n'
|
|
||||||
f'{msg}'
|
|
||||||
)
|
)
|
||||||
case Error():
|
|
||||||
log.error(
|
|
||||||
f'Msg-stream was ended via error msg\n'
|
|
||||||
f'{msg}'
|
|
||||||
)
|
|
||||||
|
|
||||||
# XXX NOTE, always set any final result on the ctx to
|
|
||||||
# avoid teardown race conditions where previously this msg
|
|
||||||
# would be consumed silently (by `.aclose()` doing its
|
|
||||||
# own "msg drain loop" but WITHOUT those `drained: lists[MsgType]`
|
|
||||||
# being post-close-processed!
|
|
||||||
#
|
|
||||||
# !!TODO, see the equiv todo-comment in `.receive()`
|
|
||||||
# around the `if drained:` where we should prolly
|
|
||||||
# ACTUALLY be doing this post-close processing??
|
|
||||||
#
|
|
||||||
case Return(pld=pld):
|
|
||||||
log.warning(
|
|
||||||
f'Msg-stream final result msg for IPC ctx?\n'
|
|
||||||
f'{msg}'
|
|
||||||
)
|
|
||||||
# XXX TODO, this **should be covered** by higher
|
|
||||||
# scoped runtime-side method calls such as
|
|
||||||
# `Context._deliver_msg()`, so you should never
|
|
||||||
# really see the warning above or else something
|
|
||||||
# racy/out-of-order is likely going on between
|
|
||||||
# actor-runtime-side push tasks and the user-app-side
|
|
||||||
# consume tasks!
|
|
||||||
# -[ ] figure out that set of race cases and fix!
|
|
||||||
# -[ ] possibly return the `msg` given an input
|
|
||||||
# arg-flag is set so we can process the `Return`
|
|
||||||
# from the `.aclose()` caller?
|
|
||||||
#
|
|
||||||
# breakpoint() # to debug this RACE CASE!
|
|
||||||
ctx._result = pld
|
|
||||||
ctx._outcome_msg = msg
|
|
||||||
|
|
||||||
return pld
|
|
||||||
|
|
||||||
async def receive(
|
async def receive(
|
||||||
self,
|
self,
|
||||||
hide_tb: bool = False,
|
|
||||||
|
hide_tb: bool = True,
|
||||||
):
|
):
|
||||||
'''
|
'''
|
||||||
Receive a single msg from the IPC transport, the next in
|
Receive a single msg from the IPC transport, the next in
|
||||||
|
@ -200,8 +127,9 @@ class MsgStream(trio.abc.Channel):
|
||||||
'''
|
'''
|
||||||
__tracebackhide__: bool = hide_tb
|
__tracebackhide__: bool = hide_tb
|
||||||
|
|
||||||
# NOTE FYI: `trio.ReceiveChannel` implements EOC handling as
|
# NOTE: `trio.ReceiveChannel` implements
|
||||||
# follows (aka uses it to gracefully exit async for loops):
|
# EOC handling as follows (aka uses it
|
||||||
|
# to gracefully exit async for loops):
|
||||||
#
|
#
|
||||||
# async def __anext__(self) -> ReceiveType:
|
# async def __anext__(self) -> ReceiveType:
|
||||||
# try:
|
# try:
|
||||||
|
@ -209,7 +137,7 @@ class MsgStream(trio.abc.Channel):
|
||||||
# except trio.EndOfChannel:
|
# except trio.EndOfChannel:
|
||||||
# raise StopAsyncIteration
|
# raise StopAsyncIteration
|
||||||
#
|
#
|
||||||
# see `.aclose()` for notes on the old behaviour prior to
|
# see ``.aclose()`` for notes on the old behaviour prior to
|
||||||
# introducing this
|
# introducing this
|
||||||
if self._eoc:
|
if self._eoc:
|
||||||
raise self._eoc
|
raise self._eoc
|
||||||
|
@ -219,33 +147,62 @@ class MsgStream(trio.abc.Channel):
|
||||||
|
|
||||||
src_err: Exception|None = None # orig tb
|
src_err: Exception|None = None # orig tb
|
||||||
try:
|
try:
|
||||||
ctx: Context = self._ctx
|
try:
|
||||||
pld = await ctx._pld_rx.recv_pld(
|
msg = await self._rx_chan.receive()
|
||||||
ipc=self,
|
return msg['yield']
|
||||||
expect_msg=Yield,
|
|
||||||
)
|
except KeyError as kerr:
|
||||||
return pld
|
src_err = kerr
|
||||||
|
|
||||||
|
# NOTE: may raise any of the below error types
|
||||||
|
# includg EoC when a 'stop' msg is found.
|
||||||
|
_raise_from_no_key_in_msg(
|
||||||
|
ctx=self._ctx,
|
||||||
|
msg=msg,
|
||||||
|
src_err=kerr,
|
||||||
|
log=log,
|
||||||
|
expect_key='yield',
|
||||||
|
stream=self,
|
||||||
|
)
|
||||||
|
|
||||||
# XXX: the stream terminates on either of:
|
# XXX: the stream terminates on either of:
|
||||||
# - `self._rx_chan.receive()` raising after manual closure
|
# - via `self._rx_chan.receive()` raising after manual closure
|
||||||
# by the rpc-runtime,
|
# by the rpc-runtime OR,
|
||||||
# OR
|
# - via a received `{'stop': ...}` msg from remote side.
|
||||||
# - via a `Stop`-msg received from remote peer task.
|
# |_ NOTE: previously this was triggered by calling
|
||||||
# NOTE
|
# ``._rx_chan.aclose()`` on the send side of the channel inside
|
||||||
# |_ previously this was triggered by calling
|
# `Actor._push_result()`, but now the 'stop' message handling
|
||||||
# `._rx_chan.aclose()` on the send side of the channel
|
# has been put just above inside `_raise_from_no_key_in_msg()`.
|
||||||
# inside `Actor._deliver_ctx_payload()`, but now the 'stop'
|
except (
|
||||||
# message handling gets delegated to `PldRFx.recv_pld()`
|
trio.EndOfChannel,
|
||||||
# internals.
|
) as eoc:
|
||||||
except trio.EndOfChannel as eoc:
|
|
||||||
# a graceful stream finished signal
|
|
||||||
self._eoc = eoc
|
|
||||||
src_err = eoc
|
src_err = eoc
|
||||||
|
self._eoc = eoc
|
||||||
|
|
||||||
# a `ClosedResourceError` indicates that the internal feeder
|
# TODO: Locally, we want to close this stream gracefully, by
|
||||||
# memory receive channel was closed likely by the runtime
|
# terminating any local consumers tasks deterministically.
|
||||||
# after the associated transport-channel disconnected or
|
# Once we have broadcast support, we **don't** want to be
|
||||||
# broke.
|
# closing this stream and not flushing a final value to
|
||||||
|
# remaining (clone) consumers who may not have been
|
||||||
|
# scheduled to receive it yet.
|
||||||
|
# try:
|
||||||
|
# maybe_err_msg_or_res: dict = self._rx_chan.receive_nowait()
|
||||||
|
# if maybe_err_msg_or_res:
|
||||||
|
# log.warning(
|
||||||
|
# 'Discarding un-processed msg:\n'
|
||||||
|
# f'{maybe_err_msg_or_res}'
|
||||||
|
# )
|
||||||
|
# except trio.WouldBlock:
|
||||||
|
# # no queued msgs that might be another remote
|
||||||
|
# # error, so just raise the original EoC
|
||||||
|
# pass
|
||||||
|
|
||||||
|
# raise eoc
|
||||||
|
|
||||||
|
# a ``ClosedResourceError`` indicates that the internal
|
||||||
|
# feeder memory receive channel was closed likely by the
|
||||||
|
# runtime after the associated transport-channel
|
||||||
|
# disconnected or broke.
|
||||||
except trio.ClosedResourceError as cre: # by self._rx_chan.receive()
|
except trio.ClosedResourceError as cre: # by self._rx_chan.receive()
|
||||||
src_err = cre
|
src_err = cre
|
||||||
log.warning(
|
log.warning(
|
||||||
|
@ -257,60 +214,47 @@ class MsgStream(trio.abc.Channel):
|
||||||
# terminated and signal this local iterator to stop
|
# terminated and signal this local iterator to stop
|
||||||
drained: list[Exception|dict] = await self.aclose()
|
drained: list[Exception|dict] = await self.aclose()
|
||||||
if drained:
|
if drained:
|
||||||
# ^^^^^^^^TODO? pass these to the `._ctx._drained_msgs:
|
# from .devx import pause
|
||||||
# deque` and then iterate them as part of any
|
# await pause()
|
||||||
# `.wait_for_result()` call?
|
|
||||||
#
|
|
||||||
# -[ ] move the match-case processing from
|
|
||||||
# `.receive_nowait()` instead to right here, use it from
|
|
||||||
# a for msg in drained:` post-proc loop?
|
|
||||||
#
|
|
||||||
log.warning(
|
log.warning(
|
||||||
'Drained context msgs during closure\n\n'
|
'Drained context msgs during closure:\n'
|
||||||
f'{drained}'
|
f'{drained}'
|
||||||
)
|
)
|
||||||
|
# TODO: pass these to the `._ctx._drained_msgs: deque`
|
||||||
|
# and then iterate them as part of any `.result()` call?
|
||||||
|
|
||||||
# NOTE XXX: if the context was cancelled or remote-errored
|
# NOTE XXX: if the context was cancelled or remote-errored
|
||||||
# but we received the stream close msg first, we
|
# but we received the stream close msg first, we
|
||||||
# probably want to instead raise the remote error
|
# probably want to instead raise the remote error
|
||||||
# over the end-of-stream connection error since likely
|
# over the end-of-stream connection error since likely
|
||||||
# the remote error was the source cause?
|
# the remote error was the source cause?
|
||||||
# ctx: Context = self._ctx
|
ctx: Context = self._ctx
|
||||||
ctx.maybe_raise(
|
ctx.maybe_raise(
|
||||||
raise_ctxc_from_self_call=True,
|
raise_ctxc_from_self_call=True,
|
||||||
from_src_exc=src_err,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
# propagate any error but hide low-level frame details from
|
# propagate any error but hide low-level frame details
|
||||||
# the caller by default for console/debug-REPL noise
|
# from the caller by default for debug noise reduction.
|
||||||
# reduction.
|
|
||||||
if (
|
if (
|
||||||
hide_tb
|
hide_tb
|
||||||
and (
|
|
||||||
|
|
||||||
# XXX NOTE special conditions: don't reraise on
|
# XXX NOTE XXX don't reraise on certain
|
||||||
# certain stream-specific internal error types like,
|
# stream-specific internal error types like,
|
||||||
#
|
#
|
||||||
# - `trio.EoC` since we want to use the exact instance
|
# - `trio.EoC` since we want to use the exact instance
|
||||||
# to ensure that it is the error that bubbles upward
|
# to ensure that it is the error that bubbles upward
|
||||||
# for silent absorption by `Context.open_stream()`.
|
# for silent absorption by `Context.open_stream()`.
|
||||||
not self._eoc
|
and not self._eoc
|
||||||
|
|
||||||
# - `RemoteActorError` (or subtypes like ctxc)
|
# - `RemoteActorError` (or `ContextCancelled`) if it gets
|
||||||
# since we want to present the error as though it is
|
# raised from `_raise_from_no_key_in_msg()` since we
|
||||||
# "sourced" directly from this `.receive()` call and
|
# want the same (as the above bullet) for any
|
||||||
# generally NOT include the stack frames raised from
|
# `.open_context()` block bubbled error raised by
|
||||||
# inside the `PldRx` and/or the transport stack
|
# any nearby ctx API remote-failures.
|
||||||
# layers.
|
# and not isinstance(src_err, RemoteActorError)
|
||||||
or isinstance(src_err, RemoteActorError)
|
|
||||||
)
|
|
||||||
):
|
):
|
||||||
raise type(src_err)(*src_err.args) from src_err
|
raise type(src_err)(*src_err.args) from src_err
|
||||||
else:
|
else:
|
||||||
# for any non-graceful-EOC we want to NOT hide this frame
|
|
||||||
if not self._eoc:
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
|
|
||||||
raise src_err
|
raise src_err
|
||||||
|
|
||||||
async def aclose(self) -> list[Exception|dict]:
|
async def aclose(self) -> list[Exception|dict]:
|
||||||
|
@ -327,6 +271,9 @@ class MsgStream(trio.abc.Channel):
|
||||||
- more or less we try to maintain adherance to trio's `.aclose()` semantics:
|
- more or less we try to maintain adherance to trio's `.aclose()` semantics:
|
||||||
https://trio.readthedocs.io/en/stable/reference-io.html#trio.abc.AsyncResource.aclose
|
https://trio.readthedocs.io/en/stable/reference-io.html#trio.abc.AsyncResource.aclose
|
||||||
'''
|
'''
|
||||||
|
|
||||||
|
# rx_chan = self._rx_chan
|
||||||
|
|
||||||
# XXX NOTE XXX
|
# XXX NOTE XXX
|
||||||
# it's SUPER IMPORTANT that we ensure we don't DOUBLE
|
# it's SUPER IMPORTANT that we ensure we don't DOUBLE
|
||||||
# DRAIN msgs on closure so avoid getting stuck handing on
|
# DRAIN msgs on closure so avoid getting stuck handing on
|
||||||
|
@ -338,16 +285,14 @@ class MsgStream(trio.abc.Channel):
|
||||||
# this stream has already been closed so silently succeed as
|
# this stream has already been closed so silently succeed as
|
||||||
# per ``trio.AsyncResource`` semantics.
|
# per ``trio.AsyncResource`` semantics.
|
||||||
# https://trio.readthedocs.io/en/stable/reference-io.html#trio.abc.AsyncResource.aclose
|
# https://trio.readthedocs.io/en/stable/reference-io.html#trio.abc.AsyncResource.aclose
|
||||||
# import tractor
|
|
||||||
# await tractor.pause()
|
|
||||||
return []
|
return []
|
||||||
|
|
||||||
ctx: Context = self._ctx
|
ctx: Context = self._ctx
|
||||||
drained: list[Exception|dict] = []
|
drained: list[Exception|dict] = []
|
||||||
while not drained:
|
while not drained:
|
||||||
try:
|
try:
|
||||||
maybe_final_msg: Yield|Return = self.receive_nowait(
|
maybe_final_msg = self.receive_nowait(
|
||||||
expect_msg=Yield|Return,
|
allow_msg_keys=['yield', 'return'],
|
||||||
)
|
)
|
||||||
if maybe_final_msg:
|
if maybe_final_msg:
|
||||||
log.debug(
|
log.debug(
|
||||||
|
@ -432,30 +377,17 @@ class MsgStream(trio.abc.Channel):
|
||||||
# await rx_chan.aclose()
|
# await rx_chan.aclose()
|
||||||
|
|
||||||
if not self._eoc:
|
if not self._eoc:
|
||||||
this_side: str = self._ctx.side
|
log.cancel(
|
||||||
peer_side: str = self._ctx.peer_side
|
'Stream closed before it received an EoC?\n'
|
||||||
message: str = (
|
'Setting eoc manually..\n..'
|
||||||
f'Stream self-closed by {this_side!r}-side before EoC from {peer_side!r}\n'
|
)
|
||||||
# } bc a stream is a "scope"/msging-phase inside an IPC
|
self._eoc: bool = trio.EndOfChannel(
|
||||||
f'x}}>\n'
|
f'Context stream closed by {self._ctx.side}\n'
|
||||||
f' |_{self}\n'
|
f'|_{self}\n'
|
||||||
)
|
)
|
||||||
log.cancel(message)
|
|
||||||
self._eoc = trio.EndOfChannel(message)
|
|
||||||
|
|
||||||
if (
|
|
||||||
(rx_chan := self._rx_chan)
|
|
||||||
and
|
|
||||||
(stats := rx_chan.statistics()).tasks_waiting_receive
|
|
||||||
):
|
|
||||||
log.cancel(
|
|
||||||
f'Msg-stream is closing but there is still reader tasks,\n'
|
|
||||||
f'{stats}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
# ?XXX WAIT, why do we not close the local mem chan `._rx_chan` XXX?
|
# ?XXX WAIT, why do we not close the local mem chan `._rx_chan` XXX?
|
||||||
# => NO, DEFINITELY NOT! <=
|
# => NO, DEFINITELY NOT! <=
|
||||||
# if we're a bi-dir `MsgStream` BECAUSE this same
|
# if we're a bi-dir ``MsgStream`` BECAUSE this same
|
||||||
# core-msg-loop mem recv-chan is used to deliver the
|
# core-msg-loop mem recv-chan is used to deliver the
|
||||||
# potential final result from the surrounding inter-actor
|
# potential final result from the surrounding inter-actor
|
||||||
# `Context` so we don't want to close it until that
|
# `Context` so we don't want to close it until that
|
||||||
|
@ -537,9 +469,6 @@ class MsgStream(trio.abc.Channel):
|
||||||
self,
|
self,
|
||||||
# use memory channel size by default
|
# use memory channel size by default
|
||||||
self._rx_chan._state.max_buffer_size, # type: ignore
|
self._rx_chan._state.max_buffer_size, # type: ignore
|
||||||
|
|
||||||
# TODO: can remove this kwarg right since
|
|
||||||
# by default behaviour is to do this anyway?
|
|
||||||
receive_afunc=self.receive,
|
receive_afunc=self.receive,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -586,10 +515,11 @@ class MsgStream(trio.abc.Channel):
|
||||||
|
|
||||||
try:
|
try:
|
||||||
await self._ctx.chan.send(
|
await self._ctx.chan.send(
|
||||||
payload=Yield(
|
payload={
|
||||||
cid=self._ctx.cid,
|
'yield': data,
|
||||||
pld=data,
|
'cid': self._ctx.cid,
|
||||||
),
|
},
|
||||||
|
# hide_tb=hide_tb,
|
||||||
)
|
)
|
||||||
except (
|
except (
|
||||||
trio.ClosedResourceError,
|
trio.ClosedResourceError,
|
||||||
|
@ -603,224 +533,6 @@ class MsgStream(trio.abc.Channel):
|
||||||
else:
|
else:
|
||||||
raise
|
raise
|
||||||
|
|
||||||
# TODO: msg capability context api1
|
|
||||||
# @acm
|
|
||||||
# async def enable_msg_caps(
|
|
||||||
# self,
|
|
||||||
# msg_subtypes: Union[
|
|
||||||
# list[list[Struct]],
|
|
||||||
# Protocol, # hypothetical type that wraps a msg set
|
|
||||||
# ],
|
|
||||||
# ) -> tuple[Callable, Callable]: # payload enc, dec pair
|
|
||||||
# ...
|
|
||||||
|
|
||||||
|
|
||||||
@acm
|
|
||||||
async def open_stream_from_ctx(
|
|
||||||
ctx: Context,
|
|
||||||
allow_overruns: bool|None = False,
|
|
||||||
msg_buffer_size: int|None = None,
|
|
||||||
|
|
||||||
) -> AsyncGenerator[MsgStream, None]:
|
|
||||||
'''
|
|
||||||
Open a `MsgStream`, a bi-directional msg transport dialog
|
|
||||||
connected to the cross-actor peer task for an IPC `Context`.
|
|
||||||
|
|
||||||
This context manager must be entered in both the "parent" (task
|
|
||||||
which entered `Portal.open_context()`) and "child" (RPC task
|
|
||||||
which is decorated by `@context`) tasks for the stream to
|
|
||||||
logically be considered "open"; if one side begins sending to an
|
|
||||||
un-opened peer, depending on policy config, msgs will either be
|
|
||||||
queued until the other side opens and/or a `StreamOverrun` will
|
|
||||||
(eventually) be raised.
|
|
||||||
|
|
||||||
------ - ------
|
|
||||||
|
|
||||||
Runtime semantics design:
|
|
||||||
|
|
||||||
A `MsgStream` session adheres to "one-shot use" semantics,
|
|
||||||
meaning if you close the scope it **can not** be "re-opened".
|
|
||||||
|
|
||||||
Instead you must re-establish a new surrounding RPC `Context`
|
|
||||||
(RTC: remote task context?) using `Portal.open_context()`.
|
|
||||||
|
|
||||||
In the future this *design choice* may need to be changed but
|
|
||||||
currently there seems to be no obvious reason to support such
|
|
||||||
semantics..
|
|
||||||
|
|
||||||
- "pausing a stream" can be supported with a message implemented
|
|
||||||
by the `tractor` application dev.
|
|
||||||
|
|
||||||
- any remote error will normally require a restart of the entire
|
|
||||||
`trio.Task`'s scope due to the nature of `trio`'s cancellation
|
|
||||||
(`CancelScope`) system and semantics (level triggered).
|
|
||||||
|
|
||||||
'''
|
|
||||||
actor: Actor = ctx._actor
|
|
||||||
|
|
||||||
# If the surrounding context has been cancelled by some
|
|
||||||
# task with a handle to THIS, we error here immediately
|
|
||||||
# since it likely means the surrounding lexical-scope has
|
|
||||||
# errored, been `trio.Cancelled` or at the least
|
|
||||||
# `Context.cancel()` was called by some task.
|
|
||||||
if ctx._cancel_called:
|
|
||||||
|
|
||||||
# XXX NOTE: ALWAYS RAISE any remote error here even if
|
|
||||||
# it's an expected `ContextCancelled` due to a local
|
|
||||||
# task having called `.cancel()`!
|
|
||||||
#
|
|
||||||
# WHY: we expect the error to always bubble up to the
|
|
||||||
# surrounding `Portal.open_context()` call and be
|
|
||||||
# absorbed there (silently) and we DO NOT want to
|
|
||||||
# actually try to stream - a cancel msg was already
|
|
||||||
# sent to the other side!
|
|
||||||
ctx.maybe_raise(
|
|
||||||
raise_ctxc_from_self_call=True,
|
|
||||||
)
|
|
||||||
# NOTE: this is diff then calling
|
|
||||||
# `._maybe_raise_remote_err()` specifically
|
|
||||||
# because we want to raise a ctxc on any task entering this `.open_stream()`
|
|
||||||
# AFTER cancellation was already been requested,
|
|
||||||
# we DO NOT want to absorb any ctxc ACK silently!
|
|
||||||
# if ctx._remote_error:
|
|
||||||
# raise ctx._remote_error
|
|
||||||
|
|
||||||
# XXX NOTE: if no `ContextCancelled` has been responded
|
|
||||||
# back from the other side (yet), we raise a different
|
|
||||||
# runtime error indicating that this task's usage of
|
|
||||||
# `Context.cancel()` and then `.open_stream()` is WRONG!
|
|
||||||
task: str = trio.lowlevel.current_task().name
|
|
||||||
raise RuntimeError(
|
|
||||||
'Stream opened after `Context.cancel()` called..?\n'
|
|
||||||
f'task: {actor.uid[0]}:{task}\n'
|
|
||||||
f'{ctx}'
|
|
||||||
)
|
|
||||||
|
|
||||||
if (
|
|
||||||
not ctx._portal
|
|
||||||
and not ctx._started_called
|
|
||||||
):
|
|
||||||
raise RuntimeError(
|
|
||||||
'Context.started()` must be called before opening a stream'
|
|
||||||
)
|
|
||||||
|
|
||||||
# NOTE: in one way streaming this only happens on the
|
|
||||||
# parent-ctx-task side (on the side that calls
|
|
||||||
# `Actor.start_remote_task()`) so if you try to send
|
|
||||||
# a stop from the caller to the callee in the
|
|
||||||
# single-direction-stream case you'll get a lookup error
|
|
||||||
# currently.
|
|
||||||
ctx: Context = actor.get_context(
|
|
||||||
chan=ctx.chan,
|
|
||||||
cid=ctx.cid,
|
|
||||||
nsf=ctx._nsf,
|
|
||||||
# side=ctx.side,
|
|
||||||
|
|
||||||
msg_buffer_size=msg_buffer_size,
|
|
||||||
allow_overruns=allow_overruns,
|
|
||||||
)
|
|
||||||
ctx._allow_overruns: bool = allow_overruns
|
|
||||||
assert ctx is ctx
|
|
||||||
|
|
||||||
# XXX: If the underlying channel feeder receive mem chan has
|
|
||||||
# been closed then likely client code has already exited
|
|
||||||
# a ``.open_stream()`` block prior or there was some other
|
|
||||||
# unanticipated error or cancellation from ``trio``.
|
|
||||||
|
|
||||||
if ctx._rx_chan._closed:
|
|
||||||
raise trio.ClosedResourceError(
|
|
||||||
'The underlying channel for this stream was already closed!\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
# NOTE: implicitly this will call `MsgStream.aclose()` on
|
|
||||||
# `.__aexit__()` due to stream's parent `Channel` type!
|
|
||||||
#
|
|
||||||
# XXX NOTE XXX: ensures the stream is "one-shot use",
|
|
||||||
# which specifically means that on exit,
|
|
||||||
# - signal ``trio.EndOfChannel``/``StopAsyncIteration`` to
|
|
||||||
# the far end indicating that the caller exited
|
|
||||||
# the streaming context purposefully by letting
|
|
||||||
# the exit block exec.
|
|
||||||
# - this is diff from the cancel/error case where
|
|
||||||
# a cancel request from this side or an error
|
|
||||||
# should be sent to the far end indicating the
|
|
||||||
# stream WAS NOT just closed normally/gracefully.
|
|
||||||
async with MsgStream(
|
|
||||||
ctx=ctx,
|
|
||||||
rx_chan=ctx._rx_chan,
|
|
||||||
) as stream:
|
|
||||||
|
|
||||||
# NOTE: we track all existing streams per portal for
|
|
||||||
# the purposes of attempting graceful closes on runtime
|
|
||||||
# cancel requests.
|
|
||||||
if ctx._portal:
|
|
||||||
ctx._portal._streams.add(stream)
|
|
||||||
|
|
||||||
try:
|
|
||||||
ctx._stream_opened: bool = True
|
|
||||||
ctx._stream = stream
|
|
||||||
|
|
||||||
# XXX: do we need this?
|
|
||||||
# ensure we aren't cancelled before yielding the stream
|
|
||||||
# await trio.lowlevel.checkpoint()
|
|
||||||
yield stream
|
|
||||||
|
|
||||||
# XXX: (MEGA IMPORTANT) if this is a root opened process we
|
|
||||||
# wait for any immediate child in debug before popping the
|
|
||||||
# context from the runtime msg loop otherwise inside
|
|
||||||
# ``Actor._deliver_ctx_payload()`` the msg will be discarded and in
|
|
||||||
# the case where that msg is global debugger unlock (via
|
|
||||||
# a "stop" msg for a stream), this can result in a deadlock
|
|
||||||
# where the root is waiting on the lock to clear but the
|
|
||||||
# child has already cleared it and clobbered IPC.
|
|
||||||
#
|
|
||||||
# await maybe_wait_for_debugger()
|
|
||||||
|
|
||||||
# XXX TODO: pretty sure this isn't needed (see
|
|
||||||
# note above this block) AND will result in
|
|
||||||
# a double `.send_stop()` call. The only reason to
|
|
||||||
# put it here would be to due with "order" in
|
|
||||||
# terms of raising any remote error (as per
|
|
||||||
# directly below) or bc the stream's
|
|
||||||
# `.__aexit__()` block might not get run
|
|
||||||
# (doubtful)? Either way if we did put this back
|
|
||||||
# in we also need a state var to avoid the double
|
|
||||||
# stop-msg send..
|
|
||||||
#
|
|
||||||
# await stream.aclose()
|
|
||||||
|
|
||||||
# NOTE: absorb and do not raise any
|
|
||||||
# EoC received from the other side such that
|
|
||||||
# it is not raised inside the surrounding
|
|
||||||
# context block's scope!
|
|
||||||
except trio.EndOfChannel as eoc:
|
|
||||||
if (
|
|
||||||
eoc
|
|
||||||
and
|
|
||||||
stream.closed
|
|
||||||
):
|
|
||||||
# sanity, can remove?
|
|
||||||
assert eoc is stream._eoc
|
|
||||||
|
|
||||||
log.warning(
|
|
||||||
'Stream was terminated by EoC\n\n'
|
|
||||||
# NOTE: won't show the error <Type> but
|
|
||||||
# does show txt followed by IPC msg.
|
|
||||||
f'{str(eoc)}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
finally:
|
|
||||||
if ctx._portal:
|
|
||||||
try:
|
|
||||||
ctx._portal._streams.remove(stream)
|
|
||||||
except KeyError:
|
|
||||||
log.warning(
|
|
||||||
f'Stream was already destroyed?\n'
|
|
||||||
f'actor: {ctx.chan.uid}\n'
|
|
||||||
f'ctx id: {ctx.cid}'
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def stream(func: Callable) -> Callable:
|
def stream(func: Callable) -> Callable:
|
||||||
'''
|
'''
|
||||||
|
@ -829,7 +541,7 @@ def stream(func: Callable) -> Callable:
|
||||||
'''
|
'''
|
||||||
# TODO: apply whatever solution ``mypy`` ends up picking for this:
|
# TODO: apply whatever solution ``mypy`` ends up picking for this:
|
||||||
# https://github.com/python/mypy/issues/2087#issuecomment-769266912
|
# https://github.com/python/mypy/issues/2087#issuecomment-769266912
|
||||||
func._tractor_stream_function: bool = True # type: ignore
|
func._tractor_stream_function = True # type: ignore
|
||||||
|
|
||||||
sig = inspect.signature(func)
|
sig = inspect.signature(func)
|
||||||
params = sig.parameters
|
params = sig.parameters
|
||||||
|
|
|
@ -22,7 +22,10 @@ from contextlib import asynccontextmanager as acm
|
||||||
from functools import partial
|
from functools import partial
|
||||||
import inspect
|
import inspect
|
||||||
from pprint import pformat
|
from pprint import pformat
|
||||||
from typing import TYPE_CHECKING
|
from typing import (
|
||||||
|
Optional,
|
||||||
|
TYPE_CHECKING,
|
||||||
|
)
|
||||||
import typing
|
import typing
|
||||||
import warnings
|
import warnings
|
||||||
|
|
||||||
|
@ -80,33 +83,30 @@ class ActorNursery:
|
||||||
'''
|
'''
|
||||||
def __init__(
|
def __init__(
|
||||||
self,
|
self,
|
||||||
# TODO: maybe def these as fields of a struct looking type?
|
|
||||||
actor: Actor,
|
actor: Actor,
|
||||||
ria_nursery: trio.Nursery,
|
ria_nursery: trio.Nursery,
|
||||||
da_nursery: trio.Nursery,
|
da_nursery: trio.Nursery,
|
||||||
errors: dict[tuple[str, str], BaseException],
|
errors: dict[tuple[str, str], BaseException],
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
# self.supervisor = supervisor # TODO
|
# self.supervisor = supervisor # TODO
|
||||||
self._actor: Actor = actor
|
self._actor: Actor = actor
|
||||||
|
self._ria_nursery = ria_nursery
|
||||||
# TODO: rename to `._tn` for our conventional "task-nursery"
|
|
||||||
self._da_nursery = da_nursery
|
self._da_nursery = da_nursery
|
||||||
|
|
||||||
self._children: dict[
|
self._children: dict[
|
||||||
tuple[str, str],
|
tuple[str, str],
|
||||||
tuple[
|
tuple[
|
||||||
Actor,
|
Actor,
|
||||||
trio.Process | mp.Process,
|
trio.Process | mp.Process,
|
||||||
Portal | None,
|
Optional[Portal],
|
||||||
]
|
]
|
||||||
] = {}
|
] = {}
|
||||||
|
# portals spawned with ``run_in_actor()`` are
|
||||||
|
# cancelled when their "main" result arrives
|
||||||
|
self._cancel_after_result_on_exit: set = set()
|
||||||
self.cancelled: bool = False
|
self.cancelled: bool = False
|
||||||
self._join_procs = trio.Event()
|
self._join_procs = trio.Event()
|
||||||
self._at_least_one_child_in_debug: bool = False
|
self._at_least_one_child_in_debug: bool = False
|
||||||
self.errors = errors
|
self.errors = errors
|
||||||
self._scope_error: BaseException|None = None
|
|
||||||
self.exited = trio.Event()
|
self.exited = trio.Event()
|
||||||
|
|
||||||
# NOTE: when no explicit call is made to
|
# NOTE: when no explicit call is made to
|
||||||
|
@ -117,48 +117,28 @@ class ActorNursery:
|
||||||
# and syncing purposes to any actor opened nurseries.
|
# and syncing purposes to any actor opened nurseries.
|
||||||
self._implicit_runtime_started: bool = False
|
self._implicit_runtime_started: bool = False
|
||||||
|
|
||||||
# TODO: remove the `.run_in_actor()` API and thus this 2ndary
|
|
||||||
# nursery when that API get's moved outside this primitive!
|
|
||||||
self._ria_nursery = ria_nursery
|
|
||||||
# portals spawned with ``run_in_actor()`` are
|
|
||||||
# cancelled when their "main" result arrives
|
|
||||||
self._cancel_after_result_on_exit: set = set()
|
|
||||||
|
|
||||||
async def start_actor(
|
async def start_actor(
|
||||||
self,
|
self,
|
||||||
name: str,
|
name: str,
|
||||||
|
|
||||||
*,
|
*,
|
||||||
|
bind_addr: tuple[str, int] = _default_bind_addr,
|
||||||
bind_addrs: list[tuple[str, int]] = [_default_bind_addr],
|
rpc_module_paths: list[str] | None = None,
|
||||||
rpc_module_paths: list[str]|None = None,
|
enable_modules: list[str] | None = None,
|
||||||
enable_modules: list[str]|None = None,
|
loglevel: str | None = None, # set log level per subactor
|
||||||
loglevel: str|None = None, # set log level per subactor
|
nursery: trio.Nursery | None = None,
|
||||||
debug_mode: bool|None = None,
|
debug_mode: Optional[bool] | None = None,
|
||||||
infect_asyncio: bool = False,
|
infect_asyncio: bool = False,
|
||||||
|
|
||||||
# TODO: ideally we can rm this once we no longer have
|
|
||||||
# a `._ria_nursery` since the dependent APIs have been
|
|
||||||
# removed!
|
|
||||||
nursery: trio.Nursery|None = None,
|
|
||||||
|
|
||||||
) -> Portal:
|
) -> Portal:
|
||||||
'''
|
'''
|
||||||
Start a (daemon) actor: an process that has no designated
|
Start a (daemon) actor: an process that has no designated
|
||||||
"main task" besides the runtime.
|
"main task" besides the runtime.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
__runtimeframe__: int = 1 # noqa
|
loglevel = loglevel or self._actor.loglevel or get_loglevel()
|
||||||
loglevel: str = (
|
|
||||||
loglevel
|
|
||||||
or self._actor.loglevel
|
|
||||||
or get_loglevel()
|
|
||||||
)
|
|
||||||
|
|
||||||
# configure and pass runtime state
|
# configure and pass runtime state
|
||||||
_rtv = _state._runtime_vars.copy()
|
_rtv = _state._runtime_vars.copy()
|
||||||
_rtv['_is_root'] = False
|
_rtv['_is_root'] = False
|
||||||
_rtv['_is_infected_aio'] = infect_asyncio
|
|
||||||
|
|
||||||
# allow setting debug policy per actor
|
# allow setting debug policy per actor
|
||||||
if debug_mode is not None:
|
if debug_mode is not None:
|
||||||
|
@ -181,9 +161,7 @@ class ActorNursery:
|
||||||
# modules allowed to invoked funcs from
|
# modules allowed to invoked funcs from
|
||||||
enable_modules=enable_modules,
|
enable_modules=enable_modules,
|
||||||
loglevel=loglevel,
|
loglevel=loglevel,
|
||||||
|
arbiter_addr=current_actor()._arb_addr,
|
||||||
# verbatim relay this actor's registrar addresses
|
|
||||||
registry_addrs=current_actor().reg_addrs,
|
|
||||||
)
|
)
|
||||||
parent_addr = self._actor.accept_addr
|
parent_addr = self._actor.accept_addr
|
||||||
assert parent_addr
|
assert parent_addr
|
||||||
|
@ -200,29 +178,21 @@ class ActorNursery:
|
||||||
self,
|
self,
|
||||||
subactor,
|
subactor,
|
||||||
self.errors,
|
self.errors,
|
||||||
bind_addrs,
|
bind_addr,
|
||||||
parent_addr,
|
parent_addr,
|
||||||
_rtv, # run time vars
|
_rtv, # run time vars
|
||||||
infect_asyncio=infect_asyncio,
|
infect_asyncio=infect_asyncio,
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
# TODO: DEPRECATE THIS:
|
|
||||||
# -[ ] impl instead as a hilevel wrapper on
|
|
||||||
# top of a `@context` style invocation.
|
|
||||||
# |_ dynamic @context decoration on child side
|
|
||||||
# |_ implicit `Portal.open_context() as (ctx, first):`
|
|
||||||
# and `return first` on parent side.
|
|
||||||
# |_ mention how it's similar to `trio-parallel` API?
|
|
||||||
# -[ ] use @api_frame on the wrapper
|
|
||||||
async def run_in_actor(
|
async def run_in_actor(
|
||||||
self,
|
self,
|
||||||
|
|
||||||
fn: typing.Callable,
|
fn: typing.Callable,
|
||||||
*,
|
*,
|
||||||
|
|
||||||
name: str | None = None,
|
name: Optional[str] = None,
|
||||||
bind_addrs: tuple[str, int] = [_default_bind_addr],
|
bind_addr: tuple[str, int] = _default_bind_addr,
|
||||||
rpc_module_paths: list[str] | None = None,
|
rpc_module_paths: list[str] | None = None,
|
||||||
enable_modules: list[str] | None = None,
|
enable_modules: list[str] | None = None,
|
||||||
loglevel: str | None = None, # set log level per subactor
|
loglevel: str | None = None, # set log level per subactor
|
||||||
|
@ -240,19 +210,18 @@ class ActorNursery:
|
||||||
the actor is terminated.
|
the actor is terminated.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
__runtimeframe__: int = 1 # noqa
|
|
||||||
mod_path: str = fn.__module__
|
mod_path: str = fn.__module__
|
||||||
|
|
||||||
if name is None:
|
if name is None:
|
||||||
# use the explicit function name if not provided
|
# use the explicit function name if not provided
|
||||||
name = fn.__name__
|
name = fn.__name__
|
||||||
|
|
||||||
portal: Portal = await self.start_actor(
|
portal = await self.start_actor(
|
||||||
name,
|
name,
|
||||||
enable_modules=[mod_path] + (
|
enable_modules=[mod_path] + (
|
||||||
enable_modules or rpc_module_paths or []
|
enable_modules or rpc_module_paths or []
|
||||||
),
|
),
|
||||||
bind_addrs=bind_addrs,
|
bind_addr=bind_addr,
|
||||||
loglevel=loglevel,
|
loglevel=loglevel,
|
||||||
# use the run_in_actor nursery
|
# use the run_in_actor nursery
|
||||||
nursery=self._ria_nursery,
|
nursery=self._ria_nursery,
|
||||||
|
@ -276,24 +245,19 @@ class ActorNursery:
|
||||||
)
|
)
|
||||||
return portal
|
return portal
|
||||||
|
|
||||||
# @api_frame
|
|
||||||
async def cancel(
|
async def cancel(
|
||||||
self,
|
self,
|
||||||
hard_kill: bool = False,
|
hard_kill: bool = False,
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
Cancel this actor-nursery by instructing each subactor's
|
Cancel this nursery by instructing each subactor to cancel
|
||||||
runtime to cancel and wait for all underlying sub-processes
|
itself and wait for all subactors to terminate.
|
||||||
to terminate.
|
|
||||||
|
|
||||||
If `hard_kill` is set then kill the processes directly using
|
If ``hard_killl`` is set to ``True`` then kill the processes
|
||||||
the spawning-backend's API/OS-machinery without any attempt
|
directly without any far end graceful ``trio`` cancellation.
|
||||||
at (graceful) `trio`-style cancellation using our
|
|
||||||
`Actor.cancel()`.
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
__runtimeframe__: int = 1 # noqa
|
|
||||||
self.cancelled = True
|
self.cancelled = True
|
||||||
|
|
||||||
# TODO: impl a repr for spawn more compact
|
# TODO: impl a repr for spawn more compact
|
||||||
|
@ -374,15 +338,11 @@ class ActorNursery:
|
||||||
@acm
|
@acm
|
||||||
async def _open_and_supervise_one_cancels_all_nursery(
|
async def _open_and_supervise_one_cancels_all_nursery(
|
||||||
actor: Actor,
|
actor: Actor,
|
||||||
tb_hide: bool = False,
|
|
||||||
|
|
||||||
) -> typing.AsyncGenerator[ActorNursery, None]:
|
) -> typing.AsyncGenerator[ActorNursery, None]:
|
||||||
|
|
||||||
# normally don't need to show user by default
|
# TODO: yay or nay?
|
||||||
__tracebackhide__: bool = tb_hide
|
__tracebackhide__ = True
|
||||||
|
|
||||||
outer_err: BaseException|None = None
|
|
||||||
inner_err: BaseException|None = None
|
|
||||||
|
|
||||||
# the collection of errors retreived from spawned sub-actors
|
# the collection of errors retreived from spawned sub-actors
|
||||||
errors: dict[tuple[str, str], BaseException] = {}
|
errors: dict[tuple[str, str], BaseException] = {}
|
||||||
|
@ -392,26 +352,20 @@ async def _open_and_supervise_one_cancels_all_nursery(
|
||||||
# handling errors that are generated by the inner nursery in
|
# handling errors that are generated by the inner nursery in
|
||||||
# a supervisor strategy **before** blocking indefinitely to wait for
|
# a supervisor strategy **before** blocking indefinitely to wait for
|
||||||
# actors spawned in "daemon mode" (aka started using
|
# actors spawned in "daemon mode" (aka started using
|
||||||
# `ActorNursery.start_actor()`).
|
# ``ActorNursery.start_actor()``).
|
||||||
|
|
||||||
# errors from this daemon actor nursery bubble up to caller
|
# errors from this daemon actor nursery bubble up to caller
|
||||||
async with trio.open_nursery(
|
async with trio.open_nursery() as da_nursery:
|
||||||
strict_exception_groups=False,
|
|
||||||
# ^XXX^ TODO? instead unpack any RAE as per "loose" style?
|
|
||||||
) as da_nursery:
|
|
||||||
try:
|
try:
|
||||||
# This is the inner level "run in actor" nursery. It is
|
# This is the inner level "run in actor" nursery. It is
|
||||||
# awaited first since actors spawned in this way (using
|
# awaited first since actors spawned in this way (using
|
||||||
# `ActorNusery.run_in_actor()`) are expected to only
|
# ``ActorNusery.run_in_actor()``) are expected to only
|
||||||
# return a single result and then complete (i.e. be canclled
|
# return a single result and then complete (i.e. be canclled
|
||||||
# gracefully). Errors collected from these actors are
|
# gracefully). Errors collected from these actors are
|
||||||
# immediately raised for handling by a supervisor strategy.
|
# immediately raised for handling by a supervisor strategy.
|
||||||
# As such if the strategy propagates any error(s) upwards
|
# As such if the strategy propagates any error(s) upwards
|
||||||
# the above "daemon actor" nursery will be notified.
|
# the above "daemon actor" nursery will be notified.
|
||||||
async with trio.open_nursery(
|
async with trio.open_nursery() as ria_nursery:
|
||||||
strict_exception_groups=False,
|
|
||||||
# ^XXX^ TODO? instead unpack any RAE as per "loose" style?
|
|
||||||
) as ria_nursery:
|
|
||||||
|
|
||||||
an = ActorNursery(
|
an = ActorNursery(
|
||||||
actor,
|
actor,
|
||||||
|
@ -433,8 +387,7 @@ async def _open_and_supervise_one_cancels_all_nursery(
|
||||||
)
|
)
|
||||||
an._join_procs.set()
|
an._join_procs.set()
|
||||||
|
|
||||||
except BaseException as _inner_err:
|
except BaseException as inner_err:
|
||||||
inner_err = _inner_err
|
|
||||||
errors[actor.uid] = inner_err
|
errors[actor.uid] = inner_err
|
||||||
|
|
||||||
# If we error in the root but the debugger is
|
# If we error in the root but the debugger is
|
||||||
|
@ -478,8 +431,8 @@ async def _open_and_supervise_one_cancels_all_nursery(
|
||||||
ContextCancelled,
|
ContextCancelled,
|
||||||
}:
|
}:
|
||||||
log.cancel(
|
log.cancel(
|
||||||
'Actor-nursery caught remote cancellation\n'
|
'Actor-nursery caught remote cancellation\n\n'
|
||||||
'\n'
|
|
||||||
f'{inner_err.tb_str}'
|
f'{inner_err.tb_str}'
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
|
@ -512,10 +465,8 @@ async def _open_and_supervise_one_cancels_all_nursery(
|
||||||
Exception,
|
Exception,
|
||||||
BaseExceptionGroup,
|
BaseExceptionGroup,
|
||||||
trio.Cancelled
|
trio.Cancelled
|
||||||
) as _outer_err:
|
|
||||||
outer_err = _outer_err
|
|
||||||
|
|
||||||
an._scope_error = outer_err or inner_err
|
) as err:
|
||||||
|
|
||||||
# XXX: yet another guard before allowing the cancel
|
# XXX: yet another guard before allowing the cancel
|
||||||
# sequence in case a (single) child is in debug.
|
# sequence in case a (single) child is in debug.
|
||||||
|
@ -530,7 +481,7 @@ async def _open_and_supervise_one_cancels_all_nursery(
|
||||||
if an._children:
|
if an._children:
|
||||||
log.cancel(
|
log.cancel(
|
||||||
'Actor-nursery cancelling due error type:\n'
|
'Actor-nursery cancelling due error type:\n'
|
||||||
f'{outer_err}\n'
|
f'{err}\n'
|
||||||
)
|
)
|
||||||
with trio.CancelScope(shield=True):
|
with trio.CancelScope(shield=True):
|
||||||
await an.cancel()
|
await an.cancel()
|
||||||
|
@ -557,23 +508,13 @@ async def _open_and_supervise_one_cancels_all_nursery(
|
||||||
else:
|
else:
|
||||||
raise list(errors.values())[0]
|
raise list(errors.values())[0]
|
||||||
|
|
||||||
# show frame on any (likely) internal error
|
|
||||||
if (
|
|
||||||
not an.cancelled
|
|
||||||
and an._scope_error
|
|
||||||
):
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
|
|
||||||
# da_nursery scope end - nursery checkpoint
|
# da_nursery scope end - nursery checkpoint
|
||||||
# final exit
|
# final exit
|
||||||
|
|
||||||
|
|
||||||
@acm
|
@acm
|
||||||
# @api_frame
|
|
||||||
async def open_nursery(
|
async def open_nursery(
|
||||||
hide_tb: bool = True,
|
|
||||||
**kwargs,
|
**kwargs,
|
||||||
# ^TODO, paramspec for `open_root_actor()`
|
|
||||||
|
|
||||||
) -> typing.AsyncGenerator[ActorNursery, None]:
|
) -> typing.AsyncGenerator[ActorNursery, None]:
|
||||||
'''
|
'''
|
||||||
|
@ -591,7 +532,6 @@ async def open_nursery(
|
||||||
which cancellation scopes correspond to each spawned subactor set.
|
which cancellation scopes correspond to each spawned subactor set.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
__tracebackhide__: bool = hide_tb
|
|
||||||
implicit_runtime: bool = False
|
implicit_runtime: bool = False
|
||||||
actor: Actor = current_actor(err_on_no_runtime=False)
|
actor: Actor = current_actor(err_on_no_runtime=False)
|
||||||
an: ActorNursery|None = None
|
an: ActorNursery|None = None
|
||||||
|
@ -607,10 +547,7 @@ async def open_nursery(
|
||||||
# mark us for teardown on exit
|
# mark us for teardown on exit
|
||||||
implicit_runtime: bool = True
|
implicit_runtime: bool = True
|
||||||
|
|
||||||
async with open_root_actor(
|
async with open_root_actor(**kwargs) as actor:
|
||||||
hide_tb=hide_tb,
|
|
||||||
**kwargs,
|
|
||||||
) as actor:
|
|
||||||
assert actor is current_actor()
|
assert actor is current_actor()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
@ -645,27 +582,13 @@ async def open_nursery(
|
||||||
an.exited.set()
|
an.exited.set()
|
||||||
|
|
||||||
finally:
|
finally:
|
||||||
# show frame on any internal runtime-scope error
|
|
||||||
if (
|
|
||||||
an
|
|
||||||
and
|
|
||||||
not an.cancelled
|
|
||||||
and
|
|
||||||
an._scope_error
|
|
||||||
):
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
|
|
||||||
msg: str = (
|
msg: str = (
|
||||||
'Actor-nursery exited\n'
|
'Actor-nursery exited\n'
|
||||||
f'|_{an}\n'
|
f'|_{an}\n\n'
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# shutdown runtime if it was started
|
||||||
if implicit_runtime:
|
if implicit_runtime:
|
||||||
# shutdown runtime if it was started and report noisly
|
|
||||||
# that we're did so.
|
|
||||||
msg += '=> Shutting down actor runtime <=\n'
|
msg += '=> Shutting down actor runtime <=\n'
|
||||||
log.info(msg)
|
|
||||||
|
|
||||||
else:
|
log.info(msg)
|
||||||
# keep noise low during std operation.
|
|
||||||
log.runtime(msg)
|
|
||||||
|
|
|
@ -19,22 +19,13 @@ Various helpers/utils for auditing your `tractor` app and/or the
|
||||||
core runtime.
|
core runtime.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
from contextlib import (
|
from contextlib import asynccontextmanager as acm
|
||||||
asynccontextmanager as acm,
|
|
||||||
)
|
|
||||||
import os
|
|
||||||
import pathlib
|
import pathlib
|
||||||
|
|
||||||
import tractor
|
import tractor
|
||||||
from tractor.devx._debug import (
|
|
||||||
BoxedMaybeException,
|
|
||||||
)
|
|
||||||
from .pytest import (
|
from .pytest import (
|
||||||
tractor_test as tractor_test
|
tractor_test as tractor_test
|
||||||
)
|
)
|
||||||
from .fault_simulation import (
|
|
||||||
break_ipc as break_ipc,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def repodir() -> pathlib.Path:
|
def repodir() -> pathlib.Path:
|
||||||
|
@ -60,35 +51,6 @@ def examples_dir() -> pathlib.Path:
|
||||||
return repodir() / 'examples'
|
return repodir() / 'examples'
|
||||||
|
|
||||||
|
|
||||||
def mk_cmd(
|
|
||||||
ex_name: str,
|
|
||||||
exs_subpath: str = 'debugging',
|
|
||||||
) -> str:
|
|
||||||
'''
|
|
||||||
Generate a shell command suitable to pass to `pexpect.spawn()`
|
|
||||||
which runs the script as a python program's entrypoint.
|
|
||||||
|
|
||||||
In particular ensure we disable the new tb coloring via unsetting
|
|
||||||
`$PYTHON_COLORS` so that `pexpect` can pattern match without
|
|
||||||
color-escape-codes.
|
|
||||||
|
|
||||||
'''
|
|
||||||
script_path: pathlib.Path = (
|
|
||||||
examples_dir()
|
|
||||||
/ exs_subpath
|
|
||||||
/ f'{ex_name}.py'
|
|
||||||
)
|
|
||||||
py_cmd: str = ' '.join([
|
|
||||||
'python',
|
|
||||||
str(script_path)
|
|
||||||
])
|
|
||||||
# XXX, required for py 3.13+
|
|
||||||
# https://docs.python.org/3/using/cmdline.html#using-on-controlling-color
|
|
||||||
# https://docs.python.org/3/using/cmdline.html#envvar-PYTHON_COLORS
|
|
||||||
os.environ['PYTHON_COLORS'] = '0'
|
|
||||||
return py_cmd
|
|
||||||
|
|
||||||
|
|
||||||
@acm
|
@acm
|
||||||
async def expect_ctxc(
|
async def expect_ctxc(
|
||||||
yay: bool,
|
yay: bool,
|
||||||
|
@ -101,13 +63,12 @@ async def expect_ctxc(
|
||||||
'''
|
'''
|
||||||
if yay:
|
if yay:
|
||||||
try:
|
try:
|
||||||
yield (maybe_exc := BoxedMaybeException())
|
yield
|
||||||
raise RuntimeError('Never raised ctxc?')
|
raise RuntimeError('Never raised ctxc?')
|
||||||
except tractor.ContextCancelled as ctxc:
|
except tractor.ContextCancelled:
|
||||||
maybe_exc.value = ctxc
|
|
||||||
if reraise:
|
if reraise:
|
||||||
raise
|
raise
|
||||||
else:
|
else:
|
||||||
return
|
return
|
||||||
else:
|
else:
|
||||||
yield (maybe_exc := BoxedMaybeException())
|
yield
|
||||||
|
|
|
@ -1,92 +0,0 @@
|
||||||
# tractor: structured concurrent "actors".
|
|
||||||
# Copyright 2018-eternity Tyler Goodlet.
|
|
||||||
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as published by
|
|
||||||
# the Free Software Foundation, either version 3 of the License, or
|
|
||||||
# (at your option) any later version.
|
|
||||||
|
|
||||||
# This program is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU Affero General Public License for more details.
|
|
||||||
|
|
||||||
# You should have received a copy of the GNU Affero General Public License
|
|
||||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
'''
|
|
||||||
`pytest` utils helpers and plugins for testing `tractor`'s runtime
|
|
||||||
and applications.
|
|
||||||
|
|
||||||
'''
|
|
||||||
|
|
||||||
from tractor import (
|
|
||||||
MsgStream,
|
|
||||||
)
|
|
||||||
|
|
||||||
async def break_ipc(
|
|
||||||
stream: MsgStream,
|
|
||||||
method: str|None = None,
|
|
||||||
pre_close: bool = False,
|
|
||||||
|
|
||||||
def_method: str = 'socket_close',
|
|
||||||
|
|
||||||
) -> None:
|
|
||||||
'''
|
|
||||||
XXX: close the channel right after an error is raised
|
|
||||||
purposely breaking the IPC transport to make sure the parent
|
|
||||||
doesn't get stuck in debug or hang on the connection join.
|
|
||||||
this more or less simulates an infinite msg-receive hang on
|
|
||||||
the other end.
|
|
||||||
|
|
||||||
'''
|
|
||||||
# close channel via IPC prot msging before
|
|
||||||
# any transport breakage
|
|
||||||
if pre_close:
|
|
||||||
await stream.aclose()
|
|
||||||
|
|
||||||
method: str = method or def_method
|
|
||||||
print(
|
|
||||||
'#################################\n'
|
|
||||||
'Simulating CHILD-side IPC BREAK!\n'
|
|
||||||
f'method: {method}\n'
|
|
||||||
f'pre `.aclose()`: {pre_close}\n'
|
|
||||||
'#################################\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
match method:
|
|
||||||
case 'socket_close':
|
|
||||||
await stream._ctx.chan.transport.stream.aclose()
|
|
||||||
|
|
||||||
case 'socket_eof':
|
|
||||||
# NOTE: `trio` does the following underneath this
|
|
||||||
# call in `src/trio/_highlevel_socket.py`:
|
|
||||||
# `Stream.socket.shutdown(tsocket.SHUT_WR)`
|
|
||||||
await stream._ctx.chan.transport.stream.send_eof()
|
|
||||||
|
|
||||||
# TODO: remove since now this will be invalid with our
|
|
||||||
# new typed msg spec?
|
|
||||||
# case 'msg':
|
|
||||||
# await stream._ctx.chan.send(None)
|
|
||||||
|
|
||||||
# TODO: the actual real-world simulated cases like
|
|
||||||
# transport layer hangs and/or lower layer 2-gens type
|
|
||||||
# scenarios..
|
|
||||||
#
|
|
||||||
# -[ ] already have some issues for this general testing
|
|
||||||
# area:
|
|
||||||
# - https://github.com/goodboy/tractor/issues/97
|
|
||||||
# - https://github.com/goodboy/tractor/issues/124
|
|
||||||
# - PR from @guille:
|
|
||||||
# https://github.com/goodboy/tractor/pull/149
|
|
||||||
# case 'hang':
|
|
||||||
# TODO: framework research:
|
|
||||||
#
|
|
||||||
# - https://github.com/GuoTengda1993/pynetem
|
|
||||||
# - https://github.com/shopify/toxiproxy
|
|
||||||
# - https://manpages.ubuntu.com/manpages/trusty/man1/wirefilter.1.html
|
|
||||||
|
|
||||||
case _:
|
|
||||||
raise RuntimeError(
|
|
||||||
f'IPC break method unsupported: {method}'
|
|
||||||
)
|
|
|
@ -26,53 +26,12 @@ from ._debug import (
|
||||||
breakpoint as breakpoint,
|
breakpoint as breakpoint,
|
||||||
pause as pause,
|
pause as pause,
|
||||||
pause_from_sync as pause_from_sync,
|
pause_from_sync as pause_from_sync,
|
||||||
sigint_shield as sigint_shield,
|
shield_sigint_handler as shield_sigint_handler,
|
||||||
|
MultiActorPdb as MultiActorPdb,
|
||||||
open_crash_handler as open_crash_handler,
|
open_crash_handler as open_crash_handler,
|
||||||
maybe_open_crash_handler as maybe_open_crash_handler,
|
maybe_open_crash_handler as maybe_open_crash_handler,
|
||||||
maybe_init_greenback as maybe_init_greenback,
|
|
||||||
post_mortem as post_mortem,
|
post_mortem as post_mortem,
|
||||||
mk_pdb as mk_pdb,
|
|
||||||
)
|
)
|
||||||
from ._stackscope import (
|
from ._stackscope import (
|
||||||
enable_stack_on_sig as enable_stack_on_sig,
|
enable_stack_on_sig as enable_stack_on_sig,
|
||||||
)
|
)
|
||||||
from .pformat import (
|
|
||||||
add_div as add_div,
|
|
||||||
pformat_caller_frame as pformat_caller_frame,
|
|
||||||
pformat_boxed_tb as pformat_boxed_tb,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# TODO, move this to a new `.devx._pdbp` mod?
|
|
||||||
def _enable_readline_feats() -> str:
|
|
||||||
'''
|
|
||||||
Handle `readline` when compiled with `libedit` to avoid breaking
|
|
||||||
tab completion in `pdbp` (and its dep `tabcompleter`)
|
|
||||||
particularly since `uv` cpython distis are compiled this way..
|
|
||||||
|
|
||||||
See docs for deats,
|
|
||||||
https://docs.python.org/3/library/readline.html#module-readline
|
|
||||||
|
|
||||||
Originally discovered soln via SO answer,
|
|
||||||
https://stackoverflow.com/q/49287102
|
|
||||||
|
|
||||||
'''
|
|
||||||
import readline
|
|
||||||
if (
|
|
||||||
# 3.13+ attr
|
|
||||||
# https://docs.python.org/3/library/readline.html#readline.backend
|
|
||||||
(getattr(readline, 'backend', False) == 'libedit')
|
|
||||||
or
|
|
||||||
'libedit' in readline.__doc__
|
|
||||||
):
|
|
||||||
readline.parse_and_bind("python:bind -v")
|
|
||||||
readline.parse_and_bind("python:bind ^I rl_complete")
|
|
||||||
return 'libedit'
|
|
||||||
else:
|
|
||||||
readline.parse_and_bind("tab: complete")
|
|
||||||
readline.parse_and_bind("set editing-mode vi")
|
|
||||||
readline.parse_and_bind("set keymap vi")
|
|
||||||
return 'readline'
|
|
||||||
|
|
||||||
|
|
||||||
_enable_readline_feats()
|
|
||||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -1,303 +0,0 @@
|
||||||
# tractor: structured concurrent "actors".
|
|
||||||
# Copyright 2018-eternity Tyler Goodlet.
|
|
||||||
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as published by
|
|
||||||
# the Free Software Foundation, either version 3 of the License, or
|
|
||||||
# (at your option) any later version.
|
|
||||||
|
|
||||||
# This program is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU Affero General Public License for more details.
|
|
||||||
|
|
||||||
# You should have received a copy of the GNU Affero General Public License
|
|
||||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
'''
|
|
||||||
Tools for code-object annotation, introspection and mutation
|
|
||||||
as it pertains to improving the grok-ability of our runtime!
|
|
||||||
|
|
||||||
'''
|
|
||||||
from __future__ import annotations
|
|
||||||
from functools import partial
|
|
||||||
import inspect
|
|
||||||
from types import (
|
|
||||||
FrameType,
|
|
||||||
FunctionType,
|
|
||||||
MethodType,
|
|
||||||
# CodeType,
|
|
||||||
)
|
|
||||||
from typing import (
|
|
||||||
Any,
|
|
||||||
Callable,
|
|
||||||
Type,
|
|
||||||
)
|
|
||||||
|
|
||||||
from tractor.msg import (
|
|
||||||
pretty_struct,
|
|
||||||
NamespacePath,
|
|
||||||
)
|
|
||||||
import wrapt
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: yeah, i don't love this and we should prolly just
|
|
||||||
# write a decorator that actually keeps a stupid ref to the func
|
|
||||||
# obj..
|
|
||||||
def get_class_from_frame(fr: FrameType) -> (
|
|
||||||
FunctionType
|
|
||||||
|MethodType
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Attempt to get the function (or method) reference
|
|
||||||
from a given `FrameType`.
|
|
||||||
|
|
||||||
Verbatim from an SO:
|
|
||||||
https://stackoverflow.com/a/2220759
|
|
||||||
|
|
||||||
'''
|
|
||||||
args, _, _, value_dict = inspect.getargvalues(fr)
|
|
||||||
|
|
||||||
# we check the first parameter for the frame function is
|
|
||||||
# named 'self'
|
|
||||||
if (
|
|
||||||
len(args)
|
|
||||||
and
|
|
||||||
# TODO: other cases for `@classmethod` etc..?)
|
|
||||||
args[0] == 'self'
|
|
||||||
):
|
|
||||||
# in that case, 'self' will be referenced in value_dict
|
|
||||||
instance: object = value_dict.get('self')
|
|
||||||
if instance:
|
|
||||||
# return its class
|
|
||||||
return getattr(
|
|
||||||
instance,
|
|
||||||
'__class__',
|
|
||||||
None,
|
|
||||||
)
|
|
||||||
|
|
||||||
# return None otherwise
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def get_ns_and_func_from_frame(
|
|
||||||
frame: FrameType,
|
|
||||||
) -> Callable:
|
|
||||||
'''
|
|
||||||
Return the corresponding function object reference from
|
|
||||||
a `FrameType`, and return it and it's parent namespace `dict`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
ns: dict[str, Any]
|
|
||||||
|
|
||||||
# for a method, go up a frame and lookup the name in locals()
|
|
||||||
if '.' in (qualname := frame.f_code.co_qualname):
|
|
||||||
cls_name, _, func_name = qualname.partition('.')
|
|
||||||
ns = frame.f_back.f_locals[cls_name].__dict__
|
|
||||||
|
|
||||||
else:
|
|
||||||
func_name: str = frame.f_code.co_name
|
|
||||||
ns = frame.f_globals
|
|
||||||
|
|
||||||
return (
|
|
||||||
ns,
|
|
||||||
ns[func_name],
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def func_ref_from_frame(
|
|
||||||
frame: FrameType,
|
|
||||||
) -> Callable:
|
|
||||||
func_name: str = frame.f_code.co_name
|
|
||||||
try:
|
|
||||||
return frame.f_globals[func_name]
|
|
||||||
except KeyError:
|
|
||||||
cls: Type|None = get_class_from_frame(frame)
|
|
||||||
if cls:
|
|
||||||
return getattr(
|
|
||||||
cls,
|
|
||||||
func_name,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class CallerInfo(pretty_struct.Struct):
|
|
||||||
# https://docs.python.org/dev/reference/datamodel.html#frame-objects
|
|
||||||
# https://docs.python.org/dev/library/inspect.html#the-interpreter-stack
|
|
||||||
_api_frame: FrameType
|
|
||||||
|
|
||||||
@property
|
|
||||||
def api_frame(self) -> FrameType:
|
|
||||||
try:
|
|
||||||
self._api_frame.clear()
|
|
||||||
except RuntimeError:
|
|
||||||
# log.warning(
|
|
||||||
print(
|
|
||||||
f'Frame {self._api_frame} for {self.api_func} is still active!'
|
|
||||||
)
|
|
||||||
|
|
||||||
return self._api_frame
|
|
||||||
|
|
||||||
_api_func: Callable
|
|
||||||
|
|
||||||
@property
|
|
||||||
def api_func(self) -> Callable:
|
|
||||||
return self._api_func
|
|
||||||
|
|
||||||
_caller_frames_up: int|None = 1
|
|
||||||
_caller_frame: FrameType|None = None # cached after first stack scan
|
|
||||||
|
|
||||||
@property
|
|
||||||
def api_nsp(self) -> NamespacePath|None:
|
|
||||||
func: FunctionType = self.api_func
|
|
||||||
if func:
|
|
||||||
return NamespacePath.from_ref(func)
|
|
||||||
|
|
||||||
return '<unknown>'
|
|
||||||
|
|
||||||
@property
|
|
||||||
def caller_frame(self) -> FrameType:
|
|
||||||
|
|
||||||
# if not already cached, scan up stack explicitly by
|
|
||||||
# configured count.
|
|
||||||
if not self._caller_frame:
|
|
||||||
if self._caller_frames_up:
|
|
||||||
for _ in range(self._caller_frames_up):
|
|
||||||
caller_frame: FrameType|None = self.api_frame.f_back
|
|
||||||
|
|
||||||
if not caller_frame:
|
|
||||||
raise ValueError(
|
|
||||||
'No frame exists {self._caller_frames_up} up from\n'
|
|
||||||
f'{self.api_frame} @ {self.api_nsp}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
self._caller_frame = caller_frame
|
|
||||||
|
|
||||||
return self._caller_frame
|
|
||||||
|
|
||||||
@property
|
|
||||||
def caller_nsp(self) -> NamespacePath|None:
|
|
||||||
func: FunctionType = self.api_func
|
|
||||||
if func:
|
|
||||||
return NamespacePath.from_ref(func)
|
|
||||||
|
|
||||||
return '<unknown>'
|
|
||||||
|
|
||||||
|
|
||||||
def find_caller_info(
|
|
||||||
dunder_var: str = '__runtimeframe__',
|
|
||||||
iframes:int = 1,
|
|
||||||
check_frame_depth: bool = True,
|
|
||||||
|
|
||||||
) -> CallerInfo|None:
|
|
||||||
'''
|
|
||||||
Scan up the callstack for a frame with a `dunder_var: str` variable
|
|
||||||
and return the `iframes` frames above it.
|
|
||||||
|
|
||||||
By default we scan for a `__runtimeframe__` scope var which
|
|
||||||
denotes a `tractor` API above which (one frame up) is "user
|
|
||||||
app code" which "called into" the `tractor` method or func.
|
|
||||||
|
|
||||||
TODO: ex with `Portal.open_context()`
|
|
||||||
|
|
||||||
'''
|
|
||||||
# TODO: use this instead?
|
|
||||||
# https://docs.python.org/3/library/inspect.html#inspect.getouterframes
|
|
||||||
frames: list[inspect.FrameInfo] = inspect.stack()
|
|
||||||
for fi in frames:
|
|
||||||
assert (
|
|
||||||
fi.function
|
|
||||||
==
|
|
||||||
fi.frame.f_code.co_name
|
|
||||||
)
|
|
||||||
this_frame: FrameType = fi.frame
|
|
||||||
dunder_val: int|None = this_frame.f_locals.get(dunder_var)
|
|
||||||
if dunder_val:
|
|
||||||
go_up_iframes: int = (
|
|
||||||
dunder_val # could be 0 or `True` i guess?
|
|
||||||
or
|
|
||||||
iframes
|
|
||||||
)
|
|
||||||
rt_frame: FrameType = fi.frame
|
|
||||||
call_frame = rt_frame
|
|
||||||
for i in range(go_up_iframes):
|
|
||||||
call_frame = call_frame.f_back
|
|
||||||
|
|
||||||
return CallerInfo(
|
|
||||||
_api_frame=rt_frame,
|
|
||||||
_api_func=func_ref_from_frame(rt_frame),
|
|
||||||
_caller_frames_up=go_up_iframes,
|
|
||||||
)
|
|
||||||
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
_frame2callerinfo_cache: dict[FrameType, CallerInfo] = {}
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: -[x] move all this into new `.devx._frame_stack`!
|
|
||||||
# -[ ] consider rename to _callstack?
|
|
||||||
# -[ ] prolly create a `@runtime_api` dec?
|
|
||||||
# |_ @api_frame seems better?
|
|
||||||
# -[ ] ^- make it capture and/or accept buncha optional
|
|
||||||
# meta-data like a fancier version of `@pdbp.hideframe`.
|
|
||||||
#
|
|
||||||
def api_frame(
|
|
||||||
wrapped: Callable|None = None,
|
|
||||||
*,
|
|
||||||
caller_frames_up: int = 1,
|
|
||||||
|
|
||||||
) -> Callable:
|
|
||||||
|
|
||||||
# handle the decorator called WITHOUT () case,
|
|
||||||
# i.e. just @api_frame, NOT @api_frame(extra=<blah>)
|
|
||||||
if wrapped is None:
|
|
||||||
return partial(
|
|
||||||
api_frame,
|
|
||||||
caller_frames_up=caller_frames_up,
|
|
||||||
)
|
|
||||||
|
|
||||||
@wrapt.decorator
|
|
||||||
async def wrapper(
|
|
||||||
wrapped: Callable,
|
|
||||||
instance: object,
|
|
||||||
args: tuple,
|
|
||||||
kwargs: dict,
|
|
||||||
):
|
|
||||||
# maybe cache the API frame for this call
|
|
||||||
global _frame2callerinfo_cache
|
|
||||||
this_frame: FrameType = inspect.currentframe()
|
|
||||||
api_frame: FrameType = this_frame.f_back
|
|
||||||
|
|
||||||
if not _frame2callerinfo_cache.get(api_frame):
|
|
||||||
_frame2callerinfo_cache[api_frame] = CallerInfo(
|
|
||||||
_api_frame=api_frame,
|
|
||||||
_api_func=wrapped,
|
|
||||||
_caller_frames_up=caller_frames_up,
|
|
||||||
)
|
|
||||||
|
|
||||||
return wrapped(*args, **kwargs)
|
|
||||||
|
|
||||||
# annotate the function as a "api function", meaning it is
|
|
||||||
# a function for which the function above it in the call stack should be
|
|
||||||
# non-`tractor` code aka "user code".
|
|
||||||
#
|
|
||||||
# in the global frame cache for easy lookup from a given
|
|
||||||
# func-instance
|
|
||||||
wrapped._call_infos: dict[FrameType, CallerInfo] = _frame2callerinfo_cache
|
|
||||||
wrapped.__api_func__: bool = True
|
|
||||||
return wrapper(wrapped)
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: something like this instead of the adhoc frame-unhiding
|
|
||||||
# blocks all over the runtime!! XD
|
|
||||||
# -[ ] ideally we can expect a certain error (set) and if something
|
|
||||||
# else is raised then all frames below the wrapped one will be
|
|
||||||
# un-hidden via `__tracebackhide__: bool = False`.
|
|
||||||
# |_ might need to dynamically mutate the code objs like
|
|
||||||
# `pdbp.hideframe()` does?
|
|
||||||
# -[ ] use this as a `@acm` decorator as introed in 3.10?
|
|
||||||
# @acm
|
|
||||||
# async def unhide_frame_when_not(
|
|
||||||
# error_set: set[BaseException],
|
|
||||||
# ) -> TracebackType:
|
|
||||||
# ...
|
|
|
@ -23,109 +23,29 @@ into each ``trio.Nursery`` except it links the lifetimes of memory space
|
||||||
disjoint, parallel executing tasks in separate actors.
|
disjoint, parallel executing tasks in separate actors.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
from __future__ import annotations
|
|
||||||
# from functools import partial
|
|
||||||
from threading import (
|
|
||||||
current_thread,
|
|
||||||
Thread,
|
|
||||||
RLock,
|
|
||||||
)
|
|
||||||
import multiprocessing as mp
|
|
||||||
from signal import (
|
from signal import (
|
||||||
signal,
|
signal,
|
||||||
getsignal,
|
|
||||||
SIGUSR1,
|
SIGUSR1,
|
||||||
SIGINT,
|
|
||||||
)
|
|
||||||
# import traceback
|
|
||||||
from types import ModuleType
|
|
||||||
from typing import (
|
|
||||||
Callable,
|
|
||||||
TYPE_CHECKING,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
import trio
|
import trio
|
||||||
from tractor import (
|
|
||||||
_state,
|
|
||||||
log as logmod,
|
|
||||||
)
|
|
||||||
from tractor.devx import _debug
|
|
||||||
|
|
||||||
log = logmod.get_logger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
|
||||||
from tractor._spawn import ProcessType
|
|
||||||
from tractor import (
|
|
||||||
Actor,
|
|
||||||
ActorNursery,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@trio.lowlevel.disable_ki_protection
|
@trio.lowlevel.disable_ki_protection
|
||||||
def dump_task_tree() -> None:
|
def dump_task_tree() -> None:
|
||||||
'''
|
|
||||||
Do a classic `stackscope.extract()` task-tree dump to console at
|
|
||||||
`.devx()` level.
|
|
||||||
|
|
||||||
'''
|
|
||||||
import stackscope
|
import stackscope
|
||||||
|
from tractor.log import get_console_log
|
||||||
|
|
||||||
tree_str: str = str(
|
tree_str: str = str(
|
||||||
stackscope.extract(
|
stackscope.extract(
|
||||||
trio.lowlevel.current_root_task(),
|
trio.lowlevel.current_root_task(),
|
||||||
recurse_child_tasks=True
|
recurse_child_tasks=True
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
actor: Actor = _state.current_actor()
|
log = get_console_log('cancel')
|
||||||
thr: Thread = current_thread()
|
log.pdb(
|
||||||
current_sigint_handler: Callable = getsignal(SIGINT)
|
f'Dumping `stackscope` tree:\n\n'
|
||||||
if (
|
f'{tree_str}\n'
|
||||||
current_sigint_handler
|
|
||||||
is not
|
|
||||||
_debug.DebugStatus._trio_handler
|
|
||||||
):
|
|
||||||
sigint_handler_report: str = (
|
|
||||||
'The default `trio` SIGINT handler was replaced?!'
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
sigint_handler_report: str = (
|
|
||||||
'The default `trio` SIGINT handler is in use?!'
|
|
||||||
)
|
|
||||||
|
|
||||||
# sclang symbology
|
|
||||||
# |_<object>
|
|
||||||
# |_(Task/Thread/Process/Actor
|
|
||||||
# |_{Supervisor/Scope
|
|
||||||
# |_[Storage/Memory/IPC-Stream/Data-Struct
|
|
||||||
|
|
||||||
log.devx(
|
|
||||||
f'Dumping `stackscope` tree for actor\n'
|
|
||||||
f'(>: {actor.uid!r}\n'
|
|
||||||
f' |_{mp.current_process()}\n'
|
|
||||||
f' |_{thr}\n'
|
|
||||||
f' |_{actor}\n'
|
|
||||||
f'\n'
|
|
||||||
f'{sigint_handler_report}\n'
|
|
||||||
f'signal.getsignal(SIGINT) -> {current_sigint_handler!r}\n'
|
|
||||||
# f'\n'
|
|
||||||
# start-of-trace-tree delimiter (mostly for testing)
|
|
||||||
# f'------ {actor.uid!r} ------\n'
|
|
||||||
f'\n'
|
|
||||||
f'------ start-of-{actor.uid!r} ------\n'
|
|
||||||
f'|\n'
|
|
||||||
f'{tree_str}'
|
|
||||||
# end-of-trace-tree delimiter (mostly for testing)
|
|
||||||
f'|\n'
|
|
||||||
f'|_____ end-of-{actor.uid!r} ______\n'
|
|
||||||
)
|
)
|
||||||
# TODO: can remove this right?
|
|
||||||
# -[ ] was original code from author
|
|
||||||
#
|
|
||||||
# print(
|
|
||||||
# 'DUMPING FROM PRINT\n'
|
|
||||||
# +
|
|
||||||
# content
|
|
||||||
# )
|
|
||||||
# import logging
|
# import logging
|
||||||
# try:
|
# try:
|
||||||
# with open("/dev/tty", "w") as tty:
|
# with open("/dev/tty", "w") as tty:
|
||||||
|
@ -135,130 +55,30 @@ def dump_task_tree() -> None:
|
||||||
# "task_tree"
|
# "task_tree"
|
||||||
# ).exception("Error printing task tree")
|
# ).exception("Error printing task tree")
|
||||||
|
|
||||||
_handler_lock = RLock()
|
|
||||||
_tree_dumped: bool = False
|
|
||||||
|
|
||||||
|
def signal_handler(sig: int, frame: object) -> None:
|
||||||
|
import traceback
|
||||||
|
try:
|
||||||
|
trio.lowlevel.current_trio_token(
|
||||||
|
).run_sync_soon(dump_task_tree)
|
||||||
|
except RuntimeError:
|
||||||
|
# not in async context -- print a normal traceback
|
||||||
|
traceback.print_stack()
|
||||||
|
|
||||||
def dump_tree_on_sig(
|
|
||||||
sig: int,
|
|
||||||
frame: object,
|
|
||||||
|
|
||||||
relay_to_subs: bool = True,
|
|
||||||
|
|
||||||
) -> None:
|
|
||||||
global _tree_dumped, _handler_lock
|
|
||||||
with _handler_lock:
|
|
||||||
# if _tree_dumped:
|
|
||||||
# log.warning(
|
|
||||||
# 'Already dumped for this actor...??'
|
|
||||||
# )
|
|
||||||
# return
|
|
||||||
|
|
||||||
_tree_dumped = True
|
|
||||||
|
|
||||||
# actor: Actor = _state.current_actor()
|
|
||||||
log.devx(
|
|
||||||
'Trying to dump `stackscope` tree..\n'
|
|
||||||
)
|
|
||||||
try:
|
|
||||||
dump_task_tree()
|
|
||||||
# await actor._service_n.start_soon(
|
|
||||||
# partial(
|
|
||||||
# trio.to_thread.run_sync,
|
|
||||||
# dump_task_tree,
|
|
||||||
# )
|
|
||||||
# )
|
|
||||||
# trio.lowlevel.current_trio_token().run_sync_soon(
|
|
||||||
# dump_task_tree
|
|
||||||
# )
|
|
||||||
|
|
||||||
except RuntimeError:
|
|
||||||
log.exception(
|
|
||||||
'Failed to dump `stackscope` tree..\n'
|
|
||||||
)
|
|
||||||
# not in async context -- print a normal traceback
|
|
||||||
# traceback.print_stack()
|
|
||||||
raise
|
|
||||||
|
|
||||||
except BaseException:
|
|
||||||
log.exception(
|
|
||||||
'Failed to dump `stackscope` tree..\n'
|
|
||||||
)
|
|
||||||
raise
|
|
||||||
|
|
||||||
# log.devx(
|
|
||||||
# 'Supposedly we dumped just fine..?'
|
|
||||||
# )
|
|
||||||
|
|
||||||
if not relay_to_subs:
|
|
||||||
return
|
|
||||||
|
|
||||||
an: ActorNursery
|
|
||||||
for an in _state.current_actor()._actoruid2nursery.values():
|
|
||||||
subproc: ProcessType
|
|
||||||
subactor: Actor
|
|
||||||
for subactor, subproc, _ in an._children.values():
|
|
||||||
log.warning(
|
|
||||||
f'Relaying `SIGUSR1`[{sig}] to sub-actor\n'
|
|
||||||
f'{subactor}\n'
|
|
||||||
f' |_{subproc}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
# bc of course stdlib can't have a std API.. XD
|
|
||||||
match subproc:
|
|
||||||
case trio.Process():
|
|
||||||
subproc.send_signal(sig)
|
|
||||||
|
|
||||||
case mp.Process():
|
|
||||||
subproc._send_signal(sig)
|
|
||||||
|
|
||||||
|
|
||||||
def enable_stack_on_sig(
|
def enable_stack_on_sig(
|
||||||
sig: int = SIGUSR1,
|
sig: int = SIGUSR1
|
||||||
) -> ModuleType:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
Enable `stackscope` tracing on reception of a signal; by
|
Enable `stackscope` tracing on reception of a signal; by
|
||||||
default this is SIGUSR1.
|
default this is SIGUSR1.
|
||||||
|
|
||||||
HOT TIP: a task/ctx-tree dump can be triggered from a shell with
|
|
||||||
fancy cmds.
|
|
||||||
|
|
||||||
For ex. from `bash` using `pgrep` and cmd-sustitution
|
|
||||||
(https://www.gnu.org/software/bash/manual/bash.html#Command-Substitution)
|
|
||||||
you could use:
|
|
||||||
|
|
||||||
>> kill -SIGUSR1 $(pgrep -f <part-of-cmd: str>)
|
|
||||||
|
|
||||||
OR without a sub-shell,
|
|
||||||
|
|
||||||
>> pkill --signal SIGUSR1 -f <part-of-cmd: str>
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
try:
|
|
||||||
import stackscope
|
|
||||||
except ImportError:
|
|
||||||
log.warning(
|
|
||||||
'`stackscope` not installed for use in debug mode!'
|
|
||||||
)
|
|
||||||
return None
|
|
||||||
|
|
||||||
handler: Callable|int = getsignal(sig)
|
|
||||||
if handler is dump_tree_on_sig:
|
|
||||||
log.devx(
|
|
||||||
'A `SIGUSR1` handler already exists?\n'
|
|
||||||
f'|_ {handler!r}\n'
|
|
||||||
)
|
|
||||||
return
|
|
||||||
|
|
||||||
signal(
|
signal(
|
||||||
sig,
|
sig,
|
||||||
dump_tree_on_sig,
|
signal_handler,
|
||||||
)
|
)
|
||||||
log.devx(
|
# NOTE: not the above can be triggered from
|
||||||
'Enabling trace-trees on `SIGUSR1` '
|
# a (xonsh) shell using:
|
||||||
'since `stackscope` is installed @ \n'
|
# kill -SIGUSR1 @$(pgrep -f '<cmd>')
|
||||||
f'{stackscope!r}\n\n'
|
|
||||||
f'With `SIGUSR1` handler\n'
|
|
||||||
f'|_{dump_tree_on_sig}\n'
|
|
||||||
)
|
|
||||||
return stackscope
|
|
||||||
|
|
|
@ -1,169 +0,0 @@
|
||||||
# tractor: structured concurrent "actors".
|
|
||||||
# Copyright 2018-eternity Tyler Goodlet.
|
|
||||||
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as published by
|
|
||||||
# the Free Software Foundation, either version 3 of the License, or
|
|
||||||
# (at your option) any later version.
|
|
||||||
|
|
||||||
# This program is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU Affero General Public License for more details.
|
|
||||||
|
|
||||||
# You should have received a copy of the GNU Affero General Public License
|
|
||||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
'''
|
|
||||||
Pretty formatters for use throughout the code base.
|
|
||||||
Mostly handy for logging and exception message content.
|
|
||||||
|
|
||||||
'''
|
|
||||||
import textwrap
|
|
||||||
import traceback
|
|
||||||
|
|
||||||
from trio import CancelScope
|
|
||||||
|
|
||||||
|
|
||||||
def add_div(
|
|
||||||
message: str,
|
|
||||||
div_str: str = '------ - ------',
|
|
||||||
|
|
||||||
) -> str:
|
|
||||||
'''
|
|
||||||
Add a "divider string" to the input `message` with
|
|
||||||
a little math to center it underneath.
|
|
||||||
|
|
||||||
'''
|
|
||||||
div_offset: int = (
|
|
||||||
round(len(message)/2)+1
|
|
||||||
-
|
|
||||||
round(len(div_str)/2)+1
|
|
||||||
)
|
|
||||||
div_str: str = (
|
|
||||||
'\n' + ' '*div_offset + f'{div_str}\n'
|
|
||||||
)
|
|
||||||
return div_str
|
|
||||||
|
|
||||||
|
|
||||||
def pformat_boxed_tb(
|
|
||||||
tb_str: str,
|
|
||||||
fields_str: str|None = None,
|
|
||||||
field_prefix: str = ' |_',
|
|
||||||
|
|
||||||
tb_box_indent: int|None = None,
|
|
||||||
tb_body_indent: int = 1,
|
|
||||||
boxer_header: str = '-'
|
|
||||||
|
|
||||||
) -> str:
|
|
||||||
'''
|
|
||||||
Create a "boxed" looking traceback string.
|
|
||||||
|
|
||||||
Useful for emphasizing traceback text content as being an
|
|
||||||
embedded attribute of some other object (like
|
|
||||||
a `RemoteActorError` or other boxing remote error shuttle
|
|
||||||
container).
|
|
||||||
|
|
||||||
Any other parent/container "fields" can be passed in the
|
|
||||||
`fields_str` input along with other prefix/indent settings.
|
|
||||||
|
|
||||||
'''
|
|
||||||
if (
|
|
||||||
fields_str
|
|
||||||
and
|
|
||||||
field_prefix
|
|
||||||
):
|
|
||||||
fields: str = textwrap.indent(
|
|
||||||
fields_str,
|
|
||||||
prefix=field_prefix,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
fields = fields_str or ''
|
|
||||||
|
|
||||||
tb_body = tb_str
|
|
||||||
if tb_body_indent:
|
|
||||||
tb_body: str = textwrap.indent(
|
|
||||||
tb_str,
|
|
||||||
prefix=tb_body_indent * ' ',
|
|
||||||
)
|
|
||||||
|
|
||||||
tb_box: str = (
|
|
||||||
f'|\n'
|
|
||||||
f' ------ {boxer_header} ------\n'
|
|
||||||
f'{tb_body}'
|
|
||||||
f' ------ {boxer_header}- ------\n'
|
|
||||||
f'_|'
|
|
||||||
)
|
|
||||||
tb_box_indent: str = (
|
|
||||||
tb_box_indent
|
|
||||||
or
|
|
||||||
1
|
|
||||||
|
|
||||||
# (len(field_prefix))
|
|
||||||
# ? ^-TODO-^ ? if you wanted another indent level
|
|
||||||
)
|
|
||||||
if tb_box_indent > 0:
|
|
||||||
tb_box: str = textwrap.indent(
|
|
||||||
tb_box,
|
|
||||||
prefix=tb_box_indent * ' ',
|
|
||||||
)
|
|
||||||
|
|
||||||
return (
|
|
||||||
fields
|
|
||||||
+
|
|
||||||
tb_box
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def pformat_caller_frame(
|
|
||||||
stack_limit: int = 1,
|
|
||||||
box_tb: bool = True,
|
|
||||||
) -> str:
|
|
||||||
'''
|
|
||||||
Capture and return the traceback text content from
|
|
||||||
`stack_limit` call frames up.
|
|
||||||
|
|
||||||
'''
|
|
||||||
tb_str: str = (
|
|
||||||
'\n'.join(
|
|
||||||
traceback.format_stack(limit=stack_limit)
|
|
||||||
)
|
|
||||||
)
|
|
||||||
if box_tb:
|
|
||||||
tb_str: str = pformat_boxed_tb(
|
|
||||||
tb_str=tb_str,
|
|
||||||
field_prefix=' ',
|
|
||||||
indent='',
|
|
||||||
)
|
|
||||||
return tb_str
|
|
||||||
|
|
||||||
|
|
||||||
def pformat_cs(
|
|
||||||
cs: CancelScope,
|
|
||||||
var_name: str = 'cs',
|
|
||||||
field_prefix: str = ' |_',
|
|
||||||
) -> str:
|
|
||||||
'''
|
|
||||||
Pretty format info about a `trio.CancelScope` including most
|
|
||||||
of its public state and `._cancel_status`.
|
|
||||||
|
|
||||||
The output can be modified to show a "var name" for the
|
|
||||||
instance as a field prefix, just a simple str before each
|
|
||||||
line more or less.
|
|
||||||
|
|
||||||
'''
|
|
||||||
|
|
||||||
fields: str = textwrap.indent(
|
|
||||||
(
|
|
||||||
f'cancel_called = {cs.cancel_called}\n'
|
|
||||||
f'cancelled_caught = {cs.cancelled_caught}\n'
|
|
||||||
f'_cancel_status = {cs._cancel_status}\n'
|
|
||||||
f'shield = {cs.shield}\n'
|
|
||||||
),
|
|
||||||
prefix=field_prefix,
|
|
||||||
)
|
|
||||||
return (
|
|
||||||
f'{var_name}: {cs}\n'
|
|
||||||
+
|
|
||||||
fields
|
|
||||||
)
|
|
146
tractor/log.py
146
tractor/log.py
|
@ -21,11 +21,6 @@ Log like a forester!
|
||||||
from collections.abc import Mapping
|
from collections.abc import Mapping
|
||||||
import sys
|
import sys
|
||||||
import logging
|
import logging
|
||||||
from logging import (
|
|
||||||
LoggerAdapter,
|
|
||||||
Logger,
|
|
||||||
StreamHandler,
|
|
||||||
)
|
|
||||||
import colorlog # type: ignore
|
import colorlog # type: ignore
|
||||||
|
|
||||||
import trio
|
import trio
|
||||||
|
@ -53,20 +48,20 @@ LOG_FORMAT = (
|
||||||
|
|
||||||
DATE_FORMAT = '%b %d %H:%M:%S'
|
DATE_FORMAT = '%b %d %H:%M:%S'
|
||||||
|
|
||||||
# FYI, ERROR is 40
|
LEVELS: dict[str, int] = {
|
||||||
# TODO: use a `bidict` to avoid the :155 check?
|
|
||||||
CUSTOM_LEVELS: dict[str, int] = {
|
|
||||||
'TRANSPORT': 5,
|
'TRANSPORT': 5,
|
||||||
'RUNTIME': 15,
|
'RUNTIME': 15,
|
||||||
'DEVX': 17,
|
'CANCEL': 16,
|
||||||
'CANCEL': 22,
|
|
||||||
'PDB': 500,
|
'PDB': 500,
|
||||||
}
|
}
|
||||||
|
# _custom_levels: set[str] = {
|
||||||
|
# lvlname.lower for lvlname in LEVELS.keys()
|
||||||
|
# }
|
||||||
|
|
||||||
STD_PALETTE = {
|
STD_PALETTE = {
|
||||||
'CRITICAL': 'red',
|
'CRITICAL': 'red',
|
||||||
'ERROR': 'red',
|
'ERROR': 'red',
|
||||||
'PDB': 'white',
|
'PDB': 'white',
|
||||||
'DEVX': 'cyan',
|
|
||||||
'WARNING': 'yellow',
|
'WARNING': 'yellow',
|
||||||
'INFO': 'green',
|
'INFO': 'green',
|
||||||
'CANCEL': 'yellow',
|
'CANCEL': 'yellow',
|
||||||
|
@ -83,7 +78,7 @@ BOLD_PALETTE = {
|
||||||
|
|
||||||
# TODO: this isn't showing the correct '{filename}'
|
# TODO: this isn't showing the correct '{filename}'
|
||||||
# as it did before..
|
# as it did before..
|
||||||
class StackLevelAdapter(LoggerAdapter):
|
class StackLevelAdapter(logging.LoggerAdapter):
|
||||||
|
|
||||||
def transport(
|
def transport(
|
||||||
self,
|
self,
|
||||||
|
@ -91,8 +86,7 @@ class StackLevelAdapter(LoggerAdapter):
|
||||||
|
|
||||||
) -> None:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
IPC transport level msg IO; generally anything below
|
IPC level msg-ing.
|
||||||
`._ipc.Channel` and friends.
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
return self.log(5, msg)
|
return self.log(5, msg)
|
||||||
|
@ -108,11 +102,11 @@ class StackLevelAdapter(LoggerAdapter):
|
||||||
msg: str,
|
msg: str,
|
||||||
) -> None:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
Cancellation sequencing, mostly for runtime reporting.
|
Cancellation logging, mostly for runtime reporting.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
return self.log(
|
return self.log(
|
||||||
level=22,
|
level=16,
|
||||||
msg=msg,
|
msg=msg,
|
||||||
# stacklevel=4,
|
# stacklevel=4,
|
||||||
)
|
)
|
||||||
|
@ -122,21 +116,11 @@ class StackLevelAdapter(LoggerAdapter):
|
||||||
msg: str,
|
msg: str,
|
||||||
) -> None:
|
) -> None:
|
||||||
'''
|
'''
|
||||||
`pdb`-REPL (debugger) related statuses.
|
Debugger logging.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
return self.log(500, msg)
|
return self.log(500, msg)
|
||||||
|
|
||||||
def devx(
|
|
||||||
self,
|
|
||||||
msg: str,
|
|
||||||
) -> None:
|
|
||||||
'''
|
|
||||||
"Developer experience" sub-sys statuses.
|
|
||||||
|
|
||||||
'''
|
|
||||||
return self.log(17, msg)
|
|
||||||
|
|
||||||
def log(
|
def log(
|
||||||
self,
|
self,
|
||||||
level,
|
level,
|
||||||
|
@ -148,13 +132,12 @@ class StackLevelAdapter(LoggerAdapter):
|
||||||
Delegate a log call to the underlying logger, after adding
|
Delegate a log call to the underlying logger, after adding
|
||||||
contextual information from this adapter instance.
|
contextual information from this adapter instance.
|
||||||
|
|
||||||
NOTE: all custom level methods (above) delegate to this!
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
if self.isEnabledFor(level):
|
if self.isEnabledFor(level):
|
||||||
stacklevel: int = 3
|
stacklevel: int = 3
|
||||||
if (
|
if (
|
||||||
level in CUSTOM_LEVELS.values()
|
level in LEVELS.values()
|
||||||
|
# or level in _custom_levels
|
||||||
):
|
):
|
||||||
stacklevel: int = 4
|
stacklevel: int = 4
|
||||||
|
|
||||||
|
@ -201,30 +184,8 @@ class StackLevelAdapter(LoggerAdapter):
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
# TODO IDEAs:
|
|
||||||
# -[ ] move to `.devx.pformat`?
|
|
||||||
# -[ ] do per task-name and actor-name color coding
|
|
||||||
# -[ ] unique color per task-id and actor-uuid
|
|
||||||
def pformat_task_uid(
|
|
||||||
id_part: str = 'tail'
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Return `str`-ified unique for a `trio.Task` via a combo of its
|
|
||||||
`.name: str` and `id()` truncated output.
|
|
||||||
|
|
||||||
'''
|
|
||||||
task: trio.Task = trio.lowlevel.current_task()
|
|
||||||
tid: str = str(id(task))
|
|
||||||
if id_part == 'tail':
|
|
||||||
tid_part: str = tid[-6:]
|
|
||||||
else:
|
|
||||||
tid_part: str = tid[:6]
|
|
||||||
|
|
||||||
return f'{task.name}[{tid_part}]'
|
|
||||||
|
|
||||||
|
|
||||||
_conc_name_getters = {
|
_conc_name_getters = {
|
||||||
'task': pformat_task_uid,
|
'task': lambda: trio.lowlevel.current_task().name,
|
||||||
'actor': lambda: current_actor(),
|
'actor': lambda: current_actor(),
|
||||||
'actor_name': lambda: current_actor().name,
|
'actor_name': lambda: current_actor().name,
|
||||||
'actor_uid': lambda: current_actor().uid[1][:6],
|
'actor_uid': lambda: current_actor().uid[1][:6],
|
||||||
|
@ -232,10 +193,7 @@ _conc_name_getters = {
|
||||||
|
|
||||||
|
|
||||||
class ActorContextInfo(Mapping):
|
class ActorContextInfo(Mapping):
|
||||||
'''
|
"Dyanmic lookup for local actor and task names"
|
||||||
Dyanmic lookup for local actor and task names.
|
|
||||||
|
|
||||||
'''
|
|
||||||
_context_keys = (
|
_context_keys = (
|
||||||
'task',
|
'task',
|
||||||
'actor',
|
'actor',
|
||||||
|
@ -258,28 +216,19 @@ class ActorContextInfo(Mapping):
|
||||||
|
|
||||||
|
|
||||||
def get_logger(
|
def get_logger(
|
||||||
name: str|None = None,
|
|
||||||
|
name: str | None = None,
|
||||||
_root_name: str = _proj_name,
|
_root_name: str = _proj_name,
|
||||||
|
|
||||||
logger: Logger|None = None,
|
|
||||||
|
|
||||||
# TODO, using `.config.dictConfig()` api?
|
|
||||||
# -[ ] SO answer with docs links
|
|
||||||
# |_https://stackoverflow.com/questions/7507825/where-is-a-complete-example-of-logging-config-dictconfig
|
|
||||||
# |_https://docs.python.org/3/library/logging.config.html#configuration-dictionary-schema
|
|
||||||
subsys_spec: str|None = None,
|
|
||||||
|
|
||||||
) -> StackLevelAdapter:
|
) -> StackLevelAdapter:
|
||||||
'''Return the package log or a sub-logger for ``name`` if provided.
|
'''Return the package log or a sub-logger for ``name`` if provided.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
log: Logger
|
log = rlog = logging.getLogger(_root_name)
|
||||||
log = rlog = logger or logging.getLogger(_root_name)
|
|
||||||
|
|
||||||
if (
|
if (
|
||||||
name
|
name
|
||||||
and
|
and name != _proj_name
|
||||||
name != _proj_name
|
|
||||||
):
|
):
|
||||||
|
|
||||||
# NOTE: for handling for modules that use ``get_logger(__name__)``
|
# NOTE: for handling for modules that use ``get_logger(__name__)``
|
||||||
|
@ -291,7 +240,7 @@ def get_logger(
|
||||||
# since in python the {filename} is always this same
|
# since in python the {filename} is always this same
|
||||||
# module-file.
|
# module-file.
|
||||||
|
|
||||||
sub_name: None|str = None
|
sub_name: None | str = None
|
||||||
rname, _, sub_name = name.partition('.')
|
rname, _, sub_name = name.partition('.')
|
||||||
pkgpath, _, modfilename = sub_name.rpartition('.')
|
pkgpath, _, modfilename = sub_name.rpartition('.')
|
||||||
|
|
||||||
|
@ -314,13 +263,10 @@ def get_logger(
|
||||||
|
|
||||||
# add our actor-task aware adapter which will dynamically look up
|
# add our actor-task aware adapter which will dynamically look up
|
||||||
# the actor and task names at each log emit
|
# the actor and task names at each log emit
|
||||||
logger = StackLevelAdapter(
|
logger = StackLevelAdapter(log, ActorContextInfo())
|
||||||
log,
|
|
||||||
ActorContextInfo(),
|
|
||||||
)
|
|
||||||
|
|
||||||
# additional levels
|
# additional levels
|
||||||
for name, val in CUSTOM_LEVELS.items():
|
for name, val in LEVELS.items():
|
||||||
logging.addLevelName(val, name)
|
logging.addLevelName(val, name)
|
||||||
|
|
||||||
# ensure customs levels exist as methods
|
# ensure customs levels exist as methods
|
||||||
|
@ -330,25 +276,15 @@ def get_logger(
|
||||||
|
|
||||||
|
|
||||||
def get_console_log(
|
def get_console_log(
|
||||||
level: str|None = None,
|
level: str | None = None,
|
||||||
logger: Logger|None = None,
|
|
||||||
**kwargs,
|
**kwargs,
|
||||||
|
) -> logging.LoggerAdapter:
|
||||||
|
'''Get the package logger and enable a handler which writes to stderr.
|
||||||
|
|
||||||
) -> LoggerAdapter:
|
Yeah yeah, i know we can use ``DictConfig``. You do it.
|
||||||
'''
|
'''
|
||||||
Get a `tractor`-style logging instance: a `Logger` wrapped in
|
log = get_logger(**kwargs) # our root logger
|
||||||
a `StackLevelAdapter` which injects various concurrency-primitive
|
logger = log.logger
|
||||||
(process, thread, task) fields and enables a `StreamHandler` that
|
|
||||||
writes on stderr using `colorlog` formatting.
|
|
||||||
|
|
||||||
Yeah yeah, i know we can use `logging.config.dictConfig()`. You do it.
|
|
||||||
|
|
||||||
'''
|
|
||||||
log = get_logger(
|
|
||||||
logger=logger,
|
|
||||||
**kwargs
|
|
||||||
) # set a root logger
|
|
||||||
logger: Logger = log.logger
|
|
||||||
|
|
||||||
if not level:
|
if not level:
|
||||||
return log
|
return log
|
||||||
|
@ -367,13 +303,9 @@ def get_console_log(
|
||||||
None,
|
None,
|
||||||
)
|
)
|
||||||
):
|
):
|
||||||
fmt = LOG_FORMAT
|
handler = logging.StreamHandler()
|
||||||
# if logger:
|
|
||||||
# fmt = None
|
|
||||||
|
|
||||||
handler = StreamHandler()
|
|
||||||
formatter = colorlog.ColoredFormatter(
|
formatter = colorlog.ColoredFormatter(
|
||||||
fmt=fmt,
|
LOG_FORMAT,
|
||||||
datefmt=DATE_FORMAT,
|
datefmt=DATE_FORMAT,
|
||||||
log_colors=STD_PALETTE,
|
log_colors=STD_PALETTE,
|
||||||
secondary_log_colors=BOLD_PALETTE,
|
secondary_log_colors=BOLD_PALETTE,
|
||||||
|
@ -390,20 +322,4 @@ def get_loglevel() -> str:
|
||||||
|
|
||||||
|
|
||||||
# global module logger for tractor itself
|
# global module logger for tractor itself
|
||||||
log: StackLevelAdapter = get_logger('tractor')
|
log = get_logger('tractor')
|
||||||
|
|
||||||
|
|
||||||
def at_least_level(
|
|
||||||
log: Logger|LoggerAdapter,
|
|
||||||
level: int|str,
|
|
||||||
) -> bool:
|
|
||||||
'''
|
|
||||||
Predicate to test if a given level is active.
|
|
||||||
|
|
||||||
'''
|
|
||||||
if isinstance(level, str):
|
|
||||||
level: int = CUSTOM_LEVELS[level.upper()]
|
|
||||||
|
|
||||||
if log.getEffectiveLevel() <= level:
|
|
||||||
return True
|
|
||||||
return False
|
|
||||||
|
|
|
@ -18,57 +18,9 @@
|
||||||
Built-in messaging patterns, types, APIs and helpers.
|
Built-in messaging patterns, types, APIs and helpers.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
from typing import (
|
|
||||||
TypeAlias,
|
|
||||||
)
|
|
||||||
from .ptr import (
|
from .ptr import (
|
||||||
NamespacePath as NamespacePath,
|
NamespacePath as NamespacePath,
|
||||||
)
|
)
|
||||||
from .pretty_struct import (
|
from .types import (
|
||||||
Struct as Struct,
|
Struct as Struct,
|
||||||
)
|
)
|
||||||
from ._codec import (
|
|
||||||
_def_msgspec_codec as _def_msgspec_codec,
|
|
||||||
_ctxvar_MsgCodec as _ctxvar_MsgCodec,
|
|
||||||
|
|
||||||
apply_codec as apply_codec,
|
|
||||||
mk_codec as mk_codec,
|
|
||||||
mk_dec as mk_dec,
|
|
||||||
MsgCodec as MsgCodec,
|
|
||||||
MsgDec as MsgDec,
|
|
||||||
current_codec as current_codec,
|
|
||||||
)
|
|
||||||
# currently can't bc circular with `._context`
|
|
||||||
# from ._ops import (
|
|
||||||
# PldRx as PldRx,
|
|
||||||
# _drain_to_final_msg as _drain_to_final_msg,
|
|
||||||
# )
|
|
||||||
|
|
||||||
from .types import (
|
|
||||||
PayloadMsg as PayloadMsg,
|
|
||||||
|
|
||||||
Aid as Aid,
|
|
||||||
SpawnSpec as SpawnSpec,
|
|
||||||
|
|
||||||
Start as Start,
|
|
||||||
StartAck as StartAck,
|
|
||||||
|
|
||||||
Started as Started,
|
|
||||||
Yield as Yield,
|
|
||||||
Stop as Stop,
|
|
||||||
Return as Return,
|
|
||||||
CancelAck as CancelAck,
|
|
||||||
|
|
||||||
Error as Error,
|
|
||||||
|
|
||||||
# type-var for `.pld` field
|
|
||||||
PayloadT as PayloadT,
|
|
||||||
|
|
||||||
# full msg class set from above as list
|
|
||||||
__msg_types__ as __msg_types__,
|
|
||||||
|
|
||||||
# type-alias for union of all msgs
|
|
||||||
MsgType as MsgType,
|
|
||||||
)
|
|
||||||
|
|
||||||
__msg_spec__: TypeAlias = MsgType
|
|
||||||
|
|
|
@ -1,886 +0,0 @@
|
||||||
# tractor: structured concurrent "actors".
|
|
||||||
# Copyright 2018-eternity Tyler Goodlet.
|
|
||||||
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as published by
|
|
||||||
# the Free Software Foundation, either version 3 of the License, or
|
|
||||||
# (at your option) any later version.
|
|
||||||
|
|
||||||
# This program is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU Affero General Public License for more details.
|
|
||||||
|
|
||||||
# You should have received a copy of the GNU Affero General Public License
|
|
||||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
'''
|
|
||||||
IPC msg interchange codec management.
|
|
||||||
|
|
||||||
Supported backend libs:
|
|
||||||
- `msgspec.msgpack`
|
|
||||||
|
|
||||||
ToDo: backends we prolly should offer:
|
|
||||||
|
|
||||||
- see project/lib list throughout GH issue discussion comments:
|
|
||||||
https://github.com/goodboy/tractor/issues/196
|
|
||||||
|
|
||||||
- `capnproto`: https://capnproto.org/rpc.html
|
|
||||||
- https://capnproto.org/language.html#language-reference
|
|
||||||
|
|
||||||
'''
|
|
||||||
from __future__ import annotations
|
|
||||||
from contextlib import (
|
|
||||||
contextmanager as cm,
|
|
||||||
)
|
|
||||||
from contextvars import (
|
|
||||||
ContextVar,
|
|
||||||
Token,
|
|
||||||
)
|
|
||||||
import textwrap
|
|
||||||
from typing import (
|
|
||||||
Any,
|
|
||||||
Callable,
|
|
||||||
Protocol,
|
|
||||||
Type,
|
|
||||||
TYPE_CHECKING,
|
|
||||||
TypeVar,
|
|
||||||
Union,
|
|
||||||
)
|
|
||||||
from types import ModuleType
|
|
||||||
|
|
||||||
import msgspec
|
|
||||||
from msgspec import (
|
|
||||||
msgpack,
|
|
||||||
Raw,
|
|
||||||
)
|
|
||||||
# TODO: see notes below from @mikenerone..
|
|
||||||
# from tricycle import TreeVar
|
|
||||||
|
|
||||||
from tractor.msg.pretty_struct import Struct
|
|
||||||
from tractor.msg.types import (
|
|
||||||
mk_msg_spec,
|
|
||||||
MsgType,
|
|
||||||
PayloadMsg,
|
|
||||||
)
|
|
||||||
from tractor.log import get_logger
|
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
|
||||||
from tractor._context import Context
|
|
||||||
|
|
||||||
log = get_logger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: unify with `MsgCodec` by making `._dec` part this?
|
|
||||||
class MsgDec(Struct):
|
|
||||||
'''
|
|
||||||
An IPC msg (payload) decoder.
|
|
||||||
|
|
||||||
Normally used to decode only a payload: `MsgType.pld:
|
|
||||||
PayloadT` field before delivery to IPC consumer code.
|
|
||||||
|
|
||||||
'''
|
|
||||||
_dec: msgpack.Decoder
|
|
||||||
# _ext_types_box: Struct|None = None
|
|
||||||
|
|
||||||
@property
|
|
||||||
def dec(self) -> msgpack.Decoder:
|
|
||||||
return self._dec
|
|
||||||
|
|
||||||
def __repr__(self) -> str:
|
|
||||||
|
|
||||||
speclines: str = self.spec_str
|
|
||||||
|
|
||||||
# in multi-typed spec case we stick the list
|
|
||||||
# all on newlines after the |__pld_spec__:,
|
|
||||||
# OW it's prolly single type spec-value
|
|
||||||
# so just leave it on same line.
|
|
||||||
if '\n' in speclines:
|
|
||||||
speclines: str = '\n' + textwrap.indent(
|
|
||||||
speclines,
|
|
||||||
prefix=' '*3,
|
|
||||||
)
|
|
||||||
|
|
||||||
body: str = textwrap.indent(
|
|
||||||
f'|_dec_hook: {self.dec.dec_hook}\n'
|
|
||||||
f'|__pld_spec__: {speclines}\n',
|
|
||||||
prefix=' '*2,
|
|
||||||
)
|
|
||||||
return (
|
|
||||||
f'<{type(self).__name__}(\n'
|
|
||||||
f'{body}'
|
|
||||||
')>'
|
|
||||||
)
|
|
||||||
|
|
||||||
# struct type unions
|
|
||||||
# https://jcristharif.com/msgspec/structs.html#tagged-unions
|
|
||||||
#
|
|
||||||
# ^-TODO-^: make a wrapper type for this such that alt
|
|
||||||
# backends can be represented easily without a `Union` needed,
|
|
||||||
# AND so that we have better support for wire transport.
|
|
||||||
#
|
|
||||||
# -[ ] maybe `FieldSpec` is a good name since msg-spec
|
|
||||||
# better applies to a `MsgType[FieldSpec]`?
|
|
||||||
#
|
|
||||||
# -[ ] both as part of the `.open_context()` call AND as part of the
|
|
||||||
# immediate ack-reponse (see similar below)
|
|
||||||
# we should do spec matching and fail if anything is awry?
|
|
||||||
#
|
|
||||||
# -[ ] eventually spec should be generated/parsed from the
|
|
||||||
# type-annots as # desired in GH issue:
|
|
||||||
# https://github.com/goodboy/tractor/issues/365
|
|
||||||
#
|
|
||||||
# -[ ] semantics of the mismatch case
|
|
||||||
# - when caller-callee specs we should raise
|
|
||||||
# a `MsgTypeError` or `MsgSpecError` or similar?
|
|
||||||
#
|
|
||||||
# -[ ] wrapper types for both spec types such that we can easily
|
|
||||||
# IPC transport them?
|
|
||||||
# - `TypeSpec: Union[Type]`
|
|
||||||
# * also a `.__contains__()` for doing `None in
|
|
||||||
# TypeSpec[None|int]` since rn you need to do it on
|
|
||||||
# `.__args__` for unions..
|
|
||||||
# - `MsgSpec: Union[MsgType]
|
|
||||||
#
|
|
||||||
# -[ ] auto-genning this from new (in 3.12) type parameter lists Bo
|
|
||||||
# |_ https://docs.python.org/3/reference/compound_stmts.html#type-params
|
|
||||||
# |_ historical pep 695: https://peps.python.org/pep-0695/
|
|
||||||
# |_ full lang spec: https://typing.readthedocs.io/en/latest/spec/
|
|
||||||
# |_ on annotation scopes:
|
|
||||||
# https://docs.python.org/3/reference/executionmodel.html#annotation-scopes
|
|
||||||
# |_ 3.13 will have subscriptable funcs Bo
|
|
||||||
# https://peps.python.org/pep-0718/
|
|
||||||
@property
|
|
||||||
def spec(self) -> Union[Type[Struct]]:
|
|
||||||
# NOTE: defined and applied inside `mk_codec()`
|
|
||||||
return self._dec.type
|
|
||||||
|
|
||||||
# no difference, as compared to a `MsgCodec` which defines the
|
|
||||||
# `MsgType.pld: PayloadT` part of its spec separately
|
|
||||||
pld_spec = spec
|
|
||||||
|
|
||||||
# TODO: would get moved into `FieldSpec.__str__()` right?
|
|
||||||
@property
|
|
||||||
def spec_str(self) -> str:
|
|
||||||
return pformat_msgspec(
|
|
||||||
codec=self,
|
|
||||||
join_char='|',
|
|
||||||
)
|
|
||||||
|
|
||||||
pld_spec_str = spec_str
|
|
||||||
|
|
||||||
def decode(
|
|
||||||
self,
|
|
||||||
raw: Raw|bytes,
|
|
||||||
) -> Any:
|
|
||||||
return self._dec.decode(raw)
|
|
||||||
|
|
||||||
@property
|
|
||||||
def hook(self) -> Callable|None:
|
|
||||||
return self._dec.dec_hook
|
|
||||||
|
|
||||||
|
|
||||||
def mk_dec(
|
|
||||||
spec: Union[Type[Struct]]|Type|None,
|
|
||||||
|
|
||||||
# NOTE, required for ad-hoc type extensions to the underlying
|
|
||||||
# serialization proto (which is default `msgpack`),
|
|
||||||
# https://jcristharif.com/msgspec/extending.html#mapping-to-from-native-types
|
|
||||||
dec_hook: Callable|None = None,
|
|
||||||
ext_types: list[Type]|None = None,
|
|
||||||
|
|
||||||
) -> MsgDec:
|
|
||||||
'''
|
|
||||||
Create an IPC msg decoder, a slightly higher level wrapper around
|
|
||||||
a `msgspec.msgpack.Decoder` which provides,
|
|
||||||
|
|
||||||
- easier introspection of the underlying type spec via
|
|
||||||
the `.spec` and `.spec_str` attrs,
|
|
||||||
- `.hook` access to the `Decoder.dec_hook()`,
|
|
||||||
- automatic custom extension-types decode support when
|
|
||||||
`dec_hook()` is provided such that any `PayloadMsg.pld` tagged
|
|
||||||
as a type from from `ext_types` (presuming the `MsgCodec.encode()` also used
|
|
||||||
a `.enc_hook()`) is processed and constructed by a `PldRx` implicitily.
|
|
||||||
|
|
||||||
NOTE, as mentioned a `MsgDec` is normally used for `PayloadMsg.pld: PayloadT` field
|
|
||||||
decoding inside an IPC-ctx-oriented `PldRx`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
if (
|
|
||||||
spec is None
|
|
||||||
and
|
|
||||||
ext_types is None
|
|
||||||
):
|
|
||||||
raise TypeError(
|
|
||||||
f'MIssing type-`spec` for msg decoder!\n'
|
|
||||||
f'\n'
|
|
||||||
f'`spec=None` is **only** permitted is if custom extension types '
|
|
||||||
f'are provided via `ext_types`, meaning it must be non-`None`.\n'
|
|
||||||
f'\n'
|
|
||||||
f'In this case it is presumed that only the `ext_types`, '
|
|
||||||
f'which much be handled by a paired `dec_hook()`, '
|
|
||||||
f'will be permitted within the payload type-`spec`!\n'
|
|
||||||
f'\n'
|
|
||||||
f'spec = {spec!r}\n'
|
|
||||||
f'dec_hook = {dec_hook!r}\n'
|
|
||||||
f'ext_types = {ext_types!r}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
if dec_hook:
|
|
||||||
if ext_types is None:
|
|
||||||
raise TypeError(
|
|
||||||
f'If extending the serializable types with a custom decode hook (`dec_hook()`), '
|
|
||||||
f'you must also provide the expected type set that the hook will handle '
|
|
||||||
f'via a `ext_types: Union[Type]|None = None` argument!\n'
|
|
||||||
f'\n'
|
|
||||||
f'dec_hook = {dec_hook!r}\n'
|
|
||||||
f'ext_types = {ext_types!r}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
# XXX, i *thought* we would require a boxing struct as per docs,
|
|
||||||
# https://jcristharif.com/msgspec/extending.html#mapping-to-from-native-types
|
|
||||||
# |_ see comment,
|
|
||||||
# > Note that typed deserialization is required for
|
|
||||||
# > successful roundtripping here, so we pass `MyMessage` to
|
|
||||||
# > `Decoder`.
|
|
||||||
#
|
|
||||||
# BUT, turns out as long as you spec a union with `Raw` it
|
|
||||||
# will work? kk B)
|
|
||||||
#
|
|
||||||
# maybe_box_struct = mk_boxed_ext_struct(ext_types)
|
|
||||||
spec = Raw | Union[*ext_types]
|
|
||||||
|
|
||||||
return MsgDec(
|
|
||||||
_dec=msgpack.Decoder(
|
|
||||||
type=spec, # like `MsgType[Any]`
|
|
||||||
dec_hook=dec_hook,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# TODO? remove since didn't end up needing this?
|
|
||||||
def mk_boxed_ext_struct(
|
|
||||||
ext_types: list[Type],
|
|
||||||
) -> Struct:
|
|
||||||
# NOTE, originally was to wrap non-msgpack-supported "extension
|
|
||||||
# types" in a field-typed boxing struct, see notes around the
|
|
||||||
# `dec_hook()` branch in `mk_dec()`.
|
|
||||||
ext_types_union = Union[*ext_types]
|
|
||||||
repr_ext_types_union: str = (
|
|
||||||
str(ext_types_union)
|
|
||||||
or
|
|
||||||
"|".join(ext_types)
|
|
||||||
)
|
|
||||||
BoxedExtType = msgspec.defstruct(
|
|
||||||
f'BoxedExts[{repr_ext_types_union}]',
|
|
||||||
fields=[
|
|
||||||
('boxed', ext_types_union),
|
|
||||||
],
|
|
||||||
)
|
|
||||||
return BoxedExtType
|
|
||||||
|
|
||||||
|
|
||||||
def unpack_spec_types(
|
|
||||||
spec: Union[Type]|Type,
|
|
||||||
) -> set[Type]:
|
|
||||||
'''
|
|
||||||
Given an input type-`spec`, either a lone type
|
|
||||||
or a `Union` of types (like `str|int|MyThing`),
|
|
||||||
return a set of individual types.
|
|
||||||
|
|
||||||
When `spec` is not a type-union returns `{spec,}`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
spec_subtypes: set[Union[Type]] = set(
|
|
||||||
getattr(
|
|
||||||
spec,
|
|
||||||
'__args__',
|
|
||||||
{spec,},
|
|
||||||
)
|
|
||||||
)
|
|
||||||
return spec_subtypes
|
|
||||||
|
|
||||||
|
|
||||||
def mk_msgspec_table(
|
|
||||||
dec: msgpack.Decoder,
|
|
||||||
msg: MsgType|None = None,
|
|
||||||
|
|
||||||
) -> dict[str, MsgType]|str:
|
|
||||||
'''
|
|
||||||
Fill out a `dict` of `MsgType`s keyed by name
|
|
||||||
for a given input `msgspec.msgpack.Decoder`
|
|
||||||
as defined by its `.type: Union[Type]` setting.
|
|
||||||
|
|
||||||
If `msg` is provided, only deliver a `dict` with a single
|
|
||||||
entry for that type.
|
|
||||||
|
|
||||||
'''
|
|
||||||
msgspec: Union[Type]|Type = dec.type
|
|
||||||
|
|
||||||
if not (msgtypes := getattr(msgspec, '__args__', False)):
|
|
||||||
msgtypes = [msgspec]
|
|
||||||
|
|
||||||
msgt_table: dict[str, MsgType] = {
|
|
||||||
msgt: str(msgt.__name__)
|
|
||||||
for msgt in msgtypes
|
|
||||||
}
|
|
||||||
if msg:
|
|
||||||
msgt: MsgType = type(msg)
|
|
||||||
str_repr: str = msgt_table[msgt]
|
|
||||||
return {msgt: str_repr}
|
|
||||||
|
|
||||||
return msgt_table
|
|
||||||
|
|
||||||
|
|
||||||
def pformat_msgspec(
|
|
||||||
codec: MsgCodec|MsgDec,
|
|
||||||
msg: MsgType|None = None,
|
|
||||||
join_char: str = '\n',
|
|
||||||
|
|
||||||
) -> str:
|
|
||||||
'''
|
|
||||||
Pretty `str` format the `msgspec.msgpack.Decoder.type` attribute
|
|
||||||
for display in (console) log messages as a nice (maybe multiline)
|
|
||||||
presentation of all supported `Struct`s (subtypes) available for
|
|
||||||
typed decoding.
|
|
||||||
|
|
||||||
'''
|
|
||||||
dec: msgpack.Decoder = getattr(codec, 'dec', codec)
|
|
||||||
return join_char.join(
|
|
||||||
mk_msgspec_table(
|
|
||||||
dec=dec,
|
|
||||||
msg=msg,
|
|
||||||
).values()
|
|
||||||
)
|
|
||||||
|
|
||||||
# TODO: overall IPC msg-spec features (i.e. in this mod)!
|
|
||||||
#
|
|
||||||
# -[ ] API changes towards being interchange lib agnostic!
|
|
||||||
# -[ ] capnproto has pre-compiled schema for eg..
|
|
||||||
# * https://capnproto.org/language.html
|
|
||||||
# * http://capnproto.github.io/pycapnp/quickstart.html
|
|
||||||
# * https://github.com/capnproto/pycapnp/blob/master/examples/addressbook.capnp
|
|
||||||
#
|
|
||||||
# -[ ] struct aware messaging coders as per:
|
|
||||||
# -[x] https://github.com/goodboy/tractor/issues/36
|
|
||||||
# -[ ] https://github.com/goodboy/tractor/issues/196
|
|
||||||
# -[ ] https://github.com/goodboy/tractor/issues/365
|
|
||||||
#
|
|
||||||
class MsgCodec(Struct):
|
|
||||||
'''
|
|
||||||
A IPC msg interchange format lib's encoder + decoder pair.
|
|
||||||
|
|
||||||
Pretty much nothing more then delegation to underlying
|
|
||||||
`msgspec.<interchange-protocol>.Encoder/Decoder`s for now.
|
|
||||||
|
|
||||||
'''
|
|
||||||
_enc: msgpack.Encoder
|
|
||||||
_dec: msgpack.Decoder
|
|
||||||
_pld_spec: Type[Struct]|Raw|Any
|
|
||||||
|
|
||||||
# _ext_types_box: Struct|None = None
|
|
||||||
|
|
||||||
def __repr__(self) -> str:
|
|
||||||
speclines: str = textwrap.indent(
|
|
||||||
pformat_msgspec(codec=self),
|
|
||||||
prefix=' '*3,
|
|
||||||
)
|
|
||||||
body: str = textwrap.indent(
|
|
||||||
f'|_lib = {self.lib.__name__!r}\n'
|
|
||||||
f'|_enc_hook: {self.enc.enc_hook}\n'
|
|
||||||
f'|_dec_hook: {self.dec.dec_hook}\n'
|
|
||||||
f'|_pld_spec: {self.pld_spec_str}\n'
|
|
||||||
# f'|\n'
|
|
||||||
f'|__msg_spec__:\n'
|
|
||||||
f'{speclines}\n',
|
|
||||||
prefix=' '*2,
|
|
||||||
)
|
|
||||||
return (
|
|
||||||
f'<{type(self).__name__}(\n'
|
|
||||||
f'{body}'
|
|
||||||
')>'
|
|
||||||
)
|
|
||||||
|
|
||||||
@property
|
|
||||||
def pld_spec(self) -> Type[Struct]|Raw|Any:
|
|
||||||
return self._pld_spec
|
|
||||||
|
|
||||||
@property
|
|
||||||
def pld_spec_str(self) -> str:
|
|
||||||
|
|
||||||
# TODO: could also use match: instead?
|
|
||||||
spec: Union[Type]|Type = self.pld_spec
|
|
||||||
|
|
||||||
# `typing.Union` case
|
|
||||||
if getattr(spec, '__args__', False):
|
|
||||||
return str(spec)
|
|
||||||
|
|
||||||
# just a single type
|
|
||||||
else:
|
|
||||||
return spec.__name__
|
|
||||||
|
|
||||||
# struct type unions
|
|
||||||
# https://jcristharif.com/msgspec/structs.html#tagged-unions
|
|
||||||
@property
|
|
||||||
def msg_spec(self) -> Union[Type[Struct]]:
|
|
||||||
# NOTE: defined and applied inside `mk_codec()`
|
|
||||||
return self._dec.type
|
|
||||||
|
|
||||||
# TODO: some way to make `pretty_struct.Struct` use this
|
|
||||||
# wrapped field over the `.msg_spec` one?
|
|
||||||
@property
|
|
||||||
def msg_spec_str(self) -> str:
|
|
||||||
return pformat_msgspec(self.msg_spec)
|
|
||||||
|
|
||||||
lib: ModuleType = msgspec
|
|
||||||
|
|
||||||
# TODO: use `functools.cached_property` for these ?
|
|
||||||
# https://docs.python.org/3/library/functools.html#functools.cached_property
|
|
||||||
@property
|
|
||||||
def enc(self) -> msgpack.Encoder:
|
|
||||||
return self._enc
|
|
||||||
|
|
||||||
# TODO: reusing encode buffer for perf?
|
|
||||||
# https://jcristharif.com/msgspec/perf-tips.html#reusing-an-output-buffer
|
|
||||||
_buf: bytearray = bytearray()
|
|
||||||
|
|
||||||
def encode(
|
|
||||||
self,
|
|
||||||
py_obj: Any|PayloadMsg,
|
|
||||||
|
|
||||||
use_buf: bool = False,
|
|
||||||
# ^-XXX-^ uhh why am i getting this?
|
|
||||||
# |_BufferError: Existing exports of data: object cannot be re-sized
|
|
||||||
|
|
||||||
as_ext_type: bool = False,
|
|
||||||
hide_tb: bool = True,
|
|
||||||
|
|
||||||
) -> bytes:
|
|
||||||
'''
|
|
||||||
Encode input python objects to `msgpack` bytes for
|
|
||||||
transfer on a tranport protocol connection.
|
|
||||||
|
|
||||||
When `use_buf == True` use the output buffer optimization:
|
|
||||||
https://jcristharif.com/msgspec/perf-tips.html#reusing-an-output-buffer
|
|
||||||
|
|
||||||
'''
|
|
||||||
__tracebackhide__: bool = hide_tb
|
|
||||||
if use_buf:
|
|
||||||
self._enc.encode_into(py_obj, self._buf)
|
|
||||||
return self._buf
|
|
||||||
|
|
||||||
return self._enc.encode(py_obj)
|
|
||||||
# try:
|
|
||||||
# return self._enc.encode(py_obj)
|
|
||||||
# except TypeError as typerr:
|
|
||||||
# typerr.add_note(
|
|
||||||
# '|_src error from `msgspec`'
|
|
||||||
# # f'|_{self._enc.encode!r}'
|
|
||||||
# )
|
|
||||||
# raise typerr
|
|
||||||
|
|
||||||
# TODO! REMOVE once i'm confident we won't ever need it!
|
|
||||||
#
|
|
||||||
# box: Struct = self._ext_types_box
|
|
||||||
# if (
|
|
||||||
# as_ext_type
|
|
||||||
# or
|
|
||||||
# (
|
|
||||||
# # XXX NOTE, auto-detect if the input type
|
|
||||||
# box
|
|
||||||
# and
|
|
||||||
# (ext_types := unpack_spec_types(
|
|
||||||
# spec=box.__annotations__['boxed'])
|
|
||||||
# )
|
|
||||||
# )
|
|
||||||
# ):
|
|
||||||
# match py_obj:
|
|
||||||
# # case PayloadMsg(pld=pld) if (
|
|
||||||
# # type(pld) in ext_types
|
|
||||||
# # ):
|
|
||||||
# # py_obj.pld = box(boxed=py_obj)
|
|
||||||
# # breakpoint()
|
|
||||||
# case _ if (
|
|
||||||
# type(py_obj) in ext_types
|
|
||||||
# ):
|
|
||||||
# py_obj = box(boxed=py_obj)
|
|
||||||
|
|
||||||
@property
|
|
||||||
def dec(self) -> msgpack.Decoder:
|
|
||||||
return self._dec
|
|
||||||
|
|
||||||
def decode(
|
|
||||||
self,
|
|
||||||
msg: bytes,
|
|
||||||
) -> Any:
|
|
||||||
'''
|
|
||||||
Decode received `msgpack` bytes into a local python object
|
|
||||||
with special `msgspec.Struct` (or other type) handling
|
|
||||||
determined by the
|
|
||||||
|
|
||||||
'''
|
|
||||||
# https://jcristharif.com/msgspec/usage.html#typed-decoding
|
|
||||||
return self._dec.decode(msg)
|
|
||||||
|
|
||||||
|
|
||||||
# ?TODO? time to remove this finally?
|
|
||||||
#
|
|
||||||
# -[x] TODO: a sub-decoder system as well?
|
|
||||||
# => No! already re-architected to include a "payload-receiver"
|
|
||||||
# now found in `._ops`.
|
|
||||||
#
|
|
||||||
# -[x] do we still want to try and support the sub-decoder with
|
|
||||||
# `.Raw` technique in the case that the `Generic` approach gives
|
|
||||||
# future grief?
|
|
||||||
# => well YES but NO, since we went with the `PldRx` approach
|
|
||||||
# instead!
|
|
||||||
#
|
|
||||||
# IF however you want to see the code that was staged for this
|
|
||||||
# from wayyy back, see the pure removal commit.
|
|
||||||
|
|
||||||
|
|
||||||
def mk_codec(
|
|
||||||
ipc_pld_spec: Union[Type[Struct]]|Any|Raw = Raw,
|
|
||||||
# tagged-struct-types-union set for `Decoder`ing of payloads, as
|
|
||||||
# per https://jcristharif.com/msgspec/structs.html#tagged-unions.
|
|
||||||
# NOTE that the default `Raw` here **is very intentional** since
|
|
||||||
# the `PldRx._pld_dec: MsgDec` is responsible for per ipc-ctx-task
|
|
||||||
# decoding of msg-specs defined by the user as part of **their**
|
|
||||||
# `tractor` "app's" type-limited IPC msg-spec.
|
|
||||||
|
|
||||||
# TODO: offering a per-msg(-field) type-spec such that
|
|
||||||
# the fields can be dynamically NOT decoded and left as `Raw`
|
|
||||||
# values which are later loaded by a sub-decoder specified
|
|
||||||
# by `tag_field: str` value key?
|
|
||||||
# payload_msg_specs: dict[
|
|
||||||
# str, # tag_field value as sub-decoder key
|
|
||||||
# Union[Type[Struct]] # `MsgType.pld` type spec
|
|
||||||
# ]|None = None,
|
|
||||||
|
|
||||||
libname: str = 'msgspec',
|
|
||||||
|
|
||||||
# settings for encoding-to-send extension-types,
|
|
||||||
# https://jcristharif.com/msgspec/extending.html#mapping-to-from-native-types
|
|
||||||
# dec_hook: Callable|None = None,
|
|
||||||
enc_hook: Callable|None = None,
|
|
||||||
ext_types: list[Type]|None = None,
|
|
||||||
|
|
||||||
# optionally provided msg-decoder from which we pull its,
|
|
||||||
# |_.dec_hook()
|
|
||||||
# |_.type
|
|
||||||
ext_dec: MsgDec|None = None
|
|
||||||
#
|
|
||||||
# ?TODO? other params we might want to support
|
|
||||||
# Encoder:
|
|
||||||
# write_buffer_size=write_buffer_size,
|
|
||||||
#
|
|
||||||
# Decoder:
|
|
||||||
# ext_hook: ext_hook_sig
|
|
||||||
|
|
||||||
) -> MsgCodec:
|
|
||||||
'''
|
|
||||||
Convenience factory for creating codecs eventually meant
|
|
||||||
to be interchange lib agnostic (i.e. once we support more then just
|
|
||||||
`msgspec` ;).
|
|
||||||
|
|
||||||
'''
|
|
||||||
pld_spec = ipc_pld_spec
|
|
||||||
if enc_hook:
|
|
||||||
if not ext_types:
|
|
||||||
raise TypeError(
|
|
||||||
f'If extending the serializable types with a custom encode hook (`enc_hook()`), '
|
|
||||||
f'you must also provide the expected type set that the hook will handle '
|
|
||||||
f'via a `ext_types: Union[Type]|None = None` argument!\n'
|
|
||||||
f'\n'
|
|
||||||
f'enc_hook = {enc_hook!r}\n'
|
|
||||||
f'ext_types = {ext_types!r}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
dec_hook: Callable|None = None
|
|
||||||
if ext_dec:
|
|
||||||
dec: msgspec.Decoder = ext_dec.dec
|
|
||||||
dec_hook = dec.dec_hook
|
|
||||||
pld_spec |= dec.type
|
|
||||||
if ext_types:
|
|
||||||
pld_spec |= Union[*ext_types]
|
|
||||||
|
|
||||||
# (manually) generate a msg-spec (how appropes) for all relevant
|
|
||||||
# payload-boxing-struct-msg-types, parameterizing the
|
|
||||||
# `PayloadMsg.pld: PayloadT` for the decoder such that all msgs
|
|
||||||
# in our SC-RPC-protocol will automatically decode to
|
|
||||||
# a type-"limited" payload (`Struct`) object (set).
|
|
||||||
(
|
|
||||||
ipc_msg_spec,
|
|
||||||
msg_types,
|
|
||||||
) = mk_msg_spec(
|
|
||||||
payload_type_union=pld_spec,
|
|
||||||
)
|
|
||||||
|
|
||||||
msg_spec_types: set[Type] = unpack_spec_types(ipc_msg_spec)
|
|
||||||
assert (
|
|
||||||
len(ipc_msg_spec.__args__) == len(msg_types)
|
|
||||||
and
|
|
||||||
len(msg_spec_types) == len(msg_types)
|
|
||||||
)
|
|
||||||
|
|
||||||
dec = msgpack.Decoder(
|
|
||||||
type=ipc_msg_spec,
|
|
||||||
dec_hook=dec_hook,
|
|
||||||
)
|
|
||||||
enc = msgpack.Encoder(
|
|
||||||
enc_hook=enc_hook,
|
|
||||||
)
|
|
||||||
codec = MsgCodec(
|
|
||||||
_enc=enc,
|
|
||||||
_dec=dec,
|
|
||||||
_pld_spec=pld_spec,
|
|
||||||
)
|
|
||||||
# sanity on expected backend support
|
|
||||||
assert codec.lib.__name__ == libname
|
|
||||||
return codec
|
|
||||||
|
|
||||||
|
|
||||||
# instance of the default `msgspec.msgpack` codec settings, i.e.
|
|
||||||
# no custom structs, hooks or other special types.
|
|
||||||
#
|
|
||||||
# XXX NOTE XXX, this will break our `Context.start()` call!
|
|
||||||
#
|
|
||||||
# * by default we roundtrip the started pld-`value` and if you apply
|
|
||||||
# this codec (globally anyway with `apply_codec()`) then the
|
|
||||||
# `roundtripped` value will include a non-`.pld: Raw` which will
|
|
||||||
# then type-error on the consequent `._ops.validte_payload_msg()`..
|
|
||||||
#
|
|
||||||
_def_msgspec_codec: MsgCodec = mk_codec(
|
|
||||||
ipc_pld_spec=Any,
|
|
||||||
)
|
|
||||||
|
|
||||||
# The built-in IPC `Msg` spec.
|
|
||||||
# Our composing "shuttle" protocol which allows `tractor`-app code
|
|
||||||
# to use any `msgspec` supported type as the `PayloadMsg.pld` payload,
|
|
||||||
# https://jcristharif.com/msgspec/supported-types.html
|
|
||||||
#
|
|
||||||
_def_tractor_codec: MsgCodec = mk_codec(
|
|
||||||
ipc_pld_spec=Raw, # XXX should be default righ!?
|
|
||||||
)
|
|
||||||
|
|
||||||
# -[x] TODO, IDEALLY provides for per-`trio.Task` specificity of the
|
|
||||||
# IPC msging codec used by the transport layer when doing
|
|
||||||
# `Channel.send()/.recv()` of wire data.
|
|
||||||
# => impled as our `PldRx` which is `Context` scoped B)
|
|
||||||
|
|
||||||
# ContextVar-TODO: DIDN'T WORK, kept resetting in every new task to default!?
|
|
||||||
# _ctxvar_MsgCodec: ContextVar[MsgCodec] = ContextVar(
|
|
||||||
|
|
||||||
# TreeVar-TODO: DIDN'T WORK, kept resetting in every new embedded nursery
|
|
||||||
# even though it's supposed to inherit from a parent context ???
|
|
||||||
#
|
|
||||||
# _ctxvar_MsgCodec: TreeVar[MsgCodec] = TreeVar(
|
|
||||||
#
|
|
||||||
# ^-NOTE-^: for this to work see the mods by @mikenerone from `trio` gitter:
|
|
||||||
#
|
|
||||||
# 22:02:54 <mikenerone> even for regular contextvars, all you have to do is:
|
|
||||||
# `task: Task = trio.lowlevel.current_task()`
|
|
||||||
# `task.parent_nursery.parent_task.context.run(my_ctx_var.set, new_value)`
|
|
||||||
#
|
|
||||||
# From a comment in his prop code he couldn't share outright:
|
|
||||||
# 1. For every TreeVar set in the current task (which covers what
|
|
||||||
# we need from SynchronizerFacade), walk up the tree until the
|
|
||||||
# root or finding one where the TreeVar is already set, setting
|
|
||||||
# it in all of the contexts along the way.
|
|
||||||
# 2. For each of those, we also forcibly set the values that are
|
|
||||||
# pending for child nurseries that have not yet accessed the
|
|
||||||
# TreeVar.
|
|
||||||
# 3. We similarly set the pending values for the child nurseries
|
|
||||||
# of the *current* task.
|
|
||||||
#
|
|
||||||
_ctxvar_MsgCodec: ContextVar[MsgCodec] = ContextVar(
|
|
||||||
'msgspec_codec',
|
|
||||||
default=_def_tractor_codec,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@cm
|
|
||||||
def apply_codec(
|
|
||||||
codec: MsgCodec,
|
|
||||||
|
|
||||||
ctx: Context|None = None,
|
|
||||||
|
|
||||||
) -> MsgCodec:
|
|
||||||
'''
|
|
||||||
Dynamically apply a `MsgCodec` to the current task's runtime
|
|
||||||
context such that all (of a certain class of payload
|
|
||||||
containing i.e. `MsgType.pld: PayloadT`) IPC msgs are
|
|
||||||
processed with it for that task.
|
|
||||||
|
|
||||||
Uses a `contextvars.ContextVar` to ensure the scope of any
|
|
||||||
codec setting matches the current `Context` or
|
|
||||||
`._rpc.process_messages()` feeder task's prior setting without
|
|
||||||
mutating any surrounding scope.
|
|
||||||
|
|
||||||
When a `ctx` is supplied, only mod its `Context.pld_codec`.
|
|
||||||
|
|
||||||
matches the `@cm` block and DOES NOT change to the original
|
|
||||||
(default) value in new tasks (as it does for `ContextVar`).
|
|
||||||
|
|
||||||
'''
|
|
||||||
__tracebackhide__: bool = True
|
|
||||||
|
|
||||||
if ctx is not None:
|
|
||||||
var: ContextVar = ctx._var_pld_codec
|
|
||||||
else:
|
|
||||||
# use IPC channel-connection "global" codec
|
|
||||||
var: ContextVar = _ctxvar_MsgCodec
|
|
||||||
|
|
||||||
orig: MsgCodec = var.get()
|
|
||||||
|
|
||||||
assert orig is not codec
|
|
||||||
if codec.pld_spec is None:
|
|
||||||
breakpoint()
|
|
||||||
|
|
||||||
log.info(
|
|
||||||
'Applying new msg-spec codec\n\n'
|
|
||||||
f'{codec}\n'
|
|
||||||
)
|
|
||||||
token: Token = var.set(codec)
|
|
||||||
|
|
||||||
try:
|
|
||||||
yield var.get()
|
|
||||||
finally:
|
|
||||||
var.reset(token)
|
|
||||||
log.info(
|
|
||||||
'Reverted to last msg-spec codec\n\n'
|
|
||||||
f'{orig}\n'
|
|
||||||
)
|
|
||||||
assert var.get() is orig
|
|
||||||
|
|
||||||
# ?TODO? for TreeVar approach which copies from the
|
|
||||||
# cancel-scope of the prior value, NOT the prior task
|
|
||||||
#
|
|
||||||
# See the docs:
|
|
||||||
# - https://tricycle.readthedocs.io/en/latest/reference.html#tree-variables
|
|
||||||
# - https://github.com/oremanj/tricycle/blob/master/tricycle/_tests/test_tree_var.py
|
|
||||||
# ^- see docs for @cm `.being()` API
|
|
||||||
#
|
|
||||||
# with _ctxvar_MsgCodec.being(codec):
|
|
||||||
# new = _ctxvar_MsgCodec.get()
|
|
||||||
# assert new is codec
|
|
||||||
# yield codec
|
|
||||||
|
|
||||||
|
|
||||||
def current_codec() -> MsgCodec:
|
|
||||||
'''
|
|
||||||
Return the current `trio.Task.context`'s value
|
|
||||||
for `msgspec_codec` used by `Channel.send/.recv()`
|
|
||||||
for wire serialization.
|
|
||||||
|
|
||||||
'''
|
|
||||||
return _ctxvar_MsgCodec.get()
|
|
||||||
|
|
||||||
|
|
||||||
@cm
|
|
||||||
def limit_msg_spec(
|
|
||||||
payload_spec: Union[Type[Struct]],
|
|
||||||
|
|
||||||
# TODO: don't need this approach right?
|
|
||||||
# -> related to the `MsgCodec._payload_decs` stuff above..
|
|
||||||
# tagged_structs: list[Struct]|None = None,
|
|
||||||
|
|
||||||
hide_tb: bool = True,
|
|
||||||
**codec_kwargs,
|
|
||||||
|
|
||||||
) -> MsgCodec:
|
|
||||||
'''
|
|
||||||
Apply a `MsgCodec` that will natively decode the SC-msg set's
|
|
||||||
`PayloadMsg.pld: Union[Type[Struct]]` payload fields using
|
|
||||||
tagged-unions of `msgspec.Struct`s from the `payload_types`
|
|
||||||
for all IPC contexts in use by the current `trio.Task`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
__tracebackhide__: bool = hide_tb
|
|
||||||
curr_codec: MsgCodec = current_codec()
|
|
||||||
msgspec_codec: MsgCodec = mk_codec(
|
|
||||||
ipc_pld_spec=payload_spec,
|
|
||||||
**codec_kwargs,
|
|
||||||
)
|
|
||||||
with apply_codec(msgspec_codec) as applied_codec:
|
|
||||||
assert applied_codec is msgspec_codec
|
|
||||||
yield msgspec_codec
|
|
||||||
|
|
||||||
assert curr_codec is current_codec()
|
|
||||||
|
|
||||||
|
|
||||||
# XXX: msgspec won't allow this with non-struct custom types
|
|
||||||
# like `NamespacePath`!@!
|
|
||||||
# @cm
|
|
||||||
# def extend_msg_spec(
|
|
||||||
# payload_spec: Union[Type[Struct]],
|
|
||||||
|
|
||||||
# ) -> MsgCodec:
|
|
||||||
# '''
|
|
||||||
# Extend the current `MsgCodec.pld_spec` (type set) by extending
|
|
||||||
# the payload spec to **include** the types specified by
|
|
||||||
# `payload_spec`.
|
|
||||||
|
|
||||||
# '''
|
|
||||||
# codec: MsgCodec = current_codec()
|
|
||||||
# pld_spec: Union[Type] = codec.pld_spec
|
|
||||||
# extended_spec: Union[Type] = pld_spec|payload_spec
|
|
||||||
|
|
||||||
# with limit_msg_spec(payload_types=extended_spec) as ext_codec:
|
|
||||||
# # import pdbp; pdbp.set_trace()
|
|
||||||
# assert ext_codec.pld_spec == extended_spec
|
|
||||||
# yield ext_codec
|
|
||||||
#
|
|
||||||
# ^-TODO-^ is it impossible to make something like this orr!?
|
|
||||||
|
|
||||||
# TODO: make an auto-custom hook generator from a set of input custom
|
|
||||||
# types?
|
|
||||||
# -[ ] below is a proto design using a `TypeCodec` idea?
|
|
||||||
#
|
|
||||||
# type var for the expected interchange-lib's
|
|
||||||
# IPC-transport type when not available as a built-in
|
|
||||||
# serialization output.
|
|
||||||
WireT = TypeVar('WireT')
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: some kinda (decorator) API for built-in subtypes
|
|
||||||
# that builds this implicitly by inspecting the `mro()`?
|
|
||||||
class TypeCodec(Protocol):
|
|
||||||
'''
|
|
||||||
A per-custom-type wire-transport serialization translator
|
|
||||||
description type.
|
|
||||||
|
|
||||||
'''
|
|
||||||
src_type: Type
|
|
||||||
wire_type: WireT
|
|
||||||
|
|
||||||
def encode(obj: Type) -> WireT:
|
|
||||||
...
|
|
||||||
|
|
||||||
def decode(
|
|
||||||
obj_type: Type[WireT],
|
|
||||||
obj: WireT,
|
|
||||||
) -> Type:
|
|
||||||
...
|
|
||||||
|
|
||||||
|
|
||||||
class MsgpackTypeCodec(TypeCodec):
|
|
||||||
...
|
|
||||||
|
|
||||||
|
|
||||||
def mk_codec_hooks(
|
|
||||||
type_codecs: list[TypeCodec],
|
|
||||||
|
|
||||||
) -> tuple[Callable, Callable]:
|
|
||||||
'''
|
|
||||||
Deliver a `enc_hook()`/`dec_hook()` pair which handle
|
|
||||||
manual convertion from an input `Type` set such that whenever
|
|
||||||
the `TypeCodec.filter()` predicate matches the
|
|
||||||
`TypeCodec.decode()` is called on the input native object by
|
|
||||||
the `dec_hook()` and whenever the
|
|
||||||
`isiinstance(obj, TypeCodec.type)` matches against an
|
|
||||||
`enc_hook(obj=obj)` the return value is taken from a
|
|
||||||
`TypeCodec.encode(obj)` callback.
|
|
||||||
|
|
||||||
'''
|
|
||||||
...
|
|
|
@ -1,94 +0,0 @@
|
||||||
# tractor: structured concurrent "actors".
|
|
||||||
# Copyright 2018-eternity Tyler Goodlet.
|
|
||||||
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as published by
|
|
||||||
# the Free Software Foundation, either version 3 of the License, or
|
|
||||||
# (at your option) any later version.
|
|
||||||
|
|
||||||
# This program is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU Affero General Public License for more details.
|
|
||||||
|
|
||||||
# You should have received a copy of the GNU Affero General Public License
|
|
||||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
'''
|
|
||||||
Type-extension-utils for codec-ing (python) objects not
|
|
||||||
covered by the `msgspec.msgpack` protocol.
|
|
||||||
|
|
||||||
See the various API docs from `msgspec`.
|
|
||||||
|
|
||||||
extending from native types,
|
|
||||||
- https://jcristharif.com/msgspec/extending.html#mapping-to-from-native-types
|
|
||||||
|
|
||||||
converters,
|
|
||||||
- https://jcristharif.com/msgspec/converters.html
|
|
||||||
- https://jcristharif.com/msgspec/api.html#msgspec.convert
|
|
||||||
|
|
||||||
`Raw` fields,
|
|
||||||
- https://jcristharif.com/msgspec/api.html#raw
|
|
||||||
- support for `.convert()` and `Raw`,
|
|
||||||
|_ https://jcristharif.com/msgspec/changelog.html
|
|
||||||
|
|
||||||
'''
|
|
||||||
from types import (
|
|
||||||
ModuleType,
|
|
||||||
)
|
|
||||||
import typing
|
|
||||||
from typing import (
|
|
||||||
Type,
|
|
||||||
Union,
|
|
||||||
)
|
|
||||||
|
|
||||||
def dec_type_union(
|
|
||||||
type_names: list[str],
|
|
||||||
mods: list[ModuleType] = []
|
|
||||||
) -> Type|Union[Type]:
|
|
||||||
'''
|
|
||||||
Look up types by name, compile into a list and then create and
|
|
||||||
return a `typing.Union` from the full set.
|
|
||||||
|
|
||||||
'''
|
|
||||||
# import importlib
|
|
||||||
types: list[Type] = []
|
|
||||||
for type_name in type_names:
|
|
||||||
for mod in [
|
|
||||||
typing,
|
|
||||||
# importlib.import_module(__name__),
|
|
||||||
] + mods:
|
|
||||||
if type_ref := getattr(
|
|
||||||
mod,
|
|
||||||
type_name,
|
|
||||||
False,
|
|
||||||
):
|
|
||||||
types.append(type_ref)
|
|
||||||
|
|
||||||
# special case handling only..
|
|
||||||
# ipc_pld_spec: Union[Type] = eval(
|
|
||||||
# pld_spec_str,
|
|
||||||
# {}, # globals
|
|
||||||
# {'typing': typing}, # locals
|
|
||||||
# )
|
|
||||||
|
|
||||||
return Union[*types]
|
|
||||||
|
|
||||||
|
|
||||||
def enc_type_union(
|
|
||||||
union_or_type: Union[Type]|Type,
|
|
||||||
) -> list[str]:
|
|
||||||
'''
|
|
||||||
Encode a type-union or single type to a list of type-name-strings
|
|
||||||
ready for IPC interchange.
|
|
||||||
|
|
||||||
'''
|
|
||||||
type_strs: list[str] = []
|
|
||||||
for typ in getattr(
|
|
||||||
union_or_type,
|
|
||||||
'__args__',
|
|
||||||
{union_or_type,},
|
|
||||||
):
|
|
||||||
type_strs.append(typ.__qualname__)
|
|
||||||
|
|
||||||
return type_strs
|
|
|
@ -1,905 +0,0 @@
|
||||||
# tractor: structured concurrent "actors".
|
|
||||||
# Copyright 2018-eternity Tyler Goodlet.
|
|
||||||
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as published by
|
|
||||||
# the Free Software Foundation, either version 3 of the License, or
|
|
||||||
# (at your option) any later version.
|
|
||||||
|
|
||||||
# This program is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU Affero General Public License for more details.
|
|
||||||
|
|
||||||
# You should have received a copy of the GNU Affero General Public License
|
|
||||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
'''
|
|
||||||
Near-application abstractions for `MsgType.pld: PayloadT|Raw`
|
|
||||||
delivery, filtering and type checking as well as generic
|
|
||||||
operational helpers for processing transaction flows.
|
|
||||||
|
|
||||||
'''
|
|
||||||
from __future__ import annotations
|
|
||||||
from contextlib import (
|
|
||||||
asynccontextmanager as acm,
|
|
||||||
contextmanager as cm,
|
|
||||||
)
|
|
||||||
from typing import (
|
|
||||||
Any,
|
|
||||||
Callable,
|
|
||||||
Type,
|
|
||||||
TYPE_CHECKING,
|
|
||||||
Union,
|
|
||||||
)
|
|
||||||
# ------ - ------
|
|
||||||
from msgspec import (
|
|
||||||
msgpack,
|
|
||||||
Raw,
|
|
||||||
Struct,
|
|
||||||
ValidationError,
|
|
||||||
)
|
|
||||||
import trio
|
|
||||||
# ------ - ------
|
|
||||||
from tractor.log import get_logger
|
|
||||||
from tractor._exceptions import (
|
|
||||||
MessagingError,
|
|
||||||
InternalError,
|
|
||||||
_raise_from_unexpected_msg,
|
|
||||||
MsgTypeError,
|
|
||||||
_mk_recv_mte,
|
|
||||||
pack_error,
|
|
||||||
)
|
|
||||||
from tractor._state import (
|
|
||||||
current_ipc_ctx,
|
|
||||||
)
|
|
||||||
from ._codec import (
|
|
||||||
mk_dec,
|
|
||||||
MsgDec,
|
|
||||||
MsgCodec,
|
|
||||||
current_codec,
|
|
||||||
)
|
|
||||||
from .types import (
|
|
||||||
CancelAck,
|
|
||||||
Error,
|
|
||||||
MsgType,
|
|
||||||
PayloadT,
|
|
||||||
Return,
|
|
||||||
Started,
|
|
||||||
Stop,
|
|
||||||
Yield,
|
|
||||||
pretty_struct,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
|
||||||
from tractor._context import Context
|
|
||||||
from tractor._streaming import MsgStream
|
|
||||||
|
|
||||||
|
|
||||||
log = get_logger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
_def_any_pldec: MsgDec[Any] = mk_dec(spec=Any)
|
|
||||||
|
|
||||||
|
|
||||||
class PldRx(Struct):
|
|
||||||
'''
|
|
||||||
A "msg payload receiver".
|
|
||||||
|
|
||||||
The pairing of a "feeder" `trio.abc.ReceiveChannel` and an
|
|
||||||
interchange-specific (eg. msgpack) payload field decoder. The
|
|
||||||
validation/type-filtering rules are runtime mutable and allow
|
|
||||||
type constraining the set of `MsgType.pld: Raw|PayloadT`
|
|
||||||
values at runtime, per IPC task-context.
|
|
||||||
|
|
||||||
This abstraction, being just below "user application code",
|
|
||||||
allows for the equivalent of our `MsgCodec` (used for
|
|
||||||
typer-filtering IPC dialog protocol msgs against a msg-spec)
|
|
||||||
but with granular control around payload delivery (i.e. the
|
|
||||||
data-values user code actually sees and uses (the blobs that
|
|
||||||
are "shuttled" by the wrapping dialog prot) such that invalid
|
|
||||||
`.pld: Raw` can be decoded and handled by IPC-primitive user
|
|
||||||
code (i.e. that operates on `Context` and `Msgstream` APIs)
|
|
||||||
without knowledge of the lower level `Channel`/`MsgTransport`
|
|
||||||
primitives nor the `MsgCodec` in use. Further, lazily decoding
|
|
||||||
payload blobs allows for topical (and maybe intentionally
|
|
||||||
"partial") encryption of msg field subsets.
|
|
||||||
|
|
||||||
'''
|
|
||||||
# TODO: better to bind it here?
|
|
||||||
# _rx_mc: trio.MemoryReceiveChannel
|
|
||||||
_pld_dec: MsgDec
|
|
||||||
|
|
||||||
@property
|
|
||||||
def pld_dec(self) -> MsgDec:
|
|
||||||
return self._pld_dec
|
|
||||||
|
|
||||||
@cm
|
|
||||||
def limit_plds(
|
|
||||||
self,
|
|
||||||
spec: Union[Type[Struct]],
|
|
||||||
**dec_kwargs,
|
|
||||||
|
|
||||||
) -> MsgDec:
|
|
||||||
'''
|
|
||||||
Type-limit the loadable msg payloads via an applied
|
|
||||||
`MsgDec` given an input spec, revert to prior decoder on
|
|
||||||
exit.
|
|
||||||
|
|
||||||
'''
|
|
||||||
# TODO, ensure we pull the current `MsgCodec`'s custom
|
|
||||||
# dec/enc_hook settings as well ?
|
|
||||||
# -[ ] see `._codec.mk_codec()` inputs
|
|
||||||
#
|
|
||||||
orig_dec: MsgDec = self._pld_dec
|
|
||||||
limit_dec: MsgDec = mk_dec(
|
|
||||||
spec=spec,
|
|
||||||
**dec_kwargs,
|
|
||||||
)
|
|
||||||
try:
|
|
||||||
self._pld_dec = limit_dec
|
|
||||||
yield limit_dec
|
|
||||||
finally:
|
|
||||||
self._pld_dec = orig_dec
|
|
||||||
|
|
||||||
@property
|
|
||||||
def dec(self) -> msgpack.Decoder:
|
|
||||||
return self._pld_dec.dec
|
|
||||||
|
|
||||||
def recv_msg_nowait(
|
|
||||||
self,
|
|
||||||
# TODO: make this `MsgStream` compat as well, see above^
|
|
||||||
# ipc_prim: Context|MsgStream,
|
|
||||||
ipc: Context|MsgStream,
|
|
||||||
|
|
||||||
ipc_msg: MsgType|None = None,
|
|
||||||
expect_msg: Type[MsgType]|None = None,
|
|
||||||
hide_tb: bool = False,
|
|
||||||
**dec_pld_kwargs,
|
|
||||||
|
|
||||||
) -> tuple[
|
|
||||||
MsgType[PayloadT],
|
|
||||||
PayloadT,
|
|
||||||
]:
|
|
||||||
'''
|
|
||||||
Attempt to non-blocking receive a message from the `._rx_chan` and
|
|
||||||
unwrap it's payload delivering the pair to the caller.
|
|
||||||
|
|
||||||
'''
|
|
||||||
__tracebackhide__: bool = hide_tb
|
|
||||||
|
|
||||||
msg: MsgType = (
|
|
||||||
ipc_msg
|
|
||||||
or
|
|
||||||
# sync-rx msg from underlying IPC feeder (mem-)chan
|
|
||||||
ipc._rx_chan.receive_nowait()
|
|
||||||
)
|
|
||||||
pld: PayloadT = self.decode_pld(
|
|
||||||
msg,
|
|
||||||
ipc=ipc,
|
|
||||||
expect_msg=expect_msg,
|
|
||||||
hide_tb=hide_tb,
|
|
||||||
**dec_pld_kwargs,
|
|
||||||
)
|
|
||||||
return (
|
|
||||||
msg,
|
|
||||||
pld,
|
|
||||||
)
|
|
||||||
|
|
||||||
async def recv_msg(
|
|
||||||
self,
|
|
||||||
ipc: Context|MsgStream,
|
|
||||||
expect_msg: MsgType,
|
|
||||||
|
|
||||||
# NOTE: ONLY for handling `Stop`-msgs that arrive during
|
|
||||||
# a call to `drain_to_final_msg()` above!
|
|
||||||
passthrough_non_pld_msgs: bool = True,
|
|
||||||
hide_tb: bool = True,
|
|
||||||
|
|
||||||
**decode_pld_kwargs,
|
|
||||||
|
|
||||||
) -> tuple[MsgType, PayloadT]:
|
|
||||||
'''
|
|
||||||
Retrieve the next avail IPC msg, decode its payload, and
|
|
||||||
return the (msg, pld) pair.
|
|
||||||
|
|
||||||
'''
|
|
||||||
__tracebackhide__: bool = hide_tb
|
|
||||||
msg: MsgType = await ipc._rx_chan.receive()
|
|
||||||
match msg:
|
|
||||||
case Return()|Error():
|
|
||||||
log.runtime(
|
|
||||||
f'Rxed final outcome msg\n'
|
|
||||||
f'{msg}\n'
|
|
||||||
)
|
|
||||||
case Stop():
|
|
||||||
log.runtime(
|
|
||||||
f'Rxed stream stopped msg\n'
|
|
||||||
f'{msg}\n'
|
|
||||||
)
|
|
||||||
if passthrough_non_pld_msgs:
|
|
||||||
return msg, None
|
|
||||||
|
|
||||||
# TODO: is there some way we can inject the decoded
|
|
||||||
# payload into an existing output buffer for the original
|
|
||||||
# msg instance?
|
|
||||||
pld: PayloadT = self.decode_pld(
|
|
||||||
msg,
|
|
||||||
ipc=ipc,
|
|
||||||
expect_msg=expect_msg,
|
|
||||||
hide_tb=hide_tb,
|
|
||||||
|
|
||||||
**decode_pld_kwargs,
|
|
||||||
)
|
|
||||||
return (
|
|
||||||
msg,
|
|
||||||
pld,
|
|
||||||
)
|
|
||||||
|
|
||||||
async def recv_pld(
|
|
||||||
self,
|
|
||||||
ipc: Context|MsgStream,
|
|
||||||
ipc_msg: MsgType[PayloadT]|None = None,
|
|
||||||
expect_msg: Type[MsgType]|None = None,
|
|
||||||
hide_tb: bool = True,
|
|
||||||
|
|
||||||
**dec_pld_kwargs,
|
|
||||||
|
|
||||||
) -> PayloadT:
|
|
||||||
'''
|
|
||||||
Receive a `MsgType`, then decode and return its `.pld` field.
|
|
||||||
|
|
||||||
'''
|
|
||||||
__tracebackhide__: bool = hide_tb
|
|
||||||
msg: MsgType = (
|
|
||||||
ipc_msg
|
|
||||||
or
|
|
||||||
# async-rx msg from underlying IPC feeder (mem-)chan
|
|
||||||
await ipc._rx_chan.receive()
|
|
||||||
)
|
|
||||||
if (
|
|
||||||
type(msg) is Return
|
|
||||||
):
|
|
||||||
log.info(
|
|
||||||
f'Rxed final result msg\n'
|
|
||||||
f'{msg}\n'
|
|
||||||
)
|
|
||||||
return self.decode_pld(
|
|
||||||
msg=msg,
|
|
||||||
ipc=ipc,
|
|
||||||
expect_msg=expect_msg,
|
|
||||||
**dec_pld_kwargs,
|
|
||||||
)
|
|
||||||
|
|
||||||
def decode_pld(
|
|
||||||
self,
|
|
||||||
msg: MsgType,
|
|
||||||
ipc: Context|MsgStream,
|
|
||||||
expect_msg: Type[MsgType]|None,
|
|
||||||
|
|
||||||
raise_error: bool = True,
|
|
||||||
hide_tb: bool = True,
|
|
||||||
|
|
||||||
# XXX for special (default?) case of send side call with
|
|
||||||
# `Context.started(validate_pld_spec=True)`
|
|
||||||
is_started_send_side: bool = False,
|
|
||||||
|
|
||||||
) -> PayloadT|Raw:
|
|
||||||
'''
|
|
||||||
Decode a msg's payload field: `MsgType.pld: PayloadT|Raw` and
|
|
||||||
return the value or raise an appropriate error.
|
|
||||||
|
|
||||||
'''
|
|
||||||
__tracebackhide__: bool = hide_tb
|
|
||||||
src_err: BaseException|None = None
|
|
||||||
match msg:
|
|
||||||
# payload-data shuttle msg; deliver the `.pld` value
|
|
||||||
# directly to IPC (primitive) client-consumer code.
|
|
||||||
case (
|
|
||||||
Started(pld=pld) # sync phase
|
|
||||||
|Yield(pld=pld) # streaming phase
|
|
||||||
|Return(pld=pld) # termination phase
|
|
||||||
):
|
|
||||||
try:
|
|
||||||
pld: PayloadT = self._pld_dec.decode(pld)
|
|
||||||
log.runtime(
|
|
||||||
'Decoded msg payload\n\n'
|
|
||||||
f'{msg}\n'
|
|
||||||
f'where payload decoded as\n'
|
|
||||||
f'|_pld={pld!r}\n'
|
|
||||||
)
|
|
||||||
return pld
|
|
||||||
except TypeError as typerr:
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
raise typerr
|
|
||||||
|
|
||||||
# XXX pld-value type failure
|
|
||||||
except ValidationError as valerr:
|
|
||||||
# pack mgterr into error-msg for
|
|
||||||
# reraise below; ensure remote-actor-err
|
|
||||||
# info is displayed nicely?
|
|
||||||
mte: MsgTypeError = _mk_recv_mte(
|
|
||||||
msg=msg,
|
|
||||||
codec=self.pld_dec,
|
|
||||||
src_validation_error=valerr,
|
|
||||||
is_invalid_payload=True,
|
|
||||||
expected_msg=expect_msg,
|
|
||||||
)
|
|
||||||
# NOTE: just raise the MTE inline instead of all
|
|
||||||
# the pack-unpack-repack non-sense when this is
|
|
||||||
# a "send side" validation error.
|
|
||||||
if is_started_send_side:
|
|
||||||
raise mte
|
|
||||||
|
|
||||||
# NOTE: the `.message` is automatically
|
|
||||||
# transferred into the message as long as we
|
|
||||||
# define it as a `Error.message` field.
|
|
||||||
err_msg: Error = pack_error(
|
|
||||||
exc=mte,
|
|
||||||
cid=msg.cid,
|
|
||||||
src_uid=(
|
|
||||||
ipc.chan.uid
|
|
||||||
if not is_started_send_side
|
|
||||||
else ipc._actor.uid
|
|
||||||
),
|
|
||||||
)
|
|
||||||
mte._ipc_msg = err_msg
|
|
||||||
|
|
||||||
# XXX override the `msg` passed to
|
|
||||||
# `_raise_from_unexpected_msg()` (below) so so
|
|
||||||
# that we're effectively able to use that same
|
|
||||||
# func to unpack and raise an "emulated remote
|
|
||||||
# `Error`" of this local MTE.
|
|
||||||
msg = err_msg
|
|
||||||
# XXX NOTE: so when the `_raise_from_unexpected_msg()`
|
|
||||||
# raises the boxed `err_msg` from above it raises
|
|
||||||
# it from the above caught interchange-lib
|
|
||||||
# validation error.
|
|
||||||
src_err = valerr
|
|
||||||
|
|
||||||
# a runtime-internal RPC endpoint response.
|
|
||||||
# always passthrough since (internal) runtime
|
|
||||||
# responses are generally never exposed to consumer
|
|
||||||
# code.
|
|
||||||
case CancelAck(
|
|
||||||
pld=bool(cancelled)
|
|
||||||
):
|
|
||||||
return cancelled
|
|
||||||
|
|
||||||
case Error():
|
|
||||||
src_err = MessagingError(
|
|
||||||
'IPC ctx dialog terminated without `Return`-ing a result\n'
|
|
||||||
f'Instead it raised {msg.boxed_type_str!r}!'
|
|
||||||
)
|
|
||||||
# XXX NOTE XXX another super subtle runtime-y thing..
|
|
||||||
#
|
|
||||||
# - when user code (transitively) calls into this
|
|
||||||
# func (usually via a `Context/MsgStream` API) we
|
|
||||||
# generally want errors to propagate immediately
|
|
||||||
# and directly so that the user can define how it
|
|
||||||
# wants to handle them.
|
|
||||||
#
|
|
||||||
# HOWEVER,
|
|
||||||
#
|
|
||||||
# - for certain runtime calling cases, we don't want to
|
|
||||||
# directly raise since the calling code might have
|
|
||||||
# special logic around whether to raise the error
|
|
||||||
# or supress it silently (eg. a `ContextCancelled`
|
|
||||||
# received from the far end which was requested by
|
|
||||||
# this side, aka a self-cancel).
|
|
||||||
#
|
|
||||||
# SO, we offer a flag to control this.
|
|
||||||
if not raise_error:
|
|
||||||
return src_err
|
|
||||||
|
|
||||||
case Stop(cid=cid):
|
|
||||||
ctx: Context = getattr(ipc, 'ctx', ipc)
|
|
||||||
message: str = (
|
|
||||||
f'{ctx.side!r}-side of ctx received stream-`Stop` from '
|
|
||||||
f'{ctx.peer_side!r} peer ?\n'
|
|
||||||
f'|_cid: {cid}\n\n'
|
|
||||||
|
|
||||||
f'{pretty_struct.pformat(msg)}\n'
|
|
||||||
)
|
|
||||||
if ctx._stream is None:
|
|
||||||
explain: str = (
|
|
||||||
f'BUT, no `MsgStream` (was) open(ed) on this '
|
|
||||||
f'{ctx.side!r}-side of the IPC ctx?\n'
|
|
||||||
f'Maybe check your code for streaming phase race conditions?\n'
|
|
||||||
)
|
|
||||||
log.warning(
|
|
||||||
message
|
|
||||||
+
|
|
||||||
explain
|
|
||||||
)
|
|
||||||
# let caller decide what to do when only one
|
|
||||||
# side opened a stream, don't raise.
|
|
||||||
return msg
|
|
||||||
|
|
||||||
else:
|
|
||||||
explain: str = (
|
|
||||||
'Received a `Stop` when it should NEVER be possible!?!?\n'
|
|
||||||
)
|
|
||||||
# TODO: this is constructed inside
|
|
||||||
# `_raise_from_unexpected_msg()` but maybe we
|
|
||||||
# should pass it in?
|
|
||||||
# src_err = trio.EndOfChannel(explain)
|
|
||||||
src_err = None
|
|
||||||
|
|
||||||
case _:
|
|
||||||
src_err = InternalError(
|
|
||||||
'Invalid IPC msg ??\n\n'
|
|
||||||
f'{msg}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
# TODO: maybe use the new `.add_note()` from 3.11?
|
|
||||||
# |_https://docs.python.org/3.11/library/exceptions.html#BaseException.add_note
|
|
||||||
#
|
|
||||||
# fallthrough and raise from `src_err`
|
|
||||||
try:
|
|
||||||
_raise_from_unexpected_msg(
|
|
||||||
ctx=getattr(ipc, 'ctx', ipc),
|
|
||||||
msg=msg,
|
|
||||||
src_err=src_err,
|
|
||||||
log=log,
|
|
||||||
expect_msg=expect_msg,
|
|
||||||
hide_tb=hide_tb,
|
|
||||||
)
|
|
||||||
except UnboundLocalError:
|
|
||||||
# XXX if there's an internal lookup error in the above
|
|
||||||
# code (prolly on `src_err`) we want to show this frame
|
|
||||||
# in the tb!
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
raise
|
|
||||||
|
|
||||||
|
|
||||||
@cm
|
|
||||||
def limit_plds(
|
|
||||||
spec: Union[Type[Struct]],
|
|
||||||
**dec_kwargs,
|
|
||||||
|
|
||||||
) -> MsgDec:
|
|
||||||
'''
|
|
||||||
Apply a `MsgCodec` that will natively decode the SC-msg set's
|
|
||||||
`PayloadMsg.pld: Union[Type[Struct]]` payload fields using
|
|
||||||
tagged-unions of `msgspec.Struct`s from the `payload_types`
|
|
||||||
for all IPC contexts in use by the current `trio.Task`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
__tracebackhide__: bool = True
|
|
||||||
curr_ctx: Context|None = current_ipc_ctx()
|
|
||||||
if curr_ctx is None:
|
|
||||||
raise RuntimeError(
|
|
||||||
'No IPC `Context` is active !?\n'
|
|
||||||
'Did you open `limit_plds()` from outside '
|
|
||||||
'a `Portal.open_context()` scope-block?'
|
|
||||||
)
|
|
||||||
try:
|
|
||||||
rx: PldRx = curr_ctx._pld_rx
|
|
||||||
orig_pldec: MsgDec = rx.pld_dec
|
|
||||||
with rx.limit_plds(
|
|
||||||
spec=spec,
|
|
||||||
**dec_kwargs,
|
|
||||||
) as pldec:
|
|
||||||
log.runtime(
|
|
||||||
'Applying payload-decoder\n\n'
|
|
||||||
f'{pldec}\n'
|
|
||||||
)
|
|
||||||
yield pldec
|
|
||||||
|
|
||||||
except BaseException:
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
raise
|
|
||||||
|
|
||||||
finally:
|
|
||||||
log.runtime(
|
|
||||||
'Reverted to previous payload-decoder\n\n'
|
|
||||||
f'{orig_pldec}\n'
|
|
||||||
)
|
|
||||||
# sanity on orig settings
|
|
||||||
assert rx.pld_dec is orig_pldec
|
|
||||||
|
|
||||||
|
|
||||||
@acm
|
|
||||||
async def maybe_limit_plds(
|
|
||||||
ctx: Context,
|
|
||||||
spec: Union[Type[Struct]]|None = None,
|
|
||||||
dec_hook: Callable|None = None,
|
|
||||||
**kwargs,
|
|
||||||
|
|
||||||
) -> MsgDec|None:
|
|
||||||
'''
|
|
||||||
Async compat maybe-payload type limiter.
|
|
||||||
|
|
||||||
Mostly for use inside other internal `@acm`s such that a separate
|
|
||||||
indent block isn't needed when an async one is already being
|
|
||||||
used.
|
|
||||||
|
|
||||||
'''
|
|
||||||
if (
|
|
||||||
spec is None
|
|
||||||
and
|
|
||||||
dec_hook is None
|
|
||||||
):
|
|
||||||
yield None
|
|
||||||
return
|
|
||||||
|
|
||||||
# sanity check on IPC scoping
|
|
||||||
curr_ctx: Context = current_ipc_ctx()
|
|
||||||
assert ctx is curr_ctx
|
|
||||||
|
|
||||||
with ctx._pld_rx.limit_plds(
|
|
||||||
spec=spec,
|
|
||||||
dec_hook=dec_hook,
|
|
||||||
**kwargs,
|
|
||||||
) as msgdec:
|
|
||||||
yield msgdec
|
|
||||||
|
|
||||||
# when the applied spec is unwound/removed, the same IPC-ctx
|
|
||||||
# should still be in scope.
|
|
||||||
curr_ctx: Context = current_ipc_ctx()
|
|
||||||
assert ctx is curr_ctx
|
|
||||||
|
|
||||||
|
|
||||||
async def drain_to_final_msg(
|
|
||||||
ctx: Context,
|
|
||||||
|
|
||||||
msg_limit: int = 6,
|
|
||||||
hide_tb: bool = True,
|
|
||||||
|
|
||||||
) -> tuple[
|
|
||||||
Return|None,
|
|
||||||
list[MsgType]
|
|
||||||
]:
|
|
||||||
'''
|
|
||||||
Drain IPC msgs delivered to the underlying IPC context's
|
|
||||||
rx-mem-chan (i.e. from `Context._rx_chan`) in search for a final
|
|
||||||
`Return` or `Error` msg.
|
|
||||||
|
|
||||||
Deliver the `Return` + preceding drained msgs (`list[MsgType]`)
|
|
||||||
as a pair unless an `Error` is found, in which unpack and raise
|
|
||||||
it.
|
|
||||||
|
|
||||||
The motivation here is to always capture any remote error relayed
|
|
||||||
by the remote peer task during a ctxc condition.
|
|
||||||
|
|
||||||
For eg. a ctxc-request may be sent to the peer as part of the
|
|
||||||
local task's (request for) cancellation but then that same task
|
|
||||||
**also errors** before executing the teardown in the
|
|
||||||
`Portal.open_context().__aexit__()` block. In such error-on-exit
|
|
||||||
cases we want to always capture and raise any delivered remote
|
|
||||||
error (like an expected ctxc-ACK) as part of the final
|
|
||||||
`ctx.wait_for_result()` teardown sequence such that the
|
|
||||||
`Context.outcome` related state always reflect what transpired
|
|
||||||
even after ctx closure and the `.open_context()` block exit.
|
|
||||||
|
|
||||||
'''
|
|
||||||
raise_overrun: bool = not ctx._allow_overruns
|
|
||||||
parent_never_opened_stream: bool = ctx._stream is None
|
|
||||||
|
|
||||||
# wait for a final context result by collecting (but
|
|
||||||
# basically ignoring) any bi-dir-stream msgs still in transit
|
|
||||||
# from the far end.
|
|
||||||
pre_result_drained: list[MsgType] = []
|
|
||||||
result_msg: Return|Error|None = None
|
|
||||||
while not (
|
|
||||||
ctx.maybe_error
|
|
||||||
and
|
|
||||||
not ctx._final_result_is_set()
|
|
||||||
):
|
|
||||||
try:
|
|
||||||
# receive all msgs, scanning for either a final result
|
|
||||||
# or error; the underlying call should never raise any
|
|
||||||
# remote error directly!
|
|
||||||
msg, pld = await ctx._pld_rx.recv_msg(
|
|
||||||
ipc=ctx,
|
|
||||||
expect_msg=Return,
|
|
||||||
raise_error=False,
|
|
||||||
hide_tb=hide_tb,
|
|
||||||
)
|
|
||||||
# ^-TODO-^ some bad ideas?
|
|
||||||
# -[ ] wrap final outcome .receive() in a scope so
|
|
||||||
# it can be cancelled out of band if needed?
|
|
||||||
# |_with trio.CancelScope() as res_cs:
|
|
||||||
# ctx._res_scope = res_cs
|
|
||||||
# msg: dict = await ctx._rx_chan.receive()
|
|
||||||
# if res_cs.cancelled_caught:
|
|
||||||
#
|
|
||||||
# -[ ] make sure pause points work here for REPLing
|
|
||||||
# the runtime itself; i.e. ensure there's no hangs!
|
|
||||||
# |_from tractor.devx._debug import pause
|
|
||||||
# await pause()
|
|
||||||
|
|
||||||
# NOTE: we get here if the far end was
|
|
||||||
# `ContextCancelled` in 2 cases:
|
|
||||||
# 1. we requested the cancellation and thus
|
|
||||||
# SHOULD NOT raise that far end error,
|
|
||||||
# 2. WE DID NOT REQUEST that cancel and thus
|
|
||||||
# SHOULD RAISE HERE!
|
|
||||||
except trio.Cancelled as _taskc:
|
|
||||||
taskc: trio.Cancelled = _taskc
|
|
||||||
|
|
||||||
# report when the cancellation wasn't (ostensibly) due to
|
|
||||||
# RPC operation, some surrounding parent cancel-scope.
|
|
||||||
if not ctx._scope.cancel_called:
|
|
||||||
task: trio.lowlevel.Task = trio.lowlevel.current_task()
|
|
||||||
rent_n: trio.Nursery = task.parent_nursery
|
|
||||||
if (
|
|
||||||
(local_cs := rent_n.cancel_scope).cancel_called
|
|
||||||
):
|
|
||||||
log.cancel(
|
|
||||||
'RPC-ctx cancelled by local-parent scope during drain!\n\n'
|
|
||||||
f'c}}>\n'
|
|
||||||
f' |_{rent_n}\n'
|
|
||||||
f' |_.cancel_scope = {local_cs}\n'
|
|
||||||
f' |_>c}}\n'
|
|
||||||
f' |_{ctx.pformat(indent=" "*9)}'
|
|
||||||
# ^TODO, some (other) simpler repr here?
|
|
||||||
)
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
|
|
||||||
else:
|
|
||||||
log.cancel(
|
|
||||||
f'IPC ctx cancelled externally during result drain ?\n'
|
|
||||||
f'{ctx}'
|
|
||||||
)
|
|
||||||
# CASE 2: mask the local cancelled-error(s)
|
|
||||||
# only when we are sure the remote error is
|
|
||||||
# the source cause of this local task's
|
|
||||||
# cancellation.
|
|
||||||
ctx.maybe_raise(
|
|
||||||
hide_tb=hide_tb,
|
|
||||||
from_src_exc=taskc,
|
|
||||||
# ?TODO? when *should* we use this?
|
|
||||||
)
|
|
||||||
|
|
||||||
# CASE 1: we DID request the cancel we simply
|
|
||||||
# continue to bubble up as normal.
|
|
||||||
raise taskc
|
|
||||||
|
|
||||||
match msg:
|
|
||||||
|
|
||||||
# final result arrived!
|
|
||||||
case Return():
|
|
||||||
log.runtime(
|
|
||||||
'Context delivered final draining msg:\n'
|
|
||||||
f'{pretty_struct.pformat(msg)}'
|
|
||||||
)
|
|
||||||
ctx._result: Any = pld
|
|
||||||
result_msg = msg
|
|
||||||
break
|
|
||||||
|
|
||||||
# far end task is still streaming to us so discard
|
|
||||||
# and report depending on local ctx state.
|
|
||||||
case Yield():
|
|
||||||
pre_result_drained.append(msg)
|
|
||||||
if (
|
|
||||||
not parent_never_opened_stream
|
|
||||||
and (
|
|
||||||
(ctx._stream.closed
|
|
||||||
and
|
|
||||||
(reason := 'stream was already closed')
|
|
||||||
) or
|
|
||||||
(ctx.cancel_acked
|
|
||||||
and
|
|
||||||
(reason := 'ctx cancelled other side')
|
|
||||||
)
|
|
||||||
or (ctx._cancel_called
|
|
||||||
and
|
|
||||||
(reason := 'ctx called `.cancel()`')
|
|
||||||
)
|
|
||||||
or (len(pre_result_drained) > msg_limit
|
|
||||||
and
|
|
||||||
(reason := f'"yield" limit={msg_limit}')
|
|
||||||
)
|
|
||||||
)
|
|
||||||
):
|
|
||||||
log.cancel(
|
|
||||||
'Cancelling `MsgStream` drain since '
|
|
||||||
f'{reason}\n\n'
|
|
||||||
f'<= {ctx.chan.uid}\n'
|
|
||||||
f' |_{ctx._nsf}()\n\n'
|
|
||||||
f'=> {ctx._task}\n'
|
|
||||||
f' |_{ctx._stream}\n\n'
|
|
||||||
|
|
||||||
f'{pretty_struct.pformat(msg)}\n'
|
|
||||||
)
|
|
||||||
break
|
|
||||||
|
|
||||||
# drain up to the `msg_limit` hoping to get
|
|
||||||
# a final result or error/ctxc.
|
|
||||||
else:
|
|
||||||
report: str = (
|
|
||||||
'Ignoring "yield" msg during `ctx.result()` drain..\n'
|
|
||||||
f'<= {ctx.chan.uid}\n'
|
|
||||||
f' |_{ctx._nsf}()\n\n'
|
|
||||||
f'=> {ctx._task}\n'
|
|
||||||
f' |_{ctx._stream}\n\n'
|
|
||||||
|
|
||||||
f'{pretty_struct.pformat(msg)}\n'
|
|
||||||
)
|
|
||||||
if parent_never_opened_stream:
|
|
||||||
report = (
|
|
||||||
f'IPC ctx never opened stream on {ctx.side!r}-side!\n'
|
|
||||||
f'\n'
|
|
||||||
# f'{ctx}\n'
|
|
||||||
) + report
|
|
||||||
|
|
||||||
log.warning(report)
|
|
||||||
continue
|
|
||||||
|
|
||||||
# stream terminated, but no result yet..
|
|
||||||
#
|
|
||||||
# TODO: work out edge cases here where
|
|
||||||
# a stream is open but the task also calls
|
|
||||||
# this?
|
|
||||||
# -[ ] should be a runtime error if a stream is open right?
|
|
||||||
# Stop()
|
|
||||||
case Stop():
|
|
||||||
pre_result_drained.append(msg)
|
|
||||||
log.runtime( # normal/expected shutdown transaction
|
|
||||||
'Remote stream terminated due to "stop" msg:\n\n'
|
|
||||||
f'{pretty_struct.pformat(msg)}\n'
|
|
||||||
)
|
|
||||||
continue
|
|
||||||
|
|
||||||
# remote error msg, likely already handled inside
|
|
||||||
# `Context._deliver_msg()`
|
|
||||||
case Error():
|
|
||||||
# TODO: can we replace this with `ctx.maybe_raise()`?
|
|
||||||
# -[ ] would this be handier for this case maybe?
|
|
||||||
# |_async with maybe_raise_on_exit() as raises:
|
|
||||||
# if raises:
|
|
||||||
# log.error('some msg about raising..')
|
|
||||||
#
|
|
||||||
re: Exception|None = ctx._remote_error
|
|
||||||
if re:
|
|
||||||
assert msg is ctx._cancel_msg
|
|
||||||
# NOTE: this solved a super duper edge case XD
|
|
||||||
# this was THE super duper edge case of:
|
|
||||||
# - local task opens a remote task,
|
|
||||||
# - requests remote cancellation of far end
|
|
||||||
# ctx/tasks,
|
|
||||||
# - needs to wait for the cancel ack msg
|
|
||||||
# (ctxc) or some result in the race case
|
|
||||||
# where the other side's task returns
|
|
||||||
# before the cancel request msg is ever
|
|
||||||
# rxed and processed,
|
|
||||||
# - here this surrounding drain loop (which
|
|
||||||
# iterates all ipc msgs until the ack or
|
|
||||||
# an early result arrives) was NOT exiting
|
|
||||||
# since we are the edge case: local task
|
|
||||||
# does not re-raise any ctxc it receives
|
|
||||||
# IFF **it** was the cancellation
|
|
||||||
# requester..
|
|
||||||
#
|
|
||||||
# XXX will raise if necessary but ow break
|
|
||||||
# from loop presuming any supressed error
|
|
||||||
# (ctxc) should terminate the context!
|
|
||||||
ctx._maybe_raise_remote_err(
|
|
||||||
re,
|
|
||||||
# NOTE: obvi we don't care if we
|
|
||||||
# overran the far end if we're already
|
|
||||||
# waiting on a final result (msg).
|
|
||||||
# raise_overrun_from_self=False,
|
|
||||||
raise_overrun_from_self=raise_overrun,
|
|
||||||
)
|
|
||||||
result_msg = msg
|
|
||||||
break # OOOOOF, yeah obvi we need this..
|
|
||||||
|
|
||||||
else:
|
|
||||||
# bubble the original src key error
|
|
||||||
raise
|
|
||||||
|
|
||||||
# XXX should pretty much never get here unless someone
|
|
||||||
# overrides the default `MsgType` spec.
|
|
||||||
case _:
|
|
||||||
pre_result_drained.append(msg)
|
|
||||||
# It's definitely an internal error if any other
|
|
||||||
# msg type without a`'cid'` field arrives here!
|
|
||||||
report: str = (
|
|
||||||
f'Invalid or unknown msg type {type(msg)!r}!?\n'
|
|
||||||
)
|
|
||||||
if not msg.cid:
|
|
||||||
report += (
|
|
||||||
'\nWhich also has no `.cid` field?\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
raise MessagingError(
|
|
||||||
report
|
|
||||||
+
|
|
||||||
f'\n{msg}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
else:
|
|
||||||
log.cancel(
|
|
||||||
'Skipping `MsgStream` drain since final outcome is set\n\n'
|
|
||||||
f'{ctx.outcome}\n'
|
|
||||||
)
|
|
||||||
|
|
||||||
__tracebackhide__: bool = hide_tb
|
|
||||||
return (
|
|
||||||
result_msg,
|
|
||||||
pre_result_drained,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def validate_payload_msg(
|
|
||||||
pld_msg: Started|Yield|Return,
|
|
||||||
pld_value: PayloadT,
|
|
||||||
ipc: Context|MsgStream,
|
|
||||||
|
|
||||||
raise_mte: bool = True,
|
|
||||||
strict_pld_parity: bool = False,
|
|
||||||
hide_tb: bool = True,
|
|
||||||
|
|
||||||
) -> MsgTypeError|None:
|
|
||||||
'''
|
|
||||||
Validate a `PayloadMsg.pld` value with the current
|
|
||||||
IPC ctx's `PldRx` and raise an appropriate `MsgTypeError`
|
|
||||||
on failure.
|
|
||||||
|
|
||||||
'''
|
|
||||||
__tracebackhide__: bool = hide_tb
|
|
||||||
codec: MsgCodec = current_codec()
|
|
||||||
msg_bytes: bytes = codec.encode(pld_msg)
|
|
||||||
roundtripped: Started|None = None
|
|
||||||
try:
|
|
||||||
roundtripped: Started = codec.decode(msg_bytes)
|
|
||||||
except TypeError as typerr:
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
raise typerr
|
|
||||||
|
|
||||||
try:
|
|
||||||
ctx: Context = getattr(ipc, 'ctx', ipc)
|
|
||||||
pld: PayloadT = ctx.pld_rx.decode_pld(
|
|
||||||
msg=roundtripped,
|
|
||||||
ipc=ipc,
|
|
||||||
expect_msg=Started,
|
|
||||||
hide_tb=hide_tb,
|
|
||||||
is_started_send_side=True,
|
|
||||||
)
|
|
||||||
if (
|
|
||||||
strict_pld_parity
|
|
||||||
and
|
|
||||||
pld != pld_value
|
|
||||||
):
|
|
||||||
# TODO: make that one a mod func too..
|
|
||||||
diff = pretty_struct.Struct.__sub__(
|
|
||||||
roundtripped,
|
|
||||||
pld_msg,
|
|
||||||
)
|
|
||||||
complaint: str = (
|
|
||||||
'Started value does not match after roundtrip?\n\n'
|
|
||||||
f'{diff}'
|
|
||||||
)
|
|
||||||
raise ValidationError(complaint)
|
|
||||||
|
|
||||||
# usually due to `.decode()` input type
|
|
||||||
except TypeError as typerr:
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
raise typerr
|
|
||||||
|
|
||||||
# raise any msg type error NO MATTER WHAT!
|
|
||||||
except ValidationError as verr:
|
|
||||||
try:
|
|
||||||
mte: MsgTypeError = _mk_recv_mte(
|
|
||||||
msg=roundtripped,
|
|
||||||
codec=codec,
|
|
||||||
src_validation_error=verr,
|
|
||||||
verb_header='Trying to send ',
|
|
||||||
is_invalid_payload=True,
|
|
||||||
)
|
|
||||||
except BaseException as _be:
|
|
||||||
if not roundtripped:
|
|
||||||
raise verr
|
|
||||||
|
|
||||||
be = _be
|
|
||||||
__tracebackhide__: bool = False
|
|
||||||
raise be
|
|
||||||
|
|
||||||
if not raise_mte:
|
|
||||||
return mte
|
|
||||||
|
|
||||||
raise mte from verr
|
|
|
@ -1,342 +0,0 @@
|
||||||
# tractor: structured concurrent "actors".
|
|
||||||
# Copyright 2018-eternity Tyler Goodlet.
|
|
||||||
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as published by
|
|
||||||
# the Free Software Foundation, either version 3 of the License, or
|
|
||||||
# (at your option) any later version.
|
|
||||||
|
|
||||||
# This program is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU Affero General Public License for more details.
|
|
||||||
|
|
||||||
# You should have received a copy of the GNU Affero General Public License
|
|
||||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
'''
|
|
||||||
Prettified version of `msgspec.Struct` for easier console grokin.
|
|
||||||
|
|
||||||
'''
|
|
||||||
from __future__ import annotations
|
|
||||||
from collections import UserList
|
|
||||||
from typing import (
|
|
||||||
Any,
|
|
||||||
Iterator,
|
|
||||||
)
|
|
||||||
|
|
||||||
from msgspec import (
|
|
||||||
msgpack,
|
|
||||||
Struct as _Struct,
|
|
||||||
structs,
|
|
||||||
)
|
|
||||||
# from pprint import (
|
|
||||||
# saferepr,
|
|
||||||
# )
|
|
||||||
|
|
||||||
from tractor.log import get_logger
|
|
||||||
|
|
||||||
log = get_logger()
|
|
||||||
# TODO: auto-gen type sig for input func both for
|
|
||||||
# type-msgs and logging of RPC tasks?
|
|
||||||
# taken and modified from:
|
|
||||||
# https://stackoverflow.com/a/57110117
|
|
||||||
# import inspect
|
|
||||||
# from typing import List
|
|
||||||
|
|
||||||
# def my_function(input_1: str, input_2: int) -> list[int]:
|
|
||||||
# pass
|
|
||||||
|
|
||||||
# def types_of(func):
|
|
||||||
# specs = inspect.getfullargspec(func)
|
|
||||||
# return_type = specs.annotations['return']
|
|
||||||
# input_types = [t.__name__ for s, t in specs.annotations.items() if s != 'return']
|
|
||||||
# return f'{func.__name__}({": ".join(input_types)}) -> {return_type}'
|
|
||||||
|
|
||||||
# types_of(my_function)
|
|
||||||
|
|
||||||
|
|
||||||
class DiffDump(UserList):
|
|
||||||
'''
|
|
||||||
Very simple list delegator that repr() dumps (presumed) tuple
|
|
||||||
elements of the form `tuple[str, Any, Any]` in a nice
|
|
||||||
multi-line readable form for analyzing `Struct` diffs.
|
|
||||||
|
|
||||||
'''
|
|
||||||
def __repr__(self) -> str:
|
|
||||||
if not len(self):
|
|
||||||
return super().__repr__()
|
|
||||||
|
|
||||||
# format by displaying item pair's ``repr()`` on multiple,
|
|
||||||
# indented lines such that they are more easily visually
|
|
||||||
# comparable when printed to console when printed to
|
|
||||||
# console.
|
|
||||||
repstr: str = '[\n'
|
|
||||||
for k, left, right in self:
|
|
||||||
repstr += (
|
|
||||||
f'({k},\n'
|
|
||||||
f' |_{repr(left)},\n'
|
|
||||||
f' |_{repr(right)},\n'
|
|
||||||
')\n'
|
|
||||||
)
|
|
||||||
repstr += ']\n'
|
|
||||||
return repstr
|
|
||||||
|
|
||||||
|
|
||||||
def iter_fields(struct: Struct) -> Iterator[
|
|
||||||
tuple[
|
|
||||||
structs.FieldIinfo,
|
|
||||||
str,
|
|
||||||
Any,
|
|
||||||
]
|
|
||||||
]:
|
|
||||||
'''
|
|
||||||
Iterate over all non-@property fields of this struct.
|
|
||||||
|
|
||||||
'''
|
|
||||||
fi: structs.FieldInfo
|
|
||||||
for fi in structs.fields(struct):
|
|
||||||
key: str = fi.name
|
|
||||||
val: Any = getattr(struct, key)
|
|
||||||
yield (
|
|
||||||
fi,
|
|
||||||
key,
|
|
||||||
val,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def pformat(
|
|
||||||
struct: Struct,
|
|
||||||
field_indent: int = 2,
|
|
||||||
indent: int = 0,
|
|
||||||
|
|
||||||
) -> str:
|
|
||||||
'''
|
|
||||||
Recursion-safe `pprint.pformat()` style formatting of
|
|
||||||
a `msgspec.Struct` for sane reading by a human using a REPL.
|
|
||||||
|
|
||||||
'''
|
|
||||||
# global whitespace indent
|
|
||||||
ws: str = ' '*indent
|
|
||||||
|
|
||||||
# field whitespace indent
|
|
||||||
field_ws: str = ' '*(field_indent + indent)
|
|
||||||
|
|
||||||
# qtn: str = ws + struct.__class__.__qualname__
|
|
||||||
qtn: str = struct.__class__.__qualname__
|
|
||||||
|
|
||||||
obj_str: str = '' # accumulator
|
|
||||||
fi: structs.FieldInfo
|
|
||||||
k: str
|
|
||||||
v: Any
|
|
||||||
for fi, k, v in iter_fields(struct):
|
|
||||||
|
|
||||||
# TODO: how can we prefer `Literal['option1', 'option2,
|
|
||||||
# ..]` over .__name__ == `Literal` but still get only the
|
|
||||||
# latter for simple types like `str | int | None` etc..?
|
|
||||||
ft: type = fi.type
|
|
||||||
typ_name: str = getattr(ft, '__name__', str(ft))
|
|
||||||
|
|
||||||
# recurse to get sub-struct's `.pformat()` output Bo
|
|
||||||
if isinstance(v, Struct):
|
|
||||||
val_str: str = v.pformat(
|
|
||||||
indent=field_indent + indent,
|
|
||||||
field_indent=indent + field_indent,
|
|
||||||
)
|
|
||||||
|
|
||||||
else:
|
|
||||||
val_str: str = repr(v)
|
|
||||||
|
|
||||||
# XXX LOL, below just seems to be f#$%in causing
|
|
||||||
# recursion errs..
|
|
||||||
#
|
|
||||||
# the `pprint` recursion-safe format:
|
|
||||||
# https://docs.python.org/3.11/library/pprint.html#pprint.saferepr
|
|
||||||
# try:
|
|
||||||
# val_str: str = saferepr(v)
|
|
||||||
# except Exception:
|
|
||||||
# log.exception(
|
|
||||||
# 'Failed to `saferepr({type(struct)})` !?\n'
|
|
||||||
# )
|
|
||||||
# raise
|
|
||||||
# return _Struct.__repr__(struct)
|
|
||||||
|
|
||||||
# TODO: LOLOL use `textwrap.indent()` instead dawwwwwg!
|
|
||||||
obj_str += (field_ws + f'{k}: {typ_name} = {val_str},\n')
|
|
||||||
|
|
||||||
return (
|
|
||||||
f'{qtn}(\n'
|
|
||||||
f'{obj_str}'
|
|
||||||
f'{ws})'
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class Struct(
|
|
||||||
_Struct,
|
|
||||||
|
|
||||||
# https://jcristharif.com/msgspec/structs.html#tagged-unions
|
|
||||||
# tag='pikerstruct',
|
|
||||||
# tag=True,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
A "human friendlier" (aka repl buddy) struct subtype.
|
|
||||||
|
|
||||||
'''
|
|
||||||
def to_dict(
|
|
||||||
self,
|
|
||||||
include_non_members: bool = True,
|
|
||||||
|
|
||||||
) -> dict:
|
|
||||||
'''
|
|
||||||
Like it sounds.. direct delegation to:
|
|
||||||
https://jcristharif.com/msgspec/api.html#msgspec.structs.asdict
|
|
||||||
|
|
||||||
BUT, by default we pop all non-member (aka not defined as
|
|
||||||
struct fields) fields by default.
|
|
||||||
|
|
||||||
'''
|
|
||||||
asdict: dict = structs.asdict(self)
|
|
||||||
if include_non_members:
|
|
||||||
return asdict
|
|
||||||
|
|
||||||
# only return a dict of the struct members
|
|
||||||
# which were provided as input, NOT anything
|
|
||||||
# added as type-defined `@property` methods!
|
|
||||||
sin_props: dict = {}
|
|
||||||
fi: structs.FieldInfo
|
|
||||||
for fi, k, v in iter_fields(self):
|
|
||||||
sin_props[k] = asdict[k]
|
|
||||||
|
|
||||||
return sin_props
|
|
||||||
|
|
||||||
pformat = pformat
|
|
||||||
|
|
||||||
def __repr__(self) -> str:
|
|
||||||
try:
|
|
||||||
return pformat(self)
|
|
||||||
except Exception:
|
|
||||||
log.exception(
|
|
||||||
f'Failed to `pformat({type(self)})` !?\n'
|
|
||||||
)
|
|
||||||
return _Struct.__repr__(self)
|
|
||||||
|
|
||||||
# __repr__ = pformat
|
|
||||||
# __str__ = __repr__ = pformat
|
|
||||||
# TODO: use a pprint.PrettyPrinter instance around ONLY rendering
|
|
||||||
# inside a known tty?
|
|
||||||
# def __repr__(self) -> str:
|
|
||||||
# ...
|
|
||||||
|
|
||||||
def copy(
|
|
||||||
self,
|
|
||||||
update: dict | None = None,
|
|
||||||
|
|
||||||
) -> Struct:
|
|
||||||
'''
|
|
||||||
Validate-typecast all self defined fields, return a copy of
|
|
||||||
us with all such fields.
|
|
||||||
|
|
||||||
NOTE: This is kinda like the default behaviour in
|
|
||||||
`pydantic.BaseModel` except a copy of the object is
|
|
||||||
returned making it compat with `frozen=True`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
if update:
|
|
||||||
for k, v in update.items():
|
|
||||||
setattr(self, k, v)
|
|
||||||
|
|
||||||
# NOTE: roundtrip serialize to validate
|
|
||||||
# - enode to msgpack binary format,
|
|
||||||
# - decode that back to a struct.
|
|
||||||
return msgpack.Decoder(type=type(self)).decode(
|
|
||||||
msgpack.Encoder().encode(self)
|
|
||||||
)
|
|
||||||
|
|
||||||
def typecast(
|
|
||||||
self,
|
|
||||||
|
|
||||||
# TODO: allow only casting a named subset?
|
|
||||||
# fields: set[str] | None = None,
|
|
||||||
|
|
||||||
) -> None:
|
|
||||||
'''
|
|
||||||
Cast all fields using their declared type annotations
|
|
||||||
(kinda like what `pydantic` does by default).
|
|
||||||
|
|
||||||
NOTE: this of course won't work on frozen types, use
|
|
||||||
``.copy()`` above in such cases.
|
|
||||||
|
|
||||||
'''
|
|
||||||
# https://jcristharif.com/msgspec/api.html#msgspec.structs.fields
|
|
||||||
fi: structs.FieldInfo
|
|
||||||
for fi in structs.fields(self):
|
|
||||||
setattr(
|
|
||||||
self,
|
|
||||||
fi.name,
|
|
||||||
fi.type(getattr(self, fi.name)),
|
|
||||||
)
|
|
||||||
|
|
||||||
# TODO: make a mod func instead and just point to it here for
|
|
||||||
# method impl?
|
|
||||||
def __sub__(
|
|
||||||
self,
|
|
||||||
other: Struct,
|
|
||||||
|
|
||||||
) -> DiffDump[tuple[str, Any, Any]]:
|
|
||||||
'''
|
|
||||||
Compare fields/items key-wise and return a `DiffDump`
|
|
||||||
for easy visual REPL comparison B)
|
|
||||||
|
|
||||||
'''
|
|
||||||
diffs: DiffDump[tuple[str, Any, Any]] = DiffDump()
|
|
||||||
for fi in structs.fields(self):
|
|
||||||
attr_name: str = fi.name
|
|
||||||
ours: Any = getattr(self, attr_name)
|
|
||||||
theirs: Any = getattr(other, attr_name)
|
|
||||||
if ours != theirs:
|
|
||||||
diffs.append((
|
|
||||||
attr_name,
|
|
||||||
ours,
|
|
||||||
theirs,
|
|
||||||
))
|
|
||||||
|
|
||||||
return diffs
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def fields_diff(
|
|
||||||
cls,
|
|
||||||
other: dict|Struct,
|
|
||||||
|
|
||||||
) -> DiffDump[tuple[str, Any, Any]]:
|
|
||||||
'''
|
|
||||||
Very similar to `PrettyStruct.__sub__()` except accepts an
|
|
||||||
input `other: dict` (presumably that would normally be called
|
|
||||||
like `Struct(**other)`) which returns a `DiffDump` of the
|
|
||||||
fields of the struct and the `dict`'s fields.
|
|
||||||
|
|
||||||
'''
|
|
||||||
nullish = object()
|
|
||||||
consumed: dict = other.copy()
|
|
||||||
diffs: DiffDump[tuple[str, Any, Any]] = DiffDump()
|
|
||||||
for fi in structs.fields(cls):
|
|
||||||
field_name: str = fi.name
|
|
||||||
# ours: Any = getattr(self, field_name)
|
|
||||||
theirs: Any = consumed.pop(field_name, nullish)
|
|
||||||
if theirs is nullish:
|
|
||||||
diffs.append((
|
|
||||||
field_name,
|
|
||||||
f'{fi.type!r}',
|
|
||||||
'NOT-DEFINED in `other: dict`',
|
|
||||||
))
|
|
||||||
|
|
||||||
# when there are lingering fields in `other` that this struct
|
|
||||||
# DOES NOT define we also append those.
|
|
||||||
if consumed:
|
|
||||||
for k, v in consumed.items():
|
|
||||||
diffs.append((
|
|
||||||
k,
|
|
||||||
f'NOT-DEFINED for `{cls.__name__}`',
|
|
||||||
f'`other: dict` has value = {v!r}',
|
|
||||||
))
|
|
||||||
|
|
||||||
return diffs
|
|
|
@ -76,11 +76,9 @@ class NamespacePath(str):
|
||||||
return self._ref
|
return self._ref
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def _mk_fqnp(
|
def _mk_fqnp(ref: type | object) -> tuple[str, str]:
|
||||||
ref: type|object,
|
|
||||||
) -> tuple[str, str]:
|
|
||||||
'''
|
'''
|
||||||
Generate a minial `str` pair which describes a python
|
Generate a minial ``str`` pair which describes a python
|
||||||
object's namespace path and object/type name.
|
object's namespace path and object/type name.
|
||||||
|
|
||||||
In more precise terms something like:
|
In more precise terms something like:
|
||||||
|
@ -89,9 +87,10 @@ class NamespacePath(str):
|
||||||
of THIS type XD
|
of THIS type XD
|
||||||
|
|
||||||
'''
|
'''
|
||||||
if isfunction(ref):
|
if (
|
||||||
|
isfunction(ref)
|
||||||
|
):
|
||||||
name: str = getattr(ref, '__name__')
|
name: str = getattr(ref, '__name__')
|
||||||
mod_name: str = ref.__module__
|
|
||||||
|
|
||||||
elif ismethod(ref):
|
elif ismethod(ref):
|
||||||
# build out the path manually i guess..?
|
# build out the path manually i guess..?
|
||||||
|
@ -100,19 +99,15 @@ class NamespacePath(str):
|
||||||
type(ref.__self__).__name__,
|
type(ref.__self__).__name__,
|
||||||
ref.__func__.__name__,
|
ref.__func__.__name__,
|
||||||
])
|
])
|
||||||
mod_name: str = ref.__self__.__module__
|
|
||||||
|
|
||||||
else: # object or other?
|
else: # object or other?
|
||||||
# isinstance(ref, object)
|
# isinstance(ref, object)
|
||||||
# and not isfunction(ref)
|
# and not isfunction(ref)
|
||||||
name: str = type(ref).__name__
|
name: str = type(ref).__name__
|
||||||
mod_name: str = ref.__module__
|
|
||||||
|
|
||||||
# TODO: return static value direactly?
|
|
||||||
#
|
|
||||||
# fully qualified namespace path, tuple.
|
# fully qualified namespace path, tuple.
|
||||||
fqnp: tuple[str, str] = (
|
fqnp: tuple[str, str] = (
|
||||||
mod_name,
|
ref.__module__,
|
||||||
name,
|
name,
|
||||||
)
|
)
|
||||||
return fqnp
|
return fqnp
|
||||||
|
@ -120,7 +115,7 @@ class NamespacePath(str):
|
||||||
@classmethod
|
@classmethod
|
||||||
def from_ref(
|
def from_ref(
|
||||||
cls,
|
cls,
|
||||||
ref: type|object,
|
ref: type | object,
|
||||||
|
|
||||||
) -> NamespacePath:
|
) -> NamespacePath:
|
||||||
|
|
||||||
|
|
|
@ -15,713 +15,256 @@
|
||||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
Define our strictly typed IPC message spec for the SCIPP:
|
Extensions to built-in or (heavily used but 3rd party) friend-lib
|
||||||
|
types.
|
||||||
that is,
|
|
||||||
|
|
||||||
the "Structurred-Concurrency-Inter-Process-(dialog)-(un)Protocol".
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
import types
|
from collections import UserList
|
||||||
|
from pprint import (
|
||||||
|
saferepr,
|
||||||
|
)
|
||||||
from typing import (
|
from typing import (
|
||||||
Any,
|
Any,
|
||||||
Generic,
|
Iterator,
|
||||||
Literal,
|
|
||||||
Type,
|
|
||||||
TypeVar,
|
|
||||||
TypeAlias,
|
|
||||||
Union,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
from msgspec import (
|
from msgspec import (
|
||||||
defstruct,
|
msgpack,
|
||||||
# field,
|
Struct as _Struct,
|
||||||
Raw,
|
structs,
|
||||||
Struct,
|
|
||||||
# UNSET,
|
|
||||||
# UnsetType,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
from tractor.msg import (
|
# TODO: auto-gen type sig for input func both for
|
||||||
pretty_struct,
|
# type-msgs and logging of RPC tasks?
|
||||||
)
|
# taken and modified from:
|
||||||
from tractor.log import get_logger
|
# https://stackoverflow.com/a/57110117
|
||||||
|
# import inspect
|
||||||
|
# from typing import List
|
||||||
|
|
||||||
|
# def my_function(input_1: str, input_2: int) -> list[int]:
|
||||||
|
# pass
|
||||||
|
|
||||||
|
# def types_of(func):
|
||||||
|
# specs = inspect.getfullargspec(func)
|
||||||
|
# return_type = specs.annotations['return']
|
||||||
|
# input_types = [t.__name__ for s, t in specs.annotations.items() if s != 'return']
|
||||||
|
# return f'{func.__name__}({": ".join(input_types)}) -> {return_type}'
|
||||||
|
|
||||||
|
# types_of(my_function)
|
||||||
|
|
||||||
|
|
||||||
log = get_logger('tractor.msgspec')
|
class DiffDump(UserList):
|
||||||
|
'''
|
||||||
|
Very simple list delegator that repr() dumps (presumed) tuple
|
||||||
|
elements of the form `tuple[str, Any, Any]` in a nice
|
||||||
|
multi-line readable form for analyzing `Struct` diffs.
|
||||||
|
|
||||||
# type variable for the boxed payload field `.pld`
|
'''
|
||||||
PayloadT = TypeVar('PayloadT')
|
def __repr__(self) -> str:
|
||||||
|
if not len(self):
|
||||||
|
return super().__repr__()
|
||||||
|
|
||||||
|
# format by displaying item pair's ``repr()`` on multiple,
|
||||||
|
# indented lines such that they are more easily visually
|
||||||
|
# comparable when printed to console when printed to
|
||||||
|
# console.
|
||||||
|
repstr: str = '[\n'
|
||||||
|
for k, left, right in self:
|
||||||
|
repstr += (
|
||||||
|
f'({k},\n'
|
||||||
|
f'\t{repr(left)},\n'
|
||||||
|
f'\t{repr(right)},\n'
|
||||||
|
')\n'
|
||||||
|
)
|
||||||
|
repstr += ']\n'
|
||||||
|
return repstr
|
||||||
|
|
||||||
|
|
||||||
class PayloadMsg(
|
class Struct(
|
||||||
Struct,
|
_Struct,
|
||||||
Generic[PayloadT],
|
|
||||||
|
|
||||||
# https://jcristharif.com/msgspec/structs.html#tagged-unions
|
# https://jcristharif.com/msgspec/structs.html#tagged-unions
|
||||||
tag=True,
|
# tag='pikerstruct',
|
||||||
tag_field='msg_type',
|
# tag=True,
|
||||||
|
|
||||||
# https://jcristharif.com/msgspec/structs.html#field-ordering
|
|
||||||
# kw_only=True,
|
|
||||||
|
|
||||||
# https://jcristharif.com/msgspec/structs.html#equality-and-order
|
|
||||||
# order=True,
|
|
||||||
|
|
||||||
# https://jcristharif.com/msgspec/structs.html#encoding-decoding-as-arrays
|
|
||||||
# as_array=True,
|
|
||||||
):
|
):
|
||||||
'''
|
'''
|
||||||
An abstract payload boxing/shuttling IPC msg type.
|
A "human friendlier" (aka repl buddy) struct subtype.
|
||||||
|
|
||||||
Boxes data-values passed to/from user code
|
|
||||||
|
|
||||||
(i.e. any values passed by `tractor` application code using any of
|
|
||||||
|
|
||||||
|_ `._streaming.MsgStream.send/receive()`
|
|
||||||
|_ `._context.Context.started/result()`
|
|
||||||
|_ `._ipc.Channel.send/recv()`
|
|
||||||
|
|
||||||
aka our "IPC primitive APIs")
|
|
||||||
|
|
||||||
as message "payloads" set to the `.pld` field and uses
|
|
||||||
`msgspec`'s "tagged unions" feature to support a subset of our
|
|
||||||
"SC-transitive shuttle protocol" specification with
|
|
||||||
a `msgspec.Struct` inheritance tree.
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
cid: str # call/context-id
|
def _sin_props(self) -> Iterator[
|
||||||
# ^-TODO-^: more explicit type?
|
tuple[
|
||||||
# -[ ] use UNSET here?
|
structs.FieldIinfo,
|
||||||
# https://jcristharif.com/msgspec/supported-types.html#unset
|
str,
|
||||||
#
|
Any,
|
||||||
# -[ ] `uuid.UUID` which has multi-protocol support
|
]
|
||||||
# https://jcristharif.com/msgspec/supported-types.html#uuid
|
]:
|
||||||
|
'''
|
||||||
# The msg's "payload" (spelled without vowels):
|
Iterate over all non-@property fields of this struct.
|
||||||
# https://en.wikipedia.org/wiki/Payload_(computing)
|
|
||||||
pld: Raw
|
'''
|
||||||
|
fi: structs.FieldInfo
|
||||||
# ^-NOTE-^ inherited from any `PayloadMsg` (and maybe type
|
for fi in structs.fields(self):
|
||||||
# overriden via the `._ops.limit_plds()` API), but by default is
|
key: str = fi.name
|
||||||
# parameterized to be `Any`.
|
val: Any = getattr(self, key)
|
||||||
#
|
yield fi, key, val
|
||||||
# XXX this `Union` must strictly NOT contain `Any` if
|
|
||||||
# a limited msg-type-spec is intended, such that when
|
def to_dict(
|
||||||
# creating and applying a new `MsgCodec` its
|
self,
|
||||||
# `.decoder: Decoder` is configured with a `Union[Type[Struct]]` which
|
include_non_members: bool = True,
|
||||||
# restricts the allowed payload content (this `.pld` field)
|
|
||||||
# by type system defined loading constraints B)
|
) -> dict:
|
||||||
#
|
'''
|
||||||
# TODO: could also be set to `msgspec.Raw` if the sub-decoders
|
Like it sounds.. direct delegation to:
|
||||||
# approach is preferred over the generic parameterization
|
https://jcristharif.com/msgspec/api.html#msgspec.structs.asdict
|
||||||
# approach as take by `mk_msg_spec()` below.
|
|
||||||
|
BUT, by default we pop all non-member (aka not defined as
|
||||||
|
struct fields) fields by default.
|
||||||
# TODO: complete rename
|
|
||||||
Msg = PayloadMsg
|
'''
|
||||||
|
asdict: dict = structs.asdict(self)
|
||||||
|
if include_non_members:
|
||||||
class Aid(
|
return asdict
|
||||||
Struct,
|
|
||||||
tag=True,
|
# only return a dict of the struct members
|
||||||
tag_field='msg_type',
|
# which were provided as input, NOT anything
|
||||||
):
|
# added as type-defined `@property` methods!
|
||||||
'''
|
sin_props: dict = {}
|
||||||
Actor-identity msg.
|
fi: structs.FieldInfo
|
||||||
|
for fi, k, v in self._sin_props():
|
||||||
Initial contact exchange enabling an actor "mailbox handshake"
|
sin_props[k] = asdict[k]
|
||||||
delivering the peer identity (and maybe eventually contact)
|
|
||||||
info.
|
return sin_props
|
||||||
|
|
||||||
Used by discovery protocol to register actors as well as
|
def pformat(
|
||||||
conduct the initial comms (capability) filtering.
|
self,
|
||||||
|
field_indent: int = 2,
|
||||||
'''
|
indent: int = 0,
|
||||||
name: str
|
|
||||||
uuid: str
|
) -> str:
|
||||||
# TODO: use built-in support for UUIDs?
|
'''
|
||||||
# -[ ] `uuid.UUID` which has multi-protocol support
|
Recursion-safe `pprint.pformat()` style formatting of
|
||||||
# https://jcristharif.com/msgspec/supported-types.html#uuid
|
a `msgspec.Struct` for sane reading by a human using a REPL.
|
||||||
|
|
||||||
|
'''
|
||||||
class SpawnSpec(
|
# global whitespace indent
|
||||||
pretty_struct.Struct,
|
ws: str = ' '*indent
|
||||||
tag=True,
|
|
||||||
tag_field='msg_type',
|
# field whitespace indent
|
||||||
):
|
field_ws: str = ' '*(field_indent + indent)
|
||||||
'''
|
|
||||||
Initial runtime spec handed down from a spawning parent to its
|
# qtn: str = ws + self.__class__.__qualname__
|
||||||
child subactor immediately following first contact via an
|
qtn: str = self.__class__.__qualname__
|
||||||
`Aid` msg.
|
|
||||||
|
obj_str: str = '' # accumulator
|
||||||
'''
|
fi: structs.FieldInfo
|
||||||
# TODO: similar to the `Start` kwargs spec needed below, we need
|
k: str
|
||||||
# a hard `Struct` def for all of these fields!
|
v: Any
|
||||||
_parent_main_data: dict
|
for fi, k, v in self._sin_props():
|
||||||
_runtime_vars: dict[str, Any]
|
|
||||||
|
# TODO: how can we prefer `Literal['option1', 'option2,
|
||||||
# module import capability
|
# ..]` over .__name__ == `Literal` but still get only the
|
||||||
enable_modules: dict[str, str]
|
# latter for simple types like `str | int | None` etc..?
|
||||||
|
ft: type = fi.type
|
||||||
# TODO: not just sockaddr pairs?
|
typ_name: str = getattr(ft, '__name__', str(ft))
|
||||||
# -[ ] abstract into a `TransportAddr` type?
|
|
||||||
reg_addrs: list[tuple[str, int]]
|
# recurse to get sub-struct's `.pformat()` output Bo
|
||||||
bind_addrs: list[tuple[str, int]]
|
if isinstance(v, Struct):
|
||||||
|
val_str: str = v.pformat(
|
||||||
|
indent=field_indent + indent,
|
||||||
# TODO: caps based RPC support in the payload?
|
field_indent=indent + field_indent,
|
||||||
#
|
)
|
||||||
# -[ ] integration with our ``enable_modules: list[str]`` caps sys.
|
|
||||||
# ``pkgutil.resolve_name()`` internally uses
|
else: # the `pprint` recursion-safe format:
|
||||||
# ``importlib.import_module()`` which can be filtered by
|
# https://docs.python.org/3.11/library/pprint.html#pprint.saferepr
|
||||||
# inserting a ``MetaPathFinder`` into ``sys.meta_path`` (which
|
val_str: str = saferepr(v)
|
||||||
# we could do before entering the ``Actor._process_messages()``
|
|
||||||
# loop)?
|
# TODO: LOLOL use `textwrap.indent()` instead dawwwwwg!
|
||||||
# - https://github.com/python/cpython/blob/main/Lib/pkgutil.py#L645
|
obj_str += (field_ws + f'{k}: {typ_name} = {val_str},\n')
|
||||||
# - https://stackoverflow.com/questions/1350466/preventing-python-code-from-importing-certain-modules
|
|
||||||
# - https://stackoverflow.com/a/63320902
|
return (
|
||||||
# - https://docs.python.org/3/library/sys.html#sys.meta_path
|
f'{qtn}(\n'
|
||||||
#
|
f'{obj_str}'
|
||||||
# -[ ] can we combine .ns + .func into a native `NamespacePath` field?
|
f'{ws})'
|
||||||
#
|
|
||||||
# -[ ] better name, like `Call/TaskInput`?
|
|
||||||
#
|
|
||||||
# -[ ] XXX a debugger lock msg transaction with payloads like,
|
|
||||||
# child -> `.pld: DebugLock` -> root
|
|
||||||
# child <- `.pld: DebugLocked` <- root
|
|
||||||
# child -> `.pld: DebugRelease` -> root
|
|
||||||
#
|
|
||||||
# WHY => when a pld spec is provided it might not allow for
|
|
||||||
# debug mode msgs as they currently are (using plain old `pld.
|
|
||||||
# str` payloads) so we only when debug_mode=True we need to
|
|
||||||
# union in this debugger payload set?
|
|
||||||
#
|
|
||||||
# mk_msg_spec(
|
|
||||||
# MyPldSpec,
|
|
||||||
# debug_mode=True,
|
|
||||||
# ) -> (
|
|
||||||
# Union[MyPldSpec]
|
|
||||||
# | Union[DebugLock, DebugLocked, DebugRelease]
|
|
||||||
# )
|
|
||||||
|
|
||||||
# class Params(
|
|
||||||
# Struct,
|
|
||||||
# Generic[PayloadT],
|
|
||||||
# ):
|
|
||||||
# spec: PayloadT|ParamSpec
|
|
||||||
# inputs: InputsT|dict[str, Any]
|
|
||||||
|
|
||||||
# TODO: for eg. we could stringently check the target
|
|
||||||
# task-func's type sig and enforce it?
|
|
||||||
# as an example for an IPTC,
|
|
||||||
# @tractor.context
|
|
||||||
# async def send_back_nsp(
|
|
||||||
# ctx: Context,
|
|
||||||
# expect_debug: bool,
|
|
||||||
# pld_spec_str: str,
|
|
||||||
# add_hooks: bool,
|
|
||||||
# started_msg_dict: dict,
|
|
||||||
# ) -> <WhatHere!>:
|
|
||||||
|
|
||||||
# TODO: figure out which of the `typing` feats we want to
|
|
||||||
# support:
|
|
||||||
# - plain ol `ParamSpec`:
|
|
||||||
# https://docs.python.org/3/library/typing.html#typing.ParamSpec
|
|
||||||
# - new in 3.12 type parameter lists Bo
|
|
||||||
# |_ https://docs.python.org/3/reference/compound_stmts.html#type-params
|
|
||||||
# |_ historical pep 695: https://peps.python.org/pep-0695/
|
|
||||||
# |_ full lang spec: https://typing.readthedocs.io/en/latest/spec/
|
|
||||||
# |_ on annotation scopes:
|
|
||||||
# https://docs.python.org/3/reference/executionmodel.html#annotation-scopes
|
|
||||||
# spec: ParamSpec[
|
|
||||||
# expect_debug: bool,
|
|
||||||
# pld_spec_str: str,
|
|
||||||
# add_hooks: bool,
|
|
||||||
# started_msg_dict: dict,
|
|
||||||
# ]
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: possibly sub-type for runtime method requests?
|
|
||||||
# -[ ] `Runtime(Start)` with a `.ns: str = 'self' or
|
|
||||||
# we can just enforce any such method as having a strict
|
|
||||||
# ns for calling funcs, namely the `Actor` instance?
|
|
||||||
class Start(
|
|
||||||
Struct,
|
|
||||||
tag=True,
|
|
||||||
tag_field='msg_type',
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Initial request to remotely schedule an RPC `trio.Task` via
|
|
||||||
`Actor.start_remote_task()`.
|
|
||||||
|
|
||||||
It is called by all the following public APIs:
|
|
||||||
|
|
||||||
- `ActorNursery.run_in_actor()`
|
|
||||||
|
|
||||||
- `Portal.run()`
|
|
||||||
`|_.run_from_ns()`
|
|
||||||
`|_.open_stream_from()`
|
|
||||||
`|_._submit_for_result()`
|
|
||||||
|
|
||||||
- `Context.open_context()`
|
|
||||||
|
|
||||||
'''
|
|
||||||
cid: str
|
|
||||||
|
|
||||||
ns: str
|
|
||||||
func: str
|
|
||||||
|
|
||||||
# TODO: make this a sub-struct which can be further
|
|
||||||
# type-limited, maybe `Inputs`?
|
|
||||||
# => SEE ABOVE <=
|
|
||||||
kwargs: dict[str, Any]
|
|
||||||
uid: tuple[str, str] # (calling) actor-id
|
|
||||||
|
|
||||||
# TODO: enforcing a msg-spec in terms `Msg.pld`
|
|
||||||
# parameterizable msgs to be used in the appls IPC dialog.
|
|
||||||
# => SEE `._codec.MsgDec` for more <=
|
|
||||||
pld_spec: str = str(Any)
|
|
||||||
|
|
||||||
|
|
||||||
class StartAck(
|
|
||||||
Struct,
|
|
||||||
tag=True,
|
|
||||||
tag_field='msg_type',
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Init response to a `Cmd` request indicating the far
|
|
||||||
end's RPC spec, namely its callable "type".
|
|
||||||
|
|
||||||
'''
|
|
||||||
cid: str
|
|
||||||
# TODO: maybe better names for all these?
|
|
||||||
# -[ ] obvi ^ would need sync with `._rpc`
|
|
||||||
functype: Literal[
|
|
||||||
'asyncfunc',
|
|
||||||
'asyncgen',
|
|
||||||
'context', # TODO: the only one eventually?
|
|
||||||
]
|
|
||||||
|
|
||||||
# import typing
|
|
||||||
# eval(str(Any), {}, {'typing': typing})
|
|
||||||
# started_spec: str = str(Any)
|
|
||||||
# return_spec
|
|
||||||
|
|
||||||
|
|
||||||
class Started(
|
|
||||||
PayloadMsg,
|
|
||||||
Generic[PayloadT],
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Packet to shuttle the "first value" delivered by
|
|
||||||
`Context.started(value: Any)` from a `@tractor.context`
|
|
||||||
decorated IPC endpoint.
|
|
||||||
|
|
||||||
'''
|
|
||||||
pld: PayloadT|Raw
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: cancel request dedicated msg?
|
|
||||||
# -[ ] instead of using our existing `Start`?
|
|
||||||
#
|
|
||||||
# class Cancel:
|
|
||||||
# cid: str
|
|
||||||
|
|
||||||
|
|
||||||
class Yield(
|
|
||||||
PayloadMsg,
|
|
||||||
Generic[PayloadT],
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Per IPC transmission of a value from `await MsgStream.send(<value>)`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
pld: PayloadT|Raw
|
|
||||||
|
|
||||||
|
|
||||||
class Stop(
|
|
||||||
Struct,
|
|
||||||
tag=True,
|
|
||||||
tag_field='msg_type',
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Stream termination signal much like an IPC version
|
|
||||||
of `StopAsyncIteration`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
cid: str
|
|
||||||
# TODO: do we want to support a payload on stop?
|
|
||||||
# pld: UnsetType = UNSET
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: is `Result` or `Out[come]` a better name?
|
|
||||||
class Return(
|
|
||||||
PayloadMsg,
|
|
||||||
Generic[PayloadT],
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Final `return <value>` from a remotely scheduled
|
|
||||||
func-as-`trio.Task`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
pld: PayloadT|Raw
|
|
||||||
|
|
||||||
|
|
||||||
class CancelAck(
|
|
||||||
PayloadMsg,
|
|
||||||
Generic[PayloadT],
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
Deliver the `bool` return-value from a cancellation `Actor`
|
|
||||||
method scheduled via and prior RPC request.
|
|
||||||
|
|
||||||
- `Actor.cancel()`
|
|
||||||
`|_.cancel_soon()`
|
|
||||||
`|_.cancel_rpc_tasks()`
|
|
||||||
`|_._cancel_task()`
|
|
||||||
`|_.cancel_server()`
|
|
||||||
|
|
||||||
RPCs to these methods must **always** be able to deliver a result
|
|
||||||
despite the currently configured IPC msg spec such that graceful
|
|
||||||
cancellation is always functional in the runtime.
|
|
||||||
|
|
||||||
'''
|
|
||||||
pld: bool
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: unify this with `._exceptions.RemoteActorError`
|
|
||||||
# such that we can have a msg which is both raisable and
|
|
||||||
# IPC-wire ready?
|
|
||||||
# B~o
|
|
||||||
class Error(
|
|
||||||
Struct,
|
|
||||||
tag=True,
|
|
||||||
tag_field='msg_type',
|
|
||||||
|
|
||||||
# TODO may omit defaults?
|
|
||||||
# https://jcristharif.com/msgspec/structs.html#omitting-default-values
|
|
||||||
# omit_defaults=True,
|
|
||||||
):
|
|
||||||
'''
|
|
||||||
A pkt that wraps `RemoteActorError`s for relay and raising.
|
|
||||||
|
|
||||||
Fields are 1-to-1 meta-data as needed originally by
|
|
||||||
`RemoteActorError.msgdata: dict` but now are defined here.
|
|
||||||
|
|
||||||
Note: this msg shuttles `ContextCancelled` and `StreamOverrun`
|
|
||||||
as well is used to rewrap any `MsgTypeError` for relay-reponse
|
|
||||||
to bad `Yield.pld` senders during an IPC ctx's streaming dialog
|
|
||||||
phase.
|
|
||||||
|
|
||||||
'''
|
|
||||||
src_uid: tuple[str, str]
|
|
||||||
src_type_str: str
|
|
||||||
boxed_type_str: str
|
|
||||||
relay_path: list[tuple[str, str]]
|
|
||||||
|
|
||||||
# normally either both are provided or just
|
|
||||||
# a message for certain special cases where
|
|
||||||
# we pack a message for a locally raised
|
|
||||||
# mte or ctxc.
|
|
||||||
message: str|None = None
|
|
||||||
tb_str: str = ''
|
|
||||||
|
|
||||||
# TODO: only optionally include sub-type specfic fields?
|
|
||||||
# -[ ] use UNSET or don't include them via `omit_defaults` (see
|
|
||||||
# inheritance-line options above)
|
|
||||||
#
|
|
||||||
# `ContextCancelled` reports the src cancelling `Actor.uid`
|
|
||||||
canceller: tuple[str, str]|None = None
|
|
||||||
|
|
||||||
# `StreamOverrun`-specific src `Actor.uid`
|
|
||||||
sender: tuple[str, str]|None = None
|
|
||||||
|
|
||||||
# `MsgTypeError` meta-data
|
|
||||||
cid: str|None = None
|
|
||||||
# when the receiver side fails to decode a delivered
|
|
||||||
# `PayloadMsg`-subtype; one and/or both the msg-struct instance
|
|
||||||
# and `Any`-decoded to `dict` of the msg are set and relayed
|
|
||||||
# (back to the sender) for introspection.
|
|
||||||
_bad_msg: Started|Yield|Return|None = None
|
|
||||||
_bad_msg_as_dict: dict|None = None
|
|
||||||
|
|
||||||
|
|
||||||
def from_dict_msg(
|
|
||||||
dict_msg: dict,
|
|
||||||
|
|
||||||
msgT: MsgType|None = None,
|
|
||||||
tag_field: str = 'msg_type',
|
|
||||||
use_pretty: bool = False,
|
|
||||||
|
|
||||||
) -> MsgType:
|
|
||||||
'''
|
|
||||||
Helper to build a specific `MsgType` struct from a "vanilla"
|
|
||||||
decoded `dict`-ified equivalent of the msg: i.e. if the
|
|
||||||
`msgpack.Decoder.type == Any`, the default when using
|
|
||||||
`msgspec.msgpack` and not "typed decoding" using
|
|
||||||
`msgspec.Struct`.
|
|
||||||
|
|
||||||
'''
|
|
||||||
msg_type_tag_field: str = (
|
|
||||||
msgT.__struct_config__.tag_field
|
|
||||||
if msgT is not None
|
|
||||||
else tag_field
|
|
||||||
)
|
|
||||||
# XXX ensure tag field is removed
|
|
||||||
msgT_name: str = dict_msg.pop(msg_type_tag_field)
|
|
||||||
msgT: MsgType = _msg_table[msgT_name]
|
|
||||||
if use_pretty:
|
|
||||||
msgT = defstruct(
|
|
||||||
name=msgT_name,
|
|
||||||
fields=[
|
|
||||||
(key, fi.type)
|
|
||||||
for fi, key, _
|
|
||||||
in pretty_struct.iter_fields(msgT)
|
|
||||||
],
|
|
||||||
bases=(
|
|
||||||
pretty_struct.Struct,
|
|
||||||
msgT,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
return msgT(**dict_msg)
|
|
||||||
|
|
||||||
# TODO: should be make a set of cancel msgs?
|
|
||||||
# -[ ] a version of `ContextCancelled`?
|
|
||||||
# |_ and/or with a scope field?
|
|
||||||
# -[ ] or, a full `ActorCancelled`?
|
|
||||||
#
|
|
||||||
# class Cancelled(MsgType):
|
|
||||||
# cid: str
|
|
||||||
#
|
|
||||||
# -[ ] what about overruns?
|
|
||||||
#
|
|
||||||
# class Overrun(MsgType):
|
|
||||||
# cid: str
|
|
||||||
|
|
||||||
_runtime_msgs: list[Struct] = [
|
|
||||||
|
|
||||||
# identity handshake on first IPC `Channel` contact.
|
|
||||||
Aid,
|
|
||||||
|
|
||||||
# parent-to-child spawn specification passed as 2nd msg after
|
|
||||||
# handshake ONLY after child connects back to parent.
|
|
||||||
SpawnSpec,
|
|
||||||
|
|
||||||
# inter-actor RPC initiation
|
|
||||||
Start, # schedule remote task-as-func
|
|
||||||
StartAck, # ack the schedule request
|
|
||||||
|
|
||||||
# emission from `MsgStream.aclose()`
|
|
||||||
Stop,
|
|
||||||
|
|
||||||
# `Return` sub-type that we always accept from
|
|
||||||
# runtime-internal cancel endpoints
|
|
||||||
CancelAck,
|
|
||||||
|
|
||||||
# box remote errors, normally subtypes
|
|
||||||
# of `RemoteActorError`.
|
|
||||||
Error,
|
|
||||||
]
|
|
||||||
|
|
||||||
# the no-outcome-yet IAC (inter-actor-communication) sub-set which
|
|
||||||
# can be `PayloadMsg.pld` payload field type-limited by application code
|
|
||||||
# using `apply_codec()` and `limit_msg_spec()`.
|
|
||||||
_payload_msgs: list[PayloadMsg] = [
|
|
||||||
# first <value> from `Context.started(<value>)`
|
|
||||||
Started,
|
|
||||||
|
|
||||||
# any <value> sent via `MsgStream.send(<value>)`
|
|
||||||
Yield,
|
|
||||||
|
|
||||||
# the final value returned from a `@context` decorated
|
|
||||||
# IPC endpoint.
|
|
||||||
Return,
|
|
||||||
]
|
|
||||||
|
|
||||||
# built-in SC shuttle protocol msg type set in
|
|
||||||
# approx order of the IPC txn-state spaces.
|
|
||||||
__msg_types__: list[MsgType] = (
|
|
||||||
_runtime_msgs
|
|
||||||
+
|
|
||||||
_payload_msgs
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
_msg_table: dict[str, MsgType] = {
|
|
||||||
msgT.__name__: msgT
|
|
||||||
for msgT in __msg_types__
|
|
||||||
}
|
|
||||||
|
|
||||||
# TODO: use new type declaration syntax for msg-type-spec
|
|
||||||
# https://docs.python.org/3/library/typing.html#type-aliases
|
|
||||||
# https://docs.python.org/3/reference/simple_stmts.html#type
|
|
||||||
MsgType: TypeAlias = Union[*__msg_types__]
|
|
||||||
|
|
||||||
|
|
||||||
def mk_msg_spec(
|
|
||||||
payload_type_union: Union[Type] = Any,
|
|
||||||
|
|
||||||
spec_build_method: Literal[
|
|
||||||
'indexed_generics', # works
|
|
||||||
'defstruct',
|
|
||||||
'types_new_class',
|
|
||||||
|
|
||||||
] = 'indexed_generics',
|
|
||||||
|
|
||||||
) -> tuple[
|
|
||||||
Union[MsgType],
|
|
||||||
list[MsgType],
|
|
||||||
]:
|
|
||||||
'''
|
|
||||||
Create a payload-(data-)type-parameterized IPC message specification.
|
|
||||||
|
|
||||||
Allows generating IPC msg types from the above builtin set
|
|
||||||
with a payload (field) restricted data-type, the `Msg.pld: PayloadT`.
|
|
||||||
|
|
||||||
This allows runtime-task contexts to use the python type system
|
|
||||||
to limit/filter payload values as determined by the input
|
|
||||||
`payload_type_union: Union[Type]`.
|
|
||||||
|
|
||||||
Notes: originally multiple approaches for constructing the
|
|
||||||
type-union passed to `msgspec` were attempted as selected via the
|
|
||||||
`spec_build_method`, but it turns out only the defaul method
|
|
||||||
'indexed_generics' seems to work reliably in all use cases. As
|
|
||||||
such, the others will likely be removed in the near future.
|
|
||||||
|
|
||||||
'''
|
|
||||||
submsg_types: list[MsgType] = Msg.__subclasses__()
|
|
||||||
bases: tuple = (
|
|
||||||
# XXX NOTE XXX the below generic-parameterization seems to
|
|
||||||
# be THE ONLY way to get this to work correctly in terms
|
|
||||||
# of getting ValidationError on a roundtrip?
|
|
||||||
Msg[payload_type_union],
|
|
||||||
Generic[PayloadT],
|
|
||||||
)
|
|
||||||
# defstruct_bases: tuple = (
|
|
||||||
# Msg, # [payload_type_union],
|
|
||||||
# # Generic[PayloadT],
|
|
||||||
# # ^-XXX-^: not allowed? lul..
|
|
||||||
# )
|
|
||||||
ipc_msg_types: list[Msg] = []
|
|
||||||
|
|
||||||
idx_msg_types: list[Msg] = []
|
|
||||||
# defs_msg_types: list[Msg] = []
|
|
||||||
nc_msg_types: list[Msg] = []
|
|
||||||
|
|
||||||
for msgtype in __msg_types__:
|
|
||||||
|
|
||||||
# for the NON-payload (user api) type specify-able
|
|
||||||
# msgs types, we simply aggregate the def as is
|
|
||||||
# for inclusion in the output type `Union`.
|
|
||||||
if msgtype not in _payload_msgs:
|
|
||||||
ipc_msg_types.append(msgtype)
|
|
||||||
continue
|
|
||||||
|
|
||||||
# check inheritance sanity
|
|
||||||
assert msgtype in submsg_types
|
|
||||||
|
|
||||||
# TODO: wait why do we need the dynamic version here?
|
|
||||||
# XXX ANSWER XXX -> BC INHERITANCE.. don't work w generics..
|
|
||||||
#
|
|
||||||
# NOTE previously bc msgtypes WERE NOT inheriting
|
|
||||||
# directly the `Generic[PayloadT]` type, the manual method
|
|
||||||
# of generic-paraming with `.__class_getitem__()` wasn't
|
|
||||||
# working..
|
|
||||||
#
|
|
||||||
# XXX but bc i changed that to make every subtype inherit
|
|
||||||
# it, this manual "indexed parameterization" method seems
|
|
||||||
# to work?
|
|
||||||
#
|
|
||||||
# -[x] paraming the `PayloadT` values via `Generic[T]`
|
|
||||||
# does work it seems but WITHOUT inheritance of generics
|
|
||||||
#
|
|
||||||
# -[-] is there a way to get it to work at module level
|
|
||||||
# just using inheritance or maybe a metaclass?
|
|
||||||
# => thot that `defstruct` might work, but NOPE, see
|
|
||||||
# below..
|
|
||||||
#
|
|
||||||
idxed_msg_type: Msg = msgtype[payload_type_union]
|
|
||||||
idx_msg_types.append(idxed_msg_type)
|
|
||||||
|
|
||||||
# TODO: WHY do we need to dynamically generate the
|
|
||||||
# subtype-msgs here to ensure the `.pld` parameterization
|
|
||||||
# propagates as well as works at all in terms of the
|
|
||||||
# `msgpack.Decoder()`..?
|
|
||||||
#
|
|
||||||
# dynamically create the payload type-spec-limited msg set.
|
|
||||||
newclass_msgtype: Type = types.new_class(
|
|
||||||
name=msgtype.__name__,
|
|
||||||
bases=bases,
|
|
||||||
kwds={},
|
|
||||||
)
|
|
||||||
nc_msg_types.append(
|
|
||||||
newclass_msgtype[payload_type_union]
|
|
||||||
)
|
)
|
||||||
|
|
||||||
# with `msgspec.structs.defstruct`
|
# TODO: use a pprint.PrettyPrinter instance around ONLY rendering
|
||||||
# XXX ALSO DOESN'T WORK
|
# inside a known tty?
|
||||||
# defstruct_msgtype = defstruct(
|
# def __repr__(self) -> str:
|
||||||
# name=msgtype.__name__,
|
# ...
|
||||||
# fields=[
|
|
||||||
# ('cid', str),
|
|
||||||
|
|
||||||
# # XXX doesn't seem to work..
|
# __str__ = __repr__ = pformat
|
||||||
# # ('pld', PayloadT),
|
__repr__ = pformat
|
||||||
|
|
||||||
# ('pld', payload_type_union),
|
def copy(
|
||||||
# ],
|
self,
|
||||||
# bases=defstruct_bases,
|
update: dict | None = None,
|
||||||
# )
|
|
||||||
# defs_msg_types.append(defstruct_msgtype)
|
|
||||||
# assert index_paramed_msg_type == manual_paramed_msg_subtype
|
|
||||||
# paramed_msg_type = manual_paramed_msg_subtype
|
|
||||||
# ipc_payload_msgs_type_union |= index_paramed_msg_type
|
|
||||||
|
|
||||||
idx_spec: Union[Type[Msg]] = Union[*idx_msg_types]
|
) -> Struct:
|
||||||
# def_spec: Union[Type[Msg]] = Union[*defs_msg_types]
|
'''
|
||||||
nc_spec: Union[Type[Msg]] = Union[*nc_msg_types]
|
Validate-typecast all self defined fields, return a copy of
|
||||||
|
us with all such fields.
|
||||||
|
|
||||||
specs: dict[str, Union[Type[Msg]]] = {
|
NOTE: This is kinda like the default behaviour in
|
||||||
'indexed_generics': idx_spec,
|
`pydantic.BaseModel` except a copy of the object is
|
||||||
# 'defstruct': def_spec,
|
returned making it compat with `frozen=True`.
|
||||||
'types_new_class': nc_spec,
|
|
||||||
}
|
|
||||||
msgtypes_table: dict[str, list[Msg]] = {
|
|
||||||
'indexed_generics': idx_msg_types,
|
|
||||||
# 'defstruct': defs_msg_types,
|
|
||||||
'types_new_class': nc_msg_types,
|
|
||||||
}
|
|
||||||
|
|
||||||
# XXX lol apparently type unions can't ever
|
'''
|
||||||
# be equal eh?
|
if update:
|
||||||
# TODO: grok the diff here better..
|
for k, v in update.items():
|
||||||
#
|
setattr(self, k, v)
|
||||||
# assert (
|
|
||||||
# idx_spec
|
|
||||||
# ==
|
|
||||||
# nc_spec
|
|
||||||
# ==
|
|
||||||
# def_spec
|
|
||||||
# )
|
|
||||||
# breakpoint()
|
|
||||||
|
|
||||||
pld_spec: Union[Type] = specs[spec_build_method]
|
# NOTE: roundtrip serialize to validate
|
||||||
runtime_spec: Union[Type] = Union[*ipc_msg_types]
|
# - enode to msgpack binary format,
|
||||||
ipc_spec = pld_spec | runtime_spec
|
# - decode that back to a struct.
|
||||||
log.runtime(
|
return msgpack.Decoder(type=type(self)).decode(
|
||||||
'Generating new IPC msg-spec\n'
|
msgpack.Encoder().encode(self)
|
||||||
f'{ipc_spec}\n'
|
)
|
||||||
)
|
|
||||||
assert (
|
def typecast(
|
||||||
ipc_spec
|
self,
|
||||||
and
|
|
||||||
ipc_spec is not Any
|
# TODO: allow only casting a named subset?
|
||||||
)
|
# fields: set[str] | None = None,
|
||||||
return (
|
|
||||||
ipc_spec,
|
) -> None:
|
||||||
msgtypes_table[spec_build_method]
|
'''
|
||||||
+
|
Cast all fields using their declared type annotations
|
||||||
ipc_msg_types,
|
(kinda like what `pydantic` does by default).
|
||||||
)
|
|
||||||
|
NOTE: this of course won't work on frozen types, use
|
||||||
|
``.copy()`` above in such cases.
|
||||||
|
|
||||||
|
'''
|
||||||
|
# https://jcristharif.com/msgspec/api.html#msgspec.structs.fields
|
||||||
|
fi: structs.FieldInfo
|
||||||
|
for fi in structs.fields(self):
|
||||||
|
setattr(
|
||||||
|
self,
|
||||||
|
fi.name,
|
||||||
|
fi.type(getattr(self, fi.name)),
|
||||||
|
)
|
||||||
|
|
||||||
|
def __sub__(
|
||||||
|
self,
|
||||||
|
other: Struct,
|
||||||
|
|
||||||
|
) -> DiffDump[tuple[str, Any, Any]]:
|
||||||
|
'''
|
||||||
|
Compare fields/items key-wise and return a ``DiffDump``
|
||||||
|
for easy visual REPL comparison B)
|
||||||
|
|
||||||
|
'''
|
||||||
|
diffs: DiffDump[tuple[str, Any, Any]] = DiffDump()
|
||||||
|
for fi in structs.fields(self):
|
||||||
|
attr_name: str = fi.name
|
||||||
|
ours: Any = getattr(self, attr_name)
|
||||||
|
theirs: Any = getattr(other, attr_name)
|
||||||
|
if ours != theirs:
|
||||||
|
diffs.append((
|
||||||
|
attr_name,
|
||||||
|
ours,
|
||||||
|
theirs,
|
||||||
|
))
|
||||||
|
|
||||||
|
return diffs
|
||||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -29,6 +29,3 @@ from ._broadcast import (
|
||||||
BroadcastReceiver as BroadcastReceiver,
|
BroadcastReceiver as BroadcastReceiver,
|
||||||
Lagged as Lagged,
|
Lagged as Lagged,
|
||||||
)
|
)
|
||||||
from ._beg import (
|
|
||||||
collapse_eg as collapse_eg,
|
|
||||||
)
|
|
||||||
|
|
|
@ -1,58 +0,0 @@
|
||||||
# tractor: structured concurrent "actors".
|
|
||||||
# Copyright 2018-eternity Tyler Goodlet.
|
|
||||||
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as published by
|
|
||||||
# the Free Software Foundation, either version 3 of the License, or
|
|
||||||
# (at your option) any later version.
|
|
||||||
|
|
||||||
# This program is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU Affero General Public License for more details.
|
|
||||||
|
|
||||||
# You should have received a copy of the GNU Affero General Public License
|
|
||||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
'''
|
|
||||||
`BaseExceptionGroup` related utils and helpers pertaining to
|
|
||||||
first-class-`trio` from a historical perspective B)
|
|
||||||
|
|
||||||
'''
|
|
||||||
from contextlib import (
|
|
||||||
asynccontextmanager as acm,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def maybe_collapse_eg(
|
|
||||||
beg: BaseExceptionGroup,
|
|
||||||
) -> BaseException:
|
|
||||||
'''
|
|
||||||
If the input beg can collapse to a single non-eg sub-exception,
|
|
||||||
return it instead.
|
|
||||||
|
|
||||||
'''
|
|
||||||
if len(excs := beg.exceptions) == 1:
|
|
||||||
return excs[0]
|
|
||||||
|
|
||||||
return beg
|
|
||||||
|
|
||||||
|
|
||||||
@acm
|
|
||||||
async def collapse_eg():
|
|
||||||
'''
|
|
||||||
If `BaseExceptionGroup` raised in the body scope is
|
|
||||||
"collapse-able" (in the same way that
|
|
||||||
`trio.open_nursery(strict_exception_groups=False)` works) then
|
|
||||||
only raise the lone emedded non-eg in in place.
|
|
||||||
|
|
||||||
'''
|
|
||||||
try:
|
|
||||||
yield
|
|
||||||
except* BaseException as beg:
|
|
||||||
if (
|
|
||||||
exc := maybe_collapse_eg(beg)
|
|
||||||
) is not beg:
|
|
||||||
raise exc
|
|
||||||
|
|
||||||
raise beg
|
|
|
@ -15,7 +15,7 @@
|
||||||
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
# along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
'''
|
'''
|
||||||
`tokio` style broadcast channel.
|
``tokio`` style broadcast channel.
|
||||||
https://docs.rs/tokio/1.11.0/tokio/sync/broadcast/index.html
|
https://docs.rs/tokio/1.11.0/tokio/sync/broadcast/index.html
|
||||||
|
|
||||||
'''
|
'''
|
||||||
|
@ -156,12 +156,11 @@ class BroadcastState(Struct):
|
||||||
|
|
||||||
class BroadcastReceiver(ReceiveChannel):
|
class BroadcastReceiver(ReceiveChannel):
|
||||||
'''
|
'''
|
||||||
A memory receive channel broadcaster which is non-lossy for
|
A memory receive channel broadcaster which is non-lossy for the
|
||||||
the fastest consumer.
|
fastest consumer.
|
||||||
|
|
||||||
Additional consumer tasks can receive all produced values by
|
Additional consumer tasks can receive all produced values by registering
|
||||||
registering with ``.subscribe()`` and receiving from the new
|
with ``.subscribe()`` and receiving from the new instance it delivers.
|
||||||
instance it delivers.
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
def __init__(
|
def __init__(
|
||||||
|
@ -382,7 +381,7 @@ class BroadcastReceiver(ReceiveChannel):
|
||||||
# likely it makes sense to unwind back to the
|
# likely it makes sense to unwind back to the
|
||||||
# underlying?
|
# underlying?
|
||||||
# import tractor
|
# import tractor
|
||||||
# await tractor.pause()
|
# await tractor.breakpoint()
|
||||||
log.warning(
|
log.warning(
|
||||||
f'Only one sub left for {self}?\n'
|
f'Only one sub left for {self}?\n'
|
||||||
'We can probably unwind from breceiver?'
|
'We can probably unwind from breceiver?'
|
||||||
|
|
|
@ -18,12 +18,8 @@
|
||||||
Async context manager primitives with hard ``trio``-aware semantics
|
Async context manager primitives with hard ``trio``-aware semantics
|
||||||
|
|
||||||
'''
|
'''
|
||||||
from __future__ import annotations
|
from contextlib import asynccontextmanager as acm
|
||||||
from contextlib import (
|
|
||||||
asynccontextmanager as acm,
|
|
||||||
)
|
|
||||||
import inspect
|
import inspect
|
||||||
from types import ModuleType
|
|
||||||
from typing import (
|
from typing import (
|
||||||
Any,
|
Any,
|
||||||
AsyncContextManager,
|
AsyncContextManager,
|
||||||
|
@ -34,16 +30,13 @@ from typing import (
|
||||||
Optional,
|
Optional,
|
||||||
Sequence,
|
Sequence,
|
||||||
TypeVar,
|
TypeVar,
|
||||||
TYPE_CHECKING,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
import trio
|
import trio
|
||||||
|
|
||||||
from tractor._state import current_actor
|
from tractor._state import current_actor
|
||||||
from tractor.log import get_logger
|
from tractor.log import get_logger
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
|
||||||
from tractor import ActorNursery
|
|
||||||
|
|
||||||
|
|
||||||
log = get_logger(__name__)
|
log = get_logger(__name__)
|
||||||
|
|
||||||
|
@ -53,12 +46,8 @@ T = TypeVar("T")
|
||||||
|
|
||||||
@acm
|
@acm
|
||||||
async def maybe_open_nursery(
|
async def maybe_open_nursery(
|
||||||
nursery: trio.Nursery|ActorNursery|None = None,
|
nursery: trio.Nursery | None = None,
|
||||||
shield: bool = False,
|
shield: bool = False,
|
||||||
lib: ModuleType = trio,
|
|
||||||
|
|
||||||
**kwargs, # proxy thru
|
|
||||||
|
|
||||||
) -> AsyncGenerator[trio.Nursery, Any]:
|
) -> AsyncGenerator[trio.Nursery, Any]:
|
||||||
'''
|
'''
|
||||||
Create a new nursery if None provided.
|
Create a new nursery if None provided.
|
||||||
|
@ -69,12 +58,13 @@ async def maybe_open_nursery(
|
||||||
if nursery is not None:
|
if nursery is not None:
|
||||||
yield nursery
|
yield nursery
|
||||||
else:
|
else:
|
||||||
async with lib.open_nursery(**kwargs) as nursery:
|
async with trio.open_nursery() as nursery:
|
||||||
nursery.cancel_scope.shield = shield
|
nursery.cancel_scope.shield = shield
|
||||||
yield nursery
|
yield nursery
|
||||||
|
|
||||||
|
|
||||||
async def _enter_and_wait(
|
async def _enter_and_wait(
|
||||||
|
|
||||||
mngr: AsyncContextManager[T],
|
mngr: AsyncContextManager[T],
|
||||||
unwrapped: dict[int, T],
|
unwrapped: dict[int, T],
|
||||||
all_entered: trio.Event,
|
all_entered: trio.Event,
|
||||||
|
@ -101,6 +91,7 @@ async def _enter_and_wait(
|
||||||
|
|
||||||
@acm
|
@acm
|
||||||
async def gather_contexts(
|
async def gather_contexts(
|
||||||
|
|
||||||
mngrs: Sequence[AsyncContextManager[T]],
|
mngrs: Sequence[AsyncContextManager[T]],
|
||||||
|
|
||||||
) -> AsyncGenerator[
|
) -> AsyncGenerator[
|
||||||
|
@ -111,17 +102,15 @@ async def gather_contexts(
|
||||||
None,
|
None,
|
||||||
]:
|
]:
|
||||||
'''
|
'''
|
||||||
Concurrently enter a sequence of async context managers (acms),
|
Concurrently enter a sequence of async context managers, each in
|
||||||
each from a separate `trio` task and deliver the unwrapped
|
a separate ``trio`` task and deliver the unwrapped values in the
|
||||||
`yield`-ed values in the same order once all managers have entered.
|
same order once all managers have entered. On exit all contexts are
|
||||||
|
subsequently and concurrently exited.
|
||||||
|
|
||||||
On exit, all acms are subsequently and concurrently exited.
|
This function is somewhat similar to common usage of
|
||||||
|
``contextlib.AsyncExitStack.enter_async_context()`` (in a loop) in
|
||||||
This function is somewhat similar to a batch of non-blocking
|
combo with ``asyncio.gather()`` except the managers are concurrently
|
||||||
calls to `contextlib.AsyncExitStack.enter_async_context()`
|
entered and exited, and cancellation just works.
|
||||||
(inside a loop) *in combo with* a `asyncio.gather()` to get the
|
|
||||||
`.__aenter__()`-ed values, except the managers are both
|
|
||||||
concurrently entered and exited and *cancellation just works*(R).
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
seed: int = id(mngrs)
|
seed: int = id(mngrs)
|
||||||
|
@ -145,14 +134,9 @@ async def gather_contexts(
|
||||||
'Use a non-lazy iterator or sequence type intead!'
|
'Use a non-lazy iterator or sequence type intead!'
|
||||||
)
|
)
|
||||||
|
|
||||||
async with trio.open_nursery(
|
async with trio.open_nursery() as n:
|
||||||
strict_exception_groups=False,
|
|
||||||
# ^XXX^ TODO? soo roll our own then ??
|
|
||||||
# -> since we kinda want the "if only one `.exception` then
|
|
||||||
# just raise that" interface?
|
|
||||||
) as tn:
|
|
||||||
for mngr in mngrs:
|
for mngr in mngrs:
|
||||||
tn.start_soon(
|
n.start_soon(
|
||||||
_enter_and_wait,
|
_enter_and_wait,
|
||||||
mngr,
|
mngr,
|
||||||
unwrapped,
|
unwrapped,
|
||||||
|
@ -226,10 +210,9 @@ async def maybe_open_context(
|
||||||
|
|
||||||
) -> AsyncIterator[tuple[bool, T]]:
|
) -> AsyncIterator[tuple[bool, T]]:
|
||||||
'''
|
'''
|
||||||
Maybe open an async-context-manager (acm) if there is not already
|
Maybe open a context manager if there is not already a _Cached
|
||||||
a `_Cached` version for the provided (input) `key` for *this* actor.
|
version for the provided ``key`` for *this* actor. Return the
|
||||||
|
_Cached instance on a _Cache hit.
|
||||||
Return the `_Cached` instance on a _Cache hit.
|
|
||||||
|
|
||||||
'''
|
'''
|
||||||
fid = id(acm_func)
|
fid = id(acm_func)
|
||||||
|
@ -288,16 +271,8 @@ async def maybe_open_context(
|
||||||
yield False, yielded
|
yield False, yielded
|
||||||
|
|
||||||
else:
|
else:
|
||||||
|
log.info(f'Reusing _Cached resource for {ctx_key}')
|
||||||
_Cache.users += 1
|
_Cache.users += 1
|
||||||
log.runtime(
|
|
||||||
f'Re-using cached resource for user {_Cache.users}\n\n'
|
|
||||||
f'{ctx_key!r} -> {type(yielded)}\n'
|
|
||||||
|
|
||||||
# TODO: make this work with values but without
|
|
||||||
# `msgspec.Struct` causing frickin crashes on field-type
|
|
||||||
# lookups..
|
|
||||||
# f'{ctx_key!r} -> {yielded!r}\n'
|
|
||||||
)
|
|
||||||
lock.release()
|
lock.release()
|
||||||
yield True, yielded
|
yield True, yielded
|
||||||
|
|
||||||
|
|
466
uv.lock
466
uv.lock
|
@ -1,466 +0,0 @@
|
||||||
version = 1
|
|
||||||
revision = 1
|
|
||||||
requires-python = ">=3.11"
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "attrs"
|
|
||||||
version = "24.3.0"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/48/c8/6260f8ccc11f0917360fc0da435c5c9c7504e3db174d5a12a1494887b045/attrs-24.3.0.tar.gz", hash = "sha256:8f5c07333d543103541ba7be0e2ce16eeee8130cb0b3f9238ab904ce1e85baff", size = 805984 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/89/aa/ab0f7891a01eeb2d2e338ae8fecbe57fcebea1a24dbb64d45801bfab481d/attrs-24.3.0-py3-none-any.whl", hash = "sha256:ac96cd038792094f438ad1f6ff80837353805ac950cd2aa0e0625ef19850c308", size = 63397 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "cffi"
|
|
||||||
version = "1.17.1"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "pycparser" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/fc/97/c783634659c2920c3fc70419e3af40972dbaf758daa229a7d6ea6135c90d/cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824", size = 516621 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/34/33/e1b8a1ba29025adbdcda5fb3a36f94c03d771c1b7b12f726ff7fef2ebe36/cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655", size = 171727 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/3d/97/50228be003bb2802627d28ec0627837ac0bf35c90cf769812056f235b2d1/cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0", size = 181400 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/86/c5/28b2d6f799ec0bdecf44dced2ec5ed43e0eb63097b0f58c293583b406582/cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65", size = 172448 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/50/b9/db34c4755a7bd1cb2d1603ac3863f22bcecbd1ba29e5ee841a4bc510b294/cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903", size = 181976 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/bf/ee/f94057fa6426481d663b88637a9a10e859e492c73d0384514a17d78ee205/cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d", size = 172475 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/7c/fc/6a8cb64e5f0324877d503c854da15d76c1e50eb722e320b15345c4d0c6de/cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a", size = 182009 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "colorama"
|
|
||||||
version = "0.4.6"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "colorlog"
|
|
||||||
version = "6.9.0"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/d3/7a/359f4d5df2353f26172b3cc39ea32daa39af8de522205f512f458923e677/colorlog-6.9.0.tar.gz", hash = "sha256:bfba54a1b93b94f54e1f4fe48395725a3d92fd2a4af702f6bd70946bdc0c6ac2", size = 16624 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/e3/51/9b208e85196941db2f0654ad0357ca6388ab3ed67efdbfc799f35d1f83aa/colorlog-6.9.0-py3-none-any.whl", hash = "sha256:5906e71acd67cb07a71e779c47c4bcb45fb8c2993eebe9e5adcd6a6f1b283eff", size = 11424 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "greenback"
|
|
||||||
version = "1.2.1"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "greenlet" },
|
|
||||||
{ name = "outcome" },
|
|
||||||
{ name = "sniffio" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/dc/c1/ab3a42c0f3ed56df9cd33de1539b3198d98c6ccbaf88a73d6be0b72d85e0/greenback-1.2.1.tar.gz", hash = "sha256:de3ca656885c03b96dab36079f3de74bb5ba061da9bfe3bb69dccc866ef95ea3", size = 42597 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/71/d0/b8dc79d5ecfffacad9c844b6ae76b9c6259935796d3c561deccbf8fa421d/greenback-1.2.1-py3-none-any.whl", hash = "sha256:98768edbbe4340091a9730cf64a683fcbaa3f2cb81e4ac41d7ed28d3b6f74b79", size = 28062 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "greenlet"
|
|
||||||
version = "3.1.1"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/2f/ff/df5fede753cc10f6a5be0931204ea30c35fa2f2ea7a35b25bdaf4fe40e46/greenlet-3.1.1.tar.gz", hash = "sha256:4ce3ac6cdb6adf7946475d7ef31777c26d94bccc377e070a7986bd2d5c515467", size = 186022 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/28/62/1c2665558618553c42922ed47a4e6d6527e2fa3516a8256c2f431c5d0441/greenlet-3.1.1-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:e4d333e558953648ca09d64f13e6d8f0523fa705f51cae3f03b5983489958c70", size = 272479 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/76/9d/421e2d5f07285b6e4e3a676b016ca781f63cfe4a0cd8eaecf3fd6f7a71ae/greenlet-3.1.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:09fc016b73c94e98e29af67ab7b9a879c307c6731a2c9da0db5a7d9b7edd1159", size = 640404 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/e5/de/6e05f5c59262a584e502dd3d261bbdd2c97ab5416cc9c0b91ea38932a901/greenlet-3.1.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d5e975ca70269d66d17dd995dafc06f1b06e8cb1ec1e9ed54c1d1e4a7c4cf26e", size = 652813 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/49/93/d5f93c84241acdea15a8fd329362c2c71c79e1a507c3f142a5d67ea435ae/greenlet-3.1.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b2813dc3de8c1ee3f924e4d4227999285fd335d1bcc0d2be6dc3f1f6a318ec1", size = 648517 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/15/85/72f77fc02d00470c86a5c982b8daafdf65d38aefbbe441cebff3bf7037fc/greenlet-3.1.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e347b3bfcf985a05e8c0b7d462ba6f15b1ee1c909e2dcad795e49e91b152c383", size = 647831 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/f7/4b/1c9695aa24f808e156c8f4813f685d975ca73c000c2a5056c514c64980f6/greenlet-3.1.1-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9e8f8c9cb53cdac7ba9793c276acd90168f416b9ce36799b9b885790f8ad6c0a", size = 602413 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/76/70/ad6e5b31ef330f03b12559d19fda2606a522d3849cde46b24f223d6d1619/greenlet-3.1.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:62ee94988d6b4722ce0028644418d93a52429e977d742ca2ccbe1c4f4a792511", size = 1129619 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/f4/fb/201e1b932e584066e0f0658b538e73c459b34d44b4bd4034f682423bc801/greenlet-3.1.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1776fd7f989fc6b8d8c8cb8da1f6b82c5814957264d1f6cf818d475ec2bf6395", size = 1155198 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/12/da/b9ed5e310bb8b89661b80cbcd4db5a067903bbcd7fc854923f5ebb4144f0/greenlet-3.1.1-cp311-cp311-win_amd64.whl", hash = "sha256:48ca08c771c268a768087b408658e216133aecd835c0ded47ce955381105ba39", size = 298930 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/7d/ec/bad1ac26764d26aa1353216fcbfa4670050f66d445448aafa227f8b16e80/greenlet-3.1.1-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:4afe7ea89de619adc868e087b4d2359282058479d7cfb94970adf4b55284574d", size = 274260 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/66/d4/c8c04958870f482459ab5956c2942c4ec35cac7fe245527f1039837c17a9/greenlet-3.1.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f406b22b7c9a9b4f8aa9d2ab13d6ae0ac3e85c9a809bd590ad53fed2bf70dc79", size = 649064 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/51/41/467b12a8c7c1303d20abcca145db2be4e6cd50a951fa30af48b6ec607581/greenlet-3.1.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c3a701fe5a9695b238503ce5bbe8218e03c3bcccf7e204e455e7462d770268aa", size = 663420 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/27/8f/2a93cd9b1e7107d5c7b3b7816eeadcac2ebcaf6d6513df9abaf0334777f6/greenlet-3.1.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2846930c65b47d70b9d178e89c7e1a69c95c1f68ea5aa0a58646b7a96df12441", size = 658035 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/57/5c/7c6f50cb12be092e1dccb2599be5a942c3416dbcfb76efcf54b3f8be4d8d/greenlet-3.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:99cfaa2110534e2cf3ba31a7abcac9d328d1d9f1b95beede58294a60348fba36", size = 660105 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/f1/66/033e58a50fd9ec9df00a8671c74f1f3a320564c6415a4ed82a1c651654ba/greenlet-3.1.1-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1443279c19fca463fc33e65ef2a935a5b09bb90f978beab37729e1c3c6c25fe9", size = 613077 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/19/c5/36384a06f748044d06bdd8776e231fadf92fc896bd12cb1c9f5a1bda9578/greenlet-3.1.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:b7cede291382a78f7bb5f04a529cb18e068dd29e0fb27376074b6d0317bf4dd0", size = 1135975 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/38/f9/c0a0eb61bdf808d23266ecf1d63309f0e1471f284300ce6dac0ae1231881/greenlet-3.1.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:23f20bb60ae298d7d8656c6ec6db134bca379ecefadb0b19ce6f19d1f232a942", size = 1163955 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/43/21/a5d9df1d21514883333fc86584c07c2b49ba7c602e670b174bd73cfc9c7f/greenlet-3.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:7124e16b4c55d417577c2077be379514321916d5790fa287c9ed6f23bd2ffd01", size = 299655 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/f3/57/0db4940cd7bb461365ca8d6fd53e68254c9dbbcc2b452e69d0d41f10a85e/greenlet-3.1.1-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:05175c27cb459dcfc05d026c4232f9de8913ed006d42713cb8a5137bd49375f1", size = 272990 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/1c/ec/423d113c9f74e5e402e175b157203e9102feeb7088cee844d735b28ef963/greenlet-3.1.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:935e943ec47c4afab8965954bf49bfa639c05d4ccf9ef6e924188f762145c0ff", size = 649175 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/a9/46/ddbd2db9ff209186b7b7c621d1432e2f21714adc988703dbdd0e65155c77/greenlet-3.1.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:667a9706c970cb552ede35aee17339a18e8f2a87a51fba2ed39ceeeb1004798a", size = 663425 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/bc/f9/9c82d6b2b04aa37e38e74f0c429aece5eeb02bab6e3b98e7db89b23d94c6/greenlet-3.1.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b8a678974d1f3aa55f6cc34dc480169d58f2e6d8958895d68845fa4ab566509e", size = 657736 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/d9/42/b87bc2a81e3a62c3de2b0d550bf91a86939442b7ff85abb94eec3fc0e6aa/greenlet-3.1.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:efc0f674aa41b92da8c49e0346318c6075d734994c3c4e4430b1c3f853e498e4", size = 660347 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/37/fa/71599c3fd06336cdc3eac52e6871cfebab4d9d70674a9a9e7a482c318e99/greenlet-3.1.1-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0153404a4bb921f0ff1abeb5ce8a5131da56b953eda6e14b88dc6bbc04d2049e", size = 615583 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/4e/96/e9ef85de031703ee7a4483489b40cf307f93c1824a02e903106f2ea315fe/greenlet-3.1.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:275f72decf9932639c1c6dd1013a1bc266438eb32710016a1c742df5da6e60a1", size = 1133039 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/87/76/b2b6362accd69f2d1889db61a18c94bc743e961e3cab344c2effaa4b4a25/greenlet-3.1.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:c4aab7f6381f38a4b42f269057aee279ab0fc7bf2e929e3d4abfae97b682a12c", size = 1160716 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/1f/1b/54336d876186920e185066d8c3024ad55f21d7cc3683c856127ddb7b13ce/greenlet-3.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:b42703b1cf69f2aa1df7d1030b9d77d3e584a70755674d60e710f0af570f3761", size = 299490 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/5f/17/bea55bf36990e1638a2af5ba10c1640273ef20f627962cf97107f1e5d637/greenlet-3.1.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f1695e76146579f8c06c1509c7ce4dfe0706f49c6831a817ac04eebb2fd02011", size = 643731 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/78/d2/aa3d2157f9ab742a08e0fd8f77d4699f37c22adfbfeb0c610a186b5f75e0/greenlet-3.1.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7876452af029456b3f3549b696bb36a06db7c90747740c5302f74a9e9fa14b13", size = 649304 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/f1/8e/d0aeffe69e53ccff5a28fa86f07ad1d2d2d6537a9506229431a2a02e2f15/greenlet-3.1.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4ead44c85f8ab905852d3de8d86f6f8baf77109f9da589cb4fa142bd3b57b475", size = 646537 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/05/79/e15408220bbb989469c8871062c97c6c9136770657ba779711b90870d867/greenlet-3.1.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8320f64b777d00dd7ccdade271eaf0cad6636343293a25074cc5566160e4de7b", size = 642506 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/18/87/470e01a940307796f1d25f8167b551a968540fbe0551c0ebb853cb527dd6/greenlet-3.1.1-cp313-cp313t-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6510bf84a6b643dabba74d3049ead221257603a253d0a9873f55f6a59a65f822", size = 602753 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/e2/72/576815ba674eddc3c25028238f74d7b8068902b3968cbe456771b166455e/greenlet-3.1.1-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:04b013dc07c96f83134b1e99888e7a79979f1a247e2a9f59697fa14b5862ed01", size = 1122731 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/ac/38/08cc303ddddc4b3d7c628c3039a61a3aae36c241ed01393d00c2fd663473/greenlet-3.1.1-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:411f015496fec93c1c8cd4e5238da364e1da7a124bcb293f085bf2860c32c6f6", size = 1142112 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "idna"
|
|
||||||
version = "3.10"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "iniconfig"
|
|
||||||
version = "2.0.0"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/d7/4b/cbd8e699e64a6f16ca3a8220661b5f83792b3017d0f79807cb8708d33913/iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3", size = 4646 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/ef/a6/62565a6e1cf69e10f5727360368e451d4b7f58beeac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374", size = 5892 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "msgspec"
|
|
||||||
version = "0.19.0"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/cf/9b/95d8ce458462b8b71b8a70fa94563b2498b89933689f3a7b8911edfae3d7/msgspec-0.19.0.tar.gz", hash = "sha256:604037e7cd475345848116e89c553aa9a233259733ab51986ac924ab1b976f8e", size = 216934 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/24/d4/2ec2567ac30dab072cce3e91fb17803c52f0a37aab6b0c24375d2b20a581/msgspec-0.19.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:aa77046904db764b0462036bc63ef71f02b75b8f72e9c9dd4c447d6da1ed8f8e", size = 187939 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/2b/c0/18226e4328897f4f19875cb62bb9259fe47e901eade9d9376ab5f251a929/msgspec-0.19.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:047cfa8675eb3bad68722cfe95c60e7afabf84d1bd8938979dd2b92e9e4a9551", size = 182202 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/81/25/3a4b24d468203d8af90d1d351b77ea3cffb96b29492855cf83078f16bfe4/msgspec-0.19.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e78f46ff39a427e10b4a61614a2777ad69559cc8d603a7c05681f5a595ea98f7", size = 209029 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/85/2e/db7e189b57901955239f7689b5dcd6ae9458637a9c66747326726c650523/msgspec-0.19.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c7adf191e4bd3be0e9231c3b6dc20cf1199ada2af523885efc2ed218eafd011", size = 210682 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/03/97/7c8895c9074a97052d7e4a1cc1230b7b6e2ca2486714eb12c3f08bb9d284/msgspec-0.19.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:f04cad4385e20be7c7176bb8ae3dca54a08e9756cfc97bcdb4f18560c3042063", size = 214003 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/61/61/e892997bcaa289559b4d5869f066a8021b79f4bf8e955f831b095f47a4cd/msgspec-0.19.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:45c8fb410670b3b7eb884d44a75589377c341ec1392b778311acdbfa55187716", size = 216833 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/ce/3d/71b2dffd3a1c743ffe13296ff701ee503feaebc3f04d0e75613b6563c374/msgspec-0.19.0-cp311-cp311-win_amd64.whl", hash = "sha256:70eaef4934b87193a27d802534dc466778ad8d536e296ae2f9334e182ac27b6c", size = 186184 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/b2/5f/a70c24f075e3e7af2fae5414c7048b0e11389685b7f717bb55ba282a34a7/msgspec-0.19.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f98bd8962ad549c27d63845b50af3f53ec468b6318400c9f1adfe8b092d7b62f", size = 190485 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/89/b0/1b9763938cfae12acf14b682fcf05c92855974d921a5a985ecc197d1c672/msgspec-0.19.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:43bbb237feab761b815ed9df43b266114203f53596f9b6e6f00ebd79d178cdf2", size = 183910 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/87/81/0c8c93f0b92c97e326b279795f9c5b956c5a97af28ca0fbb9fd86c83737a/msgspec-0.19.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4cfc033c02c3e0aec52b71710d7f84cb3ca5eb407ab2ad23d75631153fdb1f12", size = 210633 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/d0/ef/c5422ce8af73928d194a6606f8ae36e93a52fd5e8df5abd366903a5ca8da/msgspec-0.19.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d911c442571605e17658ca2b416fd8579c5050ac9adc5e00c2cb3126c97f73bc", size = 213594 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/19/2b/4137bc2ed45660444842d042be2cf5b18aa06efd2cda107cff18253b9653/msgspec-0.19.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:757b501fa57e24896cf40a831442b19a864f56d253679f34f260dcb002524a6c", size = 214053 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/9d/e6/8ad51bdc806aac1dc501e8fe43f759f9ed7284043d722b53323ea421c360/msgspec-0.19.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:5f0f65f29b45e2816d8bded36e6b837a4bf5fb60ec4bc3c625fa2c6da4124537", size = 219081 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/b1/ef/27dd35a7049c9a4f4211c6cd6a8c9db0a50647546f003a5867827ec45391/msgspec-0.19.0-cp312-cp312-win_amd64.whl", hash = "sha256:067f0de1c33cfa0b6a8206562efdf6be5985b988b53dd244a8e06f993f27c8c0", size = 187467 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/3c/cb/2842c312bbe618d8fefc8b9cedce37f773cdc8fa453306546dba2c21fd98/msgspec-0.19.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f12d30dd6266557aaaf0aa0f9580a9a8fbeadfa83699c487713e355ec5f0bd86", size = 190498 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/58/95/c40b01b93465e1a5f3b6c7d91b10fb574818163740cc3acbe722d1e0e7e4/msgspec-0.19.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:82b2c42c1b9ebc89e822e7e13bbe9d17ede0c23c187469fdd9505afd5a481314", size = 183950 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/e8/f0/5b764e066ce9aba4b70d1db8b087ea66098c7c27d59b9dd8a3532774d48f/msgspec-0.19.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:19746b50be214a54239aab822964f2ac81e38b0055cca94808359d779338c10e", size = 210647 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/9d/87/bc14f49bc95c4cb0dd0a8c56028a67c014ee7e6818ccdce74a4862af259b/msgspec-0.19.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:60ef4bdb0ec8e4ad62e5a1f95230c08efb1f64f32e6e8dd2ced685bcc73858b5", size = 213563 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/53/2f/2b1c2b056894fbaa975f68f81e3014bb447516a8b010f1bed3fb0e016ed7/msgspec-0.19.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac7f7c377c122b649f7545810c6cd1b47586e3aa3059126ce3516ac7ccc6a6a9", size = 213996 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/aa/5a/4cd408d90d1417e8d2ce6a22b98a6853c1b4d7cb7669153e4424d60087f6/msgspec-0.19.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a5bc1472223a643f5ffb5bf46ccdede7f9795078194f14edd69e3aab7020d327", size = 219087 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/23/d8/f15b40611c2d5753d1abb0ca0da0c75348daf1252220e5dda2867bd81062/msgspec-0.19.0-cp313-cp313-win_amd64.whl", hash = "sha256:317050bc0f7739cb30d257ff09152ca309bf5a369854bbf1e57dffc310c1f20f", size = 187432 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "outcome"
|
|
||||||
version = "1.3.0.post0"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "attrs" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/98/df/77698abfac98571e65ffeb0c1fba8ffd692ab8458d617a0eed7d9a8d38f2/outcome-1.3.0.post0.tar.gz", hash = "sha256:9dcf02e65f2971b80047b377468e72a268e15c0af3cf1238e6ff14f7f91143b8", size = 21060 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/55/8b/5ab7257531a5d830fc8000c476e63c935488d74609b50f9384a643ec0a62/outcome-1.3.0.post0-py2.py3-none-any.whl", hash = "sha256:e771c5ce06d1415e356078d3bdd68523f284b4ce5419828922b6871e65eda82b", size = 10692 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "packaging"
|
|
||||||
version = "24.2"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "pdbp"
|
|
||||||
version = "1.6.1"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
|
||||||
{ name = "pygments" },
|
|
||||||
{ name = "tabcompleter" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/69/13/80da03638f62facbee76312ca9ee5941c017b080f2e4c6919fd4e87e16e3/pdbp-1.6.1.tar.gz", hash = "sha256:f4041642952a05df89664e166d5bd379607a0866ddd753c06874f65552bdf40b", size = 25322 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/29/93/d56fb9ba5569dc29d8263c72e46d21a2fd38741339ebf03f54cf7561828c/pdbp-1.6.1-py3-none-any.whl", hash = "sha256:f10bad2ee044c0e5c168cb0825abfdbdc01c50013e9755df5261b060bdd35c22", size = 21495 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "pexpect"
|
|
||||||
version = "4.9.0"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "ptyprocess" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/42/92/cc564bf6381ff43ce1f4d06852fc19a2f11d180f23dc32d9588bee2f149d/pexpect-4.9.0.tar.gz", hash = "sha256:ee7d41123f3c9911050ea2c2dac107568dc43b2d3b0c7557a33212c398ead30f", size = 166450 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/9e/c3/059298687310d527a58bb01f3b1965787ee3b40dce76752eda8b44e9a2c5/pexpect-4.9.0-py2.py3-none-any.whl", hash = "sha256:7236d1e080e4936be2dc3e326cec0af72acf9212a7e1d060210e70a47e253523", size = 63772 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "pluggy"
|
|
||||||
version = "1.5.0"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/96/2d/02d4312c973c6050a18b314a5ad0b3210edb65a906f868e31c111dede4a6/pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1", size = 67955 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/88/5f/e351af9a41f866ac3f1fac4ca0613908d9a41741cfcf2228f4ad853b697d/pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669", size = 20556 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "prompt-toolkit"
|
|
||||||
version = "3.0.50"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "wcwidth" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/a1/e1/bd15cb8ffdcfeeb2bdc215de3c3cffca11408d829e4b8416dcfe71ba8854/prompt_toolkit-3.0.50.tar.gz", hash = "sha256:544748f3860a2623ca5cd6d2795e7a14f3d0e1c3c9728359013f79877fc89bab", size = 429087 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/e4/ea/d836f008d33151c7a1f62caf3d8dd782e4d15f6a43897f64480c2b8de2ad/prompt_toolkit-3.0.50-py3-none-any.whl", hash = "sha256:9b6427eb19e479d98acff65196a307c555eb567989e6d88ebbb1b509d9779198", size = 387816 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "ptyprocess"
|
|
||||||
version = "0.7.0"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/20/e5/16ff212c1e452235a90aeb09066144d0c5a6a8c0834397e03f5224495c4e/ptyprocess-0.7.0.tar.gz", hash = "sha256:5c5d0a3b48ceee0b48485e0c26037c0acd7d29765ca3fbb5cb3831d347423220", size = 70762 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/22/a6/858897256d0deac81a172289110f31629fc4cee19b6f01283303e18c8db3/ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35", size = 13993 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "pycparser"
|
|
||||||
version = "2.22"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/1d/b2/31537cf4b1ca988837256c910a668b553fceb8f069bedc4b1c826024b52c/pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6", size = 172736 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/13/a3/a812df4e2dd5696d1f351d58b8fe16a405b234ad2886a0dab9183fb78109/pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc", size = 117552 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "pygments"
|
|
||||||
version = "2.19.1"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/7c/2d/c3338d48ea6cc0feb8446d8e6937e1408088a72a39937982cc6111d17f84/pygments-2.19.1.tar.gz", hash = "sha256:61c16d2a8576dc0649d9f39e089b5f02bcd27fba10d8fb4dcc28173f7a45151f", size = 4968581 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/8a/0b/9fcc47d19c48b59121088dd6da2488a49d5f72dacf8262e2790a1d2c7d15/pygments-2.19.1-py3-none-any.whl", hash = "sha256:9ea1544ad55cecf4b8242fab6dd35a93bbce657034b0611ee383099054ab6d8c", size = 1225293 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "pyperclip"
|
|
||||||
version = "1.9.0"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/30/23/2f0a3efc4d6a32f3b63cdff36cd398d9701d26cda58e3ab97ac79fb5e60d/pyperclip-1.9.0.tar.gz", hash = "sha256:b7de0142ddc81bfc5c7507eea19da920b92252b548b96186caf94a5e2527d310", size = 20961 }
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "pyreadline3"
|
|
||||||
version = "3.5.4"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/0f/49/4cea918a08f02817aabae639e3d0ac046fef9f9180518a3ad394e22da148/pyreadline3-3.5.4.tar.gz", hash = "sha256:8d57d53039a1c75adba8e50dd3d992b28143480816187ea5efbd5c78e6c885b7", size = 99839 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/5a/dc/491b7661614ab97483abf2056be1deee4dc2490ecbf7bff9ab5cdbac86e1/pyreadline3-3.5.4-py3-none-any.whl", hash = "sha256:eaf8e6cc3c49bcccf145fc6067ba8643d1df34d604a1ec0eccbf7a18e6d3fae6", size = 83178 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "pytest"
|
|
||||||
version = "8.3.5"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
|
||||||
{ name = "iniconfig" },
|
|
||||||
{ name = "packaging" },
|
|
||||||
{ name = "pluggy" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/ae/3c/c9d525a414d506893f0cd8a8d0de7706446213181570cdbd766691164e40/pytest-8.3.5.tar.gz", hash = "sha256:f4efe70cc14e511565ac476b57c279e12a855b11f48f212af1080ef2263d3845", size = 1450891 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/30/3d/64ad57c803f1fa1e963a7946b6e0fea4a70df53c1a7fed304586539c2bac/pytest-8.3.5-py3-none-any.whl", hash = "sha256:c69214aa47deac29fad6c2a4f590b9c4a9fdb16a403176fe154b79c0b4d4d820", size = 343634 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "sniffio"
|
|
||||||
version = "1.3.1"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "sortedcontainers"
|
|
||||||
version = "2.4.0"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/e8/c4/ba2f8066cceb6f23394729afe52f3bf7adec04bf9ed2c820b39e19299111/sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88", size = 30594 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/32/46/9cb0e58b2deb7f82b84065f37f3bffeb12413f947f9388e4cac22c4621ce/sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0", size = 29575 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "stackscope"
|
|
||||||
version = "0.2.2"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/4a/fc/20dbb993353f31230138f3c63f3f0c881d1853e70d7a30cd68d2ba4cf1e2/stackscope-0.2.2.tar.gz", hash = "sha256:f508c93eb4861ada466dd3ff613ca203962ceb7587ad013759f15394e6a4e619", size = 90479 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/f1/5f/0a674fcafa03528089badb46419413f342537b5b57d2fefc9900fb8ee4e4/stackscope-0.2.2-py3-none-any.whl", hash = "sha256:c199b0cda738d39c993ee04eb01961b06b7e9aeb43ebf9fd6226cdd72ea9faf6", size = 80807 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "tabcompleter"
|
|
||||||
version = "1.4.0"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "pyreadline3", marker = "sys_platform == 'win32'" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/73/1a/ed3544579628c5709bae6fae2255e94c6982a9ff77d42d8ba59fd2f3b21a/tabcompleter-1.4.0.tar.gz", hash = "sha256:7562a9938e62f8e7c3be612c3ac4e14c5ec4307b58ba9031c148260e866e8814", size = 10431 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/65/44/bb509c3d2c0b5a87e7a5af1d5917a402a32ff026f777a6d7cb6990746cbb/tabcompleter-1.4.0-py3-none-any.whl", hash = "sha256:d744aa735b49c0a6cc2fb8fcd40077fec47425e4388301010b14e6ce3311368b", size = 6725 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "tractor"
|
|
||||||
version = "0.1.0a6.dev0"
|
|
||||||
source = { editable = "." }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "colorlog" },
|
|
||||||
{ name = "msgspec" },
|
|
||||||
{ name = "pdbp" },
|
|
||||||
{ name = "tricycle" },
|
|
||||||
{ name = "trio" },
|
|
||||||
{ name = "wrapt" },
|
|
||||||
]
|
|
||||||
|
|
||||||
[package.dev-dependencies]
|
|
||||||
dev = [
|
|
||||||
{ name = "greenback" },
|
|
||||||
{ name = "pexpect" },
|
|
||||||
{ name = "prompt-toolkit" },
|
|
||||||
{ name = "pyperclip" },
|
|
||||||
{ name = "pytest" },
|
|
||||||
{ name = "stackscope" },
|
|
||||||
{ name = "xonsh" },
|
|
||||||
]
|
|
||||||
|
|
||||||
[package.metadata]
|
|
||||||
requires-dist = [
|
|
||||||
{ name = "colorlog", specifier = ">=6.8.2,<7" },
|
|
||||||
{ name = "msgspec", specifier = ">=0.19.0" },
|
|
||||||
{ name = "pdbp", specifier = ">=1.6,<2" },
|
|
||||||
{ name = "tricycle", specifier = ">=0.4.1,<0.5" },
|
|
||||||
{ name = "trio", specifier = ">0.27" },
|
|
||||||
{ name = "wrapt", specifier = ">=1.16.0,<2" },
|
|
||||||
]
|
|
||||||
|
|
||||||
[package.metadata.requires-dev]
|
|
||||||
dev = [
|
|
||||||
{ name = "greenback", specifier = ">=1.2.1,<2" },
|
|
||||||
{ name = "pexpect", specifier = ">=4.9.0,<5" },
|
|
||||||
{ name = "prompt-toolkit", specifier = ">=3.0.50" },
|
|
||||||
{ name = "pyperclip", specifier = ">=1.9.0" },
|
|
||||||
{ name = "pytest", specifier = ">=8.3.5" },
|
|
||||||
{ name = "stackscope", specifier = ">=0.2.2,<0.3" },
|
|
||||||
{ name = "xonsh", specifier = ">=0.19.2" },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "tricycle"
|
|
||||||
version = "0.4.1"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "trio" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/f8/8e/fdd7bc467b40eedd0a5f2ed36b0d692c6e6f2473be00c8160e2e9f53adc1/tricycle-0.4.1.tar.gz", hash = "sha256:f56edb4b3e1bed3e2552b1b499b24a2dab47741e92e9b4d806acc5c35c9e6066", size = 41551 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/d7/c6/7cc05d60e21c683df99167db071ce5d848f5063c2a63971a8443466f603e/tricycle-0.4.1-py3-none-any.whl", hash = "sha256:67900995a73e7445e2c70250cdca04a778d9c3923dd960a97ad4569085e0fb3f", size = 35316 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "trio"
|
|
||||||
version = "0.29.0"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "attrs" },
|
|
||||||
{ name = "cffi", marker = "implementation_name != 'pypy' and os_name == 'nt'" },
|
|
||||||
{ name = "idna" },
|
|
||||||
{ name = "outcome" },
|
|
||||||
{ name = "sniffio" },
|
|
||||||
{ name = "sortedcontainers" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/a1/47/f62e62a1a6f37909aed0bf8f5d5411e06fa03846cfcb64540cd1180ccc9f/trio-0.29.0.tar.gz", hash = "sha256:ea0d3967159fc130acb6939a0be0e558e364fee26b5deeecc893a6b08c361bdf", size = 588952 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/c9/55/c4d9bea8b3d7937901958f65124123512419ab0eb73695e5f382521abbfb/trio-0.29.0-py3-none-any.whl", hash = "sha256:d8c463f1a9cc776ff63e331aba44c125f423a5a13c684307e828d930e625ba66", size = 492920 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "wcwidth"
|
|
||||||
version = "0.2.13"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/6c/63/53559446a878410fc5a5974feb13d31d78d752eb18aeba59c7fef1af7598/wcwidth-0.2.13.tar.gz", hash = "sha256:72ea0c06399eb286d978fdedb6923a9eb47e1c486ce63e9b4e64fc18303972b5", size = 101301 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/fd/84/fd2ba7aafacbad3c4201d395674fc6348826569da3c0937e75505ead3528/wcwidth-0.2.13-py2.py3-none-any.whl", hash = "sha256:3da69048e4540d84af32131829ff948f1e022c1c6bdb8d6102117aac784f6859", size = 34166 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "wrapt"
|
|
||||||
version = "1.17.2"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/c3/fc/e91cc220803d7bc4db93fb02facd8461c37364151b8494762cc88b0fbcef/wrapt-1.17.2.tar.gz", hash = "sha256:41388e9d4d1522446fe79d3213196bd9e3b301a336965b9e27ca2788ebd122f3", size = 55531 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/cd/f7/a2aab2cbc7a665efab072344a8949a71081eed1d2f451f7f7d2b966594a2/wrapt-1.17.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ff04ef6eec3eee8a5efef2401495967a916feaa353643defcc03fc74fe213b58", size = 53308 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/50/ff/149aba8365fdacef52b31a258c4dc1c57c79759c335eff0b3316a2664a64/wrapt-1.17.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:4db983e7bca53819efdbd64590ee96c9213894272c776966ca6306b73e4affda", size = 38488 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/65/46/5a917ce85b5c3b490d35c02bf71aedaa9f2f63f2d15d9949cc4ba56e8ba9/wrapt-1.17.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9abc77a4ce4c6f2a3168ff34b1da9b0f311a8f1cfd694ec96b0603dff1c79438", size = 38776 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/ca/74/336c918d2915a4943501c77566db41d1bd6e9f4dbc317f356b9a244dfe83/wrapt-1.17.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0b929ac182f5ace000d459c59c2c9c33047e20e935f8e39371fa6e3b85d56f4a", size = 83776 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/09/99/c0c844a5ccde0fe5761d4305485297f91d67cf2a1a824c5f282e661ec7ff/wrapt-1.17.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f09b286faeff3c750a879d336fb6d8713206fc97af3adc14def0cdd349df6000", size = 75420 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/b4/b0/9fc566b0fe08b282c850063591a756057c3247b2362b9286429ec5bf1721/wrapt-1.17.2-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a7ed2d9d039bd41e889f6fb9364554052ca21ce823580f6a07c4ec245c1f5d6", size = 83199 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/9d/4b/71996e62d543b0a0bd95dda485219856def3347e3e9380cc0d6cf10cfb2f/wrapt-1.17.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:129a150f5c445165ff941fc02ee27df65940fcb8a22a61828b1853c98763a64b", size = 82307 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/39/35/0282c0d8789c0dc9bcc738911776c762a701f95cfe113fb8f0b40e45c2b9/wrapt-1.17.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:1fb5699e4464afe5c7e65fa51d4f99e0b2eadcc176e4aa33600a3df7801d6662", size = 75025 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/4f/6d/90c9fd2c3c6fee181feecb620d95105370198b6b98a0770cba090441a828/wrapt-1.17.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9a2bce789a5ea90e51a02dfcc39e31b7f1e662bc3317979aa7e5538e3a034f72", size = 81879 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/8f/fa/9fb6e594f2ce03ef03eddbdb5f4f90acb1452221a5351116c7c4708ac865/wrapt-1.17.2-cp311-cp311-win32.whl", hash = "sha256:4afd5814270fdf6380616b321fd31435a462019d834f83c8611a0ce7484c7317", size = 36419 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/47/f8/fb1773491a253cbc123c5d5dc15c86041f746ed30416535f2a8df1f4a392/wrapt-1.17.2-cp311-cp311-win_amd64.whl", hash = "sha256:acc130bc0375999da18e3d19e5a86403667ac0c4042a094fefb7eec8ebac7cf3", size = 38773 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/a1/bd/ab55f849fd1f9a58ed7ea47f5559ff09741b25f00c191231f9f059c83949/wrapt-1.17.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:d5e2439eecc762cd85e7bd37161d4714aa03a33c5ba884e26c81559817ca0925", size = 53799 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/53/18/75ddc64c3f63988f5a1d7e10fb204ffe5762bc663f8023f18ecaf31a332e/wrapt-1.17.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:3fc7cb4c1c744f8c05cd5f9438a3caa6ab94ce8344e952d7c45a8ed59dd88392", size = 38821 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/48/2a/97928387d6ed1c1ebbfd4efc4133a0633546bec8481a2dd5ec961313a1c7/wrapt-1.17.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8fdbdb757d5390f7c675e558fd3186d590973244fab0c5fe63d373ade3e99d40", size = 38919 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/73/54/3bfe5a1febbbccb7a2f77de47b989c0b85ed3a6a41614b104204a788c20e/wrapt-1.17.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5bb1d0dbf99411f3d871deb6faa9aabb9d4e744d67dcaaa05399af89d847a91d", size = 88721 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/25/cb/7262bc1b0300b4b64af50c2720ef958c2c1917525238d661c3e9a2b71b7b/wrapt-1.17.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d18a4865f46b8579d44e4fe1e2bcbc6472ad83d98e22a26c963d46e4c125ef0b", size = 80899 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/2a/5a/04cde32b07a7431d4ed0553a76fdb7a61270e78c5fd5a603e190ac389f14/wrapt-1.17.2-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc570b5f14a79734437cb7b0500376b6b791153314986074486e0b0fa8d71d98", size = 89222 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/09/28/2e45a4f4771fcfb109e244d5dbe54259e970362a311b67a965555ba65026/wrapt-1.17.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6d9187b01bebc3875bac9b087948a2bccefe464a7d8f627cf6e48b1bbae30f82", size = 86707 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/c6/d2/dcb56bf5f32fcd4bd9aacc77b50a539abdd5b6536872413fd3f428b21bed/wrapt-1.17.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:9e8659775f1adf02eb1e6f109751268e493c73716ca5761f8acb695e52a756ae", size = 79685 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/80/4e/eb8b353e36711347893f502ce91c770b0b0929f8f0bed2670a6856e667a9/wrapt-1.17.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e8b2816ebef96d83657b56306152a93909a83f23994f4b30ad4573b00bd11bb9", size = 87567 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/17/27/4fe749a54e7fae6e7146f1c7d914d28ef599dacd4416566c055564080fe2/wrapt-1.17.2-cp312-cp312-win32.whl", hash = "sha256:468090021f391fe0056ad3e807e3d9034e0fd01adcd3bdfba977b6fdf4213ea9", size = 36672 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/15/06/1dbf478ea45c03e78a6a8c4be4fdc3c3bddea5c8de8a93bc971415e47f0f/wrapt-1.17.2-cp312-cp312-win_amd64.whl", hash = "sha256:ec89ed91f2fa8e3f52ae53cd3cf640d6feff92ba90d62236a81e4e563ac0e991", size = 38865 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/ce/b9/0ffd557a92f3b11d4c5d5e0c5e4ad057bd9eb8586615cdaf901409920b14/wrapt-1.17.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:6ed6ffac43aecfe6d86ec5b74b06a5be33d5bb9243d055141e8cabb12aa08125", size = 53800 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/c0/ef/8be90a0b7e73c32e550c73cfb2fa09db62234227ece47b0e80a05073b375/wrapt-1.17.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:35621ae4c00e056adb0009f8e86e28eb4a41a4bfa8f9bfa9fca7d343fe94f998", size = 38824 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/36/89/0aae34c10fe524cce30fe5fc433210376bce94cf74d05b0d68344c8ba46e/wrapt-1.17.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a604bf7a053f8362d27eb9fefd2097f82600b856d5abe996d623babd067b1ab5", size = 38920 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/3b/24/11c4510de906d77e0cfb5197f1b1445d4fec42c9a39ea853d482698ac681/wrapt-1.17.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5cbabee4f083b6b4cd282f5b817a867cf0b1028c54d445b7ec7cfe6505057cf8", size = 88690 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/71/d7/cfcf842291267bf455b3e266c0c29dcb675b5540ee8b50ba1699abf3af45/wrapt-1.17.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:49703ce2ddc220df165bd2962f8e03b84c89fee2d65e1c24a7defff6f988f4d6", size = 80861 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/d5/66/5d973e9f3e7370fd686fb47a9af3319418ed925c27d72ce16b791231576d/wrapt-1.17.2-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8112e52c5822fc4253f3901b676c55ddf288614dc7011634e2719718eaa187dc", size = 89174 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/a7/d3/8e17bb70f6ae25dabc1aaf990f86824e4fd98ee9cadf197054e068500d27/wrapt-1.17.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9fee687dce376205d9a494e9c121e27183b2a3df18037f89d69bd7b35bcf59e2", size = 86721 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/6f/54/f170dfb278fe1c30d0ff864513cff526d624ab8de3254b20abb9cffedc24/wrapt-1.17.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:18983c537e04d11cf027fbb60a1e8dfd5190e2b60cc27bc0808e653e7b218d1b", size = 79763 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/4a/98/de07243751f1c4a9b15c76019250210dd3486ce098c3d80d5f729cba029c/wrapt-1.17.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:703919b1633412ab54bcf920ab388735832fdcb9f9a00ae49387f0fe67dad504", size = 87585 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/f9/f0/13925f4bd6548013038cdeb11ee2cbd4e37c30f8bfd5db9e5a2a370d6e20/wrapt-1.17.2-cp313-cp313-win32.whl", hash = "sha256:abbb9e76177c35d4e8568e58650aa6926040d6a9f6f03435b7a522bf1c487f9a", size = 36676 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/bf/ae/743f16ef8c2e3628df3ddfd652b7d4c555d12c84b53f3d8218498f4ade9b/wrapt-1.17.2-cp313-cp313-win_amd64.whl", hash = "sha256:69606d7bb691b50a4240ce6b22ebb319c1cfb164e5f6569835058196e0f3a845", size = 38871 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/3d/bc/30f903f891a82d402ffb5fda27ec1d621cc97cb74c16fea0b6141f1d4e87/wrapt-1.17.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:4a721d3c943dae44f8e243b380cb645a709ba5bd35d3ad27bc2ed947e9c68192", size = 56312 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/8a/04/c97273eb491b5f1c918857cd26f314b74fc9b29224521f5b83f872253725/wrapt-1.17.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:766d8bbefcb9e00c3ac3b000d9acc51f1b399513f44d77dfe0eb026ad7c9a19b", size = 40062 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/4e/ca/3b7afa1eae3a9e7fefe499db9b96813f41828b9fdb016ee836c4c379dadb/wrapt-1.17.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e496a8ce2c256da1eb98bd15803a79bee00fc351f5dfb9ea82594a3f058309e0", size = 40155 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/89/be/7c1baed43290775cb9030c774bc53c860db140397047cc49aedaf0a15477/wrapt-1.17.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40d615e4fe22f4ad3528448c193b218e077656ca9ccb22ce2cb20db730f8d306", size = 113471 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/32/98/4ed894cf012b6d6aae5f5cc974006bdeb92f0241775addad3f8cd6ab71c8/wrapt-1.17.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a5aaeff38654462bc4b09023918b7f21790efb807f54c000a39d41d69cf552cb", size = 101208 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/ea/fd/0c30f2301ca94e655e5e057012e83284ce8c545df7661a78d8bfca2fac7a/wrapt-1.17.2-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9a7d15bbd2bc99e92e39f49a04653062ee6085c0e18b3b7512a4f2fe91f2d681", size = 109339 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/75/56/05d000de894c4cfcb84bcd6b1df6214297b8089a7bd324c21a4765e49b14/wrapt-1.17.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:e3890b508a23299083e065f435a492b5435eba6e304a7114d2f919d400888cc6", size = 110232 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/53/f8/c3f6b2cf9b9277fb0813418e1503e68414cd036b3b099c823379c9575e6d/wrapt-1.17.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:8c8b293cd65ad716d13d8dd3624e42e5a19cc2a2f1acc74b30c2c13f15cb61a6", size = 100476 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/a7/b1/0bb11e29aa5139d90b770ebbfa167267b1fc548d2302c30c8f7572851738/wrapt-1.17.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:4c82b8785d98cdd9fed4cac84d765d234ed3251bd6afe34cb7ac523cb93e8b4f", size = 106377 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/6a/e1/0122853035b40b3f333bbb25f1939fc1045e21dd518f7f0922b60c156f7c/wrapt-1.17.2-cp313-cp313t-win32.whl", hash = "sha256:13e6afb7fe71fe7485a4550a8844cc9ffbe263c0f1a1eea569bc7091d4898555", size = 37986 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/09/5e/1655cf481e079c1f22d0cabdd4e51733679932718dc23bf2db175f329b76/wrapt-1.17.2-cp313-cp313t-win_amd64.whl", hash = "sha256:eaf675418ed6b3b31c7a989fd007fa7c3be66ce14e5c3b27336383604c9da85c", size = 40750 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/2d/82/f56956041adef78f849db6b289b282e72b55ab8045a75abad81898c28d19/wrapt-1.17.2-py3-none-any.whl", hash = "sha256:b18f2d1533a71f069c7f82d524a52599053d4c7166e9dd374ae2136b7f40f7c8", size = 23594 },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "xonsh"
|
|
||||||
version = "0.19.2"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/68/4e/56e95a5e607eb3b0da37396f87cde70588efc8ef819ab16f02d5b8378dc4/xonsh-0.19.2.tar.gz", hash = "sha256:cfdd0680d954a2c3aefd6caddcc7143a3d06aa417ed18365a08219bb71b960b0", size = 799960 }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/6c/13/281094759df87b23b3c02dc4a16603ab08ea54d7f6acfeb69f3341137c7a/xonsh-0.19.2-py310-none-any.whl", hash = "sha256:ec7f163fd3a4943782aa34069d4e72793328c916a5975949dbec8536cbfc089b", size = 642301 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/29/41/a51e4c3918fe9a293b150cb949b1b8c6d45eb17dfed480dcb76ea43df4e7/xonsh-0.19.2-py311-none-any.whl", hash = "sha256:53c45f7a767901f2f518f9b8dd60fc653e0498e56e89825e1710bb0859985049", size = 642286 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/0a/93/9a77b731f492fac27c577dea2afb5a2bcc2a6a1c79be0c86c95498060270/xonsh-0.19.2-py312-none-any.whl", hash = "sha256:b24c619aa52b59eae4d35c4195dba9b19a2c548fb5c42c6f85f2b8ccb96807b5", size = 642386 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/be/75/070324769c1ff88d971ce040f4f486339be98e0a365c8dd9991eb654265b/xonsh-0.19.2-py313-none-any.whl", hash = "sha256:c53ef6c19f781fbc399ed1b382b5c2aac2125010679a3b61d643978273c27df0", size = 642873 },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/fa/cb/2c7ccec54f5b0e73fdf7650e8336582ff0347d9001c5ef8271dc00c034fe/xonsh-0.19.2-py39-none-any.whl", hash = "sha256:bcc0225dc3847f1ed2f175dac6122fbcc54cea67d9c2dc2753d9615e2a5ff284", size = 634602 },
|
|
||||||
]
|
|
Loading…
Reference in New Issue