Compare commits

...

80 Commits

Author SHA1 Message Date
Tyler Goodlet 428c111fdc Add `debug_mode: bool` control to task mngr
Allows dynamically importing `pdbp` when enabled and a way for
eventually linking with `tractor`'s own debug mode flag.
2025-03-10 12:01:10 -04:00
Tyler Goodlet bf8b862338 Go all in on "task manager" naming 2025-03-10 12:01:10 -04:00
Tyler Goodlet 48cf129362 More refinements and proper typing
- drop unneeded (and commented) internal cs allocating bits.
- bypass all task manager stuff if no generator is provided by the
  caller; i.e. just call `.start_soon()` as normal.
- fix `Generator` typing.
- add some prints around task manager.
- wrap in `TaskOutcome.lowlevel_task: Task`.
2025-03-10 12:01:10 -04:00
Tyler Goodlet c3534b1a26 Ensure user-allocated cancel scope just works!
Turns out the nursery doesn't have to care about allocating a per task
`CancelScope` since the user can just do that in the
`@task_scope_manager` if desired B) So just mask all the nursery cs
allocating with the intention of removal.

Also add a test for per-task-cancellation by starting the crash task as
a `trio.sleep_forever()` but then cancel it via the user allocated cs
and ensure the crash propagates as expected 💥
2025-03-10 12:01:10 -04:00
Tyler Goodlet 3664345083 Facepalm, don't pass in unecessary cancel scope 2025-03-10 12:01:10 -04:00
Tyler Goodlet e418017298 Do renaming, implement lowlevel `Outcome` sending
As was listed in the many todos, this changes the `.start_soon()` impl
to instead (manually) `.send()` into the user defined
`@task_scope_manager` an `Outcome` from the spawned task. In this case
the task manager wraps that in a user defined (and renamed)
`TaskOutcome` and delivers that + a containing `trio.CancelScope` to the
`.start_soon()` caller. Here the user defined `TaskOutcome` defines
a `.wait_for_result()` method that can be used to await the task's exit
and handle it's underlying returned value or raised error; the
implementation could be different and subject to the user's own whims.

Note that by default, if this was added to `trio`'s core, the
`@task_scope_manager` would simply be implemented as either a `None`
yielding single-yield-generator but more likely just entirely ignored
by the runtime (as in no manual task outcome collecting, generator
calling and sending is done at all) by default if the user does not provide
the `task_scope_manager` to the nursery at open time.
2025-03-10 12:01:10 -04:00
Tyler Goodlet e553b30e86 Alias to `@acm` in broadcaster mod 2025-03-10 12:01:10 -04:00
Tyler Goodlet b758a50135 Initial prototype for a one-cancels-one style supervisor, nursery thing.. 2025-03-10 12:01:10 -04:00
Tyler Goodlet 1c9056aced Avoid attr-err when `._ipc_msg==None`
Seems this can happen in particular when we raise a `MessageTypeError`
on the sender side of a `Context`, since there isn't any msg relayed
from the other side (though i'm wondering if MTE should derive from RAE
then considering this case?).

Means `RemoteActorError.boxed_type = None` in such cases instead of
raising an attr-error for the `None.boxed_type_str`.
2025-03-10 11:54:04 -04:00
Tyler Goodlet 5fd92dce06 Facepalm, fix logic misstep on child side
Namely that `add_hooks: bool` should be the same as on the rent side..
Also, just drop the now unused `iter_maybe_sends`.

This makes the suite entire greeeeen btw, including the new sub-suite
which i hadn't runt before Bo
2025-03-10 11:54:04 -04:00
Tyler Goodlet d5ce51f4d9 Rework IPC-using `test_caps_basesd_msging` tests
Namely renaming and massively simplifying it to a new
`test_ext_types_over_ipc` which avoids all the wacky "parent dictates
what sender should be able to send beforehand"..

Instead keep it simple and just always try to send the same small set of
types over the wire with expect-logic to handle each case,

- use the new `dec_hook`/`ext_types` args to `mk_[co]dec()` routines for
  pld-spec ipc transport.
- always try to stream a small set of types from the child with logic to
  handle the cases expected to error.

Other,
- draft a `test_pld_limiting_usage` to check runtime raising of bad API
  usage; haven't run it yet tho.
- move `test_custom_extension_types` to top of mod so that the
  `enc/dec_nsp()` hooks can be reffed from test parametrizations.
- comment out (and maybe remove) the old routines for
  `iter_maybe_sends`, `test_limit_msgspec`, `chk_pld_type`.

XXX TODO, turns out the 2 failing cases from this suite have exposed an
an actual bug with `MsgTypeError` unpacking where the `ipc_msg=` input
is being set to `None` ?? -> see the comment at the bottom of
`._exceptions._mk_recv_mte()` which seems to describe the likely
culprit?
2025-03-10 11:54:04 -04:00
Tyler Goodlet 6a2ea0e875 Raise RTE from `limit_plds()` on no `curr_ctx`
Since it should only be used from within a `Portal.open_context()`
scope, make sure the caller knows that!

Also don't hide the frame in tb if the immediate function errors..
2025-03-10 11:54:04 -04:00
Tyler Goodlet d67c025cd3 Offer a `mods: list` to `dec_type_union()`; drop importing this-mod 2025-03-10 11:54:04 -04:00
Tyler Goodlet a9cc293d00 Tweak type-error messages for when `ext_types` is missing 2025-03-10 11:54:04 -04:00
Tyler Goodlet e67db03b7a Move `Union` serializers to new `msg.` mod
Namely moving `enc/dec_type_union()` from the test mod to a new
`tractor.msg._exts` for general use outside the test suite.
2025-03-10 11:54:04 -04:00
Tyler Goodlet 604a95f5bf Finally get type-extended `msgspec` fields workinn
By using our new `PldRx` design we can,
- pass through the pld-spec & a `dec_hook()` to our `MsgDec` which is
  used to configure the underlying `.dec: msgspec.msgpack.Decoder`
- pass through a `enc_hook()` to `mk_codec()` and use it to conf the
  equiv `MsgCodec.enc` such that sent msg-plds are converted prior
  to transport.

The trick ended up being just to always union the `mk_dec()`
extension-types spec with the normaly with the `msgspec.Raw` pld-spec
such that the `dec_hook()` is only invoked for payload types tagged
by the encoder/sender side B)

A variety of impl tweaks to make it all happen as well as various
cleanups in the `.msg._codec` mod include,

- `mk_dec()` no defaul `spec` arg, better doc string, accept the new
  `ext_types` arg, doing the union of that with `msgspec.Raw`.
- proto-ed a now unused `mk_boxed_ext_struct()` which will likely get
  removed since it ended up that our `PayloadMsg` structs already cover
  the ext-type-hook requirement that the decoder is passed
  a `.type=msgspec.Struct` of some sort in order for `.dec_hook` to be
  used.
- add a `unpack_spec_types()` util fn for getting the `set[Type]` from
  from a `Union[Type]` annotation instance.
- mk the default `mk_codec(pc_pld_spec = Raw,)` since the `PldRx` design
  was already passing/overriding it and it doesn't make much sense to
  use `Any` anymore for the same reason; it will cause various `Context`
  apis to now break.
  |_ also accept a `enc_hook()` and `ext_types` which are used to maybe
     config the `.msgpack.Encoder`
- generally tweak a bunch of comments-as-docs and todos namely the ones
  that are completed after the pld-rx design was implemented.

Also,
- mask the non-functioning `'defstruct'` approach `inside
  `.msg.types.mk_msg_spec()` to prep for its removal.

Adjust the test suite (rn called `test_caps_based_msging`),
- add a new suite `test_custom_extension_types` and move and
  use the `enc/dec_nsp()` hooks to the mod level for its use.
- prolly planning to drop the `test_limit_msgspec` suite since it's
  mostly replaced by the `test_pldrx_limiting` mod's version?
- originally was tweaking a bunch in `test_codec_hooks_mod` but likely
  it will get mostly rewritten to be simpler and simply verify that
  ext-typed fields can be used over IPC `Context`s between actors (as
  originally intended for this sub-suite).
2025-03-10 11:54:04 -04:00
Tyler Goodlet 0fd43c3448 Bump to `msgspec>=0.19.0` for py 3.13 support! 2025-03-10 11:53:41 -04:00
Tyler Goodlet 81349f7c09 Bind another `_bexc` for debuggin 2025-03-05 12:39:16 -05:00
Tyler Goodlet cdd9171de6 Mask ctlc borked REPL tests
Namely the `tractor.pause_from_sync()` examples using both bg threads
and `asyncio` which seem to go into bad states where SIGINT is ignored..

Deats,
- add `maybe_expect_timeout()` cm to ensure the EOF hangs get
  `.xfail()`ed instead.
- @pytest.mark.ctlcs_bish` `test_pause_from_sync` and don't expect the
  greenback prompt msg.
- also mark `test_sync_pause_from_aio_task`.
2025-03-05 11:58:03 -05:00
Tyler Goodlet b1bcbf5693 Repair/update `stackscope` test
Seems that on 3.13 it's not showing our script code in the output now?
Gotta get an example for @oremanj to see what's up but really it'd be
nice to just custom format stuff above `trio`'s runtime by def..

Anyway, update the `.devx._stackscope`,
- log formatting to be a little more "sclangy" lookin.
- change the per-actor "delimiter" lines style.
- report the `signal.getsignal(SIGINT)` which i needed in the
  `sync_bp.py` with ctl-c causing a hang..
- mask the `_tree_dumped` duplicator log report as well as the "dumped
  fine" one.
- add an example `pkill --signal SIGUSR1` cmdline.

Tweak the test to cope with,
- not showing our script lines now.. which i've commented in the
  `assert_before()` patts..
- to expect the newly formatted delimiter (ascii) lines to separate the
  root vs. hanger sub-actor sections.
2025-03-05 11:34:36 -05:00
Tyler Goodlet c648885eed Comment-tag pause points in `asycnio_bp.py`
Thought i already did this but, obvi needed these to make the expect
matches pass in our test.
2025-03-05 09:54:56 -05:00
Tyler Goodlet 56dfb8dc99 Unpack errors from `pdb.bdb`
Like any `bdb.BdbQuit` that might be relayed from a remote context after
a REPl exit with the `quit` cmd. This fixes various issues while
debugging where it may not be clear to the parent task that the child
was terminated with a purposefully unrecoverable error.
2025-03-05 09:49:13 -05:00
Tyler Goodlet 470640d084 Add a mark to `pytest.xfail()` questionably conc py stuff (ur mam `.xfail()`s bish!) 2025-03-04 19:53:24 -05:00
Tyler Goodlet bc9856aaf4 Show frames when decode is handed bad input 2025-03-04 13:54:46 -05:00
Tyler Goodlet e1bacaf4f4 Be extra sure to re-raise EoCs from translator
That is whenever `trio.EndOfChannel` is raised (presumably from the
`._to_trio.receive()` call inside `LinkedTaskChannel.receive()`) we need
to be extra certain that we let it bubble upward transparently DESPITE
special exc-as-signal handling that is normally suppressed from the aio
side; REPEAT we want to ALWAYS bubble any `trio_err ==
trio.EndOfChannel` in the `finally:` handler of `translate_aio_errors()`
despite `chan._trio_to_raise == AsyncioTaskExited` such that the
caller's iterable machinery will operate as normal when the inter-task
stream is stopped (again, presumably by the aio side task terminating
the inter-task stream).

Main impl deats for this,
- in the EoC handler block ensure we assign both `chan._trio_err` and
  the local `trio_err` as well as continue to re-raise.
- add a case to the match block in the `finally:` handler which FOR SURE
  re-raises any `type(trio_err) is EndOfChannel`!

Additionally fix a bad bug,
- a ref bug where we were NOT using the
  `except BaseException as _trio_err` to assign to `chan._trio_err` (by
  accident was missing the leading `_`..)

Unrelated impl tweak,
- move all `maybe_raise_aio_side_err()` content back to inline with its
  parent func - makes it easier to use `tractor.pause()` mostly Bp
- go back to trying to use `aio_task.set_exception(aio_taskc)` for now
  even though i'm pretty sure we're going to move to a try-fute-first
  style helper for this in the future.

Adjust some tests to match/mk-them-green,
- break from `aio_echo_server()` recv loop on
  `to_asyncio.TrioTaskExited` much like how you'd expect to (implicitly
  with a `for`) with a `trio.EndOfChannel`.
- toss in a masked `value is None` pause point i needed for debugging
  inf looping caused by not re-raising EoCs per the main patch
  description.
- add a debug-mode sized delay to root-infected test.
2025-03-03 21:50:51 -05:00
Tyler Goodlet d46e94d70e Fix an `aio_err` ref bug 2025-03-03 19:45:29 -05:00
Tyler Goodlet d32dc4e889 Another loosie in the trioisms suite 2025-03-03 18:55:02 -05:00
Tyler Goodlet c15b9c8460 Match `maybe_open_crash_handler()` to non-maybe version
Such that it will deliver a `BoxedMaybeException` to the caller
regardless whether `pdb` is set, and proxy through all `**kwargs`.
2025-03-03 18:53:13 -05:00
Tyler Goodlet 55809dac51 Use `collapse_eg()` in broadcaster suite
Around the test embedded `trio.open_nursery()` calls as expected. Also
tidy up the various nursery var names.
2025-03-03 18:41:22 -05:00
Tyler Goodlet 749c43b034 Draft some eg collapsing helpers
Inside a new `.trionics._beg` and exposed from the subpkg ns in
anticipation of the `strict_exception_groups=False` being removed by
`trio` in py 3.15.

Notes,
- mk an embedded single-exc "extractor" using a `BaseExceptionGroup.exceptions` length
  check, when 1 return the lone child.
- use the above in a new `@acm`, async bc it's most likely to be composed in an
  `async with` tuple-style sequence block, called `collapse_eg()` which
  acts a one line "absorber" for when the above mentioned flag is no
  logner supported by `trio.open_nursery()`.

All untested atm fwiw.. but soon to be used in our test suite(s) likely!
2025-03-03 18:35:44 -05:00
Tyler Goodlet ad56d10c51 Fix docs tests with yet another loosie-goosie
So the KBI propagates up to the actor nursery scope and also avoid
running any `examples/multihost/` subdir scripts.
2025-03-03 17:55:07 -05:00
Tyler Goodlet 375c00da6e Another couple loose-ifies for discovery and advanced fault suites 2025-03-03 13:57:54 -05:00
Tyler Goodlet 0631a5b30b Add (masked) meta-debug-fixture for determining if `debug_mode` is set in harness.. 2025-03-03 12:32:25 -05:00
Tyler Goodlet eb910bb83f Various test tweaks related to 3.13 egs
Including changes like,
- loose eg flagging in various test emedded `trio.open_nursery()`s.
- changes to eg handling (like using `except*`).
- added `debug_mode` integration to tests that needed some REPLin
  in order to figure out appropriate updates.
2025-03-03 12:31:29 -05:00
Tyler Goodlet 4721c81d23 Go to loose egs in `Actor` root & service nurseries (for now..) 2025-03-03 12:20:33 -05:00
Tyler Goodlet 591b4640e1 Fix `roundtripped` ref error in `validate_payload_msg()` 2025-03-03 12:19:11 -05:00
Tyler Goodlet b44482b1c1 Hide `open_nursery()` frame by def 2025-03-03 12:18:10 -05:00
Tyler Goodlet 03f8864fd4 Moar sclang log fmting tweaks 2025-03-03 12:17:51 -05:00
Tyler Goodlet 7fac170f8d Add equiv of `AsyncioCancelled` for aio side
Such that a `TrioCancelled` is raised in the aio task via
`.set_exception()` to explicitly indicate and allow that task to handle
a taskc request from the parent `trio.Task`.
2025-03-03 12:13:25 -05:00
Tyler Goodlet 44281268a8 More `debug_mode` test support, better nursery var names 2025-03-03 12:11:50 -05:00
Tyler Goodlet 064c5ce034 Add per-side graceful-exit/cancel excs-as-signals
Such that any combination of task terminations/exits can be explicitly
handled and "dual side independent" crash cases re-raised in egs.

The main error-or-exit impl changes include,

- use of new per-side "signaling exceptions":
  - TrioTaskExited|TrioCancelled for signalling aio.
  - AsyncioTaskExited|AsyncioCancelled for signalling trio.

- NOT overloading the `LinkedTaskChannel._trio/aio_err` fields for
  err-as-signal relay and instead add a new pair of
  `._trio/aio_to_raise` maybe-exc-attrs which allow each side's
  task to specify what it would want the other side to raise to signal
  its/a termination outcome:
  - `._trio_to_raise: AsyncioTaskExited|AsyncioCancelled` to signal,
    |_ the aio task having returned while the trio side was still reading
       from the `asyncio.Queue` or is just not `.done()`.
    |_ the aio task being self or trio-request cancelled where
       a `asyncio.CancelledError` is raised and caught but NOT relayed
       as is back to trio; instead signal a "more explicit" exc type.
  - `._aio_to_raise: TrioTaskExited|TrioCancelled` to signal,
    |_ the trio task having returned while the aio side was still reading
       from the mem chan and indicating that the trio side might not
       care any more about future streamed values (like the
       `Stop/EndOfChannel` equivs for ipc `Context`s).
    |_ when the trio task canceld we do
        a `asyncio.Future.set_exception(TrioTaskExited())` to indicate
        to the aio side verbosely that it should cancel due to the trio
        parent.
  - `_aio/trio_err` are now left to only capturing the **actual**
    per-side task excs for introspection / other side's handling logic.

- supporting "graceful exits" depending on API in use from
  `translate_aio_errors()` such that if either side exits but the other
  side isn't expect to consume the final `return`ed value, we just exit
  silently, which required:
  - adding a `suppress_graceful_exits: bool` flag.
  - adjusting the `maybe_raise_aio_side_err()` logic to use that flag
    and suppress only on certain combos of `._trio_to_raise/._trio_err`.
  - prefer to raise `._trio_to_raise` when the aio-side is the src and
    vice versa.

- filling out pedantic logging for cancellation cases indicating which
  side is the cause.

- add a `LinkedTaskChannel._aio_result` modelled after our
  `Context._result` a a similar `.wait_for_result()` interface which
  allows maybe accessing the aio task's final return value if desired
  when using the `open_channel_from()` API.

- rename `cancel_trio()` done handler -> `signal_trio_when_done()`

Also some fairly major test suite updates,
- add a `delay: int` producing fixture which delivers a much larger
  timeout whenever `debug_mode` is set so that the REPL can be used
  without a surrounding cancel firing.
- add a new `test_aio_exits_early_relays_AsyncioTaskExited` including
  a paired `exit_early: bool` flag to `push_from_aio_task()`.
- adjust `test_trio_closes_early_causes_aio_checkpoint_raise` to expect
  a `to_asyncio.TrioTaskExited`.
2025-03-03 12:00:14 -05:00
Tyler Goodlet 6637473b54 Expose `._state.debug_mode()` predicate at top level 2025-03-03 11:16:47 -05:00
Tyler Goodlet 71562e0af7 Another loose-egs flag in `test_child_manages_service_nursery` 2025-02-26 18:37:43 -05:00
Tyler Goodlet 61e9c7d4db Handle egs on failed `request_root_stdio_lock()`
Namely when the subactor fails to lock the root, in which case we
try to be very verbose about how/what failed in logging as well
as ensure we cancel the employed IPC ctx.

Implement the outer `BaseException` handler to handle both styles,
- match on an eg (or the prior std cancel excs) only raising a lone
  sub-exc from for former.
- always `as _req_err:` and assign to a new func-global `req_err`
  to enable the above matching.

Other,
- raise `DebugStateError` on `status.subactor_uid != actor_uid`.
- fix a `_repl_fail_report` ref error due to making silly assumptions
  about the `_repl_fail_msg` global; now copy from global as default.
- various log-fmt and logic expression styling tweaks.
- ignore `trio.Cancelled` by default in `open_crash_handler()`.
2025-02-26 18:21:19 -05:00
Tyler Goodlet 8e7e2ab458 A couple more loose-egs flag flips
Namely inside,
- `ActorNursery.open_portal()` which uses
  `.trionics.maybe_open_nursery()` and is now adjusted to
  pass-through `**kwargs` for at least this flag.
- inside the `.trionics.gather_contexts()`.
2025-02-26 18:06:06 -05:00
Tyler Goodlet cd1761e44d Disable tb colors in `._testing.mk_cmd()`
Unset the appropriate cpython osenv var such that our `pexpect` script
runs in the test suite can maintain original matching logic.
2025-02-26 15:35:00 -05:00
Tyler Goodlet b12fff99c3 Log format tweaks for sclang reprs
A space here, a newline there..
2025-02-26 15:35:00 -05:00
Tyler Goodlet 89e7a807f4 Expose `hide_tb: bool` from `.open_nursery()`
Such that it gets passed through to `.open_root_actor()` in the
`implicit_runtime==True` case - useful for debugging cases where
`.devx._debug` APIs might be used to avoid REPL clobbering in subactors.
2025-02-26 15:35:00 -05:00
Tyler Goodlet 44f19be7cb Another `is` fix.. 2025-02-26 15:35:00 -05:00
Tyler Goodlet 5e7e6b4085 Unset `$PYTHON_COLORS` for test debugger suite..
Since obvi all our `pexpect` patterns aren't going to match with
a heck-ton of terminal color escape sequences in the output XD
2025-02-26 15:35:00 -05:00
Tyler Goodlet d43019f19c Flip to `strict_exception_groups=False` in core tns
Since it'll likely need a bit of detailing to get the test suite running
identically with strict egs (exception groups), i've opted to just flip
the switch on a few core nursery scopes for now until as such a time
i can focus enough to port the matching internals.. Xp
2025-02-26 15:35:00 -05:00
Tyler Goodlet cfd23788f1 Clean up some imports in `._clustering` 2025-02-26 15:34:59 -05:00
Tyler Goodlet c0774628bf Drop `asyncio`-canc error from `._exceptions` 2025-02-26 15:34:59 -05:00
Tyler Goodlet a44a6dca06 Tweak some test asserts to better `is` style 2025-02-26 15:34:59 -05:00
Tyler Goodlet 4fdae0fcdf Bump various (dev) deps and prefer sys python
Since it turns out there's a few gotchas moving to python 3.13,
- we need to pin to new(er) `trio` which now flips to strict exception
  groups (something to be handled in a follow up patch).
- since we're now using `uv` we should (at least for now) prefer the
  system `python` (over astral's distis) since they compile for
  `libedit` in terms of what the (new) `readline.backend: str` will read
  as; this will break our tab-completion and vi-mode settings in
  the `pdbp` REPL without a user configuring a `~/.editrc`
  appropriately.
- go back to using latest `pdbp` (not a local dev version) since it
  should work fine presuming the previous bullet is addressed.

Lock bumps,
- for now use latest `trio==0.29.0` (which i gotta feeling might have
  broken some existing attempts at strict-eg handling i've tried..)
- update to latest `xonsh`, `pdbp` and its dep `tabcompleter`

Other cleaning,
- put back in various deps "comments" from `poetry` content.
- drop the `xonsh-vox` and `xontrib-vox` dev deps; no `vox` support with
  `uv` rn anyway..
2025-02-26 15:34:59 -05:00
Tyler Goodlet ee88e3c1e8 Draft test-doc for "out-of-band" `asyncio.Task`..
Since there's no way to activate `greenback`'s portal in such cases, we
should at least have a test verifying our very loud error about the
inability to support this usage..
2025-02-26 15:33:55 -05:00
Tyler Goodlet b348c58238 Raise "independent" task errors in an eg
The (rare) condition is heavily detailed in new comments in
the `cancel_trio()` callback but, more or less the idea here is to be
extra pedantic in raising an `Exceptiongroup` of errors from each task
(both `asyncio` and `trio`) whenever the 2 tasks raise "independently"
- in the sense that it's not obviously one side's task causing an error
(or cancellation) in the other. In this case we set the error for each
side on the `LinkedTaskChannel` (via new attrs described later).

As a synopsis, most of this work was refined out of supporting
`infected_aio=True` mode in the **root actor** and in particular as part
of getting that to work inside the `modden` daemon which at the time of
writing was still using the `i3ipc` lib and thus `asyncio`.

Impl deats,
- extend the `LinkedTaskChannel` field/API set (and type it),
  - `._trio_task: trio.Task` for test/user introspection.
- also "stage" some ideas for a more refined interface,
  - `.started()` to deliver the value yielded to the `trio.Task` parent.
   |_ also includes some todos for how to implement this design
      underneath.
  - `._aio_first: Any|None = None` to hold that value ^.
  - `.wait_aio_complete()` for syncing to the asyncio task.
- some detailed logging around "asyncio cancelled trio" case.
- Move `AsyncioCancelled` in this module.

Styling changes,
- generally more explicit var naming.
- some todos for getting modern and fancy with typing..

NB, Let it be known this commit msg was written on a friday with the
help of various "mr. white" solns.
2025-02-26 15:33:55 -05:00
Tyler Goodlet ce537bc010 Handle cpython builds with `libedit` for `readline`
Since `uv`'s cpython distributions are built this way `pdbp`'s tab
completion was breaking (as was vi-mode). This adds a new
`.devx._enable_readline_feats()` import hook which checks for the
appropriate library and applies settings accordingly.
2025-02-25 11:08:04 -05:00
Tyler Goodlet c103a5c136 Add in some dev deps for @goodboy
Namely since i use `xonsh` for a main shell, this includes adding it as
well as related tooling. Obvi bump the `uv.lock`.

Some other stuff retained from `poetry` days,
- add usage-comments around various (optional) deps.
- add toml section separator lines.
- go with 2-space indent.
- add comment on `trio>0.27` needed for py3.13+
2025-02-24 12:45:47 -05:00
Tyler Goodlet 3febc61e62 Disable invalid line in `ruff` config? 2025-02-24 12:15:47 -05:00
Tyler Goodlet f2758a62d9 Add a `ruff.toml` with ignore set taken from old `pyproject.toml` content 2025-02-14 13:25:04 -05:00
Guillermo Rodriguez a163769ea6
Migrate to uv using "uvx migrate-to-uv", use msgspec from git due to python 3.13 compat 2025-01-22 14:48:00 -03:00
Tyler Goodlet bd3668e2bf Add a `tests/test_root_infect_asyncio`
Might as well break apart the specific test set since there are some
(minor) subtleties and the orig test mod is already getting pretty big
XD

Includes both the new "independent"-event-loops test as well as the std
usage base case suite.
2025-01-10 18:02:06 -05:00
Tyler Goodlet d74dbab1be Impl a proto "unmasker" `@acm` alongside our test
Such that the suite verifies the wip `maybe_raise_from_masking_exc()`
will raise from a `trio.Cancelled.__context__` since I can't think of
any reason a `Cancelled` should ever be raised in-place of
a non-`Cancelled` XD

Not sure what should be raised instead (or maybe just a `log.warning()`
emitted?) but this starts a draft for refinement at the least. Use the
new `@pytest.mark.parametrize` explicit tuple-of-params form with an
`pytest.param + `.mark.xfail()` for the default behaviour case.
2025-01-10 17:29:11 -05:00
Tyler Goodlet 9be457fcf3 Add a "raise-from-`finally:`" example test
Since i wasted 2 days just to find an example of this inside an `@acm`,
figured I better reproduce for the purposes of maybe implementing
a warning sys (inside our wip proto `open_taskman()`) when a nursery
detects a single `Cancelled` in an eg where the `.__context__` is set to
some non-cancel error (which likely means a cancel-causing source
exception was suppressed by accident).

Left in a buncha commented code using `maybe_open_nursery()` which
i thought might be part of the issue but didn't end up being required;
will likely remove on a follow up refinement.
2025-01-10 15:46:00 -05:00
Tyler Goodlet e6f3f187b6 Yield a boxed-maybe-error from `open_crash_handler()`
Along the lines of something like `pytest.raises()` where the handled
exception can be inspected from the `pdbp` REPL using its `.value` field
B)

This is super handy in particular for understanding
`BaseException[Group]`s without manually adding surrounding handler code
to assign the `except[*] Exception as exc_var:` particularly when trying
to understand multi-cancelled eg trees.
2025-01-10 12:42:23 -05:00
Tyler Goodlet 924eff2985 Add an inter-leaved-task error test
Trying to replicate cases where errors are raised in both `trio` and
`asyncio` tasks independently (at least in `.to_asyncio` API terms) with
a new `test_trio_prestarted_task_bubbles` that generates 3 cases inside
a `@acm` calls stack composing a `trio.Nursery` with
a `to_asyncio.open_channel_from()` call where a set of `trio` tasks are
started in a loop using `.start()` with various exc raising sequences,
- the aio task raising *before* the last `trio` task spawns.
- the aio task raising just after the last trio task spawns, but before
  it starts.
- after the last trio task `.start()` call returns control to the
  parent - but (for now) did not error.

TODO, still more cases to discover as i'm still fighting a `modden` bug
of this sort atm..

Other,
- tweak some other tests to have timeouts since some recent hangs were
  found..
- started mucking with py3.13 and thus adjustments for strict egs in
  some tests; full patchset to test suite likely coming soon!
2025-01-09 08:59:30 -05:00
Tyler Goodlet 89fc072ca0 Hm, `asyncio.Task._fut_waiter.set_exception()`?
Since we can't use it to `Task.set_exception()` (since that task method never
seems to work.. XD) and setting the private/internal always seems to do
the desired raising in the task? I realize it's an internal `asyncio`
runtime field but i'd rather take the risk of it breaking then having to
rely on our own equivalent hack..

Also, it seems like the case where the task's associated (and internal)
future-waiter field is null, we won't run into the (same?) prior hanging
issues (maybe since there's nothing for `asyncio` internals to use to
wait XD ??) when `Task.cancel()` is used..??

Main deats,
- add and `Future.set_exception()` a new signal-exception
  `class TrioTaskExited(AsyncioCancelled):` whenever the trio-task exits
  gracefully and the asyncio-side task is still doing blocking work (of
  some sort) which *seem to* be predicated by a check that
  `._fut_waiter is not None`.
- always call `asyncio.Queue.shutdown()` for the same^ as well as
  whenever we decide to call `Task.cancel()`; in that case the shutdown
  relays correctly?

Some further refinements,
- only warn about `Task.cancel()` usage when actually used ;)
- more local scope vars setting in the exit phase of
  `translate_aio_errors()`.
- also in ^ use explicit caught-exc var names for each error-type.
2025-01-02 15:35:36 -05:00
Tyler Goodlet 7b8a8dcc7c Much more limited `asyncio.Task.cancel()` use
Since it can not only cause the guest-mode run to abandon but also in
some edge cases prevent `trio`-errors from propagating (at least on
py3.12-13?) as discovered as part of supporting this mode officially
in the *root actor*.

As such try to avoid that method as much as possible instead opting to
pass the `trio`-side error via the iter-task channel ref.

Deats,
- add a `LinkedTaskChannel._trio_err: BaseException|None` which gets set
  whenver the `trio.Task` error is caught; ONLY set `AsyncioCancelled`
  when the `trio` task was for sure the cause, whether itself cancelled
  or errored.
- always check for this error when exiting the `asyncio` side (even when
  terminated via a call to `asyncio.Task.cancel()` or during any other
  `CancelledError` handling such that the `asyncio`-task can expect to
  handle `AsyncioCancelled` due to the above^^ cases.
- never `cs.cancel()` the `trio` side unless that cancel scope has not
  yet been `.cancel_called` whatsoever; it's a noop anyway.
- only raise any exc from `asyncio.Task.result()` when `chan._aio_err`
  does not already match it since the existence of the pre-existing
  `task_err` means `asyncio` prolly intends (or has already) raised and
  interrupted the task elsewhere.

Various supporting tweaks,
- don't bother maybe-init-ing `greenback` from the actor entrypoint
  since we already need to (and do) bestow the portals to each `asyncio`
  task spawned using the `run_task()`/`open_channel_from()` API; further
  the init-ing should be done already by client code that enables
  infected mode (even in the root actor).
 |_we should prolly also codify it from any
   `run_daemon(infected_aio=True, debug_mode=True)` usage we offer.
- pass all the `_<field>`s to `Linked TaskChannel` explicitly in named
  kwarg style.
- better sclang-style log reports throughout, particularly on teardowns.
- generally more/better comments and docs around (not well understood)
  edge cases.
- prep to just inline `maybe_raise_aio_side_err()` closure..
2024-12-31 18:10:09 -05:00
Tyler Goodlet c63b94f61f Expose `debug_filter` from `open_root_actor()` also
Such that actor-runtime graceful cancel handling can be used throughout
any process tree.
2024-12-28 14:35:05 -05:00
Tyler Goodlet 0e39b3902f Drop extra nl from boxed error fmt 2024-12-28 14:34:24 -05:00
Tyler Goodlet bf9689e10a Raise explicitly on missing `greenback` portal
When `.pause_from_sync()` is called from an `asyncio.Task` which was
never bestowed a portal we want to be mega pedantic about it; indicate
that the task was NOT spawned from our `.to_asyncio` API and likely by
some out-of-our-control code (normally using
`asyncio.ensure_future()/.create_task()`). Though `greenback` already
errors on such usage, it's not always clear why no portal exists;
explaining the situation of a 3rd-party-bg-spawned-task should avoid
dev confusion for most cases.

Impl deats,
- distinguish between an actor in infected mode versus the actual caller
  of `.pause_from_sync()` being an `asyncio.Task` with more explicit
  `asyncio_task` and `is_infected_aio` vars.
- ONLY in the case of being both an infected-mode-actor AND detecting
  that the caller is an `asyncio.Task`, check `greenback.has_portal()`
  such that when not bestowed we presume the aforementioned
  3rd-party-bg-task case above and raise a new explicit RTE with
  a detailed explanatory message.
- add some masked draft code for handling the speical case of a root
  actor `asyncio.Task` caller which could (in theory) not actually
  require gb portal use since the `Lock` can be acquired directly
  without IPC.
 |_this will likely require factoring of various pause machinery funcs
   into a `_pause_from_root_task()` to mk the impl sane XD

Other,
- expose a new `debug_filter: Callable` which can be provided by the
  caller of `_maybe_enter_pm()` to predicate whether to enter the
  debugger REPL based on the caught `BaseException|BaseExceptionGroup`;
  this is handy for customizing the meaning of "graceful cancellations"
  so as to avoid crash handling on expected egs of more then
  `trioCancelled`.
|_ make the default as it was implemented: `not is_multi_cancelled(err)`
- pass-through a new `ignore: set[BaseException]` as
  `open_crash_handler(ignore_nested=ignore)` to allow for the same
  silent-cancellation-egs-swallowing as desired from outside the actor
  runtime.
2024-12-28 14:07:01 -05:00
Tyler Goodlet 350a94f39e Accept err-type override in `is_multi_cancelled()`
Such that equivalents of `trio.Cancelled` from other runtimes such as
`asyncio.CancelledError` and `subprocess.CalledProcessError` (with
a `.returncode == -2`) can be gracefully ignored as needed by the
caller.

For example this is handy if you want to avoid debug-mode REPL entry on
an exception-group full of only some subset of exception types since you
expect certain tasks to raise such errors after having been cancelled by
a request from some parent supervision sys (some "higher up"
`trio.CancelScope`, a remote triggered `ContextCancelled` or just from
and OS SIGINT).

Impl deats,
- offer a new `ignore_nested: set[BaseException]` param which by
  default we add `trio.Cancelled` to when no other types are provided.
- use `ExceptionGroup.subgroup(tuple(ignore_nested)` to filter to egs of
  the "ignored sub-errors set" and return any such match (instead of
  `True`).
- detail a comment on exclusion case.
2024-12-27 14:07:50 -05:00
Tyler Goodlet 0945631629 Support passing pre-conf-ed `Logger`
Such that we can hook into 3rd-party-libs more easily to monkey them and
use our (prettier/hipper) console logging with something like (an
example from the client project `modden`),

```python
    connection_mod = i3ipc.connection
    tractor_style_i3ipc_logger: logging.LoggingAdapter = tractor.log.get_console_log(
        _root_name=connection_mod.__name__,
        logger=i3ipc.connection_mod.logger,
        level='info',
    )
    # monkey the instance-ref in 3rd-party module
    connection_mod.logger = our_logger
```

Impl deats,
- expose as `get_console_log(logger: logging.Logger)` and add default
  failover logic.
- toss in more typing, also for mod-global instance.
2024-12-18 12:30:17 -05:00
Tyler Goodlet 0a0d30d108 Support and test infected-`asyncio`-mode for root
Such that you can use,

```python

    tractor.to_asyncio.run_as_asyncio_guest(
        trio_main=_trio_main,
    )
```

to boostrap the root actor (and thus main parent process) to embed
the actor-rumtime into an `asyncio` loop. Prove it all works with an
subactor-free version of the aio echo-server test suite B)
2024-12-11 22:23:17 -05:00
Tyler Goodlet dcb6706489 Support `ctx: UnionType` annots for `@tractor.context` eps 2024-12-11 22:22:26 -05:00
Tyler Goodlet 170e198683 Use shorthand nursery var-names per convention in codebase 2024-12-11 20:26:13 -05:00
Tyler Goodlet 840c328f19 Better separate service tasks vs. ctxs via methods
Namely splitting the handles for each in 2 separate tables and adding
a `.cancel_service_task()`.

Also,
- move `_open_and_supervise_service_ctx()` to mod level.
- rename `target` -> `ctx_fn` params througout.
- fill out method doc strings.
2024-12-11 14:24:49 -05:00
Tyler Goodlet 46dbe6d2fc Mv over `ServiceMngr` from `piker` with mods
Namely distinguishing service "IPC contexts" (opened in a
subactor via a `Portal`) from just local `trio.Task`s started
and managed under the `.service_n` (more or less wrapping in the
interface of a "task-manager" style nursery - aka a one-cancels-one
supervision start).

API changes from original (`piker`) impl,
- mk `.start_service_task()` do ONLY that, start a task with a wrapping
  cancel-scope and completion event.
  |_ ideally this gets factored-out/re-implemented using the
    task-manager/OCO-style-nursery from GH #363.
- change what was the impl of `.start_service_task()` to `.start_service_ctx()`
  since it more explicitly defines the functionality of entering
  `Portal.open_context()` with a wrapping cs and completion event inside
  a bg task (which syncs the ctx's lifetime with termination of the
  remote actor runtime).
- factor out what was a `.start_service_ctx()` closure to a new
  `_open_and_supervise_service_ctx()` mod-func holding the meat of
  the supervision logic.

`ServiceMngr` API brief,
- use `open_service_mngr()` and `get_service_mngr()` to acquire the
  actor-global singleton.
- `ServiceMngr.start_service()` and `.cancel_service()` which allow for
  straight forward mgmt of "service subactor daemons".
2024-12-11 12:38:35 -05:00
Tyler Goodlet f08e888138 Initial idea-notes dump and @singleton factory idea from `trio`-gitter 2024-12-10 14:44:09 -05:00
60 changed files with 5240 additions and 1475 deletions

View File

@ -62,7 +62,9 @@ async def recv_and_spawn_net_killers(
await ctx.started() await ctx.started()
async with ( async with (
ctx.open_stream() as stream, ctx.open_stream() as stream,
trio.open_nursery() as n, trio.open_nursery(
strict_exception_groups=False,
) as tn,
): ):
async for i in stream: async for i in stream:
print(f'child echoing {i}') print(f'child echoing {i}')
@ -77,11 +79,11 @@ async def recv_and_spawn_net_killers(
i >= break_ipc_after i >= break_ipc_after
): ):
broke_ipc = True broke_ipc = True
n.start_soon( tn.start_soon(
iter_ipc_stream, iter_ipc_stream,
stream, stream,
) )
n.start_soon( tn.start_soon(
partial( partial(
break_ipc_then_error, break_ipc_then_error,
stream=stream, stream=stream,

View File

@ -25,7 +25,7 @@ async def bp_then_error(
) -> None: ) -> None:
# sync with ``trio``-side (caller) task # sync with `trio`-side (caller) task
to_trio.send_nowait('start') to_trio.send_nowait('start')
# NOTE: what happens here inside the hook needs some refinement.. # NOTE: what happens here inside the hook needs some refinement..
@ -33,8 +33,7 @@ async def bp_then_error(
# we set `Lock.local_task_in_debug = 'sync'`, we probably want # we set `Lock.local_task_in_debug = 'sync'`, we probably want
# some further, at least, meta-data about the task/actor in debug # some further, at least, meta-data about the task/actor in debug
# in terms of making it clear it's `asyncio` mucking about. # in terms of making it clear it's `asyncio` mucking about.
breakpoint() breakpoint() # asyncio-side
# short checkpoint / delay # short checkpoint / delay
await asyncio.sleep(0.5) # asyncio-side await asyncio.sleep(0.5) # asyncio-side
@ -58,7 +57,6 @@ async def trio_ctx(
# this will block until the ``asyncio`` task sends a "first" # this will block until the ``asyncio`` task sends a "first"
# message, see first line in above func. # message, see first line in above func.
async with ( async with (
to_asyncio.open_channel_from( to_asyncio.open_channel_from(
bp_then_error, bp_then_error,
# raise_after_bp=not bp_before_started, # raise_after_bp=not bp_before_started,
@ -69,7 +67,7 @@ async def trio_ctx(
assert first == 'start' assert first == 'start'
if bp_before_started: if bp_before_started:
await tractor.pause() await tractor.pause() # trio-side
await ctx.started(first) # trio-side await ctx.started(first) # trio-side
@ -111,7 +109,7 @@ async def main(
# pause in parent to ensure no cross-actor # pause in parent to ensure no cross-actor
# locking problems exist! # locking problems exist!
await tractor.pause() await tractor.pause() # trio-root
if cancel_from_root: if cancel_from_root:
await ctx.cancel() await ctx.cancel()

View File

@ -21,11 +21,13 @@ async def name_error():
async def main(): async def main():
"""Test breakpoint in a streaming actor. '''
""" Test breakpoint in a streaming actor.
'''
async with tractor.open_nursery( async with tractor.open_nursery(
debug_mode=True, debug_mode=True,
# loglevel='cancel', loglevel='cancel',
# loglevel='devx', # loglevel='devx',
) as n: ) as n:

View File

@ -40,7 +40,7 @@ async def main():
""" """
async with tractor.open_nursery( async with tractor.open_nursery(
debug_mode=True, debug_mode=True,
# loglevel='cancel', loglevel='devx',
) as n: ) as n:
# spawn both actors # spawn both actors

View File

@ -39,7 +39,6 @@ async def main(
loglevel='devx', loglevel='devx',
) as an, ) as an,
): ):
ptl: tractor.Portal = await an.start_actor( ptl: tractor.Portal = await an.start_actor(
'hanger', 'hanger',
enable_modules=[__name__], enable_modules=[__name__],
@ -54,13 +53,16 @@ async def main(
print( print(
'Yo my child hanging..?\n' 'Yo my child hanging..?\n'
'Sending SIGUSR1 to see a tree-trace!\n' # "i'm a user who wants to see a `stackscope` tree!\n"
) )
# XXX simulate the wrapping test's "user actions" # XXX simulate the wrapping test's "user actions"
# (i.e. if a human didn't run this manually but wants to # (i.e. if a human didn't run this manually but wants to
# know what they should do to reproduce test behaviour) # know what they should do to reproduce test behaviour)
if from_test: if from_test:
print(
f'Sending SIGUSR1 to {cpid!r}!\n'
)
os.kill( os.kill(
cpid, cpid,
signal.SIGUSR1, signal.SIGUSR1,

View File

@ -91,7 +91,7 @@ async def main() -> list[int]:
an: ActorNursery an: ActorNursery
async with tractor.open_nursery( async with tractor.open_nursery(
loglevel='cancel', loglevel='cancel',
debug_mode=True, # debug_mode=True,
) as an: ) as an:
seed = int(1e3) seed = int(1e3)

View File

@ -3,20 +3,18 @@ import trio
import tractor import tractor
async def sleepy_jane(): async def sleepy_jane() -> None:
uid = tractor.current_actor().uid uid: tuple = tractor.current_actor().uid
print(f'Yo i am actor {uid}') print(f'Yo i am actor {uid}')
await trio.sleep_forever() await trio.sleep_forever()
async def main(): async def main():
''' '''
Spawn a flat actor cluster, with one process per Spawn a flat actor cluster, with one process per detected core.
detected core.
''' '''
portal_map: dict[str, tractor.Portal] portal_map: dict[str, tractor.Portal]
results: dict[str, str]
# look at this hip new syntax! # look at this hip new syntax!
async with ( async with (
@ -25,11 +23,16 @@ async def main():
modules=[__name__] modules=[__name__]
) as portal_map, ) as portal_map,
trio.open_nursery() as n, trio.open_nursery(
strict_exception_groups=False,
) as tn,
): ):
for (name, portal) in portal_map.items(): for (name, portal) in portal_map.items():
n.start_soon(portal.run, sleepy_jane) tn.start_soon(
portal.run,
sleepy_jane,
)
await trio.sleep(0.5) await trio.sleep(0.5)
@ -41,4 +44,4 @@ if __name__ == '__main__':
try: try:
trio.run(main) trio.run(main)
except KeyboardInterrupt: except KeyboardInterrupt:
pass print('trio cancelled by KBI')

View File

@ -1,71 +1,83 @@
[build-system] [build-system]
requires = ["poetry-core"] requires = ["hatchling"]
build-backend = "poetry.core.masonry.api" build-backend = "hatchling.build"
# ------ - ------ # ------ build-system ------
[tool.poetry] [project]
name = "tractor" name = "tractor"
version = "0.1.0a6dev0" version = "0.1.0a6dev0"
description='structured concurrent `trio`-"actors"' description = 'structured concurrent `trio`-"actors"'
authors = ["Tyler Goodlet <goodboy_foss@protonmail.com>"] authors = [{ name = "Tyler Goodlet", email = "goodboy_foss@protonmail.com" }]
license = "AGPlv3" requires-python = ">= 3.11"
readme = "docs/README.rst" readme = "docs/README.rst"
license = "AGPL-3.0-or-later"
# TODO: do we need this xontrib loader at all given pep420 keywords = [
# and xonsh's xontrib global-autoload-via-setuptools? "trio",
# https://xon.sh/tutorial_xontrib.html#authoring-xontribs "async",
packages = [ "concurrency",
{include = 'tractor' }, "structured concurrency",
# {include = 'tractor.experimental' }, "actor model",
# {include = 'tractor.trionics' }, "distributed",
# {include = 'tractor.msg' }, "multiprocessing",
# {include = 'tractor.devx' }, ]
classifiers = [
"Development Status :: 3 - Alpha",
"Operating System :: POSIX :: Linux",
"Framework :: Trio",
"License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.11",
"Topic :: System :: Distributed Computing",
]
dependencies = [
# trio runtime and friends
# (poetry) proper range specs,
# https://packaging.python.org/en/latest/discussions/install-requires-vs-requirements/#id5
# TODO, for 3.13 we must go go `0.27` which means we have to
# disable strict egs or port to handling them internally!
"trio>0.27",
"tricycle>=0.4.1,<0.5",
"trio-typing>=0.10.0,<0.11",
"wrapt>=1.16.0,<2",
"colorlog>=6.8.2,<7",
# built-in multi-actor `pdb` REPL
"pdbp>=1.6,<2", # windows only (from `pdbp`)
"tabcompleter>=1.4.0",
# typed IPC msging
# TODO, get back on release once 3.13 support is out!
"msgspec>=0.19.0",
] ]
# ------ - ------ # ------ project ------
[tool.poetry.dependencies] [dependency-groups]
python = "^3.11" dev = [
# test suite
# TODO: maybe some of these layout choices?
# https://docs.pytest.org/en/8.0.x/explanation/goodpractices.html#choosing-a-test-layout-import-rules
"pytest>=8.2.0,<9",
"pexpect>=4.9.0,<5",
# `tractor.devx` tooling
"greenback>=1.2.1,<2",
"stackscope>=0.2.2,<0.3",
"pyperclip>=1.9.0",
"prompt-toolkit>=3.0.50",
"xonsh>=0.19.2",
]
# trio runtime related # TODO, distributed (multi-host) extensions
# proper range spec:
# https://packaging.python.org/en/latest/discussions/install-requires-vs-requirements/#id5
trio='^0.24'
tricycle = "^0.4.1"
trio-typing = "^0.10.0"
msgspec='^0.18.5' # interchange
wrapt = "^1.16.0" # decorators
colorlog = "^6.8.2" # logging
# built-in multi-actor `pdb` REPL
pdbp = "^1.5.0"
# TODO: distributed transport using
# linux kernel networking # linux kernel networking
# 'pyroute2 # 'pyroute2
# ------ - ------ [tool.hatch.build.targets.sdist]
include = ["tractor"]
[tool.poetry.group.dev] [tool.hatch.build.targets.wheel]
optional = false include = ["tractor"]
[tool.poetry.group.dev.dependencies]
# testing
pytest = "^8.2.0"
pexpect = "^4.9.0"
# .devx tooling # ------ dependency-groups ------
greenback = "^1.2.1"
stackscope = "^0.2.2"
# (light) xonsh usage/integration
xontrib-vox = "^0.0.1"
prompt-toolkit = "^3.0.43"
xonsh-vox-tabcomplete = "^0.5"
# ------ - ------
[tool.towncrier] [tool.towncrier]
package = "tractor" package = "tractor"
@ -76,28 +88,26 @@ title_format = "tractor {version} ({project_date})"
template = "nooz/_template.rst" template = "nooz/_template.rst"
all_bullets = true all_bullets = true
[[tool.towncrier.type]] [[tool.towncrier.type]]
directory = "feature" directory = "feature"
name = "Features" name = "Features"
showcontent = true showcontent = true
[[tool.towncrier.type]] [[tool.towncrier.type]]
directory = "bugfix" directory = "bugfix"
name = "Bug Fixes" name = "Bug Fixes"
showcontent = true showcontent = true
[[tool.towncrier.type]] [[tool.towncrier.type]]
directory = "doc" directory = "doc"
name = "Improved Documentation" name = "Improved Documentation"
showcontent = true showcontent = true
[[tool.towncrier.type]] [[tool.towncrier.type]]
directory = "trivial" directory = "trivial"
name = "Trivial/Internal Changes" name = "Trivial/Internal Changes"
showcontent = true showcontent = true
# ------ - ------
[tool.pytest.ini_options] [tool.pytest.ini_options]
minversion = '6.0' minversion = '6.0'
testpaths = [ testpaths = [
@ -113,29 +123,23 @@ addopts = [
] ]
log_cli = false log_cli = false
# TODO: maybe some of these layout choices? # ------ tool.towncrier ------
# https://docs.pytest.org/en/8.0.x/explanation/goodpractices.html#choosing-a-test-layout-import-rules
# pythonpath = "src"
# ------ - ------ [tool.uv]
# XXX NOTE, prefer the sys python bc apparently the distis from
# `astral` are built in a way that breaks `pdbp`+`tabcompleter`'s
# likely due to linking against `libedit` over `readline`..
# |_https://docs.astral.sh/uv/concepts/python-versions/#managed-python-distributions
# |_https://gregoryszorc.com/docs/python-build-standalone/main/quirks.html#use-of-libedit-on-linux
#
# https://docs.astral.sh/uv/reference/settings/#python-preference
python-preference = 'system'
[project] # ------ tool.uv ------
keywords = [
'trio', [tool.uv.sources]
'async', # XXX NOTE, only for @goodboy's hacking on `pprint(sort_dicts=False)`
'concurrency', # for the `pp` alias..
'structured concurrency', # pdbp = { path = "../pdbp", editable = true }
'actor model',
'distributed', # ------ tool.uv.sources ------
'multiprocessing'
]
classifiers = [
"Development Status :: 3 - Alpha",
"Operating System :: POSIX :: Linux",
"Framework :: Trio",
"License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.11",
"Topic :: System :: Distributed Computing",
]

82
ruff.toml 100644
View File

@ -0,0 +1,82 @@
# from default `ruff.toml` @
# https://docs.astral.sh/ruff/configuration/
# Exclude a variety of commonly ignored directories.
exclude = [
".bzr",
".direnv",
".eggs",
".git",
".git-rewrite",
".hg",
".ipynb_checkpoints",
".mypy_cache",
".nox",
".pants.d",
".pyenv",
".pytest_cache",
".pytype",
".ruff_cache",
".svn",
".tox",
".venv",
".vscode",
"__pypackages__",
"_build",
"buck-out",
"build",
"dist",
"node_modules",
"site-packages",
"venv",
]
# Same as Black.
line-length = 88
indent-width = 4
# Assume Python 3.9
target-version = "py311"
[lint]
# Enable Pyflakes (`F`) and a subset of the pycodestyle (`E`) codes by default.
# Unlike Flake8, Ruff doesn't enable pycodestyle warnings (`W`) or
# McCabe complexity (`C901`) by default.
select = ["E4", "E7", "E9", "F"]
ignore = [
'E402', # https://docs.astral.sh/ruff/rules/module-import-not-at-top-of-file/
]
# Allow fix for all enabled rules (when `--fix`) is provided.
fixable = ["ALL"]
unfixable = []
# Allow unused variables when underscore-prefixed.
# dummy-variable-rgx = "^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$"
[format]
# Use single quotes in `ruff format`.
quote-style = "single"
# Like Black, indent with spaces, rather than tabs.
indent-style = "space"
# Like Black, respect magic trailing commas.
skip-magic-trailing-comma = false
# Like Black, automatically detect the appropriate line ending.
line-ending = "auto"
# Enable auto-formatting of code examples in docstrings. Markdown,
# reStructuredText code/literal blocks and doctests are all supported.
#
# This is currently disabled by default, but it is planned for this
# to be opt-out in the future.
docstring-code-format = false
# Set the line length limit used when formatting code snippets in
# docstrings.
#
# This only has an effect when the `docstring-code-format` setting is
# enabled.
docstring-code-line-length = "dynamic"

View File

@ -75,7 +75,10 @@ def pytest_configure(config):
@pytest.fixture(scope='session') @pytest.fixture(scope='session')
def debug_mode(request): def debug_mode(request):
return request.config.option.tractor_debug_mode debug_mode: bool = request.config.option.tractor_debug_mode
# if debug_mode:
# breakpoint()
return debug_mode
@pytest.fixture(scope='session', autouse=True) @pytest.fixture(scope='session', autouse=True)
@ -92,6 +95,12 @@ def spawn_backend(request) -> str:
return request.config.option.spawn_backend return request.config.option.spawn_backend
# @pytest.fixture(scope='function', autouse=True)
# def debug_enabled(request) -> str:
# from tractor import _state
# if _state._runtime_vars['_debug_mode']:
# breakpoint()
_ci_env: bool = os.environ.get('CI', False) _ci_env: bool = os.environ.get('CI', False)
@ -150,6 +159,18 @@ def pytest_generate_tests(metafunc):
metafunc.parametrize("start_method", [spawn_backend], scope='module') metafunc.parametrize("start_method", [spawn_backend], scope='module')
# TODO: a way to let test scripts (like from `examples/`)
# guarantee they won't registry addr collide!
# @pytest.fixture
# def open_test_runtime(
# reg_addr: tuple,
# ) -> AsyncContextManager:
# return partial(
# tractor.open_nursery,
# registry_addrs=[reg_addr],
# )
def sig_prog(proc, sig): def sig_prog(proc, sig):
"Kill the actor-process with ``sig``." "Kill the actor-process with ``sig``."
proc.send_signal(sig) proc.send_signal(sig)

View File

@ -30,7 +30,7 @@ from conftest import (
@pytest.fixture @pytest.fixture
def spawn( def spawn(
start_method, start_method,
testdir: pytest.Testdir, testdir: pytest.Pytester,
reg_addr: tuple[str, int], reg_addr: tuple[str, int],
) -> Callable[[str], None]: ) -> Callable[[str], None]:
@ -44,16 +44,32 @@ def spawn(
'`pexpect` based tests only supported on `trio` backend' '`pexpect` based tests only supported on `trio` backend'
) )
def unset_colors():
'''
Python 3.13 introduced colored tracebacks that break patt
matching,
https://docs.python.org/3/using/cmdline.html#envvar-PYTHON_COLORS
https://docs.python.org/3/using/cmdline.html#using-on-controlling-color
'''
import os
os.environ['PYTHON_COLORS'] = '0'
def _spawn( def _spawn(
cmd: str, cmd: str,
**mkcmd_kwargs, **mkcmd_kwargs,
): ):
unset_colors()
return testdir.spawn( return testdir.spawn(
cmd=mk_cmd( cmd=mk_cmd(
cmd, cmd,
**mkcmd_kwargs, **mkcmd_kwargs,
), ),
expect_timeout=3, expect_timeout=3,
# preexec_fn=unset_colors,
# ^TODO? get `pytest` core to expose underlying
# `pexpect.spawn()` stuff?
) )
# such that test-dep can pass input script name. # such that test-dep can pass input script name.
@ -83,6 +99,14 @@ def ctlc(
'https://github.com/goodboy/tractor/issues/320' 'https://github.com/goodboy/tractor/issues/320'
) )
if mark.name == 'ctlcs_bish':
pytest.skip(
f'Test {node} prolly uses something from the stdlib (namely `asyncio`..)\n'
f'The test and/or underlying example script can *sometimes* run fine '
f'locally but more then likely until the cpython peeps get their sh#$ together, '
f'this test will definitely not behave like `trio` under SIGINT..\n'
)
if use_ctlc: if use_ctlc:
# XXX: disable pygments highlighting for auto-tests # XXX: disable pygments highlighting for auto-tests
# since some envs (like actions CI) will struggle # since some envs (like actions CI) will struggle

View File

@ -309,10 +309,13 @@ def test_subactor_breakpoint(
child.expect(EOF) child.expect(EOF)
assert in_prompt_msg( assert in_prompt_msg(
child, child, [
['RemoteActorError:', 'MessagingError:',
'RemoteActorError:',
"('breakpoint_forever'", "('breakpoint_forever'",
'bdb.BdbQuit',] 'bdb.BdbQuit',
],
pause_on_false=True,
) )

View File

@ -6,6 +6,9 @@ All these tests can be understood (somewhat) by running the
equivalent `examples/debugging/` scripts manually. equivalent `examples/debugging/` scripts manually.
''' '''
from contextlib import (
contextmanager as cm,
)
# from functools import partial # from functools import partial
# import itertools # import itertools
import time import time
@ -15,7 +18,7 @@ import time
import pytest import pytest
from pexpect.exceptions import ( from pexpect.exceptions import (
# TIMEOUT, TIMEOUT,
EOF, EOF,
) )
@ -32,7 +35,23 @@ from .conftest import (
# _repl_fail_msg, # _repl_fail_msg,
) )
@cm
def maybe_expect_timeout(
ctlc: bool = False,
) -> None:
try:
yield
except TIMEOUT:
# breakpoint()
if ctlc:
pytest.xfail(
'Some kinda redic threading SIGINT bug i think?\n'
'See the notes in `examples/debugging/sync_bp.py`..\n'
)
raise
@pytest.mark.ctlcs_bish
def test_pause_from_sync( def test_pause_from_sync(
spawn, spawn,
ctlc: bool, ctlc: bool,
@ -67,10 +86,10 @@ def test_pause_from_sync(
child.expect(PROMPT) child.expect(PROMPT)
# XXX shouldn't see gb loaded message with PDB loglevel! # XXX shouldn't see gb loaded message with PDB loglevel!
assert not in_prompt_msg( # assert not in_prompt_msg(
child, # child,
['`greenback` portal opened!'], # ['`greenback` portal opened!'],
) # )
# should be same root task # should be same root task
assert_before( assert_before(
child, child,
@ -162,7 +181,14 @@ def test_pause_from_sync(
) )
child.sendline('c') child.sendline('c')
child.expect(EOF)
# XXX TODO, weird threading bug it seems despite the
# `abandon_on_cancel: bool` setting to
# `trio.to_thread.run_sync()`..
with maybe_expect_timeout(
ctlc=ctlc,
):
child.expect(EOF)
def expect_any_of( def expect_any_of(
@ -218,11 +244,12 @@ def expect_any_of(
) )
return expected_patts return expected_patts
# yield child
def test_pause_from_asyncio_task( @pytest.mark.ctlcs_bish
def test_sync_pause_from_aio_task(
spawn, spawn,
ctlc: bool ctlc: bool
# ^TODO, fix for `asyncio`!! # ^TODO, fix for `asyncio`!!
): ):
@ -271,10 +298,12 @@ def test_pause_from_asyncio_task(
# error raised in `asyncio.Task` # error raised in `asyncio.Task`
"raise ValueError('asyncio side error!')": [ "raise ValueError('asyncio side error!')": [
_crash_msg, _crash_msg,
'return await chan.receive()', # `.to_asyncio` impl internals in tb
"<Task 'trio_ctx'", "<Task 'trio_ctx'",
"@ ('aio_daemon'", "@ ('aio_daemon'",
"ValueError: asyncio side error!", "ValueError: asyncio side error!",
# XXX, we no longer show this frame by default!
# 'return await chan.receive()', # `.to_asyncio` impl internals in tb
], ],
# parent-side propagation via actor-nursery/portal # parent-side propagation via actor-nursery/portal
@ -326,4 +355,27 @@ def test_pause_from_asyncio_task(
) )
child.sendline('c') child.sendline('c')
# with maybe_expect_timeout():
child.expect(EOF) child.expect(EOF)
def test_sync_pause_from_non_greenbacked_aio_task():
'''
Where the `breakpoint()` caller task is NOT spawned by
`tractor.to_asyncio` and thus never activates
a `greenback.ensure_portal()` beforehand, presumably bc the task
was started by some lib/dep as in often seen in the field.
Ensure sync pausing works when the pause is in,
- the root actor running in infected-mode?
|_ since we don't need any IPC to acquire the debug lock?
|_ is there some way to handle this like the non-main-thread case?
All other cases need to error out appropriately right?
- for any subactor we can't avoid needing the repl lock..
|_ is there a way to hook into `asyncio.ensure_future(obj)`?
'''
pass

View File

@ -15,6 +15,7 @@ TODO:
''' '''
import os import os
import signal import signal
import time
from .conftest import ( from .conftest import (
expect, expect,
@ -47,41 +48,39 @@ def test_shield_pause(
] ]
) )
script_pid: int = child.pid
print( print(
'Sending SIGUSR1 to see a tree-trace!', f'Sending SIGUSR1 to {script_pid}\n'
f'(kill -s SIGUSR1 {script_pid})\n'
) )
os.kill( os.kill(
child.pid, script_pid,
signal.SIGUSR1, signal.SIGUSR1,
) )
time.sleep(0.2)
expect( expect(
child, child,
# end-of-tree delimiter # end-of-tree delimiter
"------ \('root', ", "end-of-\('root'",
) )
assert_before( assert_before(
child, child,
[ [
'Trying to dump `stackscope` tree..', # 'Srying to dump `stackscope` tree..',
'Dumping `stackscope` tree for actor', # 'Dumping `stackscope` tree for actor',
"('root'", # uid line "('root'", # uid line
# TODO!? this used to show?
# -[ ] mk reproducable for @oremanj?
#
# parent block point (non-shielded) # parent block point (non-shielded)
'await trio.sleep_forever() # in root', # 'await trio.sleep_forever() # in root',
] ]
) )
# expect(
# child,
# # relay to the sub should be reported
# 'Relaying `SIGUSR1`[10] to sub-actor',
# )
expect( expect(
child, child,
# end-of-tree delimiter # end-of-tree delimiter
"------ \('hanger', ", "end-of-\('hanger'",
) )
assert_before( assert_before(
child, child,
@ -91,11 +90,11 @@ def test_shield_pause(
"('hanger'", # uid line "('hanger'", # uid line
# TODO!? SEE ABOVE
# hanger LOC where it's shield-halted # hanger LOC where it's shield-halted
'await trio.sleep_forever() # in subactor', # 'await trio.sleep_forever() # in subactor',
] ]
) )
# breakpoint()
# simulate the user sending a ctl-c to the hanging program. # simulate the user sending a ctl-c to the hanging program.
# this should result in the terminator kicking in since # this should result in the terminator kicking in since

View File

@ -3,7 +3,6 @@ Sketchy network blackoutz, ugly byzantine gens, puedes eschuchar la
cancelacion?.. cancelacion?..
''' '''
import itertools
from functools import partial from functools import partial
from types import ModuleType from types import ModuleType
@ -230,13 +229,10 @@ def test_ipc_channel_break_during_stream(
# get raw instance from pytest wrapper # get raw instance from pytest wrapper
value = excinfo.value value = excinfo.value
if isinstance(value, ExceptionGroup): if isinstance(value, ExceptionGroup):
value = next( excs = value.exceptions
itertools.dropwhile( assert len(excs) == 1
lambda exc: not isinstance(exc, expect_final_exc), final_exc = excs[0]
value.exceptions, assert isinstance(final_exc, expect_final_exc)
)
)
assert value
@tractor.context @tractor.context
@ -259,15 +255,16 @@ async def break_ipc_after_started(
def test_stream_closed_right_after_ipc_break_and_zombie_lord_engages(): def test_stream_closed_right_after_ipc_break_and_zombie_lord_engages():
''' '''
Verify that is a subactor's IPC goes down just after bringing up a stream Verify that is a subactor's IPC goes down just after bringing up
the parent can trigger a SIGINT and the child will be reaped out-of-IPC by a stream the parent can trigger a SIGINT and the child will be
the localhost process supervision machinery: aka "zombie lord". reaped out-of-IPC by the localhost process supervision machinery:
aka "zombie lord".
''' '''
async def main(): async def main():
with trio.fail_after(3): with trio.fail_after(3):
async with tractor.open_nursery() as n: async with tractor.open_nursery() as an:
portal = await n.start_actor( portal = await an.start_actor(
'ipc_breaker', 'ipc_breaker',
enable_modules=[__name__], enable_modules=[__name__],
) )

View File

@ -307,7 +307,15 @@ async def inf_streamer(
async with ( async with (
ctx.open_stream() as stream, ctx.open_stream() as stream,
trio.open_nursery() as tn,
# XXX TODO, INTERESTING CASE!!
# - if we don't collapse the eg then the embedded
# `trio.EndOfChannel` doesn't propagate directly to the above
# .open_stream() parent, resulting in it also raising instead
# of gracefully absorbing as normal.. so how to handle?
trio.open_nursery(
strict_exception_groups=False,
) as tn,
): ):
async def close_stream_on_sentinel(): async def close_stream_on_sentinel():
async for msg in stream: async for msg in stream:

View File

@ -130,7 +130,7 @@ def test_multierror(
try: try:
await portal2.result() await portal2.result()
except tractor.RemoteActorError as err: except tractor.RemoteActorError as err:
assert err.boxed_type == AssertionError assert err.boxed_type is AssertionError
print("Look Maa that first actor failed hard, hehh") print("Look Maa that first actor failed hard, hehh")
raise raise
@ -182,7 +182,7 @@ def test_multierror_fast_nursery(reg_addr, start_method, num_subactors, delay):
for exc in exceptions: for exc in exceptions:
assert isinstance(exc, tractor.RemoteActorError) assert isinstance(exc, tractor.RemoteActorError)
assert exc.boxed_type == AssertionError assert exc.boxed_type is AssertionError
async def do_nothing(): async def do_nothing():
@ -504,7 +504,9 @@ def test_cancel_via_SIGINT_other_task(
if is_win(): # smh if is_win(): # smh
timeout += 1 timeout += 1
async def spawn_and_sleep_forever(task_status=trio.TASK_STATUS_IGNORED): async def spawn_and_sleep_forever(
task_status=trio.TASK_STATUS_IGNORED
):
async with tractor.open_nursery() as tn: async with tractor.open_nursery() as tn:
for i in range(3): for i in range(3):
await tn.run_in_actor( await tn.run_in_actor(
@ -517,7 +519,9 @@ def test_cancel_via_SIGINT_other_task(
async def main(): async def main():
# should never timeout since SIGINT should cancel the current program # should never timeout since SIGINT should cancel the current program
with trio.fail_after(timeout): with trio.fail_after(timeout):
async with trio.open_nursery() as n: async with trio.open_nursery(
strict_exception_groups=False,
) as n:
await n.start(spawn_and_sleep_forever) await n.start(spawn_and_sleep_forever)
if 'mp' in spawn_backend: if 'mp' in spawn_backend:
time.sleep(0.1) time.sleep(0.1)
@ -610,6 +614,12 @@ def test_fast_graceful_cancel_when_spawn_task_in_soft_proc_wait_for_daemon(
nurse.start_soon(delayed_kbi) nurse.start_soon(delayed_kbi)
await p.run(do_nuthin) await p.run(do_nuthin)
# need to explicitly re-raise the lone kbi..now
except* KeyboardInterrupt as kbi_eg:
assert (len(excs := kbi_eg.exceptions) == 1)
raise excs[0]
finally: finally:
duration = time.time() - start duration = time.time() - start
if duration > timeout: if duration > timeout:

File diff suppressed because it is too large Load Diff

View File

@ -95,8 +95,8 @@ async def trio_main(
# stash a "service nursery" as "actor local" (aka a Python global) # stash a "service nursery" as "actor local" (aka a Python global)
global _nursery global _nursery
n = _nursery tn = _nursery
assert n assert tn
async def consume_stream(): async def consume_stream():
async with wrapper_mngr() as stream: async with wrapper_mngr() as stream:
@ -104,10 +104,10 @@ async def trio_main(
print(msg) print(msg)
# run 2 tasks to ensure broadcaster chan use # run 2 tasks to ensure broadcaster chan use
n.start_soon(consume_stream) tn.start_soon(consume_stream)
n.start_soon(consume_stream) tn.start_soon(consume_stream)
n.start_soon(trio_sleep_and_err) tn.start_soon(trio_sleep_and_err)
await trio.sleep_forever() await trio.sleep_forever()
@ -117,8 +117,10 @@ async def open_actor_local_nursery(
ctx: tractor.Context, ctx: tractor.Context,
): ):
global _nursery global _nursery
async with trio.open_nursery() as n: async with trio.open_nursery(
_nursery = n strict_exception_groups=False,
) as tn:
_nursery = tn
await ctx.started() await ctx.started()
await trio.sleep(10) await trio.sleep(10)
# await trio.sleep(1) # await trio.sleep(1)
@ -132,7 +134,7 @@ async def open_actor_local_nursery(
# never yields back.. aka a scenario where the # never yields back.. aka a scenario where the
# ``tractor.context`` task IS NOT in the service n's cancel # ``tractor.context`` task IS NOT in the service n's cancel
# scope. # scope.
n.cancel_scope.cancel() tn.cancel_scope.cancel()
@pytest.mark.parametrize( @pytest.mark.parametrize(
@ -157,7 +159,7 @@ def test_actor_managed_trio_nursery_task_error_cancels_aio(
async with tractor.open_nursery() as n: async with tractor.open_nursery() as n:
p = await n.start_actor( p = await n.start_actor(
'nursery_mngr', 'nursery_mngr',
infect_asyncio=asyncio_mode, infect_asyncio=asyncio_mode, # TODO, is this enabling debug mode?
enable_modules=[__name__], enable_modules=[__name__],
) )
async with ( async with (

View File

@ -181,7 +181,9 @@ async def spawn_and_check_registry(
try: try:
async with tractor.open_nursery() as n: async with tractor.open_nursery() as n:
async with trio.open_nursery() as trion: async with trio.open_nursery(
strict_exception_groups=False,
) as trion:
portals = {} portals = {}
for i in range(3): for i in range(3):
@ -316,7 +318,9 @@ async def close_chans_before_nursery(
async with portal2.open_stream_from( async with portal2.open_stream_from(
stream_forever stream_forever
) as agen2: ) as agen2:
async with trio.open_nursery() as n: async with trio.open_nursery(
strict_exception_groups=False,
) as n:
n.start_soon(streamer, agen1) n.start_soon(streamer, agen1)
n.start_soon(cancel, use_signal, .5) n.start_soon(cancel, use_signal, .5)
try: try:

View File

@ -19,7 +19,7 @@ from tractor._testing import (
@pytest.fixture @pytest.fixture
def run_example_in_subproc( def run_example_in_subproc(
loglevel: str, loglevel: str,
testdir: pytest.Testdir, testdir: pytest.Pytester,
reg_addr: tuple[str, int], reg_addr: tuple[str, int],
): ):
@ -81,27 +81,37 @@ def run_example_in_subproc(
# walk yields: (dirpath, dirnames, filenames) # walk yields: (dirpath, dirnames, filenames)
[ [
(p[0], f) for p in os.walk(examples_dir()) for f in p[2] (p[0], f)
for p in os.walk(examples_dir())
for f in p[2]
if '__' not in f if (
and f[0] != '_' '__' not in f
and 'debugging' not in p[0] and f[0] != '_'
and 'integration' not in p[0] and 'debugging' not in p[0]
and 'advanced_faults' not in p[0] and 'integration' not in p[0]
and 'advanced_faults' not in p[0]
and 'multihost' not in p[0]
)
], ],
ids=lambda t: t[1], ids=lambda t: t[1],
) )
def test_example(run_example_in_subproc, example_script): def test_example(
"""Load and run scripts from this repo's ``examples/`` dir as a user run_example_in_subproc,
example_script,
):
'''
Load and run scripts from this repo's ``examples/`` dir as a user
would copy and pasing them into their editor. would copy and pasing them into their editor.
On windows a little more "finessing" is done to make On windows a little more "finessing" is done to make
``multiprocessing`` play nice: we copy the ``__main__.py`` into the ``multiprocessing`` play nice: we copy the ``__main__.py`` into the
test directory and invoke the script as a module with ``python -m test directory and invoke the script as a module with ``python -m
test_example``. test_example``.
"""
ex_file = os.path.join(*example_script) '''
ex_file: str = os.path.join(*example_script)
if 'rpc_bidir_streaming' in ex_file and sys.version_info < (3, 9): if 'rpc_bidir_streaming' in ex_file and sys.version_info < (3, 9):
pytest.skip("2-way streaming example requires py3.9 async with syntax") pytest.skip("2-way streaming example requires py3.9 async with syntax")
@ -127,7 +137,8 @@ def test_example(run_example_in_subproc, example_script):
# shouldn't eventually once we figure out what's # shouldn't eventually once we figure out what's
# a better way to be explicit about aio side # a better way to be explicit about aio side
# cancels? # cancels?
and 'asyncio.exceptions.CancelledError' not in last_error and
'asyncio.exceptions.CancelledError' not in last_error
): ):
raise Exception(errmsg) raise Exception(errmsg)

View File

@ -5,6 +5,7 @@ The hipster way to force SC onto the stdlib's "async": 'infection mode'.
import asyncio import asyncio
import builtins import builtins
from contextlib import ExitStack from contextlib import ExitStack
# from functools import partial
import itertools import itertools
import importlib import importlib
import os import os
@ -31,6 +32,16 @@ from tractor.trionics import BroadcastReceiver
from tractor._testing import expect_ctxc from tractor._testing import expect_ctxc
@pytest.fixture(
scope='module',
)
def delay(debug_mode: bool) -> int:
if debug_mode:
return 999
else:
return 1
async def sleep_and_err( async def sleep_and_err(
sleep_for: float = 0.1, sleep_for: float = 0.1,
@ -58,20 +69,26 @@ async def trio_cancels_single_aio_task():
await tractor.to_asyncio.run_task(aio_sleep_forever) await tractor.to_asyncio.run_task(aio_sleep_forever)
def test_trio_cancels_aio_on_actor_side(reg_addr): def test_trio_cancels_aio_on_actor_side(
reg_addr: tuple[str, int],
delay: int,
debug_mode: bool,
):
''' '''
Spawn an infected actor that is cancelled by the ``trio`` side Spawn an infected actor that is cancelled by the ``trio`` side
task using std cancel scope apis. task using std cancel scope apis.
''' '''
async def main(): async def main():
async with tractor.open_nursery( with trio.fail_after(1 + delay):
registry_addrs=[reg_addr] async with tractor.open_nursery(
) as n: registry_addrs=[reg_addr],
await n.run_in_actor( debug_mode=debug_mode,
trio_cancels_single_aio_task, ) as an:
infect_asyncio=True, await an.run_in_actor(
) trio_cancels_single_aio_task,
infect_asyncio=True,
)
trio.run(main) trio.run(main)
@ -108,12 +125,17 @@ async def asyncio_actor(
except BaseException as err: except BaseException as err:
if expect_err: if expect_err:
assert isinstance(err, error_type) assert isinstance(err, error_type), (
f'{type(err)} is not {error_type}?'
)
raise raise
def test_aio_simple_error(reg_addr): def test_aio_simple_error(
reg_addr: tuple[str, int],
debug_mode: bool,
):
''' '''
Verify a simple remote asyncio error propagates back through trio Verify a simple remote asyncio error propagates back through trio
to the parent actor. to the parent actor.
@ -122,9 +144,10 @@ def test_aio_simple_error(reg_addr):
''' '''
async def main(): async def main():
async with tractor.open_nursery( async with tractor.open_nursery(
registry_addrs=[reg_addr] registry_addrs=[reg_addr],
) as n: debug_mode=debug_mode,
await n.run_in_actor( ) as an:
await an.run_in_actor(
asyncio_actor, asyncio_actor,
target='sleep_and_err', target='sleep_and_err',
expect_err='AssertionError', expect_err='AssertionError',
@ -150,14 +173,19 @@ def test_aio_simple_error(reg_addr):
assert err.boxed_type is AssertionError assert err.boxed_type is AssertionError
def test_tractor_cancels_aio(reg_addr): def test_tractor_cancels_aio(
reg_addr: tuple[str, int],
debug_mode: bool,
):
''' '''
Verify we can cancel a spawned asyncio task gracefully. Verify we can cancel a spawned asyncio task gracefully.
''' '''
async def main(): async def main():
async with tractor.open_nursery() as n: async with tractor.open_nursery(
portal = await n.run_in_actor( debug_mode=debug_mode,
) as an:
portal = await an.run_in_actor(
asyncio_actor, asyncio_actor,
target='aio_sleep_forever', target='aio_sleep_forever',
expect_err='trio.Cancelled', expect_err='trio.Cancelled',
@ -169,7 +197,9 @@ def test_tractor_cancels_aio(reg_addr):
trio.run(main) trio.run(main)
def test_trio_cancels_aio(reg_addr): def test_trio_cancels_aio(
reg_addr: tuple[str, int],
):
''' '''
Much like the above test with ``tractor.Portal.cancel_actor()`` Much like the above test with ``tractor.Portal.cancel_actor()``
except we just use a standard ``trio`` cancellation api. except we just use a standard ``trio`` cancellation api.
@ -180,8 +210,8 @@ def test_trio_cancels_aio(reg_addr):
with trio.move_on_after(1): with trio.move_on_after(1):
# cancel the nursery shortly after boot # cancel the nursery shortly after boot
async with tractor.open_nursery() as n: async with tractor.open_nursery() as tn:
await n.run_in_actor( await tn.run_in_actor(
asyncio_actor, asyncio_actor,
target='aio_sleep_forever', target='aio_sleep_forever',
expect_err='trio.Cancelled', expect_err='trio.Cancelled',
@ -200,23 +230,35 @@ async def trio_ctx(
# this will block until the ``asyncio`` task sends a "first" # this will block until the ``asyncio`` task sends a "first"
# message. # message.
with trio.fail_after(2): delay: int = 999 if tractor.debug_mode() else 1
async with ( with trio.fail_after(1 + delay):
trio.open_nursery() as n, try:
async with (
trio.open_nursery(
# TODO, for new `trio` / py3.13
# strict_exception_groups=False,
) as tn,
tractor.to_asyncio.open_channel_from(
sleep_and_err,
) as (first, chan),
):
tractor.to_asyncio.open_channel_from( assert first == 'start'
sleep_and_err,
) as (first, chan),
):
assert first == 'start' # spawn another asyncio task for the cuck of it.
tn.start_soon(
tractor.to_asyncio.run_task,
aio_sleep_forever,
)
await trio.sleep_forever()
# spawn another asyncio task for the cuck of it. # TODO, factor this into a `trionics.collapse()`?
n.start_soon( except* BaseException as beg:
tractor.to_asyncio.run_task, # await tractor.pause(shield=True)
aio_sleep_forever, if len(excs := beg.exceptions) == 1:
) raise excs[0]
await trio.sleep_forever() else:
raise
@pytest.mark.parametrize( @pytest.mark.parametrize(
@ -225,8 +267,10 @@ async def trio_ctx(
ids='parent_actor_cancels_child={}'.format ids='parent_actor_cancels_child={}'.format
) )
def test_context_spawns_aio_task_that_errors( def test_context_spawns_aio_task_that_errors(
reg_addr, reg_addr: tuple[str, int],
delay: int,
parent_cancels: bool, parent_cancels: bool,
debug_mode: bool,
): ):
''' '''
Verify that spawning a task via an intertask channel ctx mngr that Verify that spawning a task via an intertask channel ctx mngr that
@ -235,14 +279,13 @@ def test_context_spawns_aio_task_that_errors(
''' '''
async def main(): async def main():
with trio.fail_after(1 + delay):
with trio.fail_after(2): async with tractor.open_nursery() as an:
async with tractor.open_nursery() as n: p = await an.start_actor(
p = await n.start_actor(
'aio_daemon', 'aio_daemon',
enable_modules=[__name__], enable_modules=[__name__],
infect_asyncio=True, infect_asyncio=True,
# debug_mode=True, debug_mode=debug_mode,
loglevel='cancel', loglevel='cancel',
) )
async with ( async with (
@ -307,11 +350,14 @@ async def aio_cancel():
await aio_sleep_forever() await aio_sleep_forever()
def test_aio_cancelled_from_aio_causes_trio_cancelled(reg_addr): def test_aio_cancelled_from_aio_causes_trio_cancelled(
reg_addr: tuple,
delay: int,
):
''' '''
When the `asyncio.Task` cancels itself the `trio` side cshould When the `asyncio.Task` cancels itself the `trio` side should
also cancel and teardown and relay the cancellation cross-process also cancel and teardown and relay the cancellation cross-process
to the caller (parent). to the parent caller.
''' '''
async def main(): async def main():
@ -327,7 +373,7 @@ def test_aio_cancelled_from_aio_causes_trio_cancelled(reg_addr):
# NOTE: normally the `an.__aexit__()` waits on the # NOTE: normally the `an.__aexit__()` waits on the
# portal's result but we do it explicitly here # portal's result but we do it explicitly here
# to avoid indent levels. # to avoid indent levels.
with trio.fail_after(1): with trio.fail_after(1 + delay):
await p.wait_for_result() await p.wait_for_result()
with pytest.raises( with pytest.raises(
@ -338,11 +384,10 @@ def test_aio_cancelled_from_aio_causes_trio_cancelled(reg_addr):
# might get multiple `trio.Cancelled`s as well inside an inception # might get multiple `trio.Cancelled`s as well inside an inception
err: RemoteActorError|ExceptionGroup = excinfo.value err: RemoteActorError|ExceptionGroup = excinfo.value
if isinstance(err, ExceptionGroup): if isinstance(err, ExceptionGroup):
err = next(itertools.dropwhile( excs = err.exceptions
lambda exc: not isinstance(exc, tractor.RemoteActorError), assert len(excs) == 1
err.exceptions final_exc = excs[0]
)) assert isinstance(final_exc, tractor.RemoteActorError)
assert err
# relayed boxed error should be our `trio`-task's # relayed boxed error should be our `trio`-task's
# cancel-signal-proxy-equivalent of `asyncio.CancelledError`. # cancel-signal-proxy-equivalent of `asyncio.CancelledError`.
@ -355,15 +400,18 @@ async def no_to_trio_in_args():
async def push_from_aio_task( async def push_from_aio_task(
sequence: Iterable, sequence: Iterable,
to_trio: trio.abc.SendChannel, to_trio: trio.abc.SendChannel,
expect_cancel: False, expect_cancel: False,
fail_early: bool, fail_early: bool,
exit_early: bool,
) -> None: ) -> None:
try: try:
# print('trying breakpoint')
# breakpoint()
# sync caller ctx manager # sync caller ctx manager
to_trio.send_nowait(True) to_trio.send_nowait(True)
@ -372,10 +420,27 @@ async def push_from_aio_task(
to_trio.send_nowait(i) to_trio.send_nowait(i)
await asyncio.sleep(0.001) await asyncio.sleep(0.001)
if i == 50 and fail_early: if (
raise Exception i == 50
):
if fail_early:
print('Raising exc from aio side!')
raise Exception
print('asyncio streamer complete!') if exit_early:
# TODO? really you could enforce the same
# SC-proto we use for actors here with asyncio
# such that a Return[None] msg would be
# implicitly delivered to the trio side?
#
# XXX => this might be the end-all soln for
# converting any-inter-task system (regardless
# of maybe-remote runtime or language) to be
# SC-compat no?
print(f'asyncio breaking early @ {i!r}')
break
print('asyncio streaming complete!')
except asyncio.CancelledError: except asyncio.CancelledError:
if not expect_cancel: if not expect_cancel:
@ -387,9 +452,10 @@ async def push_from_aio_task(
async def stream_from_aio( async def stream_from_aio(
exit_early: bool = False, trio_exit_early: bool = False,
raise_err: bool = False, trio_raise_err: bool = False,
aio_raise_err: bool = False, aio_raise_err: bool = False,
aio_exit_early: bool = False,
fan_out: bool = False, fan_out: bool = False,
) -> None: ) -> None:
@ -402,8 +468,18 @@ async def stream_from_aio(
async with to_asyncio.open_channel_from( async with to_asyncio.open_channel_from(
push_from_aio_task, push_from_aio_task,
sequence=seq, sequence=seq,
expect_cancel=raise_err or exit_early, expect_cancel=trio_raise_err or trio_exit_early,
fail_early=aio_raise_err, fail_early=aio_raise_err,
exit_early=aio_exit_early,
# such that we can test exit early cases
# for each side explicitly.
suppress_graceful_exits=(not(
aio_exit_early
or
trio_exit_early
))
) as (first, chan): ) as (first, chan):
assert first is True assert first is True
@ -415,17 +491,28 @@ async def stream_from_aio(
], ],
): ):
async for value in chan: async for value in chan:
print(f'trio received {value}') print(f'trio received: {value!r}')
# XXX, debugging EoC not being handled correctly
# in `transate_aio_errors()`..
# if value is None:
# await tractor.pause(shield=True)
pulled.append(value) pulled.append(value)
if value == 50: if value == 50:
if raise_err: if trio_raise_err:
raise Exception raise Exception
elif exit_early: elif trio_exit_early:
print('`consume()` breaking early!\n')
break break
print('returning from `consume()`..\n')
# run 2 tasks each pulling from
# the inter-task-channel with the 2nd
# using a fan-out `BroadcastReceiver`.
if fan_out: if fan_out:
# start second task that get's the same stream value set.
async with ( async with (
# NOTE: this has to come first to avoid # NOTE: this has to come first to avoid
@ -433,19 +520,31 @@ async def stream_from_aio(
# tasks are joined.. # tasks are joined..
chan.subscribe() as br, chan.subscribe() as br,
trio.open_nursery() as n, trio.open_nursery() as tn,
): ):
n.start_soon(consume, br) # start 2nd task that get's broadcast the same
# value set.
tn.start_soon(consume, br)
await consume(chan) await consume(chan)
else: else:
await consume(chan) await consume(chan)
except BaseException as err:
import logging
log = logging.getLogger()
log.exception('aio-subactor errored!\n')
raise err
finally: finally:
if ( if not (
not raise_err and trio_raise_err
not exit_early and or
not aio_raise_err trio_exit_early
or
aio_raise_err
or
aio_exit_early
): ):
if fan_out: if fan_out:
# we get double the pulled values in the # we get double the pulled values in the
@ -455,22 +554,27 @@ async def stream_from_aio(
assert list(sorted(pulled)) == expect assert list(sorted(pulled)) == expect
else: else:
# await tractor.pause()
assert pulled == expect assert pulled == expect
else: else:
assert not fan_out assert not fan_out
assert pulled == expect[:51] assert pulled == expect[:51]
print('trio guest mode task completed!') print('trio guest-mode task completed!')
assert chan._aio_task.done()
@pytest.mark.parametrize( @pytest.mark.parametrize(
'fan_out', [False, True], 'fan_out', [False, True],
ids='fan_out_w_chan_subscribe={}'.format ids='fan_out_w_chan_subscribe={}'.format
) )
def test_basic_interloop_channel_stream(reg_addr, fan_out): def test_basic_interloop_channel_stream(
reg_addr: tuple[str, int],
fan_out: bool,
):
async def main(): async def main():
async with tractor.open_nursery() as n: async with tractor.open_nursery() as an:
portal = await n.run_in_actor( portal = await an.run_in_actor(
stream_from_aio, stream_from_aio,
infect_asyncio=True, infect_asyncio=True,
fan_out=fan_out, fan_out=fan_out,
@ -484,10 +588,10 @@ def test_basic_interloop_channel_stream(reg_addr, fan_out):
# TODO: parametrize the above test and avoid the duplication here? # TODO: parametrize the above test and avoid the duplication here?
def test_trio_error_cancels_intertask_chan(reg_addr): def test_trio_error_cancels_intertask_chan(reg_addr):
async def main(): async def main():
async with tractor.open_nursery() as n: async with tractor.open_nursery() as an:
portal = await n.run_in_actor( portal = await an.run_in_actor(
stream_from_aio, stream_from_aio,
raise_err=True, trio_raise_err=True,
infect_asyncio=True, infect_asyncio=True,
) )
# should trigger remote actor error # should trigger remote actor error
@ -500,25 +604,116 @@ def test_trio_error_cancels_intertask_chan(reg_addr):
excinfo.value.boxed_type is Exception excinfo.value.boxed_type is Exception
def test_trio_closes_early_and_channel_exits(reg_addr): def test_trio_closes_early_causes_aio_checkpoint_raise(
reg_addr: tuple[str, int],
delay: int,
debug_mode: bool,
):
'''
Check that if the `trio`-task "exits early and silently" (in this
case during `async for`-ing the inter-task-channel via
a `break`-from-loop), we raise `TrioTaskExited` on the
`asyncio`-side which also then bubbles up through the
`open_channel_from()` block indicating that the `asyncio.Task`
hit a ran another checkpoint despite the `trio.Task` exit.
'''
async def main(): async def main():
async with tractor.open_nursery() as n: with trio.fail_after(1 + delay):
portal = await n.run_in_actor( async with tractor.open_nursery(
stream_from_aio, debug_mode=debug_mode,
exit_early=True, # enable_stack_on_sig=True,
infect_asyncio=True, ) as an:
) portal = await an.run_in_actor(
# should raise RAE diectly stream_from_aio,
await portal.result() trio_exit_early=True,
infect_asyncio=True,
)
# should raise RAE diectly
print('waiting on final infected subactor result..')
res: None = await portal.wait_for_result()
assert res is None
print(f'infected subactor returned result: {res!r}\n')
# should be a quiet exit on a simple channel exit # should be a quiet exit on a simple channel exit
trio.run(main) with pytest.raises(RemoteActorError) as excinfo:
trio.run(main)
# ensure remote error is an explicit `AsyncioCancelled` sub-type
# which indicates to the aio task that the trio side exited
# silently WITHOUT raising a `trio.Cancelled` (which would
# normally be raised instead as a `AsyncioCancelled`).
excinfo.value.boxed_type is to_asyncio.TrioTaskExited
def test_aio_errors_and_channel_propagates_and_closes(reg_addr): def test_aio_exits_early_relays_AsyncioTaskExited(
# TODO, parametrize the 3 possible trio side conditions:
# - trio blocking on receive, aio exits early
# - trio cancelled AND aio exits early on its next tick
# - trio errors AND aio exits early on its next tick
reg_addr: tuple[str, int],
debug_mode: bool,
delay: int,
):
'''
Check that if the `asyncio`-task "exits early and silently" (in this
case during `push_from_aio_task()` pushing to the `InterLoopTaskChannel`
it `break`s from the loop), we raise `AsyncioTaskExited` on the
`trio`-side which then DOES NOT BUBBLE up through the
`open_channel_from()` block UNLESS,
- the trio.Task also errored/cancelled, in which case we wrap
both errors in an eg
- the trio.Task was blocking on rxing a value from the
`InterLoopTaskChannel`.
'''
async def main(): async def main():
async with tractor.open_nursery() as n: with trio.fail_after(1 + delay):
portal = await n.run_in_actor( async with tractor.open_nursery(
debug_mode=debug_mode,
# enable_stack_on_sig=True,
) as an:
portal = await an.run_in_actor(
stream_from_aio,
infect_asyncio=True,
trio_exit_early=False,
aio_exit_early=True,
)
# should raise RAE diectly
print('waiting on final infected subactor result..')
res: None = await portal.wait_for_result()
assert res is None
print(f'infected subactor returned result: {res!r}\n')
# should be a quiet exit on a simple channel exit
with pytest.raises(RemoteActorError) as excinfo:
trio.run(main)
exc = excinfo.value
# TODO, wow bug!
# -[ ] bp handler not replaced!?!?
# breakpoint()
# import pdbp; pdbp.set_trace()
# ensure remote error is an explicit `AsyncioCancelled` sub-type
# which indicates to the aio task that the trio side exited
# silently WITHOUT raising a `trio.Cancelled` (which would
# normally be raised instead as a `AsyncioCancelled`).
assert exc.boxed_type is to_asyncio.AsyncioTaskExited
def test_aio_errors_and_channel_propagates_and_closes(
reg_addr: tuple[str, int],
debug_mode: bool,
):
async def main():
async with tractor.open_nursery(
debug_mode=debug_mode,
) as an:
portal = await an.run_in_actor(
stream_from_aio, stream_from_aio,
aio_raise_err=True, aio_raise_err=True,
infect_asyncio=True, infect_asyncio=True,
@ -536,41 +731,46 @@ def test_aio_errors_and_channel_propagates_and_closes(reg_addr):
excinfo.value.boxed_type is Exception excinfo.value.boxed_type is Exception
async def aio_echo_server(
to_trio: trio.MemorySendChannel,
from_trio: asyncio.Queue,
) -> None:
to_trio.send_nowait('start')
while True:
try:
msg = await from_trio.get()
except to_asyncio.TrioTaskExited:
print(
'breaking aio echo loop due to `trio` exit!'
)
break
# echo the msg back
to_trio.send_nowait(msg)
# if we get the terminate sentinel
# break the echo loop
if msg is None:
print('breaking aio echo loop')
break
print('exiting asyncio task')
@tractor.context @tractor.context
async def trio_to_aio_echo_server( async def trio_to_aio_echo_server(
ctx: tractor.Context, ctx: tractor.Context|None,
): ):
async def aio_echo_server(
to_trio: trio.MemorySendChannel,
from_trio: asyncio.Queue,
) -> None:
to_trio.send_nowait('start')
while True:
msg = await from_trio.get()
# echo the msg back
to_trio.send_nowait(msg)
# if we get the terminate sentinel
# break the echo loop
if msg is None:
print('breaking aio echo loop')
break
print('exiting asyncio task')
async with to_asyncio.open_channel_from( async with to_asyncio.open_channel_from(
aio_echo_server, aio_echo_server,
) as (first, chan): ) as (first, chan):
assert first == 'start' assert first == 'start'
await ctx.started(first) await ctx.started(first)
async with ctx.open_stream() as stream: async with ctx.open_stream() as stream:
async for msg in stream: async for msg in stream:
print(f'asyncio echoing {msg}') print(f'asyncio echoing {msg}')
await chan.send(msg) await chan.send(msg)
@ -594,13 +794,15 @@ async def trio_to_aio_echo_server(
ids='raise_error={}'.format, ids='raise_error={}'.format,
) )
def test_echoserver_detailed_mechanics( def test_echoserver_detailed_mechanics(
reg_addr, reg_addr: tuple[str, int],
debug_mode: bool,
raise_error_mid_stream, raise_error_mid_stream,
): ):
async def main(): async def main():
async with tractor.open_nursery() as n: async with tractor.open_nursery(
p = await n.start_actor( debug_mode=debug_mode,
) as an:
p = await an.start_actor(
'aio_server', 'aio_server',
enable_modules=[__name__], enable_modules=[__name__],
infect_asyncio=True, infect_asyncio=True,
@ -649,7 +851,6 @@ def test_echoserver_detailed_mechanics(
trio.run(main) trio.run(main)
@tractor.context @tractor.context
async def manage_file( async def manage_file(
ctx: tractor.Context, ctx: tractor.Context,
@ -806,6 +1007,8 @@ def test_sigint_closes_lifetime_stack(
''' '''
async def main(): async def main():
delay = 999 if tractor.debug_mode() else 1
try: try:
an: tractor.ActorNursery an: tractor.ActorNursery
async with tractor.open_nursery( async with tractor.open_nursery(
@ -856,7 +1059,7 @@ def test_sigint_closes_lifetime_stack(
if wait_for_ctx: if wait_for_ctx:
print('waiting for ctx outcome in parent..') print('waiting for ctx outcome in parent..')
try: try:
with trio.fail_after(1): with trio.fail_after(1 + delay):
await ctx.wait_for_result() await ctx.wait_for_result()
except tractor.ContextCancelled as ctxc: except tractor.ContextCancelled as ctxc:
assert ctxc.canceller == ctx.chan.uid assert ctxc.canceller == ctx.chan.uid

View File

@ -170,7 +170,7 @@ def test_do_not_swallow_error_before_started_by_remote_contextcancelled(
trio.run(main) trio.run(main)
rae = excinfo.value rae = excinfo.value
assert rae.boxed_type == TypeError assert rae.boxed_type is TypeError
@tractor.context @tractor.context

View File

@ -0,0 +1,248 @@
'''
Special attention cases for using "infect `asyncio`" mode from a root
actor; i.e. not using a std `trio.run()` bootstrap.
'''
import asyncio
from functools import partial
import pytest
import trio
import tractor
from tractor import (
to_asyncio,
)
from tests.test_infected_asyncio import (
aio_echo_server,
)
@pytest.mark.parametrize(
'raise_error_mid_stream',
[
False,
Exception,
KeyboardInterrupt,
],
ids='raise_error={}'.format,
)
def test_infected_root_actor(
raise_error_mid_stream: bool|Exception,
# conftest wide
loglevel: str,
debug_mode: bool,
):
'''
Verify you can run the `tractor` runtime with `Actor.is_infected_aio() == True`
in the root actor.
'''
async def _trio_main():
with trio.fail_after(2 if not debug_mode else 999):
first: str
chan: to_asyncio.LinkedTaskChannel
async with (
tractor.open_root_actor(
debug_mode=debug_mode,
loglevel=loglevel,
),
to_asyncio.open_channel_from(
aio_echo_server,
) as (first, chan),
):
assert first == 'start'
for i in range(1000):
await chan.send(i)
out = await chan.receive()
assert out == i
print(f'asyncio echoing {i}')
if (
raise_error_mid_stream
and
i == 500
):
raise raise_error_mid_stream
if out is None:
try:
out = await chan.receive()
except trio.EndOfChannel:
break
else:
raise RuntimeError(
'aio channel never stopped?'
)
if raise_error_mid_stream:
with pytest.raises(raise_error_mid_stream):
tractor.to_asyncio.run_as_asyncio_guest(
trio_main=_trio_main,
)
else:
tractor.to_asyncio.run_as_asyncio_guest(
trio_main=_trio_main,
)
async def sync_and_err(
# just signature placeholders for compat with
# ``to_asyncio.open_channel_from()``
to_trio: trio.MemorySendChannel,
from_trio: asyncio.Queue,
ev: asyncio.Event,
):
if to_trio:
to_trio.send_nowait('start')
await ev.wait()
raise RuntimeError('asyncio-side')
@pytest.mark.parametrize(
'aio_err_trigger',
[
'before_start_point',
'after_trio_task_starts',
'after_start_point',
],
ids='aio_err_triggered={}'.format
)
def test_trio_prestarted_task_bubbles(
aio_err_trigger: str,
# conftest wide
loglevel: str,
debug_mode: bool,
):
async def pre_started_err(
raise_err: bool = False,
pre_sleep: float|None = None,
aio_trigger: asyncio.Event|None = None,
task_status=trio.TASK_STATUS_IGNORED,
):
'''
Maybe pre-started error then sleep.
'''
if pre_sleep is not None:
print(f'Sleeping from trio for {pre_sleep!r}s !')
await trio.sleep(pre_sleep)
# signal aio-task to raise JUST AFTER this task
# starts but has not yet `.started()`
if aio_trigger:
print('Signalling aio-task to raise from `trio`!!')
aio_trigger.set()
if raise_err:
print('Raising from trio!')
raise TypeError('trio-side')
task_status.started()
await trio.sleep_forever()
async def _trio_main():
# with trio.fail_after(2):
with trio.fail_after(999):
first: str
chan: to_asyncio.LinkedTaskChannel
aio_ev = asyncio.Event()
async with (
tractor.open_root_actor(
debug_mode=False,
loglevel=loglevel,
),
):
# TODO, tests for this with 3.13 egs?
# from tractor.devx import open_crash_handler
# with open_crash_handler():
async with (
# where we'll start a sub-task that errors BEFORE
# calling `.started()` such that the error should
# bubble before the guest run terminates!
trio.open_nursery() as tn,
# THEN start an infect task which should error just
# after the trio-side's task does.
to_asyncio.open_channel_from(
partial(
sync_and_err,
ev=aio_ev,
)
) as (first, chan),
):
for i in range(5):
pre_sleep: float|None = None
last_iter: bool = (i == 4)
# TODO, missing cases?
# -[ ] error as well on
# 'after_start_point' case as well for
# another case?
raise_err: bool = False
if last_iter:
raise_err: bool = True
# trigger aio task to error on next loop
# tick/checkpoint
if aio_err_trigger == 'before_start_point':
aio_ev.set()
pre_sleep: float = 0
await tn.start(
pre_started_err,
raise_err,
pre_sleep,
(aio_ev if (
aio_err_trigger == 'after_trio_task_starts'
and
last_iter
) else None
),
)
if (
aio_err_trigger == 'after_start_point'
and
last_iter
):
aio_ev.set()
with pytest.raises(
expected_exception=ExceptionGroup,
) as excinfo:
tractor.to_asyncio.run_as_asyncio_guest(
trio_main=_trio_main,
)
eg = excinfo.value
rte_eg, rest_eg = eg.split(RuntimeError)
# ensure the trio-task's error bubbled despite the aio-side
# having (maybe) errored first.
if aio_err_trigger in (
'after_trio_task_starts',
'after_start_point',
):
assert len(errs := rest_eg.exceptions) == 1
typerr = errs[0]
assert (
type(typerr) is TypeError
and
'trio-side' in typerr.args
)
# when aio errors BEFORE (last) trio task is scheduled, we should
# never see anythinb but the aio-side.
else:
assert len(rtes := rte_eg.exceptions) == 1
assert 'asyncio-side' in rtes[0].args[0]

View File

@ -2,7 +2,9 @@
Broadcast channels for fan-out to local tasks. Broadcast channels for fan-out to local tasks.
""" """
from contextlib import asynccontextmanager from contextlib import (
asynccontextmanager as acm,
)
from functools import partial from functools import partial
from itertools import cycle from itertools import cycle
import time import time
@ -15,6 +17,7 @@ import tractor
from tractor.trionics import ( from tractor.trionics import (
broadcast_receiver, broadcast_receiver,
Lagged, Lagged,
collapse_eg,
) )
@ -62,7 +65,7 @@ async def ensure_sequence(
break break
@asynccontextmanager @acm
async def open_sequence_streamer( async def open_sequence_streamer(
sequence: list[int], sequence: list[int],
@ -74,9 +77,9 @@ async def open_sequence_streamer(
async with tractor.open_nursery( async with tractor.open_nursery(
arbiter_addr=reg_addr, arbiter_addr=reg_addr,
start_method=start_method, start_method=start_method,
) as tn: ) as an:
portal = await tn.start_actor( portal = await an.start_actor(
'sequence_echoer', 'sequence_echoer',
enable_modules=[__name__], enable_modules=[__name__],
) )
@ -155,9 +158,12 @@ def test_consumer_and_parent_maybe_lag(
) as stream: ) as stream:
try: try:
async with trio.open_nursery() as n: async with (
collapse_eg(),
trio.open_nursery() as tn,
):
n.start_soon( tn.start_soon(
ensure_sequence, ensure_sequence,
stream, stream,
sequence.copy(), sequence.copy(),
@ -230,8 +236,8 @@ def test_faster_task_to_recv_is_cancelled_by_slower(
) as stream: ) as stream:
async with trio.open_nursery() as n: async with trio.open_nursery() as tn:
n.start_soon( tn.start_soon(
ensure_sequence, ensure_sequence,
stream, stream,
sequence.copy(), sequence.copy(),
@ -253,7 +259,7 @@ def test_faster_task_to_recv_is_cancelled_by_slower(
continue continue
print('cancelling faster subtask') print('cancelling faster subtask')
n.cancel_scope.cancel() tn.cancel_scope.cancel()
try: try:
value = await stream.receive() value = await stream.receive()
@ -371,13 +377,13 @@ def test_ensure_slow_consumers_lag_out(
f'on {lags}:{value}') f'on {lags}:{value}')
return return
async with trio.open_nursery() as nursery: async with trio.open_nursery() as tn:
for i in range(1, num_laggers): for i in range(1, num_laggers):
task_name = f'sub_{i}' task_name = f'sub_{i}'
laggers[task_name] = 0 laggers[task_name] = 0
nursery.start_soon( tn.start_soon(
partial( partial(
sub_and_print, sub_and_print,
delay=i*0.001, delay=i*0.001,
@ -497,6 +503,7 @@ def test_no_raise_on_lag():
# internals when the no raise flag is set. # internals when the no raise flag is set.
loglevel='warning', loglevel='warning',
), ),
collapse_eg(),
trio.open_nursery() as n, trio.open_nursery() as n,
): ):
n.start_soon(slow) n.start_soon(slow)

View File

@ -3,6 +3,10 @@ Reminders for oddities in `trio` that we need to stay aware of and/or
want to see changed. want to see changed.
''' '''
from contextlib import (
asynccontextmanager as acm,
)
import pytest import pytest
import trio import trio
from trio import TaskStatus from trio import TaskStatus
@ -60,7 +64,9 @@ def test_stashed_child_nursery(use_start_soon):
async def main(): async def main():
async with ( async with (
trio.open_nursery() as pn, trio.open_nursery(
strict_exception_groups=False,
) as pn,
): ):
cn = await pn.start(mk_child_nursery) cn = await pn.start(mk_child_nursery)
assert cn assert cn
@ -80,3 +86,118 @@ def test_stashed_child_nursery(use_start_soon):
with pytest.raises(NameError): with pytest.raises(NameError):
trio.run(main) trio.run(main)
@pytest.mark.parametrize(
('unmask_from_canc', 'canc_from_finally'),
[
(True, False),
(True, True),
pytest.param(False, True,
marks=pytest.mark.xfail(reason="never raises!")
),
],
# TODO, ask ronny how to impl this .. XD
# ids='unmask_from_canc={0}, canc_from_finally={1}',#.format,
)
def test_acm_embedded_nursery_propagates_enter_err(
canc_from_finally: bool,
unmask_from_canc: bool,
debug_mode: bool,
):
'''
Demo how a masking `trio.Cancelled` could be handled by unmasking from the
`.__context__` field when a user (by accident) re-raises from a `finally:`.
'''
import tractor
@acm
async def maybe_raise_from_masking_exc(
tn: trio.Nursery,
unmask_from: BaseException|None = trio.Cancelled
# TODO, maybe offer a collection?
# unmask_from: set[BaseException] = {
# trio.Cancelled,
# },
):
if not unmask_from:
yield
return
try:
yield
except* unmask_from as be_eg:
# TODO, if we offer `unmask_from: set`
# for masker_exc_type in unmask_from:
matches, rest = be_eg.split(unmask_from)
if not matches:
raise
for exc_match in be_eg.exceptions:
if (
(exc_ctx := exc_match.__context__)
and
type(exc_ctx) not in {
# trio.Cancelled, # always by default?
unmask_from,
}
):
exc_ctx.add_note(
f'\n'
f'WARNING: the above error was masked by a {unmask_from!r} !?!\n'
f'Are you always cancelling? Say from a `finally:` ?\n\n'
f'{tn!r}'
)
raise exc_ctx from exc_match
@acm
async def wraps_tn_that_always_cancels():
async with (
trio.open_nursery() as tn,
maybe_raise_from_masking_exc(
tn=tn,
unmask_from=(
trio.Cancelled
if unmask_from_canc
else None
),
)
):
try:
yield tn
finally:
if canc_from_finally:
tn.cancel_scope.cancel()
await trio.lowlevel.checkpoint()
async def _main():
with tractor.devx.maybe_open_crash_handler(
pdb=debug_mode,
) as bxerr:
assert not bxerr.value
async with (
wraps_tn_that_always_cancels() as tn,
):
assert not tn.cancel_scope.cancel_called
assert 0
assert (
(err := bxerr.value)
and
type(err) is AssertionError
)
with pytest.raises(ExceptionGroup) as excinfo:
trio.run(_main)
eg: ExceptionGroup = excinfo.value
assert_eg, rest_eg = eg.split(AssertionError)
assert len(assert_eg.exceptions) == 1

View File

@ -43,6 +43,7 @@ from ._state import (
current_actor as current_actor, current_actor as current_actor,
is_root_process as is_root_process, is_root_process as is_root_process,
current_ipc_ctx as current_ipc_ctx, current_ipc_ctx as current_ipc_ctx,
debug_mode as debug_mode
) )
from ._exceptions import ( from ._exceptions import (
ContextCancelled as ContextCancelled, ContextCancelled as ContextCancelled,
@ -65,3 +66,4 @@ from ._root import (
from ._ipc import Channel as Channel from ._ipc import Channel as Channel
from ._portal import Portal as Portal from ._portal import Portal as Portal
from ._runtime import Actor as Actor from ._runtime import Actor as Actor
from . import hilevel as hilevel

View File

@ -19,10 +19,13 @@ Actor cluster helpers.
''' '''
from __future__ import annotations from __future__ import annotations
from contextlib import (
from contextlib import asynccontextmanager as acm asynccontextmanager as acm,
)
from multiprocessing import cpu_count from multiprocessing import cpu_count
from typing import AsyncGenerator, Optional from typing import (
AsyncGenerator,
)
import trio import trio
import tractor import tractor

View File

@ -47,6 +47,9 @@ from functools import partial
import inspect import inspect
from pprint import pformat from pprint import pformat
import textwrap import textwrap
from types import (
UnionType,
)
from typing import ( from typing import (
Any, Any,
AsyncGenerator, AsyncGenerator,
@ -950,7 +953,7 @@ class Context:
# f'Context.cancel() => {self.chan.uid}\n' # f'Context.cancel() => {self.chan.uid}\n'
f'c)=> {self.chan.uid}\n' f'c)=> {self.chan.uid}\n'
# f'{self.chan.uid}\n' # f'{self.chan.uid}\n'
f' |_ @{self.dst_maddr}\n' f' |_ @{self.dst_maddr}\n'
f' >> {self.repr_rpc}\n' f' >> {self.repr_rpc}\n'
# f' >> {self._nsf}() -> {codec}[dict]:\n\n' # f' >> {self._nsf}() -> {codec}[dict]:\n\n'
# TODO: pull msg-type from spec re #320 # TODO: pull msg-type from spec re #320
@ -1003,7 +1006,8 @@ class Context:
) )
else: else:
log.cancel( log.cancel(
'Timed out on cancel request of remote task?\n' f'Timed out on cancel request of remote task?\n'
f'\n'
f'{reminfo}' f'{reminfo}'
) )
@ -1560,12 +1564,12 @@ class Context:
strict_pld_parity=strict_pld_parity, strict_pld_parity=strict_pld_parity,
hide_tb=hide_tb, hide_tb=hide_tb,
) )
except BaseException as err: except BaseException as _bexc:
err = _bexc
if not isinstance(err, MsgTypeError): if not isinstance(err, MsgTypeError):
__tracebackhide__: bool = False __tracebackhide__: bool = False
raise raise err
# TODO: maybe a flag to by-pass encode op if already done # TODO: maybe a flag to by-pass encode op if already done
# here in caller? # here in caller?
@ -1982,7 +1986,10 @@ async def open_context_from_portal(
ctxc_from_callee: ContextCancelled|None = None ctxc_from_callee: ContextCancelled|None = None
try: try:
async with ( async with (
trio.open_nursery() as tn, trio.open_nursery(
strict_exception_groups=False,
) as tn,
msgops.maybe_limit_plds( msgops.maybe_limit_plds(
ctx=ctx, ctx=ctx,
spec=ctx_meta.get('pld_spec'), spec=ctx_meta.get('pld_spec'),
@ -2544,7 +2551,14 @@ def context(
name: str name: str
param: Type param: Type
for name, param in annots.items(): for name, param in annots.items():
if param is Context: if (
param is Context
or (
isinstance(param, UnionType)
and
Context in param.__args__
)
):
ctx_var_name: str = name ctx_var_name: str = name
break break
else: else:

View File

@ -238,7 +238,7 @@ def _trio_main(
nest_from_op( nest_from_op(
input_op='>(', # see syntax ideas above input_op='>(', # see syntax ideas above
tree_str=actor_info, tree_str=actor_info,
back_from_op=1, back_from_op=2, # since "complete"
) )
) )
logmeth = log.info logmeth = log.info

View File

@ -22,6 +22,7 @@ from __future__ import annotations
import builtins import builtins
import importlib import importlib
from pprint import pformat from pprint import pformat
from pdb import bdb
import sys import sys
from types import ( from types import (
TracebackType, TracebackType,
@ -82,6 +83,48 @@ class InternalError(RuntimeError):
''' '''
class AsyncioCancelled(Exception):
'''
Asyncio cancelled translation (non-base) error
for use with the ``to_asyncio`` module
to be raised in the ``trio`` side task
NOTE: this should NOT inherit from `asyncio.CancelledError` or
tests should break!
'''
class AsyncioTaskExited(Exception):
'''
asyncio.Task "exited" translation error for use with the
`to_asyncio` APIs to be raised in the `trio` side task indicating
on `.run_task()`/`.open_channel_from()` exit that the aio side
exited early/silently.
'''
class TrioCancelled(Exception):
'''
Trio cancelled translation (non-base) error
for use with the `to_asyncio` module
to be raised in the `asyncio.Task` to indicate
that the `trio` side raised `Cancelled` or an error.
'''
class TrioTaskExited(Exception):
'''
The `trio`-side task exited without explicitly cancelling the
`asyncio.Task` peer.
This is very similar to how `trio.ClosedResource` acts as
a "clean shutdown" signal to the consumer side of a mem-chan,
https://trio.readthedocs.io/en/stable/reference-core.html#clean-shutdown-with-channels
'''
# NOTE: more or less should be close to these: # NOTE: more or less should be close to these:
# 'boxed_type', # 'boxed_type',
@ -127,8 +170,8 @@ _body_fields: list[str] = list(
def get_err_type(type_name: str) -> BaseException|None: def get_err_type(type_name: str) -> BaseException|None:
''' '''
Look up an exception type by name from the set of locally Look up an exception type by name from the set of locally known
known namespaces: namespaces:
- `builtins` - `builtins`
- `tractor._exceptions` - `tractor._exceptions`
@ -139,6 +182,7 @@ def get_err_type(type_name: str) -> BaseException|None:
builtins, builtins,
_this_mod, _this_mod,
trio, trio,
bdb,
]: ]:
if type_ref := getattr( if type_ref := getattr(
ns, ns,
@ -358,6 +402,13 @@ class RemoteActorError(Exception):
self._ipc_msg.src_type_str self._ipc_msg.src_type_str
) )
if not self._src_type:
raise TypeError(
f'Failed to lookup src error type with '
f'`tractor._exceptions.get_err_type()` :\n'
f'{self.src_type_str}'
)
return self._src_type return self._src_type
@property @property
@ -366,6 +417,9 @@ class RemoteActorError(Exception):
String-name of the (last hop's) boxed error type. String-name of the (last hop's) boxed error type.
''' '''
# TODO, maybe support also serializing the
# `ExceptionGroup.exeptions: list[BaseException]` set under
# certain conditions?
bt: Type[BaseException] = self.boxed_type bt: Type[BaseException] = self.boxed_type
if bt: if bt:
return str(bt.__name__) return str(bt.__name__)
@ -378,9 +432,13 @@ class RemoteActorError(Exception):
Error type boxed by last actor IPC hop. Error type boxed by last actor IPC hop.
''' '''
if self._boxed_type is None: if (
self._boxed_type is None
and
(ipc_msg := self._ipc_msg)
):
self._boxed_type = get_err_type( self._boxed_type = get_err_type(
self._ipc_msg.boxed_type_str ipc_msg.boxed_type_str
) )
return self._boxed_type return self._boxed_type
@ -652,16 +710,10 @@ class RemoteActorError(Exception):
failing actor's remote env. failing actor's remote env.
''' '''
src_type_ref: Type[BaseException] = self.src_type
if not src_type_ref:
raise TypeError(
'Failed to lookup src error type:\n'
f'{self.src_type_str}'
)
# TODO: better tb insertion and all the fancier dunder # TODO: better tb insertion and all the fancier dunder
# metadata stuff as per `.__context__` etc. and friends: # metadata stuff as per `.__context__` etc. and friends:
# https://github.com/python-trio/trio/issues/611 # https://github.com/python-trio/trio/issues/611
src_type_ref: Type[BaseException] = self.src_type
return src_type_ref(self.tb_str) return src_type_ref(self.tb_str)
# TODO: local recontruction of nested inception for a given # TODO: local recontruction of nested inception for a given
@ -787,8 +839,11 @@ class MsgTypeError(
''' '''
if ( if (
(_bad_msg := self.msgdata.get('_bad_msg')) (_bad_msg := self.msgdata.get('_bad_msg'))
and and (
isinstance(_bad_msg, PayloadMsg) isinstance(_bad_msg, PayloadMsg)
or
isinstance(_bad_msg, msgtypes.Start)
)
): ):
return _bad_msg return _bad_msg
@ -981,18 +1036,6 @@ class MessagingError(Exception):
''' '''
class AsyncioCancelled(Exception):
'''
Asyncio cancelled translation (non-base) error
for use with the ``to_asyncio`` module
to be raised in the ``trio`` side task
NOTE: this should NOT inherit from `asyncio.CancelledError` or
tests should break!
'''
def pack_error( def pack_error(
exc: BaseException|RemoteActorError, exc: BaseException|RemoteActorError,
@ -1104,6 +1147,8 @@ def unpack_error(
which is the responsibilitiy of the caller. which is the responsibilitiy of the caller.
''' '''
# XXX, apparently we pass all sorts of msgs here?
# kinda odd but seems like maybe they shouldn't be?
if not isinstance(msg, Error): if not isinstance(msg, Error):
return None return None
@ -1146,19 +1191,51 @@ def unpack_error(
def is_multi_cancelled( def is_multi_cancelled(
exc: BaseException|BaseExceptionGroup exc: BaseException|BaseExceptionGroup,
) -> bool:
ignore_nested: set[BaseException] = set(),
) -> bool|BaseExceptionGroup:
''' '''
Predicate to determine if a possible ``BaseExceptionGroup`` contains Predicate to determine if an `BaseExceptionGroup` only contains
only ``trio.Cancelled`` sub-exceptions (and is likely the result of some (maybe nested) set of sub-grouped exceptions (like only
cancelling a collection of subtasks. `trio.Cancelled`s which get swallowed silently by default) and is
thus the result of "gracefully cancelling" a collection of
sub-tasks (or other conc primitives) and receiving a "cancelled
ACK" from each after termination.
Docs:
----
- https://docs.python.org/3/library/exceptions.html#exception-groups
- https://docs.python.org/3/library/exceptions.html#BaseExceptionGroup.subgroup
''' '''
if (
not ignore_nested
or
trio.Cancelled in ignore_nested
# XXX always count-in `trio`'s native signal
):
ignore_nested.update({trio.Cancelled})
if isinstance(exc, BaseExceptionGroup): if isinstance(exc, BaseExceptionGroup):
return exc.subgroup( matched_exc: BaseExceptionGroup|None = exc.subgroup(
lambda exc: isinstance(exc, trio.Cancelled) tuple(ignore_nested),
) is not None
# TODO, complain about why not allowed XD
# condition=tuple(ignore_nested),
)
if matched_exc is not None:
return matched_exc
# NOTE, IFF no excs types match (throughout the error-tree)
# -> return `False`, OW return the matched sub-eg.
#
# IOW, for the inverse of ^ for the purpose of
# maybe-enter-REPL--logic: "only debug when the err-tree contains
# at least one exc-type NOT in `ignore_nested`" ; i.e. the case where
# we fallthrough and return `False` here.
return False return False

View File

@ -255,8 +255,8 @@ class MsgpackTCPStream(MsgTransport):
raise TransportClosed( raise TransportClosed(
message=( message=(
f'IPC transport already closed by peer\n' f'IPC transport already closed by peer\n'
f'x)> {type(trans_err)}\n' f'x]> {type(trans_err)}\n'
f' |_{self}\n' f' |_{self}\n'
), ),
loglevel=loglevel, loglevel=loglevel,
) from trans_err ) from trans_err
@ -273,8 +273,8 @@ class MsgpackTCPStream(MsgTransport):
raise TransportClosed( raise TransportClosed(
message=( message=(
f'IPC transport already manually closed locally?\n' f'IPC transport already manually closed locally?\n'
f'x)> {type(closure_err)} \n' f'x]> {type(closure_err)} \n'
f' |_{self}\n' f' |_{self}\n'
), ),
loglevel='error', loglevel='error',
raise_on_report=( raise_on_report=(
@ -289,8 +289,8 @@ class MsgpackTCPStream(MsgTransport):
raise TransportClosed( raise TransportClosed(
message=( message=(
f'IPC transport already gracefully closed\n' f'IPC transport already gracefully closed\n'
f')>\n' f']>\n'
f'|_{self}\n' f' |_{self}\n'
), ),
loglevel='transport', loglevel='transport',
# cause=??? # handy or no? # cause=??? # handy or no?

View File

@ -533,6 +533,10 @@ async def open_portal(
async with maybe_open_nursery( async with maybe_open_nursery(
tn, tn,
shield=shield, shield=shield,
strict_exception_groups=False,
# ^XXX^ TODO? soo roll our own then ??
# -> since we kinda want the "if only one `.exception` then
# just raise that" interface?
) as tn: ) as tn:
if not channel.connected(): if not channel.connected():

View File

@ -95,6 +95,13 @@ async def open_root_actor(
hide_tb: bool = True, hide_tb: bool = True,
# XXX, proxied directly to `.devx._debug._maybe_enter_pm()`
# for REPL-entry logic.
debug_filter: Callable[
[BaseException|BaseExceptionGroup],
bool,
] = lambda err: not is_multi_cancelled(err),
# TODO, a way for actors to augment passing derived # TODO, a way for actors to augment passing derived
# read-only state to sublayers? # read-only state to sublayers?
# extra_rt_vars: dict|None = None, # extra_rt_vars: dict|None = None,
@ -104,8 +111,8 @@ async def open_root_actor(
Runtime init entry point for ``tractor``. Runtime init entry point for ``tractor``.
''' '''
__tracebackhide__: bool = hide_tb
_debug.hide_runtime_frames() _debug.hide_runtime_frames()
__tracebackhide__: bool = hide_tb
# TODO: stick this in a `@cm` defined in `devx._debug`? # TODO: stick this in a `@cm` defined in `devx._debug`?
# #
@ -334,6 +341,10 @@ async def open_root_actor(
loglevel=loglevel, loglevel=loglevel,
enable_modules=enable_modules, enable_modules=enable_modules,
) )
# XXX, in case the root actor runtime was actually run from
# `tractor.to_asyncio.run_as_asyncio_guest()` and NOt
# `.trio.run()`.
actor._infected_aio = _state._runtime_vars['_is_infected_aio']
# Start up main task set via core actor-runtime nurseries. # Start up main task set via core actor-runtime nurseries.
try: try:
@ -351,7 +362,10 @@ async def open_root_actor(
) )
# start the actor runtime in a new task # start the actor runtime in a new task
async with trio.open_nursery() as nursery: async with trio.open_nursery(
strict_exception_groups=False,
# ^XXX^ TODO? instead unpack any RAE as per "loose" style?
) as nursery:
# ``_runtime.async_main()`` creates an internal nursery # ``_runtime.async_main()`` creates an internal nursery
# and blocks here until any underlying actor(-process) # and blocks here until any underlying actor(-process)
@ -375,6 +389,13 @@ async def open_root_actor(
Exception, Exception,
BaseExceptionGroup, BaseExceptionGroup,
) as err: ) as err:
# TODO, in beginning to handle the subsubactor with
# crashed grandparent cases..
#
# was_locked: bool = await _debug.maybe_wait_for_debugger(
# child_in_debug=True,
# )
# XXX NOTE XXX see equiv note inside # XXX NOTE XXX see equiv note inside
# `._runtime.Actor._stream_handler()` where in the # `._runtime.Actor._stream_handler()` where in the
# non-root or root-that-opened-this-mahually case we # non-root or root-that-opened-this-mahually case we
@ -383,11 +404,15 @@ async def open_root_actor(
entered: bool = await _debug._maybe_enter_pm( entered: bool = await _debug._maybe_enter_pm(
err, err,
api_frame=inspect.currentframe(), api_frame=inspect.currentframe(),
debug_filter=debug_filter,
) )
if ( if (
not entered not entered
and and
not is_multi_cancelled(err) not is_multi_cancelled(
err,
)
): ):
logger.exception('Root actor crashed\n') logger.exception('Root actor crashed\n')
@ -441,12 +466,19 @@ def run_daemon(
start_method: str | None = None, start_method: str | None = None,
debug_mode: bool = False, debug_mode: bool = False,
# TODO, support `infected_aio=True` mode by,
# - calling the appropriate entrypoint-func from `.to_asyncio`
# - maybe init-ing `greenback` as done above in
# `open_root_actor()`.
**kwargs **kwargs
) -> None: ) -> None:
''' '''
Spawn daemon actor which will respond to RPC; the main task simply Spawn a root (daemon) actor which will respond to RPC; the main
starts the runtime and then sleeps forever. task simply starts the runtime and then blocks via embedded
`trio.sleep_forever()`.
This is a very minimal convenience wrapper around starting This is a very minimal convenience wrapper around starting
a "run-until-cancelled" root actor which can be started with a set a "run-until-cancelled" root actor which can be started with a set
@ -459,7 +491,6 @@ def run_daemon(
importlib.import_module(path) importlib.import_module(path)
async def _main(): async def _main():
async with open_root_actor( async with open_root_actor(
registry_addrs=registry_addrs, registry_addrs=registry_addrs,
name=name, name=name,

View File

@ -621,7 +621,11 @@ async def _invoke(
tn: trio.Nursery tn: trio.Nursery
rpc_ctx_cs: CancelScope rpc_ctx_cs: CancelScope
async with ( async with (
trio.open_nursery() as tn, trio.open_nursery(
strict_exception_groups=False,
# ^XXX^ TODO? instead unpack any RAE as per "loose" style?
) as tn,
msgops.maybe_limit_plds( msgops.maybe_limit_plds(
ctx=ctx, ctx=ctx,
spec=ctx_meta.get('pld_spec'), spec=ctx_meta.get('pld_spec'),
@ -734,8 +738,8 @@ async def _invoke(
# XXX: do we ever trigger this block any more? # XXX: do we ever trigger this block any more?
except ( except (
BaseExceptionGroup, BaseExceptionGroup,
trio.Cancelled,
BaseException, BaseException,
trio.Cancelled,
) as scope_error: ) as scope_error:
if ( if (
@ -848,8 +852,8 @@ async def try_ship_error_to_remote(
log.critical( log.critical(
'IPC transport failure -> ' 'IPC transport failure -> '
f'failed to ship error to {remote_descr}!\n\n' f'failed to ship error to {remote_descr}!\n\n'
f'X=> {channel.uid}\n\n' f'{type(msg)!r}[{msg.boxed_type_str}] X=> {channel.uid}\n'
f'\n'
# TODO: use `.msg.preetty_struct` for this! # TODO: use `.msg.preetty_struct` for this!
f'{msg}\n' f'{msg}\n'
) )

View File

@ -1287,7 +1287,8 @@ class Actor:
msg: str = ( msg: str = (
f'Actor-runtime cancel request from {requester_type}\n\n' f'Actor-runtime cancel request from {requester_type}\n\n'
f'<=c) {requesting_uid}\n' f'<=c) {requesting_uid}\n'
f' |_{self}\n' f' |_{self}\n'
f'\n'
) )
# TODO: what happens here when we self-cancel tho? # TODO: what happens here when we self-cancel tho?
@ -1307,13 +1308,15 @@ class Actor:
lock_req_ctx.has_outcome lock_req_ctx.has_outcome
): ):
msg += ( msg += (
'-> Cancelling active debugger request..\n' f'\n'
f'-> Cancelling active debugger request..\n'
f'|_{_debug.Lock.repr()}\n\n' f'|_{_debug.Lock.repr()}\n\n'
f'|_{lock_req_ctx}\n\n' f'|_{lock_req_ctx}\n\n'
) )
# lock_req_ctx._scope.cancel() # lock_req_ctx._scope.cancel()
# TODO: wrap this in a method-API.. # TODO: wrap this in a method-API..
debug_req.req_cs.cancel() debug_req.req_cs.cancel()
# if lock_req_ctx:
# self-cancel **all** ongoing RPC tasks # self-cancel **all** ongoing RPC tasks
await self.cancel_rpc_tasks( await self.cancel_rpc_tasks(
@ -1722,11 +1725,15 @@ async def async_main(
# parent is kept alive as a resilient service until # parent is kept alive as a resilient service until
# cancellation steps have (mostly) occurred in # cancellation steps have (mostly) occurred in
# a deterministic way. # a deterministic way.
async with trio.open_nursery() as root_nursery: async with trio.open_nursery(
strict_exception_groups=False,
) as root_nursery:
actor._root_n = root_nursery actor._root_n = root_nursery
assert actor._root_n assert actor._root_n
async with trio.open_nursery() as service_nursery: async with trio.open_nursery(
strict_exception_groups=False,
) as service_nursery:
# This nursery is used to handle all inbound # This nursery is used to handle all inbound
# connections to us such that if the TCP server # connections to us such that if the TCP server
# is killed, connections can continue to process # is killed, connections can continue to process

View File

@ -323,9 +323,10 @@ async def soft_kill(
uid: tuple[str, str] = portal.channel.uid uid: tuple[str, str] = portal.channel.uid
try: try:
log.cancel( log.cancel(
'Soft killing sub-actor via portal request\n' f'Soft killing sub-actor via portal request\n'
f'c)> {portal.chan.uid}\n' f'\n'
f' |_{proc}\n' f'(c=> {portal.chan.uid}\n'
f' |_{proc}\n'
) )
# wait on sub-proc to signal termination # wait on sub-proc to signal termination
await wait_func(proc) await wait_func(proc)

View File

@ -108,6 +108,7 @@ def is_main_process() -> bool:
return mp.current_process().name == 'MainProcess' return mp.current_process().name == 'MainProcess'
# TODO, more verby name?
def debug_mode() -> bool: def debug_mode() -> bool:
''' '''
Bool determining if "debug mode" is on which enables Bool determining if "debug mode" is on which enables

View File

@ -376,7 +376,7 @@ class MsgStream(trio.abc.Channel):
f'Stream self-closed by {self._ctx.side!r}-side before EoC\n' f'Stream self-closed by {self._ctx.side!r}-side before EoC\n'
# } bc a stream is a "scope"/msging-phase inside an IPC # } bc a stream is a "scope"/msging-phase inside an IPC
f'x}}>\n' f'x}}>\n'
f'|_{self}\n' f' |_{self}\n'
) )
log.cancel(message) log.cancel(message)
self._eoc = trio.EndOfChannel(message) self._eoc = trio.EndOfChannel(message)

View File

@ -395,17 +395,23 @@ async def _open_and_supervise_one_cancels_all_nursery(
# `ActorNursery.start_actor()`). # `ActorNursery.start_actor()`).
# errors from this daemon actor nursery bubble up to caller # errors from this daemon actor nursery bubble up to caller
async with trio.open_nursery() as da_nursery: async with trio.open_nursery(
strict_exception_groups=False,
# ^XXX^ TODO? instead unpack any RAE as per "loose" style?
) as da_nursery:
try: try:
# This is the inner level "run in actor" nursery. It is # This is the inner level "run in actor" nursery. It is
# awaited first since actors spawned in this way (using # awaited first since actors spawned in this way (using
# ``ActorNusery.run_in_actor()``) are expected to only # `ActorNusery.run_in_actor()`) are expected to only
# return a single result and then complete (i.e. be canclled # return a single result and then complete (i.e. be canclled
# gracefully). Errors collected from these actors are # gracefully). Errors collected from these actors are
# immediately raised for handling by a supervisor strategy. # immediately raised for handling by a supervisor strategy.
# As such if the strategy propagates any error(s) upwards # As such if the strategy propagates any error(s) upwards
# the above "daemon actor" nursery will be notified. # the above "daemon actor" nursery will be notified.
async with trio.open_nursery() as ria_nursery: async with trio.open_nursery(
strict_exception_groups=False,
# ^XXX^ TODO? instead unpack any RAE as per "loose" style?
) as ria_nursery:
an = ActorNursery( an = ActorNursery(
actor, actor,
@ -472,8 +478,8 @@ async def _open_and_supervise_one_cancels_all_nursery(
ContextCancelled, ContextCancelled,
}: }:
log.cancel( log.cancel(
'Actor-nursery caught remote cancellation\n\n' 'Actor-nursery caught remote cancellation\n'
'\n'
f'{inner_err.tb_str}' f'{inner_err.tb_str}'
) )
else: else:
@ -565,7 +571,9 @@ async def _open_and_supervise_one_cancels_all_nursery(
@acm @acm
# @api_frame # @api_frame
async def open_nursery( async def open_nursery(
hide_tb: bool = True,
**kwargs, **kwargs,
# ^TODO, paramspec for `open_root_actor()`
) -> typing.AsyncGenerator[ActorNursery, None]: ) -> typing.AsyncGenerator[ActorNursery, None]:
''' '''
@ -583,7 +591,7 @@ async def open_nursery(
which cancellation scopes correspond to each spawned subactor set. which cancellation scopes correspond to each spawned subactor set.
''' '''
__tracebackhide__: bool = True __tracebackhide__: bool = hide_tb
implicit_runtime: bool = False implicit_runtime: bool = False
actor: Actor = current_actor(err_on_no_runtime=False) actor: Actor = current_actor(err_on_no_runtime=False)
an: ActorNursery|None = None an: ActorNursery|None = None
@ -599,7 +607,10 @@ async def open_nursery(
# mark us for teardown on exit # mark us for teardown on exit
implicit_runtime: bool = True implicit_runtime: bool = True
async with open_root_actor(**kwargs) as actor: async with open_root_actor(
hide_tb=hide_tb,
**kwargs,
) as actor:
assert actor is current_actor() assert actor is current_actor()
try: try:
@ -637,8 +648,10 @@ async def open_nursery(
# show frame on any internal runtime-scope error # show frame on any internal runtime-scope error
if ( if (
an an
and not an.cancelled and
and an._scope_error not an.cancelled
and
an._scope_error
): ):
__tracebackhide__: bool = False __tracebackhide__: bool = False

View File

@ -19,7 +19,10 @@ Various helpers/utils for auditing your `tractor` app and/or the
core runtime. core runtime.
''' '''
from contextlib import asynccontextmanager as acm from contextlib import (
asynccontextmanager as acm,
)
import os
import pathlib import pathlib
import tractor import tractor
@ -59,7 +62,12 @@ def mk_cmd(
exs_subpath: str = 'debugging', exs_subpath: str = 'debugging',
) -> str: ) -> str:
''' '''
Generate a shell command suitable to pass to ``pexpect.spawn()``. Generate a shell command suitable to pass to `pexpect.spawn()`
which runs the script as a python program's entrypoint.
In particular ensure we disable the new tb coloring via unsetting
`$PYTHON_COLORS` so that `pexpect` can pattern match without
color-escape-codes.
''' '''
script_path: pathlib.Path = ( script_path: pathlib.Path = (
@ -67,10 +75,15 @@ def mk_cmd(
/ exs_subpath / exs_subpath
/ f'{ex_name}.py' / f'{ex_name}.py'
) )
return ' '.join([ py_cmd: str = ' '.join([
'python', 'python',
str(script_path) str(script_path)
]) ])
# XXX, required for py 3.13+
# https://docs.python.org/3/using/cmdline.html#using-on-controlling-color
# https://docs.python.org/3/using/cmdline.html#envvar-PYTHON_COLORS
os.environ['PYTHON_COLORS'] = '0'
return py_cmd
@acm @acm

View File

@ -41,3 +41,38 @@ from .pformat import (
pformat_caller_frame as pformat_caller_frame, pformat_caller_frame as pformat_caller_frame,
pformat_boxed_tb as pformat_boxed_tb, pformat_boxed_tb as pformat_boxed_tb,
) )
def _enable_readline_feats() -> str:
'''
Handle `readline` when compiled with `libedit` to avoid breaking
tab completion in `pdbp` (and its dep `tabcompleter`)
particularly since `uv` cpython distis are compiled this way..
See docs for deats,
https://docs.python.org/3/library/readline.html#module-readline
Originally discovered soln via SO answer,
https://stackoverflow.com/q/49287102
'''
import readline
if (
# 3.13+ attr
# https://docs.python.org/3/library/readline.html#readline.backend
(getattr(readline, 'backend', False) == 'libedit')
or
'libedit' in readline.__doc__
):
readline.parse_and_bind("python:bind -v")
readline.parse_and_bind("python:bind ^I rl_complete")
return 'libedit'
else:
readline.parse_and_bind("tab: complete")
readline.parse_and_bind("set editing-mode vi")
readline.parse_and_bind("set keymap vi")
return 'readline'
# TODO, move this to a new `.devx._pdbp` mod?
_enable_readline_feats()

View File

@ -75,6 +75,7 @@ from tractor import _state
from tractor._exceptions import ( from tractor._exceptions import (
InternalError, InternalError,
NoRuntime, NoRuntime,
is_multi_cancelled,
) )
from tractor._state import ( from tractor._state import (
current_actor, current_actor,
@ -316,7 +317,6 @@ class Lock:
we_released: bool = False we_released: bool = False
ctx_in_debug: Context|None = cls.ctx_in_debug ctx_in_debug: Context|None = cls.ctx_in_debug
repl_task: Task|Thread|None = DebugStatus.repl_task repl_task: Task|Thread|None = DebugStatus.repl_task
try: try:
if not DebugStatus.is_main_trio_thread(): if not DebugStatus.is_main_trio_thread():
thread: threading.Thread = threading.current_thread() thread: threading.Thread = threading.current_thread()
@ -331,6 +331,10 @@ class Lock:
return False return False
task: Task = current_task() task: Task = current_task()
message: str = (
'TTY NOT RELEASED on behalf of caller\n'
f'|_{task}\n'
)
# sanity check that if we're the root actor # sanity check that if we're the root actor
# the lock is marked as such. # the lock is marked as such.
@ -345,11 +349,6 @@ class Lock:
else: else:
assert DebugStatus.repl_task is not task assert DebugStatus.repl_task is not task
message: str = (
'TTY lock was NOT released on behalf of caller\n'
f'|_{task}\n'
)
lock: trio.StrictFIFOLock = cls._debug_lock lock: trio.StrictFIFOLock = cls._debug_lock
owner: Task = lock.statistics().owner owner: Task = lock.statistics().owner
if ( if (
@ -364,23 +363,21 @@ class Lock:
# correct task, greenback-spawned-task and/or thread # correct task, greenback-spawned-task and/or thread
# being set to the `.repl_task` such that the above # being set to the `.repl_task` such that the above
# condition matches and we actually release the lock. # condition matches and we actually release the lock.
#
# This is particular of note from `.pause_from_sync()`! # This is particular of note from `.pause_from_sync()`!
): ):
cls._debug_lock.release() cls._debug_lock.release()
we_released: bool = True we_released: bool = True
if repl_task: if repl_task:
message: str = ( message: str = (
'Lock released on behalf of root-actor-local REPL owner\n' 'TTY released on behalf of root-actor-local REPL owner\n'
f'|_{repl_task}\n' f'|_{repl_task}\n'
) )
else: else:
message: str = ( message: str = (
'TTY lock released by us on behalf of remote peer?\n' 'TTY released by us on behalf of remote peer?\n'
f'|_ctx_in_debug: {ctx_in_debug}\n\n' f'{ctx_in_debug}\n'
) )
# mk_pdb().set_trace()
# elif owner:
except RuntimeError as rte: except RuntimeError as rte:
log.exception( log.exception(
@ -398,7 +395,8 @@ class Lock:
req_handler_finished: trio.Event|None = Lock.req_handler_finished req_handler_finished: trio.Event|None = Lock.req_handler_finished
if ( if (
not lock_stats.owner not lock_stats.owner
and req_handler_finished is None and
req_handler_finished is None
): ):
message += ( message += (
'-> No new task holds the TTY lock!\n\n' '-> No new task holds the TTY lock!\n\n'
@ -416,8 +414,8 @@ class Lock:
repl_task repl_task
) )
message += ( message += (
f'A non-caller task still owns this lock on behalf of ' f'A non-caller task still owns this lock on behalf of\n'
f'`{behalf_of_task}`\n' f'{behalf_of_task}\n'
f'lock owner task: {lock_stats.owner}\n' f'lock owner task: {lock_stats.owner}\n'
) )
@ -443,7 +441,8 @@ class Lock:
f'|_{repl_task}\n' f'|_{repl_task}\n'
) )
log.devx(message) if message:
log.devx(message)
return we_released return we_released
@ -663,10 +662,11 @@ async def lock_stdio_for_peer(
fail_reason: str = ( fail_reason: str = (
f'on behalf of peer\n\n' f'on behalf of peer\n\n'
f'x)<=\n' f'x)<=\n'
f' |_{subactor_task_uid!r}@{ctx.chan.uid!r}\n\n' f' |_{subactor_task_uid!r}@{ctx.chan.uid!r}\n'
f'\n'
'Forcing `Lock.release()` due to acquire failure!\n\n' 'Forcing `Lock.release()` due to acquire failure!\n\n'
f'x)=> {ctx}\n' f'x)=>\n'
f' {ctx}'
) )
if isinstance(req_err, trio.Cancelled): if isinstance(req_err, trio.Cancelled):
fail_reason = ( fail_reason = (
@ -1174,7 +1174,7 @@ async def request_root_stdio_lock(
log.devx( log.devx(
'Initing stdio-lock request task with root actor' 'Initing stdio-lock request task with root actor'
) )
# TODO: likely we can implement this mutex more generally as # TODO: can we implement this mutex more generally as
# a `._sync.Lock`? # a `._sync.Lock`?
# -[ ] simply add the wrapping needed for the debugger specifics? # -[ ] simply add the wrapping needed for the debugger specifics?
# - the `__pld_spec__` impl and maybe better APIs for the client # - the `__pld_spec__` impl and maybe better APIs for the client
@ -1185,6 +1185,7 @@ async def request_root_stdio_lock(
# - https://docs.python.org/3.8/library/multiprocessing.html#multiprocessing.RLock # - https://docs.python.org/3.8/library/multiprocessing.html#multiprocessing.RLock
DebugStatus.req_finished = trio.Event() DebugStatus.req_finished = trio.Event()
DebugStatus.req_task = current_task() DebugStatus.req_task = current_task()
req_err: BaseException|None = None
try: try:
from tractor._discovery import get_root from tractor._discovery import get_root
# NOTE: we need this to ensure that this task exits # NOTE: we need this to ensure that this task exits
@ -1207,6 +1208,7 @@ async def request_root_stdio_lock(
# ) # )
DebugStatus.req_cs = req_cs DebugStatus.req_cs = req_cs
req_ctx: Context|None = None req_ctx: Context|None = None
ctx_eg: BaseExceptionGroup|None = None
try: try:
# TODO: merge into single async with ? # TODO: merge into single async with ?
async with get_root() as portal: async with get_root() as portal:
@ -1237,7 +1239,12 @@ async def request_root_stdio_lock(
) )
# try: # try:
assert status.subactor_uid == actor_uid if (locker := status.subactor_uid) != actor_uid:
raise DebugStateError(
f'Root actor locked by another peer !?\n'
f'locker: {locker!r}\n'
f'actor_uid: {actor_uid}\n'
)
assert status.cid assert status.cid
# except AttributeError: # except AttributeError:
# log.exception('failed pldspec asserts!') # log.exception('failed pldspec asserts!')
@ -1274,10 +1281,11 @@ async def request_root_stdio_lock(
f'Exitting {req_ctx.side!r}-side of locking req_ctx\n' f'Exitting {req_ctx.side!r}-side of locking req_ctx\n'
) )
except ( except* (
tractor.ContextCancelled, tractor.ContextCancelled,
trio.Cancelled, trio.Cancelled,
): ) as _taskc_eg:
ctx_eg = _taskc_eg
log.cancel( log.cancel(
'Debug lock request was CANCELLED?\n\n' 'Debug lock request was CANCELLED?\n\n'
f'<=c) {req_ctx}\n' f'<=c) {req_ctx}\n'
@ -1286,21 +1294,23 @@ async def request_root_stdio_lock(
) )
raise raise
except ( except* (
BaseException, BaseException,
) as ctx_err: ) as _ctx_eg:
ctx_eg = _ctx_eg
message: str = ( message: str = (
'Failed during debug request dialog with root actor?\n\n' 'Failed during debug request dialog with root actor?\n'
) )
if (req_ctx := DebugStatus.req_ctx): if (req_ctx := DebugStatus.req_ctx):
message += ( message += (
f'<=x) {req_ctx}\n\n' f'<=x)\n'
f' |_{req_ctx}\n'
f'Cancelling IPC ctx!\n' f'Cancelling IPC ctx!\n'
) )
try: try:
await req_ctx.cancel() await req_ctx.cancel()
except trio.ClosedResourceError as terr: except trio.ClosedResourceError as terr:
ctx_err.add_note( ctx_eg.add_note(
# f'Failed with {type(terr)!r} x)> `req_ctx.cancel()` ' # f'Failed with {type(terr)!r} x)> `req_ctx.cancel()` '
f'Failed with `req_ctx.cancel()` <x) {type(terr)!r} ' f'Failed with `req_ctx.cancel()` <x) {type(terr)!r} '
) )
@ -1309,21 +1319,45 @@ async def request_root_stdio_lock(
message += 'Failed in `Portal.open_context()` call ??\n' message += 'Failed in `Portal.open_context()` call ??\n'
log.exception(message) log.exception(message)
ctx_err.add_note(message) ctx_eg.add_note(message)
raise ctx_err raise ctx_eg
except ( except BaseException as _req_err:
tractor.ContextCancelled, req_err = _req_err
trio.Cancelled,
): # XXX NOTE, since new `trio` enforces strict egs by default
log.cancel( # we have to always handle the eg explicitly given the
'Debug lock request CANCELLED?\n' # `Portal.open_context()` call above (which implicitly opens
f'{req_ctx}\n' # a nursery).
) match req_err:
raise case BaseExceptionGroup():
# for an eg of just one taskc, just unpack and raise
# since we want to propagate a plane ol' `Cancelled`
# up from the `.pause()` call.
excs: list[BaseException] = req_err.exceptions
if (
len(excs) == 1
and
type(exc := excs[0]) in (
tractor.ContextCancelled,
trio.Cancelled,
)
):
log.cancel(
'Debug lock request CANCELLED?\n'
f'{req_ctx}\n'
)
raise exc
case (
tractor.ContextCancelled(),
trio.Cancelled(),
):
log.cancel(
'Debug lock request CANCELLED?\n'
f'{req_ctx}\n'
)
raise exc
except BaseException as req_err:
# log.error('Failed to request root stdio-lock?')
DebugStatus.req_err = req_err DebugStatus.req_err = req_err
DebugStatus.release() DebugStatus.release()
@ -1338,7 +1372,7 @@ async def request_root_stdio_lock(
'Failed during stdio-locking dialog from root actor\n\n' 'Failed during stdio-locking dialog from root actor\n\n'
f'<=x)\n' f'<=x)\n'
f'|_{DebugStatus.req_ctx}\n' f' |_{DebugStatus.req_ctx}\n'
) from req_err ) from req_err
finally: finally:
@ -1401,7 +1435,7 @@ def any_connected_locker_child() -> bool:
actor: Actor = current_actor() actor: Actor = current_actor()
if not is_root_process(): if not is_root_process():
raise RuntimeError('This is a root-actor only API!') raise InternalError('This is a root-actor only API!')
if ( if (
(ctx := Lock.ctx_in_debug) (ctx := Lock.ctx_in_debug)
@ -1743,7 +1777,7 @@ async def _pause(
] = trio.TASK_STATUS_IGNORED, ] = trio.TASK_STATUS_IGNORED,
**debug_func_kwargs, **debug_func_kwargs,
) -> tuple[PdbREPL, Task]|None: ) -> tuple[Task, PdbREPL]|None:
''' '''
Inner impl for `pause()` to avoid the `trio.CancelScope.__exit__()` Inner impl for `pause()` to avoid the `trio.CancelScope.__exit__()`
stack frame when not shielded (since apparently i can't figure out stack frame when not shielded (since apparently i can't figure out
@ -1929,7 +1963,7 @@ async def _pause(
) )
with trio.CancelScope(shield=shield): with trio.CancelScope(shield=shield):
await trio.lowlevel.checkpoint() await trio.lowlevel.checkpoint()
return repl, task return (repl, task)
# elif repl_task: # elif repl_task:
# log.warning( # log.warning(
@ -2138,11 +2172,12 @@ async def _pause(
# `_enter_repl_sync()` into a common @cm? # `_enter_repl_sync()` into a common @cm?
except BaseException as _pause_err: except BaseException as _pause_err:
pause_err: BaseException = _pause_err pause_err: BaseException = _pause_err
_repl_fail_report: str|None = _repl_fail_msg
if isinstance(pause_err, bdb.BdbQuit): if isinstance(pause_err, bdb.BdbQuit):
log.devx( log.devx(
'REPL for pdb was explicitly quit!\n' 'REPL for pdb was explicitly quit!\n'
) )
_repl_fail_msg = None _repl_fail_report = None
# when the actor is mid-runtime cancellation the # when the actor is mid-runtime cancellation the
# `Actor._service_n` might get closed before we can spawn # `Actor._service_n` might get closed before we can spawn
@ -2162,16 +2197,16 @@ async def _pause(
return return
elif isinstance(pause_err, trio.Cancelled): elif isinstance(pause_err, trio.Cancelled):
_repl_fail_msg = ( _repl_fail_report += (
'You called `tractor.pause()` from an already cancelled scope!\n\n' 'You called `tractor.pause()` from an already cancelled scope!\n\n'
'Consider `await tractor.pause(shield=True)` to make it work B)\n' 'Consider `await tractor.pause(shield=True)` to make it work B)\n'
) )
else: else:
_repl_fail_msg += f'on behalf of {repl_task} ??\n' _repl_fail_report += f'on behalf of {repl_task} ??\n'
if _repl_fail_msg: if _repl_fail_report:
log.exception(_repl_fail_msg) log.exception(_repl_fail_report)
if not actor.is_infected_aio(): if not actor.is_infected_aio():
DebugStatus.release(cancel_req_task=True) DebugStatus.release(cancel_req_task=True)
@ -2252,6 +2287,13 @@ def _set_trace(
repl.set_trace(frame=caller_frame) repl.set_trace(frame=caller_frame)
# XXX TODO! XXX, ensure `pytest -s` doesn't just
# hang on this being called in a test.. XD
# -[ ] maybe something in our test suite or is there
# some way we can detect output capture is enabled
# from the process itself?
# |_ronny: ?
#
async def pause( async def pause(
*, *,
hide_tb: bool = True, hide_tb: bool = True,
@ -2530,26 +2572,17 @@ def pause_from_sync(
f'{actor.uid} task called `tractor.pause_from_sync()`\n' f'{actor.uid} task called `tractor.pause_from_sync()`\n'
) )
# TODO: once supported, remove this AND the one
# inside `._pause()`!
# outstanding impl fixes:
# -[ ] need to make `.shield_sigint()` below work here!
# -[ ] how to handle `asyncio`'s new SIGINT-handler
# injection?
# -[ ] should `breakpoint()` work and what does it normally
# do in `asyncio` ctxs?
# if actor.is_infected_aio():
# raise RuntimeError(
# '`tractor.pause[_from_sync]()` not yet supported '
# 'for infected `asyncio` mode!'
# )
repl: PdbREPL = mk_pdb() repl: PdbREPL = mk_pdb()
# message += f'-> created local REPL {repl}\n' # message += f'-> created local REPL {repl}\n'
is_trio_thread: bool = DebugStatus.is_main_trio_thread() is_trio_thread: bool = DebugStatus.is_main_trio_thread()
is_root: bool = is_root_process() is_root: bool = is_root_process()
is_aio: bool = actor.is_infected_aio() is_infected_aio: bool = actor.is_infected_aio()
thread: Thread = threading.current_thread()
asyncio_task: asyncio.Task|None = None
if is_infected_aio:
asyncio_task = asyncio.current_task()
# TODO: we could also check for a non-`.to_thread` context # TODO: we could also check for a non-`.to_thread` context
# using `trio.from_thread.check_cancelled()` (says # using `trio.from_thread.check_cancelled()` (says
@ -2565,24 +2598,18 @@ def pause_from_sync(
if ( if (
not is_trio_thread not is_trio_thread
and and
not is_aio # see below for this usage not asyncio_task
): ):
# TODO: `threading.Lock()` this so we don't get races in # TODO: `threading.Lock()` this so we don't get races in
# multi-thr cases where they're acquiring/releasing the # multi-thr cases where they're acquiring/releasing the
# REPL and setting request/`Lock` state, etc.. # REPL and setting request/`Lock` state, etc..
thread: threading.Thread = threading.current_thread() repl_owner: Thread = thread
repl_owner = thread
# TODO: make root-actor bg thread usage work! # TODO: make root-actor bg thread usage work!
if ( if is_root:
is_root message += (
# or f'-> called from a root-actor bg {thread}\n'
# is_aio )
):
if is_root:
message += (
f'-> called from a root-actor bg {thread}\n'
)
message += ( message += (
'-> scheduling `._pause_from_bg_root_thread()`..\n' '-> scheduling `._pause_from_bg_root_thread()`..\n'
@ -2637,34 +2664,95 @@ def pause_from_sync(
DebugStatus.shield_sigint() DebugStatus.shield_sigint()
assert bg_task is not DebugStatus.repl_task assert bg_task is not DebugStatus.repl_task
# TODO: once supported, remove this AND the one
# inside `._pause()`!
# outstanding impl fixes:
# -[ ] need to make `.shield_sigint()` below work here!
# -[ ] how to handle `asyncio`'s new SIGINT-handler
# injection?
# -[ ] should `breakpoint()` work and what does it normally
# do in `asyncio` ctxs?
# if actor.is_infected_aio():
# raise RuntimeError(
# '`tractor.pause[_from_sync]()` not yet supported '
# 'for infected `asyncio` mode!'
# )
elif ( elif (
not is_trio_thread not is_trio_thread
and and
is_aio is_infected_aio # as in, the special actor-runtime mode
# ^NOTE XXX, that doesn't mean the caller is necessarily
# an `asyncio.Task` just that `trio` has been embedded on
# the `asyncio` event loop!
and
asyncio_task # transitive caller is an actual `asyncio.Task`
): ):
greenback: ModuleType = maybe_import_greenback() greenback: ModuleType = maybe_import_greenback()
repl_owner: Task = asyncio.current_task()
DebugStatus.shield_sigint()
fute: asyncio.Future = run_trio_task_in_future(
partial(
_pause,
debug_func=None,
repl=repl,
hide_tb=hide_tb,
# XXX to prevent `._pause()` for setting if greenback.has_portal():
# `DebugStatus.repl_task` to the gb task! DebugStatus.shield_sigint()
called_from_sync=True, fute: asyncio.Future = run_trio_task_in_future(
called_from_bg_thread=True, partial(
_pause,
debug_func=None,
repl=repl,
hide_tb=hide_tb,
**_pause_kwargs # XXX to prevent `._pause()` for setting
# `DebugStatus.repl_task` to the gb task!
called_from_sync=True,
called_from_bg_thread=True,
**_pause_kwargs
)
) )
) repl_owner = asyncio_task
bg_task, _ = greenback.await_(fute)
# TODO: ASYNC version -> `.pause_from_aio()`?
# bg_task, _ = await fute
# TODO: for async version -> `.pause_from_aio()`? # handle the case where an `asyncio` task has been
# bg_task, _ = await fute # spawned WITHOUT enabling a `greenback` portal..
bg_task, _ = greenback.await_(fute) # => can often happen in 3rd party libs.
bg_task: asyncio.Task = asyncio.current_task() else:
bg_task = repl_owner
# TODO, ostensibly we can just acquire the
# debug lock directly presuming we're the
# root actor running in infected asyncio
# mode?
#
# TODO, this would be a special case where
# a `_pause_from_root()` would come in very
# handy!
# if is_root:
# import pdbp; pdbp.set_trace()
# log.warning(
# 'Allowing `asyncio` task to acquire debug-lock in root-actor..\n'
# 'This is not fully implemented yet; there may be teardown hangs!\n\n'
# )
# else:
# simply unsupported, since there exists no hack (i
# can think of) to workaround this in a subactor
# which needs to lock the root's REPL ow we're sure
# to get prompt stdstreams clobbering..
cf_repr: str = ''
if api_frame:
caller_frame: FrameType = api_frame.f_back
cf_repr: str = f'caller_frame: {caller_frame!r}\n'
raise RuntimeError(
f"CAN'T USE `greenback._await()` without a portal !?\n\n"
f'Likely this task was NOT spawned via the `tractor.to_asyncio` API..\n'
f'{asyncio_task}\n'
f'{cf_repr}\n'
f'Prolly the task was started out-of-band (from some lib?)\n'
f'AND one of the below was never called ??\n'
f'- greenback.ensure_portal()\n'
f'- greenback.bestow_portal(<task>)\n'
)
else: # we are presumably the `trio.run()` + main thread else: # we are presumably the `trio.run()` + main thread
# raises on not-found by default # raises on not-found by default
@ -2915,8 +3003,14 @@ async def _maybe_enter_pm(
tb: TracebackType|None = None, tb: TracebackType|None = None,
api_frame: FrameType|None = None, api_frame: FrameType|None = None,
hide_tb: bool = False, hide_tb: bool = False,
# only enter debugger REPL when returns `True`
debug_filter: Callable[
[BaseException|BaseExceptionGroup],
bool,
] = lambda err: not is_multi_cancelled(err),
): ):
from tractor._exceptions import is_multi_cancelled
if ( if (
debug_mode() debug_mode()
@ -2933,7 +3027,8 @@ async def _maybe_enter_pm(
# Really we just want to mostly avoid catching KBIs here so there # Really we just want to mostly avoid catching KBIs here so there
# might be a simpler check we can do? # might be a simpler check we can do?
and not is_multi_cancelled(err) and
debug_filter(err)
): ):
api_frame: FrameType = api_frame or inspect.currentframe() api_frame: FrameType = api_frame or inspect.currentframe()
tb: TracebackType = tb or sys.exc_info()[2] tb: TracebackType = tb or sys.exc_info()[2]
@ -2993,7 +3088,8 @@ async def maybe_wait_for_debugger(
if ( if (
not debug_mode() not debug_mode()
and not child_in_debug and
not child_in_debug
): ):
return False return False
@ -3051,7 +3147,7 @@ async def maybe_wait_for_debugger(
logmeth( logmeth(
msg msg
+ +
'\nRoot is waiting on tty lock to release from\n\n' '\n^^ Root is waiting on tty lock release.. ^^\n'
# f'{caller_frame_info}\n' # f'{caller_frame_info}\n'
) )
@ -3105,6 +3201,15 @@ async def maybe_wait_for_debugger(
return False return False
class BoxedMaybeException(Struct):
'''
Box a maybe-exception for post-crash introspection usage
from the body of a `open_crash_handler()` scope.
'''
value: BaseException|None = None
# TODO: better naming and what additionals? # TODO: better naming and what additionals?
# - [ ] optional runtime plugging? # - [ ] optional runtime plugging?
# - [ ] detection for sync vs. async code? # - [ ] detection for sync vs. async code?
@ -3114,11 +3219,11 @@ async def maybe_wait_for_debugger(
@cm @cm
def open_crash_handler( def open_crash_handler(
catch: set[BaseException] = { catch: set[BaseException] = {
Exception,
BaseException, BaseException,
}, },
ignore: set[BaseException] = { ignore: set[BaseException] = {
KeyboardInterrupt, KeyboardInterrupt,
trio.Cancelled,
}, },
tb_hide: bool = True, tb_hide: bool = True,
): ):
@ -3135,14 +3240,27 @@ def open_crash_handler(
''' '''
__tracebackhide__: bool = tb_hide __tracebackhide__: bool = tb_hide
# TODO, yield a `outcome.Error`-like boxed type?
# -[~] use `outcome.Value/Error` X-> frozen!
# -[x] write our own..?
# -[ ] consider just wtv is used by `pytest.raises()`?
#
boxed_maybe_exc = BoxedMaybeException()
err: BaseException err: BaseException
try: try:
yield yield boxed_maybe_exc
except tuple(catch) as err: except tuple(catch) as err:
if type(err) not in ignore: boxed_maybe_exc.value = err
if (
# use our re-impl-ed version type(err) not in ignore
and
not is_multi_cancelled(
err,
ignore_nested=ignore
)
):
try: try:
# use our re-impl-ed version
_post_mortem( _post_mortem(
repl=mk_pdb(), repl=mk_pdb(),
tb=sys.exc_info()[2], tb=sys.exc_info()[2],
@ -3150,19 +3268,21 @@ def open_crash_handler(
) )
except bdb.BdbQuit: except bdb.BdbQuit:
__tracebackhide__: bool = False __tracebackhide__: bool = False
raise raise err
# XXX NOTE, `pdbp`'s version seems to lose the up-stack # XXX NOTE, `pdbp`'s version seems to lose the up-stack
# tb-info? # tb-info?
# pdbp.xpm() # pdbp.xpm()
raise raise err
@cm @cm
def maybe_open_crash_handler( def maybe_open_crash_handler(
pdb: bool = False, pdb: bool = False,
tb_hide: bool = True, tb_hide: bool = True,
**kwargs,
): ):
''' '''
Same as `open_crash_handler()` but with bool input flag Same as `open_crash_handler()` but with bool input flag
@ -3173,9 +3293,11 @@ def maybe_open_crash_handler(
''' '''
__tracebackhide__: bool = tb_hide __tracebackhide__: bool = tb_hide
rtctx = nullcontext rtctx = nullcontext(
enter_result=BoxedMaybeException()
)
if pdb: if pdb:
rtctx = open_crash_handler rtctx = open_crash_handler(**kwargs)
with rtctx(): with rtctx as boxed_maybe_exc:
yield yield boxed_maybe_exc

View File

@ -35,6 +35,7 @@ from signal import (
signal, signal,
getsignal, getsignal,
SIGUSR1, SIGUSR1,
SIGINT,
) )
# import traceback # import traceback
from types import ModuleType from types import ModuleType
@ -48,6 +49,7 @@ from tractor import (
_state, _state,
log as logmod, log as logmod,
) )
from tractor.devx import _debug
log = logmod.get_logger(__name__) log = logmod.get_logger(__name__)
@ -76,22 +78,45 @@ def dump_task_tree() -> None:
) )
actor: Actor = _state.current_actor() actor: Actor = _state.current_actor()
thr: Thread = current_thread() thr: Thread = current_thread()
current_sigint_handler: Callable = getsignal(SIGINT)
if (
current_sigint_handler
is not
_debug.DebugStatus._trio_handler
):
sigint_handler_report: str = (
'The default `trio` SIGINT handler was replaced?!'
)
else:
sigint_handler_report: str = (
'The default `trio` SIGINT handler is in use?!'
)
# sclang symbology
# |_<object>
# |_(Task/Thread/Process/Actor
# |_{Supervisor/Scope
# |_[Storage/Memory/IPC-Stream/Data-Struct
log.devx( log.devx(
f'Dumping `stackscope` tree for actor\n' f'Dumping `stackscope` tree for actor\n'
f'{actor.uid}:\n' f'(>: {actor.uid!r}\n'
f'|_{mp.current_process()}\n' f' |_{mp.current_process()}\n'
f' |_{thr}\n' f' |_{thr}\n'
f' |_{actor}\n\n' f' |_{actor}\n'
# start-of-trace-tree delimiter (mostly for testing)
'------ - ------\n'
'\n'
+
f'{tree_str}\n'
+
# end-of-trace-tree delimiter (mostly for testing)
f'\n' f'\n'
f'------ {actor.uid!r} ------\n' f'{sigint_handler_report}\n'
f'signal.getsignal(SIGINT) -> {current_sigint_handler!r}\n'
# f'\n'
# start-of-trace-tree delimiter (mostly for testing)
# f'------ {actor.uid!r} ------\n'
f'\n'
f'------ start-of-{actor.uid!r} ------\n'
f'|\n'
f'{tree_str}'
# end-of-trace-tree delimiter (mostly for testing)
f'|\n'
f'|_____ end-of-{actor.uid!r} ______\n'
) )
# TODO: can remove this right? # TODO: can remove this right?
# -[ ] was original code from author # -[ ] was original code from author
@ -123,11 +148,11 @@ def dump_tree_on_sig(
) -> None: ) -> None:
global _tree_dumped, _handler_lock global _tree_dumped, _handler_lock
with _handler_lock: with _handler_lock:
if _tree_dumped: # if _tree_dumped:
log.warning( # log.warning(
'Already dumped for this actor...??' # 'Already dumped for this actor...??'
) # )
return # return
_tree_dumped = True _tree_dumped = True
@ -161,9 +186,9 @@ def dump_tree_on_sig(
) )
raise raise
log.devx( # log.devx(
'Supposedly we dumped just fine..?' # 'Supposedly we dumped just fine..?'
) # )
if not relay_to_subs: if not relay_to_subs:
return return
@ -202,11 +227,11 @@ def enable_stack_on_sig(
(https://www.gnu.org/software/bash/manual/bash.html#Command-Substitution) (https://www.gnu.org/software/bash/manual/bash.html#Command-Substitution)
you could use: you could use:
>> kill -SIGUSR1 $(pgrep -f '<cmd>') >> kill -SIGUSR1 $(pgrep -f <part-of-cmd: str>)
Or with with `xonsh` (which has diff capture-from-subproc syntax) OR without a sub-shell,
>> kill -SIGUSR1 @$(pgrep -f '<cmd>') >> pkill --signal SIGUSR1 -f <part-of-cmd: str>
''' '''
try: try:

View File

@ -92,7 +92,7 @@ def pformat_boxed_tb(
f' ------ {boxer_header} ------\n' f' ------ {boxer_header} ------\n'
f'{tb_body}' f'{tb_body}'
f' ------ {boxer_header}- ------\n' f' ------ {boxer_header}- ------\n'
f'_|\n' f'_|'
) )
tb_box_indent: str = ( tb_box_indent: str = (
tb_box_indent tb_box_indent

View File

@ -0,0 +1,26 @@
# tractor: structured concurrent "actors".
# Copyright 2024-eternity Tyler Goodlet.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
High level design patterns, APIs and runtime extensions built on top
of the `tractor` runtime core.
'''
from ._service import (
open_service_mngr as open_service_mngr,
get_service_mngr as get_service_mngr,
ServiceMngr as ServiceMngr,
)

View File

@ -0,0 +1,592 @@
# tractor: structured concurrent "actors".
# Copyright 2024-eternity Tyler Goodlet.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Daemon subactor as service(s) management and supervision primitives
and API.
'''
from __future__ import annotations
from contextlib import (
asynccontextmanager as acm,
# contextmanager as cm,
)
from collections import defaultdict
from dataclasses import (
dataclass,
field,
)
import functools
import inspect
from typing import (
Callable,
Any,
)
import tractor
import trio
from trio import TaskStatus
from tractor import (
log,
ActorNursery,
current_actor,
ContextCancelled,
Context,
Portal,
)
log = log.get_logger('tractor')
# TODO: implement a `@singleton` deco-API for wrapping the below
# factory's impl for general actor-singleton use?
#
# -[ ] go through the options peeps on SO did?
# * https://stackoverflow.com/questions/6760685/what-is-the-best-way-of-implementing-singleton-in-python
# * including @mikenerone's answer
# |_https://stackoverflow.com/questions/6760685/what-is-the-best-way-of-implementing-singleton-in-python/39186313#39186313
#
# -[ ] put it in `tractor.lowlevel._globals` ?
# * fits with our oustanding actor-local/global feat req?
# |_ https://github.com/goodboy/tractor/issues/55
# * how can it relate to the `Actor.lifetime_stack` that was
# silently patched in?
# |_ we could implicitly call both of these in the same
# spot in the runtime using the lifetime stack?
# - `open_singleton_cm().__exit__()`
# -`del_singleton()`
# |_ gives SC fixtue semantics to sync code oriented around
# sub-process lifetime?
# * what about with `trio.RunVar`?
# |_https://trio.readthedocs.io/en/stable/reference-lowlevel.html#trio.lowlevel.RunVar
# - which we'll need for no-GIL cpython (right?) presuming
# multiple `trio.run()` calls in process?
#
#
# @singleton
# async def open_service_mngr(
# **init_kwargs,
# ) -> ServiceMngr:
# '''
# Note this function body is invoke IFF no existing singleton instance already
# exists in this proc's memory.
# '''
# # setup
# yield ServiceMngr(**init_kwargs)
# # teardown
# a deletion API for explicit instance de-allocation?
# @open_service_mngr.deleter
# def del_service_mngr() -> None:
# mngr = open_service_mngr._singleton[0]
# open_service_mngr._singleton[0] = None
# del mngr
# TODO: implement a singleton deco-API for wrapping the below
# factory's impl for general actor-singleton use?
#
# @singleton
# async def open_service_mngr(
# **init_kwargs,
# ) -> ServiceMngr:
# '''
# Note this function body is invoke IFF no existing singleton instance already
# exists in this proc's memory.
# '''
# # setup
# yield ServiceMngr(**init_kwargs)
# # teardown
# TODO: singleton factory API instead of a class API
@acm
async def open_service_mngr(
*,
debug_mode: bool = False,
# NOTE; since default values for keyword-args are effectively
# module-vars/globals as per the note from,
# https://docs.python.org/3/tutorial/controlflow.html#default-argument-values
#
# > "The default value is evaluated only once. This makes
# a difference when the default is a mutable object such as
# a list, dictionary, or instances of most classes"
#
_singleton: list[ServiceMngr|None] = [None],
**init_kwargs,
) -> ServiceMngr:
'''
Open an actor-global "service-manager" for supervising a tree
of subactors and/or actor-global tasks.
The delivered `ServiceMngr` is singleton instance for each
actor-process, that is, allocated on first open and never
de-allocated unless explicitly deleted by al call to
`del_service_mngr()`.
'''
# TODO: factor this an allocation into
# a `._mngr.open_service_mngr()` and put in the
# once-n-only-once setup/`.__aenter__()` part!
# -[ ] how to make this only happen on the `mngr == None` case?
# |_ use `.trionics.maybe_open_context()` (for generic
# async-with-style-only-once of the factory impl, though
# what do we do for the allocation case?
# / `.maybe_open_nursery()` (since for this specific case
# it's simpler?) to activate
async with (
tractor.open_nursery() as an,
trio.open_nursery() as tn,
):
# impl specific obvi..
init_kwargs.update({
'an': an,
'tn': tn,
})
mngr: ServiceMngr|None
if (mngr := _singleton[0]) is None:
log.info('Allocating a new service mngr!')
mngr = _singleton[0] = ServiceMngr(**init_kwargs)
# TODO: put into `.__aenter__()` section of
# eventual `@singleton_acm` API wrapper.
#
# assign globally for future daemon/task creation
mngr.an = an
mngr.tn = tn
else:
assert (mngr.an and mngr.tn)
log.info(
'Using extant service mngr!\n\n'
f'{mngr!r}\n' # it has a nice `.__repr__()` of services state
)
try:
# NOTE: this is a singleton factory impl specific detail
# which should be supported in the condensed
# `@singleton_acm` API?
mngr.debug_mode = debug_mode
yield mngr
finally:
# TODO: is this more clever/efficient?
# if 'samplerd' in mngr.service_ctxs:
# await mngr.cancel_service('samplerd')
tn.cancel_scope.cancel()
def get_service_mngr() -> ServiceMngr:
'''
Try to get the singleton service-mngr for this actor presuming it
has already been allocated using,
.. code:: python
async with open_<@singleton_acm(func)>() as mngr`
... this block kept open ...
If not yet allocated raise a `ServiceError`.
'''
# https://stackoverflow.com/a/12627202
# https://docs.python.org/3/library/inspect.html#inspect.Signature
maybe_mngr: ServiceMngr|None = inspect.signature(
open_service_mngr
).parameters['_singleton'].default[0]
if maybe_mngr is None:
raise RuntimeError(
'Someone must allocate a `ServiceMngr` using\n\n'
'`async with open_service_mngr()` beforehand!!\n'
)
return maybe_mngr
async def _open_and_supervise_service_ctx(
serman: ServiceMngr,
name: str,
ctx_fn: Callable, # TODO, type for `@tractor.context` requirement
portal: Portal,
allow_overruns: bool = False,
task_status: TaskStatus[
tuple[
trio.CancelScope,
Context,
trio.Event,
Any,
]
] = trio.TASK_STATUS_IGNORED,
**ctx_kwargs,
) -> Any:
'''
Open a remote IPC-context defined by `ctx_fn` in the
(service) actor accessed via `portal` and supervise the
(local) parent task to termination at which point the remote
actor runtime is cancelled alongside it.
The main application is for allocating long-running
"sub-services" in a main daemon and explicitly controlling
their lifetimes from an actor-global singleton.
'''
# TODO: use the ctx._scope directly here instead?
# -[ ] actually what semantics do we expect for this
# usage!?
with trio.CancelScope() as cs:
try:
async with portal.open_context(
ctx_fn,
allow_overruns=allow_overruns,
**ctx_kwargs,
) as (ctx, started):
# unblock once the remote context has started
complete = trio.Event()
task_status.started((
cs,
ctx,
complete,
started,
))
log.info(
f'`pikerd` service {name} started with value {started}'
)
# wait on any context's return value
# and any final portal result from the
# sub-actor.
ctx_res: Any = await ctx.wait_for_result()
# NOTE: blocks indefinitely until cancelled
# either by error from the target context
# function or by being cancelled here by the
# surrounding cancel scope.
return (
await portal.wait_for_result(),
ctx_res,
)
except ContextCancelled as ctxe:
canceller: tuple[str, str] = ctxe.canceller
our_uid: tuple[str, str] = current_actor().uid
if (
canceller != portal.chan.uid
and
canceller != our_uid
):
log.cancel(
f'Actor-service `{name}` was remotely cancelled by a peer?\n'
# TODO: this would be a good spot to use
# a respawn feature Bo
f'-> Keeping `pikerd` service manager alive despite this inter-peer cancel\n\n'
f'cancellee: {portal.chan.uid}\n'
f'canceller: {canceller}\n'
)
else:
raise
finally:
# NOTE: the ctx MUST be cancelled first if we
# don't want the above `ctx.wait_for_result()` to
# raise a self-ctxc. WHY, well since from the ctx's
# perspective the cancel request will have
# arrived out-out-of-band at the `Actor.cancel()`
# level, thus `Context.cancel_called == False`,
# meaning `ctx._is_self_cancelled() == False`.
# with trio.CancelScope(shield=True):
# await ctx.cancel()
await portal.cancel_actor() # terminate (remote) sub-actor
complete.set() # signal caller this task is done
serman.service_ctxs.pop(name) # remove mngr entry
# TODO: we need remote wrapping and a general soln:
# - factor this into a ``tractor.highlevel`` extension # pack for the
# library.
# - wrap a "remote api" wherein you can get a method proxy
# to the pikerd actor for starting services remotely!
# - prolly rename this to ActorServicesNursery since it spawns
# new actors and supervises them to completion?
@dataclass
class ServiceMngr:
'''
A multi-subactor-as-service manager.
Spawn, supervise and monitor service/daemon subactors in a SC
process tree.
'''
an: ActorNursery
tn: trio.Nursery
debug_mode: bool = False # tractor sub-actor debug mode flag
service_tasks: dict[
str,
tuple[
trio.CancelScope,
trio.Event,
]
] = field(default_factory=dict)
service_ctxs: dict[
str,
tuple[
trio.CancelScope,
Context,
Portal,
trio.Event,
]
] = field(default_factory=dict)
# internal per-service task mutexs
_locks = defaultdict(trio.Lock)
# TODO, unify this interface with our `TaskManager` PR!
#
#
async def start_service_task(
self,
name: str,
# TODO: typevar for the return type of the target and then
# use it below for `ctx_res`?
fn: Callable,
allow_overruns: bool = False,
**ctx_kwargs,
) -> tuple[
trio.CancelScope,
Any,
trio.Event,
]:
async def _task_manager_start(
task_status: TaskStatus[
tuple[
trio.CancelScope,
trio.Event,
]
] = trio.TASK_STATUS_IGNORED,
) -> Any:
task_cs = trio.CancelScope()
task_complete = trio.Event()
with task_cs as cs:
task_status.started((
cs,
task_complete,
))
try:
await fn()
except trio.Cancelled as taskc:
log.cancel(
f'Service task for `{name}` was cancelled!\n'
# TODO: this would be a good spot to use
# a respawn feature Bo
)
raise taskc
finally:
task_complete.set()
(
cs,
complete,
) = await self.tn.start(_task_manager_start)
# store the cancel scope and portal for later cancellation or
# retstart if needed.
self.service_tasks[name] = (
cs,
complete,
)
return (
cs,
complete,
)
async def cancel_service_task(
self,
name: str,
) -> Any:
log.info(f'Cancelling `pikerd` service {name}')
cs, complete = self.service_tasks[name]
cs.cancel()
await complete.wait()
# TODO, if we use the `TaskMngr` from #346
# we can also get the return value from the task!
if name in self.service_tasks:
# TODO: custom err?
# raise ServiceError(
raise RuntimeError(
f'Service task {name!r} not terminated!?\n'
)
async def start_service_ctx(
self,
name: str,
portal: Portal,
# TODO: typevar for the return type of the target and then
# use it below for `ctx_res`?
ctx_fn: Callable,
**ctx_kwargs,
) -> tuple[
trio.CancelScope,
Context,
Any,
]:
'''
Start a remote IPC-context defined by `ctx_fn` in a background
task and immediately return supervision primitives to manage it:
- a `cs: CancelScope` for the newly allocated bg task
- the `ipc_ctx: Context` to manage the remotely scheduled
`trio.Task`.
- the `started: Any` value returned by the remote endpoint
task's `Context.started(<value>)` call.
The bg task supervises the ctx such that when it terminates the supporting
actor runtime is also cancelled, see `_open_and_supervise_service_ctx()`
for details.
'''
cs, ipc_ctx, complete, started = await self.tn.start(
functools.partial(
_open_and_supervise_service_ctx,
serman=self,
name=name,
ctx_fn=ctx_fn,
portal=portal,
**ctx_kwargs,
)
)
# store the cancel scope and portal for later cancellation or
# retstart if needed.
self.service_ctxs[name] = (cs, ipc_ctx, portal, complete)
return (
cs,
ipc_ctx,
started,
)
async def start_service(
self,
daemon_name: str,
ctx_ep: Callable, # kwargs must `partial`-ed in!
# ^TODO, type for `@tractor.context` deco-ed funcs!
debug_mode: bool = False,
**start_actor_kwargs,
) -> Context:
'''
Start new subactor and schedule a supervising "service task"
in it which explicitly defines the sub's lifetime.
"Service daemon subactors" are cancelled (and thus
terminated) using the paired `.cancel_service()`.
Effectively this API can be used to manage "service daemons"
spawned under a single parent actor with supervision
semantics equivalent to a one-cancels-one style actor-nursery
or "(subactor) task manager" where each subprocess's (and
thus its embedded actor runtime) lifetime is synced to that
of the remotely spawned task defined by `ctx_ep`.
The funcionality can be likened to a "daemonized" version of
`.hilevel.worker.run_in_actor()` but with supervision
controls offered by `tractor.Context` where the main/root
remotely scheduled `trio.Task` invoking `ctx_ep` determines
the underlying subactor's lifetime.
'''
entry: tuple|None = self.service_ctxs.get(daemon_name)
if entry:
(cs, sub_ctx, portal, complete) = entry
return sub_ctx
if daemon_name not in self.service_ctxs:
portal: Portal = await self.an.start_actor(
daemon_name,
debug_mode=( # maybe set globally during allocate
debug_mode
or
self.debug_mode
),
**start_actor_kwargs,
)
ctx_kwargs: dict[str, Any] = {}
if isinstance(ctx_ep, functools.partial):
ctx_kwargs: dict[str, Any] = ctx_ep.keywords
ctx_ep: Callable = ctx_ep.func
(
cs,
sub_ctx,
started,
) = await self.start_service_ctx(
name=daemon_name,
portal=portal,
ctx_fn=ctx_ep,
**ctx_kwargs,
)
return sub_ctx
async def cancel_service(
self,
name: str,
) -> Any:
'''
Cancel the service task and actor for the given ``name``.
'''
log.info(f'Cancelling `pikerd` service {name}')
cs, sub_ctx, portal, complete = self.service_ctxs[name]
# cs.cancel()
await sub_ctx.cancel()
await complete.wait()
if name in self.service_ctxs:
# TODO: custom err?
# raise ServiceError(
raise RuntimeError(
f'Service actor for {name} not terminated and/or unknown?'
)
# assert name not in self.service_ctxs, \
# f'Serice task for {name} not terminated?'

View File

@ -258,20 +258,28 @@ class ActorContextInfo(Mapping):
def get_logger( def get_logger(
name: str|None = None,
name: str | None = None,
_root_name: str = _proj_name, _root_name: str = _proj_name,
logger: Logger|None = None,
# TODO, using `.config.dictConfig()` api?
# -[ ] SO answer with docs links
# |_https://stackoverflow.com/questions/7507825/where-is-a-complete-example-of-logging-config-dictconfig
# |_https://docs.python.org/3/library/logging.config.html#configuration-dictionary-schema
subsys_spec: str|None = None,
) -> StackLevelAdapter: ) -> StackLevelAdapter:
'''Return the package log or a sub-logger for ``name`` if provided. '''Return the package log or a sub-logger for ``name`` if provided.
''' '''
log: Logger log: Logger
log = rlog = logging.getLogger(_root_name) log = rlog = logger or logging.getLogger(_root_name)
if ( if (
name name
and name != _proj_name and
name != _proj_name
): ):
# NOTE: for handling for modules that use ``get_logger(__name__)`` # NOTE: for handling for modules that use ``get_logger(__name__)``
@ -283,7 +291,7 @@ def get_logger(
# since in python the {filename} is always this same # since in python the {filename} is always this same
# module-file. # module-file.
sub_name: None | str = None sub_name: None|str = None
rname, _, sub_name = name.partition('.') rname, _, sub_name = name.partition('.')
pkgpath, _, modfilename = sub_name.rpartition('.') pkgpath, _, modfilename = sub_name.rpartition('.')
@ -306,7 +314,10 @@ def get_logger(
# add our actor-task aware adapter which will dynamically look up # add our actor-task aware adapter which will dynamically look up
# the actor and task names at each log emit # the actor and task names at each log emit
logger = StackLevelAdapter(log, ActorContextInfo()) logger = StackLevelAdapter(
log,
ActorContextInfo(),
)
# additional levels # additional levels
for name, val in CUSTOM_LEVELS.items(): for name, val in CUSTOM_LEVELS.items():
@ -319,15 +330,25 @@ def get_logger(
def get_console_log( def get_console_log(
level: str | None = None, level: str|None = None,
logger: Logger|None = None,
**kwargs, **kwargs,
) -> LoggerAdapter:
'''Get the package logger and enable a handler which writes to stderr.
Yeah yeah, i know we can use ``DictConfig``. You do it. ) -> LoggerAdapter:
''' '''
log = get_logger(**kwargs) # our root logger Get a `tractor`-style logging instance: a `Logger` wrapped in
logger = log.logger a `StackLevelAdapter` which injects various concurrency-primitive
(process, thread, task) fields and enables a `StreamHandler` that
writes on stderr using `colorlog` formatting.
Yeah yeah, i know we can use `logging.config.dictConfig()`. You do it.
'''
log = get_logger(
logger=logger,
**kwargs
) # set a root logger
logger: Logger = log.logger
if not level: if not level:
return log return log
@ -346,9 +367,13 @@ def get_console_log(
None, None,
) )
): ):
fmt = LOG_FORMAT
# if logger:
# fmt = None
handler = StreamHandler() handler = StreamHandler()
formatter = colorlog.ColoredFormatter( formatter = colorlog.ColoredFormatter(
LOG_FORMAT, fmt=fmt,
datefmt=DATE_FORMAT, datefmt=DATE_FORMAT,
log_colors=STD_PALETTE, log_colors=STD_PALETTE,
secondary_log_colors=BOLD_PALETTE, secondary_log_colors=BOLD_PALETTE,
@ -365,7 +390,7 @@ def get_loglevel() -> str:
# global module logger for tractor itself # global module logger for tractor itself
log = get_logger('tractor') log: StackLevelAdapter = get_logger('tractor')
def at_least_level( def at_least_level(

View File

@ -33,6 +33,7 @@ from ._codec import (
apply_codec as apply_codec, apply_codec as apply_codec,
mk_codec as mk_codec, mk_codec as mk_codec,
mk_dec as mk_dec,
MsgCodec as MsgCodec, MsgCodec as MsgCodec,
MsgDec as MsgDec, MsgDec as MsgDec,
current_codec as current_codec, current_codec as current_codec,

View File

@ -61,6 +61,7 @@ from tractor.msg.pretty_struct import Struct
from tractor.msg.types import ( from tractor.msg.types import (
mk_msg_spec, mk_msg_spec,
MsgType, MsgType,
PayloadMsg,
) )
from tractor.log import get_logger from tractor.log import get_logger
@ -80,6 +81,7 @@ class MsgDec(Struct):
''' '''
_dec: msgpack.Decoder _dec: msgpack.Decoder
# _ext_types_box: Struct|None = None
@property @property
def dec(self) -> msgpack.Decoder: def dec(self) -> msgpack.Decoder:
@ -179,23 +181,126 @@ class MsgDec(Struct):
def mk_dec( def mk_dec(
spec: Union[Type[Struct]]|Any = Any, spec: Union[Type[Struct]]|Type|None,
# NOTE, required for ad-hoc type extensions to the underlying
# serialization proto (which is default `msgpack`),
# https://jcristharif.com/msgspec/extending.html#mapping-to-from-native-types
dec_hook: Callable|None = None, dec_hook: Callable|None = None,
ext_types: list[Type]|None = None,
) -> MsgDec: ) -> MsgDec:
''' '''
Create an IPC msg decoder, normally used as the Create an IPC msg decoder, a slightly higher level wrapper around
`PayloadMsg.pld: PayloadT` field decoder inside a `PldRx`. a `msgspec.msgpack.Decoder` which provides,
- easier introspection of the underlying type spec via
the `.spec` and `.spec_str` attrs,
- `.hook` access to the `Decoder.dec_hook()`,
- automatic custom extension-types decode support when
`dec_hook()` is provided such that any `PayloadMsg.pld` tagged
as a type from from `ext_types` (presuming the `MsgCodec.encode()` also used
a `.enc_hook()`) is processed and constructed by a `PldRx` implicitily.
NOTE, as mentioned a `MsgDec` is normally used for `PayloadMsg.pld: PayloadT` field
decoding inside an IPC-ctx-oriented `PldRx`.
''' '''
if (
spec is None
and
ext_types is None
):
raise TypeError(
f'MIssing type-`spec` for msg decoder!\n'
f'\n'
f'`spec=None` is **only** permitted is if custom extension types '
f'are provided via `ext_types`, meaning it must be non-`None`.\n'
f'\n'
f'In this case it is presumed that only the `ext_types`, '
f'which much be handled by a paired `dec_hook()`, '
f'will be permitted within the payload type-`spec`!\n'
f'\n'
f'spec = {spec!r}\n'
f'dec_hook = {dec_hook!r}\n'
f'ext_types = {ext_types!r}\n'
)
if dec_hook:
if ext_types is None:
raise TypeError(
f'If extending the serializable types with a custom decode hook (`dec_hook()`), '
f'you must also provide the expected type set that the hook will handle '
f'via a `ext_types: Union[Type]|None = None` argument!\n'
f'\n'
f'dec_hook = {dec_hook!r}\n'
f'ext_types = {ext_types!r}\n'
)
# XXX, i *thought* we would require a boxing struct as per docs,
# https://jcristharif.com/msgspec/extending.html#mapping-to-from-native-types
# |_ see comment,
# > Note that typed deserialization is required for
# > successful roundtripping here, so we pass `MyMessage` to
# > `Decoder`.
#
# BUT, turns out as long as you spec a union with `Raw` it
# will work? kk B)
#
# maybe_box_struct = mk_boxed_ext_struct(ext_types)
spec = Raw | Union[*ext_types]
return MsgDec( return MsgDec(
_dec=msgpack.Decoder( _dec=msgpack.Decoder(
type=spec, # like `MsgType[Any]` type=spec, # like `MsgType[Any]`
dec_hook=dec_hook, dec_hook=dec_hook,
) ),
) )
# TODO? remove since didn't end up needing this?
def mk_boxed_ext_struct(
ext_types: list[Type],
) -> Struct:
# NOTE, originally was to wrap non-msgpack-supported "extension
# types" in a field-typed boxing struct, see notes around the
# `dec_hook()` branch in `mk_dec()`.
ext_types_union = Union[*ext_types]
repr_ext_types_union: str = (
str(ext_types_union)
or
"|".join(ext_types)
)
BoxedExtType = msgspec.defstruct(
f'BoxedExts[{repr_ext_types_union}]',
fields=[
('boxed', ext_types_union),
],
)
return BoxedExtType
def unpack_spec_types(
spec: Union[Type]|Type,
) -> set[Type]:
'''
Given an input type-`spec`, either a lone type
or a `Union` of types (like `str|int|MyThing`),
return a set of individual types.
When `spec` is not a type-union returns `{spec,}`.
'''
spec_subtypes: set[Union[Type]] = set(
getattr(
spec,
'__args__',
{spec,},
)
)
return spec_subtypes
def mk_msgspec_table( def mk_msgspec_table(
dec: msgpack.Decoder, dec: msgpack.Decoder,
msg: MsgType|None = None, msg: MsgType|None = None,
@ -273,6 +378,8 @@ class MsgCodec(Struct):
_dec: msgpack.Decoder _dec: msgpack.Decoder
_pld_spec: Type[Struct]|Raw|Any _pld_spec: Type[Struct]|Raw|Any
# _ext_types_box: Struct|None = None
def __repr__(self) -> str: def __repr__(self) -> str:
speclines: str = textwrap.indent( speclines: str = textwrap.indent(
pformat_msgspec(codec=self), pformat_msgspec(codec=self),
@ -339,12 +446,15 @@ class MsgCodec(Struct):
def encode( def encode(
self, self,
py_obj: Any, py_obj: Any|PayloadMsg,
use_buf: bool = False, use_buf: bool = False,
# ^-XXX-^ uhh why am i getting this? # ^-XXX-^ uhh why am i getting this?
# |_BufferError: Existing exports of data: object cannot be re-sized # |_BufferError: Existing exports of data: object cannot be re-sized
as_ext_type: bool = False,
hide_tb: bool = True,
) -> bytes: ) -> bytes:
''' '''
Encode input python objects to `msgpack` bytes for Encode input python objects to `msgpack` bytes for
@ -354,11 +464,46 @@ class MsgCodec(Struct):
https://jcristharif.com/msgspec/perf-tips.html#reusing-an-output-buffer https://jcristharif.com/msgspec/perf-tips.html#reusing-an-output-buffer
''' '''
__tracebackhide__: bool = hide_tb
if use_buf: if use_buf:
self._enc.encode_into(py_obj, self._buf) self._enc.encode_into(py_obj, self._buf)
return self._buf return self._buf
else:
return self._enc.encode(py_obj) return self._enc.encode(py_obj)
# try:
# return self._enc.encode(py_obj)
# except TypeError as typerr:
# typerr.add_note(
# '|_src error from `msgspec`'
# # f'|_{self._enc.encode!r}'
# )
# raise typerr
# TODO! REMOVE once i'm confident we won't ever need it!
#
# box: Struct = self._ext_types_box
# if (
# as_ext_type
# or
# (
# # XXX NOTE, auto-detect if the input type
# box
# and
# (ext_types := unpack_spec_types(
# spec=box.__annotations__['boxed'])
# )
# )
# ):
# match py_obj:
# # case PayloadMsg(pld=pld) if (
# # type(pld) in ext_types
# # ):
# # py_obj.pld = box(boxed=py_obj)
# # breakpoint()
# case _ if (
# type(py_obj) in ext_types
# ):
# py_obj = box(boxed=py_obj)
@property @property
def dec(self) -> msgpack.Decoder: def dec(self) -> msgpack.Decoder:
@ -378,21 +523,30 @@ class MsgCodec(Struct):
return self._dec.decode(msg) return self._dec.decode(msg)
# [x] TODO: a sub-decoder system as well? => No! # ?TODO? time to remove this finally?
#
# -[x] TODO: a sub-decoder system as well?
# => No! already re-architected to include a "payload-receiver"
# now found in `._ops`.
# #
# -[x] do we still want to try and support the sub-decoder with # -[x] do we still want to try and support the sub-decoder with
# `.Raw` technique in the case that the `Generic` approach gives # `.Raw` technique in the case that the `Generic` approach gives
# future grief? # future grief?
# => NO, since we went with the `PldRx` approach instead B) # => well YES but NO, since we went with the `PldRx` approach
# instead!
# #
# IF however you want to see the code that was staged for this # IF however you want to see the code that was staged for this
# from wayyy back, see the pure removal commit. # from wayyy back, see the pure removal commit.
def mk_codec( def mk_codec(
# struct type unions set for `Decoder` ipc_pld_spec: Union[Type[Struct]]|Any|Raw = Raw,
# https://jcristharif.com/msgspec/structs.html#tagged-unions # tagged-struct-types-union set for `Decoder`ing of payloads, as
ipc_pld_spec: Union[Type[Struct]]|Any = Any, # per https://jcristharif.com/msgspec/structs.html#tagged-unions.
# NOTE that the default `Raw` here **is very intentional** since
# the `PldRx._pld_dec: MsgDec` is responsible for per ipc-ctx-task
# decoding of msg-specs defined by the user as part of **their**
# `tractor` "app's" type-limited IPC msg-spec.
# TODO: offering a per-msg(-field) type-spec such that # TODO: offering a per-msg(-field) type-spec such that
# the fields can be dynamically NOT decoded and left as `Raw` # the fields can be dynamically NOT decoded and left as `Raw`
@ -405,13 +559,18 @@ def mk_codec(
libname: str = 'msgspec', libname: str = 'msgspec',
# proxy as `Struct(**kwargs)` for ad-hoc type extensions # settings for encoding-to-send extension-types,
# https://jcristharif.com/msgspec/extending.html#mapping-to-from-native-types # https://jcristharif.com/msgspec/extending.html#mapping-to-from-native-types
# ------ - ------ # dec_hook: Callable|None = None,
dec_hook: Callable|None = None,
enc_hook: Callable|None = None, enc_hook: Callable|None = None,
# ------ - ------ ext_types: list[Type]|None = None,
# optionally provided msg-decoder from which we pull its,
# |_.dec_hook()
# |_.type
ext_dec: MsgDec|None = None
# #
# ?TODO? other params we might want to support
# Encoder: # Encoder:
# write_buffer_size=write_buffer_size, # write_buffer_size=write_buffer_size,
# #
@ -425,26 +584,44 @@ def mk_codec(
`msgspec` ;). `msgspec` ;).
''' '''
# (manually) generate a msg-payload-spec for all relevant pld_spec = ipc_pld_spec
# god-boxing-msg subtypes, parameterizing the `PayloadMsg.pld: PayloadT` if enc_hook:
# for the decoder such that all sub-type msgs in our SCIPP if not ext_types:
# will automatically decode to a type-"limited" payload (`Struct`) raise TypeError(
# object (set). f'If extending the serializable types with a custom encode hook (`enc_hook()`), '
f'you must also provide the expected type set that the hook will handle '
f'via a `ext_types: Union[Type]|None = None` argument!\n'
f'\n'
f'enc_hook = {enc_hook!r}\n'
f'ext_types = {ext_types!r}\n'
)
dec_hook: Callable|None = None
if ext_dec:
dec: msgspec.Decoder = ext_dec.dec
dec_hook = dec.dec_hook
pld_spec |= dec.type
if ext_types:
pld_spec |= Union[*ext_types]
# (manually) generate a msg-spec (how appropes) for all relevant
# payload-boxing-struct-msg-types, parameterizing the
# `PayloadMsg.pld: PayloadT` for the decoder such that all msgs
# in our SC-RPC-protocol will automatically decode to
# a type-"limited" payload (`Struct`) object (set).
( (
ipc_msg_spec, ipc_msg_spec,
msg_types, msg_types,
) = mk_msg_spec( ) = mk_msg_spec(
payload_type_union=ipc_pld_spec, payload_type_union=pld_spec,
) )
assert len(ipc_msg_spec.__args__) == len(msg_types)
assert ipc_msg_spec
# TODO: use this shim instead? msg_spec_types: set[Type] = unpack_spec_types(ipc_msg_spec)
# bc.. unification, err somethin? assert (
# dec: MsgDec = mk_dec( len(ipc_msg_spec.__args__) == len(msg_types)
# spec=ipc_msg_spec, and
# dec_hook=dec_hook, len(msg_spec_types) == len(msg_types)
# ) )
dec = msgpack.Decoder( dec = msgpack.Decoder(
type=ipc_msg_spec, type=ipc_msg_spec,
@ -453,22 +630,29 @@ def mk_codec(
enc = msgpack.Encoder( enc = msgpack.Encoder(
enc_hook=enc_hook, enc_hook=enc_hook,
) )
codec = MsgCodec( codec = MsgCodec(
_enc=enc, _enc=enc,
_dec=dec, _dec=dec,
_pld_spec=ipc_pld_spec, _pld_spec=pld_spec,
) )
# sanity on expected backend support # sanity on expected backend support
assert codec.lib.__name__ == libname assert codec.lib.__name__ == libname
return codec return codec
# instance of the default `msgspec.msgpack` codec settings, i.e. # instance of the default `msgspec.msgpack` codec settings, i.e.
# no custom structs, hooks or other special types. # no custom structs, hooks or other special types.
_def_msgspec_codec: MsgCodec = mk_codec(ipc_pld_spec=Any) #
# XXX NOTE XXX, this will break our `Context.start()` call!
#
# * by default we roundtrip the started pld-`value` and if you apply
# this codec (globally anyway with `apply_codec()`) then the
# `roundtripped` value will include a non-`.pld: Raw` which will
# then type-error on the consequent `._ops.validte_payload_msg()`..
#
_def_msgspec_codec: MsgCodec = mk_codec(
ipc_pld_spec=Any,
)
# The built-in IPC `Msg` spec. # The built-in IPC `Msg` spec.
# Our composing "shuttle" protocol which allows `tractor`-app code # Our composing "shuttle" protocol which allows `tractor`-app code
@ -476,13 +660,13 @@ _def_msgspec_codec: MsgCodec = mk_codec(ipc_pld_spec=Any)
# https://jcristharif.com/msgspec/supported-types.html # https://jcristharif.com/msgspec/supported-types.html
# #
_def_tractor_codec: MsgCodec = mk_codec( _def_tractor_codec: MsgCodec = mk_codec(
# TODO: use this for debug mode locking prot? ipc_pld_spec=Raw, # XXX should be default righ!?
# ipc_pld_spec=Any,
ipc_pld_spec=Raw,
) )
# TODO: IDEALLY provides for per-`trio.Task` specificity of the
# -[x] TODO, IDEALLY provides for per-`trio.Task` specificity of the
# IPC msging codec used by the transport layer when doing # IPC msging codec used by the transport layer when doing
# `Channel.send()/.recv()` of wire data. # `Channel.send()/.recv()` of wire data.
# => impled as our `PldRx` which is `Context` scoped B)
# ContextVar-TODO: DIDN'T WORK, kept resetting in every new task to default!? # ContextVar-TODO: DIDN'T WORK, kept resetting in every new task to default!?
# _ctxvar_MsgCodec: ContextVar[MsgCodec] = ContextVar( # _ctxvar_MsgCodec: ContextVar[MsgCodec] = ContextVar(
@ -559,17 +743,6 @@ def apply_codec(
) )
token: Token = var.set(codec) token: Token = var.set(codec)
# ?TODO? for TreeVar approach which copies from the
# cancel-scope of the prior value, NOT the prior task
# See the docs:
# - https://tricycle.readthedocs.io/en/latest/reference.html#tree-variables
# - https://github.com/oremanj/tricycle/blob/master/tricycle/_tests/test_tree_var.py
# ^- see docs for @cm `.being()` API
# with _ctxvar_MsgCodec.being(codec):
# new = _ctxvar_MsgCodec.get()
# assert new is codec
# yield codec
try: try:
yield var.get() yield var.get()
finally: finally:
@ -580,6 +753,19 @@ def apply_codec(
) )
assert var.get() is orig assert var.get() is orig
# ?TODO? for TreeVar approach which copies from the
# cancel-scope of the prior value, NOT the prior task
#
# See the docs:
# - https://tricycle.readthedocs.io/en/latest/reference.html#tree-variables
# - https://github.com/oremanj/tricycle/blob/master/tricycle/_tests/test_tree_var.py
# ^- see docs for @cm `.being()` API
#
# with _ctxvar_MsgCodec.being(codec):
# new = _ctxvar_MsgCodec.get()
# assert new is codec
# yield codec
def current_codec() -> MsgCodec: def current_codec() -> MsgCodec:
''' '''
@ -599,6 +785,7 @@ def limit_msg_spec(
# -> related to the `MsgCodec._payload_decs` stuff above.. # -> related to the `MsgCodec._payload_decs` stuff above..
# tagged_structs: list[Struct]|None = None, # tagged_structs: list[Struct]|None = None,
hide_tb: bool = True,
**codec_kwargs, **codec_kwargs,
) -> MsgCodec: ) -> MsgCodec:
@ -609,7 +796,7 @@ def limit_msg_spec(
for all IPC contexts in use by the current `trio.Task`. for all IPC contexts in use by the current `trio.Task`.
''' '''
__tracebackhide__: bool = True __tracebackhide__: bool = hide_tb
curr_codec: MsgCodec = current_codec() curr_codec: MsgCodec = current_codec()
msgspec_codec: MsgCodec = mk_codec( msgspec_codec: MsgCodec = mk_codec(
ipc_pld_spec=payload_spec, ipc_pld_spec=payload_spec,

View File

@ -0,0 +1,94 @@
# tractor: structured concurrent "actors".
# Copyright 2018-eternity Tyler Goodlet.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Type-extension-utils for codec-ing (python) objects not
covered by the `msgspec.msgpack` protocol.
See the various API docs from `msgspec`.
extending from native types,
- https://jcristharif.com/msgspec/extending.html#mapping-to-from-native-types
converters,
- https://jcristharif.com/msgspec/converters.html
- https://jcristharif.com/msgspec/api.html#msgspec.convert
`Raw` fields,
- https://jcristharif.com/msgspec/api.html#raw
- support for `.convert()` and `Raw`,
|_ https://jcristharif.com/msgspec/changelog.html
'''
from types import (
ModuleType,
)
import typing
from typing import (
Type,
Union,
)
def dec_type_union(
type_names: list[str],
mods: list[ModuleType] = []
) -> Type|Union[Type]:
'''
Look up types by name, compile into a list and then create and
return a `typing.Union` from the full set.
'''
# import importlib
types: list[Type] = []
for type_name in type_names:
for mod in [
typing,
# importlib.import_module(__name__),
] + mods:
if type_ref := getattr(
mod,
type_name,
False,
):
types.append(type_ref)
# special case handling only..
# ipc_pld_spec: Union[Type] = eval(
# pld_spec_str,
# {}, # globals
# {'typing': typing}, # locals
# )
return Union[*types]
def enc_type_union(
union_or_type: Union[Type]|Type,
) -> list[str]:
'''
Encode a type-union or single type to a list of type-name-strings
ready for IPC interchange.
'''
type_strs: list[str] = []
for typ in getattr(
union_or_type,
'__args__',
{union_or_type,},
):
type_strs.append(typ.__qualname__)
return type_strs

View File

@ -50,7 +50,9 @@ from tractor._exceptions import (
_mk_recv_mte, _mk_recv_mte,
pack_error, pack_error,
) )
from tractor._state import current_ipc_ctx from tractor._state import (
current_ipc_ctx,
)
from ._codec import ( from ._codec import (
mk_dec, mk_dec,
MsgDec, MsgDec,
@ -78,7 +80,7 @@ if TYPE_CHECKING:
log = get_logger(__name__) log = get_logger(__name__)
_def_any_pldec: MsgDec[Any] = mk_dec() _def_any_pldec: MsgDec[Any] = mk_dec(spec=Any)
class PldRx(Struct): class PldRx(Struct):
@ -148,6 +150,10 @@ class PldRx(Struct):
exit. exit.
''' '''
# TODO, ensure we pull the current `MsgCodec`'s custom
# dec/enc_hook settings as well ?
# -[ ] see `._codec.mk_codec()` inputs
#
orig_dec: MsgDec = self._pld_dec orig_dec: MsgDec = self._pld_dec
limit_dec: MsgDec = mk_dec( limit_dec: MsgDec = mk_dec(
spec=spec, spec=spec,
@ -258,6 +264,9 @@ class PldRx(Struct):
f'|_pld={pld!r}\n' f'|_pld={pld!r}\n'
) )
return pld return pld
except TypeError as typerr:
__tracebackhide__: bool = False
raise typerr
# XXX pld-value type failure # XXX pld-value type failure
except ValidationError as valerr: except ValidationError as valerr:
@ -452,11 +461,16 @@ def limit_plds(
''' '''
__tracebackhide__: bool = True __tracebackhide__: bool = True
curr_ctx: Context|None = current_ipc_ctx()
if curr_ctx is None:
raise RuntimeError(
'No IPC `Context` is active !?\n'
'Did you open `limit_plds()` from outside '
'a `Portal.open_context()` scope-block?'
)
try: try:
curr_ctx: Context = current_ipc_ctx()
rx: PldRx = curr_ctx._pld_rx rx: PldRx = curr_ctx._pld_rx
orig_pldec: MsgDec = rx.pld_dec orig_pldec: MsgDec = rx.pld_dec
with rx.limit_plds( with rx.limit_plds(
spec=spec, spec=spec,
**dec_kwargs, **dec_kwargs,
@ -466,6 +480,11 @@ def limit_plds(
f'{pldec}\n' f'{pldec}\n'
) )
yield pldec yield pldec
except BaseException:
__tracebackhide__: bool = False
raise
finally: finally:
log.runtime( log.runtime(
'Reverted to previous payload-decoder\n\n' 'Reverted to previous payload-decoder\n\n'
@ -796,8 +815,14 @@ def validate_payload_msg(
__tracebackhide__: bool = hide_tb __tracebackhide__: bool = hide_tb
codec: MsgCodec = current_codec() codec: MsgCodec = current_codec()
msg_bytes: bytes = codec.encode(pld_msg) msg_bytes: bytes = codec.encode(pld_msg)
roundtripped: Started|None = None
try: try:
roundtripped: Started = codec.decode(msg_bytes) roundtripped: Started = codec.decode(msg_bytes)
except TypeError as typerr:
__tracebackhide__: bool = False
raise typerr
try:
ctx: Context = getattr(ipc, 'ctx', ipc) ctx: Context = getattr(ipc, 'ctx', ipc)
pld: PayloadT = ctx.pld_rx.decode_pld( pld: PayloadT = ctx.pld_rx.decode_pld(
msg=roundtripped, msg=roundtripped,
@ -822,6 +847,11 @@ def validate_payload_msg(
) )
raise ValidationError(complaint) raise ValidationError(complaint)
# usually due to `.decode()` input type
except TypeError as typerr:
__tracebackhide__: bool = False
raise typerr
# raise any msg type error NO MATTER WHAT! # raise any msg type error NO MATTER WHAT!
except ValidationError as verr: except ValidationError as verr:
try: try:
@ -832,9 +862,13 @@ def validate_payload_msg(
verb_header='Trying to send ', verb_header='Trying to send ',
is_invalid_payload=True, is_invalid_payload=True,
) )
except BaseException: except BaseException as _be:
if not roundtripped:
raise verr
be = _be
__tracebackhide__: bool = False __tracebackhide__: bool = False
raise raise be
if not raise_mte: if not raise_mte:
return mte return mte

View File

@ -599,15 +599,15 @@ def mk_msg_spec(
Msg[payload_type_union], Msg[payload_type_union],
Generic[PayloadT], Generic[PayloadT],
) )
defstruct_bases: tuple = ( # defstruct_bases: tuple = (
Msg, # [payload_type_union], # Msg, # [payload_type_union],
# Generic[PayloadT], # # Generic[PayloadT],
# ^-XXX-^: not allowed? lul.. # # ^-XXX-^: not allowed? lul..
) # )
ipc_msg_types: list[Msg] = [] ipc_msg_types: list[Msg] = []
idx_msg_types: list[Msg] = [] idx_msg_types: list[Msg] = []
defs_msg_types: list[Msg] = [] # defs_msg_types: list[Msg] = []
nc_msg_types: list[Msg] = [] nc_msg_types: list[Msg] = []
for msgtype in __msg_types__: for msgtype in __msg_types__:
@ -625,7 +625,7 @@ def mk_msg_spec(
# TODO: wait why do we need the dynamic version here? # TODO: wait why do we need the dynamic version here?
# XXX ANSWER XXX -> BC INHERITANCE.. don't work w generics.. # XXX ANSWER XXX -> BC INHERITANCE.. don't work w generics..
# #
# NOTE previously bc msgtypes WERE NOT inheritting # NOTE previously bc msgtypes WERE NOT inheriting
# directly the `Generic[PayloadT]` type, the manual method # directly the `Generic[PayloadT]` type, the manual method
# of generic-paraming with `.__class_getitem__()` wasn't # of generic-paraming with `.__class_getitem__()` wasn't
# working.. # working..
@ -662,38 +662,35 @@ def mk_msg_spec(
# with `msgspec.structs.defstruct` # with `msgspec.structs.defstruct`
# XXX ALSO DOESN'T WORK # XXX ALSO DOESN'T WORK
defstruct_msgtype = defstruct( # defstruct_msgtype = defstruct(
name=msgtype.__name__, # name=msgtype.__name__,
fields=[ # fields=[
('cid', str), # ('cid', str),
# XXX doesn't seem to work.. # # XXX doesn't seem to work..
# ('pld', PayloadT), # # ('pld', PayloadT),
('pld', payload_type_union),
],
bases=defstruct_bases,
)
defs_msg_types.append(defstruct_msgtype)
# ('pld', payload_type_union),
# ],
# bases=defstruct_bases,
# )
# defs_msg_types.append(defstruct_msgtype)
# assert index_paramed_msg_type == manual_paramed_msg_subtype # assert index_paramed_msg_type == manual_paramed_msg_subtype
# paramed_msg_type = manual_paramed_msg_subtype # paramed_msg_type = manual_paramed_msg_subtype
# ipc_payload_msgs_type_union |= index_paramed_msg_type # ipc_payload_msgs_type_union |= index_paramed_msg_type
idx_spec: Union[Type[Msg]] = Union[*idx_msg_types] idx_spec: Union[Type[Msg]] = Union[*idx_msg_types]
def_spec: Union[Type[Msg]] = Union[*defs_msg_types] # def_spec: Union[Type[Msg]] = Union[*defs_msg_types]
nc_spec: Union[Type[Msg]] = Union[*nc_msg_types] nc_spec: Union[Type[Msg]] = Union[*nc_msg_types]
specs: dict[str, Union[Type[Msg]]] = { specs: dict[str, Union[Type[Msg]]] = {
'indexed_generics': idx_spec, 'indexed_generics': idx_spec,
'defstruct': def_spec, # 'defstruct': def_spec,
'types_new_class': nc_spec, 'types_new_class': nc_spec,
} }
msgtypes_table: dict[str, list[Msg]] = { msgtypes_table: dict[str, list[Msg]] = {
'indexed_generics': idx_msg_types, 'indexed_generics': idx_msg_types,
'defstruct': defs_msg_types, # 'defstruct': defs_msg_types,
'types_new_class': nc_msg_types, 'types_new_class': nc_msg_types,
} }

File diff suppressed because it is too large Load Diff

View File

@ -29,3 +29,6 @@ from ._broadcast import (
BroadcastReceiver as BroadcastReceiver, BroadcastReceiver as BroadcastReceiver,
Lagged as Lagged, Lagged as Lagged,
) )
from ._beg import (
collapse_eg as collapse_eg,
)

View File

@ -0,0 +1,58 @@
# tractor: structured concurrent "actors".
# Copyright 2018-eternity Tyler Goodlet.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
`BaseExceptionGroup` related utils and helpers pertaining to
first-class-`trio` from a historical perspective B)
'''
from contextlib import (
asynccontextmanager as acm,
)
def maybe_collapse_eg(
beg: BaseExceptionGroup,
) -> BaseException:
'''
If the input beg can collapse to a single non-eg sub-exception,
return it instead.
'''
if len(excs := beg.exceptions) == 1:
return excs[0]
return beg
@acm
async def collapse_eg():
'''
If `BaseExceptionGroup` raised in the body scope is
"collapse-able" (in the same way that
`trio.open_nursery(strict_exception_groups=False)` works) then
only raise the lone emedded non-eg in in place.
'''
try:
yield
except* BaseException as beg:
if (
exc := maybe_collapse_eg(beg)
) is not beg:
raise exc
raise beg

View File

@ -15,14 +15,14 @@
# along with this program. If not, see <https://www.gnu.org/licenses/>. # along with this program. If not, see <https://www.gnu.org/licenses/>.
''' '''
``tokio`` style broadcast channel. `tokio` style broadcast channel.
https://docs.rs/tokio/1.11.0/tokio/sync/broadcast/index.html https://docs.rs/tokio/1.11.0/tokio/sync/broadcast/index.html
''' '''
from __future__ import annotations from __future__ import annotations
from abc import abstractmethod from abc import abstractmethod
from collections import deque from collections import deque
from contextlib import asynccontextmanager from contextlib import asynccontextmanager as acm
from functools import partial from functools import partial
from operator import ne from operator import ne
from typing import ( from typing import (
@ -398,7 +398,7 @@ class BroadcastReceiver(ReceiveChannel):
return await self._receive_from_underlying(key, state) return await self._receive_from_underlying(key, state)
@asynccontextmanager @acm
async def subscribe( async def subscribe(
self, self,
raise_on_lag: bool = True, raise_on_lag: bool = True,

View File

@ -57,6 +57,8 @@ async def maybe_open_nursery(
shield: bool = False, shield: bool = False,
lib: ModuleType = trio, lib: ModuleType = trio,
**kwargs, # proxy thru
) -> AsyncGenerator[trio.Nursery, Any]: ) -> AsyncGenerator[trio.Nursery, Any]:
''' '''
Create a new nursery if None provided. Create a new nursery if None provided.
@ -67,7 +69,7 @@ async def maybe_open_nursery(
if nursery is not None: if nursery is not None:
yield nursery yield nursery
else: else:
async with lib.open_nursery() as nursery: async with lib.open_nursery(**kwargs) as nursery:
nursery.cancel_scope.shield = shield nursery.cancel_scope.shield = shield
yield nursery yield nursery
@ -143,9 +145,14 @@ async def gather_contexts(
'Use a non-lazy iterator or sequence type intead!' 'Use a non-lazy iterator or sequence type intead!'
) )
async with trio.open_nursery() as n: async with trio.open_nursery(
strict_exception_groups=False,
# ^XXX^ TODO? soo roll our own then ??
# -> since we kinda want the "if only one `.exception` then
# just raise that" interface?
) as tn:
for mngr in mngrs: for mngr in mngrs:
n.start_soon( tn.start_soon(
_enter_and_wait, _enter_and_wait,
mngr, mngr,
unwrapped, unwrapped,

View File

@ -0,0 +1,322 @@
# tractor: structured concurrent "actors".
# Copyright 2018-eternity Tyler Goodlet.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Erlang-style (ish) "one-cancels-one" nursery.
'''
from __future__ import annotations
from contextlib import (
asynccontextmanager as acm,
contextmanager as cm,
)
from functools import partial
from typing import (
Generator,
Any,
)
from outcome import (
Outcome,
acapture,
)
from msgspec import Struct
import trio
from trio._core._run import (
Task,
CancelScope,
Nursery,
)
class TaskOutcome(Struct):
'''
The outcome of a scheduled ``trio`` task which includes an interface
for synchronizing to the completion of the task's runtime and access
to the eventual boxed result/value or raised exception.
'''
lowlevel_task: Task
_exited = trio.Event() # as per `trio.Runner.task_exited()`
_outcome: Outcome | None = None # as per `outcome.Outcome`
_result: Any | None = None # the eventual maybe-returned-value
@property
def result(self) -> Any:
'''
Either Any or None depending on whether the Outcome has compeleted.
'''
if self._outcome is None:
raise RuntimeError(
f'Task {self.lowlevel_task.name} is not complete.\n'
'First wait on `await TaskOutcome.wait_for_result()`!'
)
return self._result
def _set_outcome(
self,
outcome: Outcome,
):
'''
Set the ``Outcome`` for this task.
This method should only ever be called by the task's supervising
nursery implemenation.
'''
self._outcome = outcome
self._result = outcome.unwrap()
self._exited.set()
async def wait_for_result(self) -> Any:
'''
Unwind the underlying task's ``Outcome`` by async waiting for
the task to first complete and then unwrap it's result-value.
'''
if self._exited.is_set():
return self._result
await self._exited.wait()
out = self._outcome
if out is None:
raise ValueError(f'{out} is not an outcome!?')
return self.result
class TaskManagerNursery(Struct):
_n: Nursery
_scopes: dict[
Task,
tuple[CancelScope, Outcome]
] = {}
task_manager: Generator[Any, Outcome, None] | None = None
async def start_soon(
self,
async_fn,
*args,
name=None,
task_manager: Generator[Any, Outcome, None] | None = None
) -> tuple[CancelScope, Task]:
# NOTE: internals of a nursery don't let you know what
# the most recently spawned task is by order.. so we'd
# have to either change that or do set ops.
# pre_start_tasks: set[Task] = n._children.copy()
# new_tasks = n._children - pre_start_Tasks
# assert len(new_tasks) == 1
# task = new_tasks.pop()
n: Nursery = self._n
sm = self.task_manager
# we do default behavior of a scope-per-nursery
# if the user did not provide a task manager.
if sm is None:
return n.start_soon(async_fn, *args, name=None)
new_task: Task | None = None
to_return: tuple[Any] | None = None
# NOTE: what do we enforce as a signature for the
# `@task_scope_manager` here?
mngr = sm(nursery=n)
async def _start_wrapped_in_scope(
task_status: TaskStatus[
tuple[CancelScope, Task]
] = trio.TASK_STATUS_IGNORED,
) -> None:
# TODO: this was working before?! and, do we need something
# like it to implement `.start()`?
# nonlocal to_return
# execute up to the first yield
try:
to_return: tuple[Any] = next(mngr)
except StopIteration:
raise RuntimeError("task manager didn't yield") from None
# TODO: how do we support `.start()` style?
# - relay through whatever the
# started task passes back via `.started()` ?
# seems like that won't work with also returning
# a "task handle"?
# - we were previously binding-out this `to_return` to
# the parent's lexical scope, why isn't that working
# now?
task_status.started(to_return)
# invoke underlying func now that cs is entered.
outcome = await acapture(async_fn, *args)
# execute from the 1st yield to return and expect
# generator-mngr `@task_scope_manager` thinger to
# terminate!
try:
mngr.send(outcome)
# I would presume it's better to have a handle to
# the `Outcome` entirely? This method sends *into*
# the mngr this `Outcome.value`; seems like kinda
# weird semantics for our purposes?
# outcome.send(mngr)
except StopIteration:
return
else:
raise RuntimeError(f"{mngr} didn't stop!")
to_return = await n.start(_start_wrapped_in_scope)
assert to_return is not None
# TODO: use the fancy type-check-time type signature stuff from
# mypy i guess..to like, relay the type of whatever the
# generator yielded through? betcha that'll be un-grokable XD
return to_return
# TODO: define a decorator to runtime type check that this a generator
# with a single yield that also delivers a value (of some std type) from
# the yield expression?
# @trio.task_manager
def add_task_handle_and_crash_handling(
nursery: Nursery,
debug_mode: bool = False,
) -> Generator[
Any,
Outcome,
None,
]:
'''
A customizable, user defined "task scope manager".
With this specially crafted single-yield generator function you can
add more granular controls around every task spawned by `trio` B)
'''
# if you need it you can ask trio for the task obj
task: Task = trio.lowlevel.current_task()
print(f'Spawning task: {task.name}')
# User defined "task handle" for more granular supervision
# of each spawned task as needed for their particular usage.
task_outcome = TaskOutcome(task)
# NOTE: if wanted the user could wrap the output task handle however
# they want!
# class TaskHandle(Struct):
# task: Task
# cs: CancelScope
# outcome: TaskOutcome
# this yields back when the task is terminated, cancelled or returns.
try:
with CancelScope() as cs:
# the yielded value(s) here are what are returned to the
# nursery's `.start_soon()` caller B)
lowlevel_outcome: Outcome = yield (task_outcome, cs)
task_outcome._set_outcome(lowlevel_outcome)
# Adds "crash handling" from `pdbp` by entering
# a REPL on std errors.
except Exception as err:
print(f'{task.name} crashed, entering debugger!')
if debug_mode:
import pdbp
pdbp.xpm()
raise
finally:
print(f'{task.name} Exitted')
@acm
async def open_nursery(
task_manager: Generator[Any, Outcome, None] | None = None,
**lowlevel_nursery_kwargs,
):
async with trio.open_nursery(**lowlevel_nursery_kwargs) as nurse:
yield TaskManagerNursery(
nurse,
task_manager=task_manager,
)
async def sleep_then_return_val(val: str):
await trio.sleep(0.2)
return val
async def ensure_cancelled():
try:
await trio.sleep_forever()
except trio.Cancelled:
task = trio.lowlevel.current_task()
print(f'heyyo ONLY {task.name} was cancelled as expected B)')
assert 0
except BaseException:
raise RuntimeError("woa woa woa this ain't right!")
if __name__ == '__main__':
async def main():
async with open_nursery(
task_manager=partial(
add_task_handle_and_crash_handling,
debug_mode=True,
),
) as sn:
for _ in range(3):
outcome, _ = await sn.start_soon(trio.sleep_forever)
# extra task we want to engage in debugger post mortem.
err_outcome, cs = await sn.start_soon(ensure_cancelled)
val: str = 'yoyoyo'
val_outcome, _ = await sn.start_soon(
sleep_then_return_val,
val,
)
res = await val_outcome.wait_for_result()
assert res == val
print(f'{res} -> GOT EXPECTED TASK VALUE')
await trio.sleep(0.6)
print(
f'Cancelling and waiting on {err_outcome.lowlevel_task} '
'to CRASH..'
)
cs.cancel()
trio.run(main)

535
uv.lock 100644
View File

@ -0,0 +1,535 @@
version = 1
revision = 1
requires-python = ">=3.11"
[[package]]
name = "async-generator"
version = "1.10"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/ce/b6/6fa6b3b598a03cba5e80f829e0dadbb49d7645f523d209b2fb7ea0bbb02a/async_generator-1.10.tar.gz", hash = "sha256:6ebb3d106c12920aaae42ccb6f787ef5eefdcdd166ea3d628fa8476abe712144", size = 29870 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/71/52/39d20e03abd0ac9159c162ec24b93fbcaa111e8400308f2465432495ca2b/async_generator-1.10-py3-none-any.whl", hash = "sha256:01c7bf666359b4967d2cda0000cc2e4af16a0ae098cbffcb8472fb9e8ad6585b", size = 18857 },
]
[[package]]
name = "attrs"
version = "24.3.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/48/c8/6260f8ccc11f0917360fc0da435c5c9c7504e3db174d5a12a1494887b045/attrs-24.3.0.tar.gz", hash = "sha256:8f5c07333d543103541ba7be0e2ce16eeee8130cb0b3f9238ab904ce1e85baff", size = 805984 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/89/aa/ab0f7891a01eeb2d2e338ae8fecbe57fcebea1a24dbb64d45801bfab481d/attrs-24.3.0-py3-none-any.whl", hash = "sha256:ac96cd038792094f438ad1f6ff80837353805ac950cd2aa0e0625ef19850c308", size = 63397 },
]
[[package]]
name = "cffi"
version = "1.17.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "pycparser" },
]
sdist = { url = "https://files.pythonhosted.org/packages/fc/97/c783634659c2920c3fc70419e3af40972dbaf758daa229a7d6ea6135c90d/cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824", size = 516621 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/34/33/e1b8a1ba29025adbdcda5fb3a36f94c03d771c1b7b12f726ff7fef2ebe36/cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655", size = 171727 },
{ url = "https://files.pythonhosted.org/packages/3d/97/50228be003bb2802627d28ec0627837ac0bf35c90cf769812056f235b2d1/cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0", size = 181400 },
{ url = "https://files.pythonhosted.org/packages/86/c5/28b2d6f799ec0bdecf44dced2ec5ed43e0eb63097b0f58c293583b406582/cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65", size = 172448 },
{ url = "https://files.pythonhosted.org/packages/50/b9/db34c4755a7bd1cb2d1603ac3863f22bcecbd1ba29e5ee841a4bc510b294/cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903", size = 181976 },
{ url = "https://files.pythonhosted.org/packages/bf/ee/f94057fa6426481d663b88637a9a10e859e492c73d0384514a17d78ee205/cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d", size = 172475 },
{ url = "https://files.pythonhosted.org/packages/7c/fc/6a8cb64e5f0324877d503c854da15d76c1e50eb722e320b15345c4d0c6de/cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a", size = 182009 },
]
[[package]]
name = "colorama"
version = "0.4.6"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 },
]
[[package]]
name = "colorlog"
version = "6.9.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "colorama", marker = "sys_platform == 'win32'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/d3/7a/359f4d5df2353f26172b3cc39ea32daa39af8de522205f512f458923e677/colorlog-6.9.0.tar.gz", hash = "sha256:bfba54a1b93b94f54e1f4fe48395725a3d92fd2a4af702f6bd70946bdc0c6ac2", size = 16624 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e3/51/9b208e85196941db2f0654ad0357ca6388ab3ed67efdbfc799f35d1f83aa/colorlog-6.9.0-py3-none-any.whl", hash = "sha256:5906e71acd67cb07a71e779c47c4bcb45fb8c2993eebe9e5adcd6a6f1b283eff", size = 11424 },
]
[[package]]
name = "greenback"
version = "1.2.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "greenlet" },
{ name = "outcome" },
{ name = "sniffio" },
]
sdist = { url = "https://files.pythonhosted.org/packages/dc/c1/ab3a42c0f3ed56df9cd33de1539b3198d98c6ccbaf88a73d6be0b72d85e0/greenback-1.2.1.tar.gz", hash = "sha256:de3ca656885c03b96dab36079f3de74bb5ba061da9bfe3bb69dccc866ef95ea3", size = 42597 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/71/d0/b8dc79d5ecfffacad9c844b6ae76b9c6259935796d3c561deccbf8fa421d/greenback-1.2.1-py3-none-any.whl", hash = "sha256:98768edbbe4340091a9730cf64a683fcbaa3f2cb81e4ac41d7ed28d3b6f74b79", size = 28062 },
]
[[package]]
name = "greenlet"
version = "3.1.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/2f/ff/df5fede753cc10f6a5be0931204ea30c35fa2f2ea7a35b25bdaf4fe40e46/greenlet-3.1.1.tar.gz", hash = "sha256:4ce3ac6cdb6adf7946475d7ef31777c26d94bccc377e070a7986bd2d5c515467", size = 186022 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/28/62/1c2665558618553c42922ed47a4e6d6527e2fa3516a8256c2f431c5d0441/greenlet-3.1.1-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:e4d333e558953648ca09d64f13e6d8f0523fa705f51cae3f03b5983489958c70", size = 272479 },
{ url = "https://files.pythonhosted.org/packages/76/9d/421e2d5f07285b6e4e3a676b016ca781f63cfe4a0cd8eaecf3fd6f7a71ae/greenlet-3.1.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:09fc016b73c94e98e29af67ab7b9a879c307c6731a2c9da0db5a7d9b7edd1159", size = 640404 },
{ url = "https://files.pythonhosted.org/packages/e5/de/6e05f5c59262a584e502dd3d261bbdd2c97ab5416cc9c0b91ea38932a901/greenlet-3.1.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d5e975ca70269d66d17dd995dafc06f1b06e8cb1ec1e9ed54c1d1e4a7c4cf26e", size = 652813 },
{ url = "https://files.pythonhosted.org/packages/49/93/d5f93c84241acdea15a8fd329362c2c71c79e1a507c3f142a5d67ea435ae/greenlet-3.1.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b2813dc3de8c1ee3f924e4d4227999285fd335d1bcc0d2be6dc3f1f6a318ec1", size = 648517 },
{ url = "https://files.pythonhosted.org/packages/15/85/72f77fc02d00470c86a5c982b8daafdf65d38aefbbe441cebff3bf7037fc/greenlet-3.1.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e347b3bfcf985a05e8c0b7d462ba6f15b1ee1c909e2dcad795e49e91b152c383", size = 647831 },
{ url = "https://files.pythonhosted.org/packages/f7/4b/1c9695aa24f808e156c8f4813f685d975ca73c000c2a5056c514c64980f6/greenlet-3.1.1-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9e8f8c9cb53cdac7ba9793c276acd90168f416b9ce36799b9b885790f8ad6c0a", size = 602413 },
{ url = "https://files.pythonhosted.org/packages/76/70/ad6e5b31ef330f03b12559d19fda2606a522d3849cde46b24f223d6d1619/greenlet-3.1.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:62ee94988d6b4722ce0028644418d93a52429e977d742ca2ccbe1c4f4a792511", size = 1129619 },
{ url = "https://files.pythonhosted.org/packages/f4/fb/201e1b932e584066e0f0658b538e73c459b34d44b4bd4034f682423bc801/greenlet-3.1.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1776fd7f989fc6b8d8c8cb8da1f6b82c5814957264d1f6cf818d475ec2bf6395", size = 1155198 },
{ url = "https://files.pythonhosted.org/packages/12/da/b9ed5e310bb8b89661b80cbcd4db5a067903bbcd7fc854923f5ebb4144f0/greenlet-3.1.1-cp311-cp311-win_amd64.whl", hash = "sha256:48ca08c771c268a768087b408658e216133aecd835c0ded47ce955381105ba39", size = 298930 },
{ url = "https://files.pythonhosted.org/packages/7d/ec/bad1ac26764d26aa1353216fcbfa4670050f66d445448aafa227f8b16e80/greenlet-3.1.1-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:4afe7ea89de619adc868e087b4d2359282058479d7cfb94970adf4b55284574d", size = 274260 },
{ url = "https://files.pythonhosted.org/packages/66/d4/c8c04958870f482459ab5956c2942c4ec35cac7fe245527f1039837c17a9/greenlet-3.1.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f406b22b7c9a9b4f8aa9d2ab13d6ae0ac3e85c9a809bd590ad53fed2bf70dc79", size = 649064 },
{ url = "https://files.pythonhosted.org/packages/51/41/467b12a8c7c1303d20abcca145db2be4e6cd50a951fa30af48b6ec607581/greenlet-3.1.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c3a701fe5a9695b238503ce5bbe8218e03c3bcccf7e204e455e7462d770268aa", size = 663420 },
{ url = "https://files.pythonhosted.org/packages/27/8f/2a93cd9b1e7107d5c7b3b7816eeadcac2ebcaf6d6513df9abaf0334777f6/greenlet-3.1.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2846930c65b47d70b9d178e89c7e1a69c95c1f68ea5aa0a58646b7a96df12441", size = 658035 },
{ url = "https://files.pythonhosted.org/packages/57/5c/7c6f50cb12be092e1dccb2599be5a942c3416dbcfb76efcf54b3f8be4d8d/greenlet-3.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:99cfaa2110534e2cf3ba31a7abcac9d328d1d9f1b95beede58294a60348fba36", size = 660105 },
{ url = "https://files.pythonhosted.org/packages/f1/66/033e58a50fd9ec9df00a8671c74f1f3a320564c6415a4ed82a1c651654ba/greenlet-3.1.1-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1443279c19fca463fc33e65ef2a935a5b09bb90f978beab37729e1c3c6c25fe9", size = 613077 },
{ url = "https://files.pythonhosted.org/packages/19/c5/36384a06f748044d06bdd8776e231fadf92fc896bd12cb1c9f5a1bda9578/greenlet-3.1.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:b7cede291382a78f7bb5f04a529cb18e068dd29e0fb27376074b6d0317bf4dd0", size = 1135975 },
{ url = "https://files.pythonhosted.org/packages/38/f9/c0a0eb61bdf808d23266ecf1d63309f0e1471f284300ce6dac0ae1231881/greenlet-3.1.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:23f20bb60ae298d7d8656c6ec6db134bca379ecefadb0b19ce6f19d1f232a942", size = 1163955 },
{ url = "https://files.pythonhosted.org/packages/43/21/a5d9df1d21514883333fc86584c07c2b49ba7c602e670b174bd73cfc9c7f/greenlet-3.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:7124e16b4c55d417577c2077be379514321916d5790fa287c9ed6f23bd2ffd01", size = 299655 },
{ url = "https://files.pythonhosted.org/packages/f3/57/0db4940cd7bb461365ca8d6fd53e68254c9dbbcc2b452e69d0d41f10a85e/greenlet-3.1.1-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:05175c27cb459dcfc05d026c4232f9de8913ed006d42713cb8a5137bd49375f1", size = 272990 },
{ url = "https://files.pythonhosted.org/packages/1c/ec/423d113c9f74e5e402e175b157203e9102feeb7088cee844d735b28ef963/greenlet-3.1.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:935e943ec47c4afab8965954bf49bfa639c05d4ccf9ef6e924188f762145c0ff", size = 649175 },
{ url = "https://files.pythonhosted.org/packages/a9/46/ddbd2db9ff209186b7b7c621d1432e2f21714adc988703dbdd0e65155c77/greenlet-3.1.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:667a9706c970cb552ede35aee17339a18e8f2a87a51fba2ed39ceeeb1004798a", size = 663425 },
{ url = "https://files.pythonhosted.org/packages/bc/f9/9c82d6b2b04aa37e38e74f0c429aece5eeb02bab6e3b98e7db89b23d94c6/greenlet-3.1.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b8a678974d1f3aa55f6cc34dc480169d58f2e6d8958895d68845fa4ab566509e", size = 657736 },
{ url = "https://files.pythonhosted.org/packages/d9/42/b87bc2a81e3a62c3de2b0d550bf91a86939442b7ff85abb94eec3fc0e6aa/greenlet-3.1.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:efc0f674aa41b92da8c49e0346318c6075d734994c3c4e4430b1c3f853e498e4", size = 660347 },
{ url = "https://files.pythonhosted.org/packages/37/fa/71599c3fd06336cdc3eac52e6871cfebab4d9d70674a9a9e7a482c318e99/greenlet-3.1.1-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0153404a4bb921f0ff1abeb5ce8a5131da56b953eda6e14b88dc6bbc04d2049e", size = 615583 },
{ url = "https://files.pythonhosted.org/packages/4e/96/e9ef85de031703ee7a4483489b40cf307f93c1824a02e903106f2ea315fe/greenlet-3.1.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:275f72decf9932639c1c6dd1013a1bc266438eb32710016a1c742df5da6e60a1", size = 1133039 },
{ url = "https://files.pythonhosted.org/packages/87/76/b2b6362accd69f2d1889db61a18c94bc743e961e3cab344c2effaa4b4a25/greenlet-3.1.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:c4aab7f6381f38a4b42f269057aee279ab0fc7bf2e929e3d4abfae97b682a12c", size = 1160716 },
{ url = "https://files.pythonhosted.org/packages/1f/1b/54336d876186920e185066d8c3024ad55f21d7cc3683c856127ddb7b13ce/greenlet-3.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:b42703b1cf69f2aa1df7d1030b9d77d3e584a70755674d60e710f0af570f3761", size = 299490 },
{ url = "https://files.pythonhosted.org/packages/5f/17/bea55bf36990e1638a2af5ba10c1640273ef20f627962cf97107f1e5d637/greenlet-3.1.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f1695e76146579f8c06c1509c7ce4dfe0706f49c6831a817ac04eebb2fd02011", size = 643731 },
{ url = "https://files.pythonhosted.org/packages/78/d2/aa3d2157f9ab742a08e0fd8f77d4699f37c22adfbfeb0c610a186b5f75e0/greenlet-3.1.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7876452af029456b3f3549b696bb36a06db7c90747740c5302f74a9e9fa14b13", size = 649304 },
{ url = "https://files.pythonhosted.org/packages/f1/8e/d0aeffe69e53ccff5a28fa86f07ad1d2d2d6537a9506229431a2a02e2f15/greenlet-3.1.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4ead44c85f8ab905852d3de8d86f6f8baf77109f9da589cb4fa142bd3b57b475", size = 646537 },
{ url = "https://files.pythonhosted.org/packages/05/79/e15408220bbb989469c8871062c97c6c9136770657ba779711b90870d867/greenlet-3.1.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8320f64b777d00dd7ccdade271eaf0cad6636343293a25074cc5566160e4de7b", size = 642506 },
{ url = "https://files.pythonhosted.org/packages/18/87/470e01a940307796f1d25f8167b551a968540fbe0551c0ebb853cb527dd6/greenlet-3.1.1-cp313-cp313t-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6510bf84a6b643dabba74d3049ead221257603a253d0a9873f55f6a59a65f822", size = 602753 },
{ url = "https://files.pythonhosted.org/packages/e2/72/576815ba674eddc3c25028238f74d7b8068902b3968cbe456771b166455e/greenlet-3.1.1-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:04b013dc07c96f83134b1e99888e7a79979f1a247e2a9f59697fa14b5862ed01", size = 1122731 },
{ url = "https://files.pythonhosted.org/packages/ac/38/08cc303ddddc4b3d7c628c3039a61a3aae36c241ed01393d00c2fd663473/greenlet-3.1.1-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:411f015496fec93c1c8cd4e5238da364e1da7a124bcb293f085bf2860c32c6f6", size = 1142112 },
]
[[package]]
name = "idna"
version = "3.10"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 },
]
[[package]]
name = "importlib-metadata"
version = "8.6.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "zipp" },
]
sdist = { url = "https://files.pythonhosted.org/packages/33/08/c1395a292bb23fd03bdf572a1357c5a733d3eecbab877641ceacab23db6e/importlib_metadata-8.6.1.tar.gz", hash = "sha256:310b41d755445d74569f993ccfc22838295d9fe005425094fad953d7f15c8580", size = 55767 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/79/9d/0fb148dc4d6fa4a7dd1d8378168d9b4cd8d4560a6fbf6f0121c5fc34eb68/importlib_metadata-8.6.1-py3-none-any.whl", hash = "sha256:02a89390c1e15fdfdc0d7c6b25cb3e62650d0494005c97d6f148bf5b9787525e", size = 26971 },
]
[[package]]
name = "iniconfig"
version = "2.0.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/d7/4b/cbd8e699e64a6f16ca3a8220661b5f83792b3017d0f79807cb8708d33913/iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3", size = 4646 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ef/a6/62565a6e1cf69e10f5727360368e451d4b7f58beeac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374", size = 5892 },
]
[[package]]
name = "msgspec"
version = "0.19.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/cf/9b/95d8ce458462b8b71b8a70fa94563b2498b89933689f3a7b8911edfae3d7/msgspec-0.19.0.tar.gz", hash = "sha256:604037e7cd475345848116e89c553aa9a233259733ab51986ac924ab1b976f8e", size = 216934 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/24/d4/2ec2567ac30dab072cce3e91fb17803c52f0a37aab6b0c24375d2b20a581/msgspec-0.19.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:aa77046904db764b0462036bc63ef71f02b75b8f72e9c9dd4c447d6da1ed8f8e", size = 187939 },
{ url = "https://files.pythonhosted.org/packages/2b/c0/18226e4328897f4f19875cb62bb9259fe47e901eade9d9376ab5f251a929/msgspec-0.19.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:047cfa8675eb3bad68722cfe95c60e7afabf84d1bd8938979dd2b92e9e4a9551", size = 182202 },
{ url = "https://files.pythonhosted.org/packages/81/25/3a4b24d468203d8af90d1d351b77ea3cffb96b29492855cf83078f16bfe4/msgspec-0.19.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e78f46ff39a427e10b4a61614a2777ad69559cc8d603a7c05681f5a595ea98f7", size = 209029 },
{ url = "https://files.pythonhosted.org/packages/85/2e/db7e189b57901955239f7689b5dcd6ae9458637a9c66747326726c650523/msgspec-0.19.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c7adf191e4bd3be0e9231c3b6dc20cf1199ada2af523885efc2ed218eafd011", size = 210682 },
{ url = "https://files.pythonhosted.org/packages/03/97/7c8895c9074a97052d7e4a1cc1230b7b6e2ca2486714eb12c3f08bb9d284/msgspec-0.19.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:f04cad4385e20be7c7176bb8ae3dca54a08e9756cfc97bcdb4f18560c3042063", size = 214003 },
{ url = "https://files.pythonhosted.org/packages/61/61/e892997bcaa289559b4d5869f066a8021b79f4bf8e955f831b095f47a4cd/msgspec-0.19.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:45c8fb410670b3b7eb884d44a75589377c341ec1392b778311acdbfa55187716", size = 216833 },
{ url = "https://files.pythonhosted.org/packages/ce/3d/71b2dffd3a1c743ffe13296ff701ee503feaebc3f04d0e75613b6563c374/msgspec-0.19.0-cp311-cp311-win_amd64.whl", hash = "sha256:70eaef4934b87193a27d802534dc466778ad8d536e296ae2f9334e182ac27b6c", size = 186184 },
{ url = "https://files.pythonhosted.org/packages/b2/5f/a70c24f075e3e7af2fae5414c7048b0e11389685b7f717bb55ba282a34a7/msgspec-0.19.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f98bd8962ad549c27d63845b50af3f53ec468b6318400c9f1adfe8b092d7b62f", size = 190485 },
{ url = "https://files.pythonhosted.org/packages/89/b0/1b9763938cfae12acf14b682fcf05c92855974d921a5a985ecc197d1c672/msgspec-0.19.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:43bbb237feab761b815ed9df43b266114203f53596f9b6e6f00ebd79d178cdf2", size = 183910 },
{ url = "https://files.pythonhosted.org/packages/87/81/0c8c93f0b92c97e326b279795f9c5b956c5a97af28ca0fbb9fd86c83737a/msgspec-0.19.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4cfc033c02c3e0aec52b71710d7f84cb3ca5eb407ab2ad23d75631153fdb1f12", size = 210633 },
{ url = "https://files.pythonhosted.org/packages/d0/ef/c5422ce8af73928d194a6606f8ae36e93a52fd5e8df5abd366903a5ca8da/msgspec-0.19.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d911c442571605e17658ca2b416fd8579c5050ac9adc5e00c2cb3126c97f73bc", size = 213594 },
{ url = "https://files.pythonhosted.org/packages/19/2b/4137bc2ed45660444842d042be2cf5b18aa06efd2cda107cff18253b9653/msgspec-0.19.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:757b501fa57e24896cf40a831442b19a864f56d253679f34f260dcb002524a6c", size = 214053 },
{ url = "https://files.pythonhosted.org/packages/9d/e6/8ad51bdc806aac1dc501e8fe43f759f9ed7284043d722b53323ea421c360/msgspec-0.19.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:5f0f65f29b45e2816d8bded36e6b837a4bf5fb60ec4bc3c625fa2c6da4124537", size = 219081 },
{ url = "https://files.pythonhosted.org/packages/b1/ef/27dd35a7049c9a4f4211c6cd6a8c9db0a50647546f003a5867827ec45391/msgspec-0.19.0-cp312-cp312-win_amd64.whl", hash = "sha256:067f0de1c33cfa0b6a8206562efdf6be5985b988b53dd244a8e06f993f27c8c0", size = 187467 },
{ url = "https://files.pythonhosted.org/packages/3c/cb/2842c312bbe618d8fefc8b9cedce37f773cdc8fa453306546dba2c21fd98/msgspec-0.19.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f12d30dd6266557aaaf0aa0f9580a9a8fbeadfa83699c487713e355ec5f0bd86", size = 190498 },
{ url = "https://files.pythonhosted.org/packages/58/95/c40b01b93465e1a5f3b6c7d91b10fb574818163740cc3acbe722d1e0e7e4/msgspec-0.19.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:82b2c42c1b9ebc89e822e7e13bbe9d17ede0c23c187469fdd9505afd5a481314", size = 183950 },
{ url = "https://files.pythonhosted.org/packages/e8/f0/5b764e066ce9aba4b70d1db8b087ea66098c7c27d59b9dd8a3532774d48f/msgspec-0.19.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:19746b50be214a54239aab822964f2ac81e38b0055cca94808359d779338c10e", size = 210647 },
{ url = "https://files.pythonhosted.org/packages/9d/87/bc14f49bc95c4cb0dd0a8c56028a67c014ee7e6818ccdce74a4862af259b/msgspec-0.19.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:60ef4bdb0ec8e4ad62e5a1f95230c08efb1f64f32e6e8dd2ced685bcc73858b5", size = 213563 },
{ url = "https://files.pythonhosted.org/packages/53/2f/2b1c2b056894fbaa975f68f81e3014bb447516a8b010f1bed3fb0e016ed7/msgspec-0.19.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac7f7c377c122b649f7545810c6cd1b47586e3aa3059126ce3516ac7ccc6a6a9", size = 213996 },
{ url = "https://files.pythonhosted.org/packages/aa/5a/4cd408d90d1417e8d2ce6a22b98a6853c1b4d7cb7669153e4424d60087f6/msgspec-0.19.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a5bc1472223a643f5ffb5bf46ccdede7f9795078194f14edd69e3aab7020d327", size = 219087 },
{ url = "https://files.pythonhosted.org/packages/23/d8/f15b40611c2d5753d1abb0ca0da0c75348daf1252220e5dda2867bd81062/msgspec-0.19.0-cp313-cp313-win_amd64.whl", hash = "sha256:317050bc0f7739cb30d257ff09152ca309bf5a369854bbf1e57dffc310c1f20f", size = 187432 },
]
[[package]]
name = "mypy-extensions"
version = "1.0.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/98/a4/1ab47638b92648243faf97a5aeb6ea83059cc3624972ab6b8d2316078d3f/mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782", size = 4433 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/2a/e2/5d3f6ada4297caebe1a2add3b126fe800c96f56dbe5d1988a2cbe0b267aa/mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d", size = 4695 },
]
[[package]]
name = "outcome"
version = "1.3.0.post0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "attrs" },
]
sdist = { url = "https://files.pythonhosted.org/packages/98/df/77698abfac98571e65ffeb0c1fba8ffd692ab8458d617a0eed7d9a8d38f2/outcome-1.3.0.post0.tar.gz", hash = "sha256:9dcf02e65f2971b80047b377468e72a268e15c0af3cf1238e6ff14f7f91143b8", size = 21060 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/55/8b/5ab7257531a5d830fc8000c476e63c935488d74609b50f9384a643ec0a62/outcome-1.3.0.post0-py2.py3-none-any.whl", hash = "sha256:e771c5ce06d1415e356078d3bdd68523f284b4ce5419828922b6871e65eda82b", size = 10692 },
]
[[package]]
name = "packaging"
version = "24.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451 },
]
[[package]]
name = "pdbp"
version = "1.6.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "colorama", marker = "sys_platform == 'win32'" },
{ name = "pygments" },
{ name = "tabcompleter" },
]
sdist = { url = "https://files.pythonhosted.org/packages/69/13/80da03638f62facbee76312ca9ee5941c017b080f2e4c6919fd4e87e16e3/pdbp-1.6.1.tar.gz", hash = "sha256:f4041642952a05df89664e166d5bd379607a0866ddd753c06874f65552bdf40b", size = 25322 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/29/93/d56fb9ba5569dc29d8263c72e46d21a2fd38741339ebf03f54cf7561828c/pdbp-1.6.1-py3-none-any.whl", hash = "sha256:f10bad2ee044c0e5c168cb0825abfdbdc01c50013e9755df5261b060bdd35c22", size = 21495 },
]
[[package]]
name = "pexpect"
version = "4.9.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "ptyprocess" },
]
sdist = { url = "https://files.pythonhosted.org/packages/42/92/cc564bf6381ff43ce1f4d06852fc19a2f11d180f23dc32d9588bee2f149d/pexpect-4.9.0.tar.gz", hash = "sha256:ee7d41123f3c9911050ea2c2dac107568dc43b2d3b0c7557a33212c398ead30f", size = 166450 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/9e/c3/059298687310d527a58bb01f3b1965787ee3b40dce76752eda8b44e9a2c5/pexpect-4.9.0-py2.py3-none-any.whl", hash = "sha256:7236d1e080e4936be2dc3e326cec0af72acf9212a7e1d060210e70a47e253523", size = 63772 },
]
[[package]]
name = "pluggy"
version = "1.5.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/96/2d/02d4312c973c6050a18b314a5ad0b3210edb65a906f868e31c111dede4a6/pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1", size = 67955 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/88/5f/e351af9a41f866ac3f1fac4ca0613908d9a41741cfcf2228f4ad853b697d/pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669", size = 20556 },
]
[[package]]
name = "prompt-toolkit"
version = "3.0.50"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "wcwidth" },
]
sdist = { url = "https://files.pythonhosted.org/packages/a1/e1/bd15cb8ffdcfeeb2bdc215de3c3cffca11408d829e4b8416dcfe71ba8854/prompt_toolkit-3.0.50.tar.gz", hash = "sha256:544748f3860a2623ca5cd6d2795e7a14f3d0e1c3c9728359013f79877fc89bab", size = 429087 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e4/ea/d836f008d33151c7a1f62caf3d8dd782e4d15f6a43897f64480c2b8de2ad/prompt_toolkit-3.0.50-py3-none-any.whl", hash = "sha256:9b6427eb19e479d98acff65196a307c555eb567989e6d88ebbb1b509d9779198", size = 387816 },
]
[[package]]
name = "ptyprocess"
version = "0.7.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/20/e5/16ff212c1e452235a90aeb09066144d0c5a6a8c0834397e03f5224495c4e/ptyprocess-0.7.0.tar.gz", hash = "sha256:5c5d0a3b48ceee0b48485e0c26037c0acd7d29765ca3fbb5cb3831d347423220", size = 70762 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/22/a6/858897256d0deac81a172289110f31629fc4cee19b6f01283303e18c8db3/ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35", size = 13993 },
]
[[package]]
name = "pycparser"
version = "2.22"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/1d/b2/31537cf4b1ca988837256c910a668b553fceb8f069bedc4b1c826024b52c/pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6", size = 172736 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/13/a3/a812df4e2dd5696d1f351d58b8fe16a405b234ad2886a0dab9183fb78109/pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc", size = 117552 },
]
[[package]]
name = "pygments"
version = "2.19.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/7c/2d/c3338d48ea6cc0feb8446d8e6937e1408088a72a39937982cc6111d17f84/pygments-2.19.1.tar.gz", hash = "sha256:61c16d2a8576dc0649d9f39e089b5f02bcd27fba10d8fb4dcc28173f7a45151f", size = 4968581 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/8a/0b/9fcc47d19c48b59121088dd6da2488a49d5f72dacf8262e2790a1d2c7d15/pygments-2.19.1-py3-none-any.whl", hash = "sha256:9ea1544ad55cecf4b8242fab6dd35a93bbce657034b0611ee383099054ab6d8c", size = 1225293 },
]
[[package]]
name = "pyperclip"
version = "1.9.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/30/23/2f0a3efc4d6a32f3b63cdff36cd398d9701d26cda58e3ab97ac79fb5e60d/pyperclip-1.9.0.tar.gz", hash = "sha256:b7de0142ddc81bfc5c7507eea19da920b92252b548b96186caf94a5e2527d310", size = 20961 }
[[package]]
name = "pyreadline3"
version = "3.5.4"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/0f/49/4cea918a08f02817aabae639e3d0ac046fef9f9180518a3ad394e22da148/pyreadline3-3.5.4.tar.gz", hash = "sha256:8d57d53039a1c75adba8e50dd3d992b28143480816187ea5efbd5c78e6c885b7", size = 99839 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/5a/dc/491b7661614ab97483abf2056be1deee4dc2490ecbf7bff9ab5cdbac86e1/pyreadline3-3.5.4-py3-none-any.whl", hash = "sha256:eaf8e6cc3c49bcccf145fc6067ba8643d1df34d604a1ec0eccbf7a18e6d3fae6", size = 83178 },
]
[[package]]
name = "pytest"
version = "8.3.4"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "colorama", marker = "sys_platform == 'win32'" },
{ name = "iniconfig" },
{ name = "packaging" },
{ name = "pluggy" },
]
sdist = { url = "https://files.pythonhosted.org/packages/05/35/30e0d83068951d90a01852cb1cef56e5d8a09d20c7f511634cc2f7e0372a/pytest-8.3.4.tar.gz", hash = "sha256:965370d062bce11e73868e0335abac31b4d3de0e82f4007408d242b4f8610761", size = 1445919 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/11/92/76a1c94d3afee238333bc0a42b82935dd8f9cf8ce9e336ff87ee14d9e1cf/pytest-8.3.4-py3-none-any.whl", hash = "sha256:50e16d954148559c9a74109af1eaf0c945ba2d8f30f0a3d3335edde19788b6f6", size = 343083 },
]
[[package]]
name = "sniffio"
version = "1.3.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 },
]
[[package]]
name = "sortedcontainers"
version = "2.4.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/e8/c4/ba2f8066cceb6f23394729afe52f3bf7adec04bf9ed2c820b39e19299111/sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88", size = 30594 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/32/46/9cb0e58b2deb7f82b84065f37f3bffeb12413f947f9388e4cac22c4621ce/sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0", size = 29575 },
]
[[package]]
name = "stackscope"
version = "0.2.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/4a/fc/20dbb993353f31230138f3c63f3f0c881d1853e70d7a30cd68d2ba4cf1e2/stackscope-0.2.2.tar.gz", hash = "sha256:f508c93eb4861ada466dd3ff613ca203962ceb7587ad013759f15394e6a4e619", size = 90479 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/f1/5f/0a674fcafa03528089badb46419413f342537b5b57d2fefc9900fb8ee4e4/stackscope-0.2.2-py3-none-any.whl", hash = "sha256:c199b0cda738d39c993ee04eb01961b06b7e9aeb43ebf9fd6226cdd72ea9faf6", size = 80807 },
]
[[package]]
name = "tabcompleter"
version = "1.4.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "pyreadline3", marker = "sys_platform == 'win32'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/73/1a/ed3544579628c5709bae6fae2255e94c6982a9ff77d42d8ba59fd2f3b21a/tabcompleter-1.4.0.tar.gz", hash = "sha256:7562a9938e62f8e7c3be612c3ac4e14c5ec4307b58ba9031c148260e866e8814", size = 10431 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/65/44/bb509c3d2c0b5a87e7a5af1d5917a402a32ff026f777a6d7cb6990746cbb/tabcompleter-1.4.0-py3-none-any.whl", hash = "sha256:d744aa735b49c0a6cc2fb8fcd40077fec47425e4388301010b14e6ce3311368b", size = 6725 },
]
[[package]]
name = "tractor"
version = "0.1.0a6.dev0"
source = { editable = "." }
dependencies = [
{ name = "colorlog" },
{ name = "msgspec" },
{ name = "pdbp" },
{ name = "tabcompleter" },
{ name = "tricycle" },
{ name = "trio" },
{ name = "trio-typing" },
{ name = "wrapt" },
]
[package.dev-dependencies]
dev = [
{ name = "greenback" },
{ name = "pexpect" },
{ name = "prompt-toolkit" },
{ name = "pyperclip" },
{ name = "pytest" },
{ name = "stackscope" },
{ name = "xonsh" },
]
[package.metadata]
requires-dist = [
{ name = "colorlog", specifier = ">=6.8.2,<7" },
{ name = "msgspec", specifier = ">=0.19.0" },
{ name = "pdbp", specifier = ">=1.6,<2" },
{ name = "tabcompleter", specifier = ">=1.4.0" },
{ name = "tricycle", specifier = ">=0.4.1,<0.5" },
{ name = "trio", specifier = ">0.27" },
{ name = "trio-typing", specifier = ">=0.10.0,<0.11" },
{ name = "wrapt", specifier = ">=1.16.0,<2" },
]
[package.metadata.requires-dev]
dev = [
{ name = "greenback", specifier = ">=1.2.1,<2" },
{ name = "pexpect", specifier = ">=4.9.0,<5" },
{ name = "prompt-toolkit", specifier = ">=3.0.50" },
{ name = "pyperclip", specifier = ">=1.9.0" },
{ name = "pytest", specifier = ">=8.2.0,<9" },
{ name = "stackscope", specifier = ">=0.2.2,<0.3" },
{ name = "xonsh", specifier = ">=0.19.2" },
]
[[package]]
name = "tricycle"
version = "0.4.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "trio" },
]
sdist = { url = "https://files.pythonhosted.org/packages/f8/8e/fdd7bc467b40eedd0a5f2ed36b0d692c6e6f2473be00c8160e2e9f53adc1/tricycle-0.4.1.tar.gz", hash = "sha256:f56edb4b3e1bed3e2552b1b499b24a2dab47741e92e9b4d806acc5c35c9e6066", size = 41551 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/d7/c6/7cc05d60e21c683df99167db071ce5d848f5063c2a63971a8443466f603e/tricycle-0.4.1-py3-none-any.whl", hash = "sha256:67900995a73e7445e2c70250cdca04a778d9c3923dd960a97ad4569085e0fb3f", size = 35316 },
]
[[package]]
name = "trio"
version = "0.29.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "attrs" },
{ name = "cffi", marker = "implementation_name != 'pypy' and os_name == 'nt'" },
{ name = "idna" },
{ name = "outcome" },
{ name = "sniffio" },
{ name = "sortedcontainers" },
]
sdist = { url = "https://files.pythonhosted.org/packages/a1/47/f62e62a1a6f37909aed0bf8f5d5411e06fa03846cfcb64540cd1180ccc9f/trio-0.29.0.tar.gz", hash = "sha256:ea0d3967159fc130acb6939a0be0e558e364fee26b5deeecc893a6b08c361bdf", size = 588952 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/c9/55/c4d9bea8b3d7937901958f65124123512419ab0eb73695e5f382521abbfb/trio-0.29.0-py3-none-any.whl", hash = "sha256:d8c463f1a9cc776ff63e331aba44c125f423a5a13c684307e828d930e625ba66", size = 492920 },
]
[[package]]
name = "trio-typing"
version = "0.10.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "async-generator" },
{ name = "importlib-metadata" },
{ name = "mypy-extensions" },
{ name = "packaging" },
{ name = "trio" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/b5/74/a87aafa40ec3a37089148b859892cbe2eef08d132c816d58a60459be5337/trio-typing-0.10.0.tar.gz", hash = "sha256:065ee684296d52a8ab0e2374666301aec36ee5747ac0e7a61f230250f8907ac3", size = 38747 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/89/ff/9bd795273eb14fac7f6a59d16cc8c4d0948a619a1193d375437c7f50f3eb/trio_typing-0.10.0-py3-none-any.whl", hash = "sha256:6d0e7ec9d837a2fe03591031a172533fbf4a1a95baf369edebfc51d5a49f0264", size = 42224 },
]
[[package]]
name = "typing-extensions"
version = "4.12.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec321ba8bce9420de607a1d37f8342eee1863174c69557/typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8", size = 85321 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438 },
]
[[package]]
name = "wcwidth"
version = "0.2.13"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/6c/63/53559446a878410fc5a5974feb13d31d78d752eb18aeba59c7fef1af7598/wcwidth-0.2.13.tar.gz", hash = "sha256:72ea0c06399eb286d978fdedb6923a9eb47e1c486ce63e9b4e64fc18303972b5", size = 101301 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/fd/84/fd2ba7aafacbad3c4201d395674fc6348826569da3c0937e75505ead3528/wcwidth-0.2.13-py2.py3-none-any.whl", hash = "sha256:3da69048e4540d84af32131829ff948f1e022c1c6bdb8d6102117aac784f6859", size = 34166 },
]
[[package]]
name = "wrapt"
version = "1.17.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/c3/fc/e91cc220803d7bc4db93fb02facd8461c37364151b8494762cc88b0fbcef/wrapt-1.17.2.tar.gz", hash = "sha256:41388e9d4d1522446fe79d3213196bd9e3b301a336965b9e27ca2788ebd122f3", size = 55531 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/cd/f7/a2aab2cbc7a665efab072344a8949a71081eed1d2f451f7f7d2b966594a2/wrapt-1.17.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ff04ef6eec3eee8a5efef2401495967a916feaa353643defcc03fc74fe213b58", size = 53308 },
{ url = "https://files.pythonhosted.org/packages/50/ff/149aba8365fdacef52b31a258c4dc1c57c79759c335eff0b3316a2664a64/wrapt-1.17.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:4db983e7bca53819efdbd64590ee96c9213894272c776966ca6306b73e4affda", size = 38488 },
{ url = "https://files.pythonhosted.org/packages/65/46/5a917ce85b5c3b490d35c02bf71aedaa9f2f63f2d15d9949cc4ba56e8ba9/wrapt-1.17.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9abc77a4ce4c6f2a3168ff34b1da9b0f311a8f1cfd694ec96b0603dff1c79438", size = 38776 },
{ url = "https://files.pythonhosted.org/packages/ca/74/336c918d2915a4943501c77566db41d1bd6e9f4dbc317f356b9a244dfe83/wrapt-1.17.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0b929ac182f5ace000d459c59c2c9c33047e20e935f8e39371fa6e3b85d56f4a", size = 83776 },
{ url = "https://files.pythonhosted.org/packages/09/99/c0c844a5ccde0fe5761d4305485297f91d67cf2a1a824c5f282e661ec7ff/wrapt-1.17.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f09b286faeff3c750a879d336fb6d8713206fc97af3adc14def0cdd349df6000", size = 75420 },
{ url = "https://files.pythonhosted.org/packages/b4/b0/9fc566b0fe08b282c850063591a756057c3247b2362b9286429ec5bf1721/wrapt-1.17.2-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a7ed2d9d039bd41e889f6fb9364554052ca21ce823580f6a07c4ec245c1f5d6", size = 83199 },
{ url = "https://files.pythonhosted.org/packages/9d/4b/71996e62d543b0a0bd95dda485219856def3347e3e9380cc0d6cf10cfb2f/wrapt-1.17.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:129a150f5c445165ff941fc02ee27df65940fcb8a22a61828b1853c98763a64b", size = 82307 },
{ url = "https://files.pythonhosted.org/packages/39/35/0282c0d8789c0dc9bcc738911776c762a701f95cfe113fb8f0b40e45c2b9/wrapt-1.17.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:1fb5699e4464afe5c7e65fa51d4f99e0b2eadcc176e4aa33600a3df7801d6662", size = 75025 },
{ url = "https://files.pythonhosted.org/packages/4f/6d/90c9fd2c3c6fee181feecb620d95105370198b6b98a0770cba090441a828/wrapt-1.17.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9a2bce789a5ea90e51a02dfcc39e31b7f1e662bc3317979aa7e5538e3a034f72", size = 81879 },
{ url = "https://files.pythonhosted.org/packages/8f/fa/9fb6e594f2ce03ef03eddbdb5f4f90acb1452221a5351116c7c4708ac865/wrapt-1.17.2-cp311-cp311-win32.whl", hash = "sha256:4afd5814270fdf6380616b321fd31435a462019d834f83c8611a0ce7484c7317", size = 36419 },
{ url = "https://files.pythonhosted.org/packages/47/f8/fb1773491a253cbc123c5d5dc15c86041f746ed30416535f2a8df1f4a392/wrapt-1.17.2-cp311-cp311-win_amd64.whl", hash = "sha256:acc130bc0375999da18e3d19e5a86403667ac0c4042a094fefb7eec8ebac7cf3", size = 38773 },
{ url = "https://files.pythonhosted.org/packages/a1/bd/ab55f849fd1f9a58ed7ea47f5559ff09741b25f00c191231f9f059c83949/wrapt-1.17.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:d5e2439eecc762cd85e7bd37161d4714aa03a33c5ba884e26c81559817ca0925", size = 53799 },
{ url = "https://files.pythonhosted.org/packages/53/18/75ddc64c3f63988f5a1d7e10fb204ffe5762bc663f8023f18ecaf31a332e/wrapt-1.17.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:3fc7cb4c1c744f8c05cd5f9438a3caa6ab94ce8344e952d7c45a8ed59dd88392", size = 38821 },
{ url = "https://files.pythonhosted.org/packages/48/2a/97928387d6ed1c1ebbfd4efc4133a0633546bec8481a2dd5ec961313a1c7/wrapt-1.17.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8fdbdb757d5390f7c675e558fd3186d590973244fab0c5fe63d373ade3e99d40", size = 38919 },
{ url = "https://files.pythonhosted.org/packages/73/54/3bfe5a1febbbccb7a2f77de47b989c0b85ed3a6a41614b104204a788c20e/wrapt-1.17.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5bb1d0dbf99411f3d871deb6faa9aabb9d4e744d67dcaaa05399af89d847a91d", size = 88721 },
{ url = "https://files.pythonhosted.org/packages/25/cb/7262bc1b0300b4b64af50c2720ef958c2c1917525238d661c3e9a2b71b7b/wrapt-1.17.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d18a4865f46b8579d44e4fe1e2bcbc6472ad83d98e22a26c963d46e4c125ef0b", size = 80899 },
{ url = "https://files.pythonhosted.org/packages/2a/5a/04cde32b07a7431d4ed0553a76fdb7a61270e78c5fd5a603e190ac389f14/wrapt-1.17.2-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc570b5f14a79734437cb7b0500376b6b791153314986074486e0b0fa8d71d98", size = 89222 },
{ url = "https://files.pythonhosted.org/packages/09/28/2e45a4f4771fcfb109e244d5dbe54259e970362a311b67a965555ba65026/wrapt-1.17.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6d9187b01bebc3875bac9b087948a2bccefe464a7d8f627cf6e48b1bbae30f82", size = 86707 },
{ url = "https://files.pythonhosted.org/packages/c6/d2/dcb56bf5f32fcd4bd9aacc77b50a539abdd5b6536872413fd3f428b21bed/wrapt-1.17.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:9e8659775f1adf02eb1e6f109751268e493c73716ca5761f8acb695e52a756ae", size = 79685 },
{ url = "https://files.pythonhosted.org/packages/80/4e/eb8b353e36711347893f502ce91c770b0b0929f8f0bed2670a6856e667a9/wrapt-1.17.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e8b2816ebef96d83657b56306152a93909a83f23994f4b30ad4573b00bd11bb9", size = 87567 },
{ url = "https://files.pythonhosted.org/packages/17/27/4fe749a54e7fae6e7146f1c7d914d28ef599dacd4416566c055564080fe2/wrapt-1.17.2-cp312-cp312-win32.whl", hash = "sha256:468090021f391fe0056ad3e807e3d9034e0fd01adcd3bdfba977b6fdf4213ea9", size = 36672 },
{ url = "https://files.pythonhosted.org/packages/15/06/1dbf478ea45c03e78a6a8c4be4fdc3c3bddea5c8de8a93bc971415e47f0f/wrapt-1.17.2-cp312-cp312-win_amd64.whl", hash = "sha256:ec89ed91f2fa8e3f52ae53cd3cf640d6feff92ba90d62236a81e4e563ac0e991", size = 38865 },
{ url = "https://files.pythonhosted.org/packages/ce/b9/0ffd557a92f3b11d4c5d5e0c5e4ad057bd9eb8586615cdaf901409920b14/wrapt-1.17.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:6ed6ffac43aecfe6d86ec5b74b06a5be33d5bb9243d055141e8cabb12aa08125", size = 53800 },
{ url = "https://files.pythonhosted.org/packages/c0/ef/8be90a0b7e73c32e550c73cfb2fa09db62234227ece47b0e80a05073b375/wrapt-1.17.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:35621ae4c00e056adb0009f8e86e28eb4a41a4bfa8f9bfa9fca7d343fe94f998", size = 38824 },
{ url = "https://files.pythonhosted.org/packages/36/89/0aae34c10fe524cce30fe5fc433210376bce94cf74d05b0d68344c8ba46e/wrapt-1.17.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a604bf7a053f8362d27eb9fefd2097f82600b856d5abe996d623babd067b1ab5", size = 38920 },
{ url = "https://files.pythonhosted.org/packages/3b/24/11c4510de906d77e0cfb5197f1b1445d4fec42c9a39ea853d482698ac681/wrapt-1.17.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5cbabee4f083b6b4cd282f5b817a867cf0b1028c54d445b7ec7cfe6505057cf8", size = 88690 },
{ url = "https://files.pythonhosted.org/packages/71/d7/cfcf842291267bf455b3e266c0c29dcb675b5540ee8b50ba1699abf3af45/wrapt-1.17.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:49703ce2ddc220df165bd2962f8e03b84c89fee2d65e1c24a7defff6f988f4d6", size = 80861 },
{ url = "https://files.pythonhosted.org/packages/d5/66/5d973e9f3e7370fd686fb47a9af3319418ed925c27d72ce16b791231576d/wrapt-1.17.2-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8112e52c5822fc4253f3901b676c55ddf288614dc7011634e2719718eaa187dc", size = 89174 },
{ url = "https://files.pythonhosted.org/packages/a7/d3/8e17bb70f6ae25dabc1aaf990f86824e4fd98ee9cadf197054e068500d27/wrapt-1.17.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9fee687dce376205d9a494e9c121e27183b2a3df18037f89d69bd7b35bcf59e2", size = 86721 },
{ url = "https://files.pythonhosted.org/packages/6f/54/f170dfb278fe1c30d0ff864513cff526d624ab8de3254b20abb9cffedc24/wrapt-1.17.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:18983c537e04d11cf027fbb60a1e8dfd5190e2b60cc27bc0808e653e7b218d1b", size = 79763 },
{ url = "https://files.pythonhosted.org/packages/4a/98/de07243751f1c4a9b15c76019250210dd3486ce098c3d80d5f729cba029c/wrapt-1.17.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:703919b1633412ab54bcf920ab388735832fdcb9f9a00ae49387f0fe67dad504", size = 87585 },
{ url = "https://files.pythonhosted.org/packages/f9/f0/13925f4bd6548013038cdeb11ee2cbd4e37c30f8bfd5db9e5a2a370d6e20/wrapt-1.17.2-cp313-cp313-win32.whl", hash = "sha256:abbb9e76177c35d4e8568e58650aa6926040d6a9f6f03435b7a522bf1c487f9a", size = 36676 },
{ url = "https://files.pythonhosted.org/packages/bf/ae/743f16ef8c2e3628df3ddfd652b7d4c555d12c84b53f3d8218498f4ade9b/wrapt-1.17.2-cp313-cp313-win_amd64.whl", hash = "sha256:69606d7bb691b50a4240ce6b22ebb319c1cfb164e5f6569835058196e0f3a845", size = 38871 },
{ url = "https://files.pythonhosted.org/packages/3d/bc/30f903f891a82d402ffb5fda27ec1d621cc97cb74c16fea0b6141f1d4e87/wrapt-1.17.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:4a721d3c943dae44f8e243b380cb645a709ba5bd35d3ad27bc2ed947e9c68192", size = 56312 },
{ url = "https://files.pythonhosted.org/packages/8a/04/c97273eb491b5f1c918857cd26f314b74fc9b29224521f5b83f872253725/wrapt-1.17.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:766d8bbefcb9e00c3ac3b000d9acc51f1b399513f44d77dfe0eb026ad7c9a19b", size = 40062 },
{ url = "https://files.pythonhosted.org/packages/4e/ca/3b7afa1eae3a9e7fefe499db9b96813f41828b9fdb016ee836c4c379dadb/wrapt-1.17.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e496a8ce2c256da1eb98bd15803a79bee00fc351f5dfb9ea82594a3f058309e0", size = 40155 },
{ url = "https://files.pythonhosted.org/packages/89/be/7c1baed43290775cb9030c774bc53c860db140397047cc49aedaf0a15477/wrapt-1.17.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40d615e4fe22f4ad3528448c193b218e077656ca9ccb22ce2cb20db730f8d306", size = 113471 },
{ url = "https://files.pythonhosted.org/packages/32/98/4ed894cf012b6d6aae5f5cc974006bdeb92f0241775addad3f8cd6ab71c8/wrapt-1.17.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a5aaeff38654462bc4b09023918b7f21790efb807f54c000a39d41d69cf552cb", size = 101208 },
{ url = "https://files.pythonhosted.org/packages/ea/fd/0c30f2301ca94e655e5e057012e83284ce8c545df7661a78d8bfca2fac7a/wrapt-1.17.2-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9a7d15bbd2bc99e92e39f49a04653062ee6085c0e18b3b7512a4f2fe91f2d681", size = 109339 },
{ url = "https://files.pythonhosted.org/packages/75/56/05d000de894c4cfcb84bcd6b1df6214297b8089a7bd324c21a4765e49b14/wrapt-1.17.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:e3890b508a23299083e065f435a492b5435eba6e304a7114d2f919d400888cc6", size = 110232 },
{ url = "https://files.pythonhosted.org/packages/53/f8/c3f6b2cf9b9277fb0813418e1503e68414cd036b3b099c823379c9575e6d/wrapt-1.17.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:8c8b293cd65ad716d13d8dd3624e42e5a19cc2a2f1acc74b30c2c13f15cb61a6", size = 100476 },
{ url = "https://files.pythonhosted.org/packages/a7/b1/0bb11e29aa5139d90b770ebbfa167267b1fc548d2302c30c8f7572851738/wrapt-1.17.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:4c82b8785d98cdd9fed4cac84d765d234ed3251bd6afe34cb7ac523cb93e8b4f", size = 106377 },
{ url = "https://files.pythonhosted.org/packages/6a/e1/0122853035b40b3f333bbb25f1939fc1045e21dd518f7f0922b60c156f7c/wrapt-1.17.2-cp313-cp313t-win32.whl", hash = "sha256:13e6afb7fe71fe7485a4550a8844cc9ffbe263c0f1a1eea569bc7091d4898555", size = 37986 },
{ url = "https://files.pythonhosted.org/packages/09/5e/1655cf481e079c1f22d0cabdd4e51733679932718dc23bf2db175f329b76/wrapt-1.17.2-cp313-cp313t-win_amd64.whl", hash = "sha256:eaf675418ed6b3b31c7a989fd007fa7c3be66ce14e5c3b27336383604c9da85c", size = 40750 },
{ url = "https://files.pythonhosted.org/packages/2d/82/f56956041adef78f849db6b289b282e72b55ab8045a75abad81898c28d19/wrapt-1.17.2-py3-none-any.whl", hash = "sha256:b18f2d1533a71f069c7f82d524a52599053d4c7166e9dd374ae2136b7f40f7c8", size = 23594 },
]
[[package]]
name = "xonsh"
version = "0.19.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/68/4e/56e95a5e607eb3b0da37396f87cde70588efc8ef819ab16f02d5b8378dc4/xonsh-0.19.2.tar.gz", hash = "sha256:cfdd0680d954a2c3aefd6caddcc7143a3d06aa417ed18365a08219bb71b960b0", size = 799960 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/6c/13/281094759df87b23b3c02dc4a16603ab08ea54d7f6acfeb69f3341137c7a/xonsh-0.19.2-py310-none-any.whl", hash = "sha256:ec7f163fd3a4943782aa34069d4e72793328c916a5975949dbec8536cbfc089b", size = 642301 },
{ url = "https://files.pythonhosted.org/packages/29/41/a51e4c3918fe9a293b150cb949b1b8c6d45eb17dfed480dcb76ea43df4e7/xonsh-0.19.2-py311-none-any.whl", hash = "sha256:53c45f7a767901f2f518f9b8dd60fc653e0498e56e89825e1710bb0859985049", size = 642286 },
{ url = "https://files.pythonhosted.org/packages/0a/93/9a77b731f492fac27c577dea2afb5a2bcc2a6a1c79be0c86c95498060270/xonsh-0.19.2-py312-none-any.whl", hash = "sha256:b24c619aa52b59eae4d35c4195dba9b19a2c548fb5c42c6f85f2b8ccb96807b5", size = 642386 },
{ url = "https://files.pythonhosted.org/packages/be/75/070324769c1ff88d971ce040f4f486339be98e0a365c8dd9991eb654265b/xonsh-0.19.2-py313-none-any.whl", hash = "sha256:c53ef6c19f781fbc399ed1b382b5c2aac2125010679a3b61d643978273c27df0", size = 642873 },
{ url = "https://files.pythonhosted.org/packages/fa/cb/2c7ccec54f5b0e73fdf7650e8336582ff0347d9001c5ef8271dc00c034fe/xonsh-0.19.2-py39-none-any.whl", hash = "sha256:bcc0225dc3847f1ed2f175dac6122fbcc54cea67d9c2dc2753d9615e2a5ff284", size = 634602 },
]
[[package]]
name = "zipp"
version = "3.21.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/3f/50/bad581df71744867e9468ebd0bcd6505de3b275e06f202c2cb016e3ff56f/zipp-3.21.0.tar.gz", hash = "sha256:2c9958f6430a2040341a52eb608ed6dd93ef4392e02ffe219417c1b28b5dd1f4", size = 24545 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/b7/1a/7e4798e9339adc931158c9d69ecc34f5e6791489d469f5e50ec15e35f458/zipp-3.21.0-py3-none-any.whl", hash = "sha256:ac1bbe05fd2991f160ebce24ffbac5f6d11d83dc90891255885223d42b3cd931", size = 9630 },
]