Compare commits

..

15 Commits

Author SHA1 Message Date
wygud 481eb98cfd Add macOS compatibility for Unix socket credential passing
Make socket credential imports platform-conditional in `.ipc._uds`.
- Linux: use `SO_PASSCRED`/`SO_PEERCRED` from socket module
- macOS: use `LOCAL_PEERCRED` (0x0001) instead, no need for `SO_PASSCRED`
- Conditionally call `setsockopt(SO_PASSCRED)` only on Linux

Fixes AttributeError on macOS where SO_PASSCRED doesn't exist.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2026-01-25 17:10:36 -05:00
Tyler Goodlet 3befe0940e Use `platformdirs` for `.config.get_rt_dir()`
Thanks to the `tox`-dev community for such a lovely pkg which seems to
solves all the current cross-platform user-dir problems B)

Also this,
- now passes `platformdirs.user_runtime_dir(appname='tractor')`
  and allows caller to pass an optional `subdir` under `tractor/`
  if desired.
- drops the `.config._rtdir: Path` mod var.
- bumps the lock file with the new dep.
2026-01-23 20:48:11 -05:00
Tyler Goodlet 13c8e753d5 Mv `load_module_from_path()` to a new `._code_load` submod 2026-01-23 20:46:13 -05:00
Tyler Goodlet f6d3415b53 Extend `.to_asyncio.LinkedTaskChannel` for aio side
With methods to comms similar to those that exist for the `trio` side,
- `.get()` which proxies verbatim to the `._to_aio: asyncio.Queue`,
- `.send_nowait()` which thin-wraps to `._to_trio: trio.MemorySendChannel`.

Obviously the more correct design is to break up the channel type into
a pair of handle types, one for each "side's" task in each event-loop,
that's hopefully coming shortly in a follow up patch B)

Also,
- fill in some missing doc strings, tweak some explanation comments and
  update todos.
- adjust the `test_aio_errors_and_channel_propagates_and_closes()` suite
  to use the new `chan` fn-sig-API with `.open_channel_from()` including
  the new methods for msg comms; ensures everything added here works e2e.
2026-01-23 20:46:13 -05:00
Tyler Goodlet f34537aa70 Hide `._rpc._invoke()` frame, again.. 2026-01-23 20:46:13 -05:00
Tyler Goodlet ac9228a101 Explain the `infect_asyncio: bool` param to pass in RTE msg 2026-01-23 20:46:13 -05:00
Tyler Goodlet b5765d8221 Toss in masked `.set_trace()` for unshielded `.pause()` debug 2026-01-23 20:46:13 -05:00
Tyler Goodlet 3ef6bc769a Mask tpt-closed handling of `chan.send(return_msg)`
A partial revert of commit c05d08e426 since it seem we already
suppress tpt-closed errors lower down in `.ipc.Channel.send()`; given
that i'm pretty sure this new handler code should basically never run?

Left in a todo to remove the masked content once i'm done more
thoroughly testing under `piker`.
2026-01-23 20:46:13 -05:00
Tyler Goodlet bfc6caf2e7 More `TransportClosed`-handling around IPC-IO
For IPC-disconnects-during-teardown edge cases, augment some `._rpc`
machinery,
- in `._invoke()` around the `await chan.send(return_msg)` where we
  suppress if the underlying `Channel` already disconnected.
- add a disjoint handler in `_errors_relayed_via_ipc()` which just
  reports-n-reraises the exc (same as prior behaviour).
  * originally i thought it needed to be handled specially (to avoid
    being crash handled) but turns out that isn't necessary?
  * hence the also-added-bu-masked-out `debug_filter` / guard expression
    around the `await debug._maybe_enter_pm()` line.
- show the `._invoke()` frame for the moment.
2026-01-23 20:46:13 -05:00
Tyler Goodlet 2e0ba81f98 Use new `pkg_name` in log-sys test suites 2026-01-23 20:46:07 -05:00
Tyler Goodlet 7ccd7aa227 Implicitly name sub-logs by caller's mod
That is when no `name` is passed to `get_logger()`, try to introspect
the caller's `module.__name__` and use it to infer/get the "namespace
path" to that module the same as if using `name=__name__` as in the most
common usage.

Further, change the `_root_name` to be `pkg_name: str`, a public and
more obvious param name, and deprecate the former. This obviously adds
the necessary impl to make the new
`test_sys_log::test_implicit_mod_name_applied_for_child` test pass.

Impl detalles for `get_logger()`,
- add `pkg_name` and deprecate `_root_name`, include failover logic
  and a warning.
- implement calling module introspection using
  `inspect.stack()/getmodule()` to get both the `.__name__` and
  `.__package__` info alongside adjusted logic to set the `name`
  when not provided but only when a new `mk_sublog: bool` is set.
- tweak the `name` processing for implicitly set case,
  - rename `sub_name` -> `pkg_path: str` which is the path
    to the calling module minus that module's name component.
  - only partition `name` if `pkg_name` is `in` it.
  - use the `_root_log` for `pkg_name` duplication warnings.

Other/related,
- add types to various public mod vars missing them.
- rename `.log.log` -> `.log._root_log`.
2026-01-23 20:46:07 -05:00
Tyler Goodlet 6d16c6347f Add an implicit-pkg-path-as-logger-name test
A bit of test driven dev to anticipate support  of `.log.get_logger()`
usage such that it can be called from arbitrary sub-modules, themselves
embedded in arbitrary sub-pkgs, of some project; the when not provided,
the `sub_name` passed to the `Logger.getChild(<sub_name>)` will be set
as the sub-pkg path "down to" the calling module.

IOW if you call something like,

`log = tractor.log.get_logger(pkg_name='mypylib')`

from some `submod.py` in a project-dir that looks like,

mypylib/
  mod.py
  subpkg/
    submod.py  <- calling module

the `log: StackLevelAdapter` child-`Logger` instance will have a
`.name: str = 'mypylib.subpkg'`, discluding the `submod` part since this
already rendered as the `{filename}` header in `log.LOG_FORMAT`.

Previously similar behaviour would be obtained by passing
`get_logger(name=__name__)` in the calling module and so much so it
motivated me to make this the default, presuming we can introspect for
the info.

Impl deats,
- duplicated a `load_module_from_path()` from `modden` to load the
  `testdir` rendered py project dir from its path.
 |_should prolly factor it down to this lib anyway bc we're going to
   need it for hot code reload? (well that and `watchfiles` Bp)
- in each of `mod.py` and `submod.py` render the `get_logger()` code
  sin `name`, expecting the (coming shortly) implicit introspection
  feat to do this.
- do `.name` and `.parent` checks against expected sub-logger values
  from `StackLevelAdapter.logger.getChildren()`.
2026-01-23 20:46:07 -05:00
Tyler Goodlet f8b24082b9 Start a logging-sys unit-test module
To start ensuring that when `name=__name__` is passed we try to
de-duplicate the `_root_name` and any `leaf_mod: str` since it's already
included in the headers as `{filename}`.

Deats,
- heavily document the de-duplication `str.partition()`s in
  `.log.get_logger()` and provide the end fix by changing the predicate,
  `if rname == 'tractor':` -> `if rname == _root_name`.
  * also toss in some warnings for when we still detect duplicates.
- add todo comments around logging "filters" (vs. our "adapter").
- create the new `test_log_sys.test_root_pkg_not_duplicated()` which
  runs green with the fixes from ^.
- add a ton of test-suite todos both for existing and anticipated
  logging sys feats in the new mod.
2026-01-23 20:46:07 -05:00
Gud Boi 1dc27c5161 Add a dev-overlay nix flake
Based on the impure template from `pyproject.nix` and providing
a dev-shell for easy bypass-n-hack on nix(os) using `uv`.

Deats,
- include bash completion pkgs for devx/happiness.
- pull `ruff` from <nixpkgs> to avoid wheel (build) issues.
- pin to py313 `cpython` for now.
2026-01-23 16:27:19 -05:00
Gud Boi 14aefa4b11 Reorg dev deps into nested groups
Namely,
- `devx` for console debugging extras used in `tractor.devx`.
- `repl` for @goodboy's `xonsh` hackin utils.
- `testing` for harness stuffs.
- `lint` for whenever we start doing that; it requires special
  separation on nixos in order to pull `ruff` from pkgs.

Oh and bump the lock file.
2026-01-23 16:24:24 -05:00
13 changed files with 713 additions and 80 deletions

27
flake.lock 100644
View File

@ -0,0 +1,27 @@
{
"nodes": {
"nixpkgs": {
"locked": {
"lastModified": 1769018530,
"narHash": "sha256-MJ27Cy2NtBEV5tsK+YraYr2g851f3Fl1LpNHDzDX15c=",
"owner": "nixos",
"repo": "nixpkgs",
"rev": "88d3861acdd3d2f0e361767018218e51810df8a1",
"type": "github"
},
"original": {
"owner": "nixos",
"ref": "nixos-unstable",
"repo": "nixpkgs",
"type": "github"
}
},
"root": {
"inputs": {
"nixpkgs": "nixpkgs"
}
}
},
"root": "root",
"version": 7
}

70
flake.nix 100644
View File

@ -0,0 +1,70 @@
# An "impure" template thx to `pyproject.nix`,
# https://pyproject-nix.github.io/pyproject.nix/templates.html#impure
# https://github.com/pyproject-nix/pyproject.nix/blob/master/templates/impure/flake.nix
{
description = "An impure overlay using `uv` with Nix(OS)";
inputs = {
nixpkgs.url = "github:nixos/nixpkgs/nixos-unstable";
};
outputs =
{ nixpkgs, ... }:
let
inherit (nixpkgs) lib;
forAllSystems = lib.genAttrs lib.systems.flakeExposed;
in
{
devShells = forAllSystems (
system:
let
pkgs = nixpkgs.legacyPackages.${system};
# XXX NOTE XXX, for now we overlay specific pkgs via
# a major-version-pinned-`cpython`
cpython = "python313";
venv_dir = "py313";
pypkgs = pkgs."${cpython}Packages";
in
{
default = pkgs.mkShell {
packages = with pkgs; [
# XXX, ensure sh completions activate!
bashInteractive
bash-completion
# on nixos, use pkg(s)
ruff
pypkgs.ruff
uv
python313 # ?TODO^ how to set from `cpython` above?
];
shellHook = ''
# unmask to debug **this** dev-shell-hook
# set -e
# link-in c++ stdlib for various AOT-ext-pkgs (numpy, etc.)
LD_LIBRARY_PATH="${pkgs.stdenv.cc.cc.lib}/lib:$LD_LIBRARY_PATH"
export LD_LIBRARY_PATH
# RUNTIME-SETTINGS
# ------ uv ------
# - always use the ./py313/ venv-subdir
# - sync env with all extras
export UV_PROJECT_ENVIRONMENT=${venv_dir}
uv sync --dev --all-extras --no-group lint
# ------ TIPS ------
# NOTE, to launch the py-venv installed `xonsh` (like @goodboy)
# run the `nix develop` cmd with,
# >> nix develop -c uv run xonsh
'';
};
}
);
};
}

View File

@ -47,28 +47,40 @@ dependencies = [
"msgspec>=0.19.0", "msgspec>=0.19.0",
"cffi>=1.17.1", "cffi>=1.17.1",
"bidict>=0.23.1", "bidict>=0.23.1",
"platformdirs>=4.4.0",
] ]
# ------ project ------ # ------ project ------
[dependency-groups] [dependency-groups]
dev = [ dev = [
# test suite {include-group = 'devx'},
# TODO: maybe some of these layout choices? {include-group = 'testing'},
# https://docs.pytest.org/en/8.0.x/explanation/goodpractices.html#choosing-a-test-layout-import-rules {include-group = 'repl'},
"pytest>=8.3.5", ]
"pexpect>=4.9.0,<5", devx = [
# `tractor.devx` tooling # `tractor.devx` tooling
"greenback>=1.2.1,<2", "greenback>=1.2.1,<2",
"stackscope>=0.2.2,<0.3", "stackscope>=0.2.2,<0.3",
# ^ requires this? # ^ requires this?
"typing-extensions>=4.14.1", "typing-extensions>=4.14.1",
]
testing = [
# test suite
# TODO: maybe some of these layout choices?
# https://docs.pytest.org/en/8.0.x/explanation/goodpractices.html#choosing-a-test-layout-import-rules
"pytest>=8.3.5",
"pexpect>=4.9.0,<5",
]
repl = [
"pyperclip>=1.9.0", "pyperclip>=1.9.0",
"prompt-toolkit>=3.0.50", "prompt-toolkit>=3.0.50",
"xonsh>=0.19.2", "xonsh>=0.19.2",
"psutil>=7.0.0", "psutil>=7.0.0",
] ]
lint = [
"ruff>=0.9.6"
]
# TODO, add these with sane versions; were originally in # TODO, add these with sane versions; were originally in
# `requirements-docs.txt`.. # `requirements-docs.txt`..
# docs = [ # docs = [

View File

@ -732,15 +732,21 @@ def test_aio_errors_and_channel_propagates_and_closes(
async def aio_echo_server( async def aio_echo_server(
to_trio: trio.MemorySendChannel, chan: to_asyncio.LinkedTaskChannel,
from_trio: asyncio.Queue,
) -> None: ) -> None:
'''
An IPC-msg "echo server" with msgs received and relayed by
a parent `trio.Task` into a child `asyncio.Task`
and then repeated back to that local parent (`trio.Task`)
and sent again back to the original calling remote actor.
to_trio.send_nowait('start') '''
# same semantics as `trio.TaskStatus.started()`
chan.started_nowait('start')
while True: while True:
try: try:
msg = await from_trio.get() msg = await chan.get()
except to_asyncio.TrioTaskExited: except to_asyncio.TrioTaskExited:
print( print(
'breaking aio echo loop due to `trio` exit!' 'breaking aio echo loop due to `trio` exit!'
@ -748,7 +754,7 @@ async def aio_echo_server(
break break
# echo the msg back # echo the msg back
to_trio.send_nowait(msg) chan.send_nowait(msg)
# if we get the terminate sentinel # if we get the terminate sentinel
# break the echo loop # break the echo loop
@ -765,7 +771,10 @@ async def trio_to_aio_echo_server(
): ):
async with to_asyncio.open_channel_from( async with to_asyncio.open_channel_from(
aio_echo_server, aio_echo_server,
) as (first, chan): ) as (
first, # value from `chan.started_nowait()` above
chan,
):
assert first == 'start' assert first == 'start'
await ctx.started(first) await ctx.started(first)
@ -776,7 +785,8 @@ async def trio_to_aio_echo_server(
await chan.send(msg) await chan.send(msg)
out = await chan.receive() out = await chan.receive()
# echo back to parent actor-task
# echo back to parent-actor's remote parent-ctx-task!
await stream.send(out) await stream.send(out)
if out is None: if out is None:
@ -1090,14 +1100,12 @@ def test_sigint_closes_lifetime_stack(
# ?TODO asyncio.Task fn-deco? # ?TODO asyncio.Task fn-deco?
# -[ ] do sig checkingat import time like @context?
# -[ ] maybe name it @aio_task ??
# -[ ] chan: to_asyncio.InterloopChannel ?? # -[ ] chan: to_asyncio.InterloopChannel ??
# -[ ] do fn-sig checking at import time like @context?
# |_[ ] maybe name it @a(sync)io_task ??
# @asyncio_task <- not bad ??
async def raise_before_started( async def raise_before_started(
# from_trio: asyncio.Queue,
# to_trio: trio.abc.SendChannel,
chan: to_asyncio.LinkedTaskChannel, chan: to_asyncio.LinkedTaskChannel,
) -> None: ) -> None:
''' '''
`asyncio.Task` entry point which RTEs before calling `asyncio.Task` entry point which RTEs before calling

View File

@ -0,0 +1,150 @@
'''
`tractor.log`-wrapping unit tests.
'''
from pathlib import Path
import shutil
import pytest
import tractor
from tractor import _code_load
def test_root_pkg_not_duplicated_in_logger_name():
'''
When both `pkg_name` and `name` are passed and they have
a common `<root_name>.< >` prefix, ensure that it is not
duplicated in the child's `StackLevelAdapter.name: str`.
'''
project_name: str = 'pylib'
pkg_path: str = 'pylib.subpkg.mod'
proj_log = tractor.log.get_logger(
pkg_name=project_name,
mk_sublog=False,
)
sublog = tractor.log.get_logger(
pkg_name=project_name,
name=pkg_path,
)
assert proj_log is not sublog
assert sublog.name.count(proj_log.name) == 1
assert 'mod' not in sublog.name
def test_implicit_mod_name_applied_for_child(
testdir: pytest.Pytester,
loglevel: str,
):
'''
Verify that when `.log.get_logger(pkg_name='pylib')` is called
from a given sub-mod from within the `pylib` pkg-path, we
implicitly set the equiv of `name=__name__` from the caller's
module.
'''
# tractor.log.get_console_log(level=loglevel)
proj_name: str = 'snakelib'
mod_code: str = (
f'import tractor\n'
f'\n'
f'log = tractor.log.get_logger(pkg_name="{proj_name}")\n'
)
# create a sub-module for each pkg layer
_lib = testdir.mkpydir(proj_name)
pkg: Path = Path(_lib)
subpkg: Path = pkg / 'subpkg'
subpkg.mkdir()
pkgmod: Path = subpkg / "__init__.py"
pkgmod.touch()
_submod: Path = testdir.makepyfile(
_mod=mod_code,
)
pkg_mod = pkg / 'mod.py'
pkg_subpkg_submod = subpkg / 'submod.py'
shutil.copyfile(
_submod,
pkg_mod,
)
shutil.copyfile(
_submod,
pkg_subpkg_submod,
)
testdir.chdir()
# XXX NOTE, once the "top level" pkg mod has been
# imported, we can then use `import` syntax to
# import it's sub-pkgs and modules.
pkgmod = _code_load.load_module_from_path(
Path(pkg / '__init__.py'),
module_name=proj_name,
)
pkg_root_log = tractor.log.get_logger(
pkg_name=proj_name,
mk_sublog=False,
)
assert pkg_root_log.name == proj_name
assert not pkg_root_log.logger.getChildren()
from snakelib import mod
assert mod.log.name == proj_name
from snakelib.subpkg import submod
assert (
submod.log.name
==
submod.__package__ # ?TODO, use this in `.get_logger()` instead?
==
f'{proj_name}.subpkg'
)
sub_logs = pkg_root_log.logger.getChildren()
assert len(sub_logs) == 1 # only one nested sub-pkg module
assert submod.log.logger in sub_logs
# breakpoint()
# TODO, moar tests against existing feats:
# ------ - ------
# - [ ] color settings?
# - [ ] header contents like,
# - actor + thread + task names from various conc-primitives,
# - [ ] `StackLevelAdapter` extensions,
# - our custom levels/methods: `transport|runtime|cance|pdb|devx`
# - [ ] custom-headers support?
#
# TODO, test driven dev of new-ideas/long-wanted feats,
# ------ - ------
# - [ ] https://github.com/goodboy/tractor/issues/244
# - [ ] @catern mentioned using a sync / deterministic sys
# and in particular `svlogd`?
# |_ https://smarden.org/runit/svlogd.8
# - [ ] using adapter vs. filters?
# - https://stackoverflow.com/questions/60691759/add-information-to-every-log-message-in-python-logging/61830838#61830838
# - [ ] `.at_least_level()` optimization which short circuits wtv
# `logging` is doing behind the scenes when the level filters
# the emission..?
# - [ ] use of `.log.get_console_log()` in subactors and the
# subtleties of ensuring it actually emits from a subproc.
# - [ ] this idea of activating per-subsys emissions with some
# kind of `.name` filter passed to the runtime or maybe configured
# via the root `StackLevelAdapter`?
# - [ ] use of `logging.dict.dictConfig()` to simplify the impl
# of any of ^^ ??
# - https://stackoverflow.com/questions/7507825/where-is-a-complete-example-of-logging-config-dictconfig
# - https://docs.python.org/3/library/logging.config.html#configuration-dictionary-schema
# - https://docs.python.org/3/library/logging.config.html#logging.config.dictConfig

View File

@ -0,0 +1,48 @@
# tractor: structured concurrent "actors".
# Copyright 2018-eternity Tyler Goodlet.
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
(Hot) coad (re-)load utils for python.
'''
import importlib
from pathlib import Path
import sys
from types import ModuleType
# ?TODO, move this into internal libs?
# -[ ] we already use it in `modden.config._pymod` as well
def load_module_from_path(
path: Path,
module_name: str|None = None,
) -> ModuleType:
'''
Taken from SO,
https://stackoverflow.com/a/67208147
which is based on stdlib docs,
https://docs.python.org/3/library/importlib.html#importing-a-source-file-directly
'''
module_name = module_name or path.stem
spec = importlib.util.spec_from_file_location(
module_name,
str(path),
)
module = importlib.util.module_from_spec(spec)
sys.modules[module_name] = module
spec.loader.exec_module(module)
return module

View File

@ -284,6 +284,10 @@ async def _errors_relayed_via_ipc(
try: try:
yield # run RPC invoke body yield # run RPC invoke body
except TransportClosed:
log.exception('Tpt disconnect during remote-exc relay?')
raise
# box and ship RPC errors for wire-transit via # box and ship RPC errors for wire-transit via
# the task's requesting parent IPC-channel. # the task's requesting parent IPC-channel.
except ( except (
@ -319,6 +323,9 @@ async def _errors_relayed_via_ipc(
and debug_kbis and debug_kbis
) )
) )
# TODO? better then `debug_filter` below?
and
not isinstance(err, TransportClosed)
): ):
# XXX QUESTION XXX: is there any case where we'll # XXX QUESTION XXX: is there any case where we'll
# want to debug IPC disconnects as a default? # want to debug IPC disconnects as a default?
@ -327,13 +334,25 @@ async def _errors_relayed_via_ipc(
# recovery logic - the only case is some kind of # recovery logic - the only case is some kind of
# strange bug in our transport layer itself? Going # strange bug in our transport layer itself? Going
# to keep this open ended for now. # to keep this open ended for now.
log.debug(
'RPC task crashed, attempting to enter debugger\n' if _state.debug_mode():
f'|_{ctx}' log.exception(
) f'RPC task crashed!\n'
f'Attempting to enter debugger\n'
f'\n'
f'{ctx}'
)
entered_debug = await debug._maybe_enter_pm( entered_debug = await debug._maybe_enter_pm(
err, err,
api_frame=inspect.currentframe(), api_frame=inspect.currentframe(),
# don't REPL any psuedo-expected tpt-disconnect
# debug_filter=lambda exc: (
# type (exc) not in {
# TransportClosed,
# }
# ),
) )
if not entered_debug: if not entered_debug:
# if we prolly should have entered the REPL but # if we prolly should have entered the REPL but
@ -675,6 +694,22 @@ async def _invoke(
f'{pretty_struct.pformat(return_msg)}\n' f'{pretty_struct.pformat(return_msg)}\n'
) )
await chan.send(return_msg) await chan.send(return_msg)
# ?TODO, remove the below since .send() already
# doesn't raise on tpt-closed?
# try:
# await chan.send(return_msg)
# except TransportClosed:
# log.exception(
# f"Failed send final result to 'parent'-side of IPC-ctx!\n"
# f'\n'
# f'{chan}\n'
# f'Channel already disconnected ??\n'
# f'\n'
# f'{pretty_struct.pformat(return_msg)}'
# )
# # ?TODO? will this ever be true though?
# if chan.connected():
# raise
# NOTE: this happens IFF `ctx._scope.cancel()` is # NOTE: this happens IFF `ctx._scope.cancel()` is
# called by any of, # called by any of,

View File

@ -22,7 +22,6 @@ from __future__ import annotations
from contextvars import ( from contextvars import (
ContextVar, ContextVar,
) )
import os
from pathlib import Path from pathlib import Path
from typing import ( from typing import (
Any, Any,
@ -30,6 +29,7 @@ from typing import (
TYPE_CHECKING, TYPE_CHECKING,
) )
import platformdirs
from trio.lowlevel import current_task from trio.lowlevel import current_task
if TYPE_CHECKING: if TYPE_CHECKING:
@ -172,23 +172,32 @@ def current_ipc_ctx(
return ctx return ctx
# std ODE (mutable) app state location
_rtdir: Path = Path(os.environ['XDG_RUNTIME_DIR'])
def get_rt_dir( def get_rt_dir(
subdir: str = 'tractor' subdir: str|Path|None = None,
) -> Path: ) -> Path:
''' '''
Return the user "runtime dir" where most userspace apps stick Return the user "runtime dir", the file-sys location where most
their IPC and cache related system util-files; we take hold userspace apps stick their IPC and cache related system
of a `'XDG_RUNTIME_DIR'/tractor/` subdir by default. util-files.
On linux we take use a `'${XDG_RUNTIME_DIR}/tractor/` subdir by
default but equivalents are mapped for each platform using
the lovely `platformdirs`.
''' '''
rtdir: Path = _rtdir / subdir rt_dir: Path = Path(
if not rtdir.is_dir(): platformdirs.user_runtime_dir(
rtdir.mkdir() appname='tractor',
return rtdir ),
)
if subdir:
rt_dir: Path = rt_dir / subdir
if not rt_dir.is_dir():
rt_dir.mkdir()
return rt_dir
def current_ipc_protos() -> list[str]: def current_ipc_protos() -> list[str]:

View File

@ -561,6 +561,9 @@ async def _pause(
return return
elif isinstance(pause_err, trio.Cancelled): elif isinstance(pause_err, trio.Cancelled):
__tracebackhide__: bool = False
# XXX, unmask to REPL it.
# mk_pdb().set_trace(frame=inspect.currentframe())
_repl_fail_report += ( _repl_fail_report += (
'You called `tractor.pause()` from an already cancelled scope!\n\n' 'You called `tractor.pause()` from an already cancelled scope!\n\n'
'Consider `await tractor.pause(shield=True)` to make it work B)\n' 'Consider `await tractor.pause(shield=True)` to make it work B)\n'

View File

@ -23,14 +23,29 @@ from contextlib import (
) )
from pathlib import Path from pathlib import Path
import os import os
import sys
from socket import ( from socket import (
AF_UNIX, AF_UNIX,
SOCK_STREAM, SOCK_STREAM,
SO_PASSCRED,
SO_PEERCRED,
SOL_SOCKET, SOL_SOCKET,
) )
import struct import struct
# Platform-specific credential passing constants
# See: https://stackoverflow.com/a/7982749
if sys.platform == 'linux':
from socket import SO_PASSCRED, SO_PEERCRED
elif sys.platform == 'darwin': # macOS
# macOS uses LOCAL_PEERCRED instead of SO_PEERCRED
# and doesn't need SO_PASSCRED (credential passing is always enabled)
# Value from <sys/un.h>: #define LOCAL_PEERCRED 0x001
LOCAL_PEERCRED = 0x0001
SO_PEERCRED = LOCAL_PEERCRED # Alias for compatibility
SO_PASSCRED = None # Not needed/available on macOS
else:
# Other Unix platforms - may need additional handling
SO_PASSCRED = None
SO_PEERCRED = None
from typing import ( from typing import (
Type, Type,
TYPE_CHECKING, TYPE_CHECKING,
@ -306,7 +321,11 @@ async def open_unix_socket_w_passcred(
# much more simplified logic vs tcp sockets - one socket type and only one # much more simplified logic vs tcp sockets - one socket type and only one
# possible location to connect to # possible location to connect to
sock = trio.socket.socket(AF_UNIX, SOCK_STREAM) sock = trio.socket.socket(AF_UNIX, SOCK_STREAM)
sock.setsockopt(SOL_SOCKET, SO_PASSCRED, 1)
# Only set SO_PASSCRED on Linux (not needed/available on macOS)
if SO_PASSCRED is not None:
sock.setsockopt(SOL_SOCKET, SO_PASSCRED, 1)
with close_on_error(sock): with close_on_error(sock):
await sock.connect(os.fspath(filename)) await sock.connect(os.fspath(filename))
@ -320,7 +339,10 @@ def get_peer_info(sock: trio.socket.socket) -> tuple[
]: ]:
''' '''
Deliver the connecting peer's "credentials"-info as defined in Deliver the connecting peer's "credentials"-info as defined in
a very Linux specific way.. a platform-specific way.
On Linux, uses SO_PEERCRED.
On macOS, uses LOCAL_PEERCRED.
For more deats see, For more deats see,
- `man accept`, - `man accept`,
@ -333,6 +355,11 @@ def get_peer_info(sock: trio.socket.socket) -> tuple[
- https://stackoverflow.com/a/7982749 - https://stackoverflow.com/a/7982749
''' '''
if SO_PEERCRED is None:
raise RuntimeError(
f'Peer credential retrieval not supported on {sys.platform}!'
)
creds: bytes = sock.getsockopt( creds: bytes = sock.getsockopt(
SOL_SOCKET, SOL_SOCKET,
SO_PEERCRED, SO_PEERCRED,

View File

@ -14,11 +14,22 @@
# You should have received a copy of the GNU Affero General Public License # You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>. # along with this program. If not, see <https://www.gnu.org/licenses/>.
""" '''
Log like a forester! An enhanced logging subsys.
""" An extended logging layer using (for now) the stdlib's `logging`
+ `colorlog` which embeds concurrency-primitive/runtime info into
records (headers) to help you better grok your distributed systems
built on `tractor`.
'''
from collections.abc import Mapping from collections.abc import Mapping
from inspect import (
FrameInfo,
getmodule,
stack,
)
import sys import sys
import logging import logging
from logging import ( from logging import (
@ -26,8 +37,10 @@ from logging import (
Logger, Logger,
StreamHandler, StreamHandler,
) )
import colorlog # type: ignore from types import ModuleType
import warnings
import colorlog # type: ignore
import trio import trio
from ._state import current_actor from ._state import current_actor
@ -39,7 +52,7 @@ _default_loglevel: str = 'ERROR'
# Super sexy formatting thanks to ``colorlog``. # Super sexy formatting thanks to ``colorlog``.
# (NOTE: we use the '{' format style) # (NOTE: we use the '{' format style)
# Here, `thin_white` is just the layperson's gray. # Here, `thin_white` is just the layperson's gray.
LOG_FORMAT = ( LOG_FORMAT: str = (
# "{bold_white}{log_color}{asctime}{reset}" # "{bold_white}{log_color}{asctime}{reset}"
"{log_color}{asctime}{reset}" "{log_color}{asctime}{reset}"
" {bold_white}{thin_white}({reset}" " {bold_white}{thin_white}({reset}"
@ -51,7 +64,7 @@ LOG_FORMAT = (
" {reset}{bold_white}{thin_white}{message}" " {reset}{bold_white}{thin_white}{message}"
) )
DATE_FORMAT = '%b %d %H:%M:%S' DATE_FORMAT: str = '%b %d %H:%M:%S'
# FYI, ERROR is 40 # FYI, ERROR is 40
# TODO: use a `bidict` to avoid the :155 check? # TODO: use a `bidict` to avoid the :155 check?
@ -75,7 +88,10 @@ STD_PALETTE = {
'TRANSPORT': 'cyan', 'TRANSPORT': 'cyan',
} }
BOLD_PALETTE = { BOLD_PALETTE: dict[
str,
dict[int, str],
] = {
'bold': { 'bold': {
level: f"bold_{color}" for level, color in STD_PALETTE.items()} level: f"bold_{color}" for level, color in STD_PALETTE.items()}
} }
@ -97,10 +113,17 @@ def at_least_level(
return False return False
# TODO: this isn't showing the correct '{filename}' # TODO, compare with using a "filter" instead?
# as it did before.. # - https://stackoverflow.com/questions/60691759/add-information-to-every-log-message-in-python-logging/61830838#61830838
# |_corresponding dict-config,
# https://stackoverflow.com/questions/7507825/where-is-a-complete-example-of-logging-config-dictconfig/7507842#7507842
# - [ ] what's the benefit/tradeoffs?
#
class StackLevelAdapter(LoggerAdapter): class StackLevelAdapter(LoggerAdapter):
'''
A (software) stack oriented logger "adapter".
'''
def at_least_level( def at_least_level(
self, self,
level: str, level: str,
@ -284,7 +307,9 @@ class ActorContextInfo(Mapping):
def get_logger( def get_logger(
name: str|None = None, name: str|None = None,
_root_name: str = _proj_name, pkg_name: str = _proj_name,
# XXX, deprecated, use ^
_root_name: str|None = None,
logger: Logger|None = None, logger: Logger|None = None,
@ -293,22 +318,89 @@ def get_logger(
# |_https://stackoverflow.com/questions/7507825/where-is-a-complete-example-of-logging-config-dictconfig # |_https://stackoverflow.com/questions/7507825/where-is-a-complete-example-of-logging-config-dictconfig
# |_https://docs.python.org/3/library/logging.config.html#configuration-dictionary-schema # |_https://docs.python.org/3/library/logging.config.html#configuration-dictionary-schema
subsys_spec: str|None = None, subsys_spec: str|None = None,
mk_sublog: bool = True,
) -> StackLevelAdapter: ) -> StackLevelAdapter:
''' '''
Return the `tractor`-library root logger or a sub-logger for Return the `tractor`-library root logger or a sub-logger for
`name` if provided. `name` if provided.
When `name` is left null we try to auto-detect the caller's
`mod.__name__` and use that as a the sub-logger key.
This allows for example creating a module level instance like,
.. code:: python
log = tractor.log.get_logger(_root_name='mylib')
and by default all console record headers will show the caller's
(of any `log.<level>()`-method) correct sub-pkg's
+ py-module-file.
''' '''
if _root_name:
msg: str = (
'The `_root_name: str` param of `get_logger()` is now deprecated.\n'
'Use the new `pkg_name: str` instead, it is the same usage.\n'
)
warnings.warn(
msg,
DeprecationWarning,
stacklevel=2,
)
pkg_name: str = _root_name or pkg_name
log: Logger log: Logger
log = rlog = logger or logging.getLogger(_root_name) log = rlog = logger or logging.getLogger(pkg_name)
# Implicitly introspect the caller's module-name whenever `name`
# if left as the null default.
#
# When the `pkg_name` is `in` in the `mod.__name__` we presume
# this instance can be created as a sub-`StackLevelAdapter` and
# that the intention is get free module-path tracing and
# filtering (well once we implement that) oriented around the
# py-module code hierarchy of the consuming project.
if (
pkg_name != _proj_name
and
name is None
and
mk_sublog
):
callstack: list[FrameInfo] = stack()
caller_fi: FrameInfo = callstack[1]
caller_mod: ModuleType = getmodule(caller_fi.frame)
if caller_mod:
# ?how is this `mod.__name__` defined?
# -> well by how the mod is imported..
# |_https://stackoverflow.com/a/15883682
mod_name: str = caller_mod.__name__
mod_pkg: str = caller_mod.__package__
log.info(
f'Generating sub-logger name,\n'
f'{mod_pkg}.{mod_name}\n'
)
# if pkg_name in caller_mod.__package__:
# from tractor.devx.debug import mk_pdb
# mk_pdb().set_trace()
if (
pkg_name
# and
# pkg_name in mod_name
):
name = mod_name
# XXX, lowlevel debuggin..
# if pkg_name != _proj_name:
# from tractor.devx.debug import mk_pdb
# mk_pdb().set_trace()
if ( if (
name
and
name != _proj_name name != _proj_name
and
name
): ):
# NOTE: for handling for modules that use `get_logger(__name__)` # NOTE: for handling for modules that use `get_logger(__name__)`
# we make the following stylistic choice: # we make the following stylistic choice:
# - always avoid duplicate project-package token # - always avoid duplicate project-package token
@ -318,24 +410,63 @@ def get_logger(
# since in python the {filename} is always this same # since in python the {filename} is always this same
# module-file. # module-file.
sub_name: None|str = None rname: str = pkg_name
rname, _, sub_name = name.partition('.') pkg_path: str = name
pkgpath, _, modfilename = sub_name.rpartition('.')
# NOTE: for tractor itself never include the last level # ex. modden.runtime.progman
# module key in the name such that something like: eg. # -> rname='modden', _, pkg_path='runtime.progman'
# 'tractor.trionics._broadcast` only includes the first if pkg_name in name:
# 2 tokens in the (coloured) name part. rname, _, pkg_path = name.partition('.')
if rname == 'tractor':
sub_name = pkgpath
if _root_name in sub_name: # ex. modden.runtime.progman
duplicate, _, sub_name = sub_name.partition('.') # -> pkgpath='runtime', _, leaf_mod='progman'
subpkg_path, _, leaf_mod = pkg_path.rpartition('.')
if not sub_name: # NOTE: special usage for passing `name=__name__`,
#
# - remove duplication of any root-pkg-name in the
# (sub/child-)logger name; i.e. never include the
# `pkg_name` *twice* in the top-most-pkg-name/level
#
# -> this happens normally since it is added to `.getChild()`
# and as the name of its root-logger.
#
# => So for ex. (module key in the name) something like
# `name='tractor.trionics._broadcast` is passed,
# only includes the first 2 sub-pkg name-tokens in the
# child-logger's name; the colored "pkg-namespace" header
# will then correctly show the same value as `name`.
if rname == pkg_name:
pkg_path = subpkg_path
# XXX, do some double-checks for duplication of,
# - root-pkg-name, already in root logger
# - leaf-module-name already in `{filename}` header-field
if pkg_name in pkg_path:
_duplicate, _, pkg_path = pkg_path.partition('.')
if _duplicate:
# assert _duplicate == rname
_root_log.warning(
f'Duplicate pkg-name in sub-logger key?\n'
f'pkg_name = {pkg_name!r}\n'
f'pkg_path = {pkg_path!r}\n'
)
if (
leaf_mod
and
leaf_mod in pkg_path
):
_root_log.warning(
f'Duplicate leaf-module-name in sub-logger key?\n'
f'leaf_mod = {leaf_mod!r}\n'
f'pkg_path = {pkg_path!r}\n'
)
if not pkg_path:
log = rlog log = rlog
else: elif mk_sublog:
log = rlog.getChild(sub_name) log = rlog.getChild(pkg_path)
log.level = rlog.level log.level = rlog.level
@ -350,8 +481,13 @@ def get_logger(
for name, val in CUSTOM_LEVELS.items(): for name, val in CUSTOM_LEVELS.items():
logging.addLevelName(val, name) logging.addLevelName(val, name)
# ensure customs levels exist as methods # ensure our custom adapter levels exist as methods
assert getattr(logger, name.lower()), f'Logger does not define {name}' assert getattr(
logger,
name.lower()
), (
f'Logger does not define {name}'
)
return logger return logger
@ -425,4 +561,4 @@ def get_loglevel() -> str:
# global module logger for tractor itself # global module logger for tractor itself
log: StackLevelAdapter = get_logger('tractor') _root_log: StackLevelAdapter = get_logger('tractor')

View File

@ -94,10 +94,14 @@ else:
QueueShutDown = False QueueShutDown = False
# TODO, generally speaking we can generalize this abstraction, a "SC linked # TODO, generally speaking we can generalize this abstraction as,
# parent->child task pair", as the same "supervision scope primitive" #
# **that is** our `._context.Context` with the only difference being # > A "SC linked, inter-event-loop" channel for comms between
# in how the tasks conduct msg-passing comms. # > a `parent: trio.Task` -> `child: asyncio.Task` pair.
#
# It is **very similar** in terms of its operation as a "supervision
# scope primitive" to that of our `._context.Context` with the only
# difference being in how the tasks conduct msg-passing comms.
# #
# For `LinkedTaskChannel` we are passing the equivalent of (once you # For `LinkedTaskChannel` we are passing the equivalent of (once you
# include all the recently added `._trio/aio_to_raise` # include all the recently added `._trio/aio_to_raise`
@ -122,6 +126,7 @@ class LinkedTaskChannel(
task scheduled in the host loop. task scheduled in the host loop.
''' '''
# ?TODO, rename as `._aio_q` since it's 2-way?
_to_aio: asyncio.Queue _to_aio: asyncio.Queue
_from_aio: trio.MemoryReceiveChannel _from_aio: trio.MemoryReceiveChannel
@ -235,9 +240,11 @@ class LinkedTaskChannel(
# #
async def receive(self) -> Any: async def receive(self) -> Any:
''' '''
Receive a value from the paired `asyncio.Task` with Receive a value `trio.Task` <- `asyncio.Task`.
Note the tasks in each loop are "SC linked" as a pair with
exception/cancel handling to teardown both sides on any exception/cancel handling to teardown both sides on any
unexpected error. unexpected error or cancellation.
''' '''
try: try:
@ -261,15 +268,42 @@ class LinkedTaskChannel(
): ):
raise err raise err
async def get(self) -> Any:
'''
Receive a value `asyncio.Task` <- `trio.Task`.
This is equiv to `await self._from_trio.get()`.
'''
return await self._to_aio.get()
async def send(self, item: Any) -> None: async def send(self, item: Any) -> None:
''' '''
Send a value through to the asyncio task presuming Send a value through `trio.Task` -> `asyncio.Task`
it defines a ``from_trio`` argument, if it does not presuming
it defines a `from_trio` argument or makes calls
to `chan.get()` , if it does not
this method will raise an error. this method will raise an error.
''' '''
self._to_aio.put_nowait(item) self._to_aio.put_nowait(item)
# TODO? could we only compile-in this method on an instance
# handed to the `asyncio`-side, i.e. the fn invoked with
# `.open_channel_from()`.
def send_nowait(
self,
item: Any,
) -> None:
'''
Send a value through FROM the `asyncio.Task` to
the `trio.Task` NON-BLOCKING.
This is equiv to `self._to_trio.send_nowait()`.
'''
self._to_trio.send_nowait(item)
# TODO? needed? # TODO? needed?
# async def wait_aio_complete(self) -> None: # async def wait_aio_complete(self) -> None:
# await self._aio_task_complete.wait() # await self._aio_task_complete.wait()
@ -337,9 +371,12 @@ def _run_asyncio_task(
''' '''
__tracebackhide__: bool = hide_tb __tracebackhide__: bool = hide_tb
if not tractor.current_actor().is_infected_aio(): if not (actor := tractor.current_actor()).is_infected_aio():
raise RuntimeError( raise RuntimeError(
"`infect_asyncio` mode is not enabled!?" f'`infect_asyncio: bool` mode is not enabled ??\n'
f'Ensure you pass `ActorNursery.start_actor(infect_asyncio=True)`\n'
f'\n'
f'{actor}\n'
) )
# ITC (inter task comms), these channel/queue names are mostly from # ITC (inter task comms), these channel/queue names are mostly from

73
uv.lock
View File

@ -1,5 +1,5 @@
version = 1 version = 1
revision = 2 revision = 3
requires-python = ">=3.11" requires-python = ">=3.11"
[[package]] [[package]]
@ -236,6 +236,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/9e/c3/059298687310d527a58bb01f3b1965787ee3b40dce76752eda8b44e9a2c5/pexpect-4.9.0-py2.py3-none-any.whl", hash = "sha256:7236d1e080e4936be2dc3e326cec0af72acf9212a7e1d060210e70a47e253523", size = 63772, upload-time = "2023-11-25T06:56:14.81Z" }, { url = "https://files.pythonhosted.org/packages/9e/c3/059298687310d527a58bb01f3b1965787ee3b40dce76752eda8b44e9a2c5/pexpect-4.9.0-py2.py3-none-any.whl", hash = "sha256:7236d1e080e4936be2dc3e326cec0af72acf9212a7e1d060210e70a47e253523", size = 63772, upload-time = "2023-11-25T06:56:14.81Z" },
] ]
[[package]]
name = "platformdirs"
version = "4.4.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/23/e8/21db9c9987b0e728855bd57bff6984f67952bea55d6f75e055c46b5383e8/platformdirs-4.4.0.tar.gz", hash = "sha256:ca753cf4d81dc309bc67b0ea38fd15dc97bc30ce419a7f58d13eb3bf14c4febf", size = 21634, upload-time = "2025-08-26T14:32:04.268Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/40/4b/2028861e724d3bd36227adfa20d3fd24c3fc6d52032f4a93c133be5d17ce/platformdirs-4.4.0-py3-none-any.whl", hash = "sha256:abd01743f24e5287cd7a5db3752faf1a2d65353f38ec26d98e25a6db65958c85", size = 18654, upload-time = "2025-08-26T14:32:02.735Z" },
]
[[package]] [[package]]
name = "pluggy" name = "pluggy"
version = "1.5.0" version = "1.5.0"
@ -329,6 +338,32 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/30/3d/64ad57c803f1fa1e963a7946b6e0fea4a70df53c1a7fed304586539c2bac/pytest-8.3.5-py3-none-any.whl", hash = "sha256:c69214aa47deac29fad6c2a4f590b9c4a9fdb16a403176fe154b79c0b4d4d820", size = 343634, upload-time = "2025-03-02T12:54:52.069Z" }, { url = "https://files.pythonhosted.org/packages/30/3d/64ad57c803f1fa1e963a7946b6e0fea4a70df53c1a7fed304586539c2bac/pytest-8.3.5-py3-none-any.whl", hash = "sha256:c69214aa47deac29fad6c2a4f590b9c4a9fdb16a403176fe154b79c0b4d4d820", size = 343634, upload-time = "2025-03-02T12:54:52.069Z" },
] ]
[[package]]
name = "ruff"
version = "0.14.14"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/2e/06/f71e3a86b2df0dfa2d2f72195941cd09b44f87711cb7fa5193732cb9a5fc/ruff-0.14.14.tar.gz", hash = "sha256:2d0f819c9a90205f3a867dbbd0be083bee9912e170fd7d9704cc8ae45824896b", size = 4515732, upload-time = "2026-01-22T22:30:17.527Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/d2/89/20a12e97bc6b9f9f68343952da08a8099c57237aef953a56b82711d55edd/ruff-0.14.14-py3-none-linux_armv6l.whl", hash = "sha256:7cfe36b56e8489dee8fbc777c61959f60ec0f1f11817e8f2415f429552846aed", size = 10467650, upload-time = "2026-01-22T22:30:08.578Z" },
{ url = "https://files.pythonhosted.org/packages/a3/b1/c5de3fd2d5a831fcae21beda5e3589c0ba67eec8202e992388e4b17a6040/ruff-0.14.14-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:6006a0082336e7920b9573ef8a7f52eec837add1265cc74e04ea8a4368cd704c", size = 10883245, upload-time = "2026-01-22T22:30:04.155Z" },
{ url = "https://files.pythonhosted.org/packages/b8/7c/3c1db59a10e7490f8f6f8559d1db8636cbb13dccebf18686f4e3c9d7c772/ruff-0.14.14-py3-none-macosx_11_0_arm64.whl", hash = "sha256:026c1d25996818f0bf498636686199d9bd0d9d6341c9c2c3b62e2a0198b758de", size = 10231273, upload-time = "2026-01-22T22:30:34.642Z" },
{ url = "https://files.pythonhosted.org/packages/a1/6e/5e0e0d9674be0f8581d1f5e0f0a04761203affce3232c1a1189d0e3b4dad/ruff-0.14.14-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f666445819d31210b71e0a6d1c01e24447a20b85458eea25a25fe8142210ae0e", size = 10585753, upload-time = "2026-01-22T22:30:31.781Z" },
{ url = "https://files.pythonhosted.org/packages/23/09/754ab09f46ff1884d422dc26d59ba18b4e5d355be147721bb2518aa2a014/ruff-0.14.14-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3c0f18b922c6d2ff9a5e6c3ee16259adc513ca775bcf82c67ebab7cbd9da5bc8", size = 10286052, upload-time = "2026-01-22T22:30:24.827Z" },
{ url = "https://files.pythonhosted.org/packages/c8/cc/e71f88dd2a12afb5f50733851729d6b571a7c3a35bfdb16c3035132675a0/ruff-0.14.14-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1629e67489c2dea43e8658c3dba659edbfd87361624b4040d1df04c9740ae906", size = 11043637, upload-time = "2026-01-22T22:30:13.239Z" },
{ url = "https://files.pythonhosted.org/packages/67/b2/397245026352494497dac935d7f00f1468c03a23a0c5db6ad8fc49ca3fb2/ruff-0.14.14-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:27493a2131ea0f899057d49d303e4292b2cae2bb57253c1ed1f256fbcd1da480", size = 12194761, upload-time = "2026-01-22T22:30:22.542Z" },
{ url = "https://files.pythonhosted.org/packages/5b/06/06ef271459f778323112c51b7587ce85230785cd64e91772034ddb88f200/ruff-0.14.14-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:01ff589aab3f5b539e35db38425da31a57521efd1e4ad1ae08fc34dbe30bd7df", size = 12005701, upload-time = "2026-01-22T22:30:20.499Z" },
{ url = "https://files.pythonhosted.org/packages/41/d6/99364514541cf811ccc5ac44362f88df66373e9fec1b9d1c4cc830593fe7/ruff-0.14.14-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1cc12d74eef0f29f51775f5b755913eb523546b88e2d733e1d701fe65144e89b", size = 11282455, upload-time = "2026-01-22T22:29:59.679Z" },
{ url = "https://files.pythonhosted.org/packages/ca/71/37daa46f89475f8582b7762ecd2722492df26421714a33e72ccc9a84d7a5/ruff-0.14.14-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bb8481604b7a9e75eff53772496201690ce2687067e038b3cc31aaf16aa0b974", size = 11215882, upload-time = "2026-01-22T22:29:57.032Z" },
{ url = "https://files.pythonhosted.org/packages/2c/10/a31f86169ec91c0705e618443ee74ede0bdd94da0a57b28e72db68b2dbac/ruff-0.14.14-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:14649acb1cf7b5d2d283ebd2f58d56b75836ed8c6f329664fa91cdea19e76e66", size = 11180549, upload-time = "2026-01-22T22:30:27.175Z" },
{ url = "https://files.pythonhosted.org/packages/fd/1e/c723f20536b5163adf79bdd10c5f093414293cdf567eed9bdb7b83940f3f/ruff-0.14.14-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:e8058d2145566510790eab4e2fad186002e288dec5e0d343a92fe7b0bc1b3e13", size = 10543416, upload-time = "2026-01-22T22:30:01.964Z" },
{ url = "https://files.pythonhosted.org/packages/3e/34/8a84cea7e42c2d94ba5bde1d7a4fae164d6318f13f933d92da6d7c2041ff/ruff-0.14.14-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:e651e977a79e4c758eb807f0481d673a67ffe53cfa92209781dfa3a996cf8412", size = 10285491, upload-time = "2026-01-22T22:30:29.51Z" },
{ url = "https://files.pythonhosted.org/packages/55/ef/b7c5ea0be82518906c978e365e56a77f8de7678c8bb6651ccfbdc178c29f/ruff-0.14.14-py3-none-musllinux_1_2_i686.whl", hash = "sha256:cc8b22da8d9d6fdd844a68ae937e2a0adf9b16514e9a97cc60355e2d4b219fc3", size = 10733525, upload-time = "2026-01-22T22:30:06.499Z" },
{ url = "https://files.pythonhosted.org/packages/6a/5b/aaf1dfbcc53a2811f6cc0a1759de24e4b03e02ba8762daabd9b6bd8c59e3/ruff-0.14.14-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:16bc890fb4cc9781bb05beb5ab4cd51be9e7cb376bf1dd3580512b24eb3fda2b", size = 11315626, upload-time = "2026-01-22T22:30:36.848Z" },
{ url = "https://files.pythonhosted.org/packages/2c/aa/9f89c719c467dfaf8ad799b9bae0df494513fb21d31a6059cb5870e57e74/ruff-0.14.14-py3-none-win32.whl", hash = "sha256:b530c191970b143375b6a68e6f743800b2b786bbcf03a7965b06c4bf04568167", size = 10502442, upload-time = "2026-01-22T22:30:38.93Z" },
{ url = "https://files.pythonhosted.org/packages/87/44/90fa543014c45560cae1fffc63ea059fb3575ee6e1cb654562197e5d16fb/ruff-0.14.14-py3-none-win_amd64.whl", hash = "sha256:3dde1435e6b6fe5b66506c1dff67a421d0b7f6488d466f651c07f4cab3bf20fd", size = 11630486, upload-time = "2026-01-22T22:30:10.852Z" },
{ url = "https://files.pythonhosted.org/packages/9e/6a/40fee331a52339926a92e17ae748827270b288a35ef4a15c9c8f2ec54715/ruff-0.14.14-py3-none-win_arm64.whl", hash = "sha256:56e6981a98b13a32236a72a8da421d7839221fa308b223b9283312312e5ac76c", size = 10920448, upload-time = "2026-01-22T22:30:15.417Z" },
]
[[package]] [[package]]
name = "sniffio" name = "sniffio"
version = "1.3.1" version = "1.3.1"
@ -378,6 +413,7 @@ dependencies = [
{ name = "colorlog" }, { name = "colorlog" },
{ name = "msgspec" }, { name = "msgspec" },
{ name = "pdbp" }, { name = "pdbp" },
{ name = "platformdirs" },
{ name = "tricycle" }, { name = "tricycle" },
{ name = "trio" }, { name = "trio" },
{ name = "wrapt" }, { name = "wrapt" },
@ -395,6 +431,24 @@ dev = [
{ name = "typing-extensions" }, { name = "typing-extensions" },
{ name = "xonsh" }, { name = "xonsh" },
] ]
devx = [
{ name = "greenback" },
{ name = "stackscope" },
{ name = "typing-extensions" },
]
lint = [
{ name = "ruff" },
]
repl = [
{ name = "prompt-toolkit" },
{ name = "psutil" },
{ name = "pyperclip" },
{ name = "xonsh" },
]
testing = [
{ name = "pexpect" },
{ name = "pytest" },
]
[package.metadata] [package.metadata]
requires-dist = [ requires-dist = [
@ -403,6 +457,7 @@ requires-dist = [
{ name = "colorlog", specifier = ">=6.8.2,<7" }, { name = "colorlog", specifier = ">=6.8.2,<7" },
{ name = "msgspec", specifier = ">=0.19.0" }, { name = "msgspec", specifier = ">=0.19.0" },
{ name = "pdbp", specifier = ">=1.6,<2" }, { name = "pdbp", specifier = ">=1.6,<2" },
{ name = "platformdirs", specifier = ">=4.4.0" },
{ name = "tricycle", specifier = ">=0.4.1,<0.5" }, { name = "tricycle", specifier = ">=0.4.1,<0.5" },
{ name = "trio", specifier = ">0.27" }, { name = "trio", specifier = ">0.27" },
{ name = "wrapt", specifier = ">=1.16.0,<2" }, { name = "wrapt", specifier = ">=1.16.0,<2" },
@ -420,6 +475,22 @@ dev = [
{ name = "typing-extensions", specifier = ">=4.14.1" }, { name = "typing-extensions", specifier = ">=4.14.1" },
{ name = "xonsh", specifier = ">=0.19.2" }, { name = "xonsh", specifier = ">=0.19.2" },
] ]
devx = [
{ name = "greenback", specifier = ">=1.2.1,<2" },
{ name = "stackscope", specifier = ">=0.2.2,<0.3" },
{ name = "typing-extensions", specifier = ">=4.14.1" },
]
lint = [{ name = "ruff", specifier = ">=0.9.6" }]
repl = [
{ name = "prompt-toolkit", specifier = ">=3.0.50" },
{ name = "psutil", specifier = ">=7.0.0" },
{ name = "pyperclip", specifier = ">=1.9.0" },
{ name = "xonsh", specifier = ">=0.19.2" },
]
testing = [
{ name = "pexpect", specifier = ">=4.9.0,<5" },
{ name = "pytest", specifier = ">=8.3.5" },
]
[[package]] [[package]]
name = "tricycle" name = "tricycle"