Compare commits

..

86 Commits

Author SHA1 Message Date
Gud Boi 9a720f8e21 Merge pull request 'ib_venue_closures: gap detection for "legacy mkts"' (#71)
Reviewed-on: https://www.pikers.dev/pikers/piker/pulls/71
2026-02-24 17:31:13 +00:00
Gud Boi d17e6ab5d9 Add holiday-gap detection via `exchange_calendars`
Integrate `exchange_calendars` lib to detect market holidays in
gap-checking logic via new `.ib.venues.has_holiday()` helper!

The `.ib.venues` impl deats,
- add  a new `has_holiday()` using `xcals.get_calendar()` and friends
  for sanity checking a venue's holiday closure-gaps.
  * final holiday detection-check is basically,
   `(cash_gap := (next_open - prev_close)) > period`
- include `time_step_s` param to `is_venue_closure()` for boundary
  tolerance checks.
  * let's us expand closure-time checks to include `+/-time_step_s`
    "off-by-one-`timeframe`-sample" edge case ranges.
- add real docstring to `has_weekend()`.

In `.ib.api` refine usage for ^ changes,
- move `is_venue_open()` call + tz-convert outside gap check
- use a walrus to capture `has_closure_gap` from `is_venue_closure()`
- add a `not has_closure_gap` condition to the
  mismatched-duration/short-frame warning block to avoid needless warns.
- keep duration-based "short-frame" log as `.error()` but toss in a bp
  so (somone can) umask to figure out wtf is going on..
  * we should **never** really hit this path unless there's a valid bug
    or data issue with IB/GFIS!
  * keep recursion path masked-out just leave a `breakpoint()` for now.

Also some logger updates,
- import `get_logger()` from top-level `piker.log` vs `.ib._util` which
  was always kinda wrong..
- change `NonShittyIB._logger` to use `__name__` vs literal.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 13:41:00 -05:00
Gud Boi cd0c780d04 Add `exchange_calendar` dep for venue-closure gap checkin 2026-02-23 13:41:00 -05:00
Gud Boi 1417c49051 Adjust type annots in binance and IB symbol mods
Namely, again switching `|`-union syntax to rm adjacent white space.

Also, flip to multiline style for threshold comparison in
`.binance.feed` and change gap-check threshold to `timeframe` (vs
a hardcoded `60`s) in the `get_ohlc()` closure.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 13:41:00 -05:00
Gud Boi 044afb0f6e Use `.ib.venues.is_venue_open()` in `.feed`
In `.ib.feed.stream_quotes()` specifically that is since time-range
checking code was moved to the new sub-mod.

Deats,
- drop import of old `is_current_time_in_range()` from `._util`
- change `get_bars()` sig: `end_dt`/`start_dt` to `datetime|None`
- comment-out `breakpoint()` in `open_history_client()`

Styling,
- add multiline style to conditionals and tuple unpacks
- fix type annotation: `Contract|None` vs `Contract | None`
- fix backticks in comment: `ib_insync` vs `ib_async`

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 13:41:00 -05:00
Gud Boi c96ecdab75 Add venue-closure gap-detection in `.ib.api.Client.bars()`
With all detection logic coming from our new `.ib.venues` helpers
allowing use to verify IB's OHLC bars frames don't contain unexpected
time-gaps.

`Client.bars()` new checking deats,
- add `is_venue_open()`, `has_weekend()`, `sesh_times()`, and
  `is_venue_closure()` checks when `last_dt < end_dt`
- always calc gap-period in local tz via `ContractDetails.timeZoneId`.
- log warnings on invalid non-closure gaps, debug on closures for now.
- change recursion case to just `log.error()` + `breakpoint()`; we might end
  up tossing it since i don't think i could ever get it to be reliable..
  * mask-out recursive `.bars()` call (likely unnecessary).
- flip `start_dt`/`end_dt` param defaults to `None` vs epoch `str`.
- update docstring to clarify no `start_dt` support by IB
- add mod level `_iso8601_epoch_in_est` const to keep track of orig
  param default value.
- add multiline style to return type-annot, type all `pendulum` objects.

Also,
- uppercase `Crypto.symbol` for PAXOS contracts in `.find_contracts()`,
  tho now we're getting a weird new API error i left in a todo-comment..

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 13:41:00 -05:00
Gud Boi e1e59453a9 Mv `parse_trading_hours()` from `._util` to `.venues`
It was an AI-model draft that we can prolly toss but figured might as
well org it appropriately.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 13:41:00 -05:00
Gud Boi d784af9df9 Add `.ib.venues` for mkt-venue-closure checkers
Introduce set of helper-fns for detecting venue open/close status,
session start/end times, and related time-gap detection using
`pendulum`.

Deats,
- add `iter_sessions()` to yield `pendulum.Interval`s from
  a `ContractDetails` instance.
- add `is_venue_open()` to check if active at a given time.
- add `is_venue_closure()` to detect valid closure gaps.
- add `sesh_times()` to extract weekday-agnostic open/close times.
- add `has_weekend()` to check for Sat/Sun in interval.
- move in lowlevel `is_current_time_in_range()` for checking a
  datetime within a `sesh: pendulum.Interval`.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 13:41:00 -05:00
Gud Boi cabd3fde92 Merge pull request 'fix_tractor_logging: porting to latest `tractor.log` API(s)' (#76)
Reviewed-on: #76
2026-02-23 18:40:35 +00:00
Gud Boi 2d0005ce48 Drop info-level `.pause()`-es used while devving 2026-02-23 13:26:46 -05:00
Gud Boi d0add050b7 Better doc-strings n styling in `piker.cli` eps
Add comprehensive docstrings to the top-level CLI endpoints and helpers,
explaining the purpose and structure of each (sub)command.

Deats,
- add detailed docstring to `pikerd()` explaining its role as the
  root service-actor/daemon supervisor.
- add docstring to `cli()` noting it's the root endpoint generally
  requiring a sub-cmd input.
- add extensive docstring to `services()` explaining the daemon naming
  conventions and listing a few current/common service actors.
- add docstring to `_load_clis()` explaining dynamic CLI loading.

Stylin,
- add multiline style to `and not maddrs` conditional in
  `load_trans_eps()`.
- drop commented-out `--tsdb` and `--es` click options from
  `pikerd()`, they're more or less obsolete given `nativedb`.
- add type annots where obviously handy.
- add TODO comment about UDS support in `services()`.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi 709bc8a5be Bump `platformdirs` version in lock file 2026-02-23 12:22:48 -05:00
Gud Boi c7979d0100 Enable console logging in `.accounting` on import
Enable `get_console_log()` at `.accounting.__init__` import-time
to ensure console output is available whenever the subsystem is
used by `.clearing` or other code.

Deats,
- uncomment and complete `get_console_log()` call in
  `.accounting.__init__` with default `level='warning'` and
  `name=__name__`.
- update comment explaining rationale: better to enable on import
  since namely used by `.clearing` subsystem.

Also,
- change `piker.calc` import to relative `.calc` in
  `.accounting.__init__`.
- drop unused `get_console_log` import from `.accounting._pos`.
- add `log = get_logger(name=__name__)` to `.accounting.cli`.
- change `get_logger(loglevel)` -> `get_console_log()` in
  `.accounting.cli.sync()` with proper kwargs.
- add `get_console_log` import to `.accounting.cli`.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi 9a97c477e2 Use `name=__name__` for logs throughout `.service`
Change all `.service` sub-modules to use `get_logger(name=__name__)`
for per-submod instances vs a shared `._util.log`.

Deats,
- import `get_logger()` and `get_console_log()` from top-level
  `piker.log` instead of `._util` for all.
- drop `log` and `get_console_log()` partial from `._util`.
- add `name=subsys` kwarg to `get_console_log()` call in
  `_actor_runtime.maybe_open_pikerd()`.
- add `name='piker.service'` to `get_console_log()` in
  `_ahab.open_ahabd()`.
- change default `loglevel` from `None` to `'cancel'` in
  `_ahab.open_ahabd()`.
- add sanity check: `assert log.name == 'piker.service'` in
  `_daemon.maybe_spawn_daemon()`.
- change `print()` -> `log.info()` in `_registry.find_service()`.
- drop stray `from piker.service._util import log` import in
  `brokers._daemon.spawn_brokerd()`.

Styling/cleanups,
- drop blank lines from various fn sigs.
- do more sin-ws union type annots.
- add more multiline style to `or` expressions in `_actor_runtime` and
  `_registry`.
- update `._util` docstring with TODO about `import`-time console
  log setup.
- add TODO comments in `_registry` about UDS registry support.
- use `.aid.uid` from actor in `_registry.open_registry()`.
- add intermediate var `reg_addrs` in `_registry.open_registry()` (bc
  i was tracing rtvs value issues in `tractor`).
- add `pformat` import to `.elastic` (code path is currently
  not used but figured might as well appease the linter..)

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi 2516d97fe4 Pass `loglevel` down through `.ui` graphics tasks
Add `loglevel` propagation to UI graphics tasks and sampler stream
opens to enable proper console logging in chart update loops. This
ensures the graphics and FSP subsystems receive the same loglevel
as their parent and/or sibling UI-actor tasks.

Deats,
- add `loglevel` param to `graphics_update_loop()` and
  `increment_history_view()` with default `'warning'`.
- pass `loglevel` to `open_sample_stream()` calls in both fns.
- use `partial()` to pass `loglevel` through to `nurse.start_soon()`
  calls in `display_symbol_data()` and `graphics_update_loop()`.

Also logging, doc-strs, and code-style tweaks,
- change `print()` -> `log.debug()` for hidden-chart and
  interaction-pause msgs in graphics loop.
- change `log.info()` -> `log.debug()` for resize events in
  `GodWidget` and `MainWindow`.
- add multiline style to resize log msg in `GodWidget`.
- add docstring to `MainWindow.on_focus_change()`.
- moar union type annot adjustments.
- switch to explicit kwarg `period_s=` for `open_sample_stream()`
  in `increment_history_view()`.
- multiline style for `names` list in `open_fsp_actor_cluster()`.
- change `count=2` -> `count=len(names)` in
  `open_fsp_actor_cluster()`.
- add TODO about using `.experimental` for cluster import (once that
  get's patched into upstream `tractor`).
- multiline style for `or` in `FspAdmin.start_engine_task()`.
- comment-out unused `brokernames` in `ui.cli.chart()`.
- add commented breakpoint in `ui.cli.chart()`.
- fix docstring style in `OrderMode.on_submit()`.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi 5bfc9d46e1 Pass `loglevel` to `cascade()` feed/sampler opens
Add `loglevel` param to both `maybe_open_feed()` and
`open_sample_stream()` calls in FSP engine's `cascade()` task to
ensure proper console log setup in downstream sampling tasks.

Deats,
- pass `loglevel=loglevel` to `maybe_open_feed()` call.
- pass `loglevel=loglevel` to `open_sample_stream()` call.

Also,
- switch to explicit kwargs: `fqmes=[fqme]` and `period_s=` for
  clarity and consistency with other callsites.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi aa403bd390 Pass `loglevel` down through `.data` callstack
Add `loglevel` param propagation across the data feed and sampling
subsystems to enable proper console log setup in downstream (distibuted)
subactor tasks. This ensures sampler and history-mgmt tasks receive the
same loglevel as their parent `.data.feed` tasks.

Deats,
- add `loglevel: str|None` param to `register_with_sampler()`,
  `maybe_open_samplerd()`, and `open_sample_stream()`.
- pass `loglevel` through to `get_console_log()` in
  `register_with_sampler()` with fallback to actor `loglevel`.
- use `partial()` in `allocate_persistent_feed()` to pass
  `loglevel` to `manage_history()` at task-start.
- add `loglevel` param to `manage_history()` with default
  `'warning'` and pass through to `open_sample_stream()` from there.
- capture `loglevel` var in `brokers.cli.search()` and pass to
  `symbol_search()` call.

Also,
- drop blank lines in fn sigs for consistency with piker style.
- add debug bp in `open_feed()` when `loglevel != 'info'`.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi c1530c7a37 Enable console via `.clearing._ems.open_brokerd_dialog()`
Enable console logs for both `.clearing` and `.accounting` in
`open_brokerd_dialog()` and pass `loglevel` to all broker-backend
trade-dialog endpoints. This ensures all `open_trade_dialog()` will
receive the same level passed to the EMS, including the paper engine.

Deats,
- add `loglevel` param to `mk_paper_ep()` closure.
- pass `loglevel=loglevel` to all trade endpoint `open_context()`
  calls and `mk_paper_ep()` invocations.
- change default `loglevel` in `open_ems()` from `'error'` to
  `'warning'`.
- add `get_console_log()` calls for `'clearing'` and
  `'piker.accounting'` at top of `open_brokerd_dialog()` to ensure those
  dependent subsystems are console enabled given they're namely used by
  the `brokerd` trade-dialog ep tasks.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi 50ffc1095b Use `__name__` for loggers across most sub-mods
Change most sub-modules to use `get_logger(name=__name__)` for
per-leaf-module `log` instances vs previous subpkg-level/shared refs.

Primary changes,
- import `get_[console_]logger()` from top-level `piker.log` across leaf
  mods.
- change any `<subsys>._util.log` logger-instances as well (though this
  approach should no longer be used since it masks the endpoint module's
  emissions.

Also,
- add a defaulted `loglevel: str` param to all `open_trade_dialog()`
  endpoints, anticipating it being passed in by `.clearing`-engine.
- call `get_console_log(level=loglevel, name=__name__)` in each trade
  dialog ep to enable per-`brokerd`-backend console writing.
- drop `get_logger` from `.brokers.__all__` exports
- fix type annotations: `str|None` vs `str | None`
- add TODOs for,
  * comments in `._util` about multi-subsys logging
  * `.accounting.__init__` about console log setup

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi 437d87ab5f Use `__name__` for loggers across `.ib` sub-mods
Change all `.ib` sub-modules to use `get_logger(name=__name__)`
for per-module logger instances vs shared `._util.log`.

Deats,
- change `._util` to use `__name__` vs literal string.
- change `.broker`, `.feed`, `.ledger`, `.symbols` to import
  `get_logger()` from top-level `.log` and call with `__name__`.
- drop `log` imports from `._util` in all affected mods.

Also,
- drop trailing comma in `.cli.services()` conditional for `loglevel`
  passthrough -> fixes an actual kwargs bug!!

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi 0087cc8876 .data.feed: move `Flume` import to avoid cycle
Move `Flume` to `TYPE_CHECKING` and add runtime imports in
`allocate_persistent_feed()` + `open_feed()` to avoid cycle
with `.flows` mod.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi 034fa19372 .fsp._engine: enable console logging in `cascade()`
Add console log setup with module name + multiline style for
desync warning msg.

Also,
- fix import: `Flume` from `.data.flows` vs `.data.feed`
- move `Feed` to `TYPE_CHECKING` block
- add TODO comment about `tractor._state` dict issue

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi 0f0bbd1cda Add order-cancel debugging and multiline kbd logs
Add verbose logging + error handling for order cancellation
hotkey path and multiline style for view-mode kb msgs.

Deats,
- add `Cursor.is_hovered()` to check hover state
- log warnings when no orders cancelled via <c> hotkey
- add try-except around `.cancel_orders_under_cursor()`
- log `cur._hovered` state in `.ui._lines` hover handlers
- change `Dialog.cancel_orders()` to return `list[Dialog]`
- fix import: `Flume` from `.data.flows` vs `.data.feed`
- comment-out multi-status msgs in order submit/cancel

Also,
- convert all multiline kbd `if` conditionals to use `and`
  on separate lines for consistency
- move `import tractor` to top of `._interaction`
- change `print()` to `log.debug()` in `LevelLine`
- fix type annotation spacing: `Callable|None` vs `Callable | None`

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi 3b6484c340 .ui._app: enable console logging in `_async_main()`
Now we're actualy emitting colored-logs (again?), not sure how this got
borked but maybe it's due to `tractor.log`'s new changes?

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi 6d896eeed2 .brokers._daemon: enable `tractor` log in `brokerd`
Also,
- capture `Actor.loglevel` in `tll` var for reuse (when `loglevel` is
  null) and pass `bool`-ed as new `with_tractor_log`-flag.
- add `with_tractor_log=bool(tll)` to `.get_console_log()`
- add assertion check for logger name.
- comment-out `tractor.trionics.collapse_eg()` context for now, pretty
  sure we don't need it and it just ends up adding extra logging
  overhead for no good reason (warnings on various `trio` internal
  cancelled-maskings, etc).
- change type annotation: `str|None` vs `str | None`.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi bdedb16cdc Auto-enable `tractor` logging when runtime active
Check for active `tractor` runtime via `.current_actor()` and use its
`.loglevel` to auto-enable `tractor`'s internal console logging when
`with_tractor_log` is not explicitly set.

Deats,
- add `tll` (tractor log level) var to capture level
- check `with_tractor_log is not False` first
- fallback to `maybe_actor.loglevel` if runtime exists
- only call `tractor.log.get_console_log()` if `tll` set
- add TODO comment about "log-spec" style config support

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-23 12:22:48 -05:00
Gud Boi d8bfdd775c Adjust `tractor.log` API compat
Update logging helpers to use new `tractor.log` API with `pkg_name=`
kwarg and add optional `tractor` "root logger" enabling.

Deats,
- change `piker.log.get_logger()` to use `pkg_name=` vs `_root_name=`.
- add `**tractor_log_kwargs` passthrough to both wrapper fns.
- add `with_tractor_log: bool` toggle to `.get_console_log()`.
- strip `'piker.'` prefix from logger names when present to avoid
  newly added `tractor.get_logger()` warnings.

Surroundingly,
- add `subsys` import to `.clearing._ems` for log name
- update all `get_console_log()` calls to use `level=` kwarg
- add assertion checks for logger names in `_setup_persistent_emsd()`

Additionally,,
- fix all type annotations: `str|None` vs `str | None`.
- add multiline style to conditional in `.cli.services()`.
- drop unused `Optional` import from `._actor_runtime`.
- drop a few "blank lines" in various function sigs.

Warning: this patch will require an equivalent dev-commit at the time of
writing in `tractor` itself, for now the `piker_pin` branch should be
sufficient to avoid breakage 🙏!

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code

k
2026-02-23 12:22:48 -05:00
Gud Boi 73369fb1ef Merge pull request 'hist_backfill_fixes: working-around (some) conc issues in the tsdb backfiller' (#62)
Reviewed-on: #62
Reviewed-by: momo <dilarayalniz1@gmail.com>
2026-02-23 17:22:24 +00:00
Gud Boi 8dd969e85f Pin to min `xonsh` release for @goodboy needs 2026-02-23 12:17:04 -05:00
Gud Boi 90fce9fcd4 Woops, use `piker_pin` from GH for `tractor`
Also, install the `'repl'` deps-group by default to ensure we get the
extras required by `tractor` for non-`trio` task debug REPLin..
Bump lock file to match.
2026-02-22 23:37:32 -05:00
Gud Boi a97f6c8dcf Flip `.tsp._history` logger to explicit mod-name (again) 2026-02-22 22:08:35 -05:00
Gud Boi a940018721 Adjust binance stale-bar detection to 2x tolerance
Change the stale-bar check in `.binance.feed` from `timeframe` to
`timeframe * 2` tolerance to avoid false-positive pauses when bars
are slightly delayed but still within acceptable bounds.

Styling,
- add walrus operator to capture `_time_step` for debugger
  inspection.
- add comment explaining the debug purpose of this check.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi 964f207150 Replace assert with warn for no-gaps in `.storage.cli`
Change `assert aids` to a warning log when no history gaps are found
during `ldshm` gap detection; it is the **ideal case** OBVI. This avoids
crashing the CLI when gap detection finds no issues, which is actually
good news!

Bp

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi fdb3999902 .tsp._history: add gap detection in backfill loop
Add frame-gap detection when `frame_last_dt < end_dt_param` to
warn about potential venue closures or missing data during the
backfill loop in `start_backfill()`.

Deats,
- add `frame_last_dt < end_dt_param` check after frame recv
- log warnings with EST-converted timestamps for clarity
- add `await tractor.pause()` for REPL-investigation on gaps
- add TODO comment about venue closure hour checking
- capture `_until_was_none` walrus var for null-check clarity
- add `last_time` assertion for `time[-1] == next_end_dt`
- rename `_daterr` to `nodata` with `_nodata` capture

Also,
- import `pendulum.timezone` and create `est` tz instance
- change `get_logger()` import from `.data._util` to `.log`
- add parens around `(next_prepend_index - ln) < 0` check

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi d1991b3300 Guard against `None` chart in `ArrowEditor.remove()`
Add null check for `linked.chart` before calling
`.plotItem.removeItem()` to prevent `AttributeError` when chart
is `None`.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi b90edf95a7 .ib.feed: only set `feed_is_live` after first quote
Move `feed_is_live.set()` to after receiving the first valid
quote instead of setting early on venue-closed path. Prevents
sampler registration when no live data expected.

Also,
- drop redundant `.set()` call in quote iteration loop
- add TODO note about sleeping until venue opens vs forever
- init `first_quote: dict` early for consistency

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi ce3d8e7a1e Only register shms w sampler when `feed_is_live`
Add timeout-gated wait for `feed_is_live: trio.Event` before passing shm
tokens to `open_sample_stream()`; skip registering shm-buffers with the
sampler if the feed doesn't "go live" within a new timeout.

The main motivation here is to avoid the sampler incrementing shm-array
bufs when the mkt-venue is closed so that a trailing "same price"
line/bars isn't updated/rendered in the chart's view when unnecessary.

Deats,
- add `wait_for_live_timeout: float = 0.5` param to `manage_history()`
- warn-log the fqme when timeout triggers
- add error log for invalid `frame_start_dt` comparisons to
  `maybe_fill_null_segments()`.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi f2b04c4071 Clarify `register_with_sampler()` started type and vars
Markup `ctx.started()` type-sig as `set[int]`, rename binding var
`first` to `shm_periods` and add type hints for clarity on context mgr
unpacking.

Also,
- whitespace cleanup: `Type | None` -> `Type|None` throughout
- format long lines: `.setdefault()`, `await ctx.started()`
- fix backtick style in docstrings for consistency
- add placeholder TODO comment for `feed_is_live` check; it might be
  more rigorous to pass the syncing state down thru all this?

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi e75c3d8a34 Ignore single-zero-sample trace on no runtime.. 2026-02-22 22:08:35 -05:00
Gud Boi be4adfc202 ib.feed: drop legacy "quote-with-vlm" polling
Since now we explicitly check each mkt's venue hours now we don't need
this mega hacky "waiting on a quote with real vlm" stuff to determing
whether historical data should be loaded immediately. This approach also
had the added complexity that we needed to handle edge cases for tickers
(like xauusd.cmdty) which never have vlm.. so it's nice to be rid of it
all ;p
2026-02-22 22:08:35 -05:00
Gud Boi 763faa0cc1 Always overwrite tsdb duplicates found during backfill
Enable the previously commented-out dedupe-and-write logic in
`start_backfill()` to ensure tsdb stays clean of duplicate
entries.

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi 0f1f2e263d For claude, ignore no runtime for offline shm reading 2026-02-22 22:08:35 -05:00
Gud Boi fd92cd99c2 .ib._util: ignore attr err on click-hack twm wakeups? 2026-02-22 22:08:35 -05:00
Gud Boi 8c2fd7c780 Use `get_fonts()`, add `show_txt` flag to gap annots
Switch `.tsp._annotate.markup_gaps()` to use new
`.ui._style.get_fonts()` API for font size calc on client side and add
optional `show_txt: bool` flag to toggle gap duration labels (with
default `False`).

Also,
- replace `sgn` checks with named bools: `up_gap`, `down_gap`
- use `small_font.px_size - 1` for gap label font sizing
- wrap text creation in `if show_txt:` block
- update IPC handler to use `get_fonts()` vs direct `_font` import

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi 1016f54c98 Add `get_fonts()` API and fix `.px_size` for non-Qt ctxs
Add a public `.ui._style.get_fonts()` helper to retrieve the
`_font[_small]: DpiAwareFont` singleton pair. Adjust
`DpiAwareFont.px_size` to return `conf.toml` value when Qt returns `-1`
(no active Qt app).

Also,
- raise `ValueError` with detailed msg if both Qt and a conf-lookup fail
- add some more type union whitespace cleanups: `int | None` -> `int|None`

(this commit-msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi b4944916c9 Relay annot creation failures with err-dict resps
Change annot-ctl APIs to return `None` on failure instead of invalid
`aid`s. Server now sends `{'error': msg}` dict on failures, client
match-blocks handle gracefully.

Also,
- update return types: `.add_rect()`, `.add_arrow()`, `.add_text()`
  now return `int|None`
- match on `{'error': str(msg)}` in client IPC receive blocks
- send error dicts from server on timestamp lookup failures
- add failure handling in `markup_gaps()` to skip bad rects

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi 3f001cc1f6 Do time-based shm-index lookup for annots on server
Fix annotation misalignment during backfill by switching from
client-computed indices to server-side timestamp lookups against
current shm state. Store absolute coords on annotations and
reposition on viz redraws.

Lowlevel impl deats,
- add `time` param to `.add_arrow()`, `.add_text()`, `.add_rect()`
- lookup indices from shm via timestamp matching in IPC handlers
- force chart redraw before `markup_gaps()` annotation creation
- wrap IPC send/receive in `trio.fail_after(3)` for timeout when
  server fails to respond, likely hangs on no-case-match/error.
- cache `_meth`/`_kwargs` on rects, `_abs_x`/`_abs_y` on arrows
- auto-reposition all annotations after viz reset in redraw cmd

Also,
- handle `KeyError` for missing timeframes in chart lookup
- return `-1` aid on annotation creation failures (lol oh `claude`..)
- reconstruct rect positions from timestamps + BGM offset logic
- log repositioned annotation counts on viz redraw

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi 0845b257d9 Add buffer capacity checks to backfill loop
Prevent `ValueError` from negative prepend index in
`start_backfill()` by checking buffer space before push
attempts. Truncate incoming frame if needed and stop gracefully
when buffer full.

Also,
- add pre-push capacity check with frame truncation logic
- stop backfill when `next_prepend_index <= 0`
- log warnings for capacity exceeded and buffer-full conditions

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi 7964cc3cf4 Drop decimal points for whole-number durations
Adjust `humanize_duration()` to show "3h" instead of "3.0h" when the
duration value is a whole number, making labels cleaner.

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi 4e7e4a7a1b Add `font_size` param to `AnnotCtl.add_text()` API
Expose font sizing control for `pg.TextItem` annotations thru the
annot-ctl API. Default to `_font.font.pixelSize() - 3` when no
size provided.

Also,
- thread `font_size` param thru IPC handler in `serve_rc_annots()`
- apply font via `QFont.setPixelSize()` on text item creation
- add `?TODO` note in `markup_gaps()` re using `conf.toml` value
- update `add_text()` docstring with font_size param desc

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi fe11f79f21 Add humanized duration labels to gap annotations
Introduce `humanize_duration()` helper in `.tsp._annotate` to
convert seconds to short human-readable format (d/h/m/s). Extend
annot-ctl API with `add_text()` method for placing `pg.TextItem`
labels on charts.

Also,
- add duration labels on RHS of gap arrows in `markup_gaps()`
- handle text item removal in `rm_annot()` match block
- expose `TextItem` cmd in `serve_rc_annots()` IPC handler
- use `hcolor()` for named-to-hex color conversion
- set anchor positioning for up vs down gaps

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:35 -05:00
Gud Boi 40cbc8546d .ib.feed: trim bars frame to `start_dt` 2026-02-22 22:08:35 -05:00
Gud Boi e3d7077f18 ib._util: ignore timeout-errs when crash-handling `pyvnc` connects 2026-02-22 22:08:35 -05:00
Gud Boi 7ddcf5893e Lul, woops compare against first-dt in `.ib.feed` bars frame.. 2026-02-22 22:08:35 -05:00
Gud Boi b1d6c595ec Expose more `pg.ArrowItem` params thru annot-ctl API 2026-02-22 22:08:35 -05:00
Gud Boi 33ec37a83f Add `pexpect`, `xonsh`@github:main to deps
The former bc `claude` needs it for its new "offline" REPL simulator
script `snippets/claude_debug_helper.py` and pin to `xonsh` git mainline
to get the fancy new next cmd/suggestion prompt feats (which @goodboy is
using from `modden` already). Bump lock file to match.

Ah right, and for now while hackin pin to a local `tractor` Bp
2026-02-22 22:08:35 -05:00
Gud Boi e6c7834a01 Add break for single bar null segments 2026-02-22 22:08:35 -05:00
Gud Boi 4ef5a5beb8 Space gap rect-annots "between" start-end bars 2026-02-22 22:08:35 -05:00
Gud Boi 11e95d9cbf Catch too-early ib hist frames
For now by REPLing them and raising an RTE inside `.ib.feed` as well as
tracing any such cases that make it (from other providers) up to the
`.tsp._history` layer during null-segment backfilling.
2026-02-22 22:08:34 -05:00
Gud Boi 51ca9cd4d9 Add arrow indicators to time gaps
Such that they're easier to spot when zoomed out, a similar color to the
`RectItem`s and also remote-controlled via the `AnnotCtl` api.

Deats,
- request an arrow per gap from `markup_gaps()` using a new
  `.add_arrow()` meth, set the color, direction and alpha with
  position always as the `iend`/close of the last valid bar.
- extend the `.ui._remote_ctl` subys to support the above,
  * add a new `AnnotCtl.add_arrow()`.
  * add the service-side IPC endpoint for a 'cmd': 'ArrowEditor'.
- add a new `rm_annot()` helper to ensure the right graphics removal
  API is used by annotation type:
  * `pg.ArrowItem` looks up the `ArrowEditor` and calls `.remove(annot).
  * `pg.SelectRect` keeps with calling `.delete()`.
- global-ize an `_editors` table to enable the prior.
- add an explicit RTE for races on the chart-actor's `_dss` init.
2026-02-22 22:08:34 -05:00
Gud Boi 27d077ade5 Arrow editor refinements in prep for gap checker
Namely exposing `ArrowEditor.add()` params to provide access to
coloring/transparency settings over the remote-ctl annotation API and
also adding a new `.remove_all()` to easily clear all arrows from
a single call. Also add `.remove()` compat methods to the other editors
(i.e. for lines, rects).
2026-02-22 22:08:34 -05:00
Gud Boi 9257af02b9 Mv `markup_gaps()` to new `.tsp._annotate` mod 2026-02-22 22:08:34 -05:00
Gud Boi 582f9be02f Enable tracing back insert backfills
Namely insertion writes which over-fill the shm buffer past the latest
tsdb sample via `.tsp._history.shm_push_in_between()`.

Deats,
- check earliest `to_push` timestamp and enter pause point if it's
  earlier then the tsdb's `backfill_until_dt` stamp.
- requires actually passing the `backfill_until_dt: datetime` thru,
  * `get_null_segs()`
  * `maybe_fill_null_segments()`
  * `shm_push_in_between()` (obvi XD)
2026-02-22 22:08:34 -05:00
Gud Boi 5f6e24f55c Tolerate various "bad data" cases in `markup_gaps()`
Namely such that when the previous-df-row by our shm-abs-'index' doesn't
exist we ignore certain cases which are likely due to borked-but-benign
samples written to the tsdb or rt shm buffers prior.

Particularly we now ignore,
- any `dt`/`prev_dt` values which are UNIX-epoch timestamped (val of 0).
- any row-is-first-row in the df; there is no previous.
- any missing previous datum by 'index', in which case we lookup the
  `wdts` prior row and use that instead.
  * this would indicate a missing sample for the time-step but we can
    still detect a "gap" by looking at the prior row, by df-abs-index
    `i`, and use its timestamp to determine the period/size of missing
    samples (which need to likely still be retrieved).
  * in this case i'm leaving in a pause-point for introspecting these
    rarer cases when `--pdb` is passed via CLI.

Relatedly in the `piker store` CLI ep,
- add `--pdb` flag to `piker store`, pass it verbatim as `debug_mode`.
- when `times` has only a single row, don't calc a `period_s` median.
- only trace `null_segs` when in debug mode.
- always markup/dedupe gaps for `period_s==60`
2026-02-22 22:08:34 -05:00
Gud Boi bd418078ca ib: up API timeout default for remote host conns 2026-02-22 22:08:34 -05:00
Gud Boi d5af471192 Add vlm-based "smart" OHLCV de-duping & bar validation
Using `claude`, add a `.tsp._dedupe_smart` module that attemps "smarter"
duplicate bars by attempting to distinguish between erroneous bars
partially written during concurrent backfill race conditions vs.
**actual** data quality issues from historical providers.

Problem:
--------
Concurrent writes (live updates vs. backfilling) can result in create
duplicate timestamped ohlcv vars with different values. Some
potential scenarios include,

- a market live feed is cancelled during live update resulting in the
  "last" datum being partially updated with all the ticks for the
  time step.
- when the feed is rebooted during charting, the backfiller will not
  finalize this bar since rn it presumes it should only fill data for
  time steps not already in the tsdb storage.

Our current naive  `.unique()` approach obvi keeps the incomplete bar
and a "smarter" approach is to compare the provider's final vlm
amount vs. the maybe-cancelled tsdb's bar; a higher vlm value from
the provider likely indicates the cancelled-during-live-write and
**not** a datum discrepancy from said data provider.

Analysis (with `claude`) of `zecusdt` data revealed:
- 1000 duplicate timestamps
- 999 identical bars (pure duplicates from 2022 backfill overlap)
- 1 volume-monotonic conflict (live partial vs backfill complete)

A soln from `claude` -> `tsp._dedupe_smart.dedupe_ohlcv_smart()`
which:
- sorts by vlm **before** deduplication and keep the most complete
  bar based on vlm monotonicity as well as the following OHLCV
  validation assumptions:
  * volume should always increase
  * high should be non-decreasing,
  * low should be non-increasing
  * open should be identical
- Separates valid race conditions from provider data quality issues
  and reports and returns both dfs.

Change summary by `claude`:
- `.tsp._dedupe_smart`: new module with validation logic
- `.tsp.__init__`: expose `dedupe_ohlcv_smart()`
- `.storage.cli`: integrate smart dedupe, add logging for:
  * duplicate counts (identical vs monotonic races)
  * data quality violations (non-monotonic, invalid OHLC ranges)
  * warnings for provider data issues
- Remove `assert not diff` (duplicates are valid now)

Verified on `zecusdt`: correctly keeps index 3143645
(volume=287.777) over 3143644 (volume=140.299) for
conflicting 2026-01-16 18:54 UTC bar.

`claude`'s Summary of reasoning
-------------------------------
- volume monotonicity is critical: a bar's volume only increases
  during its time window.
- a backfilled bar should always have volume >= live updated.
- violations indicate any of:
  * Provider data corruption
  * Non-OHLCV aggregation semantics
  * Timestamp misalignment

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:34 -05:00
Gud Boi 6c28b1cbbc Add `pexpect`-based `pdbp`-REPL offline helper
Add a new `snippets/claude_debug_helper.py` to
provide a programmatic interface to `tractor.pause()` debugger
sessions for incremental data inspection matching the interactive UX
but able to be run by `claude` "offline" since it can't seem to feed
stdin (so it claims) to the `pdb` instance due to lack of ability to
allocate a tty internally.

The script-wrapper is based on `tractor`'s `tests/devx/` suite's use of
`pexpect` patterns for driving `pdbp` prompts and thus enables
automated-offline execution of REPL-inspection commands **without**
using incremental-realtime output capture (like a human would use it).

Features:
- `run_pdb_commands()`: batch command execution
- `InteractivePdbSession`: context manager for step-by-step REPL interaction
- `expect()` wrapper: timeout handling with buffer display
- Proper stdin/stdout handling via `pexpect.spawn()`

Example usage:
```python
from debug_helper import InteractivePdbSession

with InteractivePdbSession(
    cmd='piker store ldshm zecusdt.usdtm.perp.binance'
) as session:
    session.run('deduped.shape')
    session.run('step_gaps.shape')
```

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:34 -05:00
Gud Boi 8af8ac4f7b Fix polars 1.36.0 duration API
Polars tightened type safety for `.dt` accessor methods requiring
`total_*` methods for duration types vs datetime component accessors
like `day()` which now only work on datetime dtypes.

`detect_time_gaps()` in `.tsp._anal` was calling `.dt.day()`
on `dt_diff` column (a duration from `.diff()`) which throws
`InvalidOperationError` on modern polars.

Changes:
- use f-string to add pluralization to map time unit strings to
  `total_<unit>s` form for the new duration API.
- Handle singular/plural forms: 'day' -> 'days' -> 'total_days'
- Ensure trailing 's' before applying 'total_' prefix

Also updates inline comments explaining the polars type distinction
between datetime components vs duration totals.

Fixes `piker store ldshm` crashes on datasets with time gaps.

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 22:08:34 -05:00
Tyler Goodlet f4d9090d6d `.storage.__init__`: code styling updates 2026-02-22 22:08:34 -05:00
Tyler Goodlet b0953ecbee `.tsp._history`: drop `feed_is_live` syncing, another seg flag
The `await feed_is_live.wait()` is more or less pointless and would only
cause slower startup afaig (as-far-as-i-grok) so i'm masking it here.
This also removes the final `strict_exception_groups=False` use from the
non-tests code base, flipping to the `tractor.trionics` collapser once
and for all!
2026-02-22 22:08:34 -05:00
Tyler Goodlet b5e4c83341 Woops, keep `np2pl` exposed from `.tsp` 2026-02-22 22:08:34 -05:00
Tyler Goodlet 0ef98ba25c Factor to a new `.tsp._history` sub-mod
Cleaning out the `piker.tsp` pkg-mod to be only the (re)exports needed
for `._anal`/`._history` refs-use elsewhere!
2026-02-22 22:08:34 -05:00
Gud Boi 6b70fea5d4 Merge pull request 'Tpt-tolerance adjustments for latest `tractor`' (#73)
Reviewed-on: #73
2026-02-23 03:08:18 +00:00
Gud Boi 4e24cb1bff Adjust sampler's "IPC-dropped" log msg styling
Refmt the "connection-dropped" error-log in `Sampler`'s broadcast loop
to show error type first, then the IPC context details; mks it all
easier to grok/less-noisy on console imo.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 20:08:20 -05:00
Gud Boi 3d83b61f3f Wrap `open_autorecon_ws()` body for comms failures
Add outer `try/except` around the nursery block in
`open_autorecon_ws()` to catch any `NoBsWs.recon_errors` that
escape the inner reconnect loop, logging a warning instead of
propagating.

Also,
- correct `NoBsWs.recon_errors` typing to `tuple[Type[Exception]]`.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 20:08:20 -05:00
Gud Boi 6f390dc88c Add timeout + shielding to `NoBsWs` reconnect logic
Add timeout param to `.reset()` and `.send_msg()` to prevent
indefinite blocking on reconnect attempts. Shield reconnect
sleeps from cancellation to ensure we avoid any "finally footgun" type
scenarios where `trio.Cancelled` masks an underlying exc per,
- https://github.com/goodboy/tractor/pull/387
- https://github.com/goodboy/tractor/pull/391

Deats,
- add `timeout` param to `.reset()`, return `bool` for success
- add `timeout=3` default to `.send_msg()` for reconnect wait
- shield `.reset()` call in `.send_msg()` error handler
- log warning when reconnect timeout exceeded
- shield throttled sleeps in `_reconnect_forever()` error paths

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 20:08:20 -05:00
Gud Boi e1f3d7c3f8 Handle `tractor.TransportClosed` as "stream-closed"
In both the ems and sampler since on new `tractor` this is the
"wrapping" exception raised when the transport layer terminates early
but in a psuedo-"graceful" way, expected when a peer actors disconnect.
Previously we were crashing in this case since old `tractor` just raised
the underlying `trio`-source-exceptions verbatim.

Also,
- use `Aid.reprol()` in log msgs vs old `.chan.uid` refs

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 20:08:20 -05:00
Gud Boi 600636784c Merge pull request 'tractor_struct_and_godw_mod' (#72)
Reviewed-on: #72
2026-02-22 23:39:10 +00:00
Gud Boi 0b63a73954 Move `GodWidget` to new `._widget` mod
Extract root-most widget to resolve (various) `.ui` import cycles
when the type is declared on `Struct`s..

Deats,
- flip to `from ._widget import GodWidget`.
- move `Feed` + `Flume` imports to TYPE_CHECKING in `._chart`
- drop unused `trio` import from `._chart`
- fix docstring typo: "datums```" -> "`datums``"
- change `print()` to `log.warning()` for global step msg

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 16:08:07 -05:00
Gud Boi 8fb47f761a Point `.types.Struct` to `tractor.msg.pretty_struct`
Drop the local (and original) `Struct` impl from `piker.types` in favour
of `tractor`'s version now that it's been upstreamed.

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 16:07:53 -05:00
Gud Boi ad37ebabb2 Cleanups and doc tweaks to `.ui._fsp`
Expand read-race warning log for clarity, add TODO for reading
`tractor` transport config from `conf.toml`, and reflow docstring
in `open_vlm_displays()`.

Also,
- whitespace cleanup: `Type | None` -> `Type|None`
- clarify "Volume" -> "Vlm (volume)" in docstr

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 16:06:54 -05:00
Gud Boi 5020266bd5 Add `get_godw()` singleton getter for `GodWidget`
Expose `get_godw()` helper to retrieve the central `GodWidget`
instance from anywhere in the UI code. Set the singleton in
`_async_main()` on startup.

Also,
- add docstring to `run_qtractor()` explaining trio guest mode
- type annotate `instance: GodWidget` in `run_qtractor()`
- import reorg in `._app` for cleaner grouping
- whitespace cleanup: `Type | None` -> `Type|None` throughout
- fix bitwise-or alignment: `Flag | Other` -> `Flag|Other`

(this commit-msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-02-22 16:06:28 -05:00
Gud Boi d6a56d87bf Rm unused import in `.ui._curve` 2026-02-22 16:05:52 -05:00
Gud Boi e8152b8534 Add a couple cooler "cooler"/"muted" red and greens 2026-02-22 16:04:00 -05:00
Gud Boi bb81c74353 .ui.order_mode: multiline import styling 2026-02-22 16:03:42 -05:00
Gud Boi 7eaf28479c Fix `Qt6` types for new sub-namespaces 2026-02-22 16:03:19 -05:00
51 changed files with 607 additions and 1627 deletions

View File

@ -19,8 +19,10 @@
for tendiez.
'''
from ..log import get_logger
from piker.log import (
get_console_log,
get_logger,
)
from .calc import (
iter_by_dt,
)
@ -51,7 +53,17 @@ from ._allocate import (
log = get_logger(__name__)
# ?TODO, enable console on import
# [ ] necessary? or `open_brokerd_dialog()` doing it is sufficient?
#
# bc might as well enable whenev imported by
# other sub-sys code (namely `.clearing`).
get_console_log(
level='warning',
name=__name__,
)
# TODO, the `as <samename>` style?
__all__ = [
'Account',
'Allocator',

View File

@ -60,12 +60,16 @@ from ..clearing._messages import (
BrokerdPosition,
)
from piker.types import Struct
from piker.log import get_logger
from piker.log import (
get_logger,
)
if TYPE_CHECKING:
from piker.data._symcache import SymbologyCache
log = get_logger(__name__)
log = get_logger(
name=__name__,
)
class Position(Struct):

View File

@ -21,7 +21,6 @@ CLI front end for trades ledger and position tracking management.
from __future__ import annotations
from pprint import pformat
from rich.console import Console
from rich.markdown import Markdown
import polars as pl
@ -29,7 +28,10 @@ import tractor
import trio
import typer
from ..log import get_logger
from piker.log import (
get_console_log,
get_logger,
)
from ..service import (
open_piker_runtime,
)
@ -45,6 +47,7 @@ from .calc import (
open_ledger_dfs,
)
log = get_logger(name=__name__)
ledger = typer.Typer()
@ -79,7 +82,10 @@ def sync(
"-l",
),
):
log = get_logger(loglevel)
log = get_console_log(
level=loglevel,
name=__name__,
)
console = Console()
pair: tuple[str, str]

View File

@ -25,15 +25,16 @@ from types import ModuleType
from tractor.trionics import maybe_open_context
from piker.log import (
get_logger,
)
from ._util import (
log,
BrokerError,
SymbolNotFound,
NoData,
DataUnavailable,
DataThrottle,
resproc,
get_logger,
)
__all__: list[str] = [
@ -43,7 +44,6 @@ __all__: list[str] = [
'DataUnavailable',
'DataThrottle',
'resproc',
'get_logger',
]
__brokers__: list[str] = [
@ -65,6 +65,10 @@ __brokers__: list[str] = [
# bitso
]
log = get_logger(
name=__name__,
)
def get_brokermod(brokername: str) -> ModuleType:
'''

View File

@ -33,12 +33,18 @@ import exceptiongroup as eg
import tractor
import trio
from piker.log import (
get_logger,
get_console_log,
)
from . import _util
from . import get_brokermod
if TYPE_CHECKING:
from ..data import _FeedsBus
log = get_logger(name=__name__)
# `brokerd` enabled modules
# TODO: move this def to the `.data` subpkg..
# NOTE: keeping this list as small as possible is part of our caps-sec
@ -74,16 +80,13 @@ async def _setup_persistent_brokerd(
# any further (level) configuration on their own B)
actor: tractor.Actor = tractor.current_actor()
tll: str = actor.loglevel
log = _util.get_console_log(
log = get_console_log(
level=loglevel or tll,
name=f'{_util.subsys}.{brokername}',
with_tractor_log=bool(tll),
)
assert log.name == _util.subsys
# set global for this actor to this new process-wide instance B)
_util.log = log
# further, set the log level on any broker broker specific
# logger instance.
@ -197,7 +200,6 @@ def broker_init(
async def spawn_brokerd(
brokername: str,
loglevel: str | None = None,
@ -205,8 +207,10 @@ async def spawn_brokerd(
) -> bool:
from piker.service._util import log # use service mngr log
log.info(f'Spawning {brokername} broker daemon')
log.info(
f'Spawning broker-daemon,\n'
f'backend: {brokername!r}'
)
(
brokermode,
@ -269,8 +273,7 @@ async def maybe_spawn_brokerd(
from piker.service import maybe_spawn_daemon
async with maybe_spawn_daemon(
f'brokerd.{brokername}',
service_name=f'brokerd.{brokername}',
service_task_target=spawn_brokerd,
spawn_args={
'brokername': brokername,

View File

@ -19,15 +19,13 @@ Handy cross-broker utils.
"""
from __future__ import annotations
from functools import partial
# from functools import partial
import json
import httpx
import logging
from ..log import (
get_logger,
get_console_log,
from piker.log import (
colorize_json,
)
subsys: str = 'piker.brokers'
@ -35,12 +33,22 @@ subsys: str = 'piker.brokers'
# NOTE: level should be reset by any actor that is spawned
# as well as given a (more) explicit name/key such
# as `piker.brokers.binance` matching the subpkg.
log = get_logger(subsys)
# log = get_logger(subsys)
get_console_log = partial(
get_console_log,
name=subsys,
)
# ?TODO?? we could use this approach, but we need to be able
# to pass multiple `name=` values so for example we can include the
# emissions in `.accounting._pos` and others!
# [ ] maybe we could do the `log = get_logger()` above,
# then cycle through the list of subsys mods we depend on
# and then get all their loggers and pass them to
# `get_console_log(logger=)`??
# [ ] OR just write THIS `get_console_log()` as a hook which does
# that based on who calls it?.. i dunno
#
# get_console_log = partial(
# get_console_log,
# name=subsys,
# )
class BrokerError(Exception):

View File

@ -37,8 +37,9 @@ import trio
from piker.accounting import (
Asset,
)
from piker.brokers._util import (
from piker.log import (
get_logger,
get_console_log,
)
from piker.data._web_bs import (
open_autorecon_ws,
@ -69,7 +70,9 @@ from .venues import (
)
from .api import Client
log = get_logger('piker.brokers.binance')
log = get_logger(
name=__name__,
)
# Fee schedule template, mostly for paper engine fees modelling.
@ -245,9 +248,16 @@ async def handle_order_requests(
@tractor.context
async def open_trade_dialog(
ctx: tractor.Context,
loglevel: str = 'warning',
) -> AsyncIterator[dict[str, Any]]:
# enable piker.clearing console log for *this* `brokerd` subactor
get_console_log(
level=loglevel,
name=__name__,
)
# TODO: how do we set this from the EMS such that
# positions are loaded from the correct venue on the user
# stream at startup? (that is in an attempt to support both

View File

@ -64,9 +64,9 @@ from piker.data._web_bs import (
open_autorecon_ws,
NoBsWs,
)
from piker.log import get_logger
from piker.brokers._util import (
DataUnavailable,
get_logger,
)
from .api import (
@ -78,7 +78,7 @@ from .venues import (
get_api_eps,
)
log = get_logger('piker.brokers.binance')
log = get_logger(name=__name__)
class L1(Struct):
@ -275,12 +275,14 @@ async def open_history_client(
f'{times}'
)
# XXX, debug any case where the latest 1m bar we get is
# already another "sample's-step-old"..
if end_dt is None:
inow: int = round(time.time())
if (
(inow - times[-1])
_time_step := (inow - times[-1])
>
timeframe
timeframe * 2
):
await tractor.pause()

View File

@ -27,14 +27,12 @@ import click
import trio
import tractor
from ..cli import cli
from .. import watchlists as wl
from ..log import (
from piker.cli import cli
from piker import watchlists as wl
from piker.log import (
colorize_json,
)
from ._util import (
log,
get_console_log,
get_logger,
)
from ..service import (
maybe_spawn_brokerd,
@ -45,12 +43,15 @@ from ..brokers import (
get_brokermod,
data,
)
DEFAULT_BROKER = 'binance'
log = get_logger(
name=__name__,
)
DEFAULT_BROKER = 'binance'
_config_dir = click.get_app_dir('piker')
_watchlists_data_path = os.path.join(_config_dir, 'watchlists.json')
OK = '\033[92m'
WARNING = '\033[93m'
FAIL = '\033[91m'
@ -345,7 +346,10 @@ def contracts(ctx, loglevel, broker, symbol, ids):
'''
brokermod = get_brokermod(broker)
get_console_log(loglevel)
get_console_log(
level=loglevel,
name=__name__,
)
contracts = trio.run(partial(core.contracts, brokermod, symbol))
if not ids:
@ -477,11 +481,12 @@ def search(
# the `piker --pdb` XD ..
# -[ ] pull from the parent click ctx's values..dumdum
# assert pdb
loglevel: str = config['loglevel']
# define tractor entrypoint
async def main(func):
async with maybe_open_pikerd(
loglevel=config['loglevel'],
loglevel=loglevel,
debug_mode=pdb,
):
return await func()
@ -494,6 +499,7 @@ def search(
core.symbol_search,
brokermods,
pattern,
loglevel=loglevel,
),
)

View File

@ -28,12 +28,14 @@ from typing import (
import trio
from ._util import log
from piker.log import get_logger
from . import get_brokermod
from ..service import maybe_spawn_brokerd
from . import open_cached_client
from ..accounting import MktPair
log = get_logger(name=__name__)
async def api(brokername: str, methname: str, **kwargs) -> dict:
'''
@ -147,6 +149,7 @@ async def search_w_brokerd(
async def symbol_search(
brokermods: list[ModuleType],
pattern: str,
loglevel: str = 'warning',
**kwargs,
) -> dict[str, dict[str, dict[str, Any]]]:
@ -176,6 +179,7 @@ async def symbol_search(
'_infect_asyncio',
False,
),
loglevel=loglevel
) as portal:
results.append((

View File

@ -41,12 +41,15 @@ import tractor
from tractor.experimental import msgpub
from async_generator import asynccontextmanager
from ._util import (
log,
from piker.log import(
get_logger,
get_console_log,
)
from . import get_brokermod
log = get_logger(
name='piker.brokers.binance',
)
async def wait_for_network(
net_func: Callable,
@ -243,7 +246,10 @@ async def start_quote_stream(
'''
# XXX: why do we need this again?
get_console_log(tractor.current_actor().loglevel)
get_console_log(
level=tractor.current_actor().loglevel,
name=__name__,
)
# pull global vars from local actor
symbols = list(symbols)

View File

@ -34,13 +34,13 @@ import subprocess
import tractor
from piker.brokers._util import get_logger
from piker.log import get_logger
if TYPE_CHECKING:
from .api import Client
import i3ipc
log = get_logger('piker.brokers.ib')
log = get_logger(name=__name__)
_reset_tech: Literal[
'vnc',

View File

@ -50,6 +50,10 @@ from ib_insync.objects import (
)
from piker import config
from piker.log import (
get_logger,
get_console_log,
)
from piker.types import Struct
from piker.accounting import (
Position,
@ -77,7 +81,6 @@ from piker.clearing._messages import (
BrokerdFill,
BrokerdError,
)
from ._util import log
from .api import (
_accounts2clients,
get_config,
@ -95,6 +98,10 @@ from .ledger import (
update_ledger_from_api_trades,
)
log = get_logger(
name=__name__,
)
def pack_position(
pos: IbPosition,
@ -536,9 +543,15 @@ class IbAcnt(Struct):
@tractor.context
async def open_trade_dialog(
ctx: tractor.Context,
loglevel: str = 'warning',
) -> AsyncIterator[dict[str, Any]]:
get_console_log(
level=loglevel,
name=__name__,
)
# task local msg dialog tracking
flows = OrderDialogs()
accounts_def = config.load_accounts(['ib'])

View File

@ -56,11 +56,11 @@ from piker.brokers._util import (
NoData,
DataUnavailable,
)
from piker.log import get_logger
from .api import (
# _adhoc_futes_set,
Client,
con2fqme,
log,
load_aio_clients,
MethodProxy,
open_client_proxies,
@ -78,6 +78,9 @@ from .symbols import get_mkt_info
if TYPE_CHECKING:
from trio._core._run import Task
log = get_logger(
name=__name__,
)
# XXX NOTE: See available types table docs:
# https://interactivebrokers.github.io/tws-api/tick_types.html

View File

@ -44,6 +44,7 @@ from ib_insync import (
CommissionReport,
)
from piker.log import get_logger
from piker.types import Struct
from piker.data import (
SymbologyCache,
@ -57,7 +58,6 @@ from piker.accounting import (
iter_by_dt,
)
from ._flex_reports import parse_flex_dt
from ._util import log
if TYPE_CHECKING:
from .api import (
@ -65,6 +65,9 @@ if TYPE_CHECKING:
MethodProxy,
)
log = get_logger(
name=__name__,
)
tx_sort: Callable = partial(
iter_by_dt,

View File

@ -42,10 +42,7 @@ from piker.accounting import (
from piker._cacheables import (
async_lifo_cache,
)
from ._util import (
log,
)
from piker.log import get_logger
if TYPE_CHECKING:
from .api import (
@ -53,6 +50,10 @@ if TYPE_CHECKING:
Client,
)
log = get_logger(
name=__name__,
)
_futes_venues = (
'GLOBEX',
'NYMEX',

View File

@ -62,9 +62,12 @@ from piker.clearing._messages import (
from piker.brokers import (
open_cached_client,
)
from piker.log import (
get_console_log,
get_logger,
)
from piker.data import open_symcache
from .api import (
log,
Client,
BrokerError,
)
@ -78,6 +81,8 @@ from .ledger import (
verify_balances,
)
log = get_logger(name=__name__)
MsgUnion = Union[
BrokerdCancel,
BrokerdError,
@ -431,9 +436,15 @@ def trades2pps(
@tractor.context
async def open_trade_dialog(
ctx: tractor.Context,
loglevel: str = 'warning',
) -> AsyncIterator[dict[str, Any]]:
get_console_log(
level=loglevel,
name=__name__,
)
async with (
# TODO: maybe bind these together and deliver
# a tuple from `.open_cached_client()`?

View File

@ -50,13 +50,19 @@ from . import open_cached_client
from piker._cacheables import async_lifo_cache
from .. import config
from ._util import resproc, BrokerError, SymbolNotFound
from ..log import (
from piker.log import (
colorize_json,
)
from ._util import (
log,
get_console_log,
)
from piker.log import (
get_logger,
)
log = get_logger(
name=__name__,
)
_use_practice_account = False
_refresh_token_ep = 'https://{}login.questrade.com/oauth2/'
@ -1205,7 +1211,10 @@ async def stream_quotes(
# feed_type: str = 'stock',
) -> AsyncGenerator[str, Dict[str, Any]]:
# XXX: required to propagate ``tractor`` loglevel to piker logging
get_console_log(loglevel)
get_console_log(
level=loglevel,
name=__name__,
)
async with open_cached_client('questrade') as client:
if feed_type == 'stock':

View File

@ -30,9 +30,16 @@ import asks
from ._util import (
resproc,
BrokerError,
log,
)
from ..calc import percent_change
from piker.calc import percent_change
from piker.log import (
get_logger,
)
log = get_logger(
name=__name__,
)
_service_ep = 'https://api.robinhood.com'

View File

@ -215,7 +215,7 @@ async def relay_orders_from_sync_code(
async def open_ems(
fqme: str,
mode: str = 'live',
loglevel: str = 'error',
loglevel: str = 'warning',
) -> tuple[
OrderClient, # client

View File

@ -352,9 +352,21 @@ async def open_brokerd_dialog(
broker backend, configuration, or client code usage.
'''
get_console_log(
level=loglevel,
name='clearing',
)
# enable `.accounting` console since normally used by
# each `brokerd`.
get_console_log(
level=loglevel,
name='piker.accounting',
)
broker: str = brokermod.name
def mk_paper_ep():
def mk_paper_ep(
loglevel: str,
):
from . import _paper_engine as paper_mod
nonlocal brokermod, exec_mode
@ -406,17 +418,21 @@ async def open_brokerd_dialog(
if (
trades_endpoint is not None
or exec_mode != 'paper'
or
exec_mode != 'paper'
):
# open live brokerd trades endpoint
open_trades_endpoint = portal.open_context(
trades_endpoint,
loglevel=loglevel,
)
@acm
async def maybe_open_paper_ep():
if exec_mode == 'paper':
async with mk_paper_ep() as msg:
async with mk_paper_ep(
loglevel=loglevel,
) as msg:
yield msg
return
@ -427,7 +443,9 @@ async def open_brokerd_dialog(
# runtime indication that the backend can't support live
# order ctrl yet, so boot the paperboi B0
if first == 'paper':
async with mk_paper_ep() as msg:
async with mk_paper_ep(
loglevel=loglevel,
) as msg:
yield msg
return
else:

View File

@ -59,9 +59,9 @@ from piker.data import (
open_symcache,
)
from piker.types import Struct
from ._util import (
log, # sub-sys logger
from piker.log import (
get_console_log,
get_logger,
)
from ._messages import (
BrokerdCancel,
@ -73,6 +73,8 @@ from ._messages import (
BrokerdError,
)
log = get_logger(name=__name__)
class PaperBoi(Struct):
'''
@ -550,7 +552,6 @@ _sells: defaultdict[
@tractor.context
async def open_trade_dialog(
ctx: tractor.Context,
broker: str,
fqme: str|None = None, # if empty, we only boot broker mode
@ -558,8 +559,11 @@ async def open_trade_dialog(
) -> None:
# enable piker.clearing console log for *this* subactor
get_console_log(loglevel)
# enable piker.clearing console log for *this* `brokerd` subactor
get_console_log(
level=loglevel,
name=__name__,
)
symcache: SymbologyCache
async with open_symcache(get_brokermod(broker)) as symcache:

View File

@ -61,7 +61,8 @@ def load_trans_eps(
if (
network
and not maddrs
and
not maddrs
):
# load network section and (attempt to) connect all endpoints
# which are reachable B)
@ -112,26 +113,19 @@ def load_trans_eps(
default=None,
help='Multiaddrs to bind or contact',
)
# @click.option(
# '--tsdb',
# is_flag=True,
# help='Enable local ``marketstore`` instance'
# )
# @click.option(
# '--es',
# is_flag=True,
# help='Enable local ``elasticsearch`` instance'
# )
def pikerd(
maddr: list[str] | None,
loglevel: str,
tl: bool,
pdb: bool,
# tsdb: bool,
# es: bool,
):
'''
Spawn the piker broker-daemon.
Start the "root service actor", `pikerd`, run it until
cancellation.
This "root daemon" operates as the top most service-mngr and
subsys-as-subactor supervisor, think of it as the "init proc" of
any of any `piker` application or daemon-process tree.
'''
# from tractor.devx import maybe_open_crash_handler
@ -240,6 +234,14 @@ def cli(
regaddr: str,
) -> None:
'''
The "root" `piker`-cmd CLI endpoint.
NOTE, this def generally relies on and requires a sub-cmd to be
provided by the user, OW only a `--help` msg (listing said
subcmds) will be dumped to console.
'''
if configdir is not None:
assert os.path.isdir(configdir), f"`{configdir}` is not a valid path"
config._override_config_dir(configdir)
@ -301,18 +303,47 @@ def cli(
def services(
config,
tl: bool,
ports,
ports: list[int],
):
'''
List all `piker` "service deamons" to the console in
a `json`-table which maps each actor's UID in the form,
from ..service import (
`{service_name}.{subservice_name}.{UUID}`
to its (primary) IPC server address.
(^TODO, should be its multiaddr form once we support it)
Note that by convention actors which operate as "headless"
processes (those without GUIs/graphics, and which generally
parent some noteworthy subsystem) are normally suffixed by
a "d" such as,
- pikerd: the root runtime supervisor
- brokerd: a broker-backend order ctl daemon
- emsd: the internal dark-clearing and order routing daemon
- datad: a data-provider-backend data feed daemon
- samplerd: the real-time data sampling and clock-syncing daemon
"Headed units" are normally just given an obvious app-like name
with subactors indexed by `.` such as,
- chart: the primary modal charting iface, a Qt app
- chart.fsp_0: a financial-sig-proc cascade instance which
delivers graphics to a parent `chart` app.
- polars_boi: some (presumably) `polars` using console app.
'''
from piker.service import (
open_piker_runtime,
_default_registry_port,
_default_registry_host,
)
host = _default_registry_host
# !TODO, mk this to work with UDS!
host: str = _default_registry_host
if not ports:
ports = [_default_registry_port]
ports: list[int] = [_default_registry_port]
addr = tractor._addr.wrap_address(
addr=(host, ports[0])
@ -326,7 +357,7 @@ def services(
loglevel=(
config['loglevel']
if tl
else None,
else None
),
),
tractor.get_registry(
@ -347,7 +378,15 @@ def services(
def _load_clis() -> None:
# from ..service import elastic # noqa
'''
Dynamically load and register all subsys CLI endpoints (at call
time).
NOTE, obviously this is normally expected to be called at
`import` time and implicitly relies on our use of various
`click`/`typer` decorator APIs.
'''
from ..brokers import cli # noqa
from ..ui import cli # noqa
from ..watchlists import cli # noqa
@ -357,5 +396,5 @@ def _load_clis() -> None:
from ..accounting import cli # noqa
# load downstream cli modules
# load all subsytem cli eps
_load_clis()

View File

@ -292,9 +292,10 @@ class Sampler:
except self.bcast_errors as err:
log.error(
f'Connection dropped for IPC ctx\n'
f'{stream._ctx}\n\n'
f'Due to {type(err)}'
f'Connection dropped for IPC ctx due to,\n'
f'{type(err)!r}\n'
f'\n'
f'{stream._ctx}'
)
borked.add(stream)
else:
@ -335,10 +336,18 @@ async def register_with_sampler(
open_index_stream: bool = True, # open a 2way stream for sample step msgs?
sub_for_broadcasts: bool = True, # sampler side to send step updates?
loglevel: str|None = None,
) -> set[int]:
get_console_log(tractor.current_actor().loglevel)
get_console_log(
level=(
loglevel
or
tractor.current_actor().loglevel
),
name=__name__,
)
incr_was_started: bool = False
try:
@ -475,6 +484,7 @@ async def spawn_samplerd(
register_with_sampler,
period_s=1,
sub_for_broadcasts=False,
loglevel=loglevel,
)
return True
@ -483,7 +493,6 @@ async def spawn_samplerd(
@acm
async def maybe_open_samplerd(
loglevel: str|None = None,
**pikerd_kwargs,
@ -512,10 +521,10 @@ async def open_sample_stream(
shms_by_period: dict[float, dict]|None = None,
open_index_stream: bool = True,
sub_for_broadcasts: bool = True,
loglevel: str|None = None,
cache_key: str|None = None,
allow_new_sampler: bool = True,
# cache_key: str|None = None,
# allow_new_sampler: bool = True,
ensure_is_active: bool = False,
) -> AsyncIterator[dict[str, float]]:
@ -550,7 +559,9 @@ async def open_sample_stream(
# XXX: this should be singleton on a host,
# a lone broker-daemon per provider should be
# created for all practical purposes
maybe_open_samplerd() as portal,
maybe_open_samplerd(
loglevel=loglevel,
) as portal,
portal.open_context(
register_with_sampler,
@ -559,6 +570,7 @@ async def open_sample_stream(
'shms_by_period': shms_by_period,
'open_index_stream': open_index_stream,
'sub_for_broadcasts': sub_for_broadcasts,
'loglevel': loglevel,
},
) as (ctx, shm_periods)
):

View File

@ -31,6 +31,7 @@ from typing import (
AsyncContextManager,
AsyncGenerator,
Iterable,
Type,
)
import json
@ -67,7 +68,7 @@ class NoBsWs:
'''
# apparently we can QoS for all sorts of reasons..so catch em.
recon_errors = (
recon_errors: tuple[Type[Exception]] = (
ConnectionClosed,
DisconnectionTimeout,
ConnectionRejected,
@ -370,6 +371,7 @@ async def open_autorecon_ws(
rcv: trio.MemoryReceiveChannel
snd, rcv = trio.open_memory_channel(616)
try:
async with (
tractor.trionics.collapse_eg(),
trio.open_nursery() as tn
@ -397,6 +399,12 @@ async def open_autorecon_ws(
finally:
tn.cancel_scope.cancel()
except NoBsWs.recon_errors as con_err:
log.warning(
f'Entire ws-channel disconnect due to,\n'
f'con_err: {con_err!r}\n'
)
'''
JSONRPC response-request style machinery for transparent multiplexing

View File

@ -239,7 +239,6 @@ async def allocate_persistent_feed(
brokername: str,
symstr: str,
loglevel: str,
start_stream: bool = True,
init_timeout: float = 616,
@ -348,11 +347,14 @@ async def allocate_persistent_feed(
izero_rt,
rt_shm,
) = await bus.nursery.start(
partial(
manage_history,
mod,
mkt,
some_data_ready,
feed_is_live,
mod=mod,
mkt=mkt,
some_data_ready=some_data_ready,
feed_is_live=feed_is_live,
loglevel=loglevel,
)
)
# yield back control to starting nursery once we receive either
@ -460,7 +462,6 @@ async def allocate_persistent_feed(
@tractor.context
async def open_feed_bus(
ctx: tractor.Context,
brokername: str,
symbols: list[str], # normally expected to the broker-specific fqme
@ -481,13 +482,16 @@ async def open_feed_bus(
'''
if loglevel is None:
loglevel = tractor.current_actor().loglevel
loglevel: str = tractor.current_actor().loglevel
# XXX: required to propagate ``tractor`` loglevel to piker
# logging
get_console_log(
loglevel
or tractor.current_actor().loglevel
level=(loglevel
or
tractor.current_actor().loglevel
),
name=__name__,
)
# local state sanity checks
@ -883,7 +887,6 @@ async def open_feed(
# one actor per brokerd for now
brokerd_ctxs = []
for brokermod, bfqmes in providers.items():
# if no `brokerd` for this backend exists yet we spawn

View File

@ -484,7 +484,8 @@ async def cascade(
# open a data feed stream with requested broker
feed: Feed
async with data.feed.maybe_open_feed(
[fqme],
fqmes=[fqme],
loglevel=loglevel,
# TODO throttle tick outputs from *this* daemon since
# it'll emit tons of ticks due to the throttle only
@ -582,7 +583,8 @@ async def cascade(
# on every step msg received from the global `samplerd`
# service.
async with open_sample_stream(
float(delay_s)
period_s=float(delay_s),
loglevel=loglevel,
) as istream:
profiler(f'{func_name}: sample stream up')

View File

@ -31,9 +31,12 @@ from contextlib import (
import tractor
import trio
from ._util import (
from piker.log import (
get_console_log,
)
from ._util import (
subsys,
)
from ._mngr import (
Services,
)
@ -96,7 +99,8 @@ async def open_piker_runtime(
# setting it as the root actor on localhost.
registry_addrs = (
registry_addrs
or [_default_reg_addr]
or
[_default_reg_addr]
)
if ems := tractor_kwargs.pop('enable_modules', None):
@ -271,6 +275,7 @@ async def maybe_open_pikerd(
'''
if loglevel:
get_console_log(
name=subsys,
level=loglevel
)

View File

@ -49,13 +49,15 @@ from requests.exceptions import (
ReadTimeout,
)
from ._mngr import Services
from ._util import (
log, # sub-sys logger
from piker.log import (
get_console_log,
get_logger,
)
from ._mngr import Services
from .. import config
log = get_logger(name=__name__)
class DockerNotStarted(Exception):
'Prolly you dint start da daemon bruh'
@ -336,13 +338,16 @@ class Container:
async def open_ahabd(
ctx: tractor.Context,
endpoint: str, # ns-pointer str-msg-type
loglevel: str | None = None,
loglevel: str = 'cancel',
**ep_kwargs,
) -> None:
log = get_console_log(loglevel or 'cancel')
log = get_console_log(
level=loglevel,
name='piker.service',
)
async with open_docker() as client:

View File

@ -30,8 +30,9 @@ from contextlib import (
import tractor
from trio.lowlevel import current_task
from ._util import (
log, # sub-sys logger
from piker.log import (
get_console_log,
get_logger,
)
from ._mngr import (
Services,
@ -39,10 +40,11 @@ from ._mngr import (
from ._actor_runtime import maybe_open_pikerd
from ._registry import find_service
log = get_logger(name=__name__)
@acm
async def maybe_spawn_daemon(
service_name: str,
service_task_target: Callable,
@ -66,6 +68,12 @@ async def maybe_spawn_daemon(
clients.
'''
log = get_console_log(
level=loglevel,
name=__name__,
)
assert log.name == 'piker.service'
# serialize access to this section to avoid
# 2 or more tasks racing to create a daemon
lock = Services.locks[service_name]
@ -152,7 +160,6 @@ async def maybe_spawn_daemon(
async def spawn_emsd(
loglevel: str|None = None,
**extra_tractor_kwargs
@ -190,7 +197,6 @@ async def spawn_emsd(
@acm
async def maybe_open_emsd(
brokername: str,
loglevel: str|None = None,

View File

@ -34,9 +34,9 @@ from tractor import (
Portal,
)
from ._util import (
log, # sub-sys logger
)
from piker.log import get_logger
log = get_logger(name=__name__)
# TODO: we need remote wrapping and a general soln:

View File

@ -27,15 +27,29 @@ from typing import (
)
import tractor
from tractor import Portal
from ._util import (
log, # sub-sys logger
from tractor import (
msg,
Actor,
Portal,
)
from piker.log import get_logger
log = get_logger(name=__name__)
# TODO? default path-space for UDS registry?
# [ ] needs to be Xplatform tho!
# _default_registry_path: Path = (
# Path(os.environ['XDG_RUNTIME_DIR'])
# /'piker'
# )
_default_registry_host: str = '127.0.0.1'
_default_registry_port: int = 6116
_default_reg_addr: tuple[str, int] = (
_default_reg_addr: tuple[
str,
int, # |str TODO, once we support UDS, see above.
] = (
_default_registry_host,
_default_registry_port,
)
@ -75,16 +89,22 @@ async def open_registry(
'''
global _tractor_kwargs
actor = tractor.current_actor()
uid = actor.uid
preset_reg_addrs: list[tuple[str, int]] = Registry.addrs
actor: Actor = tractor.current_actor()
aid: msg.Aid = actor.aid
uid: tuple[str, str] = aid.uid
preset_reg_addrs: list[
tuple[str, int]
] = Registry.addrs
if (
preset_reg_addrs
and addrs
and
addrs
):
if preset_reg_addrs != addrs:
# if any(addr in preset_reg_addrs for addr in addrs):
diff: set[tuple[str, int]] = set(preset_reg_addrs) - set(addrs)
diff: set[
tuple[str, int]
] = set(preset_reg_addrs) - set(addrs)
if diff:
log.warning(
f'`{uid}` requested only subset of registrars: {addrs}\n'
@ -98,7 +118,6 @@ async def open_registry(
)
was_set: bool = False
if (
not tractor.is_root_process()
and
@ -115,16 +134,23 @@ async def open_registry(
f"`{uid}` registry should already exist but doesn't?"
)
if (
not Registry.addrs
):
if not Registry.addrs:
was_set = True
Registry.addrs = addrs or [_default_reg_addr]
Registry.addrs = (
addrs
or
[_default_reg_addr]
)
# NOTE: only spot this seems currently used is inside
# `.ui._exec` which is the (eventual qtloops) bootstrapping
# with guest mode.
_tractor_kwargs['registry_addrs'] = Registry.addrs
reg_addrs: list[tuple[str, str|int]] = Registry.addrs
# !TODO, a struct-API to stringently allow this only in special
# cases?
# -> better would be to have some way to (atomically) rewrite
# and entire `RuntimeVars`?? ideas welcome obvi..
_tractor_kwargs['registry_addrs'] = reg_addrs
try:
yield Registry.addrs
@ -149,7 +175,7 @@ async def find_service(
| None
):
# try:
reg_addrs: list[tuple[str, int]]
reg_addrs: list[tuple[str, int|str]]
async with open_registry(
addrs=(
registry_addrs
@ -172,15 +198,13 @@ async def find_service(
only_first=first_only, # if set only returns single ref
) as maybe_portals:
if not maybe_portals:
# log.info(
print(
log.info(
f'Could NOT find service {service_name!r} -> {maybe_portals!r}'
)
yield None
return
# log.info(
print(
log.info(
f'Found service {service_name!r} -> {maybe_portals}'
)
yield maybe_portals
@ -195,7 +219,6 @@ async def find_service(
async def check_for_service(
service_name: str,
) -> None|tuple[str, int]:
'''
Service daemon "liveness" predicate.

View File

@ -14,20 +14,12 @@
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Sub-sys module commons.
Sub-sys module commons (if any ?? Bp).
"""
from functools import partial
from ..log import (
get_logger,
get_console_log,
)
subsys: str = 'piker.service'
log = get_logger(subsys)
get_console_log = partial(
get_console_log,
name=subsys,
)
# ?TODO, if we were going to keep a `get_console_log()` in here to be
# invoked at `import`-time, how do we dynamically hand in the
# `level=` value? seems too early in the runtime to be injected
# right?

View File

@ -16,6 +16,7 @@
from __future__ import annotations
from contextlib import asynccontextmanager as acm
from pprint import pformat
from typing import (
Any,
TYPE_CHECKING,
@ -26,12 +27,17 @@ import asks
if TYPE_CHECKING:
import docker
from ._ahab import DockerContainer
from ._util import log # sub-sys logger
from ._util import (
get_console_log,
from . import (
Services,
)
from piker.log import (
get_console_log,
get_logger,
)
log = get_logger(name=__name__)
# container level config
_config = {
@ -67,7 +73,10 @@ def start_elasticsearch(
elastic
'''
get_console_log('info', name=__name__)
get_console_log(
level='info',
name=__name__,
)
dcntr: DockerContainer = client.containers.run(
'piker:elastic',

View File

@ -52,17 +52,18 @@ import pendulum
# TODO: import this for specific error set expected by mkts client
# import purerpc
from ..data.feed import maybe_open_feed
from piker.data.feed import maybe_open_feed
from . import Services
from ._util import (
log, # sub-sys logger
from piker.log import (
get_console_log,
get_logger,
)
if TYPE_CHECKING:
import docker
from ._ahab import DockerContainer
log = get_logger(name=__name__)
# ahabd-supervisor and container level config

View File

@ -294,11 +294,6 @@ def ldshm(
f'Something is wrong with time period for {shm}:\n{times}'
)
period_s: float = float(max(d1, d2, med))
log.info(
f'Processing shm buffer:\n'
f' file: {shmfile.name}\n'
f' period: {period_s}s\n'
)
null_segs: tuple = tsp.get_null_segs(
frame=shm.array,
@ -452,7 +447,13 @@ def ldshm(
)
# last chance manual overwrites in REPL
# await tractor.pause()
assert aids
if not aids:
log.warning(
f'No gaps were found !?\n'
f'fqme: {fqme!r}\n'
f'timeframe: {period_s!r}\n'
f"WELL THAT'S GOOD NOOZ!\n"
)
tf2aids[period_s] = aids
else:

View File

@ -54,10 +54,10 @@ from ..log import (
# for "time series processing"
subsys: str = 'piker.tsp'
log = get_logger(subsys)
log = get_logger(name=__name__)
get_console_log = partial(
get_console_log,
name=subsys,
name=subsys, # activate for subsys-pkg "downward"
)
# NOTE: union type-defs to handle generic `numpy` and `polars` types

View File

@ -30,11 +30,6 @@ import tractor
from piker.data._formatters import BGM
from piker.storage import log
from piker.toolz.profile import (
Profiler,
pg_profile_enabled,
ms_slower_then,
)
from piker.ui._style import get_fonts
if TYPE_CHECKING:
@ -97,22 +92,12 @@ async def markup_gaps(
# gap's duration.
show_txt: bool = False,
# A/B comparison: render individual arrows alongside batch
# for visual comparison
show_individual_arrows: bool = False,
) -> dict[int, dict]:
'''
Remote annotate time-gaps in a dt-fielded ts (normally OHLC)
with rectangles.
'''
profiler = Profiler(
msg=f'markup_gaps() for {gaps.height} gaps',
disabled=False,
ms_threshold=0.0,
)
# XXX: force chart redraw FIRST to ensure PlotItem coordinate
# system is properly initialized before we position annotations!
# Without this, annotations may be misaligned on first creation
@ -121,19 +106,6 @@ async def markup_gaps(
fqme=fqme,
timeframe=timeframe,
)
profiler('first `.redraw()` before annot creation')
log.info(
f'markup_gaps() called:\n'
f' fqme: {fqme}\n'
f' timeframe: {timeframe}s\n'
f' gaps.height: {gaps.height}\n'
)
# collect all annotation specs for batch submission
rect_specs: list[dict] = []
arrow_specs: list[dict] = []
text_specs: list[dict] = []
aids: dict[int] = {}
for i in range(gaps.height):
@ -245,38 +217,56 @@ async def markup_gaps(
# 1: 'wine', # down-gap
# }[sgn]
# collect rect spec (no fqme/timeframe, added by batch
# API)
rect_spec: dict[str, Any] = dict(
meth='set_view_pos',
rect_kwargs: dict[str, Any] = dict(
fqme=fqme,
timeframe=timeframe,
start_pos=lc,
end_pos=ro,
color=color,
update_label=False,
start_time=start_time,
end_time=end_time,
)
rect_specs.append(rect_spec)
# add up/down rects
aid: int|None = await actl.add_rect(**rect_kwargs)
if aid is None:
log.error(
f'Failed to add rect for,\n'
f'{rect_kwargs!r}\n'
f'\n'
f'Skipping to next gap!\n'
)
continue
assert aid
aids[aid] = rect_kwargs
direction: str = (
'down' if down_gap
else 'up'
)
# collect arrow spec
# TODO! mk this a `msgspec.Struct` which we deserialize
# on the server side!
# XXX: send timestamp for server-side index lookup
# to ensure alignment with current shm state
gap_time: float = row['time'][0]
arrow_spec: dict[str, Any] = dict(
arrow_kwargs: dict[str, Any] = dict(
fqme=fqme,
timeframe=timeframe,
x=iend, # fallback if timestamp lookup fails
y=cls,
time=gap_time, # for server-side index lookup
color=color,
alpha=169,
pointing=direction,
# TODO: expose these as params to markup_gaps()?
headLen=10,
headWidth=2.222,
pxMode=True,
)
arrow_specs.append(arrow_spec)
aid: int = await actl.add_arrow(
**arrow_kwargs
)
# add duration label to RHS of arrow
if up_gap:
@ -288,12 +278,15 @@ async def markup_gaps(
assert flat
anchor = (0, 0) # up from bottom
# collect text spec if enabled
if show_txt:
# use a slightly smaller font for gap label txt.
font, small_font = get_fonts()
font_size: int = small_font.px_size - 1
assert isinstance(font_size, int)
text_spec: dict[str, Any] = dict(
if show_txt:
text_aid: int = await actl.add_text(
fqme=fqme,
timeframe=timeframe,
text=gap_label,
x=iend + 1, # fallback if timestamp lookup fails
y=cls,
@ -302,46 +295,12 @@ async def markup_gaps(
anchor=anchor,
font_size=font_size,
)
text_specs.append(text_spec)
aids[text_aid] = {'text': gap_label}
# submit all annotations in single batch IPC msg
log.info(
f'Submitting batch annotations:\n'
f' rects: {len(rect_specs)}\n'
f' arrows: {len(arrow_specs)}\n'
f' texts: {len(text_specs)}\n'
)
profiler('built all annotation specs')
result: dict[str, list[int]] = await actl.add_batch(
fqme=fqme,
timeframe=timeframe,
rects=rect_specs,
arrows=arrow_specs,
texts=text_specs,
show_individual_arrows=show_individual_arrows,
)
profiler('batch `.add_batch()` IPC call complete')
# build aids dict from batch results
for aid in result['rects']:
aids[aid] = {'type': 'rect'}
for aid in result['arrows']:
aids[aid] = {'type': 'arrow'}
for aid in result['texts']:
aids[aid] = {'type': 'text'}
log.info(
f'Batch submission complete: {len(aids)} annotation(s) '
f'created'
)
profiler('built aids result dict')
# tell chart to redraw all its graphics view layers
# tell chart to redraw all its
# graphics view layers Bo
await actl.redraw(
fqme=fqme,
timeframe=timeframe,
)
profiler('final `.redraw()` after annot creation')
return aids

View File

@ -63,8 +63,10 @@ from ..data._sharedmem import (
maybe_open_shm_array,
ShmArray,
)
from ..data._source import def_iohlcv_fields
from ..data._sampling import (
from piker.data._source import (
def_iohlcv_fields,
)
from piker.data._sampling import (
open_sample_stream,
)
@ -96,7 +98,9 @@ if TYPE_CHECKING:
# from .feed import _FeedsBus
log = get_logger(__name__)
log = get_logger(
name=__name__,
)
# `ShmArray` buffer sizing configuration:
@ -550,7 +554,7 @@ async def start_backfill(
)
# ?TODO, check against venue closure hours
# if/when provided by backend?
await tractor.pause()
# await tractor.pause()
expected_dur: Interval = (
last_start_dt.subtract(
@ -1320,6 +1324,7 @@ async def manage_history(
mkt: MktPair,
some_data_ready: trio.Event,
feed_is_live: trio.Event,
loglevel: str = 'warning',
timeframe: float = 60, # in seconds
wait_for_live_timeout: float = 0.5,
@ -1497,6 +1502,7 @@ async def manage_history(
# data feed layer that needs to consume it).
open_index_stream=True,
sub_for_broadcasts=False,
loglevel=loglevel,
) as sample_stream:
# register 1s and 1m buffers with the global

View File

@ -24,11 +24,8 @@ from pyqtgraph import (
Point,
functions as fn,
Color,
GraphicsObject,
)
from pyqtgraph.Qt import internals
import numpy as np
import pyqtgraph as pg
from piker.ui.qt import (
QtCore,
@ -38,10 +35,6 @@ from piker.ui.qt import (
QRectF,
QGraphicsPathItem,
)
from piker.ui._style import hcolor
from piker.log import get_logger
log = get_logger(__name__)
def mk_marker_path(
@ -111,7 +104,7 @@ def mk_marker_path(
class LevelMarker(QGraphicsPathItem):
'''
An arrow marker path graphic which redraws itself
An arrow marker path graphich which redraws itself
to the specified view coordinate level on each paint cycle.
'''
@ -258,9 +251,9 @@ def qgo_draw_markers(
) -> float:
'''
Paint markers in ``pg.GraphicsItem`` style by first removing the
view transform for the painter, drawing the markers in scene
coords, then restoring the view coords.
Paint markers in ``pg.GraphicsItem`` style by first
removing the view transform for the painter, drawing the markers
in scene coords, then restoring the view coords.
'''
# paint markers in native coordinate system
@ -302,449 +295,3 @@ def qgo_draw_markers(
p.setTransform(orig_tr)
return max(sizes)
class GapAnnotations(GraphicsObject):
'''
Batch-rendered gap annotations using Qt's efficient drawing
APIs.
Instead of creating individual `QGraphicsItem` instances per
gap (which is very slow for 1000+ gaps), this class stores all
gap rectangles and arrows in numpy-backed arrays and renders
them in single batch paint calls.
Performance: ~1000x faster than individual items for large gap
counts.
Based on patterns from:
- `pyqtgraph.BarGraphItem` (batch rect rendering)
- `pyqtgraph.ScatterPlotItem` (fragment rendering)
- `piker.ui._curve.FlowGraphic` (single path pattern)
'''
def __init__(
self,
gap_specs: list[dict],
array: np.ndarray|None = None,
color: str = 'dad_blue',
alpha: int = 169,
arrow_size: float = 10.0,
fqme: str|None = None,
timeframe: float|None = None,
) -> None:
'''
gap_specs: list of dicts with keys:
- start_pos: (x, y) tuple for left corner of rect
- end_pos: (x, y) tuple for right corner of rect
- arrow_x: x position for arrow
- arrow_y: y position for arrow
- pointing: 'up' or 'down' for arrow direction
- start_time: (optional) timestamp for repositioning
- end_time: (optional) timestamp for repositioning
array: optional OHLC numpy array for repositioning on
backfill updates (when abs-index changes)
fqme: symbol name for these gaps (for logging/debugging)
timeframe: period in seconds that these gaps were
detected on (used to skip reposition when
called with wrong timeframe's array)
'''
super().__init__()
self._gap_specs = gap_specs
self._array = array
self._fqme = fqme
self._timeframe = timeframe
n_gaps = len(gap_specs)
# shared pen/brush matching original SelectRect/ArrowItem style
base_color = pg.mkColor(hcolor(color))
# rect pen: base color, fully opaque for outline
self._rect_pen = pg.mkPen(base_color, width=1)
# rect brush: base color with alpha=66 (SelectRect default)
rect_fill = pg.mkColor(hcolor(color))
rect_fill.setAlpha(66)
self._rect_brush = pg.functions.mkBrush(rect_fill)
# arrow pen: same as rects
self._arrow_pen = pg.mkPen(base_color, width=1)
# arrow brush: base color with user-specified alpha (default 169)
arrow_fill = pg.mkColor(hcolor(color))
arrow_fill.setAlpha(alpha)
self._arrow_brush = pg.functions.mkBrush(arrow_fill)
# allocate rect array using Qt's efficient storage
self._rectarray = internals.PrimitiveArray(
QtCore.QRectF,
4,
)
self._rectarray.resize(n_gaps)
rect_memory = self._rectarray.ndarray()
# fill rect array from gap specs
for (
i,
spec,
) in enumerate(gap_specs):
(
start_x,
start_y,
) = spec['start_pos']
(
end_x,
end_y,
) = spec['end_pos']
# QRectF expects (x, y, width, height)
rect_memory[i, 0] = start_x
rect_memory[i, 1] = min(start_y, end_y)
rect_memory[i, 2] = end_x - start_x
rect_memory[i, 3] = abs(end_y - start_y)
# build single QPainterPath for all arrows
self._arrow_path = QtGui.QPainterPath()
self._arrow_size = arrow_size
for spec in gap_specs:
arrow_x = spec['arrow_x']
arrow_y = spec['arrow_y']
pointing = spec['pointing']
# create arrow polygon
if pointing == 'down':
# arrow points downward
arrow_poly = QtGui.QPolygonF([
QPointF(arrow_x, arrow_y), # tip
QPointF(
arrow_x - arrow_size/2,
arrow_y - arrow_size,
), # left
QPointF(
arrow_x + arrow_size/2,
arrow_y - arrow_size,
), # right
])
else: # up
# arrow points upward
arrow_poly = QtGui.QPolygonF([
QPointF(arrow_x, arrow_y), # tip
QPointF(
arrow_x - arrow_size/2,
arrow_y + arrow_size,
), # left
QPointF(
arrow_x + arrow_size/2,
arrow_y + arrow_size,
), # right
])
self._arrow_path.addPolygon(arrow_poly)
self._arrow_path.closeSubpath()
# cache bounding rect
self._br: QRectF|None = None
def boundingRect(self) -> QRectF:
'''
Compute bounding rect from rect array and arrow path.
'''
if self._br is not None:
return self._br
# get rect bounds
rect_memory = self._rectarray.ndarray()
if len(rect_memory) == 0:
self._br = QRectF()
return self._br
x_min = rect_memory[:, 0].min()
y_min = rect_memory[:, 1].min()
x_max = (rect_memory[:, 0] + rect_memory[:, 2]).max()
y_max = (rect_memory[:, 1] + rect_memory[:, 3]).max()
# expand for arrow path
arrow_br = self._arrow_path.boundingRect()
x_min = min(x_min, arrow_br.left())
y_min = min(y_min, arrow_br.top())
x_max = max(x_max, arrow_br.right())
y_max = max(y_max, arrow_br.bottom())
self._br = QRectF(
x_min,
y_min,
x_max - x_min,
y_max - y_min,
)
return self._br
def paint(
self,
p: QtGui.QPainter,
opt: QtWidgets.QStyleOptionGraphicsItem,
w: QtWidgets.QWidget,
) -> None:
'''
Batch render all rects and arrows in minimal paint calls.
'''
# draw all rects in single batch call (data coordinates)
p.setPen(self._rect_pen)
p.setBrush(self._rect_brush)
drawargs = self._rectarray.drawargs()
p.drawRects(*drawargs)
# draw arrows in scene/pixel coordinates so they maintain
# size regardless of zoom level
orig_tr = p.transform()
p.resetTransform()
# rebuild arrow path in scene coordinates
arrow_path_scene = QtGui.QPainterPath()
# arrow geometry matching pg.ArrowItem defaults
# headLen=10, headWidth=2.222
# headWidth is the half-width (center to edge distance)
head_len = self._arrow_size
head_width = head_len * 0.2222 # 2.222 at size=10
for spec in self._gap_specs:
if 'arrow_x' not in spec:
continue
arrow_x = spec['arrow_x']
arrow_y = spec['arrow_y']
pointing = spec['pointing']
# transform data coords to scene coords
scene_pt = orig_tr.map(QPointF(arrow_x, arrow_y))
sx = scene_pt.x()
sy = scene_pt.y()
# create arrow polygon in scene/pixel coords
# matching pg.ArrowItem geometry but rotated for up/down
if pointing == 'down':
# tip points downward (negative y direction)
arrow_poly = QtGui.QPolygonF([
QPointF(sx, sy), # tip
QPointF(
sx - head_width,
sy - head_len,
), # left base
QPointF(
sx + head_width,
sy - head_len,
), # right base
])
else: # up
# tip points upward (positive y direction)
arrow_poly = QtGui.QPolygonF([
QPointF(sx, sy), # tip
QPointF(
sx - head_width,
sy + head_len,
), # left base
QPointF(
sx + head_width,
sy + head_len,
), # right base
])
arrow_path_scene.addPolygon(arrow_poly)
arrow_path_scene.closeSubpath()
p.setPen(self._arrow_pen)
p.setBrush(self._arrow_brush)
p.drawPath(arrow_path_scene)
# restore original transform
p.setTransform(orig_tr)
def reposition(
self,
array: np.ndarray|None = None,
fqme: str|None = None,
timeframe: float|None = None,
) -> None:
'''
Reposition all annotations based on timestamps.
Used when viz is updated (eg during backfill) and abs-index
range changes - we need to lookup new indices from timestamps.
'''
# skip reposition if timeframe doesn't match
# (e.g., 1s gaps being repositioned with 60s array)
if (
timeframe is not None
and
self._timeframe is not None
and
timeframe != self._timeframe
):
log.debug(
f'Skipping reposition for {self._fqme} gaps:\n'
f' gap timeframe: {self._timeframe}s\n'
f' array timeframe: {timeframe}s\n'
)
return
if array is None:
array = self._array
if array is None:
log.warning(
'GapAnnotations.reposition() called but no array '
'provided'
)
return
# collect all unique timestamps we need to lookup
timestamps: set[float] = set()
for spec in self._gap_specs:
if spec.get('start_time') is not None:
timestamps.add(spec['start_time'])
if spec.get('end_time') is not None:
timestamps.add(spec['end_time'])
if spec.get('time') is not None:
timestamps.add(spec['time'])
# vectorized timestamp -> row lookup using binary search
time_to_row: dict[float, dict] = {}
if timestamps:
import numpy as np
time_arr = array['time']
ts_array = np.array(list(timestamps))
search_indices = np.searchsorted(
time_arr,
ts_array,
)
# vectorized bounds check and exact match verification
valid_mask = (
(search_indices < len(array))
& (time_arr[search_indices] == ts_array)
)
valid_indices = search_indices[valid_mask]
valid_timestamps = ts_array[valid_mask]
matched_rows = array[valid_indices]
time_to_row = {
float(ts): {
'index': float(row['index']),
'open': float(row['open']),
'close': float(row['close']),
}
for ts, row in zip(
valid_timestamps,
matched_rows,
)
}
# rebuild rect array from gap specs with new indices
rect_memory = self._rectarray.ndarray()
for (
i,
spec,
) in enumerate(self._gap_specs):
start_time = spec.get('start_time')
end_time = spec.get('end_time')
if (
start_time is None
or end_time is None
):
continue
start_row = time_to_row.get(start_time)
end_row = time_to_row.get(end_time)
if (
start_row is None
or end_row is None
):
log.warning(
f'Timestamp lookup failed for gap[{i}] during '
f'reposition:\n'
f' fqme: {fqme}\n'
f' timeframe: {timeframe}s\n'
f' start_time: {start_time}\n'
f' end_time: {end_time}\n'
f' array time range: '
f'{array["time"][0]} -> {array["time"][-1]}\n'
)
continue
start_idx = start_row['index']
end_idx = end_row['index']
start_close = start_row['close']
end_open = end_row['open']
from_idx: float = 0.16 - 0.06
start_x = start_idx + 1 - from_idx
end_x = end_idx + from_idx
# update rect in array
rect_memory[i, 0] = start_x
rect_memory[i, 1] = min(start_close, end_open)
rect_memory[i, 2] = end_x - start_x
rect_memory[i, 3] = abs(end_open - start_close)
# rebuild arrow path with new indices
self._arrow_path.clear()
for spec in self._gap_specs:
time_val = spec.get('time')
if time_val is None:
continue
arrow_row = time_to_row.get(time_val)
if arrow_row is None:
continue
arrow_x = arrow_row['index']
arrow_y = arrow_row['close']
pointing = spec['pointing']
# create arrow polygon
if pointing == 'down':
arrow_poly = QtGui.QPolygonF([
QPointF(arrow_x, arrow_y),
QPointF(
arrow_x - self._arrow_size/2,
arrow_y - self._arrow_size,
),
QPointF(
arrow_x + self._arrow_size/2,
arrow_y - self._arrow_size,
),
])
else: # up
arrow_poly = QtGui.QPolygonF([
QPointF(arrow_x, arrow_y),
QPointF(
arrow_x - self._arrow_size/2,
arrow_y + self._arrow_size,
),
QPointF(
arrow_x + self._arrow_size/2,
arrow_y + self._arrow_size,
),
])
self._arrow_path.addPolygon(arrow_poly)
self._arrow_path.closeSubpath()
# invalidate bounding rect cache
self._br = None
self.prepareGeometryChange()
self.update()

View File

@ -21,6 +21,7 @@ this module ties together quote and computational (fsp) streams with
graphics update methods via our custom ``pyqtgraph`` charting api.
'''
from functools import partial
import itertools
from math import floor
import time
@ -208,6 +209,7 @@ class DisplayState(Struct):
async def increment_history_view(
# min_istream: tractor.MsgStream,
ds: DisplayState,
loglevel: str = 'warning',
):
hist_chart: ChartPlotWidget = ds.hist_chart
hist_viz: Viz = ds.hist_viz
@ -229,7 +231,10 @@ async def increment_history_view(
hist_viz.reset_graphics()
# hist_viz.update_graphics(force_redraw=True)
async with open_sample_stream(1.) as min_istream:
async with open_sample_stream(
period_s=1.,
loglevel=loglevel,
) as min_istream:
async for msg in min_istream:
profiler = Profiler(
@ -310,7 +315,6 @@ async def increment_history_view(
async def graphics_update_loop(
dss: dict[str, DisplayState],
nurse: trio.Nursery,
godwidget: GodWidget,
@ -319,6 +323,7 @@ async def graphics_update_loop(
pis: dict[str, list[pgo.PlotItem, pgo.PlotItem]] = {},
vlm_charts: dict[str, ChartPlotWidget] = {},
loglevel: str = 'warning',
) -> None:
'''
@ -462,9 +467,12 @@ async def graphics_update_loop(
# })
nurse.start_soon(
partial(
increment_history_view,
# min_istream,
ds,
ds=ds,
loglevel=loglevel,
),
)
await trio.sleep(0)
@ -511,14 +519,19 @@ async def graphics_update_loop(
fast_chart.linked.isHidden()
or not rt_pi.isVisible()
):
print(f'{fqme} skipping update for HIDDEN CHART')
log.debug(
f'{fqme} skipping update for HIDDEN CHART'
)
fast_chart.pause_all_feeds()
continue
ic = fast_chart.view._in_interact
if ic:
fast_chart.pause_all_feeds()
print(f'{fqme} PAUSING DURING INTERACTION')
log.debug(
f'Pausing chart updaates during interaction\n'
f'fqme: {fqme!r}'
)
await ic.wait()
fast_chart.resume_all_feeds()
@ -1591,15 +1604,18 @@ async def display_symbol_data(
# start update loop task
dss: dict[str, DisplayState] = {}
ln.start_soon(
partial(
graphics_update_loop,
dss,
ln,
godwidget,
feed,
dss=dss,
nurse=ln,
godwidget=godwidget,
feed=feed,
# min_istream,
pis,
vlm_charts,
pis=pis,
vlm_charts=vlm_charts,
loglevel=loglevel,
)
)
# boot order-mode

View File

@ -168,7 +168,7 @@ class ArrowEditor(Struct):
'''
uid: str = arrow._uid
arrows: list[pg.ArrowItem] = self._arrows[uid]
log.debug(
log.info(
f'Removing arrow from views\n'
f'uid: {uid!r}\n'
f'{arrow!r}\n'
@ -286,9 +286,7 @@ class LineEditor(Struct):
for line in lines:
line.show_labels()
line.hide_markers()
log.debug(
f'Line active @ level: {line.value()!r}'
)
log.debug(f'Level active for level: {line.value()}')
# TODO: other flashy things to indicate the order is active
return lines
@ -331,11 +329,7 @@ class LineEditor(Struct):
if line in hovered:
hovered.remove(line)
log.debug(
f'Deleting level-line\n'
f'line: {line!r}\n'
f'oid: {uuid!r}\n'
)
log.debug(f'deleting {line} with oid: {uuid}')
line.delete()
# make sure the xhair doesn't get left off
@ -343,11 +337,7 @@ class LineEditor(Struct):
cursor.show_xhair()
else:
log.warning(
f'Could not find line for removal ??\n'
f'\n'
f'{line!r}\n'
)
log.warning(f'Could not find line for {line}')
return lines
@ -579,11 +569,11 @@ class SelectRect(QtWidgets.QGraphicsRectItem):
if update_label:
self.init_label(view_rect)
log.debug(
f'SelectRect modify,\n'
print(
'SelectRect modify:\n'
f'QRectF: {view_rect}\n'
f'start_pos: {start_pos!r}\n'
f'end_pos: {end_pos!r}\n'
f'start_pos: {start_pos}\n'
f'end_pos: {end_pos}\n'
)
self.show()
@ -650,11 +640,8 @@ class SelectRect(QtWidgets.QGraphicsRectItem):
dmn=dmn,
))
# tracing
# log.info(
# f'x2, y2: {(x2, y2)}\n'
# f'xmn, ymn: {(xmn, ymx)}\n'
# )
# print(f'x2, y2: {(x2, y2)}')
# print(f'xmn, ymn: {(xmn, ymx)}')
label_anchor = Point(
xmx + 2,

View File

@ -183,13 +183,17 @@ async def open_fsp_sidepane(
@acm
async def open_fsp_actor_cluster(
names: list[str] = ['fsp_0', 'fsp_1'],
names: list[str] = [
'fsp_0',
'fsp_1',
],
) -> AsyncGenerator[
int,
dict[str, tractor.Portal]
]:
# TODO! change to .experimental!
from tractor._clustering import open_actor_cluster
# profiler = Profiler(
@ -197,7 +201,7 @@ async def open_fsp_actor_cluster(
# disabled=False
# )
async with open_actor_cluster(
count=2,
count=len(names),
names=names,
modules=['piker.fsp._engine'],
@ -497,7 +501,8 @@ class FspAdmin:
portal: tractor.Portal = (
self.cluster.get(worker_name)
or self.rr_next_portal()
or
self.rr_next_portal()
)
# TODO: this should probably be turned into a

View File

@ -38,6 +38,7 @@ from piker.ui.qt import (
QtGui,
QGraphicsPathItem,
QStyleOptionGraphicsItem,
QGraphicsItem,
QGraphicsScene,
QWidget,
QPointF,

View File

@ -22,7 +22,6 @@ a chart from some other actor.
from __future__ import annotations
from contextlib import (
asynccontextmanager as acm,
contextmanager as cm,
AsyncExitStack,
)
from functools import partial
@ -47,7 +46,6 @@ from piker.log import get_logger
from piker.types import Struct
from piker.service import find_service
from piker.brokers import SymbolNotFound
from piker.toolz import Profiler
from piker.ui.qt import (
QGraphicsItem,
)
@ -100,8 +98,6 @@ def rm_annot(
annot: ArrowEditor|SelectRect|pg.TextItem
) -> bool:
global _editors
from piker.ui._annotate import GapAnnotations
match annot:
case pg.ArrowItem():
editor = _editors[annot._uid]
@ -126,35 +122,9 @@ def rm_annot(
scene.removeItem(annot)
return True
case GapAnnotations():
scene = annot.scene()
if scene:
scene.removeItem(annot)
return True
return False
@cm
def no_qt_updates(*items):
'''
Disable Qt widget/item updates during context to batch
render operations and only trigger single repaint on exit.
Accepts both QWidgets and QGraphicsItems.
'''
for item in items:
if hasattr(item, 'setUpdatesEnabled'):
item.setUpdatesEnabled(False)
try:
yield
finally:
for item in items:
if hasattr(item, 'setUpdatesEnabled'):
item.setUpdatesEnabled(True)
async def serve_rc_annots(
ipc_key: str,
annot_req_stream: MsgStream,
@ -459,333 +429,6 @@ async def serve_rc_annots(
aids.add(aid)
await annot_req_stream.send(aid)
case {
'cmd': 'batch',
'fqme': fqme,
'timeframe': timeframe,
'rects': list(rect_specs),
'arrows': list(arrow_specs),
'texts': list(text_specs),
'show_individual_arrows': bool(show_individual_arrows),
}:
# batch submission handler - process multiple
# annotations in single IPC round-trip
ds: DisplayState = _dss[fqme]
try:
chart: ChartPlotWidget = {
60: ds.hist_chart,
1: ds.chart,
}[timeframe]
except KeyError:
msg: str = (
f'No chart for timeframe={timeframe}s, '
f'skipping batch annotation'
)
log.error(msg)
await annot_req_stream.send({'error': msg})
continue
cv: ChartView = chart.cv
viz: Viz = chart.get_viz(fqme)
shm = viz.shm
arr = shm.array
result: dict[str, list[int]] = {
'rects': [],
'arrows': [],
'texts': [],
}
profiler = Profiler(
msg=(
f'Batch annotate {len(rect_specs)} gaps '
f'on {fqme}@{timeframe}s'
),
disabled=False,
delayed=False,
)
aids_set: set[int] = ctxs[ipc_key][1]
# build unified gap_specs for GapAnnotations class
from piker.ui._annotate import GapAnnotations
gap_specs: list[dict] = []
n_gaps: int = max(
len(rect_specs),
len(arrow_specs),
)
profiler('setup batch annot creation')
# collect all unique timestamps for vectorized lookup
timestamps: list[float] = []
for rect_spec in rect_specs:
if start_time := rect_spec.get('start_time'):
timestamps.append(start_time)
if end_time := rect_spec.get('end_time'):
timestamps.append(end_time)
for arrow_spec in arrow_specs:
if time_val := arrow_spec.get('time'):
timestamps.append(time_val)
profiler('collect `timestamps: list` complet!')
# build timestamp -> row mapping using binary search
# O(m log n) instead of O(n*m) with np.isin
time_to_row: dict[float, dict] = {}
if timestamps:
import numpy as np
time_arr = arr['time']
ts_array = np.array(timestamps)
# binary search for each timestamp in sorted time array
search_indices = np.searchsorted(
time_arr,
ts_array,
)
profiler('`np.searchsorted()` complete!')
# vectorized bounds check and exact match verification
valid_mask = (
(search_indices < len(arr))
& (time_arr[search_indices] == ts_array)
)
# get all valid indices and timestamps
valid_indices = search_indices[valid_mask]
valid_timestamps = ts_array[valid_mask]
# use fancy indexing to get all rows at once
matched_rows = arr[valid_indices]
# extract fields to plain arrays BEFORE dict building
indices_arr = matched_rows['index'].astype(float)
opens_arr = matched_rows['open'].astype(float)
closes_arr = matched_rows['close'].astype(float)
profiler('extracted field arrays')
# build dict from plain arrays (much faster)
time_to_row: dict[float, dict] = {
float(ts): {
'index': idx,
'open': opn,
'close': cls,
}
for (
ts,
idx,
opn,
cls,
) in zip(
valid_timestamps,
indices_arr,
opens_arr,
closes_arr,
)
}
profiler('`time_to_row` creation complete!')
profiler(f'built timestamp lookup for {len(timestamps)} times')
# build gap_specs from rect+arrow specs
for i in range(n_gaps):
gap_spec: dict = {}
# get rect spec for this gap
if i < len(rect_specs):
rect_spec: dict = rect_specs[i].copy()
start_time = rect_spec.get('start_time')
end_time = rect_spec.get('end_time')
if (
start_time is not None
and end_time is not None
):
# lookup from pre-built mapping
start_row = time_to_row.get(start_time)
end_row = time_to_row.get(end_time)
if (
start_row is None
or end_row is None
):
log.warning(
f'Timestamp lookup failed for '
f'gap[{i}], skipping'
)
continue
start_idx = start_row['index']
end_idx = end_row['index']
start_close = start_row['close']
end_open = end_row['open']
from_idx: float = 0.16 - 0.06
gap_spec['start_pos'] = (
start_idx + 1 - from_idx,
start_close,
)
gap_spec['end_pos'] = (
end_idx + from_idx,
end_open,
)
gap_spec['start_time'] = start_time
gap_spec['end_time'] = end_time
gap_spec['color'] = rect_spec.get(
'color',
'dad_blue',
)
# get arrow spec for this gap
if i < len(arrow_specs):
arrow_spec: dict = arrow_specs[i].copy()
x: float = float(arrow_spec.get('x', 0))
y: float = float(arrow_spec.get('y', 0))
time_val: float|None = arrow_spec.get('time')
# timestamp-based index lookup (only for x, NOT y!)
# y is already set to the PREVIOUS bar's close
if time_val is not None:
arrow_row = time_to_row.get(time_val)
if arrow_row is not None:
x = arrow_row['index']
# NOTE: do NOT update y! it's the
# previous bar's close, not current
else:
log.warning(
f'Arrow timestamp {time_val} not '
f'found for gap[{i}], using x={x}'
)
gap_spec['arrow_x'] = x
gap_spec['arrow_y'] = y
gap_spec['time'] = time_val
gap_spec['pointing'] = arrow_spec.get(
'pointing',
'down',
)
gap_spec['alpha'] = arrow_spec.get('alpha', 169)
gap_specs.append(gap_spec)
profiler(f'built {len(gap_specs)} gap_specs')
# create single GapAnnotations item for all gaps
if gap_specs:
gaps_item = GapAnnotations(
gap_specs=gap_specs,
array=arr,
color=gap_specs[0].get('color', 'dad_blue'),
alpha=gap_specs[0].get('alpha', 169),
arrow_size=10.0,
fqme=fqme,
timeframe=timeframe,
)
chart.plotItem.addItem(gaps_item)
# register single item for repositioning
aid: int = id(gaps_item)
annots[aid] = gaps_item
aids_set.add(aid)
result['rects'].append(aid)
profiler(
f'created GapAnnotations item for {len(gap_specs)} '
f'gaps'
)
# A/B comparison: optionally create individual arrows
# alongside batch for visual comparison
if show_individual_arrows:
godw = chart.linked.godwidget
arrows: ArrowEditor = ArrowEditor(godw=godw)
for i, spec in enumerate(gap_specs):
if 'arrow_x' not in spec:
continue
aid_str: str = str(uuid4())
arrow: pg.ArrowItem = arrows.add(
plot=chart.plotItem,
uid=aid_str,
x=spec['arrow_x'],
y=spec['arrow_y'],
pointing=spec['pointing'],
color='bracket', # different color
alpha=spec.get('alpha', 169),
headLen=10.0,
headWidth=2.222,
pxMode=True,
)
arrow._abs_x = spec['arrow_x']
arrow._abs_y = spec['arrow_y']
annots[aid_str] = arrow
_editors[aid_str] = arrows
aids_set.add(aid_str)
result['arrows'].append(aid_str)
profiler(
f'created {len(gap_specs)} individual arrows '
f'for comparison'
)
# handle text items separately (less common, keep
# individual items)
n_texts: int = 0
for text_spec in text_specs:
kwargs: dict = text_spec.copy()
text: str = kwargs.pop('text')
x: float = float(kwargs.pop('x'))
y: float = float(kwargs.pop('y'))
time_val: float|None = kwargs.pop('time', None)
# timestamp-based index lookup
if time_val is not None:
matches = arr[arr['time'] == time_val]
if len(matches) > 0:
x = float(matches[0]['index'])
y = float(matches[0]['close'])
color = kwargs.pop('color', 'dad_blue')
anchor = kwargs.pop('anchor', (0, 1))
font_size = kwargs.pop('font_size', None)
text_item: pg.TextItem = pg.TextItem(
text,
color=hcolor(color),
anchor=anchor,
)
if font_size is None:
from ._style import get_fonts
font, font_small = get_fonts()
font_size = font_small.px_size - 1
qfont: QFont = text_item.textItem.font()
qfont.setPixelSize(font_size)
text_item.setFont(qfont)
text_item.setPos(float(x), float(y))
chart.plotItem.addItem(text_item)
text_item._abs_x = float(x)
text_item._abs_y = float(y)
aid: str = str(uuid4())
annots[aid] = text_item
aids_set.add(aid)
result['texts'].append(aid)
n_texts += 1
profiler(
f'created text annotations: {n_texts} texts'
)
profiler.finish()
await annot_req_stream.send(result)
case {
'cmd': 'remove',
'aid': int(aid)|str(aid),
@ -828,26 +471,10 @@ async def serve_rc_annots(
# XXX: reposition all annotations to ensure they
# stay aligned with viz data after reset (eg during
# backfill when abs-index range changes)
chart: ChartPlotWidget = {
60: ds.hist_chart,
1: ds.chart,
}[timeframe]
viz: Viz = chart.get_viz(fqme)
arr = viz.shm.array
n_repositioned: int = 0
for aid, annot in annots.items():
# GapAnnotations batch items have .reposition()
if hasattr(annot, 'reposition'):
annot.reposition(
array=arr,
fqme=fqme,
timeframe=timeframe,
)
n_repositioned += 1
# arrows and text items use abs x,y coords
elif (
if (
hasattr(annot, '_abs_x')
and
hasattr(annot, '_abs_y')
@ -912,21 +539,12 @@ async def remote_annotate(
finally:
# ensure all annots for this connection are deleted
# on any final teardown
profiler = Profiler(
msg=f'Annotation teardown for ctx {ctx.cid}',
disabled=False,
ms_threshold=0.0,
)
(_ctx, aids) = _ctxs[ctx.cid]
assert _ctx is ctx
profiler(f'got {len(aids)} aids to remove')
for aid in aids:
annot: QGraphicsItem = _annots[aid]
assert rm_annot(annot)
profiler(f'removed all {len(aids)} annotations')
class AnnotCtl(Struct):
'''
@ -1128,64 +746,6 @@ class AnnotCtl(Struct):
)
return aid
async def add_batch(
self,
fqme: str,
timeframe: float,
rects: list[dict]|None = None,
arrows: list[dict]|None = None,
texts: list[dict]|None = None,
show_individual_arrows: bool = False,
from_acm: bool = False,
) -> dict[str, list[int]]:
'''
Batch submit multiple annotations in single IPC msg for
much faster remote annotation vs. per-annot round-trips.
Returns dict of annotation IDs:
{
'rects': [aid1, aid2, ...],
'arrows': [aid3, aid4, ...],
'texts': [aid5, aid6, ...],
}
'''
ipc: MsgStream = self._get_ipc(fqme)
with trio.fail_after(10):
await ipc.send({
'fqme': fqme,
'cmd': 'batch',
'timeframe': timeframe,
'rects': rects or [],
'arrows': arrows or [],
'texts': texts or [],
'show_individual_arrows': show_individual_arrows,
})
result: dict = await ipc.receive()
match result:
case {'error': str(msg)}:
log.error(msg)
return {
'rects': [],
'arrows': [],
'texts': [],
}
# register all AIDs with their IPC streams
for aid_list in result.values():
for aid in aid_list:
self._ipcs[aid] = ipc
if not from_acm:
self._annot_stack.push_async_callback(
partial(
self.remove,
aid,
)
)
return result
async def add_text(
self,
fqme: str,
@ -1321,14 +881,3 @@ async def open_annot_ctl(
_annot_stack=annots_stack,
)
yield client
# client exited, measure teardown time
teardown_profiler = Profiler(
msg='Client AnnotCtl teardown',
disabled=False,
ms_threshold=0.0,
)
teardown_profiler('exiting annots_stack')
teardown_profiler('annots_stack exited')
teardown_profiler('exiting gather_contexts')

View File

@ -300,7 +300,10 @@ class GodWidget(QWidget):
getattr(widget, 'on_resize')
self._widgets[widget.mode_name] = widget
def on_win_resize(self, event: QtCore.QEvent) -> None:
def on_win_resize(
self,
event: QtCore.QEvent,
) -> None:
'''
Top level god widget handler from window (the real yaweh) resize
events such that any registered widgets which wish to be
@ -315,7 +318,10 @@ class GodWidget(QWidget):
self._resizing = True
log.info('God widget resize')
log.debug(
f'God widget resize\n'
f'{event}\n'
)
for name, widget in self._widgets.items():
widget.on_resize()

View File

@ -255,8 +255,16 @@ class MainWindow(QMainWindow):
current: QWidget,
) -> None:
'''
Focus handler.
log.info(f'widget focus changed from {last} -> {current}')
For now updates the "current mode" name.
'''
log.debug(
f'widget focus changed from,\n'
f'{last} -> {current}'
)
if current is not None:
# cursor left window?

View File

@ -177,7 +177,7 @@ def chart(
return
# global opts
brokernames = config['brokers']
# brokernames: list[str] = config['brokers']
brokermods = config['brokermods']
assert brokermods
tractorloglevel = config['tractorloglevel']
@ -216,6 +216,7 @@ def chart(
layers['tcp']['port'],
))
# breakpoint()
from tractor.devx import maybe_open_crash_handler
pdb: bool = config['pdb']
with maybe_open_crash_handler(pdb=pdb):

View File

@ -519,7 +519,8 @@ class OrderMode:
'''
Order submitted status event handler.
Commit the order line and registered order uuid, store ack time stamp.
Commit the order line and registered order uuid, store ack
time stamp.
'''
lines = self.lines.commit_line(uuid)

View File

@ -99,6 +99,7 @@ python-downloads = 'manual'
# https://docs.astral.sh/uv/concepts/projects/dependencies/#default-groups
default-groups = [
'uis',
'repl',
]
# ------ tool.uv ------
@ -130,7 +131,7 @@ repl = [
"greenback >=1.1.1, <2.0.0",
# @goodboy's preferred console toolz
"xonsh",
"xonsh>=0.22.2",
"prompt-toolkit ==3.0.40",
"pyperclip>=1.9.0",
@ -191,23 +192,19 @@ include = ["piker"]
[tool.uv.sources]
pyqtgraph = { git = "https://github.com/pikers/pyqtgraph.git" }
tomlkit = { git = "https://github.com/pikers/tomlkit.git", branch ="piker_pin" }
pyvnc = { git = "https://github.com/regulad/pyvnc.git" }
pyqtgraph = { git = "https://github.com/pyqtgraph/pyqtgraph.git", branch = 'master' }
# pyqtgraph = { path = '../pyqtgraph', editable = true }
# ?TODO, resync our fork?
# pyqtgraph = { git = "https://github.com/pikers/pyqtgraph.git" }
# to get fancy next-cmd/suggestion feats prior to 0.22.2 B)
# https://github.com/xonsh/xonsh/pull/6037
# https://github.com/xonsh/xonsh/pull/6048
xonsh = { git = 'https://github.com/xonsh/xonsh.git', branch = 'main' }
# xonsh = { git = 'https://github.com/xonsh/xonsh.git', branch = 'main' }
# XXX since, we're like, always hacking new shite all-the-time. Bp
# tractor = { git = "https://github.com/goodboy/tractor.git", branch ="piker_pin" }
tractor = { git = "https://github.com/goodboy/tractor.git", branch ="piker_pin" }
# tractor = { git = "https://pikers.dev/goodboy/tractor", branch = "piker_pin" }
# tractor = { git = "https://pikers.dev/goodboy/tractor", branch = "main" }
# ------ goodboy ------
# hackin dev-envs, usually there's something new he's hackin in..
tractor = { path = "../tractor", editable = true }
# tractor = { path = "../tractor", editable = true }

381
uv.lock
View File

@ -108,15 +108,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e0/44/827b2a91a5816512fcaf3cc4ebc465ccd5d598c45cefa6703fcf4a79018f/attrs-23.2.0-py3-none-any.whl", hash = "sha256:99b87a485a5820b23b879f04c2305b44b951b502fd64be915879d77a7e8fc6f1", size = 60752, upload-time = "2023-12-31T06:30:30.772Z" },
]
[[package]]
name = "base58"
version = "2.1.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/7f/45/8ae61209bb9015f516102fa559a2914178da1d5868428bd86a1b4421141d/base58-2.1.1.tar.gz", hash = "sha256:c5d0cb3f5b6e81e8e35da5754388ddcc6d0d14b6c6a132cb93d69ed580a7278c", size = 6528, upload-time = "2021-10-30T22:12:17.858Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/4a/45/ec96b29162a402fc4c1c5512d114d7b3787b9d1c2ec241d9568b4816ee23/base58-2.1.1-py3-none-any.whl", hash = "sha256:11a36f4d3ce51dfc1043f3218591ac4eb1ceb172919cebe05b52a5bcc8d245c2", size = 5621, upload-time = "2021-10-30T22:12:16.658Z" },
]
[[package]]
name = "bidict"
version = "0.23.1"
@ -126,74 +117,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/99/37/e8730c3587a65eb5645d4aba2d27aae48e8003614d6aaf15dda67f702f1f/bidict-0.23.1-py3-none-any.whl", hash = "sha256:5dae8d4d79b552a71cbabc7deb25dfe8ce710b17ff41711e13010ead2abfc3e5", size = 32764, upload-time = "2024-02-18T19:09:04.156Z" },
]
[[package]]
name = "blake3"
version = "1.0.8"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/75/aa/abcd75e9600987a0bc6cfe9b6b2ff3f0e2cb08c170addc6e76035b5c4cb3/blake3-1.0.8.tar.gz", hash = "sha256:513cc7f0f5a7c035812604c2c852a0c1468311345573de647e310aca4ab165ba", size = 117308, upload-time = "2025-10-14T06:47:48.83Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ed/a0/b7b6dff04012cfd6e665c09ee446f749bd8ea161b00f730fe1bdecd0f033/blake3-1.0.8-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:d8da4233984d51471bd4e4366feda1d90d781e712e0a504ea54b1f2b3577557b", size = 347983, upload-time = "2025-10-14T06:45:47.214Z" },
{ url = "https://files.pythonhosted.org/packages/5b/a2/264091cac31d7ae913f1f296abc20b8da578b958ffb86100a7ce80e8bf5c/blake3-1.0.8-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1257be19f2d381c868a34cc822fc7f12f817ddc49681b6d1a2790bfbda1a9865", size = 325415, upload-time = "2025-10-14T06:45:48.482Z" },
{ url = "https://files.pythonhosted.org/packages/ee/7d/85a4c0782f613de23d114a7a78fcce270f75b193b3ff3493a0de24ba104a/blake3-1.0.8-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:269f255b110840e52b6ce9db02217e39660ebad3e34ddd5bca8b8d378a77e4e1", size = 371296, upload-time = "2025-10-14T06:45:49.674Z" },
{ url = "https://files.pythonhosted.org/packages/e3/20/488475254976ed93fab57c67aa80d3b40df77f7d9db6528c9274bff53e08/blake3-1.0.8-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:66ca28a673025c40db3eba21a9cac52f559f83637efa675b3f6bd8683f0415f3", size = 374516, upload-time = "2025-10-14T06:45:51.23Z" },
{ url = "https://files.pythonhosted.org/packages/7b/21/2a1c47fedb77fb396512677ec6d46caf42ac6e9a897db77edd0a2a46f7bb/blake3-1.0.8-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bcb04966537777af56c1f399b35525aa70a1225816e121ff95071c33c0f7abca", size = 447911, upload-time = "2025-10-14T06:45:52.637Z" },
{ url = "https://files.pythonhosted.org/packages/cb/7d/db0626df16029713e7e61b67314c4835e85c296d82bd907c21c6ea271da2/blake3-1.0.8-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e5b5da177d62cc4b7edf0cea08fe4dec960c9ac27f916131efa890a01f747b93", size = 505420, upload-time = "2025-10-14T06:45:54.445Z" },
{ url = "https://files.pythonhosted.org/packages/5b/55/6e737850c2d58a6d9de8a76dad2ae0f75b852a23eb4ecb07a0b165e6e436/blake3-1.0.8-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:38209b10482c97e151681ea3e91cc7141f56adbbf4820a7d701a923124b41e6a", size = 394189, upload-time = "2025-10-14T06:45:55.719Z" },
{ url = "https://files.pythonhosted.org/packages/5b/94/eafaa5cdddadc0c9c603a6a6d8339433475e1a9f60c8bb9c2eed2d8736b6/blake3-1.0.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:504d1399b7fb91dfe5c25722d2807990493185faa1917456455480c36867adb5", size = 388001, upload-time = "2025-10-14T06:45:57.067Z" },
{ url = "https://files.pythonhosted.org/packages/17/81/735fa00d13de7f68b25e1b9cb36ff08c6f165e688d85d8ec2cbfcdedccc5/blake3-1.0.8-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:c84af132aa09abeadf9a0118c8fb26f4528f3f42c10ef8be0fcf31c478774ec4", size = 550302, upload-time = "2025-10-14T06:45:58.657Z" },
{ url = "https://files.pythonhosted.org/packages/0e/c6/d1fe8bdea4a6088bd54b5a58bc40aed89a4e784cd796af7722a06f74bae7/blake3-1.0.8-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a25db3d36b55f5ed6a86470155cc749fc9c5b91c949b8d14f48658f9d960d9ec", size = 554211, upload-time = "2025-10-14T06:46:00.269Z" },
{ url = "https://files.pythonhosted.org/packages/55/d1/ca74aa450cbe10e396e061f26f7a043891ffa1485537d6b30d3757e20995/blake3-1.0.8-cp312-cp312-win32.whl", hash = "sha256:e0fee93d5adcd44378b008c147e84f181f23715307a64f7b3db432394bbfce8b", size = 228343, upload-time = "2025-10-14T06:46:01.533Z" },
{ url = "https://files.pythonhosted.org/packages/4d/42/bbd02647169e3fbed27558555653ac2578c6f17ccacf7d1956c58ef1d214/blake3-1.0.8-cp312-cp312-win_amd64.whl", hash = "sha256:6a6eafc29e4f478d365a87d2f25782a521870c8514bb43734ac85ae9be71caf7", size = 215704, upload-time = "2025-10-14T06:46:02.79Z" },
{ url = "https://files.pythonhosted.org/packages/55/b8/11de9528c257f7f1633f957ccaff253b706838d22c5d2908e4735798ec01/blake3-1.0.8-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:46dc20976bd6c235959ef0246ec73420d1063c3da2839a9c87ca395cf1fd7943", size = 347771, upload-time = "2025-10-14T06:46:04.248Z" },
{ url = "https://files.pythonhosted.org/packages/50/26/f7668be55c909678b001ecacff11ad7016cd9b4e9c7cc87b5971d638c5a9/blake3-1.0.8-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d17eb6382634b3a5bc0c0e0454d5265b0becaeeadb6801ed25150b39a999d0cc", size = 325431, upload-time = "2025-10-14T06:46:06.136Z" },
{ url = "https://files.pythonhosted.org/packages/77/57/e8a85fa261894bf7ce7af928ff3408aab60287ab8d58b55d13a3f700b619/blake3-1.0.8-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:19fc6f2b7edab8acff6895fc6e38c19bd79f4c089e21153020c75dfc7397d52d", size = 370994, upload-time = "2025-10-14T06:46:07.398Z" },
{ url = "https://files.pythonhosted.org/packages/62/cd/765b76bb48b8b294fea94c9008b0d82b4cfa0fa2f3c6008d840d01a597e4/blake3-1.0.8-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4f54cff7f15d91dc78a63a2dd02a3dccdc932946f271e2adb4130e0b4cf608ba", size = 374372, upload-time = "2025-10-14T06:46:08.698Z" },
{ url = "https://files.pythonhosted.org/packages/36/7a/32084eadbb28592bb07298f0de316d2da586c62f31500a6b1339a7e7b29b/blake3-1.0.8-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e7e12a777f6b798eb8d06f875d6e108e3008bd658d274d8c676dcf98e0f10537", size = 447627, upload-time = "2025-10-14T06:46:10.002Z" },
{ url = "https://files.pythonhosted.org/packages/a7/f4/3788a1d86e17425eea147e28d7195d7053565fc279236a9fd278c2ec495e/blake3-1.0.8-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ddfc59b0176fb31168f08d5dd536e69b1f4f13b5a0f4b0c3be1003efd47f9308", size = 507536, upload-time = "2025-10-14T06:46:11.614Z" },
{ url = "https://files.pythonhosted.org/packages/fe/01/4639cba48513b94192681b4da472cdec843d3001c5344d7051ee5eaef606/blake3-1.0.8-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a2336d5b2a801a7256da21150348f41610a6c21dae885a3acb1ebbd7333d88d8", size = 394105, upload-time = "2025-10-14T06:46:12.808Z" },
{ url = "https://files.pythonhosted.org/packages/21/ae/6e55c19c8460fada86cd1306a390a09b0c5a2e2e424f9317d2edacea439f/blake3-1.0.8-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4072196547484c95a5a09adbb952e9bb501949f03f9e2a85e7249ef85faaba8", size = 386928, upload-time = "2025-10-14T06:46:16.284Z" },
{ url = "https://files.pythonhosted.org/packages/ee/6c/05b7a5a907df1be53a8f19e7828986fc6b608a44119641ef9c0804fbef15/blake3-1.0.8-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:0eab3318ec02f8e16fe549244791ace2ada2c259332f0c77ab22cf94dfff7130", size = 550003, upload-time = "2025-10-14T06:46:17.791Z" },
{ url = "https://files.pythonhosted.org/packages/b4/03/f0ea4adfedc1717623be6460b3710fcb725ca38082c14274369803f727e1/blake3-1.0.8-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:a33b9a1fb6d1d559a8e0d04b041e99419a6bb771311c774f6ff57ed7119c70ed", size = 553857, upload-time = "2025-10-14T06:46:19.088Z" },
{ url = "https://files.pythonhosted.org/packages/cc/6f/e5410d2e2a30c8aba8389ffc1c0061356916bf5ecd0a210344e7b69b62ab/blake3-1.0.8-cp313-cp313-win32.whl", hash = "sha256:e171b169cb7ea618e362a4dddb7a4d4c173bbc08b9ba41ea3086dd1265530d4f", size = 228315, upload-time = "2025-10-14T06:46:20.391Z" },
{ url = "https://files.pythonhosted.org/packages/79/ef/d9c297956dfecd893f29f59e7b22445aba5b47b7f6815d9ba5dcd73fcae6/blake3-1.0.8-cp313-cp313-win_amd64.whl", hash = "sha256:3168c457255b5d2a2fc356ba696996fcaff5d38284f968210d54376312107662", size = 215477, upload-time = "2025-10-14T06:46:21.542Z" },
{ url = "https://files.pythonhosted.org/packages/20/ba/eaa7723d66dd8ab762a3e85e139bb9c46167b751df6e950ad287adb8fb61/blake3-1.0.8-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:b4d672c24dc15ec617d212a338a4ca14b449829b6072d09c96c63b6e6b621aed", size = 347289, upload-time = "2025-10-14T06:46:22.772Z" },
{ url = "https://files.pythonhosted.org/packages/47/b3/6957f6ee27f0d5b8c4efdfda68a1298926a88c099f4dd89c711049d16526/blake3-1.0.8-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:1af0e5a29aa56d4fba904452ae784740997440afd477a15e583c38338e641f41", size = 324444, upload-time = "2025-10-14T06:46:24.729Z" },
{ url = "https://files.pythonhosted.org/packages/13/da/722cebca11238f3b24d3cefd2361c9c9ea47cfa0ad9288eeb4d1e0b7cf93/blake3-1.0.8-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ef153c5860d5bf1cc71aece69b28097d2a392913eb323d6b52555c875d0439fc", size = 370441, upload-time = "2025-10-14T06:46:26.29Z" },
{ url = "https://files.pythonhosted.org/packages/2e/d5/2f7440c8e41c0af995bad3a159e042af0f4ed1994710af5b4766ca918f65/blake3-1.0.8-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e8ae3689f0c7bfa6ce6ae45cab110e4c3442125c4c23b28f1f097856de26e4d1", size = 374312, upload-time = "2025-10-14T06:46:27.451Z" },
{ url = "https://files.pythonhosted.org/packages/a6/6c/fb6a7812e60ce3e110bcbbb11f167caf3e975c589572c41e1271f35f2c41/blake3-1.0.8-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3fb83532f7456ddeb68dae1b36e1f7c52f9cb72852ac01159bbcb1a12b0f8be0", size = 447007, upload-time = "2025-10-14T06:46:29.056Z" },
{ url = "https://files.pythonhosted.org/packages/13/3b/c99b43fae5047276ea9d944077c190fc1e5f22f57528b9794e21f7adedc6/blake3-1.0.8-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6ae7754c7d96e92a70a52e07c732d594cf9924d780f49fffd3a1e9235e0f5ba7", size = 507323, upload-time = "2025-10-14T06:46:30.661Z" },
{ url = "https://files.pythonhosted.org/packages/fc/bb/ba90eddd592f8c074a0694cb0a744b6bd76bfe67a14c2b490c8bdfca3119/blake3-1.0.8-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4bacaae75e98dee3b7da6c5ee3b81ee21a3352dd2477d6f1d1dbfd38cdbf158a", size = 393449, upload-time = "2025-10-14T06:46:31.805Z" },
{ url = "https://files.pythonhosted.org/packages/25/ed/58a2acd0b9e14459cdaef4344db414d4a36e329b9720921b442a454dd443/blake3-1.0.8-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9456c829601d72852d8ba0af8dae0610f7def1d59f5942efde1e2ef93e8a8b57", size = 386844, upload-time = "2025-10-14T06:46:33.195Z" },
{ url = "https://files.pythonhosted.org/packages/4a/04/fed09845b18d90862100c8e48308261e2f663aab25d3c71a6a0bdda6618b/blake3-1.0.8-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:497ef8096ec4ac1ffba9a66152cee3992337cebf8ea434331d8fd9ce5423d227", size = 549550, upload-time = "2025-10-14T06:46:35.23Z" },
{ url = "https://files.pythonhosted.org/packages/d6/65/1859fddfabc1cc72548c2269d988819aad96d854e25eae00531517925901/blake3-1.0.8-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:511133bab85ff60ed143424ce484d08c60894ff7323f685d7a6095f43f0c85c3", size = 553805, upload-time = "2025-10-14T06:46:36.532Z" },
{ url = "https://files.pythonhosted.org/packages/c1/c7/2969352017f62378e388bb07bb2191bc9a953f818dc1cd6b9dd5c24916e1/blake3-1.0.8-cp313-cp313t-win32.whl", hash = "sha256:9c9fbdacfdeb68f7ca53bb5a7a5a593ec996eaf21155ad5b08d35e6f97e60877", size = 228068, upload-time = "2025-10-14T06:46:37.826Z" },
{ url = "https://files.pythonhosted.org/packages/d8/fc/923e25ac9cadfff1cd20038bcc0854d0f98061eb6bc78e42c43615f5982d/blake3-1.0.8-cp313-cp313t-win_amd64.whl", hash = "sha256:3cec94ed5676821cf371e9c9d25a41b4f3ebdb5724719b31b2749653b7cc1dfa", size = 215369, upload-time = "2025-10-14T06:46:39.054Z" },
{ url = "https://files.pythonhosted.org/packages/2e/2a/9f13ea01b03b1b4751a1cc2b6c1ef4b782e19433a59cf35b59cafb2a2696/blake3-1.0.8-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:2c33dac2c6112bc23f961a7ca305c7e34702c8177040eb98d0389d13a347b9e1", size = 347016, upload-time = "2025-10-14T06:46:40.318Z" },
{ url = "https://files.pythonhosted.org/packages/06/8e/8458c4285fbc5de76414f243e4e0fcab795d71a8b75324e14959aee699da/blake3-1.0.8-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c445eff665d21c3b3b44f864f849a2225b1164c08654beb23224a02f087b7ff1", size = 324496, upload-time = "2025-10-14T06:46:42.355Z" },
{ url = "https://files.pythonhosted.org/packages/49/fa/b913eb9cc4af708c03e01e6b88a8bb3a74833ba4ae4b16b87e2829198e06/blake3-1.0.8-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a47939f04b89c5c6ff1e51e883e5efab1ea1bf01a02f4d208d216dddd63d0dd8", size = 370654, upload-time = "2025-10-14T06:46:43.907Z" },
{ url = "https://files.pythonhosted.org/packages/7f/4f/245e0800c33b99c8f2b570d9a7199b51803694913ee4897f339648502933/blake3-1.0.8-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:73e0b4fa25f6e3078526a592fb38fca85ef204fd02eced6731e1cdd9396552d4", size = 374693, upload-time = "2025-10-14T06:46:45.186Z" },
{ url = "https://files.pythonhosted.org/packages/a2/a6/8cb182c8e482071dbdfcc6ec0048271fd48bcb78782d346119ff54993700/blake3-1.0.8-cp314-cp314-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4b0543c57eb9d6dac9d4bced63e9f7f7b546886ac04cec8da3c3d9c8f30cbbb7", size = 447673, upload-time = "2025-10-14T06:46:46.358Z" },
{ url = "https://files.pythonhosted.org/packages/06/b7/1cbbb5574d2a9436d1b15e7eb5b9d82e178adcaca71a97b0fddaca4bfe3a/blake3-1.0.8-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ed972ebd553c0c25363459e9fc71a38c045d8419e365b59acd8cd791eff13981", size = 507233, upload-time = "2025-10-14T06:46:48.109Z" },
{ url = "https://files.pythonhosted.org/packages/9c/45/b55825d90af353b3e26c653bab278da9d6563afcf66736677f9397e465be/blake3-1.0.8-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3bafdec95dfffa3f6571e529644744e280337df15ddd9728f224ba70c5779b23", size = 393852, upload-time = "2025-10-14T06:46:49.511Z" },
{ url = "https://files.pythonhosted.org/packages/34/73/9058a1a457dd20491d1b37de53d6876eff125e1520d9b2dd7d0acbc88de2/blake3-1.0.8-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2d78f06f3fb838b34c330e2987090376145cbe5944d8608a0c4779c779618f7b", size = 386442, upload-time = "2025-10-14T06:46:51.205Z" },
{ url = "https://files.pythonhosted.org/packages/30/6d/561d537ffc17985e276e08bf4513f1c106f1fdbef571e782604dc4e44070/blake3-1.0.8-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:dd03ff08d1b6e4fdda1cd03826f971ae8966ef6f683a8c68aa27fb21904b5aa9", size = 549929, upload-time = "2025-10-14T06:46:52.494Z" },
{ url = "https://files.pythonhosted.org/packages/03/2f/dbe20d2c57f1a67c63be4ba310bcebc707b945c902a0bde075d2a8f5cd5c/blake3-1.0.8-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:4e02a3c499e35bf51fc15b2738aca1a76410804c877bcd914752cac4f71f052a", size = 553750, upload-time = "2025-10-14T06:46:54.194Z" },
{ url = "https://files.pythonhosted.org/packages/6b/da/c6cb712663c869b2814870c2798e57289c4268c5ac5fb12d467fce244860/blake3-1.0.8-cp314-cp314-win32.whl", hash = "sha256:a585357d5d8774aad9ffc12435de457f9e35cde55e0dc8bc43ab590a6929e59f", size = 228404, upload-time = "2025-10-14T06:46:56.807Z" },
{ url = "https://files.pythonhosted.org/packages/dc/b6/c7dcd8bc3094bba1c4274e432f9e77a7df703532ca000eaa550bd066b870/blake3-1.0.8-cp314-cp314-win_amd64.whl", hash = "sha256:9ab5998e2abd9754819753bc2f1cf3edf82d95402bff46aeef45ed392a5468bf", size = 215460, upload-time = "2025-10-14T06:46:58.15Z" },
{ url = "https://files.pythonhosted.org/packages/75/3c/6c8afd856c353176836daa5cc33a7989e8f54569e9d53eb1c53fc8f80c34/blake3-1.0.8-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:e2df12f295f95a804338bd300e8fad4a6f54fd49bd4d9c5893855a230b5188a8", size = 347482, upload-time = "2025-10-14T06:47:00.189Z" },
{ url = "https://files.pythonhosted.org/packages/6a/35/92cd5501ce8e1f5cabdc0c3ac62d69fdb13ff0b60b62abbb2b6d0a53a790/blake3-1.0.8-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:63379be58438878eeb76ebe4f0efbeaabf42b79f2cff23b6126b7991588ced67", size = 324376, upload-time = "2025-10-14T06:47:01.413Z" },
{ url = "https://files.pythonhosted.org/packages/11/33/503b37220a3e2e31917ef13722efd00055af51c5e88ae30974c733d7ece6/blake3-1.0.8-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88d527c247f9609dc1d45a08fd243e39f0d5300d54c57e048de24d4fa9240ebb", size = 370220, upload-time = "2025-10-14T06:47:02.573Z" },
{ url = "https://files.pythonhosted.org/packages/3e/df/fe817843adf59516c04d44387bd643b422a3b0400ea95c6ede6a49920737/blake3-1.0.8-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:506a47897a11ebe8f3cdeb52f1365d6a2f83959e98ccb0c830f8f73277d4d358", size = 373454, upload-time = "2025-10-14T06:47:03.784Z" },
{ url = "https://files.pythonhosted.org/packages/d1/4d/90a2a623575373dfc9b683f1bad1bf017feafa5a6d65d94fb09543050740/blake3-1.0.8-cp314-cp314t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e5122a61b3b004bbbd979bdf83a3aaab432da3e2a842d7ddf1c273f2503b4884", size = 447102, upload-time = "2025-10-14T06:47:04.958Z" },
{ url = "https://files.pythonhosted.org/packages/93/ff/4e8ce314f60115c4c657b1fdbe9225b991da4f5bcc5d1c1f1d151e2f39d6/blake3-1.0.8-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0171e85d56dec1219abdae5f49a0ed12cb3f86a454c29160a64fd8a8166bba37", size = 506791, upload-time = "2025-10-14T06:47:06.82Z" },
{ url = "https://files.pythonhosted.org/packages/44/88/2963a1f18aab52bdcf35379b2b48c34bbc462320c37e76960636b8602c36/blake3-1.0.8-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:003f61e8c41dd9931edddf1cc6a1bb680fb2ac0ad15493ef4a1df9adc59ce9df", size = 393717, upload-time = "2025-10-14T06:47:09.085Z" },
{ url = "https://files.pythonhosted.org/packages/45/d1/a848ed8e8d4e236b9b16381768c9ae99d92890c24886bb4505aa9c3d2033/blake3-1.0.8-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2c3151955efb09ba58cd3e1263521e15e9e3866a40d6bd3556d86fc968e8f95", size = 386150, upload-time = "2025-10-14T06:47:10.363Z" },
{ url = "https://files.pythonhosted.org/packages/96/09/e3eb5d60f97c01de23d9f434e6e1fc117efb466eaa1f6ddbbbcb62580d6e/blake3-1.0.8-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:5eb25bca3cee2e0dd746a214784fb36be6a43640c01c55b6b4e26196e72d076c", size = 549120, upload-time = "2025-10-14T06:47:11.713Z" },
{ url = "https://files.pythonhosted.org/packages/14/ad/3d9661c710febb8957dd685fdb3e5a861aa0ac918eda3031365ce45789e2/blake3-1.0.8-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:ab4e1dea4fa857944944db78e8f20d99ee2e16b2dea5a14f514fb0607753ac83", size = 553264, upload-time = "2025-10-14T06:47:13.317Z" },
{ url = "https://files.pythonhosted.org/packages/11/55/e332a5b49edf377d0690e95951cca21a00c568f6e37315f9749efee52617/blake3-1.0.8-cp314-cp314t-win32.whl", hash = "sha256:67f1bc11bf59464ef092488c707b13dd4e872db36e25c453dfb6e0c7498df9f1", size = 228116, upload-time = "2025-10-14T06:47:14.516Z" },
{ url = "https://files.pythonhosted.org/packages/b0/5c/dbd00727a3dd165d7e0e8af40e630cd7e45d77b525a3218afaff8a87358e/blake3-1.0.8-cp314-cp314t-win_amd64.whl", hash = "sha256:421b99cdf1ff2d1bf703bc56c454f4b286fce68454dd8711abbcb5a0df90c19a", size = 215133, upload-time = "2025-10-14T06:47:16.069Z" },
]
[[package]]
name = "caio"
version = "0.9.24"
@ -446,15 +369,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/76/f2/98fd8d0b514622a789fd2824b59bd6041b799aaeeba14a8d92d52f6654dd/cython-3.2.2-py3-none-any.whl", hash = "sha256:13b99ecb9482aff6a6c12d1ca6feef6940c507af909914b49f568de74fa965fb", size = 1255106, upload-time = "2025-11-30T12:48:18.454Z" },
]
[[package]]
name = "dnspython"
version = "2.8.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/8c/8b/57666417c0f90f08bcafa776861060426765fdb422eb10212086fb811d26/dnspython-2.8.0.tar.gz", hash = "sha256:181d3c6996452cb1189c4046c61599b84a5a86e099562ffde77d26984ff26d0f", size = 368251, upload-time = "2025-09-07T18:58:00.022Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ba/5a/18ad964b0086c6e62e2e7500f7edc89e3faa45033c71c1893d34eed2b2de/dnspython-2.8.0-py3-none-any.whl", hash = "sha256:01d9bbc4a2d76bf0db7c1f729812ded6d912bd318d3b1cf81d30c0f845dbf3af", size = 331094, upload-time = "2025-09-07T18:57:58.071Z" },
]
[[package]]
name = "elastic-transport"
version = "8.17.1"
@ -814,94 +728,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" },
]
[[package]]
name = "mmh3"
version = "5.2.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/a7/af/f28c2c2f51f31abb4725f9a64bc7863d5f491f6539bd26aee2a1d21a649e/mmh3-5.2.0.tar.gz", hash = "sha256:1efc8fec8478e9243a78bb993422cf79f8ff85cb4cf6b79647480a31e0d950a8", size = 33582, upload-time = "2025-07-29T07:43:48.49Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/bf/6a/d5aa7edb5c08e0bd24286c7d08341a0446f9a2fbbb97d96a8a6dd81935ee/mmh3-5.2.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:384eda9361a7bf83a85e09447e1feafe081034af9dd428893701b959230d84be", size = 56141, upload-time = "2025-07-29T07:42:13.456Z" },
{ url = "https://files.pythonhosted.org/packages/08/49/131d0fae6447bc4a7299ebdb1a6fb9d08c9f8dcf97d75ea93e8152ddf7ab/mmh3-5.2.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:2c9da0d568569cc87315cb063486d761e38458b8ad513fedd3dc9263e1b81bcd", size = 40681, upload-time = "2025-07-29T07:42:14.306Z" },
{ url = "https://files.pythonhosted.org/packages/8f/6f/9221445a6bcc962b7f5ff3ba18ad55bba624bacdc7aa3fc0a518db7da8ec/mmh3-5.2.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:86d1be5d63232e6eb93c50881aea55ff06eb86d8e08f9b5417c8c9b10db9db96", size = 40062, upload-time = "2025-07-29T07:42:15.08Z" },
{ url = "https://files.pythonhosted.org/packages/1e/d4/6bb2d0fef81401e0bb4c297d1eb568b767de4ce6fc00890bc14d7b51ecc4/mmh3-5.2.0-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:bf7bee43e17e81671c447e9c83499f53d99bf440bc6d9dc26a841e21acfbe094", size = 97333, upload-time = "2025-07-29T07:42:16.436Z" },
{ url = "https://files.pythonhosted.org/packages/44/e0/ccf0daff8134efbb4fbc10a945ab53302e358c4b016ada9bf97a6bdd50c1/mmh3-5.2.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:7aa18cdb58983ee660c9c400b46272e14fa253c675ed963d3812487f8ca42037", size = 103310, upload-time = "2025-07-29T07:42:17.796Z" },
{ url = "https://files.pythonhosted.org/packages/02/63/1965cb08a46533faca0e420e06aff8bbaf9690a6f0ac6ae6e5b2e4544687/mmh3-5.2.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ae9d032488fcec32d22be6542d1a836f00247f40f320844dbb361393b5b22773", size = 106178, upload-time = "2025-07-29T07:42:19.281Z" },
{ url = "https://files.pythonhosted.org/packages/c2/41/c883ad8e2c234013f27f92061200afc11554ea55edd1bcf5e1accd803a85/mmh3-5.2.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e1861fb6b1d0453ed7293200139c0a9011eeb1376632e048e3766945b13313c5", size = 113035, upload-time = "2025-07-29T07:42:20.356Z" },
{ url = "https://files.pythonhosted.org/packages/df/b5/1ccade8b1fa625d634a18bab7bf08a87457e09d5ec8cf83ca07cbea9d400/mmh3-5.2.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:99bb6a4d809aa4e528ddfe2c85dd5239b78b9dd14be62cca0329db78505e7b50", size = 120784, upload-time = "2025-07-29T07:42:21.377Z" },
{ url = "https://files.pythonhosted.org/packages/77/1c/919d9171fcbdcdab242e06394464ccf546f7d0f3b31e0d1e3a630398782e/mmh3-5.2.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1f8d8b627799f4e2fcc7c034fed8f5f24dc7724ff52f69838a3d6d15f1ad4765", size = 99137, upload-time = "2025-07-29T07:42:22.344Z" },
{ url = "https://files.pythonhosted.org/packages/66/8a/1eebef5bd6633d36281d9fc83cf2e9ba1ba0e1a77dff92aacab83001cee4/mmh3-5.2.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b5995088dd7023d2d9f310a0c67de5a2b2e06a570ecfd00f9ff4ab94a67cde43", size = 98664, upload-time = "2025-07-29T07:42:23.269Z" },
{ url = "https://files.pythonhosted.org/packages/13/41/a5d981563e2ee682b21fb65e29cc0f517a6734a02b581359edd67f9d0360/mmh3-5.2.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:1a5f4d2e59d6bba8ef01b013c472741835ad961e7c28f50c82b27c57748744a4", size = 106459, upload-time = "2025-07-29T07:42:24.238Z" },
{ url = "https://files.pythonhosted.org/packages/24/31/342494cd6ab792d81e083680875a2c50fa0c5df475ebf0b67784f13e4647/mmh3-5.2.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:fd6e6c3d90660d085f7e73710eab6f5545d4854b81b0135a3526e797009dbda3", size = 110038, upload-time = "2025-07-29T07:42:25.629Z" },
{ url = "https://files.pythonhosted.org/packages/28/44/efda282170a46bb4f19c3e2b90536513b1d821c414c28469a227ca5a1789/mmh3-5.2.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:c4a2f3d83879e3de2eb8cbf562e71563a8ed15ee9b9c2e77ca5d9f73072ac15c", size = 97545, upload-time = "2025-07-29T07:42:27.04Z" },
{ url = "https://files.pythonhosted.org/packages/68/8f/534ae319c6e05d714f437e7206f78c17e66daca88164dff70286b0e8ea0c/mmh3-5.2.0-cp312-cp312-win32.whl", hash = "sha256:2421b9d665a0b1ad724ec7332fb5a98d075f50bc51a6ff854f3a1882bd650d49", size = 40805, upload-time = "2025-07-29T07:42:28.032Z" },
{ url = "https://files.pythonhosted.org/packages/b8/f6/f6abdcfefcedab3c964868048cfe472764ed358c2bf6819a70dd4ed4ed3a/mmh3-5.2.0-cp312-cp312-win_amd64.whl", hash = "sha256:72d80005b7634a3a2220f81fbeb94775ebd12794623bb2e1451701ea732b4aa3", size = 41597, upload-time = "2025-07-29T07:42:28.894Z" },
{ url = "https://files.pythonhosted.org/packages/15/fd/f7420e8cbce45c259c770cac5718badf907b302d3a99ec587ba5ce030237/mmh3-5.2.0-cp312-cp312-win_arm64.whl", hash = "sha256:3d6bfd9662a20c054bc216f861fa330c2dac7c81e7fb8307b5e32ab5b9b4d2e0", size = 39350, upload-time = "2025-07-29T07:42:29.794Z" },
{ url = "https://files.pythonhosted.org/packages/d8/fa/27f6ab93995ef6ad9f940e96593c5dd24744d61a7389532b0fec03745607/mmh3-5.2.0-cp313-cp313-android_21_arm64_v8a.whl", hash = "sha256:e79c00eba78f7258e5b354eccd4d7907d60317ced924ea4a5f2e9d83f5453065", size = 40874, upload-time = "2025-07-29T07:42:30.662Z" },
{ url = "https://files.pythonhosted.org/packages/11/9c/03d13bcb6a03438bc8cac3d2e50f80908d159b31a4367c2e1a7a077ded32/mmh3-5.2.0-cp313-cp313-android_21_x86_64.whl", hash = "sha256:956127e663d05edbeec54df38885d943dfa27406594c411139690485128525de", size = 42012, upload-time = "2025-07-29T07:42:31.539Z" },
{ url = "https://files.pythonhosted.org/packages/4e/78/0865d9765408a7d504f1789944e678f74e0888b96a766d578cb80b040999/mmh3-5.2.0-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:c3dca4cb5b946ee91b3d6bb700d137b1cd85c20827f89fdf9c16258253489044", size = 39197, upload-time = "2025-07-29T07:42:32.374Z" },
{ url = "https://files.pythonhosted.org/packages/3e/12/76c3207bd186f98b908b6706c2317abb73756d23a4e68ea2bc94825b9015/mmh3-5.2.0-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:e651e17bfde5840e9e4174b01e9e080ce49277b70d424308b36a7969d0d1af73", size = 39840, upload-time = "2025-07-29T07:42:33.227Z" },
{ url = "https://files.pythonhosted.org/packages/5d/0d/574b6cce5555c9f2b31ea189ad44986755eb14e8862db28c8b834b8b64dc/mmh3-5.2.0-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:9f64bf06f4bf623325fda3a6d02d36cd69199b9ace99b04bb2d7fd9f89688504", size = 40644, upload-time = "2025-07-29T07:42:34.099Z" },
{ url = "https://files.pythonhosted.org/packages/52/82/3731f8640b79c46707f53ed72034a58baad400be908c87b0088f1f89f986/mmh3-5.2.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ddc63328889bcaee77b743309e5c7d2d52cee0d7d577837c91b6e7cc9e755e0b", size = 56153, upload-time = "2025-07-29T07:42:35.031Z" },
{ url = "https://files.pythonhosted.org/packages/4f/34/e02dca1d4727fd9fdeaff9e2ad6983e1552804ce1d92cc796e5b052159bb/mmh3-5.2.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:bb0fdc451fb6d86d81ab8f23d881b8d6e37fc373a2deae1c02d27002d2ad7a05", size = 40684, upload-time = "2025-07-29T07:42:35.914Z" },
{ url = "https://files.pythonhosted.org/packages/8f/36/3dee40767356e104967e6ed6d102ba47b0b1ce2a89432239b95a94de1b89/mmh3-5.2.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b29044e1ffdb84fe164d0a7ea05c7316afea93c00f8ed9449cf357c36fc4f814", size = 40057, upload-time = "2025-07-29T07:42:36.755Z" },
{ url = "https://files.pythonhosted.org/packages/31/58/228c402fccf76eb39a0a01b8fc470fecf21965584e66453b477050ee0e99/mmh3-5.2.0-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:58981d6ea9646dbbf9e59a30890cbf9f610df0e4a57dbfe09215116fd90b0093", size = 97344, upload-time = "2025-07-29T07:42:37.675Z" },
{ url = "https://files.pythonhosted.org/packages/34/82/fc5ce89006389a6426ef28e326fc065b0fbaaed230373b62d14c889f47ea/mmh3-5.2.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:7e5634565367b6d98dc4aa2983703526ef556b3688ba3065edb4b9b90ede1c54", size = 103325, upload-time = "2025-07-29T07:42:38.591Z" },
{ url = "https://files.pythonhosted.org/packages/09/8c/261e85777c6aee1ebd53f2f17e210e7481d5b0846cd0b4a5c45f1e3761b8/mmh3-5.2.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b0271ac12415afd3171ab9a3c7cbfc71dee2c68760a7dc9d05bf8ed6ddfa3a7a", size = 106240, upload-time = "2025-07-29T07:42:39.563Z" },
{ url = "https://files.pythonhosted.org/packages/70/73/2f76b3ad8a3d431824e9934403df36c0ddacc7831acf82114bce3c4309c8/mmh3-5.2.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:45b590e31bc552c6f8e2150ff1ad0c28dd151e9f87589e7eaf508fbdd8e8e908", size = 113060, upload-time = "2025-07-29T07:42:40.585Z" },
{ url = "https://files.pythonhosted.org/packages/9f/b9/7ea61a34e90e50a79a9d87aa1c0b8139a7eaf4125782b34b7d7383472633/mmh3-5.2.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:bdde97310d59604f2a9119322f61b31546748499a21b44f6715e8ced9308a6c5", size = 120781, upload-time = "2025-07-29T07:42:41.618Z" },
{ url = "https://files.pythonhosted.org/packages/0f/5b/ae1a717db98c7894a37aeedbd94b3f99e6472a836488f36b6849d003485b/mmh3-5.2.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:fc9c5f280438cf1c1a8f9abb87dc8ce9630a964120cfb5dd50d1e7ce79690c7a", size = 99174, upload-time = "2025-07-29T07:42:42.587Z" },
{ url = "https://files.pythonhosted.org/packages/e3/de/000cce1d799fceebb6d4487ae29175dd8e81b48e314cba7b4da90bcf55d7/mmh3-5.2.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:c903e71fd8debb35ad2a4184c1316b3cb22f64ce517b4e6747f25b0a34e41266", size = 98734, upload-time = "2025-07-29T07:42:43.996Z" },
{ url = "https://files.pythonhosted.org/packages/79/19/0dc364391a792b72fbb22becfdeacc5add85cc043cd16986e82152141883/mmh3-5.2.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:eed4bba7ff8a0d37106ba931ab03bdd3915fbb025bcf4e1f0aa02bc8114960c5", size = 106493, upload-time = "2025-07-29T07:42:45.07Z" },
{ url = "https://files.pythonhosted.org/packages/3c/b1/bc8c28e4d6e807bbb051fefe78e1156d7f104b89948742ad310612ce240d/mmh3-5.2.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:1fdb36b940e9261aff0b5177c5b74a36936b902f473180f6c15bde26143681a9", size = 110089, upload-time = "2025-07-29T07:42:46.122Z" },
{ url = "https://files.pythonhosted.org/packages/3b/a2/d20f3f5c95e9c511806686c70d0a15479cc3941c5f322061697af1c1ff70/mmh3-5.2.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:7303aab41e97adcf010a09efd8f1403e719e59b7705d5e3cfed3dd7571589290", size = 97571, upload-time = "2025-07-29T07:42:47.18Z" },
{ url = "https://files.pythonhosted.org/packages/7b/23/665296fce4f33488deec39a750ffd245cfc07aafb0e3ef37835f91775d14/mmh3-5.2.0-cp313-cp313-win32.whl", hash = "sha256:03e08c6ebaf666ec1e3d6ea657a2d363bb01effd1a9acfe41f9197decaef0051", size = 40806, upload-time = "2025-07-29T07:42:48.166Z" },
{ url = "https://files.pythonhosted.org/packages/59/b0/92e7103f3b20646e255b699e2d0327ce53a3f250e44367a99dc8be0b7c7a/mmh3-5.2.0-cp313-cp313-win_amd64.whl", hash = "sha256:7fddccd4113e7b736706e17a239a696332360cbaddf25ae75b57ba1acce65081", size = 41600, upload-time = "2025-07-29T07:42:49.371Z" },
{ url = "https://files.pythonhosted.org/packages/99/22/0b2bd679a84574647de538c5b07ccaa435dbccc37815067fe15b90fe8dad/mmh3-5.2.0-cp313-cp313-win_arm64.whl", hash = "sha256:fa0c966ee727aad5406d516375593c5f058c766b21236ab8985693934bb5085b", size = 39349, upload-time = "2025-07-29T07:42:50.268Z" },
{ url = "https://files.pythonhosted.org/packages/f7/ca/a20db059a8a47048aaf550da14a145b56e9c7386fb8280d3ce2962dcebf7/mmh3-5.2.0-cp314-cp314-ios_13_0_arm64_iphoneos.whl", hash = "sha256:e5015f0bb6eb50008bed2d4b1ce0f2a294698a926111e4bb202c0987b4f89078", size = 39209, upload-time = "2025-07-29T07:42:51.559Z" },
{ url = "https://files.pythonhosted.org/packages/98/dd/e5094799d55c7482d814b979a0fd608027d0af1b274bfb4c3ea3e950bfd5/mmh3-5.2.0-cp314-cp314-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:e0f3ed828d709f5b82d8bfe14f8856120718ec4bd44a5b26102c3030a1e12501", size = 39843, upload-time = "2025-07-29T07:42:52.536Z" },
{ url = "https://files.pythonhosted.org/packages/f4/6b/7844d7f832c85400e7cc89a1348e4e1fdd38c5a38415bb5726bbb8fcdb6c/mmh3-5.2.0-cp314-cp314-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:f35727c5118aba95f0397e18a1a5b8405425581bfe53e821f0fb444cbdc2bc9b", size = 40648, upload-time = "2025-07-29T07:42:53.392Z" },
{ url = "https://files.pythonhosted.org/packages/1f/bf/71f791f48a21ff3190ba5225807cbe4f7223360e96862c376e6e3fb7efa7/mmh3-5.2.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3bc244802ccab5220008cb712ca1508cb6a12f0eb64ad62997156410579a1770", size = 56164, upload-time = "2025-07-29T07:42:54.267Z" },
{ url = "https://files.pythonhosted.org/packages/70/1f/f87e3d34d83032b4f3f0f528c6d95a98290fcacf019da61343a49dccfd51/mmh3-5.2.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:ff3d50dc3fe8a98059f99b445dfb62792b5d006c5e0b8f03c6de2813b8376110", size = 40692, upload-time = "2025-07-29T07:42:55.234Z" },
{ url = "https://files.pythonhosted.org/packages/a6/e2/db849eaed07117086f3452feca8c839d30d38b830ac59fe1ce65af8be5ad/mmh3-5.2.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:37a358cc881fe796e099c1db6ce07ff757f088827b4e8467ac52b7a7ffdca647", size = 40068, upload-time = "2025-07-29T07:42:56.158Z" },
{ url = "https://files.pythonhosted.org/packages/df/6b/209af927207af77425b044e32f77f49105a0b05d82ff88af6971d8da4e19/mmh3-5.2.0-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:b9a87025121d1c448f24f27ff53a5fe7b6ef980574b4a4f11acaabe702420d63", size = 97367, upload-time = "2025-07-29T07:42:57.037Z" },
{ url = "https://files.pythonhosted.org/packages/ca/e0/78adf4104c425606a9ce33fb351f790c76a6c2314969c4a517d1ffc92196/mmh3-5.2.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:1ba55d6ca32eeef8b2625e1e4bfc3b3db52bc63014bd7e5df8cc11bf2b036b12", size = 103306, upload-time = "2025-07-29T07:42:58.522Z" },
{ url = "https://files.pythonhosted.org/packages/a3/79/c2b89f91b962658b890104745b1b6c9ce38d50a889f000b469b91eeb1b9e/mmh3-5.2.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c9ff37ba9f15637e424c2ab57a1a590c52897c845b768e4e0a4958084ec87f22", size = 106312, upload-time = "2025-07-29T07:42:59.552Z" },
{ url = "https://files.pythonhosted.org/packages/4b/14/659d4095528b1a209be90934778c5ffe312177d51e365ddcbca2cac2ec7c/mmh3-5.2.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a094319ec0db52a04af9fdc391b4d39a1bc72bc8424b47c4411afb05413a44b5", size = 113135, upload-time = "2025-07-29T07:43:00.745Z" },
{ url = "https://files.pythonhosted.org/packages/8d/6f/cd7734a779389a8a467b5c89a48ff476d6f2576e78216a37551a97e9e42a/mmh3-5.2.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:c5584061fd3da584659b13587f26c6cad25a096246a481636d64375d0c1f6c07", size = 120775, upload-time = "2025-07-29T07:43:02.124Z" },
{ url = "https://files.pythonhosted.org/packages/1d/ca/8256e3b96944408940de3f9291d7e38a283b5761fe9614d4808fcf27bd62/mmh3-5.2.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ecbfc0437ddfdced5e7822d1ce4855c9c64f46819d0fdc4482c53f56c707b935", size = 99178, upload-time = "2025-07-29T07:43:03.182Z" },
{ url = "https://files.pythonhosted.org/packages/8a/32/39e2b3cf06b6e2eb042c984dab8680841ac2a0d3ca6e0bea30db1f27b565/mmh3-5.2.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:7b986d506a8e8ea345791897ba5d8ba0d9d8820cd4fc3e52dbe6de19388de2e7", size = 98738, upload-time = "2025-07-29T07:43:04.207Z" },
{ url = "https://files.pythonhosted.org/packages/61/d3/7bbc8e0e8cf65ebbe1b893ffa0467b7ecd1bd07c3bbf6c9db4308ada22ec/mmh3-5.2.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:38d899a156549da8ef6a9f1d6f7ef231228d29f8f69bce2ee12f5fba6d6fd7c5", size = 106510, upload-time = "2025-07-29T07:43:05.656Z" },
{ url = "https://files.pythonhosted.org/packages/10/99/b97e53724b52374e2f3859046f0eb2425192da356cb19784d64bc17bb1cf/mmh3-5.2.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:d86651fa45799530885ba4dab3d21144486ed15285e8784181a0ab37a4552384", size = 110053, upload-time = "2025-07-29T07:43:07.204Z" },
{ url = "https://files.pythonhosted.org/packages/ac/62/3688c7d975ed195155671df68788c83fed6f7909b6ec4951724c6860cb97/mmh3-5.2.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:c463d7c1c4cfc9d751efeaadd936bbba07b5b0ed81a012b3a9f5a12f0872bd6e", size = 97546, upload-time = "2025-07-29T07:43:08.226Z" },
{ url = "https://files.pythonhosted.org/packages/ca/3b/c6153250f03f71a8b7634cded82939546cdfba02e32f124ff51d52c6f991/mmh3-5.2.0-cp314-cp314-win32.whl", hash = "sha256:bb4fe46bdc6104fbc28db7a6bacb115ee6368ff993366bbd8a2a7f0076e6f0c0", size = 41422, upload-time = "2025-07-29T07:43:09.216Z" },
{ url = "https://files.pythonhosted.org/packages/74/01/a27d98bab083a435c4c07e9d1d720d4c8a578bf4c270bae373760b1022be/mmh3-5.2.0-cp314-cp314-win_amd64.whl", hash = "sha256:7c7f0b342fd06044bedd0b6e72177ddc0076f54fd89ee239447f8b271d919d9b", size = 42135, upload-time = "2025-07-29T07:43:10.183Z" },
{ url = "https://files.pythonhosted.org/packages/cb/c9/dbba5507e95429b8b380e2ba091eff5c20a70a59560934dff0ad8392b8c8/mmh3-5.2.0-cp314-cp314-win_arm64.whl", hash = "sha256:3193752fc05ea72366c2b63ff24b9a190f422e32d75fdeae71087c08fff26115", size = 39879, upload-time = "2025-07-29T07:43:11.106Z" },
{ url = "https://files.pythonhosted.org/packages/b5/d1/c8c0ef839c17258b9de41b84f663574fabcf8ac2007b7416575e0f65ff6e/mmh3-5.2.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:69fc339d7202bea69ef9bd7c39bfdf9fdabc8e6822a01eba62fb43233c1b3932", size = 57696, upload-time = "2025-07-29T07:43:11.989Z" },
{ url = "https://files.pythonhosted.org/packages/2f/55/95e2b9ff201e89f9fe37036037ab61a6c941942b25cdb7b6a9df9b931993/mmh3-5.2.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:12da42c0a55c9d86ab566395324213c319c73ecb0c239fad4726324212b9441c", size = 41421, upload-time = "2025-07-29T07:43:13.269Z" },
{ url = "https://files.pythonhosted.org/packages/77/79/9be23ad0b7001a4b22752e7693be232428ecc0a35068a4ff5c2f14ef8b20/mmh3-5.2.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:f7f9034c7cf05ddfaac8d7a2e63a3c97a840d4615d0a0e65ba8bdf6f8576e3be", size = 40853, upload-time = "2025-07-29T07:43:14.888Z" },
{ url = "https://files.pythonhosted.org/packages/ac/1b/96b32058eda1c1dee8264900c37c359a7325c1f11f5ff14fd2be8e24eff9/mmh3-5.2.0-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:11730eeb16dfcf9674fdea9bb6b8e6dd9b40813b7eb839bc35113649eef38aeb", size = 109694, upload-time = "2025-07-29T07:43:15.816Z" },
{ url = "https://files.pythonhosted.org/packages/8d/6f/a2ae44cd7dad697b6dea48390cbc977b1e5ca58fda09628cbcb2275af064/mmh3-5.2.0-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:932a6eec1d2e2c3c9e630d10f7128d80e70e2d47fe6b8c7ea5e1afbd98733e65", size = 117438, upload-time = "2025-07-29T07:43:16.865Z" },
{ url = "https://files.pythonhosted.org/packages/a0/08/bfb75451c83f05224a28afeaf3950c7b793c0b71440d571f8e819cfb149a/mmh3-5.2.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3ca975c51c5028947bbcfc24966517aac06a01d6c921e30f7c5383c195f87991", size = 120409, upload-time = "2025-07-29T07:43:18.207Z" },
{ url = "https://files.pythonhosted.org/packages/9f/ea/8b118b69b2ff8df568f742387d1a159bc654a0f78741b31437dd047ea28e/mmh3-5.2.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:5b0b58215befe0f0e120b828f7645e97719bbba9f23b69e268ed0ac7adde8645", size = 125909, upload-time = "2025-07-29T07:43:19.39Z" },
{ url = "https://files.pythonhosted.org/packages/3e/11/168cc0b6a30650032e351a3b89b8a47382da541993a03af91e1ba2501234/mmh3-5.2.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:29c2b9ce61886809d0492a274a5a53047742dea0f703f9c4d5d223c3ea6377d3", size = 135331, upload-time = "2025-07-29T07:43:20.435Z" },
{ url = "https://files.pythonhosted.org/packages/31/05/e3a9849b1c18a7934c64e831492c99e67daebe84a8c2f2c39a7096a830e3/mmh3-5.2.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:a367d4741ac0103f8198c82f429bccb9359f543ca542b06a51f4f0332e8de279", size = 110085, upload-time = "2025-07-29T07:43:21.92Z" },
{ url = "https://files.pythonhosted.org/packages/d9/d5/a96bcc306e3404601418b2a9a370baec92af84204528ba659fdfe34c242f/mmh3-5.2.0-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:5a5dba98e514fb26241868f6eb90a7f7ca0e039aed779342965ce24ea32ba513", size = 111195, upload-time = "2025-07-29T07:43:23.066Z" },
{ url = "https://files.pythonhosted.org/packages/af/29/0fd49801fec5bff37198684e0849b58e0dab3a2a68382a357cfffb0fafc3/mmh3-5.2.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:941603bfd75a46023807511c1ac2f1b0f39cccc393c15039969806063b27e6db", size = 116919, upload-time = "2025-07-29T07:43:24.178Z" },
{ url = "https://files.pythonhosted.org/packages/2d/04/4f3c32b0a2ed762edca45d8b46568fc3668e34f00fb1e0a3b5451ec1281c/mmh3-5.2.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:132dd943451a7c7546978863d2f5a64977928410782e1a87d583cb60eb89e667", size = 123160, upload-time = "2025-07-29T07:43:25.26Z" },
{ url = "https://files.pythonhosted.org/packages/91/76/3d29eaa38821730633d6a240d36fa8ad2807e9dfd432c12e1a472ed211eb/mmh3-5.2.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:f698733a8a494466432d611a8f0d1e026f5286dee051beea4b3c3146817e35d5", size = 110206, upload-time = "2025-07-29T07:43:26.699Z" },
{ url = "https://files.pythonhosted.org/packages/44/1c/ccf35892684d3a408202e296e56843743e0b4fb1629e59432ea88cdb3909/mmh3-5.2.0-cp314-cp314t-win32.whl", hash = "sha256:6d541038b3fc360ec538fc116de87462627944765a6750308118f8b509a8eec7", size = 41970, upload-time = "2025-07-29T07:43:27.666Z" },
{ url = "https://files.pythonhosted.org/packages/75/b2/b9e4f1e5adb5e21eb104588fcee2cd1eaa8308255173481427d5ecc4284e/mmh3-5.2.0-cp314-cp314t-win_amd64.whl", hash = "sha256:e912b19cf2378f2967d0c08e86ff4c6c360129887f678e27e4dde970d21b3f4d", size = 43063, upload-time = "2025-07-29T07:43:28.582Z" },
{ url = "https://files.pythonhosted.org/packages/6a/fc/0e61d9a4e29c8679356795a40e48f647b4aad58d71bfc969f0f8f56fb912/mmh3-5.2.0-cp314-cp314t-win_arm64.whl", hash = "sha256:e7884931fe5e788163e7b3c511614130c2c59feffdc21112290a194487efb2e9", size = 40455, upload-time = "2025-07-29T07:43:29.563Z" },
]
[[package]]
name = "morphys"
version = "1.0"
source = { registry = "https://pypi.org/simple" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/f9/4f/cb781d0ac5d079adabc77dc4f0bc99fc81c390029bd33c6e70552139e762/morphys-1.0-py2.py3-none-any.whl", hash = "sha256:76d6dbaa4d65f597e59d332c81da786d83e4669387b9b2a750cfec74e7beec20", size = 5618, upload-time = "2017-01-10T20:08:56.872Z" },
]
[[package]]
name = "msgspec"
version = "0.19.0"
@ -924,29 +750,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/23/d8/f15b40611c2d5753d1abb0ca0da0c75348daf1252220e5dda2867bd81062/msgspec-0.19.0-cp313-cp313-win_amd64.whl", hash = "sha256:317050bc0f7739cb30d257ff09152ca309bf5a369854bbf1e57dffc310c1f20f", size = 187432, upload-time = "2024-12-27T17:40:16.256Z" },
]
[[package]]
name = "multiaddr"
version = "0.1.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "base58" },
{ name = "dnspython" },
{ name = "idna" },
{ name = "netaddr" },
{ name = "psutil" },
{ name = "py-cid" },
{ name = "py-multibase" },
{ name = "py-multicodec" },
{ name = "py-multihash" },
{ name = "trio" },
{ name = "trio-typing" },
{ name = "varint" },
]
sdist = { url = "https://files.pythonhosted.org/packages/cf/5c/6d27f2b04c54e7c85b4a05760ac7dfaa7c68214b917e0d4a5043c01cf231/multiaddr-0.1.1.tar.gz", hash = "sha256:04da0afd2097625569073776526eb9733db9d9713286bada44632a9d7275b9bb", size = 54186, upload-time = "2025-12-07T17:19:07.996Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/07/ad/bd8a1748953a8f7a8167ec7d02eeefbed469595089d75e3fcb53047e7e20/multiaddr-0.1.1-py3-none-any.whl", hash = "sha256:d95333effddbd372009dbfce4d2ec922dd633697b6cc89a5af90ae082c4e68d4", size = 38374, upload-time = "2025-12-07T17:19:05.849Z" },
]
[[package]]
name = "multidict"
version = "6.7.0"
@ -1064,15 +867,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/a0/c4/c2971a3ba4c6103a3d10c4b0f24f461ddc027f0f09763220cf35ca1401b3/nest_asyncio-1.6.0-py3-none-any.whl", hash = "sha256:87af6efd6b5e897c81050477ef65c62e2b2f35d51703cae01aff2905b1852e1c", size = 5195, upload-time = "2024-01-21T14:25:17.223Z" },
]
[[package]]
name = "netaddr"
version = "1.3.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/54/90/188b2a69654f27b221fba92fda7217778208532c962509e959a9cee5229d/netaddr-1.3.0.tar.gz", hash = "sha256:5c3c3d9895b551b763779ba7db7a03487dc1f8e3b385af819af341ae9ef6e48a", size = 2260504, upload-time = "2024-05-28T21:30:37.743Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/12/cc/f4fe2c7ce68b92cbf5b2d379ca366e1edae38cccaad00f69f529b460c3ef/netaddr-1.3.0-py3-none-any.whl", hash = "sha256:c2c6a8ebe5554ce33b7d5b3a306b71bbb373e000bbbf2350dd5213cc56e3dbbe", size = 2262023, upload-time = "2024-05-28T21:30:34.191Z" },
]
[[package]]
name = "numba"
version = "0.62.1"
@ -1403,7 +1197,7 @@ requires-dist = [
{ name = "tomli", specifier = ">=2.0.1,<3.0.0" },
{ name = "tomli-w", specifier = ">=1.0.0,<2.0.0" },
{ name = "tomlkit", git = "https://github.com/pikers/tomlkit.git?branch=piker_pin" },
{ name = "tractor", editable = "../tractor" },
{ name = "tractor", git = "https://github.com/goodboy/tractor.git?branch=piker_pin" },
{ name = "trio", specifier = ">=0.27" },
{ name = "trio-typing", specifier = ">=0.10.0" },
{ name = "trio-util", specifier = ">=0.7.0,<0.8.0" },
@ -1424,11 +1218,11 @@ dev = [
{ name = "prompt-toolkit", specifier = "==3.0.40" },
{ name = "pyperclip", specifier = ">=1.9.0" },
{ name = "pyqt6", specifier = ">=6.7.0,<7.0.0" },
{ name = "pyqtgraph", git = "https://github.com/pyqtgraph/pyqtgraph.git?branch=master" },
{ name = "pyqtgraph", git = "https://github.com/pikers/pyqtgraph.git" },
{ name = "pytest" },
{ name = "qdarkstyle", specifier = ">=3.0.2,<4.0.0" },
{ name = "rapidfuzz", specifier = ">=3.2.0,<4.0.0" },
{ name = "xonsh", git = "https://github.com/xonsh/xonsh.git?branch=main" },
{ name = "xonsh", specifier = ">=0.22.2" },
]
lint = [{ name = "ruff", specifier = ">=0.9.6" }]
repl = [
@ -1437,23 +1231,23 @@ repl = [
{ name = "pexpect", specifier = ">=4.9.0" },
{ name = "prompt-toolkit", specifier = "==3.0.40" },
{ name = "pyperclip", specifier = ">=1.9.0" },
{ name = "xonsh", git = "https://github.com/xonsh/xonsh.git?branch=main" },
{ name = "xonsh", specifier = ">=0.22.2" },
]
testing = [{ name = "pytest" }]
uis = [
{ name = "pyqt6", specifier = ">=6.7.0,<7.0.0" },
{ name = "pyqtgraph", git = "https://github.com/pyqtgraph/pyqtgraph.git?branch=master" },
{ name = "pyqtgraph", git = "https://github.com/pikers/pyqtgraph.git" },
{ name = "qdarkstyle", specifier = ">=3.0.2,<4.0.0" },
{ name = "rapidfuzz", specifier = ">=3.2.0,<4.0.0" },
]
[[package]]
name = "platformdirs"
version = "4.5.1"
version = "4.6.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/cf/86/0248f086a84f01b37aaec0fa567b397df1a119f73c16f6c7a9aac73ea309/platformdirs-4.5.1.tar.gz", hash = "sha256:61d5cdcc6065745cdd94f0f878977f8de9437be93de97c1c12f853c9c0cdcbda", size = 21715, upload-time = "2025-12-05T13:52:58.638Z" }
sdist = { url = "https://files.pythonhosted.org/packages/20/e5/474d0a8508029286b905622e6929470fb84337cfa08f9d09fbb624515249/platformdirs-4.6.0.tar.gz", hash = "sha256:4a13c2db1071e5846c3b3e04e5b095c0de36b2a24be9a3bc0145ca66fce4e328", size = 23433, upload-time = "2026-02-12T14:36:21.288Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/cb/28/3bfe2fa5a7b9c46fe7e13c97bda14c895fb10fa2ebf1d0abb90e0cea7ee1/platformdirs-4.5.1-py3-none-any.whl", hash = "sha256:d03afa3963c806a9bed9d5125c8f4cb2fdaf74a55ab60e5d59b3fde758104d31", size = 18731, upload-time = "2025-12-05T13:52:56.823Z" },
{ url = "https://files.pythonhosted.org/packages/da/10/1b0dcf51427326f70e50d98df21b18c228117a743a1fc515a42f8dc7d342/platformdirs-4.6.0-py3-none-any.whl", hash = "sha256:dd7f808d828e1764a22ebff09e60f175ee3c41876606a6132a688d809c7c9c73", size = 19549, upload-time = "2026-02-12T14:36:19.743Z" },
]
[[package]]
@ -1603,34 +1397,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/5b/5a/bc7b4a4ef808fa59a816c17b20c4bef6884daebbdf627ff2a161da67da19/propcache-0.4.1-py3-none-any.whl", hash = "sha256:af2a6052aeb6cf17d3e46ee169099044fd8224cbaf75c76a2ef596e8163e2237", size = 13305, upload-time = "2025-10-08T19:49:00.792Z" },
]
[[package]]
name = "psutil"
version = "7.2.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/aa/c6/d1ddf4abb55e93cebc4f2ed8b5d6dbad109ecb8d63748dd2b20ab5e57ebe/psutil-7.2.2.tar.gz", hash = "sha256:0746f5f8d406af344fd547f1c8daa5f5c33dbc293bb8d6a16d80b4bb88f59372", size = 493740, upload-time = "2026-01-28T18:14:54.428Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/51/08/510cbdb69c25a96f4ae523f733cdc963ae654904e8db864c07585ef99875/psutil-7.2.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:2edccc433cbfa046b980b0df0171cd25bcaeb3a68fe9022db0979e7aa74a826b", size = 130595, upload-time = "2026-01-28T18:14:57.293Z" },
{ url = "https://files.pythonhosted.org/packages/d6/f5/97baea3fe7a5a9af7436301f85490905379b1c6f2dd51fe3ecf24b4c5fbf/psutil-7.2.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e78c8603dcd9a04c7364f1a3e670cea95d51ee865e4efb3556a3a63adef958ea", size = 131082, upload-time = "2026-01-28T18:14:59.732Z" },
{ url = "https://files.pythonhosted.org/packages/37/d6/246513fbf9fa174af531f28412297dd05241d97a75911ac8febefa1a53c6/psutil-7.2.2-cp313-cp313t-manylinux2010_x86_64.manylinux_2_12_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1a571f2330c966c62aeda00dd24620425d4b0cc86881c89861fbc04549e5dc63", size = 181476, upload-time = "2026-01-28T18:15:01.884Z" },
{ url = "https://files.pythonhosted.org/packages/b8/b5/9182c9af3836cca61696dabe4fd1304e17bc56cb62f17439e1154f225dd3/psutil-7.2.2-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:917e891983ca3c1887b4ef36447b1e0873e70c933afc831c6b6da078ba474312", size = 184062, upload-time = "2026-01-28T18:15:04.436Z" },
{ url = "https://files.pythonhosted.org/packages/16/ba/0756dca669f5a9300d0cbcbfae9a4c30e446dfc7440ffe43ded5724bfd93/psutil-7.2.2-cp313-cp313t-win_amd64.whl", hash = "sha256:ab486563df44c17f5173621c7b198955bd6b613fb87c71c161f827d3fb149a9b", size = 139893, upload-time = "2026-01-28T18:15:06.378Z" },
{ url = "https://files.pythonhosted.org/packages/1c/61/8fa0e26f33623b49949346de05ec1ddaad02ed8ba64af45f40a147dbfa97/psutil-7.2.2-cp313-cp313t-win_arm64.whl", hash = "sha256:ae0aefdd8796a7737eccea863f80f81e468a1e4cf14d926bd9b6f5f2d5f90ca9", size = 135589, upload-time = "2026-01-28T18:15:08.03Z" },
{ url = "https://files.pythonhosted.org/packages/81/69/ef179ab5ca24f32acc1dac0c247fd6a13b501fd5534dbae0e05a1c48b66d/psutil-7.2.2-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:eed63d3b4d62449571547b60578c5b2c4bcccc5387148db46e0c2313dad0ee00", size = 130664, upload-time = "2026-01-28T18:15:09.469Z" },
{ url = "https://files.pythonhosted.org/packages/7b/64/665248b557a236d3fa9efc378d60d95ef56dd0a490c2cd37dafc7660d4a9/psutil-7.2.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7b6d09433a10592ce39b13d7be5a54fbac1d1228ed29abc880fb23df7cb694c9", size = 131087, upload-time = "2026-01-28T18:15:11.724Z" },
{ url = "https://files.pythonhosted.org/packages/d5/2e/e6782744700d6759ebce3043dcfa661fb61e2fb752b91cdeae9af12c2178/psutil-7.2.2-cp314-cp314t-manylinux2010_x86_64.manylinux_2_12_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1fa4ecf83bcdf6e6c8f4449aff98eefb5d0604bf88cb883d7da3d8d2d909546a", size = 182383, upload-time = "2026-01-28T18:15:13.445Z" },
{ url = "https://files.pythonhosted.org/packages/57/49/0a41cefd10cb7505cdc04dab3eacf24c0c2cb158a998b8c7b1d27ee2c1f5/psutil-7.2.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e452c464a02e7dc7822a05d25db4cde564444a67e58539a00f929c51eddda0cf", size = 185210, upload-time = "2026-01-28T18:15:16.002Z" },
{ url = "https://files.pythonhosted.org/packages/dd/2c/ff9bfb544f283ba5f83ba725a3c5fec6d6b10b8f27ac1dc641c473dc390d/psutil-7.2.2-cp314-cp314t-win_amd64.whl", hash = "sha256:c7663d4e37f13e884d13994247449e9f8f574bc4655d509c3b95e9ec9e2b9dc1", size = 141228, upload-time = "2026-01-28T18:15:18.385Z" },
{ url = "https://files.pythonhosted.org/packages/f2/fc/f8d9c31db14fcec13748d373e668bc3bed94d9077dbc17fb0eebc073233c/psutil-7.2.2-cp314-cp314t-win_arm64.whl", hash = "sha256:11fe5a4f613759764e79c65cf11ebdf26e33d6dd34336f8a337aa2996d71c841", size = 136284, upload-time = "2026-01-28T18:15:19.912Z" },
{ url = "https://files.pythonhosted.org/packages/e7/36/5ee6e05c9bd427237b11b3937ad82bb8ad2752d72c6969314590dd0c2f6e/psutil-7.2.2-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:ed0cace939114f62738d808fdcecd4c869222507e266e574799e9c0faa17d486", size = 129090, upload-time = "2026-01-28T18:15:22.168Z" },
{ url = "https://files.pythonhosted.org/packages/80/c4/f5af4c1ca8c1eeb2e92ccca14ce8effdeec651d5ab6053c589b074eda6e1/psutil-7.2.2-cp36-abi3-macosx_11_0_arm64.whl", hash = "sha256:1a7b04c10f32cc88ab39cbf606e117fd74721c831c98a27dc04578deb0c16979", size = 129859, upload-time = "2026-01-28T18:15:23.795Z" },
{ url = "https://files.pythonhosted.org/packages/b5/70/5d8df3b09e25bce090399cf48e452d25c935ab72dad19406c77f4e828045/psutil-7.2.2-cp36-abi3-manylinux2010_x86_64.manylinux_2_12_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:076a2d2f923fd4821644f5ba89f059523da90dc9014e85f8e45a5774ca5bc6f9", size = 155560, upload-time = "2026-01-28T18:15:25.976Z" },
{ url = "https://files.pythonhosted.org/packages/63/65/37648c0c158dc222aba51c089eb3bdfa238e621674dc42d48706e639204f/psutil-7.2.2-cp36-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b0726cecd84f9474419d67252add4ac0cd9811b04d61123054b9fb6f57df6e9e", size = 156997, upload-time = "2026-01-28T18:15:27.794Z" },
{ url = "https://files.pythonhosted.org/packages/8e/13/125093eadae863ce03c6ffdbae9929430d116a246ef69866dad94da3bfbc/psutil-7.2.2-cp36-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:fd04ef36b4a6d599bbdb225dd1d3f51e00105f6d48a28f006da7f9822f2606d8", size = 148972, upload-time = "2026-01-28T18:15:29.342Z" },
{ url = "https://files.pythonhosted.org/packages/04/78/0acd37ca84ce3ddffaa92ef0f571e073faa6d8ff1f0559ab1272188ea2be/psutil-7.2.2-cp36-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:b58fabe35e80b264a4e3bb23e6b96f9e45a3df7fb7eed419ac0e5947c61e47cc", size = 148266, upload-time = "2026-01-28T18:15:31.597Z" },
{ url = "https://files.pythonhosted.org/packages/b4/90/e2159492b5426be0c1fef7acba807a03511f97c5f86b3caeda6ad92351a7/psutil-7.2.2-cp37-abi3-win_amd64.whl", hash = "sha256:eb7e81434c8d223ec4a219b5fc1c47d0417b12be7ea866e24fb5ad6e84b3d988", size = 137737, upload-time = "2026-01-28T18:15:33.849Z" },
{ url = "https://files.pythonhosted.org/packages/8c/c7/7bb2e321574b10df20cbde462a94e2b71d05f9bbda251ef27d104668306a/psutil-7.2.2-cp37-abi3-win_arm64.whl", hash = "sha256:8c233660f575a5a89e6d4cb65d9f938126312bca76d8fe087b947b3a1aaac9ee", size = 134617, upload-time = "2026-01-28T18:15:36.514Z" },
]
[[package]]
name = "ptyprocess"
version = "0.7.0"
@ -1640,64 +1406,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/22/a6/858897256d0deac81a172289110f31629fc4cee19b6f01283303e18c8db3/ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35", size = 13993, upload-time = "2020-12-28T15:15:28.35Z" },
]
[[package]]
name = "py-cid"
version = "0.4.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "morphys" },
{ name = "py-multibase" },
{ name = "py-multicodec" },
{ name = "py-multihash" },
]
sdist = { url = "https://files.pythonhosted.org/packages/e7/09/c0ca25eac91c62f6f22f5ac6accd0bfa957e77adfdffd0eccc0700f2ea07/py_cid-0.4.0.tar.gz", hash = "sha256:7c15d6a83f59c3a4c7fbff793f1d4cbfc831e90355fd0e2c5cfe927c21733cc3", size = 25970, upload-time = "2025-12-19T16:55:01.057Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/80/39/d5c1828e79526002f1bf87b9daba01c7db445960daf341e1dd84a5ff0469/py_cid-0.4.0-py3-none-any.whl", hash = "sha256:6a3183a3088b219dbf3cb37eec7d47a644be3f3ebabdf38347c2e9312621d6cc", size = 8833, upload-time = "2025-12-19T16:54:59.233Z" },
]
[[package]]
name = "py-multibase"
version = "2.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "morphys" },
{ name = "python-baseconv" },
{ name = "six" },
]
sdist = { url = "https://files.pythonhosted.org/packages/bc/52/5ed393ab49df7e3b03995d3c4e53bae1e8c2ca40909cf25a41b346c09a38/py_multibase-2.0.0.tar.gz", hash = "sha256:58c1a264195fa1ae29ea707c6fc8196446f4bdb92e0f9a0f131e0f280b238839", size = 26857, upload-time = "2025-12-18T02:24:49.132Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/36/c7/38035079d9978b32b962f996f1cccaa166ecfe38723ab4349ab32166c037/py_multibase-2.0.0-py3-none-any.whl", hash = "sha256:b29ce489b556134e73998a11712c406b70950812955df64084754e0774e40900", size = 10608, upload-time = "2025-12-18T02:24:47.827Z" },
]
[[package]]
name = "py-multicodec"
version = "1.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "varint" },
]
sdist = { url = "https://files.pythonhosted.org/packages/5e/26/ef24db0fbfec080b72c5ac4a1000da3a4d696a1e31862c695d683097a1b5/py_multicodec-1.0.0.tar.gz", hash = "sha256:78e4e3e47b6288cf635c3ca987152e6cb5510bdcdab307e7690c76ec3d5bbfeb", size = 44668, upload-time = "2025-12-18T20:41:37.976Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/76/da/768d07490faeae88ac361184164be9c262fececc3c6241b5fc471be4f659/py_multicodec-1.0.0-py3-none-any.whl", hash = "sha256:ae2e687bac8fdf54e3f5b3feded36b61a304d5e3c3af9438f7481f543ec15b8d", size = 26200, upload-time = "2025-12-18T20:41:37.055Z" },
]
[[package]]
name = "py-multihash"
version = "3.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "base58" },
{ name = "blake3" },
{ name = "mmh3" },
{ name = "morphys" },
{ name = "six" },
{ name = "varint" },
]
sdist = { url = "https://files.pythonhosted.org/packages/11/3d/ed68b0eccd0654f7f3c163d9b3d428f903e5e3e884ab1f0d0a16ba6a4f11/py_multihash-3.0.0.tar.gz", hash = "sha256:2e848941de5ef0533ca26b81940e2ffcf7b4322a3f803e8c97f4f0eca8767aa7", size = 41630, upload-time = "2025-12-17T19:30:00.596Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/24/e2/d65606db8369916fb5a9b4fe14df7e6072970d919300f3fb1c989a1d8e7d/py_multihash-3.0.0-py3-none-any.whl", hash = "sha256:3863ec1313b4eac1e5169137c143d40bf77456e57388f839441deba089f87326", size = 21215, upload-time = "2025-12-17T19:29:59.322Z" },
]
[[package]]
name = "pyarrow"
version = "22.0.0"
@ -1895,10 +1603,9 @@ wheels = [
[[package]]
name = "pyqtgraph"
version = "0.15.0.dev0"
source = { git = "https://github.com/pyqtgraph/pyqtgraph.git?branch=master#d588dd3ec30915e61c8496a0fe1db21fc25e4da5" }
version = "0.12.3"
source = { git = "https://github.com/pikers/pyqtgraph.git#373f9561ea8ec4fef9b4e8bdcdd4bbf372dd6512" }
dependencies = [
{ name = "colorama" },
{ name = "numpy" },
]
@ -1927,12 +1634,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/3b/ab/b3226f0bd7cdcf710fbede2b3548584366da3b19b5021e74f5bde2a8fa3f/pytest-9.0.2-py3-none-any.whl", hash = "sha256:711ffd45bf766d5264d487b917733b453d917afd2b0ad65223959f59089f875b", size = 374801, upload-time = "2025-12-06T21:30:49.154Z" },
]
[[package]]
name = "python-baseconv"
version = "1.2.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/33/d0/9297d7d8dd74767b4d5560d834b30b2fff17d39987c23ed8656f476e0d9b/python-baseconv-1.2.2.tar.gz", hash = "sha256:0539f8bd0464013b05ad62e0a1673f0ac9086c76b43ebf9f833053527cd9931b", size = 4929, upload-time = "2019-04-04T19:28:57.17Z" }
[[package]]
name = "python-dateutil"
version = "2.9.0.post0"
@ -2269,13 +1970,12 @@ wheels = [
[[package]]
name = "tractor"
version = "0.1.0a6.dev0"
source = { editable = "../tractor" }
source = { git = "https://github.com/goodboy/tractor.git?branch=piker_pin#36307c59175a1d04fecc77ef2c28f5c943b5f3d1" }
dependencies = [
{ name = "bidict" },
{ name = "cffi" },
{ name = "colorlog" },
{ name = "msgspec" },
{ name = "multiaddr" },
{ name = "pdbp" },
{ name = "platformdirs" },
{ name = "tricycle" },
@ -2283,49 +1983,6 @@ dependencies = [
{ name = "wrapt" },
]
[package.metadata]
requires-dist = [
{ name = "bidict", specifier = ">=0.23.1" },
{ name = "cffi", specifier = ">=1.17.1" },
{ name = "colorlog", specifier = ">=6.8.2,<7" },
{ name = "msgspec", specifier = ">=0.19.0" },
{ name = "multiaddr", specifier = ">=0.1.1" },
{ name = "pdbp", specifier = ">=1.8.2,<2" },
{ name = "platformdirs", specifier = ">=4.4.0" },
{ name = "tricycle", specifier = ">=0.4.1,<0.5" },
{ name = "trio", specifier = ">0.27" },
{ name = "wrapt", specifier = ">=1.16.0,<2" },
]
[package.metadata.requires-dev]
dev = [
{ name = "greenback", specifier = ">=1.2.1,<2" },
{ name = "pexpect", specifier = ">=4.9.0,<5" },
{ name = "prompt-toolkit", specifier = ">=3.0.50" },
{ name = "psutil", specifier = ">=7.0.0" },
{ name = "pyperclip", specifier = ">=1.9.0" },
{ name = "pytest", specifier = ">=8.3.5" },
{ name = "stackscope", specifier = ">=0.2.2,<0.3" },
{ name = "typing-extensions", specifier = ">=4.14.1" },
{ name = "xonsh", specifier = ">=0.22.2" },
]
devx = [
{ name = "greenback", specifier = ">=1.2.1,<2" },
{ name = "stackscope", specifier = ">=0.2.2,<0.3" },
{ name = "typing-extensions", specifier = ">=4.14.1" },
]
lint = [{ name = "ruff", specifier = ">=0.9.6" }]
repl = [
{ name = "prompt-toolkit", specifier = ">=3.0.50" },
{ name = "psutil", specifier = ">=7.0.0" },
{ name = "pyperclip", specifier = ">=1.9.0" },
{ name = "xonsh", specifier = ">=0.22.2" },
]
testing = [
{ name = "pexpect", specifier = ">=4.9.0,<5" },
{ name = "pytest", specifier = ">=8.3.5" },
]
[[package]]
name = "tricycle"
version = "0.4.1"
@ -2473,12 +2130,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e4/16/c1fd27e9549f3c4baf1dc9c20c456cd2f822dbf8de9f463824b0c0357e06/uvloop-0.22.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6cde23eeda1a25c75b2e07d39970f3374105d5eafbaab2a4482be82f272d5a5e", size = 4296730, upload-time = "2025-10-16T22:17:00.744Z" },
]
[[package]]
name = "varint"
version = "1.0.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/a8/fe/1ea0ba0896dfa47186692655b86db3214c4b7c9e0e76c7b1dc257d101ab1/varint-1.0.2.tar.gz", hash = "sha256:a6ecc02377ac5ee9d65a6a8ad45c9ff1dac8ccee19400a5950fb51d594214ca5", size = 1886, upload-time = "2016-02-24T20:42:38.5Z" }
[[package]]
name = "wcwidth"
version = "0.2.14"
@ -2571,8 +2222,14 @@ wheels = [
[[package]]
name = "xonsh"
version = "0.22.3"
source = { git = "https://github.com/xonsh/xonsh.git?branch=main#b446946fd94c3913e002318db1d1b41ee4fa1f9a" }
version = "0.22.4"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/48/df/1fc9ed62b3d7c14612e1713e9eb7bd41d54f6ad1028a8fbb6b7cddebc345/xonsh-0.22.4.tar.gz", hash = "sha256:6be346563fec2db75778ba5d2caee155525e634e99d9cc8cc347626025c0b3fa", size = 826665, upload-time = "2026-02-17T07:53:39.424Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/2e/00/7cbc0c1fb64365a0a317c54ce3a151c9644eea5a509d9cbaae61c9fd1426/xonsh-0.22.4-py311-none-any.whl", hash = "sha256:38b29b29fa85aa756462d9d9bbcaa1d85478c2108da3de6cc590a69a4bcd1a01", size = 654375, upload-time = "2026-02-17T07:53:37.702Z" },
{ url = "https://files.pythonhosted.org/packages/2e/c2/3dd498dc28d8f89cdd52e39950c5e591499ae423f61694c0bb4d03ed1d82/xonsh-0.22.4-py312-none-any.whl", hash = "sha256:4e538fac9f4c3d866ddbdeca068f0c0515469c997ed58d3bfee963878c6df5a5", size = 654300, upload-time = "2026-02-17T07:53:35.813Z" },
{ url = "https://files.pythonhosted.org/packages/82/7d/1f9c7147518e9f03f6ce081b5bfc4f1aceb6ec5caba849024d005e41d3be/xonsh-0.22.4-py313-none-any.whl", hash = "sha256:cc5fabf0ad0c56a2a11bed1e6a43c4ec6416a5b30f24f126b8e768547c3793e2", size = 654818, upload-time = "2026-02-17T07:53:33.477Z" },
]
[[package]]
name = "yapic-json"