Compare commits

..

39 Commits

Author SHA1 Message Date
Gud Boi fbb0fc6517 Add `multiaddr` as dep with bumped lock-file 2026-01-30 14:57:37 -05:00
Gud Boi 2ebb1e1019 Muckin around with multiaddr unpacking in .cli; need to use actual lib not our stuff! Bp 2026-01-30 14:56:40 -05:00
Gud Boi 88353ffef8 Ignore single-zero-sample trace on no runtime.. 2026-01-30 14:53:00 -05:00
Gud Boi ec4e6ec742 ib.feed: drop legacy "quote-with-vlm" polling
Since now we explicitly check each mkt's venue hours now we don't need
this mega hacky "waiting on a quote with real vlm" stuff to determing
whether historical data should be loaded immediately. This approach also
had the added complexity that we needed to handle edge cases for tickers
(like xauusd.cmdty) which never have vlm.. so it's nice to be rid of it
all ;p
2026-01-30 14:47:11 -05:00
Gud Boi 205058de21 Always overwrite tsdb duplicates found during backfill
Enable the previously commented-out dedupe-and-write logic in
`start_backfill()` to ensure tsdb stays clean of duplicate
entries.

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-01-30 14:46:23 -05:00
Gud Boi f11ab5f0aa For claude, ignore no runtime for offline shm reading 2026-01-29 02:49:25 -05:00
Gud Boi 8718ad4874 .ib._util: ignore attr err on click-hack twm wakeups? 2026-01-29 02:48:41 -05:00
Gud Boi 3a515afccd Use `get_fonts()`, add `show_txt` flag to gap annots
Switch `.tsp._annotate.markup_gaps()` to use new
`.ui._style.get_fonts()` API for font size calc on client side and add
optional `show_txt: bool` flag to toggle gap duration labels (with
default `False`).

Also,
- replace `sgn` checks with named bools: `up_gap`, `down_gap`
- use `small_font.px_size - 1` for gap label font sizing
- wrap text creation in `if show_txt:` block
- update IPC handler to use `get_fonts()` vs direct `_font` import

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-01-28 16:30:41 -05:00
Gud Boi 88732a67d5 Add `get_fonts()` API and fix `.px_size` for non-Qt ctxs
Add a public `.ui._style.get_fonts()` helper to retrieve the
`_font[_small]: DpiAwareFont` singleton pair. Adjust
`DpiAwareFont.px_size` to return `conf.toml` value when Qt returns `-1`
(no active Qt app).

Also,
- raise `ValueError` with detailed msg if both Qt and a conf-lookup fail
- add some more type union whitespace cleanups: `int | None` -> `int|None`

(this commit-msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-01-28 15:34:57 -05:00
Gud Boi 858cfce958 Relay annot creation failures with err-dict resps
Change annot-ctl APIs to return `None` on failure instead of invalid
`aid`s. Server now sends `{'error': msg}` dict on failures, client
match-blocks handle gracefully.

Also,
- update return types: `.add_rect()`, `.add_arrow()`, `.add_text()`
  now return `int|None`
- match on `{'error': str(msg)}` in client IPC receive blocks
- send error dicts from server on timestamp lookup failures
- add failure handling in `markup_gaps()` to skip bad rects

(this commit msg was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-01-28 14:43:52 -05:00
Gud Boi 51d109f7e7 Do time-based shm-index lookup for annots on server
Fix annotation misalignment during backfill by switching from
client-computed indices to server-side timestamp lookups against
current shm state. Store absolute coords on annotations and
reposition on viz redraws.

Lowlevel impl deats,
- add `time` param to `.add_arrow()`, `.add_text()`, `.add_rect()`
- lookup indices from shm via timestamp matching in IPC handlers
- force chart redraw before `markup_gaps()` annotation creation
- wrap IPC send/receive in `trio.fail_after(3)` for timeout when
  server fails to respond, likely hangs on no-case-match/error.
- cache `_meth`/`_kwargs` on rects, `_abs_x`/`_abs_y` on arrows
- auto-reposition all annotations after viz reset in redraw cmd

Also,
- handle `KeyError` for missing timeframes in chart lookup
- return `-1` aid on annotation creation failures (lol oh `claude`..)
- reconstruct rect positions from timestamps + BGM offset logic
- log repositioned annotation counts on viz redraw

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-01-28 12:48:26 -05:00
Gud Boi 76f199df3b Add buffer capacity checks to backfill loop
Prevent `ValueError` from negative prepend index in
`start_backfill()` by checking buffer space before push
attempts. Truncate incoming frame if needed and stop gracefully
when buffer full.

Also,
- add pre-push capacity check with frame truncation logic
- stop backfill when `next_prepend_index <= 0`
- log warnings for capacity exceeded and buffer-full conditions

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-01-27 23:52:00 -05:00
Gud Boi 4e3cd7f986 Drop decimal points for whole-number durations
Adjust `humanize_duration()` to show "3h" instead of "3.0h" when the
duration value is a whole number, making labels cleaner.

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-01-27 21:09:49 -05:00
Gud Boi 1fb0fe3a04 Add `font_size` param to `AnnotCtl.add_text()` API
Expose font sizing control for `pg.TextItem` annotations thru the
annot-ctl API. Default to `_font.font.pixelSize() - 3` when no
size provided.

Also,
- thread `font_size` param thru IPC handler in `serve_rc_annots()`
- apply font via `QFont.setPixelSize()` on text item creation
- add `?TODO` note in `markup_gaps()` re using `conf.toml` value
- update `add_text()` docstring with font_size param desc

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-01-27 20:53:10 -05:00
Gud Boi de5b1737b4 Add humanized duration labels to gap annotations
Introduce `humanize_duration()` helper in `.tsp._annotate` to
convert seconds to short human-readable format (d/h/m/s). Extend
annot-ctl API with `add_text()` method for placing `pg.TextItem`
labels on charts.

Also,
- add duration labels on RHS of gap arrows in `markup_gaps()`
- handle text item removal in `rm_annot()` match block
- expose `TextItem` cmd in `serve_rc_annots()` IPC handler
- use `hcolor()` for named-to-hex color conversion
- set anchor positioning for up vs down gaps

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-01-27 20:51:21 -05:00
Gud Boi 1776242413 .ib.feed: trim bars frame to `start_dt` 2026-01-27 17:37:25 -05:00
Gud Boi 848c8ae533 ib._util: ignore timeout-errs when crash-handling `pyvnc` connects 2026-01-27 17:36:33 -05:00
Gud Boi fdea8556d7 Lul, woops compare against first-dt in `.ib.feed` bars frame.. 2026-01-27 16:21:19 -05:00
Gud Boi be28d083e4 Expose more `pg.ArrowItem` params thru annot-ctl API 2026-01-27 16:20:23 -05:00
Gud Boi 8701b517e7 Add `pexpect`, `xonsh`@github:main to deps
The former bc `claude` needs it for its new "offline" REPL simulator
script `snippets/claude_debug_helper.py` and pin to `xonsh` git mainline
to get the fancy new next cmd/suggestion prompt feats (which @goodboy is
using from `modden` already). Bump lock file to match.

Ah right, and for now while hackin pin to a local `tractor` Bp
2026-01-27 14:16:51 -05:00
Gud Boi f39b362bc4 Add a couple cooler "cooler"/"muted" red and greens 2026-01-27 13:34:52 -05:00
Gud Boi d2e1d6ce91 Add break for single bar null segments 2026-01-27 13:33:46 -05:00
Gud Boi d0966e0363 Space gap rect-annots "between" start-end bars 2026-01-27 13:33:13 -05:00
Gud Boi 4081336bd3 Catch too-early ib hist frames
For now by REPLing them and raising an RTE inside `.ib.feed` as well as
tracing any such cases that make it (from other providers) up to the
`.tsp._history` layer during null-segment backfilling.
2026-01-27 13:17:28 -05:00
Gud Boi ff502b62bf .ui.order_mode: multiline import styling 2026-01-25 22:19:39 -05:00
Gud Boi e77bec203d Add arrow indicators to time gaps
Such that they're easier to spot when zoomed out, a similar color to the
`RectItem`s and also remote-controlled via the `AnnotCtl` api.

Deats,
- request an arrow per gap from `markup_gaps()` using a new
  `.add_arrow()` meth, set the color, direction and alpha with
  position always as the `iend`/close of the last valid bar.
- extend the `.ui._remote_ctl` subys to support the above,
  * add a new `AnnotCtl.add_arrow()`.
  * add the service-side IPC endpoint for a 'cmd': 'ArrowEditor'.
- add a new `rm_annot()` helper to ensure the right graphics removal
  API is used by annotation type:
  * `pg.ArrowItem` looks up the `ArrowEditor` and calls `.remove(annot).
  * `pg.SelectRect` keeps with calling `.delete()`.
- global-ize an `_editors` table to enable the prior.
- add an explicit RTE for races on the chart-actor's `_dss` init.
2026-01-25 22:06:59 -05:00
Gud Boi 809ec6accb Arrow editor refinements in prep for gap checker
Namely exposing `ArrowEditor.add()` params to provide access to
coloring/transparency settings over the remote-ctl annotation API and
also adding a new `.remove_all()` to easily clear all arrows from
a single call. Also add `.remove()` compat methods to the other editors
(i.e. for lines, rects).
2026-01-25 14:14:42 -05:00
Gud Boi ad299789db Mv `markup_gaps()` to new `.tsp._annotate` mod 2026-01-21 23:52:12 -05:00
Gud Boi cd6bc105de Enable tracing back insert backfills
Namely insertion writes which over-fill the shm buffer past the latest
tsdb sample via `.tsp._history.shm_push_in_between()`.

Deats,
- check earliest `to_push` timestamp and enter pause point if it's
  earlier then the tsdb's `backfill_until_dt` stamp.
- requires actually passing the `backfill_until_dt: datetime` thru,
  * `get_null_segs()`
  * `maybe_fill_null_segments()`
  * `shm_push_in_between()` (obvi XD)
2026-01-21 22:38:42 -05:00
Gud Boi a8e4e1b2c5 Tolerate various "bad data" cases in `markup_gaps()`
Namely such that when the previous-df-row by our shm-abs-'index' doesn't
exist we ignore certain cases which are likely due to borked-but-benign
samples written to the tsdb or rt shm buffers prior.

Particularly we now ignore,
- any `dt`/`prev_dt` values which are UNIX-epoch timestamped (val of 0).
- any row-is-first-row in the df; there is no previous.
- any missing previous datum by 'index', in which case we lookup the
  `wdts` prior row and use that instead.
  * this would indicate a missing sample for the time-step but we can
    still detect a "gap" by looking at the prior row, by df-abs-index
    `i`, and use its timestamp to determine the period/size of missing
    samples (which need to likely still be retrieved).
  * in this case i'm leaving in a pause-point for introspecting these
    rarer cases when `--pdb` is passed via CLI.

Relatedly in the `piker store` CLI ep,
- add `--pdb` flag to `piker store`, pass it verbatim as `debug_mode`.
- when `times` has only a single row, don't calc a `period_s` median.
- only trace `null_segs` when in debug mode.
- always markup/dedupe gaps for `period_s==60`
2026-01-21 22:20:43 -05:00
Gud Boi caf2cc5a5b ib: up API timeout default for remote host conns 2026-01-21 22:20:43 -05:00
Gud Boi d4b46e0eda Fix `Qt6` types for new sub-namespaces 2026-01-21 22:20:43 -05:00
Gud Boi a1048c847b Add vlm-based "smart" OHLCV de-duping & bar validation
Using `claude`, add a `.tsp._dedupe_smart` module that attemps "smarter"
duplicate bars by attempting to distinguish between erroneous bars
partially written during concurrent backfill race conditions vs.
**actual** data quality issues from historical providers.

Problem:
--------
Concurrent writes (live updates vs. backfilling) can result in create
duplicate timestamped ohlcv vars with different values. Some
potential scenarios include,

- a market live feed is cancelled during live update resulting in the
  "last" datum being partially updated with all the ticks for the
  time step.
- when the feed is rebooted during charting, the backfiller will not
  finalize this bar since rn it presumes it should only fill data for
  time steps not already in the tsdb storage.

Our current naive  `.unique()` approach obvi keeps the incomplete bar
and a "smarter" approach is to compare the provider's final vlm
amount vs. the maybe-cancelled tsdb's bar; a higher vlm value from
the provider likely indicates the cancelled-during-live-write and
**not** a datum discrepancy from said data provider.

Analysis (with `claude`) of `zecusdt` data revealed:
- 1000 duplicate timestamps
- 999 identical bars (pure duplicates from 2022 backfill overlap)
- 1 volume-monotonic conflict (live partial vs backfill complete)

A soln from `claude` -> `tsp._dedupe_smart.dedupe_ohlcv_smart()`
which:
- sorts by vlm **before** deduplication and keep the most complete
  bar based on vlm monotonicity as well as the following OHLCV
  validation assumptions:
  * volume should always increase
  * high should be non-decreasing,
  * low should be non-increasing
  * open should be identical
- Separates valid race conditions from provider data quality issues
  and reports and returns both dfs.

Change summary by `claude`:
- `.tsp._dedupe_smart`: new module with validation logic
- `.tsp.__init__`: expose `dedupe_ohlcv_smart()`
- `.storage.cli`: integrate smart dedupe, add logging for:
  * duplicate counts (identical vs monotonic races)
  * data quality violations (non-monotonic, invalid OHLC ranges)
  * warnings for provider data issues
- Remove `assert not diff` (duplicates are valid now)

Verified on `zecusdt`: correctly keeps index 3143645
(volume=287.777) over 3143644 (volume=140.299) for
conflicting 2026-01-16 18:54 UTC bar.

`claude`'s Summary of reasoning
-------------------------------
- volume monotonicity is critical: a bar's volume only increases
  during its time window.
- a backfilled bar should always have volume >= live updated.
- violations indicate any of:
  * Provider data corruption
  * Non-OHLCV aggregation semantics
  * Timestamp misalignment

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-01-21 22:20:43 -05:00
Gud Boi 192fe0dc73 Add `pexpect`-based `pdbp`-REPL offline helper
Add a new `snippets/claude_debug_helper.py` to
provide a programmatic interface to `tractor.pause()` debugger
sessions for incremental data inspection matching the interactive UX
but able to be run by `claude` "offline" since it can't seem to feed
stdin (so it claims) to the `pdb` instance due to lack of ability to
allocate a tty internally.

The script-wrapper is based on `tractor`'s `tests/devx/` suite's use of
`pexpect` patterns for driving `pdbp` prompts and thus enables
automated-offline execution of REPL-inspection commands **without**
using incremental-realtime output capture (like a human would use it).

Features:
- `run_pdb_commands()`: batch command execution
- `InteractivePdbSession`: context manager for step-by-step REPL interaction
- `expect()` wrapper: timeout handling with buffer display
- Proper stdin/stdout handling via `pexpect.spawn()`

Example usage:
```python
from debug_helper import InteractivePdbSession

with InteractivePdbSession(
    cmd='piker store ldshm zecusdt.usdtm.perp.binance'
) as session:
    session.run('deduped.shape')
    session.run('step_gaps.shape')
```

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-01-21 22:20:43 -05:00
Gud Boi 4bfdd388bb Fix polars 1.36.0 duration API
Polars tightened type safety for `.dt` accessor methods requiring
`total_*` methods for duration types vs datetime component accessors
like `day()` which now only work on datetime dtypes.

`detect_time_gaps()` in `.tsp._anal` was calling `.dt.day()`
on `dt_diff` column (a duration from `.diff()`) which throws
`InvalidOperationError` on modern polars.

Changes:
- use f-string to add pluralization to map time unit strings to
  `total_<unit>s` form for the new duration API.
- Handle singular/plural forms: 'day' -> 'days' -> 'total_days'
- Ensure trailing 's' before applying 'total_' prefix

Also updates inline comments explaining the polars type distinction
between datetime components vs duration totals.

Fixes `piker store ldshm` crashes on datasets with time gaps.

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2026-01-21 22:20:43 -05:00
Tyler Goodlet 534b13f755 `.storage.__init__`: code styling updates 2026-01-21 22:20:43 -05:00
Tyler Goodlet 108646fdfb `.tsp._history`: drop `feed_is_live` syncing, another seg flag
The `await feed_is_live.wait()` is more or less pointless and would only
cause slower startup afaig (as-far-as-i-grok) so i'm masking it here.
This also removes the final `strict_exception_groups=False` use from the
non-tests code base, flipping to the `tractor.trionics` collapser once
and for all!
2026-01-21 22:20:43 -05:00
Tyler Goodlet d6d4fec666 Woops, keep `np2pl` exposed from `.tsp` 2026-01-21 22:20:43 -05:00
Tyler Goodlet 14ac351a65 Factor to a new `.tsp._history` sub-mod
Cleaning out the `piker.tsp` pkg-mod to be only the (re)exports needed
for `._anal`/`._history` refs-use elsewhere!
2026-01-21 22:20:43 -05:00
20 changed files with 3564 additions and 1722 deletions

View File

@ -250,7 +250,9 @@ async def vnc_click_hack(
'connection': 'r' 'connection': 'r'
}[reset_type] }[reset_type]
with tractor.devx.open_crash_handler(): with tractor.devx.open_crash_handler(
ignore={TimeoutError,},
):
client = await AsyncVNCClient.connect( client = await AsyncVNCClient.connect(
VNCConfig( VNCConfig(
host=host, host=host,
@ -331,7 +333,14 @@ def i3ipc_xdotool_manual_click_hack() -> None:
''' '''
focussed, matches = i3ipc_fin_wins_titled() focussed, matches = i3ipc_fin_wins_titled()
try:
orig_win_id = focussed.window orig_win_id = focussed.window
except AttributeError:
# XXX if .window cucks we prolly aren't intending to
# use this and/or just woke up from suspend..
log.exception('xdotool invalid usage ya ??\n')
return
try: try:
for name, con in matches: for name, con in matches:
print(f'Resetting data feed for {name}') print(f'Resetting data feed for {name}')

View File

@ -1187,7 +1187,7 @@ async def load_aio_clients(
# the API TCP in `ib_insync` connection can be flaky af so instead # the API TCP in `ib_insync` connection can be flaky af so instead
# retry a few times to get the client going.. # retry a few times to get the client going..
connect_retries: int = 3, connect_retries: int = 3,
connect_timeout: float = 10, connect_timeout: float = 30, # in case a remote-host
disconnect_on_exit: bool = True, disconnect_on_exit: bool = True,
) -> dict[str, Client]: ) -> dict[str, Client]:

View File

@ -262,7 +262,38 @@ async def open_history_client(
vlm = bars_array['volume'] vlm = bars_array['volume']
vlm[vlm < 0] = 0 vlm[vlm < 0] = 0
return bars_array, first_dt, last_dt # XXX, if a start-limit was passed ensure we only
# return history that far back!
if (
start_dt
and
first_dt < start_dt
):
trimmed_bars = bars_array[
bars_array['time'] >= start_dt.timestamp()
]
if (
trimmed_first_dt := from_timestamp(trimmed_bars['time'][0])
!=
start_dt
):
# TODO! rm this once we're more confident it never hits!
breakpoint()
raise RuntimeError(
f'OHLC-bars array start is gt `start_dt` limit !!\n'
f'start_dt: {start_dt}\n'
f'first_dt: {first_dt}\n'
f'trimmed_first_dt: {trimmed_first_dt}\n'
)
# XXX, overwrite with start_dt-limited frame
bars_array = trimmed_bars
return (
bars_array,
first_dt,
last_dt,
)
# TODO: it seems like we can do async queries for ohlc # TODO: it seems like we can do async queries for ohlc
# but getting the order right still isn't working and I'm not # but getting the order right still isn't working and I'm not
@ -451,6 +482,8 @@ async def get_bars(
dt_duration, dt_duration,
) = await proxy.bars( ) = await proxy.bars(
fqme=fqme, fqme=fqme,
# XXX TODO! lol we're not using this..
# start_dt=start_dt,
end_dt=end_dt, end_dt=end_dt,
sample_period_s=timeframe, sample_period_s=timeframe,
@ -1213,54 +1246,12 @@ async def stream_quotes(
tn.start_soon(reset_on_feed) tn.start_soon(reset_on_feed)
async with aclosing(iter_quotes): async with aclosing(iter_quotes):
# if syminfo.get('no_vlm', False):
if not init_msg.shm_write_opts['has_vlm']:
# generally speaking these feeds don't
# include vlm data.
atype: str = mkt.dst.atype
log.info(
f'No-vlm {mkt.fqme}@{atype}, skipping quote poll'
)
else:
# wait for real volume on feed (trading might be
# closed)
while True:
ticker = await iter_quotes.receive()
# for a real volume contract we rait for
# the first "real" trade to take place
if (
# not calc_price
# and not ticker.rtTime
False
# not ticker.rtTime
):
# spin consuming tickers until we
# get a real market datum
log.debug(f"New unsent ticker: {ticker}")
continue
else:
log.debug("Received first volume tick")
# ugh, clear ticks since we've
# consumed them (ahem, ib_insync is
# truly stateful trash)
# ticker.ticks = []
# XXX: this works because we don't use
# ``aclosing()`` above?
break
quote = normalize(ticker)
log.debug(f"First ticker received {quote}")
# tell data-layer spawner-caller that live # tell data-layer spawner-caller that live
# quotes are now active desptie not having # quotes are now active desptie not having
# necessarily received a first vlm/clearing # necessarily received a first vlm/clearing
# tick. # tick.
ticker = await iter_quotes.receive() ticker = await iter_quotes.receive()
quote = normalize(ticker)
feed_is_live.set() feed_is_live.set()
fqme: str = quote['fqme'] fqme: str = quote['fqme']
await send_chan.send({fqme: quote}) await send_chan.send({fqme: quote})

View File

@ -61,15 +61,15 @@ def load_trans_eps(
if ( if (
network network
and not maddrs and
not maddrs
): ):
# load network section and (attempt to) connect all endpoints # load network section and (attempt to) connect all endpoints
# which are reachable B) # which are reachable B)
for key, maddrs in network.items(): for key, maddrs in network.items():
match key: match key:
# TODO: resolve table across multiple discov # TODO: resolve table across multiple discov prots Bo
# prots Bo
case 'resolv': case 'resolv':
pass pass
@ -156,7 +156,8 @@ def pikerd(
network: dict = conf.get('network') network: dict = conf.get('network')
if ( if (
network is None network is None
and not maddr and
not maddr
): ):
regaddrs = [( regaddrs = [(
_default_registry_host, _default_registry_host,
@ -168,10 +169,40 @@ def pikerd(
network, network,
maddr, maddr,
) )
match eps:
case {
'pikerd': [
{
'ipv4': {
'layer': 3,
'addr': host,
'proto': l3_proto,
},
'tcp': {
'layer': 4,
'port': int(port),
'proto': l4_proto,
},
},
# | {
# 'uds': {
# 'layer': 4,
# 'port': int(port),
# },
# },
],
}:
assert l3_proto == 'ipv4'
assert l4_proto == 'tcp'
# breakpoint()
# XXX, for now always setup a localhost TCP registry addr.
for layers in eps['pikerd']: for layers in eps['pikerd']:
host: str = layers['ipv4']['addr']
port: int = layers['tcp']['port']
regaddrs.append(( regaddrs.append((
layers['ipv4']['addr'], host,
layers['tcp']['port'], port,
)) ))
from .. import service from .. import service
@ -183,8 +214,8 @@ def pikerd(
registry_addrs=regaddrs, registry_addrs=regaddrs,
loglevel=loglevel, loglevel=loglevel,
debug_mode=pdb, debug_mode=pdb,
# enable_transports=['uds'], enable_transports=[l4_proto],
enable_transports=['tcp'], # enable_transports=['tcp'],
) as service_mngr, ) as service_mngr,
): ):
assert service_mngr assert service_mngr

View File

@ -520,7 +520,10 @@ def open_shm_array(
# "unlink" created shm on process teardown by # "unlink" created shm on process teardown by
# pushing teardown calls onto actor context stack # pushing teardown calls onto actor context stack
stack = tractor.current_actor().lifetime_stack stack = tractor.current_actor(
err_on_no_runtime=False,
).lifetime_stack
if stack:
stack.callback(shmarr.close) stack.callback(shmarr.close)
stack.callback(shmarr.destroy) stack.callback(shmarr.destroy)
@ -607,7 +610,10 @@ def attach_shm_array(
_known_tokens[key] = token _known_tokens[key] = token
# "close" attached shm on actor teardown # "close" attached shm on actor teardown
tractor.current_actor().lifetime_stack.callback(sha.close) if (actor := tractor.current_actor(
err_on_no_runtime=False,
)):
actor.lifetime_stack.callback(sha.close)
return sha return sha

View File

@ -43,7 +43,6 @@ from typing import (
import numpy as np import numpy as np
from .. import config from .. import config
from ..service import ( from ..service import (
check_for_service, check_for_service,
@ -152,7 +151,10 @@ class StorageConnectionError(ConnectionError):
''' '''
def get_storagemod(name: str) -> ModuleType: def get_storagemod(
name: str,
) -> ModuleType:
mod: ModuleType = import_module( mod: ModuleType = import_module(
'.' + name, '.' + name,
'piker.storage', 'piker.storage',
@ -167,7 +169,10 @@ def get_storagemod(name: str) -> ModuleType:
async def open_storage_client( async def open_storage_client(
backend: str|None = None, backend: str|None = None,
) -> tuple[ModuleType, StorageClient]: ) -> tuple[
ModuleType,
StorageClient,
]:
''' '''
Load the ``StorageClient`` for named backend. Load the ``StorageClient`` for named backend.
@ -267,7 +272,10 @@ async def open_tsdb_client(
from ..data.feed import maybe_open_feed from ..data.feed import maybe_open_feed
async with ( async with (
open_storage_client() as (_, storage), open_storage_client() as (
_,
storage,
),
maybe_open_feed( maybe_open_feed(
[fqme], [fqme],
@ -275,7 +283,7 @@ async def open_tsdb_client(
) as feed, ) as feed,
): ):
profiler(f'opened feed for {fqme}') profiler(f'opened feed for {fqme!r}')
# to_append = feed.hist_shm.array # to_append = feed.hist_shm.array
# to_prepend = None # to_prepend = None

View File

@ -19,16 +19,10 @@ Storage middle-ware CLIs.
""" """
from __future__ import annotations from __future__ import annotations
# from datetime import datetime
# from contextlib import (
# AsyncExitStack,
# )
from pathlib import Path from pathlib import Path
from math import copysign
import time import time
from types import ModuleType from types import ModuleType
from typing import ( from typing import (
Any,
TYPE_CHECKING, TYPE_CHECKING,
) )
@ -47,7 +41,6 @@ from piker.data import (
ShmArray, ShmArray,
) )
from piker import tsp from piker import tsp
from piker.data._formatters import BGM
from . import log from . import log
from . import ( from . import (
__tsdbs__, __tsdbs__,
@ -242,122 +235,12 @@ def anal(
trio.run(main) trio.run(main)
async def markup_gaps(
fqme: str,
timeframe: float,
actl: AnnotCtl,
wdts: pl.DataFrame,
gaps: pl.DataFrame,
) -> dict[int, dict]:
'''
Remote annotate time-gaps in a dt-fielded ts (normally OHLC)
with rectangles.
'''
aids: dict[int] = {}
for i in range(gaps.height):
row: pl.DataFrame = gaps[i]
# the gap's RIGHT-most bar's OPEN value
# at that time (sample) step.
iend: int = row['index'][0]
# dt: datetime = row['dt'][0]
# dt_prev: datetime = row['dt_prev'][0]
# dt_end_t: float = dt.timestamp()
# TODO: can we eventually remove this
# once we figure out why the epoch cols
# don't match?
# TODO: FIX HOW/WHY these aren't matching
# and are instead off by 4hours (EST
# vs. UTC?!?!)
# end_t: float = row['time']
# assert (
# dt.timestamp()
# ==
# end_t
# )
# the gap's LEFT-most bar's CLOSE value
# at that time (sample) step.
prev_r: pl.DataFrame = wdts.filter(
pl.col('index') == iend - 1
)
# XXX: probably a gap in the (newly sorted or de-duplicated)
# dt-df, so we might need to re-index first..
if prev_r.is_empty():
await tractor.pause()
istart: int = prev_r['index'][0]
# dt_start_t: float = dt_prev.timestamp()
# start_t: float = prev_r['time']
# assert (
# dt_start_t
# ==
# start_t
# )
# TODO: implement px-col width measure
# and ensure at least as many px-cols
# shown per rect as configured by user.
# gap_w: float = abs((iend - istart))
# if gap_w < 6:
# margin: float = 6
# iend += margin
# istart -= margin
rect_gap: float = BGM*3/8
opn: float = row['open'][0]
ro: tuple[float, float] = (
# dt_end_t,
iend + rect_gap + 1,
opn,
)
cls: float = prev_r['close'][0]
lc: tuple[float, float] = (
# dt_start_t,
istart - rect_gap, # + 1 ,
cls,
)
color: str = 'dad_blue'
diff: float = cls - opn
sgn: float = copysign(1, diff)
color: str = {
-1: 'buy_green',
1: 'sell_red',
}[sgn]
rect_kwargs: dict[str, Any] = dict(
fqme=fqme,
timeframe=timeframe,
start_pos=lc,
end_pos=ro,
color=color,
)
aid: int = await actl.add_rect(**rect_kwargs)
assert aid
aids[aid] = rect_kwargs
# tell chart to redraw all its
# graphics view layers Bo
await actl.redraw(
fqme=fqme,
timeframe=timeframe,
)
return aids
@store.command() @store.command()
def ldshm( def ldshm(
fqme: str, fqme: str,
write_parquet: bool = True, write_parquet: bool = True,
reload_parquet_to_shm: bool = True, reload_parquet_to_shm: bool = True,
pdb: bool = False, # --pdb passed?
) -> None: ) -> None:
''' '''
@ -377,7 +260,7 @@ def ldshm(
open_piker_runtime( open_piker_runtime(
'polars_boi', 'polars_boi',
enable_modules=['piker.data._sharedmem'], enable_modules=['piker.data._sharedmem'],
debug_mode=True, debug_mode=pdb,
), ),
open_storage_client() as ( open_storage_client() as (
mod, mod,
@ -397,6 +280,9 @@ def ldshm(
times: np.ndarray = shm.array['time'] times: np.ndarray = shm.array['time']
d1: float = float(times[-1] - times[-2]) d1: float = float(times[-1] - times[-2])
d2: float = 0
# XXX, take a median sample rate if sufficient data
if times.size > 2:
d2: float = float(times[-2] - times[-3]) d2: float = float(times[-2] - times[-3])
med: float = np.median(np.diff(times)) med: float = np.median(np.diff(times))
if ( if (
@ -407,7 +293,6 @@ def ldshm(
raise ValueError( raise ValueError(
f'Something is wrong with time period for {shm}:\n{times}' f'Something is wrong with time period for {shm}:\n{times}'
) )
period_s: float = float(max(d1, d2, med)) period_s: float = float(max(d1, d2, med))
null_segs: tuple = tsp.get_null_segs( null_segs: tuple = tsp.get_null_segs(
@ -417,6 +302,8 @@ def ldshm(
# TODO: call null-seg fixer somehow? # TODO: call null-seg fixer somehow?
if null_segs: if null_segs:
if tractor._state.is_debug_mode():
await tractor.pause() await tractor.pause()
# async with ( # async with (
# trio.open_nursery() as tn, # trio.open_nursery() as tn,
@ -441,9 +328,35 @@ def ldshm(
wdts, wdts,
deduped, deduped,
diff, diff,
) = tsp.dedupe( valid_races,
dq_issues,
) = tsp.dedupe_ohlcv_smart(
shm_df, shm_df,
period=period_s, )
# Report duplicate analysis
if diff > 0:
log.info(
f'Removed {diff} duplicate timestamp(s)\n'
)
if valid_races is not None:
identical: int = (
valid_races
.filter(pl.col('identical_bars'))
.height
)
monotonic: int = valid_races.height - identical
log.info(
f'Valid race conditions: {valid_races.height}\n'
f' - Identical bars: {identical}\n'
f' - Volume monotonic: {monotonic}\n'
)
if dq_issues is not None:
log.warning(
f'DATA QUALITY ISSUES from provider: '
f'{dq_issues.height} timestamp(s)\n'
f'{dq_issues}\n'
) )
# detect gaps from in expected (uniform OHLC) sample period # detect gaps from in expected (uniform OHLC) sample period
@ -460,7 +373,8 @@ def ldshm(
# TODO: actually pull the exact duration # TODO: actually pull the exact duration
# expected for each venue operational period? # expected for each venue operational period?
gap_dt_unit='days', # gap_dt_unit='day',
gap_dt_unit='day',
gap_thresh=1, gap_thresh=1,
) )
@ -471,8 +385,11 @@ def ldshm(
if ( if (
not venue_gaps.is_empty() not venue_gaps.is_empty()
or ( or (
period_s < 60 not step_gaps.is_empty()
and not step_gaps.is_empty() # XXX, i presume i put this bc i was guarding
# for ib venue gaps?
# and
# period_s < 60
) )
): ):
# write repaired ts to parquet-file? # write repaired ts to parquet-file?
@ -521,7 +438,7 @@ def ldshm(
do_markup_gaps: bool = True do_markup_gaps: bool = True
if do_markup_gaps: if do_markup_gaps:
new_df: pl.DataFrame = tsp.np2pl(new) new_df: pl.DataFrame = tsp.np2pl(new)
aids: dict = await markup_gaps( aids: dict = await tsp._annotate.markup_gaps(
fqme, fqme,
period_s, period_s,
actl, actl,
@ -534,8 +451,13 @@ def ldshm(
tf2aids[period_s] = aids tf2aids[period_s] = aids
else: else:
# allow interaction even when no ts problems. # No significant gaps to handle, but may have had
assert not diff # duplicates removed (valid race conditions are ok)
if diff > 0 and dq_issues is not None:
log.warning(
'Found duplicates with data quality issues '
'but no significant time gaps!\n'
)
await tractor.pause() await tractor.pause()
log.info('Exiting TSP shm anal-izer!') log.info('Exiting TSP shm anal-izer!')

File diff suppressed because it is too large Load Diff

View File

@ -275,6 +275,18 @@ def get_null_segs(
# diff of abs index steps between each zeroed row # diff of abs index steps between each zeroed row
absi_zdiff: np.ndarray = np.diff(absi_zeros) absi_zdiff: np.ndarray = np.diff(absi_zeros)
if zero_t.size < 2:
try:
breakpoint()
except RuntimeError:
# XXX, if greenback not active from
# piker store ldshm cmd..
log.exception(
"Can't debug single-sample null!\n"
)
return None
# scan for all frame-indices where the # scan for all frame-indices where the
# zeroed-row-abs-index-step-diff is greater then the # zeroed-row-abs-index-step-diff is greater then the
# expected increment of 1. # expected increment of 1.
@ -487,7 +499,8 @@ def iter_null_segs(
start_dt = None start_dt = None
if ( if (
absi_start is not None absi_start is not None
and start_t != 0 and
start_t != 0
): ):
fi_start: int = absi_start - absi_first fi_start: int = absi_start - absi_first
start_row: Seq = frame[fi_start] start_row: Seq = frame[fi_start]
@ -501,8 +514,8 @@ def iter_null_segs(
yield ( yield (
absi_start, absi_end, # abs indices absi_start, absi_end, # abs indices
fi_start, fi_end, # relative "frame" indices fi_start, fi_end, # relative "frame" indices
start_t, end_t, start_t, end_t, # epoch times
start_dt, end_dt, start_dt, end_dt, # dts
) )
@ -578,11 +591,22 @@ def detect_time_gaps(
# NOTE: this flag is to indicate that on this (sampling) time # NOTE: this flag is to indicate that on this (sampling) time
# scale we expect to only be filtering against larger venue # scale we expect to only be filtering against larger venue
# closures-scale time gaps. # closures-scale time gaps.
#
# Map to total_ method since `dt_diff` is a duration type,
# not datetime - modern polars requires `total_*` methods
# for duration types (e.g. `total_days()` not `day()`)
# Ensure plural form for polars API (e.g. 'day' -> 'days')
unit_plural: str = (
gap_dt_unit
if gap_dt_unit.endswith('s')
else f'{gap_dt_unit}s'
)
duration_method: str = f'total_{unit_plural}'
return step_gaps.filter( return step_gaps.filter(
# Second by an arbitrary dt-unit step size # Second by an arbitrary dt-unit step size
getattr( getattr(
pl.col('dt_diff').dt, pl.col('dt_diff').dt,
gap_dt_unit, duration_method,
)().abs() > gap_thresh )().abs() > gap_thresh
) )

View File

@ -0,0 +1,306 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Time-series (remote) annotation APIs.
"""
from __future__ import annotations
from math import copysign
from typing import (
Any,
TYPE_CHECKING,
)
import polars as pl
import tractor
from piker.data._formatters import BGM
from piker.storage import log
from piker.ui._style import get_fonts
if TYPE_CHECKING:
from piker.ui._remote_ctl import AnnotCtl
def humanize_duration(
seconds: float,
) -> str:
'''
Convert duration in seconds to short human-readable form.
Uses smallest appropriate time unit:
- d: days
- h: hours
- m: minutes
- s: seconds
Examples:
- 86400 -> "1d"
- 28800 -> "8h"
- 180 -> "3m"
- 45 -> "45s"
'''
abs_secs: float = abs(seconds)
if abs_secs >= 86400:
days: float = abs_secs / 86400
if days >= 10 or days == int(days):
return f'{int(days)}d'
return f'{days:.1f}d'
elif abs_secs >= 3600:
hours: float = abs_secs / 3600
if hours >= 10 or hours == int(hours):
return f'{int(hours)}h'
return f'{hours:.1f}h'
elif abs_secs >= 60:
mins: float = abs_secs / 60
if mins >= 10 or mins == int(mins):
return f'{int(mins)}m'
return f'{mins:.1f}m'
else:
if abs_secs >= 10 or abs_secs == int(abs_secs):
return f'{int(abs_secs)}s'
return f'{abs_secs:.1f}s'
async def markup_gaps(
fqme: str,
timeframe: float,
actl: AnnotCtl,
wdts: pl.DataFrame,
gaps: pl.DataFrame,
# XXX, switch on to see txt showing a "humanized" label of each
# gap's duration.
show_txt: bool = False,
) -> dict[int, dict]:
'''
Remote annotate time-gaps in a dt-fielded ts (normally OHLC)
with rectangles.
'''
# XXX: force chart redraw FIRST to ensure PlotItem coordinate
# system is properly initialized before we position annotations!
# Without this, annotations may be misaligned on first creation
# due to Qt/pyqtgraph initialization race conditions.
await actl.redraw(
fqme=fqme,
timeframe=timeframe,
)
aids: dict[int] = {}
for i in range(gaps.height):
row: pl.DataFrame = gaps[i]
# the gap's RIGHT-most bar's OPEN value
# at that time (sample) step.
iend: int = row['index'][0]
# dt: datetime = row['dt'][0]
# dt_prev: datetime = row['dt_prev'][0]
# dt_end_t: float = dt.timestamp()
# TODO: can we eventually remove this
# once we figure out why the epoch cols
# don't match?
# TODO: FIX HOW/WHY these aren't matching
# and are instead off by 4hours (EST
# vs. UTC?!?!)
# end_t: float = row['time']
# assert (
# dt.timestamp()
# ==
# end_t
# )
# the gap's LEFT-most bar's CLOSE value
# at that time (sample) step.
prev_r: pl.DataFrame = wdts.filter(
pl.col('index') == iend - 1
)
# XXX: probably a gap in the (newly sorted or de-duplicated)
# dt-df, so we might need to re-index first..
dt: pl.Series = row['dt']
dt_prev: pl.Series = row['dt_prev']
if prev_r.is_empty():
# XXX, filter out any special ignore cases,
# - UNIX-epoch stamped datums
# - first row
if (
dt_prev.dt.epoch()[0] == 0
or
dt.dt.epoch()[0] == 0
):
log.warning('Skipping row with UNIX epoch timestamp ??')
continue
if wdts[0]['index'][0] == iend: # first row
log.warning('Skipping first-row (has no previous obvi) !!')
continue
# XXX, if the previous-row by shm-index is missing,
# meaning there is a missing sample (set), get the prior
# row by df index and attempt to use it?
i_wdts: pl.DataFrame = wdts.with_row_index(name='i')
i_row: int = i_wdts.filter(pl.col('index') == iend)['i'][0]
prev_row_by_i = wdts[i_row]
prev_r: pl.DataFrame = prev_row_by_i
# debug any missing pre-row
if tractor._state.is_debug_mode():
await tractor.pause()
istart: int = prev_r['index'][0]
# TODO: implement px-col width measure
# and ensure at least as many px-cols
# shown per rect as configured by user.
# gap_w: float = abs((iend - istart))
# if gap_w < 6:
# margin: float = 6
# iend += margin
# istart -= margin
opn: float = row['open'][0]
cls: float = prev_r['close'][0]
# get gap duration for humanized label
gap_dur_s: float = row['s_diff'][0]
gap_label: str = humanize_duration(gap_dur_s)
# XXX: get timestamps for server-side index lookup
start_time: float = prev_r['time'][0]
end_time: float = row['time'][0]
# BGM=0.16 is the normal diff from overlap between bars, SO
# just go slightly "in" from that "between them".
from_idx: int = BGM - .06 # = .10
lc: tuple[float, float] = (
istart + 1 - from_idx,
cls,
)
ro: tuple[float, float] = (
iend + from_idx,
opn,
)
diff: float = cls - opn
sgn: float = copysign(1, diff)
up_gap: bool = sgn == -1
down_gap: bool = sgn == 1
flat: bool = sgn == 0
color: str = 'dad_blue'
# TODO? mks more sense to have up/down coloring?
# color: str = {
# -1: 'lilypad_green', # up-gap
# 1: 'wine', # down-gap
# }[sgn]
rect_kwargs: dict[str, Any] = dict(
fqme=fqme,
timeframe=timeframe,
start_pos=lc,
end_pos=ro,
color=color,
start_time=start_time,
end_time=end_time,
)
# add up/down rects
aid: int|None = await actl.add_rect(**rect_kwargs)
if aid is None:
log.error(
f'Failed to add rect for,\n'
f'{rect_kwargs!r}\n'
f'\n'
f'Skipping to next gap!\n'
)
continue
assert aid
aids[aid] = rect_kwargs
direction: str = (
'down' if down_gap
else 'up'
)
# TODO! mk this a `msgspec.Struct` which we deserialize
# on the server side!
# XXX: send timestamp for server-side index lookup
# to ensure alignment with current shm state
gap_time: float = row['time'][0]
arrow_kwargs: dict[str, Any] = dict(
fqme=fqme,
timeframe=timeframe,
x=iend, # fallback if timestamp lookup fails
y=cls,
time=gap_time, # for server-side index lookup
color=color,
alpha=169,
pointing=direction,
# TODO: expose these as params to markup_gaps()?
headLen=10,
headWidth=2.222,
pxMode=True,
)
aid: int = await actl.add_arrow(
**arrow_kwargs
)
# add duration label to RHS of arrow
if up_gap:
anchor = (0, 0)
# ^XXX? i dun get dese dims.. XD
elif down_gap:
anchor = (0, 1) # XXX y, x?
else: # no-gap?
assert flat
anchor = (0, 0) # up from bottom
# use a slightly smaller font for gap label txt.
font, small_font = get_fonts()
font_size: int = small_font.px_size - 1
assert isinstance(font_size, int)
if show_txt:
text_aid: int = await actl.add_text(
fqme=fqme,
timeframe=timeframe,
text=gap_label,
x=iend + 1, # fallback if timestamp lookup fails
y=cls,
time=gap_time, # server-side index lookup
color=color,
anchor=anchor,
font_size=font_size,
)
aids[text_aid] = {'text': gap_label}
# tell chart to redraw all its
# graphics view layers Bo
await actl.redraw(
fqme=fqme,
timeframe=timeframe,
)
return aids

View File

@ -0,0 +1,206 @@
'''
Smart OHLCV deduplication with data quality validation.
Handles concurrent write conflicts by keeping the most complete bar
(highest volume) while detecting data quality anomalies.
'''
import polars as pl
from ._anal import with_dts
def dedupe_ohlcv_smart(
src_df: pl.DataFrame,
time_col: str = 'time',
volume_col: str = 'volume',
sort: bool = True,
) -> tuple[
pl.DataFrame, # with dts
pl.DataFrame, # deduped (keeping higher volume bars)
int, # count of dupes removed
pl.DataFrame|None, # valid race conditions
pl.DataFrame|None, # data quality violations
]:
'''
Smart OHLCV deduplication keeping most complete bars.
For duplicate timestamps, keeps bar with highest volume under
the assumption that higher volume indicates more complete/final
data from backfill vs partial live updates.
Returns
-------
Tuple of:
- wdts: original dataframe with datetime columns added
- deduped: deduplicated frame keeping highest-volume bars
- diff: number of duplicate rows removed
- valid_races: duplicates meeting expected race condition pattern
(volume monotonic, OHLC ranges valid)
- data_quality_issues: duplicates violating expected relationships
indicating provider data problems
'''
wdts: pl.DataFrame = with_dts(src_df)
# Find duplicate timestamps
dupes: pl.DataFrame = wdts.filter(
pl.col(time_col).is_duplicated()
)
if dupes.is_empty():
# No duplicates, return as-is
return (wdts, wdts, 0, None, None)
# Analyze duplicate groups for validation
dupe_analysis: pl.DataFrame = (
dupes
.sort([time_col, 'index'])
.group_by(time_col, maintain_order=True)
.agg([
pl.col('index').alias('indices'),
pl.col('volume').alias('volumes'),
pl.col('high').alias('highs'),
pl.col('low').alias('lows'),
pl.col('open').alias('opens'),
pl.col('close').alias('closes'),
pl.col('dt').first().alias('dt'),
pl.len().alias('count'),
])
)
# Validate OHLCV monotonicity for each duplicate group
def check_ohlcv_validity(row) -> dict[str, bool]:
'''
Check if duplicate bars follow expected race condition pattern.
For a valid live-update backfill race:
- volume should be monotonically increasing
- high should be monotonically non-decreasing
- low should be monotonically non-increasing
- open should be identical (fixed at bar start)
Returns dict of violation flags.
'''
vols: list = row['volumes']
highs: list = row['highs']
lows: list = row['lows']
opens: list = row['opens']
violations: dict[str, bool] = {
'volume_non_monotonic': False,
'high_decreased': False,
'low_increased': False,
'open_mismatch': False,
'identical_bars': False,
}
# Check if all bars are identical (pure duplicate)
if (
len(set(vols)) == 1
and len(set(highs)) == 1
and len(set(lows)) == 1
and len(set(opens)) == 1
):
violations['identical_bars'] = True
return violations
# Check volume monotonicity
for i in range(1, len(vols)):
if vols[i] < vols[i-1]:
violations['volume_non_monotonic'] = True
break
# Check high monotonicity (can only increase or stay same)
for i in range(1, len(highs)):
if highs[i] < highs[i-1]:
violations['high_decreased'] = True
break
# Check low monotonicity (can only decrease or stay same)
for i in range(1, len(lows)):
if lows[i] > lows[i-1]:
violations['low_increased'] = True
break
# Check open consistency (should be fixed)
if len(set(opens)) > 1:
violations['open_mismatch'] = True
return violations
# Apply validation
dupe_analysis = dupe_analysis.with_columns([
pl.struct(['volumes', 'highs', 'lows', 'opens'])
.map_elements(
check_ohlcv_validity,
return_dtype=pl.Struct([
pl.Field('volume_non_monotonic', pl.Boolean),
pl.Field('high_decreased', pl.Boolean),
pl.Field('low_increased', pl.Boolean),
pl.Field('open_mismatch', pl.Boolean),
pl.Field('identical_bars', pl.Boolean),
])
)
.alias('validity')
])
# Unnest validity struct
dupe_analysis = dupe_analysis.unnest('validity')
# Separate valid races from data quality issues
valid_races: pl.DataFrame|None = (
dupe_analysis
.filter(
# Valid if no violations OR just identical bars
~pl.col('volume_non_monotonic')
& ~pl.col('high_decreased')
& ~pl.col('low_increased')
& ~pl.col('open_mismatch')
)
)
if valid_races.is_empty():
valid_races = None
data_quality_issues: pl.DataFrame|None = (
dupe_analysis
.filter(
# Issues if any non-identical violation exists
(
pl.col('volume_non_monotonic')
| pl.col('high_decreased')
| pl.col('low_increased')
| pl.col('open_mismatch')
)
& ~pl.col('identical_bars')
)
)
if data_quality_issues.is_empty():
data_quality_issues = None
# Deduplicate: keep highest volume bar for each timestamp
deduped: pl.DataFrame = (
wdts
.sort([time_col, volume_col])
.unique(
subset=[time_col],
keep='last',
maintain_order=False,
)
)
# Re-sort by time or index
if sort:
deduped = deduped.sort(by=time_col)
diff: int = wdts.height - deduped.height
return (
wdts,
deduped,
diff,
valid_races,
data_quality_issues,
)

1548
piker/tsp/_history.py 100644

File diff suppressed because it is too large Load Diff

View File

@ -21,6 +21,7 @@ Higher level annotation editors.
from __future__ import annotations from __future__ import annotations
from collections import defaultdict from collections import defaultdict
from typing import ( from typing import (
Literal,
Sequence, Sequence,
TYPE_CHECKING, TYPE_CHECKING,
) )
@ -66,9 +67,18 @@ log = get_logger(__name__)
class ArrowEditor(Struct): class ArrowEditor(Struct):
'''
Annotate a chart-view with arrows most often used for indicating,
- order txns/clears,
- positions directions,
- general points-of-interest like nooz events.
'''
godw: GodWidget = None # type: ignore # noqa godw: GodWidget = None # type: ignore # noqa
_arrows: dict[str, list[pg.ArrowItem]] = {} _arrows: dict[
str,
list[pg.ArrowItem]
] = {}
def add( def add(
self, self,
@ -76,8 +86,19 @@ class ArrowEditor(Struct):
uid: str, uid: str,
x: float, x: float,
y: float, y: float,
color: str = 'default', color: str|None = None,
pointing: str | None = None, pointing: Literal[
'up',
'down',
None,
] = None,
alpha: int = 255,
zval: float = 1e9,
headLen: float|None = None,
headWidth: float|None = None,
tailLen: float|None = None,
tailWidth: float|None = None,
pxMode: bool = True,
) -> pg.ArrowItem: ) -> pg.ArrowItem:
''' '''
@ -93,29 +114,80 @@ class ArrowEditor(Struct):
# scale arrow sizing to dpi-aware font # scale arrow sizing to dpi-aware font
size = _font.font.pixelSize() * 0.8 size = _font.font.pixelSize() * 0.8
# allow caller override of head dimensions
if headLen is None:
headLen = size
if headWidth is None:
headWidth = size/2
# tail params default to None (no tail)
if tailWidth is None:
tailWidth = 3
color = color or 'default'
color = QColor(hcolor(color))
color.setAlpha(alpha)
pen = fn.mkPen(color, width=1)
brush = fn.mkBrush(color)
arrow = pg.ArrowItem( arrow = pg.ArrowItem(
angle=angle, angle=angle,
baseAngle=0, baseAngle=0,
headLen=size, headLen=headLen,
headWidth=size/2, headWidth=headWidth,
tailLen=None, tailLen=tailLen,
pxMode=True, tailWidth=tailWidth,
pxMode=pxMode,
# coloring # coloring
pen=pg.mkPen(hcolor('papas_special')), pen=pen,
brush=pg.mkBrush(hcolor(color)), brush=brush,
) )
arrow.setZValue(zval)
arrow.setPos(x, y) arrow.setPos(x, y)
self._arrows.setdefault(uid, []).append(arrow) plot.addItem(arrow) # render to view
# render to view # register for removal
plot.addItem(arrow) arrow._uid = uid
self._arrows.setdefault(
uid, []
).append(arrow)
return arrow return arrow
def remove(self, arrow) -> bool: def remove(
self,
arrow: pg.ArrowItem,
) -> None:
'''
Remove a *single arrow* from all chart views to which it was
added.
'''
uid: str = arrow._uid
arrows: list[pg.ArrowItem] = self._arrows[uid]
log.info(
f'Removing arrow from views\n'
f'uid: {uid!r}\n'
f'{arrow!r}\n'
)
for linked in self.godw.iter_linked(): for linked in self.godw.iter_linked():
linked.chart.plotItem.removeItem(arrow) linked.chart.plotItem.removeItem(arrow)
try:
arrows.remove(arrow)
except ValueError:
log.warning(
f'Arrow was already removed?\n'
f'uid: {uid!r}\n'
f'{arrow!r}\n'
)
def remove_all(self) -> set[pg.ArrowItem]:
'''
Remove all arrows added by this editor from all
chart-views.
'''
for uid, arrows in self._arrows.items():
for arrow in arrows:
self.remove(arrow)
class LineEditor(Struct): class LineEditor(Struct):
@ -261,6 +333,9 @@ class LineEditor(Struct):
return lines return lines
# compat with ArrowEditor
remove = remove_line
def as_point( def as_point(
pair: Sequence[float, float] | QPointF, pair: Sequence[float, float] | QPointF,
@ -609,3 +684,6 @@ class SelectRect(QtWidgets.QGraphicsRectItem):
): ):
scen.removeItem(self._label_proxy) scen.removeItem(self._label_proxy)
# compat with ArrowEditor
remove = delete

View File

@ -237,8 +237,8 @@ class LevelLabel(YAxisLabel):
class L1Label(LevelLabel): class L1Label(LevelLabel):
text_flags = ( text_flags = (
QtCore.Qt.TextDontClip QtCore.Qt.TextFlag.TextDontClip
| QtCore.Qt.AlignLeft | QtCore.Qt.AlignmentFlag.AlignLeft
) )
def set_label_str( def set_label_str(

View File

@ -27,10 +27,12 @@ from contextlib import (
from functools import partial from functools import partial
from pprint import pformat from pprint import pformat
from typing import ( from typing import (
# Any,
AsyncContextManager, AsyncContextManager,
Literal,
) )
from uuid import uuid4
import pyqtgraph as pg
import tractor import tractor
import trio import trio
from tractor import trionics from tractor import trionics
@ -47,12 +49,16 @@ from piker.brokers import SymbolNotFound
from piker.ui.qt import ( from piker.ui.qt import (
QGraphicsItem, QGraphicsItem,
) )
from PyQt6.QtGui import QFont
from ._display import DisplayState from ._display import DisplayState
from ._interaction import ChartView from ._interaction import ChartView
from ._editors import SelectRect from ._editors import (
SelectRect,
ArrowEditor,
)
from ._chart import ChartPlotWidget from ._chart import ChartPlotWidget
from ._dataviz import Viz from ._dataviz import Viz
from ._style import hcolor
log = get_logger(__name__) log = get_logger(__name__)
@ -83,8 +89,40 @@ _ctxs: IpcCtxTable = {}
# the "annotations server" which actually renders to a Qt canvas). # the "annotations server" which actually renders to a Qt canvas).
# type AnnotsTable = dict[int, QGraphicsItem] # type AnnotsTable = dict[int, QGraphicsItem]
AnnotsTable = dict[int, QGraphicsItem] AnnotsTable = dict[int, QGraphicsItem]
EditorsTable = dict[int, ArrowEditor]
_annots: AnnotsTable = {} _annots: AnnotsTable = {}
_editors: EditorsTable = {}
def rm_annot(
annot: ArrowEditor|SelectRect|pg.TextItem
) -> bool:
global _editors
match annot:
case pg.ArrowItem():
editor = _editors[annot._uid]
editor.remove(annot)
# ^TODO? only remove each arrow or all?
# if editor._arrows:
# editor.remove_all()
# else:
# log.warning(
# f'Annot already removed!\n'
# f'{annot!r}\n'
# )
return True
case SelectRect():
annot.delete()
return True
case pg.TextItem():
scene = annot.scene()
if scene:
scene.removeItem(annot)
return True
return False
async def serve_rc_annots( async def serve_rc_annots(
@ -95,6 +133,12 @@ async def serve_rc_annots(
annots: AnnotsTable, annots: AnnotsTable,
) -> None: ) -> None:
'''
A small viz(ualization) server for remote ctl of chart
annotations.
'''
global _editors
async for msg in annot_req_stream: async for msg in annot_req_stream:
match msg: match msg:
case { case {
@ -104,14 +148,77 @@ async def serve_rc_annots(
'meth': str(meth), 'meth': str(meth),
'kwargs': dict(kwargs), 'kwargs': dict(kwargs),
}: }:
ds: DisplayState = _dss[fqme] ds: DisplayState = _dss[fqme]
try:
chart: ChartPlotWidget = { chart: ChartPlotWidget = {
60: ds.hist_chart, 60: ds.hist_chart,
1: ds.chart, 1: ds.chart,
}[timeframe] }[timeframe]
except KeyError:
msg: str = (
f'No chart for timeframe={timeframe}s, '
f'skipping rect annotation'
)
log.exeception(msg)
await annot_req_stream.send({'error': msg})
continue
cv: ChartView = chart.cv cv: ChartView = chart.cv
# NEW: if timestamps provided, lookup current indices
# from shm to ensure alignment with current buffer
# state
start_time = kwargs.pop('start_time', None)
end_time = kwargs.pop('end_time', None)
if (
start_time is not None
and end_time is not None
):
viz: Viz = chart.get_viz(fqme)
shm = viz.shm
arr = shm.array
# lookup start index
start_matches = arr[arr['time'] == start_time]
if len(start_matches) == 0:
msg: str = (
f'No shm entry for start_time={start_time}, '
f'skipping rect'
)
log.error(msg)
await annot_req_stream.send({'error': msg})
continue
# lookup end index
end_matches = arr[arr['time'] == end_time]
if len(end_matches) == 0:
msg: str = (
f'No shm entry for end_time={end_time}, '
f'skipping rect'
)
log.error(msg)
await annot_req_stream.send({'error': msg})
continue
# get close price from start bar, open from end
# bar
start_idx = float(start_matches[0]['index'])
end_idx = float(end_matches[0]['index'])
start_close = float(start_matches[0]['close'])
end_open = float(end_matches[0]['open'])
# reconstruct start_pos and end_pos with
# looked-up indices
from_idx: float = 0.16 - 0.06 # BGM offset
kwargs['start_pos'] = (
start_idx + 1 - from_idx,
start_close,
)
kwargs['end_pos'] = (
end_idx + from_idx,
end_open,
)
# annot type lookup from cmd # annot type lookup from cmd
rect = SelectRect( rect = SelectRect(
viewbox=cv, viewbox=cv,
@ -130,21 +237,207 @@ async def serve_rc_annots(
# delegate generically to the requested method # delegate generically to the requested method
getattr(rect, meth)(**kwargs) getattr(rect, meth)(**kwargs)
rect.show() rect.show()
# XXX: store absolute coords for repositioning
# during viz redraws (eg backfill updates)
rect._meth = meth
rect._kwargs = kwargs
aid: int = id(rect) aid: int = id(rect)
annots[aid] = rect annots[aid] = rect
aids: set[int] = ctxs[ipc_key][1] aids: set[int] = ctxs[ipc_key][1]
aids.add(aid) aids.add(aid)
await annot_req_stream.send(aid) await annot_req_stream.send(aid)
case {
'cmd': 'ArrowEditor',
'fqme': fqme,
'timeframe': timeframe,
'meth': 'add'|'remove' as meth,
'kwargs': {
'x': float(x),
'y': float(y),
'pointing': pointing,
'color': color,
'aid': str()|None as aid,
'alpha': int(alpha),
'headLen': int()|float()|None as headLen,
'headWidth': int()|float()|None as headWidth,
'tailLen': int()|float()|None as tailLen,
'tailWidth': int()|float()|None as tailWidth,
'pxMode': bool(pxMode),
'time': int()|float()|None as timestamp,
},
# ?TODO? split based on method fn-sigs?
# 'pointing',
}:
ds: DisplayState = _dss[fqme]
try:
chart: ChartPlotWidget = {
60: ds.hist_chart,
1: ds.chart,
}[timeframe]
except KeyError:
log.warning(
f'No chart for timeframe={timeframe}s, '
f'skipping arrow annotation'
)
# return -1 to indicate failure
await annot_req_stream.send(-1)
continue
cv: ChartView = chart.cv
godw = chart.linked.godwidget
# NEW: if timestamp provided, lookup current index
# from shm to ensure alignment with current buffer
# state
if timestamp is not None:
viz: Viz = chart.get_viz(fqme)
shm = viz.shm
arr = shm.array
# find index where time matches timestamp
matches = arr[arr['time'] == timestamp]
if len(matches) == 0:
log.error(
f'No shm entry for timestamp={timestamp}, '
f'skipping arrow annotation'
)
await annot_req_stream.send(-1)
continue
# use the matched row's index as x
x = float(matches[0]['index'])
arrows = ArrowEditor(godw=godw)
# `.add/.remove()` API
if meth != 'add':
# await tractor.pause()
raise ValueError(
f'Invalid arrow-edit request ?\n'
f'{msg!r}\n'
)
aid: str = str(uuid4())
arrow: pg.ArrowItem = arrows.add(
plot=chart.plotItem,
uid=aid,
x=x,
y=y,
pointing=pointing,
color=color,
alpha=alpha,
headLen=headLen,
headWidth=headWidth,
tailLen=tailLen,
tailWidth=tailWidth,
pxMode=pxMode,
)
# XXX: store absolute coords for repositioning
# during viz redraws (eg backfill updates)
arrow._abs_x = x
arrow._abs_y = y
annots[aid] = arrow
_editors[aid] = arrows
aids: set[int] = ctxs[ipc_key][1]
aids.add(aid)
await annot_req_stream.send(aid)
case {
'cmd': 'TextItem',
'fqme': fqme,
'timeframe': timeframe,
'kwargs': {
'text': str(text),
'x': int()|float() as x,
'y': int()|float() as y,
'color': color,
'anchor': list(anchor),
'font_size': int()|None as font_size,
'time': int()|float()|None as timestamp,
},
}:
ds: DisplayState = _dss[fqme]
try:
chart: ChartPlotWidget = {
60: ds.hist_chart,
1: ds.chart,
}[timeframe]
except KeyError:
log.warning(
f'No chart for timeframe={timeframe}s, '
f'skipping text annotation'
)
await annot_req_stream.send(-1)
continue
# NEW: if timestamp provided, lookup current index
# from shm to ensure alignment with current buffer
# state
if timestamp is not None:
viz: Viz = chart.get_viz(fqme)
shm = viz.shm
arr = shm.array
# find index where time matches timestamp
matches = arr[arr['time'] == timestamp]
if len(matches) == 0:
log.error(
f'No shm entry for timestamp={timestamp}, '
f'skipping text annotation'
)
await annot_req_stream.send(-1)
continue
# use the matched row's index as x, +1 for text
# offset
x = float(matches[0]['index']) + 1
# convert named color to hex
color_hex: str = hcolor(color)
# create text item
text_item: pg.TextItem = pg.TextItem(
text=text,
color=color_hex,
anchor=anchor,
# ?TODO, pin to github:main for this?
# legacy, can have scaling ish?
# ensureInBounds=True,
)
# apply font size (default to DpiAwareFont if not
# provided)
if font_size is None:
from ._style import get_fonts
font, font_small = get_fonts()
font_size = font_small.px_size - 1
qfont: QFont = text_item.textItem.font()
qfont.setPixelSize(font_size)
text_item.setFont(qfont)
text_item.setPos(x, y)
chart.plotItem.addItem(text_item)
# XXX: store absolute coords for repositioning
# during viz redraws (eg backfill updates)
text_item._abs_x = x
text_item._abs_y = y
aid: str = str(uuid4())
annots[aid] = text_item
aids: set[int] = ctxs[ipc_key][1]
aids.add(aid)
await annot_req_stream.send(aid)
case { case {
'cmd': 'remove', 'cmd': 'remove',
'aid': int(aid), 'aid': int(aid)|str(aid),
}: }:
# NOTE: this is normally entered on # NOTE: this is normally entered on
# a client's annotation de-alloc normally # a client's annotation de-alloc normally
# prior to detach or modify. # prior to detach or modify.
annot: QGraphicsItem = annots[aid] annot: QGraphicsItem = annots[aid]
annot.delete() assert rm_annot(annot)
# respond to client indicating annot # respond to client indicating annot
# was indeed deleted. # was indeed deleted.
@ -175,6 +468,38 @@ async def serve_rc_annots(
) )
viz.reset_graphics() viz.reset_graphics()
# XXX: reposition all annotations to ensure they
# stay aligned with viz data after reset (eg during
# backfill when abs-index range changes)
n_repositioned: int = 0
for aid, annot in annots.items():
# arrows and text items use abs x,y coords
if (
hasattr(annot, '_abs_x')
and
hasattr(annot, '_abs_y')
):
annot.setPos(
annot._abs_x,
annot._abs_y,
)
n_repositioned += 1
# rects use method + kwargs
elif (
hasattr(annot, '_meth')
and
hasattr(annot, '_kwargs')
):
getattr(annot, annot._meth)(**annot._kwargs)
n_repositioned += 1
if n_repositioned:
log.info(
f'Repositioned {n_repositioned} annotation(s) '
f'after viz redraw'
)
case _: case _:
log.error( log.error(
'Unknown remote annotation cmd:\n' 'Unknown remote annotation cmd:\n'
@ -188,6 +513,12 @@ async def remote_annotate(
) -> None: ) -> None:
global _dss, _ctxs global _dss, _ctxs
if not _dss:
raise RuntimeError(
'Race condition on chart-init state ??\n'
'Anoter actor is trying to annoate this chart '
'before it has fully spawned.\n'
)
assert _dss assert _dss
_ctxs[ctx.cid] = (ctx, set()) _ctxs[ctx.cid] = (ctx, set())
@ -212,7 +543,7 @@ async def remote_annotate(
assert _ctx is ctx assert _ctx is ctx
for aid in aids: for aid in aids:
annot: QGraphicsItem = _annots[aid] annot: QGraphicsItem = _annots[aid]
annot.delete() assert rm_annot(annot)
class AnnotCtl(Struct): class AnnotCtl(Struct):
@ -257,13 +588,18 @@ class AnnotCtl(Struct):
from_acm: bool = False, from_acm: bool = False,
) -> int: # NEW: optional timestamps for server-side index lookup
start_time: float|None = None,
end_time: float|None = None,
) -> int|None:
''' '''
Add a `SelectRect` annotation to the target view, return Add a `SelectRect` annotation to the target view, return
the instances `id(obj)` from the remote UI actor. the instances `id(obj)` from the remote UI actor.
''' '''
ipc: MsgStream = self._get_ipc(fqme) ipc: MsgStream = self._get_ipc(fqme)
with trio.fail_after(3):
await ipc.send({ await ipc.send({
'fqme': fqme, 'fqme': fqme,
'cmd': 'SelectRect', 'cmd': 'SelectRect',
@ -275,9 +611,15 @@ class AnnotCtl(Struct):
'end_pos': tuple(end_pos), 'end_pos': tuple(end_pos),
'color': color, 'color': color,
'update_label': False, 'update_label': False,
'start_time': start_time,
'end_time': end_time,
}, },
}) })
aid: int = await ipc.receive() aid: int|dict = await ipc.receive()
match aid:
case {'error': str(msg)}:
log.error(msg)
return None
self._ipcs[aid] = ipc self._ipcs[aid] = ipc
if not from_acm: if not from_acm:
self._annot_stack.push_async_callback( self._annot_stack.push_async_callback(
@ -334,20 +676,130 @@ class AnnotCtl(Struct):
'timeframe': timeframe, 'timeframe': timeframe,
}) })
# TODO: do we even need this? async def add_arrow(
# async def modify( self,
# self, fqme: str,
# aid: int, # annotation id timeframe: float,
# meth: str, # far end graphics object method to invoke x: float,
# params: dict[str, Any], # far end `meth(**kwargs)` y: float,
# ) -> bool: pointing: Literal[
# ''' 'up',
# Modify an existing (remote) annotation's graphics 'down',
# paramters, thus changing it's appearance / state in real ],
# time. # TODO: a `Literal['view', 'scene']` for this?
# domain: str = 'view', # or 'scene'
color: str = 'dad_blue',
alpha: int = 116,
headLen: float|None = None,
headWidth: float|None = None,
tailLen: float|None = None,
tailWidth: float|None = None,
pxMode: bool = True,
# ''' from_acm: bool = False,
# raise NotImplementedError
# NEW: optional timestamp for server-side index lookup
time: float|None = None,
) -> int|None:
'''
Add a `SelectRect` annotation to the target view, return
the instances `id(obj)` from the remote UI actor.
'''
ipc: MsgStream = self._get_ipc(fqme)
with trio.fail_after(3):
await ipc.send({
'fqme': fqme,
'cmd': 'ArrowEditor',
'timeframe': timeframe,
# 'meth': str(meth),
'meth': 'add',
'kwargs': {
'x': float(x),
'y': float(y),
'color': color,
'pointing': pointing, # up|down
'alpha': alpha,
'aid': None,
'headLen': headLen,
'headWidth': headWidth,
'tailLen': tailLen,
'tailWidth': tailWidth,
'pxMode': pxMode,
'time': time, # for server-side index lookup
},
})
aid: int|dict = await ipc.receive()
match aid:
case {'error': str(msg)}:
log.error(msg)
return None
self._ipcs[aid] = ipc
if not from_acm:
self._annot_stack.push_async_callback(
partial(
self.remove,
aid,
)
)
return aid
async def add_text(
self,
fqme: str,
timeframe: float,
text: str,
x: float,
y: float,
color: str|tuple = 'dad_blue',
anchor: tuple[float, float] = (0, 1),
font_size: int|None = None,
from_acm: bool = False,
# NEW: optional timestamp for server-side index lookup
time: float|None = None,
) -> int|None:
'''
Add a `pg.TextItem` annotation to the target view.
anchor: (x, y) where (0,0) is upper-left, (1,1) is lower-right
font_size: pixel size for font, defaults to `_font.font.pixelSize()`
'''
ipc: MsgStream = self._get_ipc(fqme)
with trio.fail_after(3):
await ipc.send({
'fqme': fqme,
'cmd': 'TextItem',
'timeframe': timeframe,
'kwargs': {
'text': text,
'x': float(x),
'y': float(y),
'color': color,
'anchor': tuple(anchor),
'font_size': font_size,
'time': time, # for server-side index lookup
},
})
aid: int|dict = await ipc.receive()
match aid:
case {'error': str(msg)}:
log.error(msg)
return None
self._ipcs[aid] = ipc
if not from_acm:
self._annot_stack.push_async_callback(
partial(
self.remove,
aid,
)
)
return aid
@acm @acm
@ -374,7 +826,9 @@ async def open_annot_ctl(
# TODO: print the current discoverable actor UID set # TODO: print the current discoverable actor UID set
# here as well? # here as well?
if not maybe_portals: if not maybe_portals:
raise RuntimeError('No chart UI actors found in service domain?') raise RuntimeError(
'No chart actors found in service domain?'
)
for portal in maybe_portals: for portal in maybe_portals:
ctx_mngrs.append( ctx_mngrs.append(

View File

@ -107,7 +107,22 @@ class DpiAwareFont:
@property @property
def px_size(self) -> int: def px_size(self) -> int:
return self._qfont.pixelSize() size: int = self._qfont.pixelSize()
# XXX, when no Qt app has been spawned this will always be
# invalid..
# SO, just return any conf.toml value.
if size == -1:
if (conf_size := self._font_size) is None:
raise ValueError(
f'No valid `{type(_font).__name__}.px_size` set?\n'
f'\n'
f'-> `ui.font_size` is NOT set in `conf.toml`\n'
f'-> no Qt app is active ??\n'
)
return conf_size
return size
def configure_to_dpi(self, screen: QtGui.QScreen | None = None): def configure_to_dpi(self, screen: QtGui.QScreen | None = None):
''' '''
@ -221,6 +236,20 @@ def _config_fonts_to_screen() -> None:
_font_small.configure_to_dpi() _font_small.configure_to_dpi()
def get_fonts() -> tuple[
DpiAwareFont,
DpiAwareFont,
]:
'''
Get the singleton font pair (of instances) from which all other
UI/UX should be "scaled around".
See `DpiAwareFont` for (internal) deats.
'''
return _font, _font_small
# TODO: re-compute font size when main widget switches screens? # TODO: re-compute font size when main widget switches screens?
# https://forum.qt.io/topic/54136/how-do-i-get-the-qscreen-my-widget-is-on-qapplication-desktop-screen-returns-a-qwidget-and-qobject_cast-qscreen-returns-null/3 # https://forum.qt.io/topic/54136/how-do-i-get-the-qscreen-my-widget-is-on-qapplication-desktop-screen-returns-a-qwidget-and-qobject_cast-qscreen-returns-null/3
@ -308,6 +337,7 @@ def hcolor(name: str) -> str:
'cool_green': '#33b864', 'cool_green': '#33b864',
'dull_green': '#74a662', 'dull_green': '#74a662',
'hedge_green': '#518360', 'hedge_green': '#518360',
'lilypad_green': '#839c84',
# orders and alerts # orders and alerts
'alert_yellow': '#e2d083', 'alert_yellow': '#e2d083',
@ -335,6 +365,7 @@ def hcolor(name: str) -> str:
'sell_red': '#b6003f', 'sell_red': '#b6003f',
# 'sell_red': '#d00048', # 'sell_red': '#d00048',
'sell_red_light': '#f85462', 'sell_red_light': '#f85462',
'wine': '#69212d',
# 'sell_red': '#f85462', # 'sell_red': '#f85462',
# 'sell_red_light': '#ff4d5c', # 'sell_red_light': '#ff4d5c',

View File

@ -59,8 +59,14 @@ from piker.data import (
from piker.types import Struct from piker.types import Struct
from piker.log import get_logger from piker.log import get_logger
from piker.ui.qt import Qt from piker.ui.qt import Qt
from ._editors import LineEditor, ArrowEditor from ._editors import (
from ._lines import order_line, LevelLine LineEditor,
ArrowEditor,
)
from ._lines import (
order_line,
LevelLine,
)
from ._position import ( from ._position import (
PositionTracker, PositionTracker,
SettingsPane, SettingsPane,

View File

@ -75,6 +75,7 @@ dependencies = [
"trio-typing>=0.10.0", "trio-typing>=0.10.0",
"numba>=0.61.0", "numba>=0.61.0",
"pyvnc", "pyvnc",
"multiaddr>=0.1.1",
] ]
# ------ dependencies ------ # ------ dependencies ------
# NOTE, by default we ship only a "headless" deps set bc # NOTE, by default we ship only a "headless" deps set bc
@ -116,7 +117,6 @@ uis = [
dev = [ dev = [
# https://docs.astral.sh/uv/concepts/projects/dependencies/#development-dependencies # https://docs.astral.sh/uv/concepts/projects/dependencies/#development-dependencies
"cython >=3.0.0, <4.0.0", "cython >=3.0.0, <4.0.0",
# nested deps-groups # nested deps-groups
# https://docs.astral.sh/uv/concepts/projects/dependencies/#nesting-groups # https://docs.astral.sh/uv/concepts/projects/dependencies/#nesting-groups
{include-group = 'uis'}, {include-group = 'uis'},
@ -134,6 +134,10 @@ repl = [
"prompt-toolkit ==3.0.40", "prompt-toolkit ==3.0.40",
"pyperclip>=1.9.0", "pyperclip>=1.9.0",
# for @claude's `snippets/claude_debug_helper.py` it uses to do
# "offline" debug/crash REPL-in alongside a dev.
"pexpect>=4.9.0",
# ?TODO, new stuff to consider.. # ?TODO, new stuff to consider..
# "visidata" # console numerics # "visidata" # console numerics
# "xxh" # for remote `xonsh`-ing # "xxh" # for remote `xonsh`-ing
@ -191,10 +195,15 @@ pyqtgraph = { git = "https://github.com/pikers/pyqtgraph.git" }
tomlkit = { git = "https://github.com/pikers/tomlkit.git", branch ="piker_pin" } tomlkit = { git = "https://github.com/pikers/tomlkit.git", branch ="piker_pin" }
pyvnc = { git = "https://github.com/regulad/pyvnc.git" } pyvnc = { git = "https://github.com/regulad/pyvnc.git" }
# to get fancy next-cmd/suggestion feats prior to 0.22.2 B)
# https://github.com/xonsh/xonsh/pull/6037
# https://github.com/xonsh/xonsh/pull/6048
xonsh = { git = 'https://github.com/xonsh/xonsh.git', branch = 'main' }
# XXX since, we're like, always hacking new shite all-the-time. Bp # XXX since, we're like, always hacking new shite all-the-time. Bp
tractor = { git = "https://github.com/goodboy/tractor.git", branch ="piker_pin" } # tractor = { git = "https://github.com/goodboy/tractor.git", branch ="piker_pin" }
# tractor = { git = "https://pikers.dev/goodboy/tractor", branch = "piker_pin" } # tractor = { git = "https://pikers.dev/goodboy/tractor", branch = "piker_pin" }
# tractor = { git = "https://pikers.dev/goodboy/tractor", branch = "main" } # tractor = { git = "https://pikers.dev/goodboy/tractor", branch = "main" }
# ------ goodboy ------ # ------ goodboy ------
# hackin dev-envs, usually there's something new he's hackin in.. # hackin dev-envs, usually there's something new he's hackin in..
# tractor = { path = "../tractor", editable = true } tractor = { path = "../tractor", editable = true }

View File

@ -0,0 +1,256 @@
#!/usr/bin/env python
'''
Programmatic debugging helper for `pdbp` REPL human-like
interaction but built to allow `claude` to interact with
crashes and `tractor.pause()` breakpoints along side a human dev.
Originally written by `clauded` during a backfiller inspection
session with @goodboy trying to resolve duplicate/gappy ohlcv ts
issues discovered while testing the new `nativedb` tsdb.
Allows `claude` to run `pdb` commands and capture output in an "offline"
manner but generating similar output as if it was iteracting with
the debug REPL.
The use of `pexpect` is heavily based on tractor's REPL UX test
suite(s), namely various `tests/devx/test_debugger.py` patterns.
'''
import sys
import os
import time
import pexpect
from pexpect.exceptions import (
TIMEOUT,
EOF,
)
PROMPT: str = r'\(Pdb\+\)'
def expect(
child: pexpect.spawn,
patt: str,
**kwargs,
) -> None:
'''
Expect wrapper that prints last console data before failing.
'''
try:
child.expect(
patt,
**kwargs,
)
except TIMEOUT:
before: str = (
str(child.before.decode())
if isinstance(child.before, bytes)
else str(child.before)
)
print(
f'TIMEOUT waiting for pattern: {patt}\n'
f'Last seen output:\n{before}'
)
raise
def run_pdb_commands(
commands: list[str],
initial_cmd: str = 'piker store ldshm xmrusdt.usdtm.perp.binance',
timeout: int = 30,
print_output: bool = True,
) -> dict[str, str]:
'''
Spawn piker process, wait for pdb prompt, execute commands.
Returns dict mapping command -> output.
'''
results: dict[str, str] = {}
# Disable colored output for easier parsing
os.environ['PYTHON_COLORS'] = '0'
# Spawn the process
if print_output:
print(f'Spawning: {initial_cmd}')
child: pexpect.spawn = pexpect.spawn(
initial_cmd,
timeout=timeout,
encoding='utf-8',
echo=False,
)
# Wait for pdb prompt
try:
expect(child, PROMPT, timeout=timeout)
if print_output:
print('Reached pdb prompt!')
# Execute each command
for cmd in commands:
if print_output:
print(f'\n>>> {cmd}')
child.sendline(cmd)
time.sleep(0.1)
# Wait for next prompt
expect(child, PROMPT, timeout=timeout)
# Capture output (everything before the prompt)
output: str = (
str(child.before.decode())
if isinstance(child.before, bytes)
else str(child.before)
)
results[cmd] = output
if print_output:
print(output)
# Quit debugger gracefully
child.sendline('quit')
try:
child.expect(EOF, timeout=5)
except (TIMEOUT, EOF):
pass
except TIMEOUT as e:
print(f'Timeout: {e}')
if child.before:
before: str = (
str(child.before.decode())
if isinstance(child.before, bytes)
else str(child.before)
)
print(f'Buffer:\n{before}')
results['_error'] = str(e)
finally:
if child.isalive():
child.close(force=True)
return results
class InteractivePdbSession:
'''
Interactive pdb session manager for incremental debugging.
'''
def __init__(
self,
cmd: str = 'piker store ldshm xmrusdt.usdtm.perp.binance',
timeout: int = 30,
):
self.cmd: str = cmd
self.timeout: int = timeout
self.child: pexpect.spawn|None = None
self.history: list[tuple[str, str]] = []
def start(self) -> None:
'''
Start the piker process and wait for first prompt.
'''
os.environ['PYTHON_COLORS'] = '0'
print(f'Starting: {self.cmd}')
self.child = pexpect.spawn(
self.cmd,
timeout=self.timeout,
encoding='utf-8',
echo=False,
)
# Wait for initial prompt
expect(self.child, PROMPT, timeout=self.timeout)
print('Ready at pdb prompt!')
def run(
self,
cmd: str,
print_output: bool = True,
) -> str:
'''
Execute a single pdb command and return output.
'''
if not self.child or not self.child.isalive():
raise RuntimeError('Session not started or dead')
if print_output:
print(f'\n>>> {cmd}')
self.child.sendline(cmd)
time.sleep(0.1)
# Wait for next prompt
expect(self.child, PROMPT, timeout=self.timeout)
output: str = (
str(self.child.before.decode())
if isinstance(self.child.before, bytes)
else str(self.child.before)
)
self.history.append((cmd, output))
if print_output:
print(output)
return output
def quit(self) -> None:
'''
Exit the debugger and cleanup.
'''
if self.child and self.child.isalive():
self.child.sendline('quit')
try:
self.child.expect(EOF, timeout=5)
except (TIMEOUT, EOF):
pass
self.child.close(force=True)
def __enter__(self):
self.start()
return self
def __exit__(self, *args):
self.quit()
if __name__ == '__main__':
# Example inspection commands
inspect_cmds: list[str] = [
'locals().keys()',
'type(deduped)',
'deduped.shape',
(
'step_gaps.shape '
'if "step_gaps" in locals() '
'else "N/A"'
),
(
'venue_gaps.shape '
'if "venue_gaps" in locals() '
'else "N/A"'
),
]
# Allow commands from CLI args
if len(sys.argv) > 1:
inspect_cmds = sys.argv[1:]
# Interactive session example
with InteractivePdbSession() as session:
for cmd in inspect_cmds:
session.run(cmd)
print('\n=== Session Complete ===')

391
uv.lock
View File

@ -104,6 +104,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e0/44/827b2a91a5816512fcaf3cc4ebc465ccd5d598c45cefa6703fcf4a79018f/attrs-23.2.0-py3-none-any.whl", hash = "sha256:99b87a485a5820b23b879f04c2305b44b951b502fd64be915879d77a7e8fc6f1", size = 60752, upload-time = "2023-12-31T06:30:30.772Z" }, { url = "https://files.pythonhosted.org/packages/e0/44/827b2a91a5816512fcaf3cc4ebc465ccd5d598c45cefa6703fcf4a79018f/attrs-23.2.0-py3-none-any.whl", hash = "sha256:99b87a485a5820b23b879f04c2305b44b951b502fd64be915879d77a7e8fc6f1", size = 60752, upload-time = "2023-12-31T06:30:30.772Z" },
] ]
[[package]]
name = "base58"
version = "2.1.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/7f/45/8ae61209bb9015f516102fa559a2914178da1d5868428bd86a1b4421141d/base58-2.1.1.tar.gz", hash = "sha256:c5d0cb3f5b6e81e8e35da5754388ddcc6d0d14b6c6a132cb93d69ed580a7278c", size = 6528, upload-time = "2021-10-30T22:12:17.858Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/4a/45/ec96b29162a402fc4c1c5512d114d7b3787b9d1c2ec241d9568b4816ee23/base58-2.1.1-py3-none-any.whl", hash = "sha256:11a36f4d3ce51dfc1043f3218591ac4eb1ceb172919cebe05b52a5bcc8d245c2", size = 5621, upload-time = "2021-10-30T22:12:16.658Z" },
]
[[package]] [[package]]
name = "bidict" name = "bidict"
version = "0.23.1" version = "0.23.1"
@ -113,6 +122,74 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/99/37/e8730c3587a65eb5645d4aba2d27aae48e8003614d6aaf15dda67f702f1f/bidict-0.23.1-py3-none-any.whl", hash = "sha256:5dae8d4d79b552a71cbabc7deb25dfe8ce710b17ff41711e13010ead2abfc3e5", size = 32764, upload-time = "2024-02-18T19:09:04.156Z" }, { url = "https://files.pythonhosted.org/packages/99/37/e8730c3587a65eb5645d4aba2d27aae48e8003614d6aaf15dda67f702f1f/bidict-0.23.1-py3-none-any.whl", hash = "sha256:5dae8d4d79b552a71cbabc7deb25dfe8ce710b17ff41711e13010ead2abfc3e5", size = 32764, upload-time = "2024-02-18T19:09:04.156Z" },
] ]
[[package]]
name = "blake3"
version = "1.0.8"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/75/aa/abcd75e9600987a0bc6cfe9b6b2ff3f0e2cb08c170addc6e76035b5c4cb3/blake3-1.0.8.tar.gz", hash = "sha256:513cc7f0f5a7c035812604c2c852a0c1468311345573de647e310aca4ab165ba", size = 117308, upload-time = "2025-10-14T06:47:48.83Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ed/a0/b7b6dff04012cfd6e665c09ee446f749bd8ea161b00f730fe1bdecd0f033/blake3-1.0.8-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:d8da4233984d51471bd4e4366feda1d90d781e712e0a504ea54b1f2b3577557b", size = 347983, upload-time = "2025-10-14T06:45:47.214Z" },
{ url = "https://files.pythonhosted.org/packages/5b/a2/264091cac31d7ae913f1f296abc20b8da578b958ffb86100a7ce80e8bf5c/blake3-1.0.8-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1257be19f2d381c868a34cc822fc7f12f817ddc49681b6d1a2790bfbda1a9865", size = 325415, upload-time = "2025-10-14T06:45:48.482Z" },
{ url = "https://files.pythonhosted.org/packages/ee/7d/85a4c0782f613de23d114a7a78fcce270f75b193b3ff3493a0de24ba104a/blake3-1.0.8-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:269f255b110840e52b6ce9db02217e39660ebad3e34ddd5bca8b8d378a77e4e1", size = 371296, upload-time = "2025-10-14T06:45:49.674Z" },
{ url = "https://files.pythonhosted.org/packages/e3/20/488475254976ed93fab57c67aa80d3b40df77f7d9db6528c9274bff53e08/blake3-1.0.8-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:66ca28a673025c40db3eba21a9cac52f559f83637efa675b3f6bd8683f0415f3", size = 374516, upload-time = "2025-10-14T06:45:51.23Z" },
{ url = "https://files.pythonhosted.org/packages/7b/21/2a1c47fedb77fb396512677ec6d46caf42ac6e9a897db77edd0a2a46f7bb/blake3-1.0.8-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bcb04966537777af56c1f399b35525aa70a1225816e121ff95071c33c0f7abca", size = 447911, upload-time = "2025-10-14T06:45:52.637Z" },
{ url = "https://files.pythonhosted.org/packages/cb/7d/db0626df16029713e7e61b67314c4835e85c296d82bd907c21c6ea271da2/blake3-1.0.8-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e5b5da177d62cc4b7edf0cea08fe4dec960c9ac27f916131efa890a01f747b93", size = 505420, upload-time = "2025-10-14T06:45:54.445Z" },
{ url = "https://files.pythonhosted.org/packages/5b/55/6e737850c2d58a6d9de8a76dad2ae0f75b852a23eb4ecb07a0b165e6e436/blake3-1.0.8-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:38209b10482c97e151681ea3e91cc7141f56adbbf4820a7d701a923124b41e6a", size = 394189, upload-time = "2025-10-14T06:45:55.719Z" },
{ url = "https://files.pythonhosted.org/packages/5b/94/eafaa5cdddadc0c9c603a6a6d8339433475e1a9f60c8bb9c2eed2d8736b6/blake3-1.0.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:504d1399b7fb91dfe5c25722d2807990493185faa1917456455480c36867adb5", size = 388001, upload-time = "2025-10-14T06:45:57.067Z" },
{ url = "https://files.pythonhosted.org/packages/17/81/735fa00d13de7f68b25e1b9cb36ff08c6f165e688d85d8ec2cbfcdedccc5/blake3-1.0.8-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:c84af132aa09abeadf9a0118c8fb26f4528f3f42c10ef8be0fcf31c478774ec4", size = 550302, upload-time = "2025-10-14T06:45:58.657Z" },
{ url = "https://files.pythonhosted.org/packages/0e/c6/d1fe8bdea4a6088bd54b5a58bc40aed89a4e784cd796af7722a06f74bae7/blake3-1.0.8-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a25db3d36b55f5ed6a86470155cc749fc9c5b91c949b8d14f48658f9d960d9ec", size = 554211, upload-time = "2025-10-14T06:46:00.269Z" },
{ url = "https://files.pythonhosted.org/packages/55/d1/ca74aa450cbe10e396e061f26f7a043891ffa1485537d6b30d3757e20995/blake3-1.0.8-cp312-cp312-win32.whl", hash = "sha256:e0fee93d5adcd44378b008c147e84f181f23715307a64f7b3db432394bbfce8b", size = 228343, upload-time = "2025-10-14T06:46:01.533Z" },
{ url = "https://files.pythonhosted.org/packages/4d/42/bbd02647169e3fbed27558555653ac2578c6f17ccacf7d1956c58ef1d214/blake3-1.0.8-cp312-cp312-win_amd64.whl", hash = "sha256:6a6eafc29e4f478d365a87d2f25782a521870c8514bb43734ac85ae9be71caf7", size = 215704, upload-time = "2025-10-14T06:46:02.79Z" },
{ url = "https://files.pythonhosted.org/packages/55/b8/11de9528c257f7f1633f957ccaff253b706838d22c5d2908e4735798ec01/blake3-1.0.8-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:46dc20976bd6c235959ef0246ec73420d1063c3da2839a9c87ca395cf1fd7943", size = 347771, upload-time = "2025-10-14T06:46:04.248Z" },
{ url = "https://files.pythonhosted.org/packages/50/26/f7668be55c909678b001ecacff11ad7016cd9b4e9c7cc87b5971d638c5a9/blake3-1.0.8-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d17eb6382634b3a5bc0c0e0454d5265b0becaeeadb6801ed25150b39a999d0cc", size = 325431, upload-time = "2025-10-14T06:46:06.136Z" },
{ url = "https://files.pythonhosted.org/packages/77/57/e8a85fa261894bf7ce7af928ff3408aab60287ab8d58b55d13a3f700b619/blake3-1.0.8-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:19fc6f2b7edab8acff6895fc6e38c19bd79f4c089e21153020c75dfc7397d52d", size = 370994, upload-time = "2025-10-14T06:46:07.398Z" },
{ url = "https://files.pythonhosted.org/packages/62/cd/765b76bb48b8b294fea94c9008b0d82b4cfa0fa2f3c6008d840d01a597e4/blake3-1.0.8-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4f54cff7f15d91dc78a63a2dd02a3dccdc932946f271e2adb4130e0b4cf608ba", size = 374372, upload-time = "2025-10-14T06:46:08.698Z" },
{ url = "https://files.pythonhosted.org/packages/36/7a/32084eadbb28592bb07298f0de316d2da586c62f31500a6b1339a7e7b29b/blake3-1.0.8-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e7e12a777f6b798eb8d06f875d6e108e3008bd658d274d8c676dcf98e0f10537", size = 447627, upload-time = "2025-10-14T06:46:10.002Z" },
{ url = "https://files.pythonhosted.org/packages/a7/f4/3788a1d86e17425eea147e28d7195d7053565fc279236a9fd278c2ec495e/blake3-1.0.8-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ddfc59b0176fb31168f08d5dd536e69b1f4f13b5a0f4b0c3be1003efd47f9308", size = 507536, upload-time = "2025-10-14T06:46:11.614Z" },
{ url = "https://files.pythonhosted.org/packages/fe/01/4639cba48513b94192681b4da472cdec843d3001c5344d7051ee5eaef606/blake3-1.0.8-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a2336d5b2a801a7256da21150348f41610a6c21dae885a3acb1ebbd7333d88d8", size = 394105, upload-time = "2025-10-14T06:46:12.808Z" },
{ url = "https://files.pythonhosted.org/packages/21/ae/6e55c19c8460fada86cd1306a390a09b0c5a2e2e424f9317d2edacea439f/blake3-1.0.8-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4072196547484c95a5a09adbb952e9bb501949f03f9e2a85e7249ef85faaba8", size = 386928, upload-time = "2025-10-14T06:46:16.284Z" },
{ url = "https://files.pythonhosted.org/packages/ee/6c/05b7a5a907df1be53a8f19e7828986fc6b608a44119641ef9c0804fbef15/blake3-1.0.8-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:0eab3318ec02f8e16fe549244791ace2ada2c259332f0c77ab22cf94dfff7130", size = 550003, upload-time = "2025-10-14T06:46:17.791Z" },
{ url = "https://files.pythonhosted.org/packages/b4/03/f0ea4adfedc1717623be6460b3710fcb725ca38082c14274369803f727e1/blake3-1.0.8-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:a33b9a1fb6d1d559a8e0d04b041e99419a6bb771311c774f6ff57ed7119c70ed", size = 553857, upload-time = "2025-10-14T06:46:19.088Z" },
{ url = "https://files.pythonhosted.org/packages/cc/6f/e5410d2e2a30c8aba8389ffc1c0061356916bf5ecd0a210344e7b69b62ab/blake3-1.0.8-cp313-cp313-win32.whl", hash = "sha256:e171b169cb7ea618e362a4dddb7a4d4c173bbc08b9ba41ea3086dd1265530d4f", size = 228315, upload-time = "2025-10-14T06:46:20.391Z" },
{ url = "https://files.pythonhosted.org/packages/79/ef/d9c297956dfecd893f29f59e7b22445aba5b47b7f6815d9ba5dcd73fcae6/blake3-1.0.8-cp313-cp313-win_amd64.whl", hash = "sha256:3168c457255b5d2a2fc356ba696996fcaff5d38284f968210d54376312107662", size = 215477, upload-time = "2025-10-14T06:46:21.542Z" },
{ url = "https://files.pythonhosted.org/packages/20/ba/eaa7723d66dd8ab762a3e85e139bb9c46167b751df6e950ad287adb8fb61/blake3-1.0.8-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:b4d672c24dc15ec617d212a338a4ca14b449829b6072d09c96c63b6e6b621aed", size = 347289, upload-time = "2025-10-14T06:46:22.772Z" },
{ url = "https://files.pythonhosted.org/packages/47/b3/6957f6ee27f0d5b8c4efdfda68a1298926a88c099f4dd89c711049d16526/blake3-1.0.8-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:1af0e5a29aa56d4fba904452ae784740997440afd477a15e583c38338e641f41", size = 324444, upload-time = "2025-10-14T06:46:24.729Z" },
{ url = "https://files.pythonhosted.org/packages/13/da/722cebca11238f3b24d3cefd2361c9c9ea47cfa0ad9288eeb4d1e0b7cf93/blake3-1.0.8-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ef153c5860d5bf1cc71aece69b28097d2a392913eb323d6b52555c875d0439fc", size = 370441, upload-time = "2025-10-14T06:46:26.29Z" },
{ url = "https://files.pythonhosted.org/packages/2e/d5/2f7440c8e41c0af995bad3a159e042af0f4ed1994710af5b4766ca918f65/blake3-1.0.8-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e8ae3689f0c7bfa6ce6ae45cab110e4c3442125c4c23b28f1f097856de26e4d1", size = 374312, upload-time = "2025-10-14T06:46:27.451Z" },
{ url = "https://files.pythonhosted.org/packages/a6/6c/fb6a7812e60ce3e110bcbbb11f167caf3e975c589572c41e1271f35f2c41/blake3-1.0.8-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3fb83532f7456ddeb68dae1b36e1f7c52f9cb72852ac01159bbcb1a12b0f8be0", size = 447007, upload-time = "2025-10-14T06:46:29.056Z" },
{ url = "https://files.pythonhosted.org/packages/13/3b/c99b43fae5047276ea9d944077c190fc1e5f22f57528b9794e21f7adedc6/blake3-1.0.8-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6ae7754c7d96e92a70a52e07c732d594cf9924d780f49fffd3a1e9235e0f5ba7", size = 507323, upload-time = "2025-10-14T06:46:30.661Z" },
{ url = "https://files.pythonhosted.org/packages/fc/bb/ba90eddd592f8c074a0694cb0a744b6bd76bfe67a14c2b490c8bdfca3119/blake3-1.0.8-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4bacaae75e98dee3b7da6c5ee3b81ee21a3352dd2477d6f1d1dbfd38cdbf158a", size = 393449, upload-time = "2025-10-14T06:46:31.805Z" },
{ url = "https://files.pythonhosted.org/packages/25/ed/58a2acd0b9e14459cdaef4344db414d4a36e329b9720921b442a454dd443/blake3-1.0.8-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9456c829601d72852d8ba0af8dae0610f7def1d59f5942efde1e2ef93e8a8b57", size = 386844, upload-time = "2025-10-14T06:46:33.195Z" },
{ url = "https://files.pythonhosted.org/packages/4a/04/fed09845b18d90862100c8e48308261e2f663aab25d3c71a6a0bdda6618b/blake3-1.0.8-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:497ef8096ec4ac1ffba9a66152cee3992337cebf8ea434331d8fd9ce5423d227", size = 549550, upload-time = "2025-10-14T06:46:35.23Z" },
{ url = "https://files.pythonhosted.org/packages/d6/65/1859fddfabc1cc72548c2269d988819aad96d854e25eae00531517925901/blake3-1.0.8-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:511133bab85ff60ed143424ce484d08c60894ff7323f685d7a6095f43f0c85c3", size = 553805, upload-time = "2025-10-14T06:46:36.532Z" },
{ url = "https://files.pythonhosted.org/packages/c1/c7/2969352017f62378e388bb07bb2191bc9a953f818dc1cd6b9dd5c24916e1/blake3-1.0.8-cp313-cp313t-win32.whl", hash = "sha256:9c9fbdacfdeb68f7ca53bb5a7a5a593ec996eaf21155ad5b08d35e6f97e60877", size = 228068, upload-time = "2025-10-14T06:46:37.826Z" },
{ url = "https://files.pythonhosted.org/packages/d8/fc/923e25ac9cadfff1cd20038bcc0854d0f98061eb6bc78e42c43615f5982d/blake3-1.0.8-cp313-cp313t-win_amd64.whl", hash = "sha256:3cec94ed5676821cf371e9c9d25a41b4f3ebdb5724719b31b2749653b7cc1dfa", size = 215369, upload-time = "2025-10-14T06:46:39.054Z" },
{ url = "https://files.pythonhosted.org/packages/2e/2a/9f13ea01b03b1b4751a1cc2b6c1ef4b782e19433a59cf35b59cafb2a2696/blake3-1.0.8-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:2c33dac2c6112bc23f961a7ca305c7e34702c8177040eb98d0389d13a347b9e1", size = 347016, upload-time = "2025-10-14T06:46:40.318Z" },
{ url = "https://files.pythonhosted.org/packages/06/8e/8458c4285fbc5de76414f243e4e0fcab795d71a8b75324e14959aee699da/blake3-1.0.8-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c445eff665d21c3b3b44f864f849a2225b1164c08654beb23224a02f087b7ff1", size = 324496, upload-time = "2025-10-14T06:46:42.355Z" },
{ url = "https://files.pythonhosted.org/packages/49/fa/b913eb9cc4af708c03e01e6b88a8bb3a74833ba4ae4b16b87e2829198e06/blake3-1.0.8-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a47939f04b89c5c6ff1e51e883e5efab1ea1bf01a02f4d208d216dddd63d0dd8", size = 370654, upload-time = "2025-10-14T06:46:43.907Z" },
{ url = "https://files.pythonhosted.org/packages/7f/4f/245e0800c33b99c8f2b570d9a7199b51803694913ee4897f339648502933/blake3-1.0.8-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:73e0b4fa25f6e3078526a592fb38fca85ef204fd02eced6731e1cdd9396552d4", size = 374693, upload-time = "2025-10-14T06:46:45.186Z" },
{ url = "https://files.pythonhosted.org/packages/a2/a6/8cb182c8e482071dbdfcc6ec0048271fd48bcb78782d346119ff54993700/blake3-1.0.8-cp314-cp314-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4b0543c57eb9d6dac9d4bced63e9f7f7b546886ac04cec8da3c3d9c8f30cbbb7", size = 447673, upload-time = "2025-10-14T06:46:46.358Z" },
{ url = "https://files.pythonhosted.org/packages/06/b7/1cbbb5574d2a9436d1b15e7eb5b9d82e178adcaca71a97b0fddaca4bfe3a/blake3-1.0.8-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ed972ebd553c0c25363459e9fc71a38c045d8419e365b59acd8cd791eff13981", size = 507233, upload-time = "2025-10-14T06:46:48.109Z" },
{ url = "https://files.pythonhosted.org/packages/9c/45/b55825d90af353b3e26c653bab278da9d6563afcf66736677f9397e465be/blake3-1.0.8-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3bafdec95dfffa3f6571e529644744e280337df15ddd9728f224ba70c5779b23", size = 393852, upload-time = "2025-10-14T06:46:49.511Z" },
{ url = "https://files.pythonhosted.org/packages/34/73/9058a1a457dd20491d1b37de53d6876eff125e1520d9b2dd7d0acbc88de2/blake3-1.0.8-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2d78f06f3fb838b34c330e2987090376145cbe5944d8608a0c4779c779618f7b", size = 386442, upload-time = "2025-10-14T06:46:51.205Z" },
{ url = "https://files.pythonhosted.org/packages/30/6d/561d537ffc17985e276e08bf4513f1c106f1fdbef571e782604dc4e44070/blake3-1.0.8-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:dd03ff08d1b6e4fdda1cd03826f971ae8966ef6f683a8c68aa27fb21904b5aa9", size = 549929, upload-time = "2025-10-14T06:46:52.494Z" },
{ url = "https://files.pythonhosted.org/packages/03/2f/dbe20d2c57f1a67c63be4ba310bcebc707b945c902a0bde075d2a8f5cd5c/blake3-1.0.8-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:4e02a3c499e35bf51fc15b2738aca1a76410804c877bcd914752cac4f71f052a", size = 553750, upload-time = "2025-10-14T06:46:54.194Z" },
{ url = "https://files.pythonhosted.org/packages/6b/da/c6cb712663c869b2814870c2798e57289c4268c5ac5fb12d467fce244860/blake3-1.0.8-cp314-cp314-win32.whl", hash = "sha256:a585357d5d8774aad9ffc12435de457f9e35cde55e0dc8bc43ab590a6929e59f", size = 228404, upload-time = "2025-10-14T06:46:56.807Z" },
{ url = "https://files.pythonhosted.org/packages/dc/b6/c7dcd8bc3094bba1c4274e432f9e77a7df703532ca000eaa550bd066b870/blake3-1.0.8-cp314-cp314-win_amd64.whl", hash = "sha256:9ab5998e2abd9754819753bc2f1cf3edf82d95402bff46aeef45ed392a5468bf", size = 215460, upload-time = "2025-10-14T06:46:58.15Z" },
{ url = "https://files.pythonhosted.org/packages/75/3c/6c8afd856c353176836daa5cc33a7989e8f54569e9d53eb1c53fc8f80c34/blake3-1.0.8-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:e2df12f295f95a804338bd300e8fad4a6f54fd49bd4d9c5893855a230b5188a8", size = 347482, upload-time = "2025-10-14T06:47:00.189Z" },
{ url = "https://files.pythonhosted.org/packages/6a/35/92cd5501ce8e1f5cabdc0c3ac62d69fdb13ff0b60b62abbb2b6d0a53a790/blake3-1.0.8-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:63379be58438878eeb76ebe4f0efbeaabf42b79f2cff23b6126b7991588ced67", size = 324376, upload-time = "2025-10-14T06:47:01.413Z" },
{ url = "https://files.pythonhosted.org/packages/11/33/503b37220a3e2e31917ef13722efd00055af51c5e88ae30974c733d7ece6/blake3-1.0.8-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88d527c247f9609dc1d45a08fd243e39f0d5300d54c57e048de24d4fa9240ebb", size = 370220, upload-time = "2025-10-14T06:47:02.573Z" },
{ url = "https://files.pythonhosted.org/packages/3e/df/fe817843adf59516c04d44387bd643b422a3b0400ea95c6ede6a49920737/blake3-1.0.8-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:506a47897a11ebe8f3cdeb52f1365d6a2f83959e98ccb0c830f8f73277d4d358", size = 373454, upload-time = "2025-10-14T06:47:03.784Z" },
{ url = "https://files.pythonhosted.org/packages/d1/4d/90a2a623575373dfc9b683f1bad1bf017feafa5a6d65d94fb09543050740/blake3-1.0.8-cp314-cp314t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e5122a61b3b004bbbd979bdf83a3aaab432da3e2a842d7ddf1c273f2503b4884", size = 447102, upload-time = "2025-10-14T06:47:04.958Z" },
{ url = "https://files.pythonhosted.org/packages/93/ff/4e8ce314f60115c4c657b1fdbe9225b991da4f5bcc5d1c1f1d151e2f39d6/blake3-1.0.8-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0171e85d56dec1219abdae5f49a0ed12cb3f86a454c29160a64fd8a8166bba37", size = 506791, upload-time = "2025-10-14T06:47:06.82Z" },
{ url = "https://files.pythonhosted.org/packages/44/88/2963a1f18aab52bdcf35379b2b48c34bbc462320c37e76960636b8602c36/blake3-1.0.8-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:003f61e8c41dd9931edddf1cc6a1bb680fb2ac0ad15493ef4a1df9adc59ce9df", size = 393717, upload-time = "2025-10-14T06:47:09.085Z" },
{ url = "https://files.pythonhosted.org/packages/45/d1/a848ed8e8d4e236b9b16381768c9ae99d92890c24886bb4505aa9c3d2033/blake3-1.0.8-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2c3151955efb09ba58cd3e1263521e15e9e3866a40d6bd3556d86fc968e8f95", size = 386150, upload-time = "2025-10-14T06:47:10.363Z" },
{ url = "https://files.pythonhosted.org/packages/96/09/e3eb5d60f97c01de23d9f434e6e1fc117efb466eaa1f6ddbbbcb62580d6e/blake3-1.0.8-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:5eb25bca3cee2e0dd746a214784fb36be6a43640c01c55b6b4e26196e72d076c", size = 549120, upload-time = "2025-10-14T06:47:11.713Z" },
{ url = "https://files.pythonhosted.org/packages/14/ad/3d9661c710febb8957dd685fdb3e5a861aa0ac918eda3031365ce45789e2/blake3-1.0.8-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:ab4e1dea4fa857944944db78e8f20d99ee2e16b2dea5a14f514fb0607753ac83", size = 553264, upload-time = "2025-10-14T06:47:13.317Z" },
{ url = "https://files.pythonhosted.org/packages/11/55/e332a5b49edf377d0690e95951cca21a00c568f6e37315f9749efee52617/blake3-1.0.8-cp314-cp314t-win32.whl", hash = "sha256:67f1bc11bf59464ef092488c707b13dd4e872db36e25c453dfb6e0c7498df9f1", size = 228116, upload-time = "2025-10-14T06:47:14.516Z" },
{ url = "https://files.pythonhosted.org/packages/b0/5c/dbd00727a3dd165d7e0e8af40e630cd7e45d77b525a3218afaff8a87358e/blake3-1.0.8-cp314-cp314t-win_amd64.whl", hash = "sha256:421b99cdf1ff2d1bf703bc56c454f4b286fce68454dd8711abbcb5a0df90c19a", size = 215133, upload-time = "2025-10-14T06:47:16.069Z" },
]
[[package]] [[package]]
name = "caio" name = "caio"
version = "0.9.24" version = "0.9.24"
@ -365,6 +442,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/76/f2/98fd8d0b514622a789fd2824b59bd6041b799aaeeba14a8d92d52f6654dd/cython-3.2.2-py3-none-any.whl", hash = "sha256:13b99ecb9482aff6a6c12d1ca6feef6940c507af909914b49f568de74fa965fb", size = 1255106, upload-time = "2025-11-30T12:48:18.454Z" }, { url = "https://files.pythonhosted.org/packages/76/f2/98fd8d0b514622a789fd2824b59bd6041b799aaeeba14a8d92d52f6654dd/cython-3.2.2-py3-none-any.whl", hash = "sha256:13b99ecb9482aff6a6c12d1ca6feef6940c507af909914b49f568de74fa965fb", size = 1255106, upload-time = "2025-11-30T12:48:18.454Z" },
] ]
[[package]]
name = "dnspython"
version = "2.8.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/8c/8b/57666417c0f90f08bcafa776861060426765fdb422eb10212086fb811d26/dnspython-2.8.0.tar.gz", hash = "sha256:181d3c6996452cb1189c4046c61599b84a5a86e099562ffde77d26984ff26d0f", size = 368251, upload-time = "2025-09-07T18:58:00.022Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ba/5a/18ad964b0086c6e62e2e7500f7edc89e3faa45033c71c1893d34eed2b2de/dnspython-2.8.0-py3-none-any.whl", hash = "sha256:01d9bbc4a2d76bf0db7c1f729812ded6d912bd318d3b1cf81d30c0f845dbf3af", size = 331094, upload-time = "2025-09-07T18:57:58.071Z" },
]
[[package]] [[package]]
name = "elastic-transport" name = "elastic-transport"
version = "8.17.1" version = "8.17.1"
@ -698,6 +784,94 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" }, { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" },
] ]
[[package]]
name = "mmh3"
version = "5.2.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/a7/af/f28c2c2f51f31abb4725f9a64bc7863d5f491f6539bd26aee2a1d21a649e/mmh3-5.2.0.tar.gz", hash = "sha256:1efc8fec8478e9243a78bb993422cf79f8ff85cb4cf6b79647480a31e0d950a8", size = 33582, upload-time = "2025-07-29T07:43:48.49Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/bf/6a/d5aa7edb5c08e0bd24286c7d08341a0446f9a2fbbb97d96a8a6dd81935ee/mmh3-5.2.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:384eda9361a7bf83a85e09447e1feafe081034af9dd428893701b959230d84be", size = 56141, upload-time = "2025-07-29T07:42:13.456Z" },
{ url = "https://files.pythonhosted.org/packages/08/49/131d0fae6447bc4a7299ebdb1a6fb9d08c9f8dcf97d75ea93e8152ddf7ab/mmh3-5.2.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:2c9da0d568569cc87315cb063486d761e38458b8ad513fedd3dc9263e1b81bcd", size = 40681, upload-time = "2025-07-29T07:42:14.306Z" },
{ url = "https://files.pythonhosted.org/packages/8f/6f/9221445a6bcc962b7f5ff3ba18ad55bba624bacdc7aa3fc0a518db7da8ec/mmh3-5.2.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:86d1be5d63232e6eb93c50881aea55ff06eb86d8e08f9b5417c8c9b10db9db96", size = 40062, upload-time = "2025-07-29T07:42:15.08Z" },
{ url = "https://files.pythonhosted.org/packages/1e/d4/6bb2d0fef81401e0bb4c297d1eb568b767de4ce6fc00890bc14d7b51ecc4/mmh3-5.2.0-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:bf7bee43e17e81671c447e9c83499f53d99bf440bc6d9dc26a841e21acfbe094", size = 97333, upload-time = "2025-07-29T07:42:16.436Z" },
{ url = "https://files.pythonhosted.org/packages/44/e0/ccf0daff8134efbb4fbc10a945ab53302e358c4b016ada9bf97a6bdd50c1/mmh3-5.2.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:7aa18cdb58983ee660c9c400b46272e14fa253c675ed963d3812487f8ca42037", size = 103310, upload-time = "2025-07-29T07:42:17.796Z" },
{ url = "https://files.pythonhosted.org/packages/02/63/1965cb08a46533faca0e420e06aff8bbaf9690a6f0ac6ae6e5b2e4544687/mmh3-5.2.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ae9d032488fcec32d22be6542d1a836f00247f40f320844dbb361393b5b22773", size = 106178, upload-time = "2025-07-29T07:42:19.281Z" },
{ url = "https://files.pythonhosted.org/packages/c2/41/c883ad8e2c234013f27f92061200afc11554ea55edd1bcf5e1accd803a85/mmh3-5.2.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e1861fb6b1d0453ed7293200139c0a9011eeb1376632e048e3766945b13313c5", size = 113035, upload-time = "2025-07-29T07:42:20.356Z" },
{ url = "https://files.pythonhosted.org/packages/df/b5/1ccade8b1fa625d634a18bab7bf08a87457e09d5ec8cf83ca07cbea9d400/mmh3-5.2.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:99bb6a4d809aa4e528ddfe2c85dd5239b78b9dd14be62cca0329db78505e7b50", size = 120784, upload-time = "2025-07-29T07:42:21.377Z" },
{ url = "https://files.pythonhosted.org/packages/77/1c/919d9171fcbdcdab242e06394464ccf546f7d0f3b31e0d1e3a630398782e/mmh3-5.2.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1f8d8b627799f4e2fcc7c034fed8f5f24dc7724ff52f69838a3d6d15f1ad4765", size = 99137, upload-time = "2025-07-29T07:42:22.344Z" },
{ url = "https://files.pythonhosted.org/packages/66/8a/1eebef5bd6633d36281d9fc83cf2e9ba1ba0e1a77dff92aacab83001cee4/mmh3-5.2.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b5995088dd7023d2d9f310a0c67de5a2b2e06a570ecfd00f9ff4ab94a67cde43", size = 98664, upload-time = "2025-07-29T07:42:23.269Z" },
{ url = "https://files.pythonhosted.org/packages/13/41/a5d981563e2ee682b21fb65e29cc0f517a6734a02b581359edd67f9d0360/mmh3-5.2.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:1a5f4d2e59d6bba8ef01b013c472741835ad961e7c28f50c82b27c57748744a4", size = 106459, upload-time = "2025-07-29T07:42:24.238Z" },
{ url = "https://files.pythonhosted.org/packages/24/31/342494cd6ab792d81e083680875a2c50fa0c5df475ebf0b67784f13e4647/mmh3-5.2.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:fd6e6c3d90660d085f7e73710eab6f5545d4854b81b0135a3526e797009dbda3", size = 110038, upload-time = "2025-07-29T07:42:25.629Z" },
{ url = "https://files.pythonhosted.org/packages/28/44/efda282170a46bb4f19c3e2b90536513b1d821c414c28469a227ca5a1789/mmh3-5.2.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:c4a2f3d83879e3de2eb8cbf562e71563a8ed15ee9b9c2e77ca5d9f73072ac15c", size = 97545, upload-time = "2025-07-29T07:42:27.04Z" },
{ url = "https://files.pythonhosted.org/packages/68/8f/534ae319c6e05d714f437e7206f78c17e66daca88164dff70286b0e8ea0c/mmh3-5.2.0-cp312-cp312-win32.whl", hash = "sha256:2421b9d665a0b1ad724ec7332fb5a98d075f50bc51a6ff854f3a1882bd650d49", size = 40805, upload-time = "2025-07-29T07:42:28.032Z" },
{ url = "https://files.pythonhosted.org/packages/b8/f6/f6abdcfefcedab3c964868048cfe472764ed358c2bf6819a70dd4ed4ed3a/mmh3-5.2.0-cp312-cp312-win_amd64.whl", hash = "sha256:72d80005b7634a3a2220f81fbeb94775ebd12794623bb2e1451701ea732b4aa3", size = 41597, upload-time = "2025-07-29T07:42:28.894Z" },
{ url = "https://files.pythonhosted.org/packages/15/fd/f7420e8cbce45c259c770cac5718badf907b302d3a99ec587ba5ce030237/mmh3-5.2.0-cp312-cp312-win_arm64.whl", hash = "sha256:3d6bfd9662a20c054bc216f861fa330c2dac7c81e7fb8307b5e32ab5b9b4d2e0", size = 39350, upload-time = "2025-07-29T07:42:29.794Z" },
{ url = "https://files.pythonhosted.org/packages/d8/fa/27f6ab93995ef6ad9f940e96593c5dd24744d61a7389532b0fec03745607/mmh3-5.2.0-cp313-cp313-android_21_arm64_v8a.whl", hash = "sha256:e79c00eba78f7258e5b354eccd4d7907d60317ced924ea4a5f2e9d83f5453065", size = 40874, upload-time = "2025-07-29T07:42:30.662Z" },
{ url = "https://files.pythonhosted.org/packages/11/9c/03d13bcb6a03438bc8cac3d2e50f80908d159b31a4367c2e1a7a077ded32/mmh3-5.2.0-cp313-cp313-android_21_x86_64.whl", hash = "sha256:956127e663d05edbeec54df38885d943dfa27406594c411139690485128525de", size = 42012, upload-time = "2025-07-29T07:42:31.539Z" },
{ url = "https://files.pythonhosted.org/packages/4e/78/0865d9765408a7d504f1789944e678f74e0888b96a766d578cb80b040999/mmh3-5.2.0-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:c3dca4cb5b946ee91b3d6bb700d137b1cd85c20827f89fdf9c16258253489044", size = 39197, upload-time = "2025-07-29T07:42:32.374Z" },
{ url = "https://files.pythonhosted.org/packages/3e/12/76c3207bd186f98b908b6706c2317abb73756d23a4e68ea2bc94825b9015/mmh3-5.2.0-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:e651e17bfde5840e9e4174b01e9e080ce49277b70d424308b36a7969d0d1af73", size = 39840, upload-time = "2025-07-29T07:42:33.227Z" },
{ url = "https://files.pythonhosted.org/packages/5d/0d/574b6cce5555c9f2b31ea189ad44986755eb14e8862db28c8b834b8b64dc/mmh3-5.2.0-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:9f64bf06f4bf623325fda3a6d02d36cd69199b9ace99b04bb2d7fd9f89688504", size = 40644, upload-time = "2025-07-29T07:42:34.099Z" },
{ url = "https://files.pythonhosted.org/packages/52/82/3731f8640b79c46707f53ed72034a58baad400be908c87b0088f1f89f986/mmh3-5.2.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ddc63328889bcaee77b743309e5c7d2d52cee0d7d577837c91b6e7cc9e755e0b", size = 56153, upload-time = "2025-07-29T07:42:35.031Z" },
{ url = "https://files.pythonhosted.org/packages/4f/34/e02dca1d4727fd9fdeaff9e2ad6983e1552804ce1d92cc796e5b052159bb/mmh3-5.2.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:bb0fdc451fb6d86d81ab8f23d881b8d6e37fc373a2deae1c02d27002d2ad7a05", size = 40684, upload-time = "2025-07-29T07:42:35.914Z" },
{ url = "https://files.pythonhosted.org/packages/8f/36/3dee40767356e104967e6ed6d102ba47b0b1ce2a89432239b95a94de1b89/mmh3-5.2.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b29044e1ffdb84fe164d0a7ea05c7316afea93c00f8ed9449cf357c36fc4f814", size = 40057, upload-time = "2025-07-29T07:42:36.755Z" },
{ url = "https://files.pythonhosted.org/packages/31/58/228c402fccf76eb39a0a01b8fc470fecf21965584e66453b477050ee0e99/mmh3-5.2.0-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:58981d6ea9646dbbf9e59a30890cbf9f610df0e4a57dbfe09215116fd90b0093", size = 97344, upload-time = "2025-07-29T07:42:37.675Z" },
{ url = "https://files.pythonhosted.org/packages/34/82/fc5ce89006389a6426ef28e326fc065b0fbaaed230373b62d14c889f47ea/mmh3-5.2.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:7e5634565367b6d98dc4aa2983703526ef556b3688ba3065edb4b9b90ede1c54", size = 103325, upload-time = "2025-07-29T07:42:38.591Z" },
{ url = "https://files.pythonhosted.org/packages/09/8c/261e85777c6aee1ebd53f2f17e210e7481d5b0846cd0b4a5c45f1e3761b8/mmh3-5.2.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b0271ac12415afd3171ab9a3c7cbfc71dee2c68760a7dc9d05bf8ed6ddfa3a7a", size = 106240, upload-time = "2025-07-29T07:42:39.563Z" },
{ url = "https://files.pythonhosted.org/packages/70/73/2f76b3ad8a3d431824e9934403df36c0ddacc7831acf82114bce3c4309c8/mmh3-5.2.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:45b590e31bc552c6f8e2150ff1ad0c28dd151e9f87589e7eaf508fbdd8e8e908", size = 113060, upload-time = "2025-07-29T07:42:40.585Z" },
{ url = "https://files.pythonhosted.org/packages/9f/b9/7ea61a34e90e50a79a9d87aa1c0b8139a7eaf4125782b34b7d7383472633/mmh3-5.2.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:bdde97310d59604f2a9119322f61b31546748499a21b44f6715e8ced9308a6c5", size = 120781, upload-time = "2025-07-29T07:42:41.618Z" },
{ url = "https://files.pythonhosted.org/packages/0f/5b/ae1a717db98c7894a37aeedbd94b3f99e6472a836488f36b6849d003485b/mmh3-5.2.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:fc9c5f280438cf1c1a8f9abb87dc8ce9630a964120cfb5dd50d1e7ce79690c7a", size = 99174, upload-time = "2025-07-29T07:42:42.587Z" },
{ url = "https://files.pythonhosted.org/packages/e3/de/000cce1d799fceebb6d4487ae29175dd8e81b48e314cba7b4da90bcf55d7/mmh3-5.2.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:c903e71fd8debb35ad2a4184c1316b3cb22f64ce517b4e6747f25b0a34e41266", size = 98734, upload-time = "2025-07-29T07:42:43.996Z" },
{ url = "https://files.pythonhosted.org/packages/79/19/0dc364391a792b72fbb22becfdeacc5add85cc043cd16986e82152141883/mmh3-5.2.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:eed4bba7ff8a0d37106ba931ab03bdd3915fbb025bcf4e1f0aa02bc8114960c5", size = 106493, upload-time = "2025-07-29T07:42:45.07Z" },
{ url = "https://files.pythonhosted.org/packages/3c/b1/bc8c28e4d6e807bbb051fefe78e1156d7f104b89948742ad310612ce240d/mmh3-5.2.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:1fdb36b940e9261aff0b5177c5b74a36936b902f473180f6c15bde26143681a9", size = 110089, upload-time = "2025-07-29T07:42:46.122Z" },
{ url = "https://files.pythonhosted.org/packages/3b/a2/d20f3f5c95e9c511806686c70d0a15479cc3941c5f322061697af1c1ff70/mmh3-5.2.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:7303aab41e97adcf010a09efd8f1403e719e59b7705d5e3cfed3dd7571589290", size = 97571, upload-time = "2025-07-29T07:42:47.18Z" },
{ url = "https://files.pythonhosted.org/packages/7b/23/665296fce4f33488deec39a750ffd245cfc07aafb0e3ef37835f91775d14/mmh3-5.2.0-cp313-cp313-win32.whl", hash = "sha256:03e08c6ebaf666ec1e3d6ea657a2d363bb01effd1a9acfe41f9197decaef0051", size = 40806, upload-time = "2025-07-29T07:42:48.166Z" },
{ url = "https://files.pythonhosted.org/packages/59/b0/92e7103f3b20646e255b699e2d0327ce53a3f250e44367a99dc8be0b7c7a/mmh3-5.2.0-cp313-cp313-win_amd64.whl", hash = "sha256:7fddccd4113e7b736706e17a239a696332360cbaddf25ae75b57ba1acce65081", size = 41600, upload-time = "2025-07-29T07:42:49.371Z" },
{ url = "https://files.pythonhosted.org/packages/99/22/0b2bd679a84574647de538c5b07ccaa435dbccc37815067fe15b90fe8dad/mmh3-5.2.0-cp313-cp313-win_arm64.whl", hash = "sha256:fa0c966ee727aad5406d516375593c5f058c766b21236ab8985693934bb5085b", size = 39349, upload-time = "2025-07-29T07:42:50.268Z" },
{ url = "https://files.pythonhosted.org/packages/f7/ca/a20db059a8a47048aaf550da14a145b56e9c7386fb8280d3ce2962dcebf7/mmh3-5.2.0-cp314-cp314-ios_13_0_arm64_iphoneos.whl", hash = "sha256:e5015f0bb6eb50008bed2d4b1ce0f2a294698a926111e4bb202c0987b4f89078", size = 39209, upload-time = "2025-07-29T07:42:51.559Z" },
{ url = "https://files.pythonhosted.org/packages/98/dd/e5094799d55c7482d814b979a0fd608027d0af1b274bfb4c3ea3e950bfd5/mmh3-5.2.0-cp314-cp314-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:e0f3ed828d709f5b82d8bfe14f8856120718ec4bd44a5b26102c3030a1e12501", size = 39843, upload-time = "2025-07-29T07:42:52.536Z" },
{ url = "https://files.pythonhosted.org/packages/f4/6b/7844d7f832c85400e7cc89a1348e4e1fdd38c5a38415bb5726bbb8fcdb6c/mmh3-5.2.0-cp314-cp314-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:f35727c5118aba95f0397e18a1a5b8405425581bfe53e821f0fb444cbdc2bc9b", size = 40648, upload-time = "2025-07-29T07:42:53.392Z" },
{ url = "https://files.pythonhosted.org/packages/1f/bf/71f791f48a21ff3190ba5225807cbe4f7223360e96862c376e6e3fb7efa7/mmh3-5.2.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3bc244802ccab5220008cb712ca1508cb6a12f0eb64ad62997156410579a1770", size = 56164, upload-time = "2025-07-29T07:42:54.267Z" },
{ url = "https://files.pythonhosted.org/packages/70/1f/f87e3d34d83032b4f3f0f528c6d95a98290fcacf019da61343a49dccfd51/mmh3-5.2.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:ff3d50dc3fe8a98059f99b445dfb62792b5d006c5e0b8f03c6de2813b8376110", size = 40692, upload-time = "2025-07-29T07:42:55.234Z" },
{ url = "https://files.pythonhosted.org/packages/a6/e2/db849eaed07117086f3452feca8c839d30d38b830ac59fe1ce65af8be5ad/mmh3-5.2.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:37a358cc881fe796e099c1db6ce07ff757f088827b4e8467ac52b7a7ffdca647", size = 40068, upload-time = "2025-07-29T07:42:56.158Z" },
{ url = "https://files.pythonhosted.org/packages/df/6b/209af927207af77425b044e32f77f49105a0b05d82ff88af6971d8da4e19/mmh3-5.2.0-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:b9a87025121d1c448f24f27ff53a5fe7b6ef980574b4a4f11acaabe702420d63", size = 97367, upload-time = "2025-07-29T07:42:57.037Z" },
{ url = "https://files.pythonhosted.org/packages/ca/e0/78adf4104c425606a9ce33fb351f790c76a6c2314969c4a517d1ffc92196/mmh3-5.2.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:1ba55d6ca32eeef8b2625e1e4bfc3b3db52bc63014bd7e5df8cc11bf2b036b12", size = 103306, upload-time = "2025-07-29T07:42:58.522Z" },
{ url = "https://files.pythonhosted.org/packages/a3/79/c2b89f91b962658b890104745b1b6c9ce38d50a889f000b469b91eeb1b9e/mmh3-5.2.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c9ff37ba9f15637e424c2ab57a1a590c52897c845b768e4e0a4958084ec87f22", size = 106312, upload-time = "2025-07-29T07:42:59.552Z" },
{ url = "https://files.pythonhosted.org/packages/4b/14/659d4095528b1a209be90934778c5ffe312177d51e365ddcbca2cac2ec7c/mmh3-5.2.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a094319ec0db52a04af9fdc391b4d39a1bc72bc8424b47c4411afb05413a44b5", size = 113135, upload-time = "2025-07-29T07:43:00.745Z" },
{ url = "https://files.pythonhosted.org/packages/8d/6f/cd7734a779389a8a467b5c89a48ff476d6f2576e78216a37551a97e9e42a/mmh3-5.2.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:c5584061fd3da584659b13587f26c6cad25a096246a481636d64375d0c1f6c07", size = 120775, upload-time = "2025-07-29T07:43:02.124Z" },
{ url = "https://files.pythonhosted.org/packages/1d/ca/8256e3b96944408940de3f9291d7e38a283b5761fe9614d4808fcf27bd62/mmh3-5.2.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ecbfc0437ddfdced5e7822d1ce4855c9c64f46819d0fdc4482c53f56c707b935", size = 99178, upload-time = "2025-07-29T07:43:03.182Z" },
{ url = "https://files.pythonhosted.org/packages/8a/32/39e2b3cf06b6e2eb042c984dab8680841ac2a0d3ca6e0bea30db1f27b565/mmh3-5.2.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:7b986d506a8e8ea345791897ba5d8ba0d9d8820cd4fc3e52dbe6de19388de2e7", size = 98738, upload-time = "2025-07-29T07:43:04.207Z" },
{ url = "https://files.pythonhosted.org/packages/61/d3/7bbc8e0e8cf65ebbe1b893ffa0467b7ecd1bd07c3bbf6c9db4308ada22ec/mmh3-5.2.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:38d899a156549da8ef6a9f1d6f7ef231228d29f8f69bce2ee12f5fba6d6fd7c5", size = 106510, upload-time = "2025-07-29T07:43:05.656Z" },
{ url = "https://files.pythonhosted.org/packages/10/99/b97e53724b52374e2f3859046f0eb2425192da356cb19784d64bc17bb1cf/mmh3-5.2.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:d86651fa45799530885ba4dab3d21144486ed15285e8784181a0ab37a4552384", size = 110053, upload-time = "2025-07-29T07:43:07.204Z" },
{ url = "https://files.pythonhosted.org/packages/ac/62/3688c7d975ed195155671df68788c83fed6f7909b6ec4951724c6860cb97/mmh3-5.2.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:c463d7c1c4cfc9d751efeaadd936bbba07b5b0ed81a012b3a9f5a12f0872bd6e", size = 97546, upload-time = "2025-07-29T07:43:08.226Z" },
{ url = "https://files.pythonhosted.org/packages/ca/3b/c6153250f03f71a8b7634cded82939546cdfba02e32f124ff51d52c6f991/mmh3-5.2.0-cp314-cp314-win32.whl", hash = "sha256:bb4fe46bdc6104fbc28db7a6bacb115ee6368ff993366bbd8a2a7f0076e6f0c0", size = 41422, upload-time = "2025-07-29T07:43:09.216Z" },
{ url = "https://files.pythonhosted.org/packages/74/01/a27d98bab083a435c4c07e9d1d720d4c8a578bf4c270bae373760b1022be/mmh3-5.2.0-cp314-cp314-win_amd64.whl", hash = "sha256:7c7f0b342fd06044bedd0b6e72177ddc0076f54fd89ee239447f8b271d919d9b", size = 42135, upload-time = "2025-07-29T07:43:10.183Z" },
{ url = "https://files.pythonhosted.org/packages/cb/c9/dbba5507e95429b8b380e2ba091eff5c20a70a59560934dff0ad8392b8c8/mmh3-5.2.0-cp314-cp314-win_arm64.whl", hash = "sha256:3193752fc05ea72366c2b63ff24b9a190f422e32d75fdeae71087c08fff26115", size = 39879, upload-time = "2025-07-29T07:43:11.106Z" },
{ url = "https://files.pythonhosted.org/packages/b5/d1/c8c0ef839c17258b9de41b84f663574fabcf8ac2007b7416575e0f65ff6e/mmh3-5.2.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:69fc339d7202bea69ef9bd7c39bfdf9fdabc8e6822a01eba62fb43233c1b3932", size = 57696, upload-time = "2025-07-29T07:43:11.989Z" },
{ url = "https://files.pythonhosted.org/packages/2f/55/95e2b9ff201e89f9fe37036037ab61a6c941942b25cdb7b6a9df9b931993/mmh3-5.2.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:12da42c0a55c9d86ab566395324213c319c73ecb0c239fad4726324212b9441c", size = 41421, upload-time = "2025-07-29T07:43:13.269Z" },
{ url = "https://files.pythonhosted.org/packages/77/79/9be23ad0b7001a4b22752e7693be232428ecc0a35068a4ff5c2f14ef8b20/mmh3-5.2.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:f7f9034c7cf05ddfaac8d7a2e63a3c97a840d4615d0a0e65ba8bdf6f8576e3be", size = 40853, upload-time = "2025-07-29T07:43:14.888Z" },
{ url = "https://files.pythonhosted.org/packages/ac/1b/96b32058eda1c1dee8264900c37c359a7325c1f11f5ff14fd2be8e24eff9/mmh3-5.2.0-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:11730eeb16dfcf9674fdea9bb6b8e6dd9b40813b7eb839bc35113649eef38aeb", size = 109694, upload-time = "2025-07-29T07:43:15.816Z" },
{ url = "https://files.pythonhosted.org/packages/8d/6f/a2ae44cd7dad697b6dea48390cbc977b1e5ca58fda09628cbcb2275af064/mmh3-5.2.0-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:932a6eec1d2e2c3c9e630d10f7128d80e70e2d47fe6b8c7ea5e1afbd98733e65", size = 117438, upload-time = "2025-07-29T07:43:16.865Z" },
{ url = "https://files.pythonhosted.org/packages/a0/08/bfb75451c83f05224a28afeaf3950c7b793c0b71440d571f8e819cfb149a/mmh3-5.2.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3ca975c51c5028947bbcfc24966517aac06a01d6c921e30f7c5383c195f87991", size = 120409, upload-time = "2025-07-29T07:43:18.207Z" },
{ url = "https://files.pythonhosted.org/packages/9f/ea/8b118b69b2ff8df568f742387d1a159bc654a0f78741b31437dd047ea28e/mmh3-5.2.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:5b0b58215befe0f0e120b828f7645e97719bbba9f23b69e268ed0ac7adde8645", size = 125909, upload-time = "2025-07-29T07:43:19.39Z" },
{ url = "https://files.pythonhosted.org/packages/3e/11/168cc0b6a30650032e351a3b89b8a47382da541993a03af91e1ba2501234/mmh3-5.2.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:29c2b9ce61886809d0492a274a5a53047742dea0f703f9c4d5d223c3ea6377d3", size = 135331, upload-time = "2025-07-29T07:43:20.435Z" },
{ url = "https://files.pythonhosted.org/packages/31/05/e3a9849b1c18a7934c64e831492c99e67daebe84a8c2f2c39a7096a830e3/mmh3-5.2.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:a367d4741ac0103f8198c82f429bccb9359f543ca542b06a51f4f0332e8de279", size = 110085, upload-time = "2025-07-29T07:43:21.92Z" },
{ url = "https://files.pythonhosted.org/packages/d9/d5/a96bcc306e3404601418b2a9a370baec92af84204528ba659fdfe34c242f/mmh3-5.2.0-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:5a5dba98e514fb26241868f6eb90a7f7ca0e039aed779342965ce24ea32ba513", size = 111195, upload-time = "2025-07-29T07:43:23.066Z" },
{ url = "https://files.pythonhosted.org/packages/af/29/0fd49801fec5bff37198684e0849b58e0dab3a2a68382a357cfffb0fafc3/mmh3-5.2.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:941603bfd75a46023807511c1ac2f1b0f39cccc393c15039969806063b27e6db", size = 116919, upload-time = "2025-07-29T07:43:24.178Z" },
{ url = "https://files.pythonhosted.org/packages/2d/04/4f3c32b0a2ed762edca45d8b46568fc3668e34f00fb1e0a3b5451ec1281c/mmh3-5.2.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:132dd943451a7c7546978863d2f5a64977928410782e1a87d583cb60eb89e667", size = 123160, upload-time = "2025-07-29T07:43:25.26Z" },
{ url = "https://files.pythonhosted.org/packages/91/76/3d29eaa38821730633d6a240d36fa8ad2807e9dfd432c12e1a472ed211eb/mmh3-5.2.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:f698733a8a494466432d611a8f0d1e026f5286dee051beea4b3c3146817e35d5", size = 110206, upload-time = "2025-07-29T07:43:26.699Z" },
{ url = "https://files.pythonhosted.org/packages/44/1c/ccf35892684d3a408202e296e56843743e0b4fb1629e59432ea88cdb3909/mmh3-5.2.0-cp314-cp314t-win32.whl", hash = "sha256:6d541038b3fc360ec538fc116de87462627944765a6750308118f8b509a8eec7", size = 41970, upload-time = "2025-07-29T07:43:27.666Z" },
{ url = "https://files.pythonhosted.org/packages/75/b2/b9e4f1e5adb5e21eb104588fcee2cd1eaa8308255173481427d5ecc4284e/mmh3-5.2.0-cp314-cp314t-win_amd64.whl", hash = "sha256:e912b19cf2378f2967d0c08e86ff4c6c360129887f678e27e4dde970d21b3f4d", size = 43063, upload-time = "2025-07-29T07:43:28.582Z" },
{ url = "https://files.pythonhosted.org/packages/6a/fc/0e61d9a4e29c8679356795a40e48f647b4aad58d71bfc969f0f8f56fb912/mmh3-5.2.0-cp314-cp314t-win_arm64.whl", hash = "sha256:e7884931fe5e788163e7b3c511614130c2c59feffdc21112290a194487efb2e9", size = 40455, upload-time = "2025-07-29T07:43:29.563Z" },
]
[[package]]
name = "morphys"
version = "1.0"
source = { registry = "https://pypi.org/simple" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/f9/4f/cb781d0ac5d079adabc77dc4f0bc99fc81c390029bd33c6e70552139e762/morphys-1.0-py2.py3-none-any.whl", hash = "sha256:76d6dbaa4d65f597e59d332c81da786d83e4669387b9b2a750cfec74e7beec20", size = 5618, upload-time = "2017-01-10T20:08:56.872Z" },
]
[[package]] [[package]]
name = "msgspec" name = "msgspec"
version = "0.19.0" version = "0.19.0"
@ -720,6 +894,29 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/23/d8/f15b40611c2d5753d1abb0ca0da0c75348daf1252220e5dda2867bd81062/msgspec-0.19.0-cp313-cp313-win_amd64.whl", hash = "sha256:317050bc0f7739cb30d257ff09152ca309bf5a369854bbf1e57dffc310c1f20f", size = 187432, upload-time = "2024-12-27T17:40:16.256Z" }, { url = "https://files.pythonhosted.org/packages/23/d8/f15b40611c2d5753d1abb0ca0da0c75348daf1252220e5dda2867bd81062/msgspec-0.19.0-cp313-cp313-win_amd64.whl", hash = "sha256:317050bc0f7739cb30d257ff09152ca309bf5a369854bbf1e57dffc310c1f20f", size = 187432, upload-time = "2024-12-27T17:40:16.256Z" },
] ]
[[package]]
name = "multiaddr"
version = "0.1.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "base58" },
{ name = "dnspython" },
{ name = "idna" },
{ name = "netaddr" },
{ name = "psutil" },
{ name = "py-cid" },
{ name = "py-multibase" },
{ name = "py-multicodec" },
{ name = "py-multihash" },
{ name = "trio" },
{ name = "trio-typing" },
{ name = "varint" },
]
sdist = { url = "https://files.pythonhosted.org/packages/cf/5c/6d27f2b04c54e7c85b4a05760ac7dfaa7c68214b917e0d4a5043c01cf231/multiaddr-0.1.1.tar.gz", hash = "sha256:04da0afd2097625569073776526eb9733db9d9713286bada44632a9d7275b9bb", size = 54186, upload-time = "2025-12-07T17:19:07.996Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/07/ad/bd8a1748953a8f7a8167ec7d02eeefbed469595089d75e3fcb53047e7e20/multiaddr-0.1.1-py3-none-any.whl", hash = "sha256:d95333effddbd372009dbfce4d2ec922dd633697b6cc89a5af90ae082c4e68d4", size = 38374, upload-time = "2025-12-07T17:19:05.849Z" },
]
[[package]] [[package]]
name = "multidict" name = "multidict"
version = "6.7.0" version = "6.7.0"
@ -837,6 +1034,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/a0/c4/c2971a3ba4c6103a3d10c4b0f24f461ddc027f0f09763220cf35ca1401b3/nest_asyncio-1.6.0-py3-none-any.whl", hash = "sha256:87af6efd6b5e897c81050477ef65c62e2b2f35d51703cae01aff2905b1852e1c", size = 5195, upload-time = "2024-01-21T14:25:17.223Z" }, { url = "https://files.pythonhosted.org/packages/a0/c4/c2971a3ba4c6103a3d10c4b0f24f461ddc027f0f09763220cf35ca1401b3/nest_asyncio-1.6.0-py3-none-any.whl", hash = "sha256:87af6efd6b5e897c81050477ef65c62e2b2f35d51703cae01aff2905b1852e1c", size = 5195, upload-time = "2024-01-21T14:25:17.223Z" },
] ]
[[package]]
name = "netaddr"
version = "1.3.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/54/90/188b2a69654f27b221fba92fda7217778208532c962509e959a9cee5229d/netaddr-1.3.0.tar.gz", hash = "sha256:5c3c3d9895b551b763779ba7db7a03487dc1f8e3b385af819af341ae9ef6e48a", size = 2260504, upload-time = "2024-05-28T21:30:37.743Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/12/cc/f4fe2c7ce68b92cbf5b2d379ca366e1edae38cccaad00f69f529b460c3ef/netaddr-1.3.0-py3-none-any.whl", hash = "sha256:c2c6a8ebe5554ce33b7d5b3a306b71bbb373e000bbbf2350dd5213cc56e3dbbe", size = 2262023, upload-time = "2024-05-28T21:30:34.191Z" },
]
[[package]] [[package]]
name = "numba" name = "numba"
version = "0.62.1" version = "0.62.1"
@ -1000,6 +1206,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/6e/23/e98758924d1b3aac11a626268eabf7f3cf177e7837c28d47bf84c64532d0/pendulum-3.1.0-py3-none-any.whl", hash = "sha256:f9178c2a8e291758ade1e8dd6371b1d26d08371b4c7730a6e9a3ef8b16ebae0f", size = 111799, upload-time = "2025-04-19T14:02:34.739Z" }, { url = "https://files.pythonhosted.org/packages/6e/23/e98758924d1b3aac11a626268eabf7f3cf177e7837c28d47bf84c64532d0/pendulum-3.1.0-py3-none-any.whl", hash = "sha256:f9178c2a8e291758ade1e8dd6371b1d26d08371b4c7730a6e9a3ef8b16ebae0f", size = 111799, upload-time = "2025-04-19T14:02:34.739Z" },
] ]
[[package]]
name = "pexpect"
version = "4.9.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "ptyprocess" },
]
sdist = { url = "https://files.pythonhosted.org/packages/42/92/cc564bf6381ff43ce1f4d06852fc19a2f11d180f23dc32d9588bee2f149d/pexpect-4.9.0.tar.gz", hash = "sha256:ee7d41123f3c9911050ea2c2dac107568dc43b2d3b0c7557a33212c398ead30f", size = 166450, upload-time = "2023-11-25T09:07:26.339Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/9e/c3/059298687310d527a58bb01f3b1965787ee3b40dce76752eda8b44e9a2c5/pexpect-4.9.0-py2.py3-none-any.whl", hash = "sha256:7236d1e080e4936be2dc3e326cec0af72acf9212a7e1d060210e70a47e253523", size = 63772, upload-time = "2023-11-25T06:56:14.81Z" },
]
[[package]] [[package]]
name = "piker" name = "piker"
version = "0.1.0a0.dev0" version = "0.1.0a0.dev0"
@ -1014,6 +1232,7 @@ dependencies = [
{ name = "httpx" }, { name = "httpx" },
{ name = "ib-insync" }, { name = "ib-insync" },
{ name = "msgspec" }, { name = "msgspec" },
{ name = "multiaddr" },
{ name = "numba" }, { name = "numba" },
{ name = "numpy" }, { name = "numpy" },
{ name = "pendulum" }, { name = "pendulum" },
@ -1047,6 +1266,7 @@ dev = [
{ name = "greenback" }, { name = "greenback" },
{ name = "i3ipc" }, { name = "i3ipc" },
{ name = "pdbp" }, { name = "pdbp" },
{ name = "pexpect" },
{ name = "prompt-toolkit" }, { name = "prompt-toolkit" },
{ name = "pyperclip" }, { name = "pyperclip" },
{ name = "pyqt6" }, { name = "pyqt6" },
@ -1062,6 +1282,7 @@ lint = [
repl = [ repl = [
{ name = "greenback" }, { name = "greenback" },
{ name = "pdbp" }, { name = "pdbp" },
{ name = "pexpect" },
{ name = "prompt-toolkit" }, { name = "prompt-toolkit" },
{ name = "pyperclip" }, { name = "pyperclip" },
{ name = "xonsh" }, { name = "xonsh" },
@ -1087,6 +1308,7 @@ requires-dist = [
{ name = "httpx", specifier = ">=0.27.0,<0.28.0" }, { name = "httpx", specifier = ">=0.27.0,<0.28.0" },
{ name = "ib-insync", specifier = ">=0.9.86,<0.10.0" }, { name = "ib-insync", specifier = ">=0.9.86,<0.10.0" },
{ name = "msgspec", specifier = ">=0.19.0,<0.20" }, { name = "msgspec", specifier = ">=0.19.0,<0.20" },
{ name = "multiaddr", specifier = ">=0.1.1" },
{ name = "numba", specifier = ">=0.61.0" }, { name = "numba", specifier = ">=0.61.0" },
{ name = "numpy", specifier = ">=2.0" }, { name = "numpy", specifier = ">=2.0" },
{ name = "pendulum" }, { name = "pendulum" },
@ -1099,7 +1321,7 @@ requires-dist = [
{ name = "tomli", specifier = ">=2.0.1,<3.0.0" }, { name = "tomli", specifier = ">=2.0.1,<3.0.0" },
{ name = "tomli-w", specifier = ">=1.0.0,<2.0.0" }, { name = "tomli-w", specifier = ">=1.0.0,<2.0.0" },
{ name = "tomlkit", git = "https://github.com/pikers/tomlkit.git?branch=piker_pin" }, { name = "tomlkit", git = "https://github.com/pikers/tomlkit.git?branch=piker_pin" },
{ name = "tractor", git = "https://github.com/goodboy/tractor.git?branch=piker_pin" }, { name = "tractor", editable = "../tractor" },
{ name = "trio", specifier = ">=0.27" }, { name = "trio", specifier = ">=0.27" },
{ name = "trio-typing", specifier = ">=0.10.0" }, { name = "trio-typing", specifier = ">=0.10.0" },
{ name = "trio-util", specifier = ">=0.7.0,<0.8.0" }, { name = "trio-util", specifier = ">=0.7.0,<0.8.0" },
@ -1116,6 +1338,7 @@ dev = [
{ name = "greenback", specifier = ">=1.1.1,<2.0.0" }, { name = "greenback", specifier = ">=1.1.1,<2.0.0" },
{ name = "i3ipc", specifier = ">=2.2.1" }, { name = "i3ipc", specifier = ">=2.2.1" },
{ name = "pdbp", specifier = ">=1.8.2,<2.0.0" }, { name = "pdbp", specifier = ">=1.8.2,<2.0.0" },
{ name = "pexpect", specifier = ">=4.9.0" },
{ name = "prompt-toolkit", specifier = "==3.0.40" }, { name = "prompt-toolkit", specifier = "==3.0.40" },
{ name = "pyperclip", specifier = ">=1.9.0" }, { name = "pyperclip", specifier = ">=1.9.0" },
{ name = "pyqt6", specifier = ">=6.7.0,<7.0.0" }, { name = "pyqt6", specifier = ">=6.7.0,<7.0.0" },
@ -1123,15 +1346,16 @@ dev = [
{ name = "pytest" }, { name = "pytest" },
{ name = "qdarkstyle", specifier = ">=3.0.2,<4.0.0" }, { name = "qdarkstyle", specifier = ">=3.0.2,<4.0.0" },
{ name = "rapidfuzz", specifier = ">=3.2.0,<4.0.0" }, { name = "rapidfuzz", specifier = ">=3.2.0,<4.0.0" },
{ name = "xonsh" }, { name = "xonsh", git = "https://github.com/xonsh/xonsh.git?branch=main" },
] ]
lint = [{ name = "ruff", specifier = ">=0.9.6" }] lint = [{ name = "ruff", specifier = ">=0.9.6" }]
repl = [ repl = [
{ name = "greenback", specifier = ">=1.1.1,<2.0.0" }, { name = "greenback", specifier = ">=1.1.1,<2.0.0" },
{ name = "pdbp", specifier = ">=1.8.2,<2.0.0" }, { name = "pdbp", specifier = ">=1.8.2,<2.0.0" },
{ name = "pexpect", specifier = ">=4.9.0" },
{ name = "prompt-toolkit", specifier = "==3.0.40" }, { name = "prompt-toolkit", specifier = "==3.0.40" },
{ name = "pyperclip", specifier = ">=1.9.0" }, { name = "pyperclip", specifier = ">=1.9.0" },
{ name = "xonsh" }, { name = "xonsh", git = "https://github.com/xonsh/xonsh.git?branch=main" },
] ]
testing = [{ name = "pytest" }] testing = [{ name = "pytest" }]
uis = [ uis = [
@ -1297,6 +1521,101 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/5b/5a/bc7b4a4ef808fa59a816c17b20c4bef6884daebbdf627ff2a161da67da19/propcache-0.4.1-py3-none-any.whl", hash = "sha256:af2a6052aeb6cf17d3e46ee169099044fd8224cbaf75c76a2ef596e8163e2237", size = 13305, upload-time = "2025-10-08T19:49:00.792Z" }, { url = "https://files.pythonhosted.org/packages/5b/5a/bc7b4a4ef808fa59a816c17b20c4bef6884daebbdf627ff2a161da67da19/propcache-0.4.1-py3-none-any.whl", hash = "sha256:af2a6052aeb6cf17d3e46ee169099044fd8224cbaf75c76a2ef596e8163e2237", size = 13305, upload-time = "2025-10-08T19:49:00.792Z" },
] ]
[[package]]
name = "psutil"
version = "7.2.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/aa/c6/d1ddf4abb55e93cebc4f2ed8b5d6dbad109ecb8d63748dd2b20ab5e57ebe/psutil-7.2.2.tar.gz", hash = "sha256:0746f5f8d406af344fd547f1c8daa5f5c33dbc293bb8d6a16d80b4bb88f59372", size = 493740, upload-time = "2026-01-28T18:14:54.428Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/51/08/510cbdb69c25a96f4ae523f733cdc963ae654904e8db864c07585ef99875/psutil-7.2.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:2edccc433cbfa046b980b0df0171cd25bcaeb3a68fe9022db0979e7aa74a826b", size = 130595, upload-time = "2026-01-28T18:14:57.293Z" },
{ url = "https://files.pythonhosted.org/packages/d6/f5/97baea3fe7a5a9af7436301f85490905379b1c6f2dd51fe3ecf24b4c5fbf/psutil-7.2.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e78c8603dcd9a04c7364f1a3e670cea95d51ee865e4efb3556a3a63adef958ea", size = 131082, upload-time = "2026-01-28T18:14:59.732Z" },
{ url = "https://files.pythonhosted.org/packages/37/d6/246513fbf9fa174af531f28412297dd05241d97a75911ac8febefa1a53c6/psutil-7.2.2-cp313-cp313t-manylinux2010_x86_64.manylinux_2_12_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1a571f2330c966c62aeda00dd24620425d4b0cc86881c89861fbc04549e5dc63", size = 181476, upload-time = "2026-01-28T18:15:01.884Z" },
{ url = "https://files.pythonhosted.org/packages/b8/b5/9182c9af3836cca61696dabe4fd1304e17bc56cb62f17439e1154f225dd3/psutil-7.2.2-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:917e891983ca3c1887b4ef36447b1e0873e70c933afc831c6b6da078ba474312", size = 184062, upload-time = "2026-01-28T18:15:04.436Z" },
{ url = "https://files.pythonhosted.org/packages/16/ba/0756dca669f5a9300d0cbcbfae9a4c30e446dfc7440ffe43ded5724bfd93/psutil-7.2.2-cp313-cp313t-win_amd64.whl", hash = "sha256:ab486563df44c17f5173621c7b198955bd6b613fb87c71c161f827d3fb149a9b", size = 139893, upload-time = "2026-01-28T18:15:06.378Z" },
{ url = "https://files.pythonhosted.org/packages/1c/61/8fa0e26f33623b49949346de05ec1ddaad02ed8ba64af45f40a147dbfa97/psutil-7.2.2-cp313-cp313t-win_arm64.whl", hash = "sha256:ae0aefdd8796a7737eccea863f80f81e468a1e4cf14d926bd9b6f5f2d5f90ca9", size = 135589, upload-time = "2026-01-28T18:15:08.03Z" },
{ url = "https://files.pythonhosted.org/packages/81/69/ef179ab5ca24f32acc1dac0c247fd6a13b501fd5534dbae0e05a1c48b66d/psutil-7.2.2-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:eed63d3b4d62449571547b60578c5b2c4bcccc5387148db46e0c2313dad0ee00", size = 130664, upload-time = "2026-01-28T18:15:09.469Z" },
{ url = "https://files.pythonhosted.org/packages/7b/64/665248b557a236d3fa9efc378d60d95ef56dd0a490c2cd37dafc7660d4a9/psutil-7.2.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7b6d09433a10592ce39b13d7be5a54fbac1d1228ed29abc880fb23df7cb694c9", size = 131087, upload-time = "2026-01-28T18:15:11.724Z" },
{ url = "https://files.pythonhosted.org/packages/d5/2e/e6782744700d6759ebce3043dcfa661fb61e2fb752b91cdeae9af12c2178/psutil-7.2.2-cp314-cp314t-manylinux2010_x86_64.manylinux_2_12_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1fa4ecf83bcdf6e6c8f4449aff98eefb5d0604bf88cb883d7da3d8d2d909546a", size = 182383, upload-time = "2026-01-28T18:15:13.445Z" },
{ url = "https://files.pythonhosted.org/packages/57/49/0a41cefd10cb7505cdc04dab3eacf24c0c2cb158a998b8c7b1d27ee2c1f5/psutil-7.2.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e452c464a02e7dc7822a05d25db4cde564444a67e58539a00f929c51eddda0cf", size = 185210, upload-time = "2026-01-28T18:15:16.002Z" },
{ url = "https://files.pythonhosted.org/packages/dd/2c/ff9bfb544f283ba5f83ba725a3c5fec6d6b10b8f27ac1dc641c473dc390d/psutil-7.2.2-cp314-cp314t-win_amd64.whl", hash = "sha256:c7663d4e37f13e884d13994247449e9f8f574bc4655d509c3b95e9ec9e2b9dc1", size = 141228, upload-time = "2026-01-28T18:15:18.385Z" },
{ url = "https://files.pythonhosted.org/packages/f2/fc/f8d9c31db14fcec13748d373e668bc3bed94d9077dbc17fb0eebc073233c/psutil-7.2.2-cp314-cp314t-win_arm64.whl", hash = "sha256:11fe5a4f613759764e79c65cf11ebdf26e33d6dd34336f8a337aa2996d71c841", size = 136284, upload-time = "2026-01-28T18:15:19.912Z" },
{ url = "https://files.pythonhosted.org/packages/e7/36/5ee6e05c9bd427237b11b3937ad82bb8ad2752d72c6969314590dd0c2f6e/psutil-7.2.2-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:ed0cace939114f62738d808fdcecd4c869222507e266e574799e9c0faa17d486", size = 129090, upload-time = "2026-01-28T18:15:22.168Z" },
{ url = "https://files.pythonhosted.org/packages/80/c4/f5af4c1ca8c1eeb2e92ccca14ce8effdeec651d5ab6053c589b074eda6e1/psutil-7.2.2-cp36-abi3-macosx_11_0_arm64.whl", hash = "sha256:1a7b04c10f32cc88ab39cbf606e117fd74721c831c98a27dc04578deb0c16979", size = 129859, upload-time = "2026-01-28T18:15:23.795Z" },
{ url = "https://files.pythonhosted.org/packages/b5/70/5d8df3b09e25bce090399cf48e452d25c935ab72dad19406c77f4e828045/psutil-7.2.2-cp36-abi3-manylinux2010_x86_64.manylinux_2_12_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:076a2d2f923fd4821644f5ba89f059523da90dc9014e85f8e45a5774ca5bc6f9", size = 155560, upload-time = "2026-01-28T18:15:25.976Z" },
{ url = "https://files.pythonhosted.org/packages/63/65/37648c0c158dc222aba51c089eb3bdfa238e621674dc42d48706e639204f/psutil-7.2.2-cp36-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b0726cecd84f9474419d67252add4ac0cd9811b04d61123054b9fb6f57df6e9e", size = 156997, upload-time = "2026-01-28T18:15:27.794Z" },
{ url = "https://files.pythonhosted.org/packages/8e/13/125093eadae863ce03c6ffdbae9929430d116a246ef69866dad94da3bfbc/psutil-7.2.2-cp36-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:fd04ef36b4a6d599bbdb225dd1d3f51e00105f6d48a28f006da7f9822f2606d8", size = 148972, upload-time = "2026-01-28T18:15:29.342Z" },
{ url = "https://files.pythonhosted.org/packages/04/78/0acd37ca84ce3ddffaa92ef0f571e073faa6d8ff1f0559ab1272188ea2be/psutil-7.2.2-cp36-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:b58fabe35e80b264a4e3bb23e6b96f9e45a3df7fb7eed419ac0e5947c61e47cc", size = 148266, upload-time = "2026-01-28T18:15:31.597Z" },
{ url = "https://files.pythonhosted.org/packages/b4/90/e2159492b5426be0c1fef7acba807a03511f97c5f86b3caeda6ad92351a7/psutil-7.2.2-cp37-abi3-win_amd64.whl", hash = "sha256:eb7e81434c8d223ec4a219b5fc1c47d0417b12be7ea866e24fb5ad6e84b3d988", size = 137737, upload-time = "2026-01-28T18:15:33.849Z" },
{ url = "https://files.pythonhosted.org/packages/8c/c7/7bb2e321574b10df20cbde462a94e2b71d05f9bbda251ef27d104668306a/psutil-7.2.2-cp37-abi3-win_arm64.whl", hash = "sha256:8c233660f575a5a89e6d4cb65d9f938126312bca76d8fe087b947b3a1aaac9ee", size = 134617, upload-time = "2026-01-28T18:15:36.514Z" },
]
[[package]]
name = "ptyprocess"
version = "0.7.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/20/e5/16ff212c1e452235a90aeb09066144d0c5a6a8c0834397e03f5224495c4e/ptyprocess-0.7.0.tar.gz", hash = "sha256:5c5d0a3b48ceee0b48485e0c26037c0acd7d29765ca3fbb5cb3831d347423220", size = 70762, upload-time = "2020-12-28T15:15:30.155Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/22/a6/858897256d0deac81a172289110f31629fc4cee19b6f01283303e18c8db3/ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35", size = 13993, upload-time = "2020-12-28T15:15:28.35Z" },
]
[[package]]
name = "py-cid"
version = "0.4.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "morphys" },
{ name = "py-multibase" },
{ name = "py-multicodec" },
{ name = "py-multihash" },
]
sdist = { url = "https://files.pythonhosted.org/packages/e7/09/c0ca25eac91c62f6f22f5ac6accd0bfa957e77adfdffd0eccc0700f2ea07/py_cid-0.4.0.tar.gz", hash = "sha256:7c15d6a83f59c3a4c7fbff793f1d4cbfc831e90355fd0e2c5cfe927c21733cc3", size = 25970, upload-time = "2025-12-19T16:55:01.057Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/80/39/d5c1828e79526002f1bf87b9daba01c7db445960daf341e1dd84a5ff0469/py_cid-0.4.0-py3-none-any.whl", hash = "sha256:6a3183a3088b219dbf3cb37eec7d47a644be3f3ebabdf38347c2e9312621d6cc", size = 8833, upload-time = "2025-12-19T16:54:59.233Z" },
]
[[package]]
name = "py-multibase"
version = "2.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "morphys" },
{ name = "python-baseconv" },
{ name = "six" },
]
sdist = { url = "https://files.pythonhosted.org/packages/bc/52/5ed393ab49df7e3b03995d3c4e53bae1e8c2ca40909cf25a41b346c09a38/py_multibase-2.0.0.tar.gz", hash = "sha256:58c1a264195fa1ae29ea707c6fc8196446f4bdb92e0f9a0f131e0f280b238839", size = 26857, upload-time = "2025-12-18T02:24:49.132Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/36/c7/38035079d9978b32b962f996f1cccaa166ecfe38723ab4349ab32166c037/py_multibase-2.0.0-py3-none-any.whl", hash = "sha256:b29ce489b556134e73998a11712c406b70950812955df64084754e0774e40900", size = 10608, upload-time = "2025-12-18T02:24:47.827Z" },
]
[[package]]
name = "py-multicodec"
version = "1.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "varint" },
]
sdist = { url = "https://files.pythonhosted.org/packages/5e/26/ef24db0fbfec080b72c5ac4a1000da3a4d696a1e31862c695d683097a1b5/py_multicodec-1.0.0.tar.gz", hash = "sha256:78e4e3e47b6288cf635c3ca987152e6cb5510bdcdab307e7690c76ec3d5bbfeb", size = 44668, upload-time = "2025-12-18T20:41:37.976Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/76/da/768d07490faeae88ac361184164be9c262fececc3c6241b5fc471be4f659/py_multicodec-1.0.0-py3-none-any.whl", hash = "sha256:ae2e687bac8fdf54e3f5b3feded36b61a304d5e3c3af9438f7481f543ec15b8d", size = 26200, upload-time = "2025-12-18T20:41:37.055Z" },
]
[[package]]
name = "py-multihash"
version = "3.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "base58" },
{ name = "blake3" },
{ name = "mmh3" },
{ name = "morphys" },
{ name = "six" },
{ name = "varint" },
]
sdist = { url = "https://files.pythonhosted.org/packages/11/3d/ed68b0eccd0654f7f3c163d9b3d428f903e5e3e884ab1f0d0a16ba6a4f11/py_multihash-3.0.0.tar.gz", hash = "sha256:2e848941de5ef0533ca26b81940e2ffcf7b4322a3f803e8c97f4f0eca8767aa7", size = 41630, upload-time = "2025-12-17T19:30:00.596Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/24/e2/d65606db8369916fb5a9b4fe14df7e6072970d919300f3fb1c989a1d8e7d/py_multihash-3.0.0-py3-none-any.whl", hash = "sha256:3863ec1313b4eac1e5169137c143d40bf77456e57388f839441deba089f87326", size = 21215, upload-time = "2025-12-17T19:29:59.322Z" },
]
[[package]] [[package]]
name = "pyarrow" name = "pyarrow"
version = "22.0.0" version = "22.0.0"
@ -1516,6 +1835,12 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/3b/ab/b3226f0bd7cdcf710fbede2b3548584366da3b19b5021e74f5bde2a8fa3f/pytest-9.0.2-py3-none-any.whl", hash = "sha256:711ffd45bf766d5264d487b917733b453d917afd2b0ad65223959f59089f875b", size = 374801, upload-time = "2025-12-06T21:30:49.154Z" }, { url = "https://files.pythonhosted.org/packages/3b/ab/b3226f0bd7cdcf710fbede2b3548584366da3b19b5021e74f5bde2a8fa3f/pytest-9.0.2-py3-none-any.whl", hash = "sha256:711ffd45bf766d5264d487b917733b453d917afd2b0ad65223959f59089f875b", size = 374801, upload-time = "2025-12-06T21:30:49.154Z" },
] ]
[[package]]
name = "python-baseconv"
version = "1.2.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/33/d0/9297d7d8dd74767b4d5560d834b30b2fff17d39987c23ed8656f476e0d9b/python-baseconv-1.2.2.tar.gz", hash = "sha256:0539f8bd0464013b05ad62e0a1673f0ac9086c76b43ebf9f833053527cd9931b", size = 4929, upload-time = "2019-04-04T19:28:57.17Z" }
[[package]] [[package]]
name = "python-dateutil" name = "python-dateutil"
version = "2.9.0.post0" version = "2.9.0.post0"
@ -1843,7 +2168,7 @@ source = { git = "https://github.com/pikers/tomlkit.git?branch=piker_pin#8e0239a
[[package]] [[package]]
name = "tractor" name = "tractor"
version = "0.1.0a6.dev0" version = "0.1.0a6.dev0"
source = { git = "https://github.com/goodboy/tractor.git?branch=piker_pin#e232d9dd06f41b8dca997f0647f2083d27cc34f2" } source = { editable = "../tractor" }
dependencies = [ dependencies = [
{ name = "bidict" }, { name = "bidict" },
{ name = "cffi" }, { name = "cffi" },
@ -1856,6 +2181,48 @@ dependencies = [
{ name = "wrapt" }, { name = "wrapt" },
] ]
[package.metadata]
requires-dist = [
{ name = "bidict", specifier = ">=0.23.1" },
{ name = "cffi", specifier = ">=1.17.1" },
{ name = "colorlog", specifier = ">=6.8.2,<7" },
{ name = "msgspec", specifier = ">=0.19.0" },
{ name = "pdbp", specifier = ">=1.8.2,<2" },
{ name = "platformdirs", specifier = ">=4.4.0" },
{ name = "tricycle", specifier = ">=0.4.1,<0.5" },
{ name = "trio", specifier = ">0.27" },
{ name = "wrapt", specifier = ">=1.16.0,<2" },
]
[package.metadata.requires-dev]
dev = [
{ name = "greenback", specifier = ">=1.2.1,<2" },
{ name = "pexpect", specifier = ">=4.9.0,<5" },
{ name = "prompt-toolkit", specifier = ">=3.0.50" },
{ name = "psutil", specifier = ">=7.0.0" },
{ name = "pyperclip", specifier = ">=1.9.0" },
{ name = "pytest", specifier = ">=8.3.5" },
{ name = "stackscope", specifier = ">=0.2.2,<0.3" },
{ name = "typing-extensions", specifier = ">=4.14.1" },
{ name = "xonsh", specifier = ">=0.19.2" },
]
devx = [
{ name = "greenback", specifier = ">=1.2.1,<2" },
{ name = "stackscope", specifier = ">=0.2.2,<0.3" },
{ name = "typing-extensions", specifier = ">=4.14.1" },
]
lint = [{ name = "ruff", specifier = ">=0.9.6" }]
repl = [
{ name = "prompt-toolkit", specifier = ">=3.0.50" },
{ name = "psutil", specifier = ">=7.0.0" },
{ name = "pyperclip", specifier = ">=1.9.0" },
{ name = "xonsh", specifier = ">=0.19.2" },
]
testing = [
{ name = "pexpect", specifier = ">=4.9.0,<5" },
{ name = "pytest", specifier = ">=8.3.5" },
]
[[package]] [[package]]
name = "tricycle" name = "tricycle"
version = "0.4.1" version = "0.4.1"
@ -2003,6 +2370,12 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e4/16/c1fd27e9549f3c4baf1dc9c20c456cd2f822dbf8de9f463824b0c0357e06/uvloop-0.22.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6cde23eeda1a25c75b2e07d39970f3374105d5eafbaab2a4482be82f272d5a5e", size = 4296730, upload-time = "2025-10-16T22:17:00.744Z" }, { url = "https://files.pythonhosted.org/packages/e4/16/c1fd27e9549f3c4baf1dc9c20c456cd2f822dbf8de9f463824b0c0357e06/uvloop-0.22.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6cde23eeda1a25c75b2e07d39970f3374105d5eafbaab2a4482be82f272d5a5e", size = 4296730, upload-time = "2025-10-16T22:17:00.744Z" },
] ]
[[package]]
name = "varint"
version = "1.0.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/a8/fe/1ea0ba0896dfa47186692655b86db3214c4b7c9e0e76c7b1dc257d101ab1/varint-1.0.2.tar.gz", hash = "sha256:a6ecc02377ac5ee9d65a6a8ad45c9ff1dac8ccee19400a5950fb51d594214ca5", size = 1886, upload-time = "2016-02-24T20:42:38.5Z" }
[[package]] [[package]]
name = "wcwidth" name = "wcwidth"
version = "0.2.14" version = "0.2.14"
@ -2095,14 +2468,8 @@ wheels = [
[[package]] [[package]]
name = "xonsh" name = "xonsh"
version = "0.20.0" version = "0.22.1"
source = { registry = "https://pypi.org/simple" } source = { git = "https://github.com/xonsh/xonsh.git?branch=main#336658ff0919f8d7bb96d581136d37d470a8fe99" }
sdist = { url = "https://files.pythonhosted.org/packages/56/af/7e2ba3885da44cbe03c7ff46f90ea917ba10d91dc74d68604001ea28055f/xonsh-0.20.0.tar.gz", hash = "sha256:d44a50ee9f288ff96bd0456f0a38988ef6d4985637140ea793beeef5ec5d2d38", size = 811907, upload-time = "2025-11-24T07:50:50.847Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e8/db/1c5c057c0b2a89b8919477726558685720ae0849ea1a98a3803e93550824/xonsh-0.20.0-py311-none-any.whl", hash = "sha256:65d27ba31d558f79010d6c652751449fd3ed4df1f1eda78040a6427fa0a0f03e", size = 646312, upload-time = "2025-11-24T07:50:49.488Z" },
{ url = "https://files.pythonhosted.org/packages/d2/a2/d6f7534f31489a4b8b54bd2a2496248f86f7c21a6a6ce9bfdcdd389fe4e7/xonsh-0.20.0-py312-none-any.whl", hash = "sha256:3148900e67b9c2796bef6f2eda003b0a64d4c6f50a0db23324f786d9e1af9353", size = 646323, upload-time = "2025-11-24T07:50:43.028Z" },
{ url = "https://files.pythonhosted.org/packages/bd/48/bcb1e4d329c3d522bc29b066b0b6ee86938ec392376a29c36fac0ad1c586/xonsh-0.20.0-py313-none-any.whl", hash = "sha256:c83daaf6eb2960180fc5a507459dbdf6c0d6d63e1733c43f4e43db77255c7278", size = 646830, upload-time = "2025-11-24T07:50:45.078Z" },
]
[[package]] [[package]]
name = "yapic-json" name = "yapic-json"