Compare commits

..

102 Commits

Author SHA1 Message Date
Tyler Goodlet ee353099e8 Uhh ok, apparently this osenv var is just idiotic?
See https://www.pikers.dev/pikers/piker/issues/50#issuecomment-1041 for
deatz.
2026-01-01 20:29:22 -05:00
momo 6ee0d86548 Merge pull request 'dpi_scaling_round2: a bit of research on `Qt6`'s hiDPI support' (#49) from dpi_scaling_round2 into dpi-font-auto-calc
Reviewed-on: #49
2025-12-22 21:47:57 +00:00
Tyler Goodlet e63cffaf53 Add `.xsh` script mentioned in gitea #50
Note since it's actually `xonsh` code run with either,
- most pedantically: `xonsh ./snippets/calc_ppi.xsh`
- or relying on how shebang: `./snippets/calc_ppi.xsh`
  * an sheboom.
2025-12-19 19:29:15 -05:00
Tyler Goodlet 49bedb4912 Reorder imports in `qt_screen_info.py` ??
For wtv reason on nixos importing `pyqtgraph` first is causing `numpy`
to fail import?? No idea, but likely something to do with recent
`flake.nix`'s ld-lib-linking with `<nixpkgs>` marlarky?
2025-12-18 18:24:19 -05:00
Tyler Goodlet 35744f666f Add some Qt DPI extras to `qt_screen_info.py`
- set `QT_USE_PHYSICAL_DPI='1'` env var for Qt6 high-DPI
  * we likely want to do this in `piker.ui` as well!
- move `pxr` calc from widget to per-screen in loop.
- add `unscaled_size` calc using `pxr * size`.
- switch from `.availableGeometry()` to `.geometry()` for full
  bounds.
- shorten output labels, add `!r` repr formatting
- add Qt6 DPI rounding policy TODO with doc links

(this patch was generated in some part by [`claude-code`][claude-code-gh])
[claude-code-gh]: https://github.com/anthropics/claude-code
2025-12-18 18:19:50 -05:00
Tyler Goodlet 6f9b7320e5 Re-fmt and `.info()` the `.configure_to_dpi()` DPI calcs for now 2025-12-18 18:19:50 -05:00
di1ara 47dbed1229 fixed spacing 2025-12-08 22:22:31 -05:00
di1ara 0563c91b33 fixed pytest test for dpi font auto calculation 2025-12-08 22:16:59 -05:00
di1ara 7ff4f5d059 added pytest, moved dependencies 2025-12-08 15:29:56 -05:00
di1ara c6c768eb77 fix DpiAwareFont default size calculation 2025-12-03 16:50:10 -05:00
Tyler Goodlet 3751140fca ib: bump `docker/ib/README.rst`
For the new github image, a high-level look at its basic
features/usage/docs and prosing around our expected default usage with
the `piker.brokers.ib` backend.
2025-10-02 22:12:56 -04:00
Tyler Goodlet 588569edb3 ib.feed: better no-bars error-log message format 2025-10-02 20:52:01 -04:00
Tyler Goodlet 8a5bb688af binance: set `Pair.pegInstructionsAllowed = False`
Lol, a cheeky unforeseen bug due to TOML's lack of a null type and
thinking i can render an `Optional` field on a `msgspec.Struct`
(defaulted to `None`) the `binance.symcache.toml` cache file..

I didn't catch this when i first updated to the 3.1 API in f7caa75228
because i never did a cache-files flush.. lesson learned and we **really
need tests for this**!!
2025-10-02 20:08:56 -04:00
Tyler Goodlet 513ced6a70 Wow, update root `conf.toml` to new multiaddr style
I don't know how this wasn't already committed but.. drops the legacy
`marketstore` tsdb socket info vars since we're going all in on
`nativedb` BP
2025-10-02 20:07:23 -04:00
Tyler Goodlet f2ae3b0e2e `accouning.calc`: enable crash handlers on `debug_mode` input (via test harness) 2025-09-29 15:14:35 -04:00
Tyler Goodlet 56b660fe34 Draft a gt-one-`.fqme`-in-txns/account-file test
To start this is just a shell for the test, there's no checking logic
yet.. put it as `test_accounting.test_ib_account_with_duplicated_mktids()`.
The test is composed for now to be completely runtime-free using only
the offline txn-ledger / symcache / account loading APIs, ideally we
fill in the activated symbology-data-runtime cases once we figure a sane
way to handle incremental symcache updates for backends like IB..

To actually fill the test out with real checks we still need to,
- extract the problem account file from my ib.algopape into the test
  harness data.
- pick some contracts with multiple fqmes despite a single bs_mktid and
  ensure they're aggregated as a single `Position` as well as,
  * ideally de-duplicating txns from the account file section for the
    mkt..
  * warning appropriately about greater-then-one fqme for the bs_mktid
    and providing a way for the ledger re-writing to choose the
    appropriate `<venue>` as the "primary" when the
    data-symbology-runtime is up and possibly use it to incrementally
    update the IB symcache and store offline for next use?
2025-09-29 15:02:50 -04:00
Tyler Goodlet 6eced8ca67 `data._symcache`, impl a summary `.__repr__()`, avoids `Asset` causality issues 2025-09-29 15:00:14 -04:00
Tyler Goodlet 3eb1bf8248 Use `pytest` plugin now exposed by `tractor` 2025-09-29 14:36:55 -04:00
Tyler Goodlet e007163816 Avoid `msgspec` eval-err on `Asset` in symcache? 2025-09-29 13:44:57 -04:00
Tyler Goodlet e14008701c Drop `open_pps()` from ems tests 2025-09-29 13:33:03 -04:00
Tyler Goodlet 8bb5c1bf96 `ui._remote_ctl`: shield remote rect removals
Since under `trio`-cancellation the `.remove()` is a checkpoint and will
be masked by a taskc AND we **always want to remove the rect** despite
the surrounding teardown conditions.
2025-09-29 13:26:11 -04:00
Tyler Goodlet 0462415491 `_ems`: tolerate and warn on already popped execs
In the `translate_and_relay_brokerd_events()` loop task that is, such
that we never crash on a `status_msg = book._active.pop(oid)` in the
'closed' status handler whenever a double removal happens.

Turns out there were unforeseen races here when a benign backend error
would cause an order-mode dialog to be cancelled (incorrectly) and then
a UI side `.on_cancel()` would trigger too-early removal from the
`book._active` table despite the backend sending an actual 'closed'
event (much) later, this would crash on the now missing entry..

So instead we now,
- obviously use `book._active.pop(oid, None)`
- emit a `log.warning()` (not info lol) on a null-read and with a less
  "one-line-y" message explaining the double removal and maybe *why*.
2025-09-29 13:21:11 -04:00
Tyler Goodlet 62f27bf509 `polars.cumsum()` is now `.cum_sum()` 2025-09-27 12:24:11 -04:00
Tyler Goodlet 3f48098c55 ui.order_mode: prioritize mkt-match on `.bs_mktid`
For backends which opt to set the new `BrokerdPosition.bs_mktid` field,
give (matching logic) priority to it such that even if the `.symbol`
field doesn't match the mkt currently focussed on chart, it will
always match on a provider's own internal asset-mapping-id. The original
fallback logic for `.fqme` matching is left as is.

As an example with IB, a qqq.nasdaq.ib txn may have been filled on
a non-primary venue as qqq.directedea.ib, in this case if the mkt is
displayed and focused on chart we want the **entire position info** to
be overlayed by the `OrderMode` UX without discrepancy.

Other refinements,
- improve logging and add a detailed edge-case-comment around the
  `.on_fill()` handler to clarify where if a benign 'error' msg is
  relayed from a backend it will cause the UI to operate as though the
  order **was not-cleared/cancelled** since the `.on_cancel()` handler
  will have likely been called just before, popping the `.dialogs`
  entry. Return `bool` to indicate whether the UI removed-lines
  / added-fill-arrows.
- inverse the `return` branching logic in `.on_cancel()` to reduce
  indent.
- add a very loud `log.error()` in `Status(resp='error')` case-block
  ensuring the console yells about the order being cancelled, also
  a todo for the weird msg-field recursion nonsense..
2025-09-27 11:55:35 -04:00
Tyler Goodlet ad3fe65bd9 Set `.bs_mktid` on all IB position-msg emissions.. 2025-09-26 17:44:06 -04:00
Tyler Goodlet 9ea857298c Add an option `BrokerdPosition.bs_mktid` field
Such that backends can deliver their own internal unique
`MktPair.bs_mktid` when they can't seem to get it right via the
`.fqme: str` export.. (COUGH ib, you piece of sh#$).

Also add todo for possibly replacing the msg with a `Position.summary()`
"snapshot" as a better and more rigorously generated wire-ready msg.
2025-09-26 17:38:22 -04:00
Tyler Goodlet b0f273f091 Don't override `Account.pps: dict` entries..
Despite a `.bs_mktid` ideally being a bijection with `MktPair.fqme`
values, apparently some backends (cough IB) will switch the .<venue>`
part in txn records resulting in multiple account-conf-file sections for
the same dst asset. Obviously that means we can't allocate new
`Position` entries keyed by that `bs_mktid`, instead be sure to **update
them instead**!

Deats,
- add case logic to avoid pp overwrites using a `pp_objs.get()` check.
- warn on duplicated pos entries whenever the current account-file
  entry's `mkt` doesn't match the pre-existing position's.
- mk `Position.add_clear()` return a `bool` indicating if the record was
  newly added, warn when it was already existing/added prior.

Also,
- drop the already deprecated `open_pps()`, also from sub-pkg exports.
- draft TODO for `Position.summary()` idea as a replacement for
  `BrokerdPosition`-msgs.
2025-09-26 15:17:41 -04:00
Tyler Goodlet 6cc3518143 Bump lock file after vnc client change 2025-09-26 13:25:49 -04:00
Tyler Goodlet e265a98456 Switch to `pyvnc` for IB reset hackz
It actually works for vncAuth(2) (thank god!) which the previous
`asyncvnc` **did not**, and seems to be mostly based on the work
from the `asyncvnc` author anyway (so all my past efforts don't seem to
have been in vain XD).

Deats,
- switch to `pyvnc` async API (using `asyncio` again obvi) in
  `.ib._util._vnc_click_hack()`.
- add `pyvnc` as src installed dep from GH.
- drop `asyncvnc` as dep.

Other,
- update `pytest` version range to avoid weird auto-load plugin exposed
  by `xonsh`?
- add a `tool.pytest.ini_options` to project file with vars to,
  - disable that^ `xonsh` plug using `addopts = '-p no:xonsh'`.
  - set a `testpaths` to avoid running anything but that subdir.
  - try out the `'progress'` style console output (does it work?).
2025-09-26 13:25:19 -04:00
Tyler Goodlet 4f8dc7693b Convert remaining `.to_asyncio.open_channel_from()` to `chan` fn-sig usage 2025-09-22 12:58:23 -04:00
Tyler Goodlet 40dca34fde Flip screen-info script to qt6, refine it to heck.
Buncha updates and improvements,
- adjust sub-namespace imports according to console warnings.
- iterate all detected screens in a loop and instead report which is the
  primary and the current.
- type annotate all vars where non-obvious, particularly the`Qt` refs.
2025-09-22 09:37:32 -04:00
Tyler Goodlet db77d7ab29 Use gitea for `tractor` repo endpoint 2025-09-22 06:50:58 -04:00
Tyler Goodlet 8c274efd18 `ib.feed`: finally solve `push()` exc propagation
Such that if/when the `push()` ticker callback (closure) errors
internally, we actually eventually bubble the error out-and-up from the
`asyncio.Task` and from there out the `.to_asyncio.open_channel_from()` to
the parent `trio.Task`..

It ended up being much more subtle to solve then i would have liked
thanks to,

- whatever `Ticker.updateEvent.connect()` does behind the scenes in
  terms of (clearly) swallowing with only log reporting any exc raised
  in the registered callback (in our case `push()`),

- `asyncio.Task.set_excepion()` never working and instead needing to
  resort to `Task.cancel()`, catching `CancelledError` and re-raising
  the stashed `maybe_exc` from `push()` when set..

Further this ports `.to_asyncio.open_channel_from()` usage to use
the new `chan: tractor.to_asyncio.LinkedTaskChannel` fn-sig API, namely
for `_setup_quote_stream()` task. Requires the latest `tractor` updates
to the inter-eventloop-chan iface providing a `.set_nowait()` and
`.get()` for the `asyncio`-side.

Impl deats within `_setup_quote_stream()`,
- implement `push()` error-bubbling by adding a `maybe_exc` which can be
  set by that callback itself or by its registering task; when set it is
  both,
  * reported on by the `teardown()` cb,
  * re-raised by the terminated (via `.cancel()`) `asyncio.Task` after
    woken from its sleep, aka "cancelled" (since that's apparently one
    of the only options.. see big rant further todo comments).
- add explicit error-tolerance-tuning via a `handler_tries: int` counter
  and `tries_before_raise: int` limit such that we only bubble
  a `push()` raised exc once enough tries have consecutively failed.
- as mentioned, use the new `chan` fn-sig support and thus the new
  method API for `asyncio` -> `trio` comms.
- a big TODO XXX around the need to use a better sys for terminating
  `asyncio.Task`s whether it's by delegating to some `.to_asyncio`
  internals after a factor-out OR by potentially going full bore `anyio`
  throughout `.to_asyncio`'s impl in general..
- mk `teardown()` use appropriate `log.<level>()`s based on outcome.

Surroundingly,
- add a ton of doc-strings to mod fns previously missing them.
- improved / added-new comments to `wait_on_data_reset()` internals and
  anything changed per ^above.
2025-09-21 22:38:05 -04:00
Tyler Goodlet 0b123c9af9 `ib`: various type-annot, multiline styling and todos updates 2025-09-21 16:05:50 -04:00
Tyler Goodlet d17160519e `.ui._search`: collapse EGs as needed, use `tn` naming. 2025-09-21 12:02:04 -04:00
Tyler Goodlet 5bc7e4c9b6 Bump lock file with `tractor` piker pinned branch 2025-09-21 11:26:49 -04:00
Tyler Goodlet d35e1e5c67 Port `.data._web_bs` stuff to strict-EGs
Using `tractor.trionics.collapse_eg()` as needed and doing
some renames, in similar style as elsewhere:
- `pcs` -> `rent_cs`,
- `n` -> `tn` for nursery handles,

Also,
- tweak the `._reconnect_forever()` while loop to use the
  (also) `trio`-internal
  `mc_state: trio._channel.MemoryChannelState = snd._state` instead
  of `snd._close` to poll for open send/receive consumer task counts
  since,
    1. it seems more reliable then using the `snd._closed`,
    2. there's no other way to access the info.. afaik?

- handle `ConnectionRejected` explicitly alongside handshake-errs as
  a retry case.
- add a base-exc handler which `.exception()` reports the reconnect
  attempt failure explicitly.
- drop some lingering `Optional` usage.
2025-09-21 11:25:10 -04:00
Tyler Goodlet d4c10b2b0f Use `tractor`'s updated `piker_pin` branch (again)
Instead of the insignificantly named dev branch from recent `trio`
/ py3.13 updates work; it makes more sense to keep a dedicated pin (as
we have prior) for the moment. Also re-org the masked @goodboy dev-env
lines + comments to bottom of file.
2025-09-21 10:59:42 -04:00
Tyler Goodlet 46285a601e Port `.cli` & `.service` to latest `tractor` registry APIs
Namely changes for the `registry_addrs: list`, enable_transports: list`
and related `tractor._addr` primitive requirements.

Other updates include,
- passing `maybe_enable_greenback=True`,
- additional exc logging around `pikerd` syncing/booting,
- changing to newer `Context.wait_for_result()`,
- dropping (unnecessary?) `maybe_open_crash_handler()` around `pikerd` ep.
2025-09-20 22:38:47 -04:00
Tyler Goodlet f9610c9e26 Bump to WIP "piker pin" `tractor` dev branch, with lock file 2025-09-20 22:36:53 -04:00
Tyler Goodlet 9d5e405903 binance; unmask around send-chan @acm usage 2025-09-20 22:32:05 -04:00
Tyler Goodlet e19a724037 ib: add venue-hours checking
Such that we can avoid other (pretty unreliable) "alternative" checks to
determine whether a real-time quote should be waited on or (when venue
is closed) we should just signal that historical backfilling can
commence immediately.

This has been a todo for a very long time and it turned out to be much
easier to accomplish than anticipated..

Deats,
- add a new `is_current_time_in_range()` dt range checker to predicate
  whether an input range contains `datetime.now(start_dt.tzinfo)`.
- in `.ib.feed.stream_quotes()` add a `venue_is_open: bool` which uses
  all of the new ^^ to determine whether to branch for the
  short-circuit-and-do-history-now-case or the std real-time-quotes
  should-be-awaited-since-venue-is-open, case; drop all the old hacks
  trying to workaround not figuring that venue state stuff..

Other,
- also add a gpt5 composed parser to `._util` for the
  `ib_insync.ContractDetails.tradingHours: str` for before i realized
  there was a `.tradingSessions` property XD
- in `.ib_feed`,
  * add various EG-collapsings per recent tractor/trio updates.
  * better logging / exc-handling around ticker quote pushes.
  * stop clearing `Ticker.ticks` each quote iteration; not sure if this
    is needed/correct tho?
  * add masked `Ticker.ticks` poll loop that logs.
- fix some `str.format()` usage in `._util.try_xdo_manual()`
2025-09-20 22:13:59 -04:00
Tyler Goodlet 390a57c96d ib: never relay "Warning:" errors to EMS..
You'd think they could be bothered to make either a "log" or "warning"
msg type instead of a `type='error'`.. but alas, this attempts to detect
all such "warning"-errors and never proxy them to the clearing engine
thus avoiding the cancellation of any associated (by `reqid`)
pre-existing orders (control dialogs).

Also update all surrounding log messages to a more multiline style.
2025-09-17 18:54:47 -04:00
Tyler Goodlet 69eac7bb15 Spurious first-draft of EG collapsing
Topically, throughout various (seemingly) console-UX-affecting or benign
spots in the code base; nothing that required more intervention beyond
things superficial. A few spots also include `trio.Nursery` ref renames
(always to something with a `tn` in it) and log-level reductions to
quiet (benign) console noise oriented around issues meant to be solved
long..

Note there's still a couple spots i left with the loose-ify flag because
i haven't fully tested them without using the latest version of
`tractor.trionics.collapse_eg()`, but more then likely they should flip
over fine.
2025-09-15 19:27:56 -04:00
Tyler Goodlet a45de0b710 ib-related: cope with invalid txn timestamps
That is inside embedded `.accounting.calc.dyn_parse_to_dt()` closure add
an optional `_invalid: list` param to where we can report
bad-timestamped records which we instead override and return as
`from_timestamp(0.)` (when the parser loop falls through) and report
later (in summary ) from the `.accounting.calc.iter_by_dt()` caller
. Add some logging and an optional debug block for future tracing.
2025-09-15 18:29:19 -04:00
Tyler Goodlet 9df1988aa6 ib: jig `.data_reset_hack()` with vnc-client failover
Since apparently porting to the new docker container enforces using
a vnc password and `asyncvnc` seems to have a bug/mis-config whenever
i've tried a pw over a wg tunnel..?

Soo, this tries out the old `i3ipc`-win-focus + `xdo` click hack when
the above fails.

Deats,
- add a mod-level `try_xdo_manual()` to wrap calling
  `i3ipc_xdotool_manual_click_hack()` with an oserr handler, ensure we
  don't bother trying if `i3ipc` import fails beforehand tho.
- call ^ from both the orig case block and the failover from the
  vnc-client case.
- factor the `+no_setup_msg: str` out to mod level and expect it to be
  `.format()`-ed.
- refresh todo around `asyncvnc` pw ish..
- add a new `i3ipc_fin_wins_titled()` window-title scanner which
  predicates input `titles` and delivers any matches alongside the orig
  focused win at call time.
- tweak `i3ipc_xdotool_manual_click_hack()` to call ^ and remove prior
  unfactored window scanning logic.
2025-09-15 16:53:25 -04:00
Tyler Goodlet f7caa75228 Add fix for binance API 3.1 rollout..
See https://developers.binance.com/docs/binance-spot-api-docs#2025-08-26
2025-08-27 23:00:25 -04:00
Tyler Goodlet e9613e46f6 Mk a `notes_to_self/` move orig file `ideas.rst' 2025-07-21 21:26:39 -04:00
Tyler Goodlet 6637ca9e4f Drop old/masked ahab-docker daemon starting 2025-07-18 19:35:54 -04:00
Tyler Goodlet 7e139e6a8e Add `pyperclip` dep for goodboy's xonsh-clipboard needs Bp 2025-06-26 11:40:28 -04:00
Tyler Goodlet c2d9283db4 Try running daemons on UDS tpt
The root daemon, pikerd, needs to be adjusted to use diff default
registry addrs to also utilize non-TCP, but for now this gets us started
testing; so far so good B)
2025-06-26 11:38:04 -04:00
Tyler Goodlet 28ba1392bb Adjust feed status fields/display-pane to new actor-ID
That is to use the new `tractor.msg.types.Aid` struct to pull the
`brokerd` info from the `tractor.Channel.aid: Aid` attr as well as more
generally handling the new `Channel.raddr.proto_key: str` and no longer
assuming a TCP IPC transport; this per the recent `tractor.ipc`
subsys which adds multi-IPC-transports!

Downstream tweaks to match,
- use an "opt-in" field set to display in the `brokerd` info pane in
  `.ui._feedstatus.mk_feed_label()`.
 |_ also add some todos and drop some seemingly unneeded form sizing
    calcs?
- tweak `.ui._label` to allow not using markdown, though ended up not
  doing that since it looked too plain..
2025-06-26 11:24:32 -04:00
Tyler Goodlet f50202a6af Adjust to `trio`'s strict eg nurseries throughout!
Using `tractor.trionics.collapse_eg()` as needed to avoid, at the least,
crash-worthy (in debug-mode REPL-ing terms) nested cancellation egs that
exhibit on SIGINT/ctl-c of each "app" (chart & daemon).

Also a bit of renaming of all `trio.Nursery`s to `tn`, the new "task
nursery" shorthand-var-name being used in all our other `tractor`
related projects.
2025-06-26 11:07:56 -04:00
Tyler Goodlet baff466ee0 kraken: add crash-handling around `Pair()` init
Since it can otherwise be difficult to debug due to nursery cancellation
(we need that taskman yo!).
2025-06-26 10:51:03 -04:00
Tyler Goodlet b01edcf65a kraken: `Pair.costmin` is now optional?
Some pairs don't seem to define it but it's not listed as deprecated on
official API page (new one now linked in type def's doc string).
2025-06-26 10:49:39 -04:00
Tyler Goodlet 2545def7bb Start a manual `tags` file for internal refs 2025-06-20 16:00:14 -04:00
Tyler Goodlet 1b74417688 Flip to non-git`msgspec`, update `bidict`, link to "sdof" `tractor` dev branch 2025-06-10 14:25:21 -04:00
Tyler Goodlet 4d4f5d0af5 Fix readme to `uv sync`.. link to astral docs 2025-06-10 14:22:58 -04:00
Tyler Goodlet 7e82bf0729 Support python 3.13 !!
Luckily all core deps are already ported so this was pretty easy!

B)

I've opted (via `tool.uv` settings) to prefer the user's system
(installed) python distro and disable auto-download of astral's
distros for now since I recently hit some strange silent core dumping
(`brokerd` actors just disappearing..)
with their binaries; an introspect showed it seemingly todo with
p_threading in cpython internals? We can figure out how to
better accommodate users with the opposite pref later, presumably
non-opinionated-linux hackers?

Core pkg upgrades of note,
- manually re-pinned most numerics libs including `numpy`, `numba`,
  `pyarrow`.
- for AOT ext-libs (thanks to `uv.lock` being so detailed), new
  `cython`, `llvmlite`, `cffi`, `rapidfuzz`, `uvloop`, `wrapt` and
  `PyQt6` wheels pulled in.
- `cryptofeed` did a required bump to `2.4.0` looks like which also
  required the above (and notable?) `cffi` update.
2025-06-10 13:12:38 -04:00
Tyler Goodlet f1b4550483 Flip to latest `tractor` @ `branch = main` deps
Namely requiring a `trio` that supports py3.13, so "trio >=0.27".
Unfortunately this brings in strict egs and drops various `trio`-related
sub-deps we also import in `piker`, like `trio-typing`. So there's a few
"rough edges", mostly todo with the REPL activating on graceful cancels
(SIGINT) of `piker` CLIs atm - due to the new strict-egs in recent
`trio`, but nothing we can't work out pretty quickly i'd imagine with
the new `tractor.collapse_eg()` stacker.

Note that we're pinning to `tractor`'s main branch for the moment since
it should be "stable" vs. the `repl_fixture` i'm likely running local Bp
2025-06-09 20:13:01 -04:00
Tyler Goodlet bdaf74a19a Add a couple new grays to the pallete 2025-06-09 10:43:52 -04:00
Tyler Goodlet b87ca76700 Bump to (latest) `polars`, the `0.20.6x` series B)
Since I was trying out the neat lookin `polars-fuzzy-match` (also added
for now as a core dep here) which requires the new plugin sys, plus it's
about time we synced with upstream!

Adjust some column syntax to the new `.name` sub-field-space and the
`uv` lock-file to match.

Other,
- add back `trio-typing` bc i guess something else needs it (debug
  tooling stuff in new `tractor`?)
- flip back to the `tractor` pre-main pin since the new `main`-branch
  requires new `trio` stuff we haven't ported yet..
2025-06-09 10:40:24 -04:00
Tyler Goodlet 94caa248e7 TO-CHERRY: another sampler EoC suppression case?
Not sure whether this should get cherried onto `stop_is_eoc` or
`brokers_refinery` (need to a diff between the two first) but seems like
this might be the final resiliency update? 🙏
2025-06-09 10:27:01 -04:00
Tyler Goodlet da953b6b0c Port to newer `tractor.get_registry()` 2025-06-09 10:18:08 -04:00
Tyler Goodlet fb8375f608 deribit: fill out docstr for `.api.get_values_from_cb_normalized_date()` 2025-06-09 10:17:36 -04:00
Tyler Goodlet d5faf4f59d binance: add new `permissionSets` to base `Pair` 2025-06-09 10:16:41 -04:00
Tyler Goodlet df5e72f7ae max_pain-script: bit of multi-line fmting 2025-06-09 10:11:10 -04:00
Tyler Goodlet bf33cb93b1 Fix type-check assertion in ems test to use `is` 2025-04-24 12:53:32 -04:00
Tyler Goodlet d655e81290 max_pain: add piker logging, tweak var names, notes and todos 2025-04-24 12:15:26 -04:00
Tyler Goodlet bc72e3d206 Drop unused `assets: dict` 2025-04-24 11:34:32 -04:00
Tyler Goodlet 35cb538a69 Update `binance` spot pairs with `amendAllowed`
As per API updates,
https://developers.binance.com/docs/binance-spot-api-docs
https://developers.binance.com/docs/binance-spot-api-docs/faqs/order_amend_keep_priority

I also slightly tweaked the filed mismatch exception note to include the
`repr(pair_type)` so the dev can know which pair types should be
changed.
2025-04-24 10:37:52 -04:00
Tyler Goodlet 8a768af5bb Update legacy type to `tractor.MsgStream` 2025-04-24 10:37:33 -04:00
Tyler Goodlet 8b0fac3b6c TOSQUASH: 84ad34f51, one more `float` cast for paperboi.. 2025-04-22 22:29:12 -04:00
Tyler Goodlet 36cc0cf750 TOSQUASH: 84ad34f51, lingering `float` casts.. 2025-04-22 00:20:48 -04:00
Tyler Goodlet 3ff0a86741 Gracefully close on EoCs thrown in quote throttler
Since `tractor.MsgStream.send()` now also raises `trio.EndOfChannel`
when the stream was gracefully `Stop`ped by the peer side, also handle
that case in `.data._sampling.uniform_rate_send()`.
2025-04-21 21:31:13 -04:00
Tyler Goodlet 705f0e86ac Drop variable regex from `ruff.toml`
Same as in other projects, seems to be not parsing and causing `ruff` to
crash?!?
2025-04-21 21:22:34 -04:00
Tyler Goodlet 2a24d1d50c `.kraken`: add masked pauses for order req debug
Such that the next time i inevitably must debug the some order-request
error status or precision discrepancy, i have the mkt-symbol branch
ready to go. Also, switch to `'action': 'buy'|'sell' as action,` style
`case` matching instead of the post-`if` predicate style.
2025-04-21 21:21:03 -04:00
Tyler Goodlet 84ad34f51e Cast to `float` as needed from order-mode and ems
Since we're not quite yet using automatic typed msging from
`tractor`/`msgspec` (i.e. still manually decoding order ctl msgs from
built-in types..`dict`s still not `msgspec.Struct`) this adds the
appropriate typecasting ops to ensure the required precision is attained
prior to processing and/or submission to a brokerd backend service.

For the `.clearing._ems`,
- flip all `trigger_price` previously presumed to be `float` to just
  the field-identical `price: Decimal` and ensure we cast to `float`
  for any `trigger_price` usage, like before passing to `mk_check()`.

For `.ui.order_mode.OrderMode`,
- add a new `.curr_mkt: MktPair` convenience property to get the
  chart-active value.
- ensure we always use the `.curr_mkt.quantize() -> Decimal` before
  setting any IPC-msg's `.price` field!
- always cast `float(Order.price)` before use in setting line-levels.
- don't bother setting `Order.symbol` to a (now fully removed) `Symbol`
  instance since it's not really required-for-use anywhere; leaving it
  a `str` (per the type-annot) is fine for now?
2025-04-21 20:36:28 -04:00
Tyler Goodlet cbbf674737 Finally drop `Symbol`
It was replaced by `MktPair` long ago in,
https://github.com/pikers/piker/pull/489

with follow up for final removal in,
https://github.com/pikers/piker/issues/517

Resolves #517
2025-04-21 13:34:12 -04:00
Tyler Goodlet ec71dc2018 Mk `Brokerd[Order].price` avoid `float`-errs
By re-typing to a `.price: Decimal` field on both legs of the EMS.

It seems we must do it ourselves since,
- these msg's (fields) are relayed through the clearing engine to each
  `brokerd` backend and,
- bc many (if not all) of those backends `.broker`-clients (nor their
  encapsulated "brokerage services") **are not** doing any
  precision-truncation themselves.

So, for now, instead we opt to expect rounding at the source. This means
we will explicitly require casting to/from `float` at the line-graphics
interface to the order-clearing-engine (as implemented throughout
`.ui.order_mode.OrderMode`); and this is coming shortly.
2025-04-21 13:24:41 -04:00
Tyler Goodlet 17aebf44a9 Add note to `.brokers.ib.api` about removing `_bar_load_dtype` 2025-03-28 18:33:38 -04:00
Tyler Goodlet 5f347c9f6a Show in readme how to install GUIs with `--extra` flag 2025-03-28 18:32:53 -04:00
Tyler Goodlet cdb41e4881 Add some notes about using multi-ine strings instead of `print()`s 2025-03-07 19:23:22 -05:00
Tyler Goodlet 289b63bb2a Line limit tweaks for reading in slim `vsplit`s Bp 2025-03-07 19:22:26 -05:00
Nelson Torres 8f1e082c91 Add write_oi for open interest
In storage.nativedb mod is manage the write_parquet file, this is
a rudimentary way to write on file using parquet, meant just for
development purpose.

Add comments in max_pain to track the changes.
2025-03-04 19:30:11 -03:00
Nelson Torres b9321dbb49 Add Plot
Here is the `plot_graph()` that is in char of the bars, scatter and vertical line plot items.

Also all the necessary code  for the graph to be shown.
2025-02-24 17:33:32 -03:00
Nelson Torres 21d051b05f Extract logic from get_max_pain()
All the max pain math now is in this two functions:

- get_total_intrinsic_values(): calculate the total value for all strike_prices and stores then in a dict[str, Decimal]

- `get_intrinsic_value_and_max_pain()` given the `intrinsic_values` dict, returns the `max_pain` strike price and the `total_intrinsic_value` for that `strike_price`
2025-02-24 17:33:32 -03:00
Nelson Torres 3118d0f140 Max pain daemon:
- To calculate the `max_pain` first we need an expiration date,
get_expiration_dates()` retrieves them and the user then enters one of
the shown, then using the select expiry_date on `get_instruments()` we
are good to build the `oi_by_strikes` important!

- Add `update_oi_by_strikes()`.

- Add `check_if_complete()`.

- `get_max_pain()`: here's where all the action takes place, the
`oi_by_strikes` must be complete to start the calculations,

- Use `maybe_open_oi_feed` for open a oi_feed.

- Add `max_pain_readme.rst`
2025-02-24 17:32:24 -03:00
Nelson Torres 4278d8e2f1 Deribit api key changes introduce:
- `get_timestamp_int`: added this is the hack, so we can aboid use the custom deribit date format.

- `get_currencies`: added so we could get all deribit's available currencies.

- `get_instruments`: for a especific expiration date, it return a list of criptofeed.Symbol.

- `get_expiration_dates`: expirations dates available for btc's option contracts .

- `get_strikes_dict`: all the strike prices for an especific expiration date.

- `aio_open_interest_feed_relay` `open_oi_feed` `maybe_open_oi_feed`: this three handles all the portal stuff and the cryptofeed callbacks for the open interest and trades, for some reason it need both to work, i need to check that out at some point.

- Also a couple of format fixes.
2025-02-24 17:32:24 -03:00
Tyler Goodlet b209512eb6 Add `.log.mk_repr()` to create `reprlib.Repr`s 2025-02-24 17:20:50 -03:00
Nelson Torres 8a9d21468a Deribit api key changes introduce:
- `get_timestamp_int`: added this is the hack, so we can aboid use the custom deribit date format.

- `get_currencies`: added so we could get all deribit's available currencies.

- Also a couple of format fixes.
2025-02-24 17:20:36 -03:00
Tyler Goodlet 75ddba09f7 `deribit.feed`: fix "trade" event streaming
The main change needed to make `piker.data.feed._FeedsBus` work was
to correctly format the `'trade'` msgs with the (new schema) expected
`'ticks': list[dict]` field which,
- we compute the `piker` quote-msg-`dict` from the (now directly proxied through)
  `cryptofeed.types.Trade`'s fields inside the body of `stream_quotes()`.
- similarly, move the `'l1'` msg processing, **out of** the `asyncio`-side
  `_l1()` callback (defined as a closure in `.api.aio_price_feed_relay()`
  and passed to the `cryptofeed.FeedHandler`) and instead mod the
  callback to simply pass through the `.types.L1Book` ref directly to
  the `piker`/`trio` side task for conversion.

In support of all that,
- mask-to-drop the alt-branch to wait on a first rt event when the
  `cryptofeed.LastTradesResult.trades: list[Trade]` is empty; doesn't
  seem like this ever even happens?
- add a buncha typing, comments and doc-strs to the routines in
  `.deribit.api` including notes on where we can choose to mod the
  `.bs_fqme` for our eventually preferred `piker` style format.
- simplify some nested `@acm` enters to the new single `async with
  <tuple>)` style.
- be particularly pedantic about typing
  `tractor.to_asyncio.LinkedTaskChannel`
- bit of pep8 line-spacing fixes in `.venues`.
2025-02-24 17:20:36 -03:00
Tyler Goodlet dae17bb043 `.deribit.feed`: get live quotes workin (again)
The quote-msg `'topic'` field was being set and sent as the
`OptionPair.symbol: str` value instead of as the `MktPair.bs_fqme: str`
as is required for matching on the `piker.data.feed` side. So change to
that and simplify the actual `.bs_fqme: str` value to NOT include the
ISO-format time (for now) since it's a big ugly and longer term we need
a `piker`-fqme friendly-on-ze-eyes format/style anyway..
2025-02-24 17:20:36 -03:00
Tyler Goodlet 8bd0a182cf Bit more `cryptofeed` adapter formatting and typing for clarity.. 2025-02-24 17:20:36 -03:00
Tyler Goodlet 04421e5ad2 .deribit.venues: add todo for an ideal `OptionPair.expiry` fmt/value 2025-02-24 17:20:36 -03:00
Tyler Goodlet 1e0c3da32d Report the closest (via fuzzy match) pairs on unmatched input 2025-02-24 17:20:36 -03:00
Tyler Goodlet 5b87b3c2a6 Signal hist start using `OptionPair.creation_timestamp`
Such that the `get_hist()` query func raises `DataUnavailable` with an
explicit message regarding the start of the (option) contract's
lifetime.

Other,
- mask some unused imports (for now?)
- drop a duplicate `tractor.get_console_log()` call which was causing
  duplicate console emits (it's already setup by brokerd init now).
- comment various unused code bits i found.
- add a info log around live quotes so we can see for the moment when
  they actually occur.. XD
2025-02-24 17:20:36 -03:00
Tyler Goodlet 438e69e42c `.deribit.api` bit of tidying/typing
There were some imports missing or unused as well as a variety of spots
that had grokability issues due to missing type hints.

Other tweaks as part some more thorough manual testing:
- always raise when not `brokers.toml` section since the API can never
  work (no free data without keys).
- inline the `Asset.atype='crypto_currency` field despite it maybe not
  being the best value for `OptionPair` instruments..
- tossed in a now-masked pause block for debugging history queries in
  `Client.bars()`.
- commented out all the live order ctl (internal) endpoints for now
  since they're unused.
2025-02-24 17:20:36 -03:00
Tyler Goodlet ec6dd7cafc 'Fix `Optional` and use `'linear/reverse'` in `OptionPair.venue`' 2025-02-24 17:20:36 -03:00
Nelson Torres f1436c93db Deribit's feed fix
- `FeedInit` for init_msgs in `stream_quotes`.

- new cache is `client_pairs` so is replacing the old `client.cache_symbols`.

- `get_mkt_info` added

- `get_ohlc` fixed to comply the new ways of the feed.
2025-02-24 17:20:36 -03:00
Nelson Torres 1061103f76 Deribit's api fix
key changes:

- Resolved the issue with the expiration dates from deribits, now we int instead of the crazy custom deribits format.

- The client now has a new  `_json_rpc_auth_wrapper` that adquires a first access token and then will refresh the access token when this expires.

- `get_assets` fixed, now  we use the public endpoint to check the availables assets, in the future probably this will change, but for now is working just fine.

- `get_mkt_pairs` added.

- `exch_info` added.

- `cache_symbols` fixed.

- Also a lot of reformat made in api.
2025-02-24 17:20:36 -03:00
Nelson Torres 3aea296caa Venues
Moved from api to venues all the msgspecs structs, also added critical imports in api, feed and __init__ mods.
2025-02-24 17:20:36 -03:00
50 changed files with 2395 additions and 1289 deletions

View File

@ -93,13 +93,13 @@ bc why install with `python` when you can faster with `rust` ::
# ^ astral's docs,
# https://docs.astral.sh/uv/concepts/projects/sync/
include all GUIs (ex. for charting)::
include all GUIs ::
uv sync --extra uis
AND with all our hacking tools and WIP integrations::
AND with all our hacking tools::
uv sync --dev --all-extras
uv sync --dev --extra uis
Ensure you can run the root-daemon::
@ -107,21 +107,13 @@ Ensure you can run the root-daemon::
uv run pikerd [-l info --pdb]
install on nix(os)
******************
hacky install on nixos
**********************
``NixOS`` is our core devs' distro of choice for which we offer
a stringently defined development shell envoirment that can currently
be applied in one of 2 ways::
a stringently defined development shell envoirment that can be loaded with::
# ONLY if running on X11
nix-shell default.nix
Or if you prefer flakes style and a modern DE::
# ONLY if also running on Wayland
nix develop # for default bash
nix develop -c uv run xonsh # for @goodboy's preferred sh B)
start a chart
*************

View File

@ -1,5 +1,6 @@
################
# ---- CEXY ----
################
[binance]
accounts.paper = 'paper'
@ -12,41 +13,28 @@ accounts.spot = 'spot'
spot.use_testnet = false
spot.api_key = ''
spot.api_secret = ''
# ------ binance ------
[deribit]
# std assets
key_id = ''
key_secret = ''
# options
accounts.option = 'option'
option.use_testnet = false
option.key_id = ''
option.key_secret = ''
# aux logging from `cryptofeed`
option.log.filename = 'cryptofeed.log'
option.log.level = 'DEBUG'
option.log.disabled = true
# ------ deribit ------
[kraken]
key_descr = ''
api_key = ''
secret = ''
# ------ kraken ------
[kucoin]
key_id = ''
key_secret = ''
key_passphrase = ''
# ------ kucoin ------
################
# -- BROKERZ ---
################
[questrade]
refresh_token = ''
access_token = ''
@ -54,55 +42,44 @@ api_server = 'https://api06.iq.questrade.com/'
expires_in = 1800
token_type = 'Bearer'
expires_at = 1616095326.355846
# ------ questrade ------
[ib]
# define the (set of) host-port socketaddrs that
# brokerd.ib will scan to connect to an API endpoint
# (ib-gw or ib-tws listening instances)
hosts = [
'127.0.0.1',
]
# XXX: the order in which ports will be scanned
# (by the `brokerd` daemon-actor)
# is determined # by the line order here.
# TODO: when we eventually spawn gateways in our
# container, we can just dynamically allocate these
# using IBC.
ports = [
4002, # gw
7497, # tws
]
# When API endpoints are being scanned durin startup, the order
# of user-defined-account "names" (as defined below) here
# determines which py-client connection is given priority to be
# used for data-feed-requests by according to whichever client
# connected to an API endpoing which reported the equivalent
# account number for that name.
# XXX: for a paper account the flex web query service
# is not supported so you have to manually download
# and XML report and put it in a location that can be
# accessed by the ``brokerd.ib`` backend code for parsing.
flex_token = ''
flex_trades_query_id = '' # live account
# when clients are being scanned this determines
# which clients are preferred to be used for data
# feeds based on the order of account names, if
# detected as active on an API client.
prefer_data_account = [
'paper',
'margin',
'ira',
]
# For long-term trades txn (transaction) history
# processing (i.e your txn ledger with IB) you can
# (automatically for live accounts) query the FLEX
# report system for past history.
#
# (For paper accounts the web query service
# is not supported so you have to manually download
# an XML report and put it in a location that can be
# accessed by our `brokerd.ib` backend code for parsing).
#
flex_token = ''
flex_trades_query_id = '' # live account
# define "aliases" (names) for each account number
# such that the names can be reffed and logged throughout
# `piker.accounting` subsys and more easily
# referred to by the user.
#
# These keys will be the set exposed through the order-mode
# account-selection UI so that numbers are never shown.
[ib.accounts]
paper = 'DU0000000' # <- literal account #
margin = 'U0000000'
ira = 'U0000000'
# ------ ib ------
# the order in which accounts will be selectable
# in the order mode UI (if found via clients during
# API-app scanning)when a new symbol is loaded.
paper = 'XX0000000'
margin = 'X0000000'
ira = 'X0000000'

View File

@ -11,12 +11,11 @@ let
libxkbcommonStorePath = lib.getLib libxkbcommon;
xcbutilcursorStorePath = lib.getLib xcb-util-cursor;
pypkgs = python313Packages;
qtpyStorePath = lib.getLib pypkgs.qtpy;
pyqt6StorePath = lib.getLib pypkgs.pyqt6;
pyqt6SipStorePath = lib.getLib pypkgs.pyqt6-sip;
rapidfuzzStorePath = lib.getLib pypkgs.rapidfuzz;
qdarkstyleStorePath = lib.getLib pypkgs.qdarkstyle;
qtpyStorePath = lib.getLib python312Packages.qtpy;
pyqt6StorePath = lib.getLib python312Packages.pyqt6;
pyqt6SipStorePath = lib.getLib python312Packages.pyqt6-sip;
rapidfuzzStorePath = lib.getLib python312Packages.rapidfuzz;
qdarkstyleStorePath = lib.getLib python312Packages.qdarkstyle;
xorgLibX11StorePath = lib.getLib xorg.libX11;
xorgLibxcbStorePath = lib.getLib xorg.libxcb;
@ -52,12 +51,12 @@ stdenv.mkDerivation {
xorg.xcbutilrenderutil
# Python requirements.
python313
uv
pypkgs.qdarkstyle
pypkgs.rapidfuzz
pypkgs.pyqt6
pypkgs.qtpy
python312Full
python312Packages.uv
python312Packages.qdarkstyle
python312Packages.rapidfuzz
python312Packages.pyqt6
python312Packages.qtpy
];
src = null;
shellHook = ''
@ -114,11 +113,11 @@ stdenv.mkDerivation {
export LD_LIBRARY_PATH
RPDFUZZ_PATH="${rapidfuzzStorePath}/lib/python3.13/site-packages"
QDRKSTYLE_PATH="${qdarkstyleStorePath}/lib/python3.13/site-packages"
QTPY_PATH="${qtpyStorePath}/lib/python3.13/site-packages"
PYQT6_PATH="${pyqt6StorePath}/lib/python3.13/site-packages"
PYQT6_SIP_PATH="${pyqt6SipStorePath}/lib/python3.13/site-packages"
RPDFUZZ_PATH="${rapidfuzzStorePath}/lib/python3.12/site-packages"
QDRKSTYLE_PATH="${qdarkstyleStorePath}/lib/python3.12/site-packages"
QTPY_PATH="${qtpyStorePath}/lib/python3.12/site-packages"
PYQT6_PATH="${pyqt6StorePath}/lib/python3.12/site-packages"
PYQT6_SIP_PATH="${pyqt6SipStorePath}/lib/python3.12/site-packages"
PATCH="$PATCH:$RPDFUZZ_PATH"
PATCH="$PATCH:$QDRKSTYLE_PATH"
@ -128,8 +127,8 @@ stdenv.mkDerivation {
export PATCH
# install all dev and extras
uv sync --dev --all-extras
# Install deps
uv lock
'';
}

View File

@ -24,8 +24,9 @@ here is an example using ``vncclient`` on ``linux``::
vncviewer localhost:5900
now enter the pw (password) you set via an (see second code blob)
`.env file`_ or pw-file according to the `credentials section`_.
now enter the pw you set via an (see second code blob) `.env file`_
or pw-file according to the `credentials section`_.
If you want to change away from their default config see the example
`docker-compose.yml`-config issue and config-section of the readme,
@ -38,74 +39,6 @@ If you want to change away from their default config see the example
.. _credentials section: https://github.com/gnzsnz/ib-gateway-docker?tab=readme-ov-file#credentials
Connecting to the API from `piker`
---------------------------------
In order to expose the container's API endpoint to the
`brokerd/datad/ib` actor, we need to add a section to the user's
`brokers.toml` config (note the below is similar to the repo-shipped
template file),
.. code:: toml
[ib]
# define the (set of) host-port socketaddrs that
# brokerd.ib will scan to connect to an API endpoint
# (ib-gw or ib-tws listening instances)
hosts = [
'127.0.0.1',
]
ports = [
4002, # gw
7497, # tws
]
# When API endpoints are being scanned durin startup, the order
# of user-defined-account "names" (as defined below) here
# determines which py-client connection is given priority to be
# used for data-feed-requests by according to whichever client
# connected to an API endpoing which reported the equivalent
# account number for that name.
prefer_data_account = [
'paper',
'margin',
'ira',
]
# define "aliases" (names) for each account number
# such that the names can be reffed and logged throughout
# `piker.accounting` subsys and more easily
# referred to by the user.
#
# These keys will be the set exposed through the order-mode
# account-selection UI so that numbers are never shown.
[ib.accounts]
paper = 'XX0000000'
margin = 'X0000000'
ira = 'X0000000'
the broker daemon can also connect to the container's VNC server for
added functionalies including,
- viewing the API endpoint program's GUI for manual interventions,
- workarounds for historical data throttling using hotkey hacks,
Add a further section to `brokers.toml` which maps each API-ep's
port to a table of VNC server connection info like,
.. code:: toml
[ib.vnc_addrs]
4002 = {host = 'localhost', port = 5900, pw = 'doggy'}
The `pw = 'doggy'` here ^ should the same value as the particular
container instances `.env` file setting (when it was run),
.. code:: ini
VNC_SERVER_PASSWORD='doggy'
IF you also want to run ``TWS``
-------------------------------
You can also run it containerized,

View File

@ -1,15 +1,10 @@
# a community maintained IB API container!
#
# https://github.com/gnzsnz/ib-gateway-docker
#
# For piker we (currently) include some minor deviations
# for some config files in the `volumes` section.
#
# See full configuration settings @
# - https://github.com/gnzsnz/ib-gateway-docker?tab=readme-ov-file#configuration
# - https://github.com/gnzsnz/ib-gateway-docker/discussions/103
# rework from the original @
# https://github.com/waytrade/ib-gateway-docker/blob/master/docker-compose.yml
version: "3.5"
services:
ib_gw_paper:
# apparently java is a mega cukc:
@ -55,22 +50,16 @@ services:
target: /root/scripts/run_x11_vnc.sh
read_only: true
# NOTE: an alt method to fill these out is to
# define an `.env` file in the same dir as
# this compose file.
# NOTE:to fill these out, define an `.env` file in the same dir as
# this compose file which looks something like:
# TWS_USERID='myuser'
# TWS_PASSWORD='guest'
environment:
TWS_USERID: ${TWS_USERID}
# TWS_USERID: 'myuser'
TWS_PASSWORD: ${TWS_PASSWORD}
# TWS_PASSWORD: 'guest'
TRADING_MODE: ${TRADING_MODE}
# TRADING_MODE: 'paper'
VNC_SERVER_PASSWORD: ${VNC_SERVER_PASSWORD}
# VNC_SERVER_PASSWORD: 'doggy'
# TODO, see if we can get this supported like it
# was on the old `waytrade` image?
# VNC_SERVER_PORT: '3003'
TRADING_MODE: 'paper'
VNC_SERVER_PASSWORD: 'doggy'
VNC_SERVER_PORT: '3003'
# ports:
# - target: 4002
@ -87,9 +76,6 @@ services:
# - "127.0.0.1:4002:4002"
# - "127.0.0.1:5900:5900"
# TODO, a masked but working example of dual paper + live
# ib-gw instances running in a single app run!
#
# ib_gw_live:
# image: waytrade/ib-gateway:1012.2i
# restart: no

View File

View File

@ -0,0 +1,338 @@
#!/usr/bin/env python
from decimal import (
Decimal,
)
from pathlib import Path
import numpy as np
# import polars as pl
import trio
import tractor
from datetime import datetime
# from pprint import pformat
from piker.brokers.deribit.api import (
get_client,
maybe_open_oi_feed,
)
from piker.storage import open_storage_client, StorageClient
from piker.log import get_logger
import sys
import pyqtgraph as pg
from PyQt6 import QtCore
from pyqtgraph import ScatterPlotItem, InfiniteLine
from PyQt6.QtWidgets import QApplication
from cryptofeed.symbols import Symbol
log = get_logger(__name__)
# XXX, use 2 newlines between top level LOC (even between these
# imports and the next function line ;)
def check_if_complete(
oi: dict[str, dict[str, Decimal | None]]
) -> bool:
return all(
oi[strike]['C'] is not None
and
oi[strike]['P'] is not None for strike in oi
)
async def max_pain_daemon(
) -> None:
oi_by_strikes: dict[str, dict[str, Decimal | None]]
instruments: list[Symbol] = []
expiry_dates: list[str]
expiry_date: str
currency: str = 'btc'
kind: str = 'option'
async with get_client(
) as client:
expiry_dates: list[str] = await client.get_expiration_dates(
currency=currency,
kind=kind
)
log.info(
f'Available expiries for {currency!r}-{kind}:\n'
f'{expiry_dates}\n'
)
expiry_date: str = input(
'Please enter a valid expiration date: '
).upper()
print('Starting little daemon...')
# maybe move this type annot down to the assignment line?
oi_by_strikes: dict[str, dict[str, Decimal]]
instruments = await client.get_instruments(
expiry_date=expiry_date,
)
oi_by_strikes = client.get_strikes_dict(instruments)
def get_total_intrinsic_values(
oi_by_strikes: dict[str, dict[str, Decimal]]
) -> dict[str, dict[str, Decimal]]:
call_cash: Decimal = Decimal(0)
put_cash: Decimal = Decimal(0)
intrinsic_values: dict[str, dict[str, Decimal]] = {}
closes: list = sorted(Decimal(close) for close in oi_by_strikes)
for strike, oi in oi_by_strikes.items():
s = Decimal(strike)
call_cash = sum(max(0, (s - c) * oi_by_strikes[str(c)]['C']) for c in closes)
put_cash = sum(max(0, (c - s) * oi_by_strikes[str(c)]['P']) for c in closes)
intrinsic_values[strike] = {
'C': call_cash,
'P': put_cash,
'total': call_cash + put_cash,
}
return intrinsic_values
def get_intrinsic_value_and_max_pain(
intrinsic_values: dict[str, dict[str, Decimal]]
):
# We meed to find the lowest value, so we start at
# infinity to ensure that, and the max_pain must be
# an amount greater than zero.
total_intrinsic_value: Decimal = Decimal('Infinity')
max_pain: Decimal = Decimal(0)
for strike, oi in oi_by_strikes.items():
s = Decimal(strike)
if intrinsic_values[strike]['total'] < total_intrinsic_value:
total_intrinsic_value = intrinsic_values[strike]['total']
max_pain = s
return total_intrinsic_value, max_pain
def plot_graph(
oi_by_strikes: dict[str, dict[str, Decimal]],
plot,
):
"""Update the bar graph with new open interest data."""
plot.clear()
intrinsic_values = get_total_intrinsic_values(oi_by_strikes)
for strike_str in sorted(oi_by_strikes, key=lambda x: int(x)):
strike = int(strike_str)
calls_val = float(oi_by_strikes[strike_str]['C'])
puts_val = float(oi_by_strikes[strike_str]['P'])
bar_c = pg.BarGraphItem(
x=[strike - 100],
height=[calls_val],
width=200,
pen='w',
brush=(0, 0, 255, 150)
)
plot.addItem(bar_c)
bar_p = pg.BarGraphItem(
x=[strike + 100],
height=[puts_val],
width=200,
pen='w',
brush=(255, 0, 0, 150)
)
plot.addItem(bar_p)
total_val = float(intrinsic_values[strike_str]['total']) / 100000
scatter_iv = ScatterPlotItem(
x=[strike],
y=[total_val],
pen=pg.mkPen(color=(0, 255, 0), width=2),
brush=pg.mkBrush(0, 255, 0, 150),
size=3,
symbol='o'
)
plot.addItem(scatter_iv)
_, max_pain = get_intrinsic_value_and_max_pain(intrinsic_values)
vertical_line = InfiniteLine(
pos=max_pain,
angle=90,
pen=pg.mkPen(color='yellow', width=1, style=QtCore.Qt.PenStyle.DotLine),
label=f'Max pain: {max_pain:,.0f}',
labelOpts={
'position': 0.85,
'color': 'yellow',
'movable': True
}
)
plot.addItem(vertical_line)
def update_oi_by_strikes(msg: tuple):
nonlocal oi_by_strikes
if 'oi' == msg[0]:
strike_price = msg[1]['strike_price']
option_type = msg[1]['option_type']
open_interest = msg[1]['open_interest']
oi_by_strikes.setdefault(
strike_price, {}
).update(
{option_type: open_interest}
)
# Define the structured dtype
dtype = np.dtype([
('time', int),
('oi', float),
('oi_calc', float),
])
async def write_open_interest_on_file(msg: tuple, client: StorageClient):
if 'oi' == msg[0]:
nonlocal expiry_date
timestamp = msg[1]['timestamp']
strike_price = msg[1]["strike_price"]
option_type = msg[1]['option_type'].lower()
col_sym_key = f'btc-{expiry_date.lower()}-{strike_price}-{option_type}'
# Create the numpy array with sample data
data = np.array([
(
int(timestamp),
float(msg[1]['open_interest']),
np.nan,
),
], dtype=dtype)
path: Path = await client.write_oi(
col_sym_key,
data,
)
# TODO, use std logging like this throughout for status
# emissions on console!
log.info(f'Wrote OI history to {path}')
def get_max_pain(
oi_by_strikes: dict[str, dict[str, Decimal]]
) -> dict[str, str | Decimal]:
'''
This method requires only the strike_prices and oi for call
and puts, the closes list are the same as the strike_prices
the idea is to sum all the calls and puts cash for each strike
and the ITM strikes from that strike, the lowest value is what we
are looking for the intrinsic value.
'''
nonlocal timestamp
intrinsic_values = get_total_intrinsic_values(oi_by_strikes)
total_intrinsic_value, max_pain = get_intrinsic_value_and_max_pain(intrinsic_values)
return {
'timestamp': timestamp,
'expiry_date': expiry_date,
'total_intrinsic_value': total_intrinsic_value,
'max_pain': max_pain,
}
async with (
open_storage_client() as (_, storage),
maybe_open_oi_feed(
instruments,
) as oi_feed,
):
# Initialize QApplication
app = QApplication(sys.argv)
win = pg.GraphicsLayoutWidget(show=True)
win.setWindowTitle('Calls (blue) vs Puts (red)')
plot = win.addPlot(title='OI by Strikes')
plot.showGrid(x=True, y=True)
print('Plot initialized...')
async for msg in oi_feed:
# In memory oi_by_strikes dict, all message are filtered here
# and the dict is updated with the open interest data
update_oi_by_strikes(msg)
# Write on file using storage client
await write_open_interest_on_file(msg, storage)
# Max pain calcs, before start we must gather all the open interest for
# all the strike prices and option types available for a expiration date
if check_if_complete(oi_by_strikes):
if 'oi' == msg[0]:
# Here we must read for the filesystem all the latest open interest value for
# each instrument for that specific expiration date, that means look up for the
# last update got the instrument btc-{expity_date}-*oi1s.parquet (1s because is
# hardcoded to something, sorry.)
timestamp = msg[1]['timestamp']
max_pain = get_max_pain(oi_by_strikes)
# intrinsic_values = get_total_intrinsic_values(oi_by_strikes)
# graph here
plot_graph(oi_by_strikes, plot)
# TODO, use a single multiline string with `()`
# and drop the multiple `print()` calls (this
# should be done elsewhere in this file as well!
#
# As per the docs,
# https://docs.python.org/3/reference/lexical_analysis.html#string-literal-concatenation
# you could instead do,
# print(
# '-----------------------------------------------\n'
# f'timestamp: {datetime.fromtimestamp(max_pain['timestamp'])}\n'
# )
# WHY?
# |_ less ctx-switches/calls to `print()`
# |_ the `str` can then be modified / passed
# around as a variable more easily if needed in
# the future ;)
#
# ALSO, i believe there already is a stdlib
# module to do "alignment" of text which you
# could try for doing the right-side alignment,
# https://docs.python.org/3/library/textwrap.html#textwrap.indent
#
print('-----------------------------------------------')
print(f'timestamp: {datetime.fromtimestamp(max_pain['timestamp'])}')
print(f'expiry_date: {max_pain['expiry_date']}')
print(f'max_pain: {max_pain['max_pain']:,.0f}')
print(f'total intrinsic value: {max_pain['total_intrinsic_value']:,.0f}')
print('-----------------------------------------------')
# Process GUI events to keep the window responsive
app.processEvents()
async def main():
async with tractor.open_nursery(
debug_mode=True,
loglevel='info',
) as an:
from tractor import log
log.get_console_log(level='info')
ptl: tractor.Portal = await an.start_actor(
'max_pain_daemon',
enable_modules=[__name__],
infect_asyncio=True,
# ^TODO, we can actually run this in the root-actor now
# if needed as per 2nd "section" in,
# https://pikers.dev/goodboy/tractor/pulls/2
#
# NOTE, will first require us porting to modern
# `tractor:main` though ofc!
)
await ptl.run(max_pain_daemon)
if __name__ == '__main__':
trio.run(main)

View File

@ -0,0 +1,29 @@
## Max Pain Calculation for Deribit Options
This feature, which calculates the max pain point for options traded
on the Deribit exchange using cryptofeed library.
- Functions in the api module for fetching options data from Deribit.
[commit](https://pikers.dev/pikers/piker/commit/da55856dd2876291f55a06eb0561438a912d8241)
- Compute the max pain point based on open interest data using
deribit's api.
[commit](https://pikers.dev/pikers/piker/commit/0d9d6e15ba0edeb662ec97f7599dd66af3046b94)
### How to test it?
**Before start:** in order to get this working with `uv`, you
**must** use my [`tractor` fork](https://pikers.dev/ntorres/tractor/src/branch/aio_abandons)
and this branch: `aio_abandons`, the reason is that I cherry-pick the
`uv_migration` that guille made, for some reason that a didn't dive
into, in my system y need tractor using `uv` too. quite hacky
I guess.
1. `uv lock`
2. `uv run --no-dev python examples/max_pain.py`
3. A message should be display, enter one of the expiration date
available.
4. The script should be up and running.

View File

@ -1,24 +1,135 @@
{
"nodes": {
"nixpkgs": {
"flake-utils": {
"inputs": {
"systems": "systems"
},
"locked": {
"lastModified": 1765779637,
"narHash": "sha256-KJ2wa/BLSrTqDjbfyNx70ov/HdgNBCBBSQP3BIzKnv4=",
"owner": "nixos",
"repo": "nixpkgs",
"rev": "1306659b587dc277866c7b69eb97e5f07864d8c4",
"lastModified": 1689068808,
"narHash": "sha256-6ixXo3wt24N/melDWjq70UuHQLxGV8jZvooRanIHXw0=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "919d646de7be200f3bf08cb76ae1f09402b6f9b4",
"type": "github"
},
"original": {
"owner": "nixos",
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"flake-utils_2": {
"inputs": {
"systems": "systems_2"
},
"locked": {
"lastModified": 1689068808,
"narHash": "sha256-6ixXo3wt24N/melDWjq70UuHQLxGV8jZvooRanIHXw0=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "919d646de7be200f3bf08cb76ae1f09402b6f9b4",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"nix-github-actions": {
"inputs": {
"nixpkgs": [
"poetry2nix",
"nixpkgs"
]
},
"locked": {
"lastModified": 1688870561,
"narHash": "sha256-4UYkifnPEw1nAzqqPOTL2MvWtm3sNGw1UTYTalkTcGY=",
"owner": "nix-community",
"repo": "nix-github-actions",
"rev": "165b1650b753316aa7f1787f3005a8d2da0f5301",
"type": "github"
},
"original": {
"owner": "nix-community",
"repo": "nix-github-actions",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1692174805,
"narHash": "sha256-xmNPFDi/AUMIxwgOH/IVom55Dks34u1g7sFKKebxUm0=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "caac0eb6bdcad0b32cb2522e03e4002c8975c62e",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-unstable",
"repo": "nixpkgs",
"type": "github"
}
},
"poetry2nix": {
"inputs": {
"flake-utils": "flake-utils_2",
"nix-github-actions": "nix-github-actions",
"nixpkgs": [
"nixpkgs"
]
},
"locked": {
"lastModified": 1692048894,
"narHash": "sha256-cDw03rso2V4CDc3Mll0cHN+ztzysAvdI8pJ7ybbz714=",
"ref": "refs/heads/pyqt6",
"rev": "b059ad4c3051f45d6c912e17747aae37a9ec1544",
"revCount": 2276,
"type": "git",
"url": "file:///home/lord_fomo/repos/poetry2nix"
},
"original": {
"type": "git",
"url": "file:///home/lord_fomo/repos/poetry2nix"
}
},
"root": {
"inputs": {
"nixpkgs": "nixpkgs"
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs",
"poetry2nix": "poetry2nix"
}
},
"systems": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
},
"systems_2": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
}
},

251
flake.nix
View File

@ -1,103 +1,180 @@
# An "impure" template thx to `pyproject.nix`,
# https://pyproject-nix.github.io/pyproject.nix/templates.html#impure
# https://github.com/pyproject-nix/pyproject.nix/blob/master/templates/impure/flake.nix
{
description = "An impure `piker` overlay using `uv` with Nix(OS)";
# NOTE: to convert to a poetry2nix env like this here are the
# steps:
# - install poetry in your system nix config
# - convert the repo to use poetry using `poetry init`:
# https://python-poetry.org/docs/basic-usage/#initialising-a-pre-existing-project
# - then manually ensuring all deps are converted over:
# - add this file to the repo and commit it
# -
inputs = {
nixpkgs.url = "github:nixos/nixpkgs/nixos-unstable";
# GROKin tips:
# - CLI eps are (ostensibly) added via an `entry_points.txt`:
# - https://packaging.python.org/en/latest/specifications/entry-points/#file-format
# - https://github.com/nix-community/poetry2nix/blob/master/editable.nix#L49
{
description = "piker: trading gear for hackers (pkged with poetry2nix)";
inputs.flake-utils.url = "github:numtide/flake-utils";
inputs.nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
# see https://github.com/nix-community/poetry2nix/tree/master#api
inputs.poetry2nix = {
# url = "github:nix-community/poetry2nix";
# url = "github:K900/poetry2nix/qt5-explicit-deps";
url = "/home/lord_fomo/repos/poetry2nix";
inputs.nixpkgs.follows = "nixpkgs";
};
outputs =
{ nixpkgs, ... }:
let
inherit (nixpkgs) lib;
forAllSystems = lib.genAttrs lib.systems.flakeExposed;
in
{
devShells = forAllSystems (
system:
let
pkgs = nixpkgs.legacyPackages.${system};
outputs = {
self,
nixpkgs,
flake-utils,
poetry2nix,
}:
# TODO: build cross-OS and use the `${system}` var thingy..
flake-utils.lib.eachDefaultSystem (system:
let
# use PWD as sources
projectDir = ./.;
pyproject = ./pyproject.toml;
poetrylock = ./poetry.lock;
# do store-path extractions
qt6baseStorePath = lib.getLib pkgs.qt6.qtbase;
# ?TODO? can remove below since manual linking not needed?
# qt6QtWaylandStorePath = lib.getLib pkgs.qt6.qtwayland;
# TODO: port to 3.11 and support both versions?
python = "python3.10";
# XXX NOTE XXX, for now we overlay specific pkgs via
# a major-version-pinned-`cpython`
cpython = "python313";
pypkgs = pkgs."${cpython}Packages";
in
{
default = pkgs.mkShell {
# for more functions and examples.
# inherit
# (poetry2nix.legacyPackages.${system})
# mkPoetryApplication;
# pkgs = nixpkgs.legacyPackages.${system};
packages = with pkgs; [
# XXX, ensure sh completions active!
bashInteractive
bash-completion
pkgs = nixpkgs.legacyPackages.x86_64-linux;
lib = pkgs.lib;
p2npkgs = poetry2nix.legacyPackages.x86_64-linux;
# dev utils
ruff
pypkgs.ruff
# define all pkg overrides per dep, see edgecases.md:
# https://github.com/nix-community/poetry2nix/blob/master/docs/edgecases.md
# TODO: add these into the json file:
# https://github.com/nix-community/poetry2nix/blob/master/overrides/build-systems.json
pypkgs-build-requirements = {
asyncvnc = [ "setuptools" ];
eventkit = [ "setuptools" ];
ib-insync = [ "setuptools" "flake8" ];
msgspec = [ "setuptools"];
pdbp = [ "setuptools" ];
pyqt6-sip = [ "setuptools" ];
tabcompleter = [ "setuptools" ];
tractor = [ "setuptools" ];
tricycle = [ "setuptools" ];
trio-typing = [ "setuptools" ];
trio-util = [ "setuptools" ];
xonsh = [ "setuptools" ];
};
qt6.qtwayland
qt6.qtbase
# auto-generate override entries
p2n-overrides = p2npkgs.defaultPoetryOverrides.extend (self: super:
builtins.mapAttrs (package: build-requirements:
(builtins.getAttr package super).overridePythonAttrs (old: {
buildInputs = (
old.buildInputs or [ ]
) ++ (
builtins.map (
pkg: if builtins.isString pkg then builtins.getAttr pkg super else pkg
) build-requirements
);
})
) pypkgs-build-requirements
);
uv
python313 # ?TODO^ how to set from `cpython` above?
pypkgs.pyqt6
pypkgs.pyqt6-sip
pypkgs.qtpy
pypkgs.qdarkstyle
pypkgs.rapidfuzz
];
# override some ahead-of-time compiled extensions
# to be built with their wheels.
ahot_overrides = p2n-overrides.extend(
final: prev: {
shellHook = ''
# unmask to debug **this** dev-shell-hook
# set -e
# llvmlite = prev.llvmlite.override {
# preferWheel = false;
# };
# set qt-base/plugin path(s)
QTBASE_PATH="${qt6baseStorePath}/lib"
QT_PLUGIN_PATH="${qt6baseStorePath}/lib/qt-6/plugins"
QT_QPA_PLATFORM_PLUGIN_PATH="$QT_PLUGIN_PATH/platforms"
# TODO: get this workin with p2n and nixpkgs..
# pyqt6 = prev.pyqt6.override {
# preferWheel = true;
# };
# link in Qt cc lib paths from <nixpkgs>
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$QTBASE_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$QT_PLUGIN_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$QT_QPA_PLATFORM_PLUGIN_PATH"
# NOTE: this DOESN'T work atm but after a fix
# to poetry2nix, it will and actually this line
# won't be needed - thanks @k900:
# https://github.com/nix-community/poetry2nix/pull/1257
pyqt5 = prev.pyqt5.override {
# withWebkit = false;
preferWheel = true;
};
# link-in c++ stdlib for various AOT-ext-pkgs (numpy, etc.)
LD_LIBRARY_PATH="${pkgs.stdenv.cc.cc.lib}/lib:$LD_LIBRARY_PATH"
# see PR from @k900:
# https://github.com/nix-community/poetry2nix/pull/1257
# pyqt5-qt5 = prev.pyqt5-qt5.override {
# withWebkit = false;
# preferWheel = true;
# };
export LD_LIBRARY_PATH
# RUNTIME-SETTINGS
#
# ------ Qt ------
# XXX, unmask to debug qt .so linking/loading deats
# export QT_DEBUG_PLUGINS=1
#
# ALSO, for *modern linux* DEs,
# - maybe set wayland-mode (TODO, parametrtize this!)
# * a chosen wayland-mode shell-integration
export QT_QPA_PLATFORM="wayland"
export QT_WAYLAND_SHELL_INTEGRATION="xdg-shell"
# ------ uv ------
# - always use the ./py313/ venv-subdir
export UV_PROJECT_ENVIRONMENT="py313"
# sync project-env with all extras
uv sync --dev --all-extras --no-group lint
# ------ TIPS ------
# NOTE, to launch the py-venv installed `xonsh` (like @goodboy)
# run the `nix develop` cmd with,
# >> nix develop -c uv run xonsh
'';
};
}
# TODO: patch in an override for polars to build
# from src! See the details likely needed from
# the cryptography entry:
# https://github.com/nix-community/poetry2nix/blob/master/overrides/default.nix#L426-L435
polars = prev.polars.override {
preferWheel = true;
};
}
);
};
# WHY!? -> output-attrs that `nix develop` scans for:
# https://nixos.org/manual/nix/stable/command-ref/new-cli/nix3-develop.html#flake-output-attributes
in
rec {
packages = {
# piker = poetry2nix.legacyPackages.x86_64-linux.mkPoetryEditablePackage {
# editablePackageSources = { piker = ./piker; };
piker = p2npkgs.mkPoetryApplication {
projectDir = projectDir;
# SEE ABOVE for auto-genned input set, override
# buncha deps with extras.. like `setuptools` mostly.
# TODO: maybe propose a patch to p2n to show that you
# can even do this in the edgecases docs?
overrides = ahot_overrides;
# XXX: won't work on llvmlite..
# preferWheels = true;
};
};
# devShells.default = pkgs.mkShell {
# projectDir = projectDir;
# python = "python3.10";
# overrides = ahot_overrides;
# inputsFrom = [ self.packages.x86_64-linux.piker ];
# packages = packages;
# # packages = [ poetry2nix.packages.${system}.poetry ];
# };
# TODO: grok the difference here..
# - avoid re-cloning git repos on every develop entry..
# - ideally allow hacking on the src code of some deps
# (tractor, pyqtgraph, tomlkit, etc.) WITHOUT having to
# re-install them every time a change is made.
# - boot a usable xonsh inside the poetry virtualenv when
# defined via a custom entry point?
devShells.default = p2npkgs.mkPoetryEnv {
# env = p2npkgs.mkPoetryEnv {
projectDir = projectDir;
python = pkgs.python310;
overrides = ahot_overrides;
editablePackageSources = packages;
# piker = "./";
# tractor = "../tractor/";
# }; # wut?
};
}
); # end of .outputs scope
}

View File

@ -40,7 +40,7 @@ import tomli_w # for fast ledger writing
from piker.types import Struct
from piker import config
from piker.log import get_logger
from ..log import get_logger
from .calc import (
iter_by_dt,
)
@ -239,9 +239,7 @@ class TransactionLedger(UserDict):
symcache: SymbologyCache = self._symcache
towrite: dict[str, Any] = {}
for tid, txdict in self.tx_sort(
self.data.copy()
):
for tid, txdict in self.tx_sort(self.data.copy()):
# write blank-str expiry for non-expiring assets
if (
'expiry' in txdict
@ -379,7 +377,7 @@ def open_trade_ledger(
account,
dirpath=_fp,
)
cpy: dict = ledger_dict.copy()
cpy = ledger_dict.copy()
# XXX NOTE: if not provided presume we are being called from
# sync code and need to maybe run `trio` to generate..
@ -408,13 +406,7 @@ def open_trade_ledger(
account=account,
mod=mod,
symcache=symcache,
# NOTE: allow backends to provide custom ledger sorting
tx_sort=getattr(
mod,
'tx_sort',
tx_sort,
),
tx_sort=getattr(mod, 'tx_sort', tx_sort),
)
try:
yield ledger

View File

@ -305,8 +305,8 @@ class MktPair(Struct, frozen=True):
# config right?
# src_type: AssetTypeName
# for derivs, info describing contract, egs. strike price, call
# or put, swap type, exercise model, etc.
# for derivs, info describing contract, egs.
# strike price, call or put, swap type, exercise model, etc.
contract_info: list[str] | None = None
# TODO: rename to sectype since all of these can

View File

@ -30,8 +30,7 @@ from types import ModuleType
from typing import (
Any,
Iterator,
Generator,
TYPE_CHECKING,
Generator
)
import pendulum
@ -60,10 +59,8 @@ from ..clearing._messages import (
BrokerdPosition,
)
from piker.types import Struct
from piker.log import get_logger
if TYPE_CHECKING:
from piker.data._symcache import SymbologyCache
from piker.data._symcache import SymbologyCache
from ..log import get_logger
log = get_logger(__name__)
@ -505,17 +502,6 @@ class Account(Struct):
_mktmap_table: dict[str, MktPair] | None = None,
only_require: list[str]|True = True,
# ^list of fqmes that are "required" to be processed from
# this ledger pass; we often don't care about others and
# definitely shouldn't always error in such cases.
# (eg. broker backend loaded that doesn't yet supsport the
# symcache but also, inside the paper engine we don't ad-hoc
# request `get_mkt_info()` for every symbol in the ledger,
# only the one for which we're simulating against).
# TODO, not sure if there's a better soln for this, ideally
# all backends get symcache support afap i guess..
) -> dict[str, Position]:
'''
Update the internal `.pps[str, Position]` table from input
@ -558,32 +544,11 @@ class Account(Struct):
if _mktmap_table is None:
raise
required: bool = (
only_require is True
or (
only_require is not True
and
fqme in only_require
)
)
# XXX: caller is allowed to provide a fallback
# mktmap table for the case where a new position is
# being added and the preloaded symcache didn't
# have this entry prior (eg. with frickin IB..)
if (
not (mkt := _mktmap_table.get(fqme))
and
required
):
raise
elif not required:
continue
else:
# should be an entry retreived somewhere
assert mkt
mkt = _mktmap_table[fqme]
if not (pos := pps.get(bs_mktid)):
@ -700,7 +665,7 @@ class Account(Struct):
def write_config(self) -> None:
'''
Write the current account state to the user's account TOML file, normally
something like `pps.toml`.
something like ``pps.toml``.
'''
# TODO: show diff output?

View File

@ -268,6 +268,9 @@ def iter_by_dt(
(v := tx.get(k))
)
):
# TODO? remove yah?
# v = tx[k] if isdict else tx.dt
# only call parser on the value if not None from
# the `parsers` table above (when NOT using
# `.get()`), otherwise pass through the value and
@ -284,50 +287,24 @@ def iter_by_dt(
return ret
else:
log.debug(
f'Parser-field not found in txn\n'
f'\n'
f'parser-field: {k!r}\n'
f'txn: {tx!r}\n'
f'\n'
f'Trying next..\n'
)
continue
# XXX: we should never really get here bc it means some kinda
# bad txn-record (field) data..
#
# -> set the `debug_mode = True` if you want to trace such
# cases from REPL ;)
# XXX: should never get here..
else:
# XXX: we should really never get here..
# only if a ledger record has no expected sort(able)
# field will we likely hit this.. like with ze IB.
# if no sortable field just deliver epoch?
log.warning(
'No (time) sortable field for TXN:\n'
f'{tx!r}\n'
)
report: str = (
f'No supported time-field found in txn !?\n'
f'\n'
f'supported-time-fields: {parsers!r}\n'
f'\n'
f'txn: {tx!r}\n'
)
if debug:
with maybe_open_crash_handler(
pdb=debug,
raise_on_exit=False,
):
raise ValueError(report)
else:
log.error(report)
with maybe_open_crash_handler(pdb=True):
raise ValueError(
f'Invalid txn time ??\n'
f'txn-id: {k!r}\n'
f'{k!r}: {v!r}\n'
)
# assert v is not None, f'No valid value for `{k}`!?'
if _invalid is not None:
_invalid.append(tx)
return from_timestamp(0.)
# breakpoint()
entry: tuple[str, dict]|Transaction
invalid: list = []
for entry in sorted(
@ -341,6 +318,8 @@ def iter_by_dt(
log.warning(
f'Ignoring txn w invalid timestamp ??\n'
f'{pformat(entry)}\n'
# f'txn-id: {k!r}\n'
# f'{k!r}: {v!r}\n'
)
continue
@ -421,10 +400,7 @@ def open_ledger_dfs(
can update the ledger on exit.
'''
with maybe_open_crash_handler(
pdb=debug_mode,
# raise_on_exit=False,
):
with maybe_open_crash_handler(pdb=debug_mode):
if not ledger:
import time
from ._ledger import open_trade_ledger

View File

@ -300,8 +300,7 @@ def disect(
assert not df.is_empty()
# muck around in pdbp REPL
# tractor.devx.mk_pdb().set_trace()
# breakpoint()
breakpoint()
# TODO: we REALLY need a better console REPL for this
# kinda thing..

View File

@ -51,6 +51,7 @@ __brokers__: list[str] = [
'ib',
'kraken',
'kucoin',
'deribit',
# broken but used to work
# 'questrade',
@ -61,7 +62,6 @@ __brokers__: list[str] = [
# wstrade
# iex
# deribit
# bitso
]
@ -98,14 +98,13 @@ async def open_cached_client(
If one has not been setup do it and cache it.
'''
brokermod: ModuleType = get_brokermod(brokername)
# TODO: make abstract or `typing.Protocol`
# client: Client
brokermod = get_brokermod(brokername)
async with maybe_open_context(
acm_func=brokermod.get_client,
kwargs=kwargs,
) as (cache_hit, client):
if cache_hit:
log.runtime(f'Reusing existing {client}')

View File

@ -94,15 +94,13 @@ class L1(Struct):
# validation type
# https://developers.binance.com/docs/derivatives/usds-margined-futures/websocket-market-streams/Aggregate-Trade-Streams#response-example
class AggTrade(Struct, frozen=True):
e: str # Event type
E: int # Event time
s: str # Symbol
a: int # Aggregate trade ID
p: float # Price
q: float # Quantity with all the market trades
nq: float # Normal quantity without the trades involving RPI orders
q: float # Quantity
f: int # First trade ID
l: int # noqa Last trade ID
T: int # Trade time

View File

@ -104,9 +104,6 @@ class Pair(Struct, frozen=True, kw_only=True):
# https://developers.binance.com/docs/binance-spot-api-docs#future-changes
pegInstructionsAllowed: bool = False
# https://developers.binance.com/docs/binance-spot-api-docs#2025-12-02
opoAllowed: bool = False
filters: dict[
str,
str | int | float,
@ -223,10 +220,7 @@ class FutesPair(Pair):
assert pair == self.pair # sanity
return f'{expiry}'
case (
'PERPETUAL'
| 'TRADIFI_PERPETUAL'
):
case 'PERPETUAL':
return 'PERP'
case '':
@ -255,10 +249,7 @@ class FutesPair(Pair):
margin: str = self.marginAsset
match ctype:
case (
'PERPETUAL'
| 'TRADIFI_PERPETUAL'
):
case 'PERPETUAL':
return f'{margin}M'
case (

View File

@ -471,15 +471,11 @@ def search(
'''
# global opts
brokermods: list[ModuleType] = list(config['brokermods'].values())
# TODO: this is coming from the `search --pdb` NOT from
# the `piker --pdb` XD ..
# -[ ] pull from the parent click ctx's values..dumdum
# assert pdb
brokermods = list(config['brokermods'].values())
# define tractor entrypoint
async def main(func):
async with maybe_open_pikerd(
loglevel=config['loglevel'],
debug_mode=pdb,

View File

@ -22,9 +22,7 @@ routines should be primitive data types where possible.
"""
import inspect
from types import ModuleType
from typing import (
Any,
)
from typing import List, Dict, Any, Optional
import trio
@ -36,10 +34,8 @@ from ..accounting import MktPair
async def api(brokername: str, methname: str, **kwargs) -> dict:
'''
Make (proxy through) a broker API call by name and return its result.
'''
"""Make (proxy through) a broker API call by name and return its result.
"""
brokermod = get_brokermod(brokername)
async with brokermod.get_client() as client:
meth = getattr(client, methname, None)
@ -66,14 +62,10 @@ async def api(brokername: str, methname: str, **kwargs) -> dict:
async def stocks_quote(
brokermod: ModuleType,
tickers: list[str]
) -> dict[str, dict[str, Any]]:
'''
Return a `dict` of snapshot quotes for the provided input
`tickers`: a `list` of fqmes.
'''
tickers: List[str]
) -> Dict[str, Dict[str, Any]]:
"""Return quotes dict for ``tickers``.
"""
async with brokermod.get_client() as client:
return await client.quote(tickers)
@ -82,15 +74,13 @@ async def stocks_quote(
async def option_chain(
brokermod: ModuleType,
symbol: str,
date: str|None = None,
) -> dict[str, dict[str, dict[str, Any]]]:
'''
Return option chain for ``symbol`` for ``date``.
date: Optional[str] = None,
) -> Dict[str, Dict[str, Dict[str, Any]]]:
"""Return option chain for ``symbol`` for ``date``.
By default all expiries are returned. If ``date`` is provided
then contract quotes for that single expiry are returned.
'''
"""
async with brokermod.get_client() as client:
if date:
id = int((await client.tickers2ids([symbol]))[symbol])
@ -108,7 +98,7 @@ async def option_chain(
# async def contracts(
# brokermod: ModuleType,
# symbol: str,
# ) -> dict[str, dict[str, dict[str, Any]]]:
# ) -> Dict[str, Dict[str, Dict[str, Any]]]:
# """Return option contracts (all expiries) for ``symbol``.
# """
# async with brokermod.get_client() as client:
@ -120,24 +110,15 @@ async def bars(
brokermod: ModuleType,
symbol: str,
**kwargs,
) -> dict[str, dict[str, dict[str, Any]]]:
'''
Return option contracts (all expiries) for ``symbol``.
'''
) -> Dict[str, Dict[str, Dict[str, Any]]]:
"""Return option contracts (all expiries) for ``symbol``.
"""
async with brokermod.get_client() as client:
return await client.bars(symbol, **kwargs)
async def search_w_brokerd(
name: str,
pattern: str,
) -> dict:
async def search_w_brokerd(name: str, pattern: str) -> dict:
# TODO: WHY NOT WORK!?!
# when we `step` through the next block?
# import tractor
# await tractor.pause()
async with open_cached_client(name) as client:
# TODO: support multiple asset type concurrent searches.
@ -149,12 +130,12 @@ async def symbol_search(
pattern: str,
**kwargs,
) -> dict[str, dict[str, dict[str, Any]]]:
) -> Dict[str, Dict[str, Dict[str, Any]]]:
'''
Return symbol info from broker.
'''
results: list[str] = []
results = []
async def search_backend(
brokermod: ModuleType
@ -162,13 +143,6 @@ async def symbol_search(
brokername: str = mod.name
# TODO: figure this the FUCK OUT
# -> ok so obvi in the root actor any async task that's
# spawned outside the main tractor-root-actor task needs to
# call this..
# await tractor.devx._debug.maybe_init_greenback()
# tractor.pause_from_sync()
async with maybe_spawn_brokerd(
mod.name,
infect_asyncio=getattr(
@ -188,6 +162,7 @@ async def symbol_search(
))
async with trio.open_nursery() as n:
for mod in brokermods:
n.start_soon(search_backend, mod.name)
@ -197,13 +172,11 @@ async def symbol_search(
async def mkt_info(
brokermod: ModuleType,
fqme: str,
**kwargs,
) -> MktPair:
'''
Return the `piker.accounting.MktPair` info struct from a given
backend broker tradable src/dst asset pair.
Return MktPair info from broker including src and dst assets.
'''
async with open_cached_client(brokermod.name) as client:

View File

@ -25,6 +25,7 @@ from .api import (
get_client,
)
from .feed import (
get_mkt_info,
open_history_client,
open_symbol_search,
stream_quotes,
@ -34,15 +35,20 @@ from .feed import (
# open_trade_dialog,
# norm_trade_records,
# )
from .venues import (
OptionPair,
)
log = get_logger(__name__)
__all__ = [
'get_client',
# 'trades_dialogue',
'get_mkt_info',
'open_history_client',
'open_symbol_search',
'stream_quotes',
'OptionPair',
# 'norm_trade_records',
]

File diff suppressed because it is too large Load Diff

View File

@ -18,38 +18,59 @@
Deribit backend.
'''
from __future__ import annotations
from contextlib import asynccontextmanager as acm
from datetime import datetime
from typing import Any, Optional, Callable
from typing import (
# Any,
# Optional,
Callable,
)
# from pprint import pformat
import time
import cryptofeed
import trio
from trio_typing import TaskStatus
import pendulum
from rapidfuzz import process as fuzzy
from pendulum import (
from_timestamp,
)
import numpy as np
import tractor
from piker.brokers import open_cached_client
from piker.log import get_logger, get_console_log
from piker.data import ShmArray
from piker.brokers._util import (
BrokerError,
from piker.accounting import (
Asset,
MktPair,
unpack_fqme,
)
from piker.brokers import (
open_cached_client,
NoData,
DataUnavailable,
)
from cryptofeed import FeedHandler
from cryptofeed.defines import (
DERIBIT, L1_BOOK, TRADES, OPTION, CALL, PUT
from piker._cacheables import (
async_lifo_cache,
)
from cryptofeed.symbols import Symbol
from piker.log import (
get_logger,
mk_repr,
)
from piker.data.validate import FeedInit
from .api import (
Client, Trade,
get_config,
str_to_cb_sym, piker_sym_to_cb_sym, cb_sym_to_deribit_inst,
Client,
# get_config,
piker_sym_to_cb_sym,
cb_sym_to_deribit_inst,
str_to_cb_sym,
maybe_open_price_feed
)
from .venues import (
Pair,
OptionPair,
Trade,
)
_spawn_kwargs = {
'infect_asyncio': True,
@ -64,90 +85,215 @@ async def open_history_client(
mkt: MktPair,
) -> tuple[Callable, int]:
fnstrument: str = mkt.bs_fqme
# TODO implement history getter for the new storage layer.
async with open_cached_client('deribit') as client:
pair: OptionPair = client._pairs[mkt.dst.name]
# XXX NOTE, the cuckers use ms !!!
creation_time_s: int = pair.creation_timestamp/1000
async def get_ohlc(
end_dt: Optional[datetime] = None,
start_dt: Optional[datetime] = None,
timeframe: float,
end_dt: datetime | None = None,
start_dt: datetime | None = None,
) -> tuple[
np.ndarray,
datetime, # start
datetime, # end
]:
if timeframe != 60:
raise DataUnavailable('Only 1m bars are supported')
array = await client.bars(
instrument,
array: np.ndarray = await client.bars(
mkt,
start_dt=start_dt,
end_dt=end_dt,
)
if len(array) == 0:
raise DataUnavailable
if (
end_dt is None
):
raise DataUnavailable(
'No history seems to exist yet?\n\n'
f'{mkt}'
)
elif (
end_dt
and
end_dt.timestamp() < creation_time_s
):
# the contract can't have history
# before it was created.
pair_type_str: str = type(pair).__name__
create_dt: datetime = from_timestamp(creation_time_s)
raise DataUnavailable(
f'No history prior to\n'
f'`{pair_type_str}.creation_timestamp: int = '
f'{pair.creation_timestamp}\n\n'
f'------ deribit sux ------\n'
f'WHICH IN "NORMAL PEOPLE WHO USE EPOCH TIME" form is,\n'
f'creation_time_s: {creation_time_s}\n'
f'create_dt: {create_dt}\n'
)
raise NoData(
f'No frame for {start_dt} -> {end_dt}\n'
)
start_dt = pendulum.from_timestamp(array[0]['time'])
end_dt = pendulum.from_timestamp(array[-1]['time'])
start_dt = from_timestamp(array[0]['time'])
end_dt = from_timestamp(array[-1]['time'])
times = array['time']
if not times.any():
raise ValueError(
'Bad frame with null-times?\n\n'
f'{times}'
)
if end_dt is None:
inow: int = round(time.time())
if (inow - times[-1]) > 60:
await tractor.pause()
return array, start_dt, end_dt
yield get_ohlc, {'erlangs': 3, 'rate': 3}
yield (
get_ohlc,
{ # backfill config
'erlangs': 3,
'rate': 3,
}
)
@async_lifo_cache()
async def get_mkt_info(
fqme: str,
) -> tuple[MktPair, Pair|OptionPair] | None:
# uppercase since kraken bs_mktid is always upper
if 'deribit' not in fqme.lower():
fqme += '.deribit'
mkt_mode: str = ''
broker, mkt_ep, venue, expiry = unpack_fqme(fqme)
# NOTE: we always upper case all tokens to be consistent with
# binance's symbology style for pairs, like `BTCUSDT`, but in
# theory we could also just keep things lower case; as long as
# we're consistent and the symcache matches whatever this func
# returns, always!
expiry: str = expiry.upper()
venue: str = venue.upper()
# venue_lower: str = venue.lower()
mkt_mode: str = 'option'
async with open_cached_client(
'deribit',
) as client:
assets: dict[str, Asset] = await client.get_assets()
pair_str: str = mkt_ep.lower()
pair: Pair = await client.exch_info(
sym=pair_str,
)
mkt_mode = pair.venue
client.mkt_mode = mkt_mode
dst: Asset | None = assets.get(pair.bs_dst_asset)
src: Asset | None = assets.get(pair.bs_src_asset)
mkt = MktPair(
dst=dst,
src=src,
price_tick=pair.price_tick,
size_tick=pair.size_tick,
bs_mktid=pair.symbol,
venue=mkt_mode,
broker='deribit',
_atype=mkt_mode,
_fqme_without_src=True,
# expiry=pair.expiry,
# XXX TODO, currently we don't use it since it's
# already "described" in the `OptionPair.symbol: str`
# and if we slap in the ISO repr it's kinda hideous..
# -[ ] figure out the best either std
)
return mkt, pair
async def stream_quotes(
send_chan: trio.abc.SendChannel,
symbols: list[str],
feed_is_live: trio.Event,
loglevel: str = None,
# startup sync
task_status: TaskStatus[tuple[dict, dict]] = trio.TASK_STATUS_IGNORED,
) -> None:
# XXX: required to propagate ``tractor`` loglevel to piker logging
get_console_log(loglevel or tractor.current_actor().loglevel)
'''
Open a live quote stream for the market set defined by `symbols`.
sym = symbols[0]
Internally this starts a `cryptofeed.FeedHandler` inside an `asyncio`-side
task and relays through L1 and `Trade` msgs here to our `trio.Task`.
'''
sym = symbols[0].split('.')[0]
init_msgs: list[FeedInit] = []
# multiline nested `dict` formatter (since rn quote-msgs are
# just that).
pfmt: Callable[[str], str] = mk_repr(
# so we can see `deribit`'s delightfully mega-long bs fields..
maxstring=100,
)
async with (
open_cached_client('deribit') as client,
send_chan as send_chan
):
mkt: MktPair
pair: Pair
mkt, pair = await get_mkt_info(sym)
init_msgs = {
# pass back token, and bool, signalling if we're the writer
# and that history has been written
sym: {
'symbol_info': {
'asset_type': 'option',
'price_tick_size': 0.0005
},
'shm_write_opts': {'sum_tick_vml': False},
'fqsn': sym,
},
}
# build out init msgs according to latest spec
init_msgs.append(
FeedInit(
mkt_info=mkt,
)
)
# build `cryptofeed` feed-handle
cf_sym: cryptofeed.Symbol = piker_sym_to_cb_sym(sym)
nsym = piker_sym_to_cb_sym(sym)
from_cf: tractor.to_asyncio.LinkedTaskChannel
async with maybe_open_price_feed(sym) as from_cf:
async with maybe_open_price_feed(sym) as stream:
# load the "last trades" summary
last_trades_res: cryptofeed.LastTradesResult = await client.last_trades(
cb_sym_to_deribit_inst(cf_sym),
count=1,
)
last_trades: list[Trade] = last_trades_res.trades
cache = await client.cache_symbols()
# TODO, do we even need this or will the above always
# work?
# if not last_trades:
# await tractor.pause()
# async for typ, quote in from_cf:
# if typ == 'trade':
# last_trade = Trade(**(quote['data']))
# break
last_trades = (await client.last_trades(
cb_sym_to_deribit_inst(nsym), count=1)).trades
# else:
last_trade = Trade(
**(last_trades[0])
)
if len(last_trades) == 0:
last_trade = None
async for typ, quote in stream:
if typ == 'trade':
last_trade = Trade(**(quote['data']))
break
else:
last_trade = Trade(**(last_trades[0]))
first_quote = {
first_quote: dict = {
'symbol': sym,
'last': last_trade.price,
'brokerd_ts': last_trade.timestamp,
@ -158,13 +304,84 @@ async def stream_quotes(
'broker_ts': last_trade.timestamp
}]
}
task_status.started((init_msgs, first_quote))
task_status.started((
init_msgs,
first_quote,
))
feed_is_live.set()
async for typ, quote in stream:
topic = quote['symbol']
await send_chan.send({topic: quote})
# NOTE XXX, static for now!
# => since this only handles ONE mkt feed at a time we
# don't need a lookup table to map interleaved quotes
# from multiple possible mkt-pairs
topic: str = mkt.bs_fqme
# deliver until cancelled
async for typ, ref in from_cf:
match typ:
case 'trade':
trade: cryptofeed.types.Trade = ref
# TODO, re-impl this according to teh ideal
# fqme for opts that we choose!!
bs_fqme: str = cb_sym_to_deribit_inst(
str_to_cb_sym(trade.symbol)
).lower()
piker_quote: dict = {
'symbol': bs_fqme,
'last': trade.price,
'broker_ts': time.time(),
# ^TODO, name this `brokerd/datad_ts` and
# use `time.time_ns()` ??
'ticks': [{
'type': 'trade',
'price': float(trade.price),
'size': float(trade.amount),
'broker_ts': trade.timestamp,
}],
}
log.info(
f'deribit {typ!r} quote for {sym!r}\n\n'
f'{trade}\n\n'
f'{pfmt(piker_quote)}\n'
)
case 'l1':
book: cryptofeed.types.L1Book = ref
# TODO, so this is where we can possibly change things
# and instead lever the `MktPair.bs_fqme: str` output?
bs_fqme: str = cb_sym_to_deribit_inst(
str_to_cb_sym(book.symbol)
).lower()
piker_quote: dict = {
'symbol': bs_fqme,
'ticks': [
{'type': 'bid',
'price': float(book.bid_price),
'size': float(book.bid_size)},
{'type': 'bsize',
'price': float(book.bid_price),
'size': float(book.bid_size),},
{'type': 'ask',
'price': float(book.ask_price),
'size': float(book.ask_size),},
{'type': 'asize',
'price': float(book.ask_price),
'size': float(book.ask_size),}
]
}
await send_chan.send({
topic: piker_quote,
})
@tractor.context
@ -174,12 +391,21 @@ async def open_symbol_search(
async with open_cached_client('deribit') as client:
# load all symbols locally for fast search
cache = await client.cache_symbols()
# cache = client._pairs
await ctx.started()
async with ctx.open_stream() as stream:
pattern: str
async for pattern in stream:
# repack in dict form
await stream.send(
await client.search_symbols(pattern))
# NOTE: pattern fuzzy-matching is done within
# the methd impl.
pairs: dict[str, Pair] = await client.search_symbols(
pattern,
)
# repack in fqme-keyed table
byfqme: dict[str, Pair] = {}
for pair in pairs.values():
byfqme[pair.bs_fqme] = pair
await stream.send(byfqme)

View File

@ -0,0 +1,196 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Per market data-type definitions and schemas types.
"""
from __future__ import annotations
import pendulum
from typing import (
Literal,
Optional,
)
from decimal import Decimal
from piker.types import Struct
# API endpoint paths by venue / sub-API
_domain: str = 'deribit.com'
_url = f'https://www.{_domain}'
# WEBsocketz
_ws_url: str = f'wss://www.{_domain}/ws/api/v2'
# test nets
_testnet_ws_url: str = f'wss://test.{_domain}/ws/api/v2'
MarketType = Literal[
'option'
]
def get_api_eps(venue: MarketType) -> tuple[str, str]:
'''
Return API ep root paths per venue.
'''
return {
'option': (
_ws_url,
),
}[venue]
class Pair(Struct, frozen=True, kw_only=True):
symbol: str
# src
quote_currency: str # 'BTC'
# dst
base_currency: str # "BTC",
tick_size: float # 0.0001 # [{'above_price': 0.005, 'tick_size': 0.0005}]
tick_size_steps: list[dict[str, float]]
@property
def price_tick(self) -> Decimal:
return Decimal(str(self.tick_size_steps[0]['above_price']))
@property
def size_tick(self) -> Decimal:
return Decimal(str(self.tick_size))
@property
def bs_fqme(self) -> str:
return f'{self.symbol}'
@property
def bs_mktid(self) -> str:
return f'{self.symbol}.{self.venue}'
class OptionPair(Pair, frozen=True):
taker_commission: float # 0.0003
strike: float # 5000.0
settlement_period: str # 'day'
settlement_currency: str # "BTC",
rfq: bool # false
price_index: str # 'btc_usd'
option_type: str # 'call'
min_trade_amount: float # 0.1
maker_commission: float # 0.0003
kind: str # 'option'
is_active: bool # true
instrument_type: str # 'reversed'
instrument_name: str # 'BTC-1SEP24-55000-C'
instrument_id: int # 364671
expiration_timestamp: int # 1725177600000
creation_timestamp: int # 1724918461000
counter_currency: str # 'USD'
contract_size: float # '1.0'
block_trade_tick_size: float # '0.0001'
block_trade_min_trade_amount: int # '25'
block_trade_commission: float # '0.003'
# NOTE: see `.data._symcache.SymbologyCache.load()` for why
ns_path: str = 'piker.brokers.deribit:OptionPair'
# TODO, impl this without the MM:SS part of
# the `'THH:MM:SS..'` etc..
@property
def expiry(self) -> str:
iso_date = pendulum.from_timestamp(
self.expiration_timestamp / 1000
).isoformat()
return iso_date
@property
def venue(self) -> str:
return f'{self.instrument_type}_option'
@property
def bs_fqme(self) -> str:
return f'{self.symbol}'
@property
def bs_src_asset(self) -> str:
return f'{self.quote_currency}'
@property
def bs_dst_asset(self) -> str:
return f'{self.symbol}'
PAIRTYPES: dict[MarketType, Pair] = {
'option': OptionPair,
}
class JSONRPCResult(Struct):
id: int
usIn: int
usOut: int
usDiff: int
testnet: bool
jsonrpc: str = '2.0'
error: Optional[dict] = None
result: Optional[list[dict]] = None
class JSONRPCChannel(Struct):
method: str
params: dict
jsonrpc: str = '2.0'
class KLinesResult(Struct):
low: list[float]
cost: list[float]
high: list[float]
open: list[float]
close: list[float]
ticks: list[int]
status: str
volume: list[float]
class Trade(Struct):
iv: float
price: float
amount: float
trade_id: str
contracts: float
direction: str
trade_seq: int
timestamp: int
mark_price: float
index_price: float
tick_direction: int
instrument_name: str
combo_id: Optional[str] = '',
combo_trade_id: Optional[int] = 0,
block_trade_id: Optional[str] = '',
block_trade_leg_count: Optional[int] = 0,
class LastTradesResult(Struct):
trades: list[Trade]
has_more: bool

View File

@ -38,6 +38,7 @@ from piker.brokers._util import get_logger
if TYPE_CHECKING:
from .api import Client
from ib_insync import IB
import i3ipc
log = get_logger('piker.brokers.ib')
@ -61,7 +62,7 @@ no_setup_msg:str = (
def try_xdo_manual(
client: Client,
vnc_sockaddr: str,
):
'''
Do the "manual" `xdo`-based screen switch + click
@ -78,7 +79,6 @@ def try_xdo_manual(
_reset_tech = 'i3ipc_xdotool'
return True
except OSError:
vnc_sockaddr: str = client.conf.vnc_addrs
log.exception(
no_setup_msg.format(vnc_sockaddr=vnc_sockaddr)
)
@ -86,6 +86,7 @@ def try_xdo_manual(
async def data_reset_hack(
# vnc_host: str,
client: Client,
reset_type: Literal['data', 'connection'],
@ -117,24 +118,36 @@ async def data_reset_hack(
that need to be wrangle.
'''
ib_client: IB = client.ib
# look up any user defined vnc socket address mapped from
# a particular API socket port.
vnc_addrs: tuple[str]|None = client.conf.get('vnc_addrs')
if not vnc_addrs:
api_port: str = str(ib_client.client.port)
vnc_host: str
vnc_port: int
vnc_sockaddr: tuple[str] | None = client.conf.get('vnc_addrs')
if not vnc_sockaddr:
log.warning(
no_setup_msg.format(vnc_sockaddr=client.conf)
no_setup_msg.format(vnc_sockaddr=vnc_sockaddr)
+
'REQUIRES A `vnc_addrs: array` ENTRY'
)
vnc_host, vnc_port = vnc_sockaddr.get(
api_port,
('localhost', 3003)
)
global _reset_tech
match _reset_tech:
case 'vnc':
try:
await tractor.to_asyncio.run_task(
partial(
vnc_click_hack,
client=client,
host=vnc_host,
port=vnc_port,
)
)
except (
@ -145,31 +158,29 @@ async def data_reset_hack(
import i3ipc # noqa (since a deps dynamic check)
except ModuleNotFoundError:
log.warning(
no_setup_msg.format(vnc_sockaddr=client.conf)
no_setup_msg.format(vnc_sockaddr=vnc_sockaddr)
)
return False
# XXX, Xorg only workaround..
# TODO? remove now that we have `pyvnc`?
# if vnc_host not in {
# 'localhost',
# '127.0.0.1',
# }:
# focussed, matches = i3ipc_fin_wins_titled()
# if not matches:
# log.warning(
# no_setup_msg.format(vnc_sockaddr=vnc_sockaddr)
# )
# return False
# else:
# try_xdo_manual(vnc_sockaddr)
if vnc_host not in {
'localhost',
'127.0.0.1',
}:
focussed, matches = i3ipc_fin_wins_titled()
if not matches:
log.warning(
no_setup_msg.format(vnc_sockaddr=vnc_sockaddr)
)
return False
else:
try_xdo_manual(vnc_sockaddr)
# localhost but no vnc-client or it borked..
else:
try_xdo_manual(client)
try_xdo_manual(vnc_sockaddr)
case 'i3ipc_xdotool':
try_xdo_manual(client)
try_xdo_manual(vnc_sockaddr)
# i3ipc_xdotool_manual_click_hack()
case _ as tech:
@ -180,55 +191,15 @@ async def data_reset_hack(
async def vnc_click_hack(
client: Client,
reset_type: str = 'data',
pw: str|None = None,
host: str,
port: int,
reset_type: str = 'data'
) -> None:
'''
Reset the data or network connection for the VNC attached
ib-gateway using a (magic) keybinding combo.
A vnc-server password can be set either by an input `pw` param or
set in the client's config with the latter loaded from the user's
`brokers.toml` in a vnc-addrs-port-mapping section,
.. code:: toml
[ib.vnc_addrs]
4002 = {host = 'localhost', port = 5900, pw = 'doggy'}
'''
api_port: str = str(client.ib.client.port)
conf: dict = client.conf
vnc_addrs: dict[int, tuple] = conf.get('vnc_addrs')
if not vnc_addrs:
return None
addr_entry: dict|tuple = vnc_addrs.get(
api_port,
('localhost', 5900) # a typical default
)
if pw is None:
match addr_entry:
case (
host,
port,
):
pass
case {
'host': host,
'port': port,
'pw': pw
}:
pass
case _:
raise ValueError(
f'Invalid `ib.vnc_addrs` entry ?\n'
f'{addr_entry!r}\n'
)
try:
from pyvnc import (
AsyncVNCClient,
@ -255,7 +226,7 @@ async def vnc_click_hack(
VNCConfig(
host=host,
port=port,
password=pw,
password='doggy',
)
)
async with client:

View File

@ -97,6 +97,10 @@ from ._util import (
get_logger,
)
# ?TODO? this can now be removed since it was originally to extend
# with a `bar_vwap` field that we removed from the default ohlcv
# dtype since it's better calculated in an FSP func
#
_bar_load_dtype: list[tuple[str, type]] = [
# NOTE XXX: only part that's diff
# from our default fields where
@ -944,7 +948,6 @@ class Client:
)
if tkr:
break
except TimeoutError as err:
timeouterr = err
await asyncio.sleep(0.01)
@ -953,9 +956,7 @@ class Client:
else:
if not warnset:
log.warning(
f'Quote req timed out..\n'
f'Maybe the venue is closed?\n'
f'\n'
f'Quote req timed out..maybe venue is closed?\n'
f'{asdict(contract)}'
)
warnset = True
@ -967,11 +968,9 @@ class Client:
)
break
else:
if (
timeouterr
and
raise_on_timeout
):
if timeouterr and raise_on_timeout:
import pdbp
pdbp.set_trace()
raise timeouterr
if not warnset:

View File

@ -117,11 +117,7 @@ def pack_position(
symbol=fqme,
currency=con.currency,
size=float(pos.position),
avg_price=(
float(pos.avgCost)
/
float(con.multiplier or 1.0)
),
avg_price=float(pos.avgCost) / float(con.multiplier or 1.0),
),
)
@ -567,7 +563,7 @@ async def open_trade_dialog(
ledgers: dict[str, TransactionLedger] = {}
tables: dict[str, Account] = {}
order_msgs: list[Status] = []
conf: dict = get_config()
conf = get_config()
accounts_def_inv: bidict[str, str] = bidict(
conf['accounts']
).inverse

View File

@ -613,7 +613,7 @@ async def get_bars(
data_cs.cancel()
# spawn new data reset task
data_cs, reset_done = await tn.start(
data_cs, reset_done = await nurse.start(
partial(
wait_on_data_reset,
proxy,
@ -635,12 +635,12 @@ async def get_bars(
unset_resetter: bool = False
async with (
tractor.trionics.collapse_eg(),
trio.open_nursery() as tn
trio.open_nursery() as nurse
):
# start history request that we allow
# to run indefinitely until a result is acquired
tn.start_soon(query)
nurse.start_soon(query)
# start history reset loop which waits up to the timeout
# for a result before triggering a data feed reset.
@ -660,7 +660,7 @@ async def get_bars(
unset_resetter: bool = True
# spawn new data reset task
data_cs, reset_done = await tn.start(
data_cs, reset_done = await nurse.start(
partial(
wait_on_data_reset,
proxy,
@ -896,10 +896,7 @@ async def open_aio_quote_stream(
symbol: str,
contract: Contract|None = None,
) -> (
trio.abc.Channel| # iface
tractor.to_asyncio.LinkedTaskChannel # actually
):
) -> trio.abc.ReceiveStream:
'''
Open a real-time `Ticker` quote stream from an `asyncio.Task`
spawned via `tractor.to_asyncio.open_channel_from()`, deliver the
@ -922,7 +919,6 @@ async def open_aio_quote_stream(
yield from_aio
return
from_aio: tractor.to_asyncio.LinkedTaskChannel
async with tractor.to_asyncio.open_channel_from(
_setup_quote_stream,
symbol=symbol,
@ -1083,8 +1079,7 @@ async def stream_quotes(
con: Contract = details.contract
first_ticker: Ticker|None = None
timeout: float = 1.6
with trio.move_on_after(timeout) as quote_cs:
with trio.move_on_after(1.6) as quote_cs:
first_ticker: Ticker = await proxy.get_quote(
contract=con,
raise_on_timeout=False,
@ -1093,9 +1088,7 @@ async def stream_quotes(
# XXX should never happen with this ep right?
# but if so then, more then likely mkt is closed?
if quote_cs.cancelled_caught:
log.warning(
f'First quote req timed out after {timeout!r}s'
)
await tractor.pause()
if first_ticker:
first_quote: dict = normalize(first_ticker)
@ -1168,7 +1161,6 @@ async def stream_quotes(
)
cs: trio.CancelScope|None = None
startup: bool = True
iter_quotes: trio.abc.Channel
while (
startup
or
@ -1177,11 +1169,11 @@ async def stream_quotes(
with trio.CancelScope() as cs:
async with (
tractor.trionics.collapse_eg(),
trio.open_nursery() as tn,
trio.open_nursery() as nurse,
open_aio_quote_stream(
symbol=sym,
contract=con,
) as iter_quotes,
) as stream,
):
# ?TODO? can we rm this - particularly for `ib_async`?
# ugh, clear ticks since we've consumed them
@ -1210,9 +1202,9 @@ async def stream_quotes(
await rt_ev.wait()
cs.cancel() # cancel called should now be set
tn.start_soon(reset_on_feed)
nurse.start_soon(reset_on_feed)
async with aclosing(iter_quotes):
async with aclosing(stream):
# if syminfo.get('no_vlm', False):
if not init_msg.shm_write_opts['has_vlm']:
@ -1227,21 +1219,19 @@ async def stream_quotes(
# wait for real volume on feed (trading might be
# closed)
while True:
ticker = await iter_quotes.receive()
ticker = await stream.receive()
# for a real volume contract we rait for
# the first "real" trade to take place
if (
# not calc_price
# and not ticker.rtTime
False
# not ticker.rtTime
not ticker.rtTime
):
# spin consuming tickers until we
# get a real market datum
log.debug(f"New unsent ticker: {ticker}")
continue
else:
log.debug("Received first volume tick")
# ugh, clear ticks since we've
@ -1257,18 +1247,13 @@ async def stream_quotes(
log.debug(f"First ticker received {quote}")
# tell data-layer spawner-caller that live
# quotes are now active desptie not having
# necessarily received a first vlm/clearing
# tick.
ticker = await iter_quotes.receive()
# quotes are now streaming.
feed_is_live.set()
fqme: str = quote['fqme']
await send_chan.send({fqme: quote})
# last = time.time()
async for ticker in iter_quotes:
async for ticker in stream:
quote = normalize(ticker)
fqme: str = quote['fqme']
fqme = quote['fqme']
log.debug(
f'Sending quote\n'
f'{quote}'

View File

@ -549,7 +549,7 @@ async def open_trade_dialog(
# to be reloaded.
balances: dict[str, float] = await client.get_balances()
await verify_balances(
verify_balances(
acnt,
src_fiat,
balances,

View File

@ -37,12 +37,6 @@ import tractor
from async_generator import asynccontextmanager
import numpy as np
import wrapt
# TODO, port to `httpx`/`trio-websocket` whenver i get back to
# writing a proper ws-api streamer for this backend (since the data
# feeds are free now) as per GH feat-req:
# https://github.com/pikers/piker/issues/509
#
import asks
from ..calc import humanize, percent_change

View File

@ -171,6 +171,7 @@ class OrderClient(Struct):
async def relay_orders_from_sync_code(
client: OrderClient,
symbol_key: str,
to_ems_stream: tractor.MsgStream,
@ -244,11 +245,6 @@ async def open_ems(
async with maybe_open_emsd(
broker,
# XXX NOTE, LOL so this determines the daemon `emsd` loglevel
# then FYI.. that's kinda wrong no?
# -[ ] shouldn't it be set by `pikerd -l` or no?
# -[ ] would make a lot more sense to have a subsys ctl for
# levels.. like `-l emsd.info` or something?
loglevel=loglevel,
) as portal:

View File

@ -655,11 +655,7 @@ class Router(Struct):
flume = feed.flumes[fqme]
first_quote: dict = flume.first_quote
book: DarkBook = self.get_dark_book(broker)
if not (last := first_quote.get('last')):
last: float = flume.rt_shm.array[-1]['close']
book.lasts[fqme]: float = float(last)
book.lasts[fqme]: float = float(first_quote['last'])
async with self.maybe_open_brokerd_dialog(
brokermod=brokermod,
@ -722,7 +718,7 @@ class Router(Struct):
subs = self.subscribers[sub_key]
sent_some: bool = False
for client_stream in subs.copy():
for client_stream in subs:
try:
await client_stream.send(msg)
sent_some = True
@ -1018,14 +1014,10 @@ async def translate_and_relay_brokerd_events(
status_msg.brokerd_msg = msg
status_msg.src = msg.broker_details['name']
if not status_msg.req:
# likely some order change state?
await tractor.pause()
else:
await router.client_broadcast(
status_msg.req.symbol,
status_msg,
)
await router.client_broadcast(
status_msg.req.symbol,
status_msg,
)
if status == 'closed':
log.info(

View File

@ -297,8 +297,6 @@ class PaperBoi(Struct):
# transmit pp msg to ems
pp: Position = self.acnt.pps[bs_mktid]
# TODO, this will break if `require_only=True` was passed to
# `.update_from_ledger()`
pp_msg = BrokerdPosition(
broker=self.broker,
@ -655,7 +653,6 @@ async def open_trade_dialog(
# in) use manually constructed table from calling
# the `.get_mkt_info()` provider EP above.
_mktmap_table=mkt_by_fqme,
only_require=list(mkt_by_fqme),
)
pp_msgs: list[BrokerdPosition] = []

View File

@ -30,7 +30,6 @@ subsys: str = 'piker.clearing'
log = get_logger(subsys)
# TODO, oof doesn't this ignore the `loglevel` then???
get_console_log = partial(
get_console_log,
name=subsys,

View File

@ -183,8 +183,8 @@ def pikerd(
registry_addrs=regaddrs,
loglevel=loglevel,
debug_mode=pdb,
# enable_transports=['uds'],
enable_transports=['tcp'],
enable_transports=['uds'],
# enable_transports=['tcp'],
) as service_mngr,
):
assert service_mngr

View File

@ -41,13 +41,10 @@ from .log import get_logger
log = get_logger('broker-config')
# XXX NOTE: taken from `click`
# |_https://github.com/pallets/click/blob/main/src/click/utils.py#L449
#
# (since apparently they have some super weirdness with SIGINT and
# sudo.. no clue we're probably going to slowly just modify it to our
# own version over time..)
#
# XXX NOTE: taken from ``click`` since apparently they have some
# super weirdness with sigint and sudo..no clue
# we're probably going to slowly just modify it to our own version over
# time..
def get_app_dir(
app_name: str,
roaming: bool = True,
@ -264,7 +261,7 @@ def load(
MutableMapping,
] = tomllib.loads,
touch_if_dne: bool = True,
touch_if_dne: bool = False,
**tomlkws,
@ -273,7 +270,7 @@ def load(
Load config file by name.
If desired config is not in the top level piker-user config path then
pass the `path: Path` explicitly.
pass the ``path: Path`` explicitly.
'''
# create the $HOME/.config/piker dir if dne
@ -288,8 +285,7 @@ def load(
if (
not path.is_file()
and
touch_if_dne
and touch_if_dne
):
# only do a template if no path provided,
# just touch an empty file with same name.

View File

@ -95,12 +95,6 @@ class Sampler:
# history loading.
incr_task_cs: trio.CancelScope | None = None
bcast_errors: tuple[Exception] = (
trio.BrokenResourceError,
trio.ClosedResourceError,
trio.EndOfChannel,
)
# holds all the ``tractor.Context`` remote subscriptions for
# a particular sample period increment event: all subscribers are
# notified on a step.
@ -264,15 +258,14 @@ class Sampler:
subs: set
last_ts, subs = pair
# NOTE, for debugging pub-sub issues
# task = trio.lowlevel.current_task()
# log.debug(
# f'AlL-SUBS@{period_s!r}: {self.subscribers}\n'
# f'PAIR: {pair}\n'
# f'TASK: {task}: {id(task)}\n'
# f'broadcasting {period_s} -> {last_ts}\n'
# f'consumers: {subs}'
# )
task = trio.lowlevel.current_task()
log.debug(
f'SUBS {self.subscribers}\n'
f'PAIR {pair}\n'
f'TASK: {task}: {id(task)}\n'
f'broadcasting {period_s} -> {last_ts}\n'
# f'consumers: {subs}'
)
borked: set[MsgStream] = set()
sent: set[MsgStream] = set()
while True:
@ -289,11 +282,13 @@ class Sampler:
await stream.send(msg)
sent.add(stream)
except self.bcast_errors as err:
except (
trio.BrokenResourceError,
trio.ClosedResourceError,
trio.EndOfChannel,
):
log.error(
f'Connection dropped for IPC ctx\n'
f'{stream._ctx}\n\n'
f'Due to {type(err)}'
f'{stream._ctx.chan.uid} dropped connection'
)
borked.add(stream)
else:
@ -400,8 +395,7 @@ async def register_with_sampler(
finally:
if (
sub_for_broadcasts
and
subs
and subs
):
try:
subs.remove(stream)
@ -568,7 +562,8 @@ async def open_sample_stream(
async def sample_and_broadcast(
bus: _FeedsBus,
bus: _FeedsBus, # noqa
rt_shm: ShmArray,
hist_shm: ShmArray,
quote_stream: trio.abc.ReceiveChannel,
@ -588,33 +583,11 @@ async def sample_and_broadcast(
overruns = Counter()
# NOTE, only used for debugging live-data-feed issues, though
# this should be resolved more correctly in the future using the
# new typed-msgspec feats of `tractor`!
#
# XXX, a multiline nested `dict` formatter (since rn quote-msgs
# are just that).
# pfmt: Callable[[str], str] = mk_repr()
# iterate stream delivered by broker
async for quotes in quote_stream:
# print(quotes)
# XXX WARNING XXX only enable for debugging bc ow can cost
# ALOT of perf with HF-feedz!!!
#
# log.info(
# 'Rx live quotes:\n'
# f'{pfmt(quotes)}'
# )
# TODO,
# -[ ] `numba` or `cython`-nize this loop possibly?
# |_alternatively could we do it in rust somehow by upacking
# arrow msgs instead of using `msgspec`?
# -[ ] use `msgspec.Struct` support in new typed-msging from
# `tractor` to ensure only allowed msgs are transmitted?
#
# TODO: ``numba`` this!
for broker_symbol, quote in quotes.items():
# TODO: in theory you can send the IPC msg *before* writing
# to the sharedmem array to decrease latency, however, that
@ -687,21 +660,6 @@ async def sample_and_broadcast(
sub_key: str = broker_symbol.lower()
subs: set[Sub] = bus.get_subs(sub_key)
# TODO, figure out how to make this useful whilst
# incoporating feed "pausing" ..
#
# if not subs:
# all_bs_fqmes: list[str] = list(
# bus._subscribers.keys()
# )
# log.warning(
# f'No subscribers for {brokername!r} live-quote ??\n'
# f'broker_symbol: {broker_symbol}\n\n'
# f'Maybe the backend-sys symbol does not match one of,\n'
# f'{pfmt(all_bs_fqmes)}\n'
# )
# NOTE: by default the broker backend doesn't append
# it's own "name" into the fqme schema (but maybe it
# should?) so we have to manually generate the correct
@ -771,14 +729,18 @@ async def sample_and_broadcast(
if lags > 10:
await tractor.pause()
except Sampler.bcast_errors as ipc_err:
except (
trio.BrokenResourceError,
trio.ClosedResourceError,
trio.EndOfChannel,
):
ctx: Context = ipc._ctx
chan: Channel = ctx.chan
if ctx:
log.warning(
f'Dropped `brokerd`-feed for {broker_symbol!r} due to,\n'
f'x>) {ctx.cid}@{chan.uid}'
f'|_{ipc_err!r}\n\n'
'Dropped `brokerd`-quotes-feed connection:\n'
f'{broker_symbol}:'
f'{ctx.cid}@{chan.uid}'
)
if sub.throttle_rate:
assert ipc._closed
@ -795,11 +757,12 @@ async def sample_and_broadcast(
async def uniform_rate_send(
rate: float,
quote_stream: trio.abc.ReceiveChannel,
stream: MsgStream,
task_status: TaskStatus[None] = trio.TASK_STATUS_IGNORED,
task_status: TaskStatus = trio.TASK_STATUS_IGNORED,
) -> None:
'''
@ -817,16 +780,13 @@ async def uniform_rate_send(
https://gist.github.com/njsmith/7ea44ec07e901cb78ebe1dd8dd846cb9
'''
# ?TODO? dynamically compute the **actual** approx overhead latency per cycle
# instead of this magic # bidinezz?
throttle_period: float = 1/rate - 0.000616
left_to_sleep: float = throttle_period
# TODO: compute the approx overhead latency per cycle
left_to_sleep = throttle_period = 1/rate - 0.000616
# send cycle state
first_quote: dict|None
first_quote = last_quote = None
last_send: float = time.time()
diff: float = 0
last_send = time.time()
diff = 0
task_status.started()
ticks_by_type: dict[
@ -837,28 +797,22 @@ async def uniform_rate_send(
clear_types = _tick_groups['clears']
while True:
# compute the remaining time to sleep for this throttled cycle
left_to_sleep: float = throttle_period - diff
left_to_sleep = throttle_period - diff
if left_to_sleep > 0:
cs: trio.CancelScope
with trio.move_on_after(left_to_sleep) as cs:
sym: str
last_quote: dict
try:
sym, last_quote = await quote_stream.receive()
except trio.EndOfChannel:
log.exception(
f'Live stream for feed for ended?\n'
f'<=c\n'
f' |_[{stream!r}\n'
)
log.exception(f"feed for {stream} ended?")
break
diff: float = time.time() - last_send
diff = time.time() - last_send
if not first_quote:
first_quote: float = last_quote
first_quote = last_quote
# first_quote['tbt'] = ticks_by_type
if (throttle_period - diff) > 0:
@ -919,12 +873,11 @@ async def uniform_rate_send(
# TODO: now if only we could sync this to the display
# rate timing exactly lul
try:
await stream.send({
sym: first_quote
})
await stream.send({sym: first_quote})
except tractor.RemoteActorError as rme:
if rme.type is not tractor._exceptions.StreamOverrun:
raise
ctx = stream._ctx
chan = ctx.chan
log.warning(
@ -932,28 +885,20 @@ async def uniform_rate_send(
f'{sym}:{ctx.cid}@{chan.uid}'
)
# NOTE: any of these can be raised by `tractor`'s IPC
# transport-layer and we want to be highly resilient
# to consumers which crash or lose network connection.
# I.e. we **DO NOT** want to crash and propagate up to
# ``pikerd`` these kinds of errors!
except (
# NOTE: any of these can be raised by ``tractor``'s IPC
# transport-layer and we want to be highly resilient
# to consumers which crash or lose network connection.
# I.e. we **DO NOT** want to crash and propagate up to
# ``pikerd`` these kinds of errors!
trio.ClosedResourceError,
trio.BrokenResourceError,
ConnectionResetError,
) + Sampler.bcast_errors as ipc_err:
match ipc_err:
case trio.EndOfChannel():
log.info(
f'{stream} terminated by peer,\n'
f'{ipc_err!r}'
)
case _:
# if the feed consumer goes down then drop
# out of this rate limiter
log.warning(
f'{stream} closed due to,\n'
f'{ipc_err!r}'
)
trio.EndOfChannel,
):
# if the feed consumer goes down then drop
# out of this rate limiter
log.warning(f'{stream} closed')
await stream.aclose()
return

View File

@ -31,7 +31,6 @@ from pathlib import Path
from pprint import pformat
from typing import (
Any,
Callable,
Sequence,
Hashable,
TYPE_CHECKING,
@ -57,7 +56,7 @@ from piker.brokers import (
)
if TYPE_CHECKING:
from piker.accounting import (
from ..accounting import (
Asset,
MktPair,
)
@ -162,68 +161,57 @@ class SymbologyCache(Struct):
'Implement `Client.get_assets()`!'
)
get_mkt_pairs: Callable|None = getattr(
client,
'get_mkt_pairs',
None,
)
if not get_mkt_pairs:
if get_mkt_pairs := getattr(client, 'get_mkt_pairs', None):
pairs: dict[str, Struct] = await get_mkt_pairs()
for bs_fqme, pair in pairs.items():
# NOTE: every backend defined pair should
# declare it's ns path for roundtrip
# serialization lookup.
if not getattr(pair, 'ns_path', None):
raise TypeError(
f'Pair-struct for {self.mod.name} MUST define a '
'`.ns_path: str`!\n'
f'{pair}'
)
entry = await self.mod.get_mkt_info(pair.bs_fqme)
if not entry:
continue
mkt: MktPair
pair: Struct
mkt, _pair = entry
assert _pair is pair, (
f'`{self.mod.name}` backend probably has a '
'keying-symmetry problem between the pair-`Struct` '
'returned from `Client.get_mkt_pairs()`and the '
'module level endpoint: `.get_mkt_info()`\n\n'
"Here's the struct diff:\n"
f'{_pair - pair}'
)
# NOTE XXX: this means backends MUST implement
# a `Struct.bs_mktid: str` field to provide
# a native-keyed map to their own symbol
# set(s).
self.pairs[pair.bs_mktid] = pair
# NOTE: `MktPair`s are keyed here using piker's
# internal FQME schema so that search,
# accounting and feed init can be accomplished
# a sane, uniform, normalized basis.
self.mktmaps[mkt.fqme] = mkt
self.pair_ns_path: str = tractor.msg.NamespacePath.from_ref(
pair,
)
else:
log.warning(
'No symbology cache `Pair` support for `{provider}`..\n'
'Implement `Client.get_mkt_pairs()`!'
)
return self
pairs: dict[str, Struct] = await get_mkt_pairs()
if not pairs:
log.warning(
'No pairs from intial {provider!r} sym-cache request?\n\n'
'`Client.get_mkt_pairs()` -> {pairs!r} ?'
)
return self
for bs_fqme, pair in pairs.items():
if not getattr(pair, 'ns_path', None):
# XXX: every backend defined pair must declare
# a `.ns_path: tractor.NamespacePath` to enable
# roundtrip serialization lookup from a local
# cache file.
raise TypeError(
f'Pair-struct for {self.mod.name} MUST define a '
'`.ns_path: str`!\n\n'
f'{pair!r}'
)
entry = await self.mod.get_mkt_info(pair.bs_fqme)
if not entry:
continue
mkt: MktPair
pair: Struct
mkt, _pair = entry
assert _pair is pair, (
f'`{self.mod.name}` backend probably has a '
'keying-symmetry problem between the pair-`Struct` '
'returned from `Client.get_mkt_pairs()`and the '
'module level endpoint: `.get_mkt_info()`\n\n'
"Here's the struct diff:\n"
f'{_pair - pair}'
)
# NOTE XXX: this means backends MUST implement
# a `Struct.bs_mktid: str` field to provide
# a native-keyed map to their own symbol
# set(s).
self.pairs[pair.bs_mktid] = pair
# NOTE: `MktPair`s are keyed here using piker's
# internal FQME schema so that search,
# accounting and feed init can be accomplished
# a sane, uniform, normalized basis.
self.mktmaps[mkt.fqme] = mkt
self.pair_ns_path: str = tractor.msg.NamespacePath.from_ref(
pair,
)
return self

View File

@ -357,9 +357,7 @@ async def allocate_persistent_feed(
# yield back control to starting nursery once we receive either
# some history or a real-time quote.
log.info(
f'loading OHLCV history: {fqme!r}\n'
)
log.info(f'loading OHLCV history: {fqme}')
await some_data_ready.wait()
flume = Flume(
@ -796,6 +794,7 @@ async def install_brokerd_search(
@acm
async def maybe_open_feed(
fqmes: list[str],
loglevel: str | None = None,
@ -849,12 +848,13 @@ async def maybe_open_feed(
@acm
async def open_feed(
fqmes: list[str],
loglevel: str|None = None,
loglevel: str | None = None,
allow_overruns: bool = True,
start_stream: bool = True,
tick_throttle: float|None = None, # Hz
tick_throttle: float | None = None, # Hz
allow_remote_ctl_ui: bool = False,

View File

@ -36,10 +36,10 @@ from ._sharedmem import (
ShmArray,
_Token,
)
from piker.accounting import MktPair
if TYPE_CHECKING:
from piker.data.feed import Feed
from ..accounting import MktPair
from .feed import Feed
class Flume(Struct):
@ -82,7 +82,7 @@ class Flume(Struct):
# TODO: do we need this really if we can pull the `Portal` from
# ``tractor``'s internals?
feed: Feed|None = None
feed: Feed | None = None
@property
def rt_shm(self) -> ShmArray:

View File

@ -113,9 +113,9 @@ def validate_backend(
)
if ep is None:
log.warning(
f'Provider backend {mod.name!r} is missing '
f'{daemon_name!r} support?\n'
f'|_module endpoint-func missing: {name!r}\n'
f'Provider backend {mod.name} is missing '
f'{daemon_name} support :(\n'
f'The following endpoint is missing: {name}'
)
inits: list[

View File

@ -18,8 +18,8 @@
Log like a forester!
"""
import logging
import json
import reprlib
import json
from typing import (
Callable,
)
@ -90,8 +90,6 @@ def colorize_json(
)
# TODO, eventually defer to the version in `modden` once
# it becomes a dep!
def mk_repr(
**repr_kws,
) -> Callable[[str], str]:

View File

@ -138,6 +138,16 @@ class StorageClient(
) -> None:
...
async def write_oi(
self,
fqme: str,
oi: np.ndarray,
append_and_duplicate: bool = True,
limit: int = int(800e3),
) -> None:
...
class TimeseriesNotFound(Exception):
'''

View File

@ -111,6 +111,24 @@ def mk_ohlcv_shm_keyed_filepath(
return path
def mk_oi_shm_keyed_filepath(
fqme: str,
period: float | int,
datadir: Path,
) -> Path:
if period < 1.:
raise ValueError('Sample period should be >= 1.!?')
path: Path = (
datadir
/
f'{fqme}.oi{int(period)}s.parquet'
)
return path
def unpack_fqme_from_parquet_filepath(path: Path) -> str:
filename: str = str(path.name)
@ -172,7 +190,11 @@ class NativeStorageClient:
key: str = path.name.rstrip('.parquet')
fqme, _, descr = key.rpartition('.')
prefix, _, suffix = descr.partition('ohlcv')
if 'ohlcv' in descr:
prefix, _, suffix = descr.partition('ohlcv')
elif 'oi' in descr:
prefix, _, suffix = descr.partition('oi')
period: int = int(suffix.strip('s'))
# cache description data
@ -369,6 +391,61 @@ class NativeStorageClient:
timeframe,
)
def _write_oi(
self,
fqme: str,
oi: np.ndarray,
) -> Path:
'''
Sync version of the public interface meth, since we don't
currently actually need or support an async impl.
'''
path: Path = mk_oi_shm_keyed_filepath(
fqme=fqme,
period=1,
datadir=self._datadir,
)
if isinstance(oi, np.ndarray):
new_df: pl.DataFrame = tsp.np2pl(oi)
else:
new_df = oi
if path.exists():
old_df = pl.read_parquet(path)
df = pl.concat([old_df, new_df])
else:
df = new_df
start = time.time()
df.write_parquet(path)
delay: float = round(
time.time() - start,
ndigits=6,
)
log.info(
f'parquet write took {delay} secs\n'
f'file path: {path}'
)
return path
async def write_oi(
self,
fqme: str,
oi: np.ndarray,
) -> Path:
'''
Write input oi time series for fqme and sampling period
to (local) disk.
'''
return self._write_oi(
fqme,
oi,
)
async def delete_ts(
self,
key: str,

View File

@ -182,24 +182,27 @@ class DpiAwareFont:
# always going to hit that error in range mapping from inches:
# float to px size: int.
self._font_inches = inches
font_size = math.floor(inches * dpi)
font_size = math.floor(inches * pdpi)
log.debug(
f"screen:{screen.name()}\n"
f"pDPI: {pdpi}, lDPI: {ldpi}, scale: {scale}\n"
f"\nOur best guess font size is {font_size}\n"
ftype: str = f'{type(self)!r}'
log.info(
f'screen: {screen.name()}\n'
f'pDPI: {pdpi!r}\n'
f'lDPI: {ldpi!r}\n'
f'scale: {scale!r}\n'
f'{ftype}._font_inches={self._font_inches!r}\n'
f'\n'
f"Our best guess for an auto-font-size is,\n"
f'font_size: {font_size!r}\n'
)
# apply the size
self._set_qfont_px_size(font_size)
def boundingRect(self, value: str) -> QtCore.QRectF:
screen = self.screen
if screen is None:
if (screen := self.screen) is None:
raise RuntimeError("You must call .configure_to_dpi() first!")
unscaled_br = self._qfm.boundingRect(value)
unscaled_br: QtCore.QRectF = self._qfm.boundingRect(value)
return QtCore.QRectF(
0,
0,

View File

@ -63,8 +63,10 @@ dependencies = [
"trio-util >=0.7.0, <0.8.0",
"trio-websocket >=0.10.3, <0.11.0",
"typer >=0.9.0, <1.0.0",
"rapidfuzz >=3.5.2, <4.0.0",
"pdbp >=1.5.0, <2.0.0",
"trio >=0.27",
"pendulum",
"pendulum >=3.0.0, <4.0.0",
"httpx >=0.27.0, <0.28.0",
"cryptofeed >=2.4.0, <3.0.0",
"pyarrow>=18.0.0",
@ -76,31 +78,23 @@ dependencies = [
"numba>=0.61.0",
"pyvnc",
]
# ------ dependencies ------
# TODO: add an `--only daemon` group for running non-ui / pikerd
# service tree in distributed mode B)
# https://docs.astral.sh/uv/concepts/projects/dependencies/#optional-dependencies
[dependency-groups]
[project.optional-dependencies]
uis = [
# https://docs.astral.sh/uv/concepts/projects/dependencies/#optional-dependencies
# TODO: make sure the levenshtein shit compiles on nix..
# rapidfuzz = {extras = ["speedup"], version = "^0.18.0"}
"rapidfuzz >=3.2.0, <4.0.0",
"qdarkstyle >=3.0.2, <4.0.0",
"pyqt6 >=6.7.0, <7.0.0",
"pyqtgraph",
# for consideration,
# - 'visidata'
"qdarkstyle >=3.0.2, <4.0.0",
"pyqt6 >=6.7.0, <7.0.0",
"pyqtgraph",
# TODO: add an `--only daemon` group for running non-ui / pikerd
# service tree in distributed mode B)
# https://docs.astral.sh/uv/concepts/projects/dependencies/#optional-dependencies
]
[dependency-groups]
# TODO: a toolset that makes debugging a `pikerd` service (tree) easy
# to hack on directly using more or less the local env:
# - xonsh + xxh
@ -109,46 +103,23 @@ uis = [
#
# console ehancements and eventually remote debugging extras/helpers.
# use `uv --dev` to enable
repl = [
# debug
"pdbp >=1.5.0, <2.0.0",
"greenback >=1.1.1, <2.0.0",
"xonsh",
"prompt-toolkit ==3.0.40",
"pyperclip>=1.9.0",
]
testing = [
"pytest",
]
de = [
# DE-specific
"i3ipc>=2.2.1",
]
dev = [
# https://docs.astral.sh/uv/concepts/projects/dependencies/#development-dependencies
"cython >=3.0.0, <4.0.0",
"pytest",
"elasticsearch >=8.9.0, <9.0.0",
"prompt-toolkit ==3.0.40",
"cython >=3.0.0, <4.0.0",
"greenback >=1.1.1, <2.0.0",
"ruff>=0.9.6",
"pyperclip>=1.9.0",
"i3ipc>=2.2.1",
# nested deps-groups
# https://docs.astral.sh/uv/concepts/projects/dependencies/#nesting-groups
{include-group = 'uis'},
{include-group = 'repl'},
{include-group = 'testing'},
{include-group = 'de'},
]
lint = [
# XXX, with flake.nix needs to be from nixpkgs
"ruff>=0.9.6"
#
# ^TODO? these markers don't work; use deps-flags for now?
# ; os_name != 'nixos' and platform_system != 'NixOS'",
# ; defined('IN_NIX_SHELL')",
]
dbs = [
"elasticsearch >=8.9.0, <9.0.0",
]
# ------ dependency-groups ------
# ?from git, see below.
"xonsh",
"qdarkstyle >=3.0.2, <4.0.0",
"pyqt6 >=6.7.0, <7.0.0",
"pyqtgraph",
]
[tool.pytest.ini_options]
# https://docs.pytest.org/en/stable/reference/reference.html#configuration-options
@ -161,29 +132,24 @@ console_output_style = 'progress'
# https://docs.pytest.org/en/stable/how-to/plugins.html#disabling-plugins-from-autoloading
# https://docs.pytest.org/en/stable/how-to/plugins.html#deactivating-unregistering-a-plugin-by-name
addopts = '-p no:xonsh'
# ------ tool.pytest ------
[project.scripts]
piker = "piker.cli:cli"
pikerd = "piker.cli:pikerd"
ledger = "piker.accounting.cli:ledger"
# ------ project.scripts ------
[tool.hatch.build.targets.sdist]
include = ["piker"]
[tool.hatch.build.targets.wheel]
include = ["piker"]
# ------ tool.hatch ------
# TODO? move to a `uv.toml`?
[tool.uv]
python-preference = 'system'
python-downloads = 'manual'
# https://docs.astral.sh/uv/concepts/projects/dependencies/#default-groups
default-groups = ['uis', 'dev']
# ------ tool.uv ------
[tool.uv.sources]
@ -191,10 +157,12 @@ pyqtgraph = { git = "https://github.com/pikers/pyqtgraph.git" }
tomlkit = { git = "https://github.com/pikers/tomlkit.git", branch ="piker_pin" }
pyvnc = { git = "https://github.com/regulad/pyvnc.git" }
# XXX since, we're like, always hacking new shite all-the-time. Bp
tractor = { git = "https://github.com/goodboy/tractor.git", branch ="piker_pin" }
# tractor = { git = "https://pikers.dev/goodboy/tractor", branch = "piker_pin" }
# tractor = { git = "https://pikers.dev/goodboy/tractor", branch = "main" }
# ------ goodboy ------
# hackin dev-envs, usually there's something new he's hackin in..
# tractor = { path = "../tractor", editable = true }
# TODO, long term we should be synced to upstream `main` branch!
# tractor = { git = "https://github.com/goodboy/tractor.git", branch ="piker_pin" }
tractor = { git = "https://pikers.dev/goodboy/tractor", branch = "piker_pin" }
# goodboy's dev-env
# XXX for @goodboy's hackin dev env, usually there's something new in
# the runtime being seriously tested here Bp
# tractor = { path = "../tractor/", editable = true }
# xonsh = { path = "../xonsh", editable = true }

View File

@ -0,0 +1,64 @@
#!env xonsh
'''
Compute the pxs-per-inch (PPI) naively for the local DE.
NOTE, currently this only supports the `sway`-TWM on wayland.
!TODO!
- [ ] support Xorg (and possibly other OSs as well?
- [ ] conver this to pure py code, dropping the `.xsh` specifics
instead for `subprocess` API calls?
- [ ] possibly unify all this with `./qt_screen_info.py` as part of
a "PPI config wizard" or something, but more then likely we'll
have lib-ified version inside modden/piker by then?
'''
import math
import json
# XXX, xonsh part using "subprocess mode"
disp_infos: list[dict] = json.loads($(wlr-randr --json))
lappy: dict = disp_infos[0]
dims: dict[str, int] = lappy['physical_size']
w_cm: int = dims['width']
h_cm: int = dims['height']
# cm per inch
cpi: float = 25.4
# compute "diagonal" size (aka hypot)
diag_inches: float = math.sqrt((h_cm/cpi)**2 + (w_cm/cpi)**2)
# compute reso-hypot / inches-hypot
hi_res: dict[str, float|bool] = lappy['modes'][0]
w_px: int = hi_res['width']
h_px: int = hi_res['height']
diag_pxs: float = math.sqrt(h_px**2 + w_px**2)
unscaled_ppi: float = diag_pxs/diag_inches
# retrieve TWM info on the display (including scaling info)
sway_disp_info: dict = json.loads($(swaymsg -r -t get_outputs))[0]
scale: float = sway_disp_info['scale']
print(
f'output: {sway_disp_info["name"]!r}\n'
f'--- DIMENSIONS ---\n'
f'w_cm: {w_cm!r}\n'
f'h_cm: {h_cm!r}\n'
f'w_px: {w_px!r}\n'
f'h_cm: {h_px!r}\n'
f'\n'
f'--- DIAGONALS ---\n'
f'diag_inches: {diag_inches!r}\n'
f'diag_pxs: {diag_pxs!r}\n'
f'\n'
f'--- PPI-related-info ---\n'
f'(DE reported) scale: {scale!r}\n'
f'unscaled PPI: {unscaled_ppi!r}\n'
f'|_ =sqrt(h_px**2 + w_px**2) / sqrt(h_in**2 + w_in**2)\n'
f'scaled PPI: {unscaled_ppi/scale!r}\n'
f'|_ =unscaled_ppi/scale\n'
)

View File

@ -31,8 +31,8 @@ Resource list for mucking with DPIs on multiple screens:
- https://doc.qt.io/qt-5/qguiapplication.html#screenAt
'''
import os
from pyqtgraph import QtGui
from PyQt6 import (
QtCore,
QtWidgets,
@ -43,6 +43,13 @@ from PyQt6.QtCore import (
QSize,
QRect,
)
from pyqtgraph import QtGui
# https://doc.qt.io/qt-6/highdpi.html#environment-variable-reference
os.environ['QT_USE_PHYSICAL_DPI'] = '0'
# TODO? i don't get how this is useful, when i set it it seems
# the physical-dpi get's computed incorrectly??
# Proper high DPI scaling is available in Qt >= 5.6.0. This attibute
# must be set before creating the application
@ -58,13 +65,22 @@ if hasattr(Qt, 'AA_UseHighDpiPixmaps'):
True,
)
# NOTE, inherits `QGuiApplication`
# https://doc.qt.io/qt-6/qapplication.html
# https://doc.qt.io/qt-6/qguiapplication.html
app = QtWidgets.QApplication([])
#
# ^TODO? various global DPI settings?
# [ ] DPI rounding policy,
# - https://doc.qt.io/qt-6/qt.html#HighDpiScaleFactorRoundingPolicy-enum
# - https://doc.qt.io/qt-6/qguiapplication.html#setHighDpiScaleFactorRoundingPolicy
window = QtWidgets.QMainWindow()
main_widget = QtWidgets.QWidget()
window.setCentralWidget(main_widget)
window.show()
pxr: float = main_widget.devicePixelRatioF()
_main_pxr: float = main_widget.devicePixelRatioF()
# explicitly get main widget and primary displays
current_screen: QtGui.QScreen = app.screenAt(
@ -77,7 +93,13 @@ for screen in app.screens():
name: str = screen.name()
model: str = screen.model().rstrip()
size: QSize = screen.size()
geo: QRect = screen.availableGeometry()
geo: QRect = screen.geometry()
# device-pixel-ratio
# https://doc.qt.io/qt-6/highdpi.html
pxr: float = screen.devicePixelRatio()
unscaled_size: QSize = pxr * size
phydpi: float = screen.physicalDotsPerInch()
logdpi: float = screen.logicalDotsPerInch()
is_primary: bool = screen is primary_screen
@ -88,11 +110,12 @@ for screen in app.screens():
f'|_primary: {is_primary}\n'
f' _current: {is_current}\n'
f' _model: {model}\n'
f' _screen size: {size}\n'
f' _screen geometry: {geo}\n'
f' _devicePixelRationF(): {pxr}\n'
f' _physical dpi: {phydpi}\n'
f' _logical dpi: {logdpi}\n'
f' _size: {size}\n'
f' _geometry: {geo}\n'
f' _devicePixelRatio(): {pxr}\n'
f' _unscaled-size: {unscaled_size!r}\n'
f' _physical-dpi: {phydpi}\n'
f' _logical-dpi: {logdpi}\n'
)
# app-wide font info
@ -110,8 +133,8 @@ str_w: int = str_br.width()
print(
f'------ global font settings ------\n'
f'font dpi: {fontdpi}\n'
f'font height: {font_h}\n'
f'string bounding rect: {str_br}\n'
f'string width : {str_w}\n'
f'font dpi: {fontdpi!r}\n'
f'font height: {font_h!r}\n'
f'string bounding rect: {str_br!r}\n'
f'string width : {str_w!r}\n'
)

View File

@ -0,0 +1,36 @@
import pytest
from piker.ui._style import DpiAwareFont
class MockScreen:
def __init__(self, pdpi, ldpi, name="MockScreen"):
self._pdpi = pdpi
self._ldpi = ldpi
self._name = name
def physicalDotsPerInch(self):
return self._pdpi
def logicalDotsPerInch(self):
return self._ldpi
def name(self):
return self._name
@pytest.mark.parametrize(
"pdpi, ldpi, expected_px",
[
(96, 96, 9), # normal DPI
(169, 96, 15), # HiDPI
(120, 96, 10), # mid-DPI
]
)
def test_font_px_size(pdpi, ldpi, expected_px):
font = DpiAwareFont()
font.configure_to_dpi(screen=MockScreen(pdpi, ldpi))
px = font.px_size
print(f"{pdpi}x{ldpi} DPI -> Computed pixel size: {px}")
assert px == expected_px

105
uv.lock
View File

@ -1016,12 +1016,14 @@ dependencies = [
{ name = "msgspec" },
{ name = "numba" },
{ name = "numpy" },
{ name = "pdbp" },
{ name = "pendulum" },
{ name = "polars" },
{ name = "polars-fuzzy-match" },
{ name = "pyarrow" },
{ name = "pygments" },
{ name = "pyvnc" },
{ name = "rapidfuzz" },
{ name = "rich" },
{ name = "tomli" },
{ name = "tomli-w" },
@ -1035,46 +1037,26 @@ dependencies = [
{ name = "websockets" },
]
[package.optional-dependencies]
uis = [
{ name = "rapidfuzz" },
]
[package.dev-dependencies]
dbs = [
{ name = "elasticsearch" },
]
de = [
{ name = "i3ipc" },
]
dev = [
{ name = "cython" },
{ name = "elasticsearch" },
{ name = "greenback" },
{ name = "i3ipc" },
{ name = "pdbp" },
{ name = "prompt-toolkit" },
{ name = "pyperclip" },
{ name = "pyqt6" },
{ name = "pyqtgraph" },
{ name = "pytest" },
{ name = "qdarkstyle" },
{ name = "rapidfuzz" },
{ name = "xonsh" },
]
lint = [
{ name = "ruff" },
]
repl = [
{ name = "greenback" },
{ name = "pdbp" },
{ name = "prompt-toolkit" },
{ name = "pyperclip" },
{ name = "xonsh" },
]
testing = [
{ name = "pytest" },
]
uis = [
{ name = "pyqt6" },
{ name = "pyqtgraph" },
{ name = "qdarkstyle" },
{ name = "rapidfuzz" },
]
[package.metadata]
requires-dist = [
@ -1089,17 +1071,20 @@ requires-dist = [
{ name = "msgspec", specifier = ">=0.19.0,<0.20" },
{ name = "numba", specifier = ">=0.61.0" },
{ name = "numpy", specifier = ">=2.0" },
{ name = "pendulum" },
{ name = "pdbp", specifier = ">=1.5.0,<2.0.0" },
{ name = "pendulum", specifier = ">=3.0.0,<4.0.0" },
{ name = "polars", specifier = ">=0.20.6" },
{ name = "polars-fuzzy-match", specifier = ">=0.1.5" },
{ name = "pyarrow", specifier = ">=18.0.0" },
{ name = "pygments", specifier = ">=2.16.1,<3.0.0" },
{ name = "pyvnc", git = "https://github.com/regulad/pyvnc.git" },
{ name = "rapidfuzz", specifier = ">=3.5.2,<4.0.0" },
{ name = "rapidfuzz", marker = "extra == 'uis'", specifier = ">=3.2.0,<4.0.0" },
{ name = "rich", specifier = ">=13.5.2,<14.0.0" },
{ name = "tomli", specifier = ">=2.0.1,<3.0.0" },
{ name = "tomli-w", specifier = ">=1.0.0,<2.0.0" },
{ name = "tomlkit", git = "https://github.com/pikers/tomlkit.git?branch=piker_pin" },
{ name = "tractor", git = "https://github.com/goodboy/tractor.git?branch=piker_pin" },
{ name = "tractor", git = "https://pikers.dev/goodboy/tractor?branch=piker_pin" },
{ name = "trio", specifier = ">=0.27" },
{ name = "trio-typing", specifier = ">=0.10.0" },
{ name = "trio-util", specifier = ">=0.7.0,<0.8.0" },
@ -1107,39 +1092,23 @@ requires-dist = [
{ name = "typer", specifier = ">=0.9.0,<1.0.0" },
{ name = "websockets", specifier = "==12.0" },
]
provides-extras = ["uis"]
[package.metadata.requires-dev]
dbs = [{ name = "elasticsearch", specifier = ">=8.9.0,<9.0.0" }]
de = [{ name = "i3ipc", specifier = ">=2.2.1" }]
dev = [
{ name = "cython", specifier = ">=3.0.0,<4.0.0" },
{ name = "elasticsearch", specifier = ">=8.9.0,<9.0.0" },
{ name = "greenback", specifier = ">=1.1.1,<2.0.0" },
{ name = "i3ipc", specifier = ">=2.2.1" },
{ name = "pdbp", specifier = ">=1.5.0,<2.0.0" },
{ name = "prompt-toolkit", specifier = "==3.0.40" },
{ name = "pyperclip", specifier = ">=1.9.0" },
{ name = "pyqt6", specifier = ">=6.7.0,<7.0.0" },
{ name = "pyqtgraph", git = "https://github.com/pikers/pyqtgraph.git" },
{ name = "pytest" },
{ name = "qdarkstyle", specifier = ">=3.0.2,<4.0.0" },
{ name = "rapidfuzz", specifier = ">=3.2.0,<4.0.0" },
{ name = "ruff", specifier = ">=0.9.6" },
{ name = "xonsh" },
]
lint = [{ name = "ruff", specifier = ">=0.9.6" }]
repl = [
{ name = "greenback", specifier = ">=1.1.1,<2.0.0" },
{ name = "pdbp", specifier = ">=1.5.0,<2.0.0" },
{ name = "prompt-toolkit", specifier = "==3.0.40" },
{ name = "pyperclip", specifier = ">=1.9.0" },
{ name = "xonsh" },
]
testing = [{ name = "pytest" }]
uis = [
{ name = "pyqt6", specifier = ">=6.7.0,<7.0.0" },
{ name = "pyqtgraph", git = "https://github.com/pikers/pyqtgraph.git" },
{ name = "qdarkstyle", specifier = ">=3.0.2,<4.0.0" },
{ name = "rapidfuzz", specifier = ">=3.2.0,<4.0.0" },
]
[[package]]
name = "platformdirs"
@ -1713,28 +1682,28 @@ wheels = [
[[package]]
name = "ruff"
version = "0.14.10"
version = "0.14.8"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/57/08/52232a877978dd8f9cf2aeddce3e611b40a63287dfca29b6b8da791f5e8d/ruff-0.14.10.tar.gz", hash = "sha256:9a2e830f075d1a42cd28420d7809ace390832a490ed0966fe373ba288e77aaf4", size = 5859763, upload-time = "2025-12-18T19:28:57.98Z" }
sdist = { url = "https://files.pythonhosted.org/packages/ed/d9/f7a0c4b3a2bf2556cd5d99b05372c29980249ef71e8e32669ba77428c82c/ruff-0.14.8.tar.gz", hash = "sha256:774ed0dd87d6ce925e3b8496feb3a00ac564bea52b9feb551ecd17e0a23d1eed", size = 5765385, upload-time = "2025-12-04T15:06:17.669Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/60/01/933704d69f3f05ee16ef11406b78881733c186fe14b6a46b05cfcaf6d3b2/ruff-0.14.10-py3-none-linux_armv6l.whl", hash = "sha256:7a3ce585f2ade3e1f29ec1b92df13e3da262178df8c8bdf876f48fa0e8316c49", size = 13527080, upload-time = "2025-12-18T19:29:25.642Z" },
{ url = "https://files.pythonhosted.org/packages/df/58/a0349197a7dfa603ffb7f5b0470391efa79ddc327c1e29c4851e85b09cc5/ruff-0.14.10-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:674f9be9372907f7257c51f1d4fc902cb7cf014b9980152b802794317941f08f", size = 13797320, upload-time = "2025-12-18T19:29:02.571Z" },
{ url = "https://files.pythonhosted.org/packages/7b/82/36be59f00a6082e38c23536df4e71cdbc6af8d7c707eade97fcad5c98235/ruff-0.14.10-py3-none-macosx_11_0_arm64.whl", hash = "sha256:d85713d522348837ef9df8efca33ccb8bd6fcfc86a2cde3ccb4bc9d28a18003d", size = 12918434, upload-time = "2025-12-18T19:28:51.202Z" },
{ url = "https://files.pythonhosted.org/packages/a6/00/45c62a7f7e34da92a25804f813ebe05c88aa9e0c25e5cb5a7d23dd7450e3/ruff-0.14.10-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6987ebe0501ae4f4308d7d24e2d0fe3d7a98430f5adfd0f1fead050a740a3a77", size = 13371961, upload-time = "2025-12-18T19:29:04.991Z" },
{ url = "https://files.pythonhosted.org/packages/40/31/a5906d60f0405f7e57045a70f2d57084a93ca7425f22e1d66904769d1628/ruff-0.14.10-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:16a01dfb7b9e4eee556fbfd5392806b1b8550c9b4a9f6acd3dbe6812b193c70a", size = 13275629, upload-time = "2025-12-18T19:29:21.381Z" },
{ url = "https://files.pythonhosted.org/packages/3e/60/61c0087df21894cf9d928dc04bcd4fb10e8b2e8dca7b1a276ba2155b2002/ruff-0.14.10-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7165d31a925b7a294465fa81be8c12a0e9b60fb02bf177e79067c867e71f8b1f", size = 14029234, upload-time = "2025-12-18T19:29:00.132Z" },
{ url = "https://files.pythonhosted.org/packages/44/84/77d911bee3b92348b6e5dab5a0c898d87084ea03ac5dc708f46d88407def/ruff-0.14.10-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:c561695675b972effb0c0a45db233f2c816ff3da8dcfbe7dfc7eed625f218935", size = 15449890, upload-time = "2025-12-18T19:28:53.573Z" },
{ url = "https://files.pythonhosted.org/packages/e9/36/480206eaefa24a7ec321582dda580443a8f0671fdbf6b1c80e9c3e93a16a/ruff-0.14.10-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4bb98fcbbc61725968893682fd4df8966a34611239c9fd07a1f6a07e7103d08e", size = 15123172, upload-time = "2025-12-18T19:29:23.453Z" },
{ url = "https://files.pythonhosted.org/packages/5c/38/68e414156015ba80cef5473d57919d27dfb62ec804b96180bafdeaf0e090/ruff-0.14.10-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f24b47993a9d8cb858429e97bdf8544c78029f09b520af615c1d261bf827001d", size = 14460260, upload-time = "2025-12-18T19:29:27.808Z" },
{ url = "https://files.pythonhosted.org/packages/b3/19/9e050c0dca8aba824d67cc0db69fb459c28d8cd3f6855b1405b3f29cc91d/ruff-0.14.10-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:59aabd2e2c4fd614d2862e7939c34a532c04f1084476d6833dddef4afab87e9f", size = 14229978, upload-time = "2025-12-18T19:29:11.32Z" },
{ url = "https://files.pythonhosted.org/packages/51/eb/e8dd1dd6e05b9e695aa9dd420f4577debdd0f87a5ff2fedda33c09e9be8c/ruff-0.14.10-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:213db2b2e44be8625002dbea33bb9c60c66ea2c07c084a00d55732689d697a7f", size = 14338036, upload-time = "2025-12-18T19:29:09.184Z" },
{ url = "https://files.pythonhosted.org/packages/6a/12/f3e3a505db7c19303b70af370d137795fcfec136d670d5de5391e295c134/ruff-0.14.10-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:b914c40ab64865a17a9a5b67911d14df72346a634527240039eb3bd650e5979d", size = 13264051, upload-time = "2025-12-18T19:29:13.431Z" },
{ url = "https://files.pythonhosted.org/packages/08/64/8c3a47eaccfef8ac20e0484e68e0772013eb85802f8a9f7603ca751eb166/ruff-0.14.10-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:1484983559f026788e3a5c07c81ef7d1e97c1c78ed03041a18f75df104c45405", size = 13283998, upload-time = "2025-12-18T19:29:06.994Z" },
{ url = "https://files.pythonhosted.org/packages/12/84/534a5506f4074e5cc0529e5cd96cfc01bb480e460c7edf5af70d2bcae55e/ruff-0.14.10-py3-none-musllinux_1_2_i686.whl", hash = "sha256:c70427132db492d25f982fffc8d6c7535cc2fd2c83fc8888f05caaa248521e60", size = 13601891, upload-time = "2025-12-18T19:28:55.811Z" },
{ url = "https://files.pythonhosted.org/packages/0d/1e/14c916087d8598917dbad9b2921d340f7884824ad6e9c55de948a93b106d/ruff-0.14.10-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:5bcf45b681e9f1ee6445d317ce1fa9d6cba9a6049542d1c3d5b5958986be8830", size = 14336660, upload-time = "2025-12-18T19:29:16.531Z" },
{ url = "https://files.pythonhosted.org/packages/f2/1c/d7b67ab43f30013b47c12b42d1acd354c195351a3f7a1d67f59e54227ede/ruff-0.14.10-py3-none-win32.whl", hash = "sha256:104c49fc7ab73f3f3a758039adea978869a918f31b73280db175b43a2d9b51d6", size = 13196187, upload-time = "2025-12-18T19:29:19.006Z" },
{ url = "https://files.pythonhosted.org/packages/fb/9c/896c862e13886fae2af961bef3e6312db9ebc6adc2b156fe95e615dee8c1/ruff-0.14.10-py3-none-win_amd64.whl", hash = "sha256:466297bd73638c6bdf06485683e812db1c00c7ac96d4ddd0294a338c62fdc154", size = 14661283, upload-time = "2025-12-18T19:29:30.16Z" },
{ url = "https://files.pythonhosted.org/packages/74/31/b0e29d572670dca3674eeee78e418f20bdf97fa8aa9ea71380885e175ca0/ruff-0.14.10-py3-none-win_arm64.whl", hash = "sha256:e51d046cf6dda98a4633b8a8a771451107413b0f07183b2bef03f075599e44e6", size = 13729839, upload-time = "2025-12-18T19:28:48.636Z" },
{ url = "https://files.pythonhosted.org/packages/48/b8/9537b52010134b1d2b72870cc3f92d5fb759394094741b09ceccae183fbe/ruff-0.14.8-py3-none-linux_armv6l.whl", hash = "sha256:ec071e9c82eca417f6111fd39f7043acb53cd3fde9b1f95bbed745962e345afb", size = 13441540, upload-time = "2025-12-04T15:06:14.896Z" },
{ url = "https://files.pythonhosted.org/packages/24/00/99031684efb025829713682012b6dd37279b1f695ed1b01725f85fd94b38/ruff-0.14.8-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:8cdb162a7159f4ca36ce980a18c43d8f036966e7f73f866ac8f493b75e0c27e9", size = 13669384, upload-time = "2025-12-04T15:06:51.809Z" },
{ url = "https://files.pythonhosted.org/packages/72/64/3eb5949169fc19c50c04f28ece2c189d3b6edd57e5b533649dae6ca484fe/ruff-0.14.8-py3-none-macosx_11_0_arm64.whl", hash = "sha256:2e2fcbefe91f9fad0916850edf0854530c15bd1926b6b779de47e9ab619ea38f", size = 12806917, upload-time = "2025-12-04T15:06:08.925Z" },
{ url = "https://files.pythonhosted.org/packages/c4/08/5250babb0b1b11910f470370ec0cbc67470231f7cdc033cee57d4976f941/ruff-0.14.8-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a9d70721066a296f45786ec31916dc287b44040f553da21564de0ab4d45a869b", size = 13256112, upload-time = "2025-12-04T15:06:23.498Z" },
{ url = "https://files.pythonhosted.org/packages/78/4c/6c588e97a8e8c2d4b522c31a579e1df2b4d003eddfbe23d1f262b1a431ff/ruff-0.14.8-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2c87e09b3cd9d126fc67a9ecd3b5b1d3ded2b9c7fce3f16e315346b9d05cfb52", size = 13227559, upload-time = "2025-12-04T15:06:33.432Z" },
{ url = "https://files.pythonhosted.org/packages/23/ce/5f78cea13eda8eceac71b5f6fa6e9223df9b87bb2c1891c166d1f0dce9f1/ruff-0.14.8-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d62cb310c4fbcb9ee4ac023fe17f984ae1e12b8a4a02e3d21489f9a2a5f730c", size = 13896379, upload-time = "2025-12-04T15:06:02.687Z" },
{ url = "https://files.pythonhosted.org/packages/cf/79/13de4517c4dadce9218a20035b21212a4c180e009507731f0d3b3f5df85a/ruff-0.14.8-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:1af35c2d62633d4da0521178e8a2641c636d2a7153da0bac1b30cfd4ccd91344", size = 15372786, upload-time = "2025-12-04T15:06:29.828Z" },
{ url = "https://files.pythonhosted.org/packages/00/06/33df72b3bb42be8a1c3815fd4fae83fa2945fc725a25d87ba3e42d1cc108/ruff-0.14.8-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:25add4575ffecc53d60eed3f24b1e934493631b48ebbc6ebaf9d8517924aca4b", size = 14990029, upload-time = "2025-12-04T15:06:36.812Z" },
{ url = "https://files.pythonhosted.org/packages/64/61/0f34927bd90925880394de0e081ce1afab66d7b3525336f5771dcf0cb46c/ruff-0.14.8-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4c943d847b7f02f7db4201a0600ea7d244d8a404fbb639b439e987edcf2baf9a", size = 14407037, upload-time = "2025-12-04T15:06:39.979Z" },
{ url = "https://files.pythonhosted.org/packages/96/bc/058fe0aefc0fbf0d19614cb6d1a3e2c048f7dc77ca64957f33b12cfdc5ef/ruff-0.14.8-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cb6e8bf7b4f627548daa1b69283dac5a296bfe9ce856703b03130732e20ddfe2", size = 14102390, upload-time = "2025-12-04T15:06:46.372Z" },
{ url = "https://files.pythonhosted.org/packages/af/a4/e4f77b02b804546f4c17e8b37a524c27012dd6ff05855d2243b49a7d3cb9/ruff-0.14.8-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:7aaf2974f378e6b01d1e257c6948207aec6a9b5ba53fab23d0182efb887a0e4a", size = 14230793, upload-time = "2025-12-04T15:06:20.497Z" },
{ url = "https://files.pythonhosted.org/packages/3f/52/bb8c02373f79552e8d087cedaffad76b8892033d2876c2498a2582f09dcf/ruff-0.14.8-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:e5758ca513c43ad8a4ef13f0f081f80f08008f410790f3611a21a92421ab045b", size = 13160039, upload-time = "2025-12-04T15:06:49.06Z" },
{ url = "https://files.pythonhosted.org/packages/1f/ad/b69d6962e477842e25c0b11622548df746290cc6d76f9e0f4ed7456c2c31/ruff-0.14.8-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:f74f7ba163b6e85a8d81a590363bf71618847e5078d90827749bfda1d88c9cdf", size = 13205158, upload-time = "2025-12-04T15:06:54.574Z" },
{ url = "https://files.pythonhosted.org/packages/06/63/54f23da1315c0b3dfc1bc03fbc34e10378918a20c0b0f086418734e57e74/ruff-0.14.8-py3-none-musllinux_1_2_i686.whl", hash = "sha256:eed28f6fafcc9591994c42254f5a5c5ca40e69a30721d2ab18bb0bb3baac3ab6", size = 13469550, upload-time = "2025-12-04T15:05:59.209Z" },
{ url = "https://files.pythonhosted.org/packages/70/7d/a4d7b1961e4903bc37fffb7ddcfaa7beb250f67d97cfd1ee1d5cddb1ec90/ruff-0.14.8-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:21d48fa744c9d1cb8d71eb0a740c4dd02751a5de9db9a730a8ef75ca34cf138e", size = 14211332, upload-time = "2025-12-04T15:06:06.027Z" },
{ url = "https://files.pythonhosted.org/packages/5d/93/2a5063341fa17054e5c86582136e9895db773e3c2ffb770dde50a09f35f0/ruff-0.14.8-py3-none-win32.whl", hash = "sha256:15f04cb45c051159baebb0f0037f404f1dc2f15a927418f29730f411a79bc4e7", size = 13151890, upload-time = "2025-12-04T15:06:11.668Z" },
{ url = "https://files.pythonhosted.org/packages/02/1c/65c61a0859c0add13a3e1cbb6024b42de587456a43006ca2d4fd3d1618fe/ruff-0.14.8-py3-none-win_amd64.whl", hash = "sha256:9eeb0b24242b5bbff3011409a739929f497f3fb5fe3b5698aba5e77e8c833097", size = 14537826, upload-time = "2025-12-04T15:06:26.409Z" },
{ url = "https://files.pythonhosted.org/packages/6d/63/8b41cea3afd7f58eb64ac9251668ee0073789a3bc9ac6f816c8c6fef986d/ruff-0.14.8-py3-none-win_arm64.whl", hash = "sha256:965a582c93c63fe715fd3e3f8aa37c4b776777203d8e1d8aa3cc0c14424a4b99", size = 13634522, upload-time = "2025-12-04T15:06:43.212Z" },
]
[[package]]
@ -1843,7 +1812,7 @@ source = { git = "https://github.com/pikers/tomlkit.git?branch=piker_pin#8e0239a
[[package]]
name = "tractor"
version = "0.1.0a6.dev0"
source = { git = "https://github.com/goodboy/tractor.git?branch=piker_pin#e232d9dd06f41b8dca997f0647f2083d27cc34f2" }
source = { git = "https://pikers.dev/goodboy/tractor?branch=piker_pin#e232d9dd06f41b8dca997f0647f2083d27cc34f2" }
dependencies = [
{ name = "bidict" },
{ name = "cffi" },