Compare commits

...

50 Commits

Author SHA1 Message Date
goodboy 5e371f1d73 Merge pull request 'jsonrpc_err_in_rent' (#41) from jsonrpc_err_in_rent into gitea_feats
Reviewed-on: #41
2025-02-21 21:21:02 +00:00
goodboy 6c221bb348 Merge pull request 'tsp_gaps: fixes for fault-less OHLCV time-series loads' (#35) from tsp_gaps into gitea_feats
Reviewed-on: #35
2025-02-21 20:46:37 +00:00
Tyler Goodlet e391c896f8 Mk jsronrpc's underlying ws timeout `float('inf')`
Since currently we're only using this IPC subsys for `deribit`, and
generally speaking we're primarly supporting options markets (which are
fairly "slow moving"), flip to a default of NOT resetting the `NoBsWs`
on timeout since doing so normally breaks the jsron-rpc IPC session.
Without a proper `fixture` passed to `open_autorecon_ws()` (which we
should eventually implement!!) relying on a timeout-to-reset more or
less will just cause breakage issues - a proper reconnect sequence must
be implemented before using that feature.

Deats,
- expose and proxy through the `msg_recv_timeout` from
  `open_jsonrpc_session()` into the underlying `open_autorecon_ws()`
  call.
2025-02-19 17:05:13 -05:00
Tyler Goodlet 5633f5614d Doc-n-clean `.data._web_bs.open_jsonrpc_session()`
Add a doc-string reflecting recent refinements, drop all the old hook
params, rename `n: trio.Nursery` -> `tn` for "task nursery" fitting with
code base's naming style.
2025-02-19 17:05:13 -05:00
Tyler Goodlet 76735189de data._web_bs: try to raise jsonrpc errors in parent task 2025-02-19 17:05:13 -05:00
Tyler Goodlet d49608f74e Refine history gap/termination signalling
Namely handling backends which do not provide a default "frame
size-duration" in their init-config by making the backfiller guess the
value based on the first frame received.

Deats,
- adjust `start_backfill()` to take a more explicit
  `def_frame_duration: Duration` expected to be unpacked from any
  backend hist init-config by the `tsdb_backfill()` caller which now
  also computes a value from the first received frame when the config
  section isn't provided.
- in `start_backfill()` we now always expect the `def_frame_duration`
  input and always decrement the query range by this value whenever
  a `NoData` is raised by the provider-backend paired with an explicit
  `log.warning()` about the handling.
- also relay any `DataUnavailable.args[0]` message from the provider
  in the handler.
- repair "gap reporting" which checks for expected frame duration vs.
  that received with much better humanized logging on the missing
  segment using `pendulum.Interval/Duration.in_words()` output.
2025-02-19 17:01:24 -05:00
Tyler Goodlet bf0ac93aa3 Only use `frame_types` if delivered during enter
The `open_history_client()` provider endpoint can *optionally*
deliver a `frame_types: dict[int, pendulum.Duration]` subsection in its
`config: dict[str, dict]` (as was implemented with the `ib` backend).
This allows the `tsp` backfilling machinery to use this "recommended
frame duration" to subtract from the `last_start_dt` any time a `NoData`
gap is signalled by the `get_hist()` call allowing gaps to be ignored
safely without missing history by knowing the next earliest dt we can
query from using the `end_dt`. However, currently all crypto$ providers
haven't implemented this feat yet..

As such only try to use the `frame_types` feature if provided when
handling `NoData` conditions inside `tsp.start_backfill()` and otherwise
raise as normal.
2025-02-19 17:01:24 -05:00
Tyler Goodlet d7179d47b0 `.tsp._anal`: add (unused) `detect_vlm_gaps()` 2025-02-19 17:01:24 -05:00
Tyler Goodlet c390e87536 `.storage.cli`: collect gap-markup-aids into `tf2aids: dict` prior to pause for introspection 2025-02-19 17:01:24 -05:00
Tyler Goodlet 5e4a6d61c7 Ignore any non-`.parquet` files under `.config/piker/nativedb/` subdir 2025-02-19 17:01:24 -05:00
Tyler Goodlet 3caaa30b03 Mask no-data pause, add perps to no-`/src`-in-fqme asset set
Was orig for debugging an issue with `kucoin` i think but definitely
shouldn't be left in XD

Also add `'perpetual_future'` to the `.start_backfill()` input literal
set since we don't expect the 'btc/usd.perp.binance' for now.
2025-02-19 17:01:24 -05:00
goodboy 1e3942fdc2 Merge pull request 'Add `ruff` to deps, bump `uv.lock`' (#32) from add_ruff_linter into gitea_feats
Reviewed-on: #32
2025-02-17 19:55:32 +00:00
Nelson Torres 49ea380503 Add new `ruff.toml` config file
Based on the default provided in their
[docs](https://docs.astral.sh/ruff/configuration/) and migrating
previous config from the prior `poetry`-verion of our `pyproject.toml`
2025-02-17 14:48:10 -05:00
Tyler Goodlet 933f169938 Add/reorg back some content from `poetry` old config 2025-02-14 13:47:02 -05:00
Nelson Torres 51337052a4 Remove legacy `poetry` config content from pyproject.toml 2025-02-14 15:01:29 -03:00
Tyler Goodlet 8abe55dcc6 Add `ruff` to deps, bump `uv.lock`
Such that we start encouraging devs to lint code they touch and
hopefully we include a pass as part of our tests/CI eventually B)

Also, mk local `tractor` install `--editable` since without it being
a locally hackable repo it's kinda pointless to install from the local
fs Xp
2025-02-13 21:20:11 -05:00
goodboy c933f2ad56 Merge pull request 'kucoin_and_binance_fix' (#9) from kucoin_and_binance_fix into gitea_feats
Reviewed-on: #9
2025-02-13 19:40:50 +00:00
Tyler Goodlet 00108010c9 Mask `pytest` detection block in `piker.config`
Seems to be some kinda super weird env bug since we moved to using
`uv`? When it triggers it also seems to cause a pretty fundamental crash
that not only breaks `tractor.devx._debug` stuff but also seems to get
us in a perma-hang state where no SIGINT or other sys sig will be able
to kill the root proc!?!?

TODO, a `gitea` issue to track so we can fix the fundamental problem as
well as transitive fault in `tractor`'s core which seems to be due to
the error taking place during a sub-actor's module import phase which
prevents the runtime from booting fully and then the proc getting stuck
in a real gnarly SIG-state..
2025-02-13 13:32:11 -05:00
Tyler Goodlet 8a4901c517 `.binance.feed`: moar type fixes, drop `rapidfuzz` 2025-02-13 12:35:41 -05:00
Tyler Goodlet d7f6a5ab63 Update to latest `KucoinMktPair` spec 2025-02-12 18:08:40 -05:00
Tyler Goodlet e0fdabf651 Use `../tractor` srcs in editable mode? 2025-02-12 15:14:30 -05:00
Tyler Goodlet cb88dfc9da `kucoin`: repair live quotes streaming..
This must have broke at some point during the new `MktPair` and thus
`.fqme: str` updates; mas-o-menos the symbol key in the quote-msg-`dict`
was NOT set to the `MktPair.bs_fqme: str` value and thus wasn't being
processed by the downstream sampling and feed subsys.

So fix that as well as a few other refinements,
- set the `topic: mkt.bs_fqme` in quote msgs obvi.
- drop the "wait for first clearing vlm" quote poll loop; going to fix
  the sampler to handle a `first_quote` without a `'last'` key.
- add some typing around calls to `get_mkt_info()`.
- rename `stream_messages()` -> `iter_normed_quotes()`.
2025-02-12 15:05:22 -05:00
Nelson Torres bb41dd6d18 Deleted settlePlan field from binance FutesPair. 2025-02-12 15:05:22 -05:00
Nelson Torres 99e90129ad Added missing fields for kucoin.
feeCategory, makerFeeCoefficient, takerFeeCoefficient and st.
2025-02-12 15:05:22 -05:00
Tyler Goodlet cceb7a37b9 Lel, forgot to add a `SPOT` venue for `binance`.. 2025-02-12 15:05:22 -05:00
goodboy 5382815b2d Merge pull request 'uv migration and default.nix for qt6' (#17) from uv_migration into gitea_feats
Reviewed-on: #17
2025-02-12 20:04:02 +00:00
Tyler Goodlet cb1ba8a05f Further root readme bump, factor `.clearing` content
In line with our move to `uv` and recent `nix` support update a bunch of
the summary content and factor out the order-control section to a new
`.piker.clearing` readme file with embedded todos therein.
2025-02-12 15:01:51 -05:00
Nelson Torres 6c65ec4d3b Readme update:
- uv replace poetry
- update nix-shell command
2025-02-12 13:05:25 -03:00
Nelson Torres 12e371b027 uv migration 2025-02-12 11:19:34 -03:00
goodboy 014bd58db4 Merge pull request 'go_httpx' (#2) from go_httpx into gitea_feats
Landed-in: #2
2025-02-12 13:01:19 +00:00
Tyler Goodlet 844544ed8e Port binance to `httpx`
Like other backends use the `AsyncClient` for all venue specific
client-sessions but change to allocating them inside `get_client()`
using an `AsyncExitStack` and inserting directly in the
`Client.venue_sesh: dict` table during init.

Supporting impl tweaks:
- remove most of the API client session building logic and instead make
  `Client.__init__()` take in a `venue_sessions: dict` (set it to
  `.venue_sesh`) and `conf: dict`, instead opting to do the http client
  configuration inside `get_client()` since all that code only needs to
  be run once.
 |_load config inside `get_client()` once.
 |_move session token creation into a new util func `init_api_keys()` and
  also call it from `get_client()` factory; toss in an ex. toml section
  config to the doc string.
- define `_venue_urls: dict[str, str]` (content taken from old static
  `.venue_sesh` dict) at module level and feed them as `base_url: str`
  inputs to the client create loop.
- adjust all call sigs in httpx-sesh-using methods, namely just
  `._api()`.
- do a `.exch_info()` call in `get_client()` to cache the symbology
  set.

Unrelated changes for various other outstanding buggers:
- to get futures feeds correctly loading when selected
  from search (like 'XMRUSDT.USDTM.PERP'), expect a `MktPair` input to
  `Client.bars()` such that the exact venue-key can be looked up (via
  a new `.pair2venuekey()` meth) and then passed to `._api()`.
- adjust `.broker.open_trade_dialog()` to failover to paper engine when
  there's no `api_key` key set for the `subconf` venue-key.
2025-02-11 16:27:28 -05:00
Nelson Torres f479252d26 Added note to exception when missing field in SpotPair class 2025-02-11 16:27:28 -05:00
Nelson Torres 033ef2e35e Added new fields to SpotPair class in venues 2025-02-11 16:27:28 -05:00
Tyler Goodlet 2cdece244c binance: raise `NoData` on null hist arrays
Like we do with other history backends to indicate lack of a data set.
This avoids any raise that will will bring down the backloader task with
some downstream error.

Raise a `ValueError` on no time index for now.
2025-02-11 16:27:28 -05:00
Tyler Goodlet 018694bbdb Woops, `data` can be an empty list XD 2025-02-11 16:27:28 -05:00
Tyler Goodlet 128a2d507f Woops, fix missing `api_url` ref in error log 2025-02-11 16:27:28 -05:00
Tyler Goodlet 430650a6a7 Change type-annots to use `httpx.Response` 2025-02-11 16:27:28 -05:00
Tyler Goodlet 1da3cf5698 Port `kucoin` backend to `httpx` 2025-02-11 16:27:28 -05:00
Tyler Goodlet a348603fc4 Port `kraken` backend to `httpx` 2025-02-11 16:27:28 -05:00
goodboy 86047824d8 Merge pull request '`.brokers.ib` random fixes-n-improvements from various other dev branches..' (#27) from ib_refinements into gitea_feats
Merged-in: #27
2025-02-11 21:26:20 +00:00
Tyler Goodlet cb92abbc38 ib: add connect status info emit 2025-02-11 14:56:17 -05:00
Tyler Goodlet 70332e375b ib: `.api` mod and log-fmt cleaning
About time we tidy'd a buncha status logging in this backend..
particularly for boot-up where there's lots of client-try-connect poll
looping with account detection from the user config.

`.api.Client` pprint and logging fmt improvements:
- add `Client.__repr__()` which shows the minimally useful set of info
  from the underlying `.ib: IB` as well as a new `.acnts: list[str]`
  of the account aliases defined in the user's `brokers.toml`.
- mk `.bars()` define a comprehensive `query_info: str` with all the
  request deats but only display if there's a problem with the response
  data.
- mk `.get_config()` report both the config file path and the acnt
  aliases (NOT the actual account #s).
- move all `.load_aio_clients()` client poll loop requests do
  `log.runtime()` statuses, only falling through to a `.warning()` when
  the loop fails to connect the client to the spec-ed API-gw addr, and
 |_ don't allow loading accounts for which the user has not defined an
    alias in `brokers.toml::[ib]`; raise a value-error in such cases
    with a message indicating how to mod the config.
 |_ only `log.info()` about acnts if some were loaded..

Other mod logging de-noising:
- better status fmting in `.symbols.open_symbol_search()` with
  `repr(Client)`.
- for `.feed.stream_quotes()` first quote reporting use `.runtime()`.
2025-02-11 14:56:17 -05:00
Tyler Goodlet 4940aabe05 ib: warn about mkt precision cuckups that `Contract`s clearly deliver wrong.. 2025-02-11 14:56:17 -05:00
Tyler Goodlet 4f9998e9fb ib: mask out trade and vlm rates for now 2025-02-11 14:56:17 -05:00
Tyler Goodlet c92a236196 ib: more trade record edge case handling
- timestamps came as `'date'`-keyed from 2022 and before but now are
  `'datetime'`..
- some symbols seem to have no commission field, so handle that..
- when no `'price'` field found return `None` from `norm_trade()`.
- add a warn log on mid-fill commission updates.
2025-02-11 14:56:17 -05:00
goodboy e4cd1f85f6 Merge pull request 'pyqt6' (#3) from pyqt6 into gitea_feats
Reviewed-on: #3 (well by fomo anyway..)
2025-02-11 17:25:03 +00:00
Tyler Goodlet 129cf58d41 Bump deps for Py3.12, go PyQt6, tweak ruff rules
Code base is already ported for `Qt6` so this removes the pyqt5 dep,
adds latest pyqt6 as well as buncha other updates:

- add `xonsh` and ptk as dev deps for those of us using wacky shells ;P
- bump compiled deps as needed for python 3.12 (`numpy`, `numba`)
- add `httpx` and drop `asks` since the latter is zombied and not compat
  with other libs on 3.12.
- add `ruff` linting ignore rules for the new `.ui.qt` shim mod layer.
- few other deps updates to latest versions.
- add in the `keywords` and `classifiers` sections from the old
  `setup.py`.
2024-05-20 11:07:27 -04:00
Tyler Goodlet 1fd8654ca5 Port all `.ui*` submods to new `.ui.qt` imports
This also officially moves the code base to using `PyQt6` including all
necessary reference changes and enum namespace path moves.

Also includes a small `.ui.order_mode` fix to cancel any
`Order.price <= 0` and a name error fix with logging using `msg`,
which is already used for the input order msg..
2024-05-01 14:33:10 -04:00
Tyler Goodlet d0170982bf Add `piker.ui.qt` as a `PyQt6` shim module
For the future, like if we ever get a `PyQt7` (or wtv else..), add
a module which allows changing Qt binding lib imports from one spot for
all other `.ui` submodules. In some sense this is like a shoddier, less
dynamic version of how `pyqtgraph.Qt.__init__.py` supports multiple
libs; it might actually make sense eventually to instead import from
their shim layer instead?

Included is a draft attempt at exposing a bunch of enums which under
custom names:
- while the specific grouping of values seem to always stay consistent,
  the root enum's seem to almost always get moved around in the `PyQtX`
  module namespace.
- changing groupings and/or each top level enum's ns location can more
  simply be changed/re-orged from one spot.
- allows `.ui` consumer code to use a name more relevant to `piker`'s
  usage of wtv UI component is being configured.
2024-05-01 14:30:18 -04:00
Tyler Goodlet 821e73a409 Use a `unit_prefix: str` (like u or $) on health bar 2024-05-01 14:09:39 -04:00
55 changed files with 3366 additions and 879 deletions

View File

@ -1,162 +1,161 @@
piker piker
----- -----
trading gear for hackers. trading gear for hackers
|gh_actions| |gh_actions|
.. |gh_actions| image:: https://img.shields.io/endpoint.svg?url=https%3A%2F%2Factions-badge.atrox.dev%2Fpikers%2Fpiker%2Fbadge&style=popout-square .. |gh_actions| image:: https://img.shields.io/endpoint.svg?url=https%3A%2F%2Factions-badge.atrox.dev%2Fpikers%2Fpiker%2Fbadge&style=popout-square
:target: https://actions-badge.atrox.dev/piker/pikers/goto :target: https://actions-badge.atrox.dev/piker/pikers/goto
``piker`` is a broker agnostic, next-gen FOSS toolset for real-time ``piker`` is a broker agnostic, next-gen FOSS toolset and runtime for
computational trading targeted at `hardcore Linux users <comp_trader>`_ . real-time computational trading targeted at `hardcore Linux users
<comp_trader>`_ .
we use as much bleeding edge tech as possible including (but not limited to): we use much bleeding edge tech including (but not limited to):
- latest python for glue_ - latest python for glue_
- trio_ & tractor_ for our distributed, multi-core, real-time streaming - uv_ for packaging and distribution
`structured concurrency`_ runtime B) - trio_ & tractor_ for our distributed `structured concurrency`_ runtime
- Qt_ for pristine high performance UIs - Qt_ for pristine low latency UIs
- pyqtgraph_ for real-time charting - pyqtgraph_ (which we've extended) for real-time charting and graphics
- ``polars`` ``numpy`` and ``numba`` for `fast numerics`_ - ``polars`` ``numpy`` and ``numba`` for redic `fast numerics`_
- `apache arrow and parquet`_ for time series history management - `apache arrow and parquet`_ for time-series storage
persistence and sharing
- (prototyped) techtonicdb_ for L2 book storage
.. |travis| image:: https://img.shields.io/travis/pikers/piker/master.svg potential projects we might integrate with soon,
:target: https://travis-ci.org/pikers/piker
- (already prototyped in ) techtonicdb_ for L2 book storage
.. _comp_trader: https://jfaleiro.wordpress.com/2019/10/09/computational-trader/
.. _glue: https://numpy.org/doc/stable/user/c-info.python-as-glue.html#using-python-as-glue
.. _uv: https://docs.astral.sh/uv/
.. _trio: https://github.com/python-trio/trio .. _trio: https://github.com/python-trio/trio
.. _tractor: https://github.com/goodboy/tractor .. _tractor: https://github.com/goodboy/tractor
.. _structured concurrency: https://trio.discourse.group/ .. _structured concurrency: https://trio.discourse.group/
.. _marketstore: https://github.com/alpacahq/marketstore
.. _techtonicdb: https://github.com/0b01/tectonicdb
.. _Qt: https://www.qt.io/ .. _Qt: https://www.qt.io/
.. _pyqtgraph: https://github.com/pyqtgraph/pyqtgraph .. _pyqtgraph: https://github.com/pyqtgraph/pyqtgraph
.. _glue: https://numpy.org/doc/stable/user/c-info.python-as-glue.html#using-python-as-glue
.. _apache arrow and parquet: https://arrow.apache.org/faq/ .. _apache arrow and parquet: https://arrow.apache.org/faq/
.. _fast numerics: https://zerowithdot.com/python-numpy-and-pandas-performance/ .. _fast numerics: https://zerowithdot.com/python-numpy-and-pandas-performance/
.. _comp_trader: https://jfaleiro.wordpress.com/2019/10/09/computational-trader/ .. _techtonicdb: https://github.com/0b01/tectonicdb
focus and features: focus and feats:
******************* ****************
- 100% federated: your code, your hardware, your data feeds, your broker fills. fitting with these tenets, we're always open to new
- zero web: low latency, native software that doesn't try to re-invent the OS framework/lib/service interop suggestions and ideas!
- maximal **privacy**: prevent brokers and mms from knowing your
planz; smack their spreads with dark volume.
- zero clutter: modal, context oriented UIs that echew minimalism, reduce
thought noise and encourage un-emotion.
- first class parallelism: built from the ground up on next-gen structured concurrency
primitives.
- traders first: broker/exchange/asset-class agnostic
- systems grounded: real-time financial signal processing that will
make any queuing or DSP eng juice their shorts.
- non-tina UX: sleek, powerful keyboard driven interaction with expected use in tiling wms
- data collaboration: every process and protocol is multi-host scalable.
- fight club ready: zero interest in adoption by suits; no corporate friendly license, ever.
fitting with these tenets, we're always open to new framework suggestions and ideas. - **100% federated**:
your code, your hardware, your data feeds, your broker fills.
building the best looking, most reliable, keyboard friendly trading - **zero web**:
platform is the dream; join the cause. low latency as a prime objective, native UIs and modern IPC
protocols without trying to re-invent the "OS-as-an-app"..
- **maximal privacy**:
prevent brokers and mms from knowing your planz; smack their
spreads with dark volume from a VPN tunnel.
- **zero clutter**:
modal, context oriented UIs that echew minimalism, reduce thought
noise and encourage un-emotion.
- **first class parallelism**:
built from the ground up on a next-gen structured concurrency
supervision sys.
- **traders first**:
broker/exchange/venue/asset-class/money-sys agnostic
- **systems grounded**:
real-time financial signal processing (fsp) that will make any
queuing or DSP eng juice their shorts.
- **non-tina UX**:
sleek, powerful keyboard driven interaction with expected use in
tiling wms (or maybe even a DDE).
- **data collab at scale**:
every actor-process and protocol is multi-host aware.
- **fight club ready**:
zero interest in adoption by suits; no corporate friendly license,
ever.
building the hottest looking, fastest, most reliable, keyboard
friendly FOSS trading platform is the dream; join the cause.
sane install with `poetry` a sane install with `uv`
************************** ************************
TODO! bc why install with `python` when you can faster with `rust` ::
uv lock
rigorous install on ``nixos`` using ``poetry2nix``
**************************************************
TODO!
hacky install on nixos hacky install on nixos
********************** **********************
`NixOS` is our core devs' distro of choice for which we offer ``NixOS`` is our core devs' distro of choice for which we offer
a stringently defined development shell envoirment that can be loaded with:: a stringently defined development shell envoirment that can be loaded with::
nix-shell develop.nix nix-shell default.nix
this will setup the required python environment to run piker, make sure to
run::
pip install -r requirements.txt -e .
once after loading the shell
install wild-west style via `pip` start a chart
********************************* *************
``piker`` is currently under heavy pre-alpha development and as such run a realtime OHLCV chart stand-alone::
should be cloned from this repo and hacked on directly.
for a development install:: piker -l info chart btcusdt.spot.binance xmrusdt.spot.kraken
git clone git@github.com:pikers/piker.git this runs a chart UI (with 1m sampled OHLCV) and shows 2 spot markets from 2 diff cexes
cd piker overlayed on the same graph. Use of `piker` without first starting
virtualenv env a daemon (`pikerd` - see below) means there is an implicit spawning of the
source ./env/bin/activate multi-actor-runtime (implemented as a `tractor` app).
pip install -r requirements.txt -e .
For additional subsystem feats available through our chart UI see the
various sub-readmes:
- order control using a mouse-n-keyboard UX B)
- cross venue market-pair (what most call "symbol") search, select, overlay Bo
- financial-signal-processing (`piker.fsp`) write-n-reload to sub-chart BO
- src-asset derivatives scan for anal, like the infamous "max pain" XO
check out our charts spawn a daemon standalone
******************** *************************
bet you weren't expecting this from the foss:: we call the root actor-process the ``pikerd``. it can be (and is
recommended normally to be) started separately from the ``piker
piker -l info -b kraken -b binance chart btcusdt.binance --pdb chart`` program::
this runs the main chart (currently with 1m sampled OHLC) in in debug
mode and you can practice paper trading using the following
micro-manual:
``order_mode`` (
edge triggered activation by any of the following keys,
``mouse-click`` on y-level to submit at that price
):
- ``f``/ ``ctl-f`` to stage buy
- ``d``/ ``ctl-d`` to stage sell
- ``a`` to stage alert
``search_mode`` (
``ctl-l`` or ``ctl-space`` to open,
``ctl-c`` or ``ctl-space`` to close
) :
- begin typing to have symbol search automatically lookup
symbols from all loaded backend (broker) providers
- arrow keys and mouse click to navigate selection
- vi-like ``ctl-[hjkl]`` for navigation
you can also configure your position allocation limits from the
sidepane.
run in distributed mode
***********************
start the service manager and data feed daemon in the background and
connect to it::
pikerd -l info --pdb pikerd -l info --pdb
the daemon does nothing until a ``piker``-client (like ``piker
chart``) connects and requests some particular sub-system. for
a connecting chart ``pikerd`` will spawn and manage at least,
connect your chart:: - a data-feed daemon: ``datad`` which does all the work of comms with
the backend provider (in this case the ``binance`` cex).
- a paper-trading engine instance, ``paperboi.binance``, (if no live
account has been configured) which allows for auto/manual order
control against the live quote stream.
piker -l info -b kraken -b binance chart xmrusdt.binance --pdb *using* an actor-service (aka micro-daemon) manager which dynamically
supervises various sub-subsystems-as-services throughout the ``piker``
runtime-stack.
now you can (implicitly) connect your chart::
enjoy persistent real-time data feeds tied to daemon lifetime. the next piker chart btcusdt.spot.binance
time you spawn a chart it will load much faster since the data feed has
been cached and is now always running live in the background until you since ``pikerd`` was started separately you can now enjoy a persistent
kill ``pikerd``. real-time data stream tied to the daemon-tree's lifetime. i.e. the next
time you spawn a chart it will obviously not only load much faster
(since the underlying ``datad.binance`` is left running with its
in-memory IPC data structures) but also the data-feed and any order
mgmt states should be persistent until you finally cancel ``pikerd``.
if anyone asks you what this project is about if anyone asks you what this project is about
********************************************* *********************************************
you don't talk about it. you don't talk about it; just use it.
how do i get involved? how do i get involved?
@ -166,6 +165,15 @@ enter the matrix.
how come there ain't that many docs how come there ain't that many docs
*********************************** ***********************************
suck it up, learn the code; no one is trying to sell you on anything. i mean we want/need them but building the core right has been higher
also, we need lotsa help so if you want to start somewhere and can't prio then marketting (and likely will stay that way Bp).
necessarily write serious code, this might be the place for you!
soo, suck it up bc,
- no one is trying to sell you on anything
- learning the code base is prolly way more valuable
- the UI/UXs are intended to be "intuitive" for any hacker..
we obviously need tonz help so if you want to start somewhere and
can't necessarily write "advanced" concurrent python/rust code, this
helping document literally anything might be the place for you!

134
default.nix 100644
View File

@ -0,0 +1,134 @@
with (import <nixpkgs> {});
let
glibStorePath = lib.getLib glib;
zlibStorePath = lib.getLib zlib;
zstdStorePath = lib.getLib zstd;
dbusStorePath = lib.getLib dbus;
libGLStorePath = lib.getLib libGL;
freetypeStorePath = lib.getLib freetype;
qt6baseStorePath = lib.getLib qt6.qtbase;
fontconfigStorePath = lib.getLib fontconfig;
libxkbcommonStorePath = lib.getLib libxkbcommon;
xcbutilcursorStorePath = lib.getLib xcb-util-cursor;
qtpyStorePath = lib.getLib python312Packages.qtpy;
pyqt6StorePath = lib.getLib python312Packages.pyqt6;
pyqt6SipStorePath = lib.getLib python312Packages.pyqt6-sip;
rapidfuzzStorePath = lib.getLib python312Packages.rapidfuzz;
qdarkstyleStorePath = lib.getLib python312Packages.qdarkstyle;
xorgLibX11StorePath = lib.getLib xorg.libX11;
xorgLibxcbStorePath = lib.getLib xorg.libxcb;
xorgxcbutilwmStorePath = lib.getLib xorg.xcbutilwm;
xorgxcbutilimageStorePath = lib.getLib xorg.xcbutilimage;
xorgxcbutilerrorsStorePath = lib.getLib xorg.xcbutilerrors;
xorgxcbutilkeysymsStorePath = lib.getLib xorg.xcbutilkeysyms;
xorgxcbutilrenderutilStorePath = lib.getLib xorg.xcbutilrenderutil;
in
stdenv.mkDerivation {
name = "piker-qt6-uv";
buildInputs = [
# System requirements.
glib
zlib
dbus
zstd
libGL
freetype
qt6.qtbase
libgcc.lib
fontconfig
libxkbcommon
# Xorg requirements
xcb-util-cursor
xorg.libxcb
xorg.libX11
xorg.xcbutilwm
xorg.xcbutilimage
xorg.xcbutilerrors
xorg.xcbutilkeysyms
xorg.xcbutilrenderutil
# Python requirements.
python312Full
python312Packages.uv
python312Packages.qdarkstyle
python312Packages.rapidfuzz
python312Packages.pyqt6
python312Packages.qtpy
];
src = null;
shellHook = ''
set -e
# Set the Qt plugin path
# export QT_DEBUG_PLUGINS=1
QTBASE_PATH="${qt6baseStorePath}/lib"
QT_PLUGIN_PATH="$QTBASE_PATH/qt-6/plugins"
QT_QPA_PLATFORM_PLUGIN_PATH="$QT_PLUGIN_PATH/platforms"
LIB_GCC_PATH="${libgcc.lib}/lib"
GLIB_PATH="${glibStorePath}/lib"
ZSTD_PATH="${zstdStorePath}/lib"
ZLIB_PATH="${zlibStorePath}/lib"
DBUS_PATH="${dbusStorePath}/lib"
LIBGL_PATH="${libGLStorePath}/lib"
FREETYPE_PATH="${freetypeStorePath}/lib"
FONTCONFIG_PATH="${fontconfigStorePath}/lib"
LIB_XKB_COMMON_PATH="${libxkbcommonStorePath}/lib"
XCB_UTIL_CURSOR_PATH="${xcbutilcursorStorePath}/lib"
XORG_LIB_X11_PATH="${xorgLibX11StorePath}/lib"
XORG_LIB_XCB_PATH="${xorgLibxcbStorePath}/lib"
XORG_XCB_UTIL_IMAGE_PATH="${xorgxcbutilimageStorePath}/lib"
XORG_XCB_UTIL_WM_PATH="${xorgxcbutilwmStorePath}/lib"
XORG_XCB_UTIL_RENDER_UTIL_PATH="${xorgxcbutilrenderutilStorePath}/lib"
XORG_XCB_UTIL_KEYSYMS_PATH="${xorgxcbutilkeysymsStorePath}/lib"
XORG_XCB_UTIL_ERRORS_PATH="${xorgxcbutilerrorsStorePath}/lib"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$QTBASE_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$QT_PLUGIN_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$QT_QPA_PLATFORM_PLUGIN_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$LIB_GCC_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$DBUS_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$GLIB_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$ZLIB_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$ZSTD_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$LIBGL_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$FONTCONFIG_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$FREETYPE_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$LIB_XKB_COMMON_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$XCB_UTIL_CURSOR_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$XORG_LIB_X11_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$XORG_LIB_XCB_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$XORG_XCB_UTIL_IMAGE_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$XORG_XCB_UTIL_WM_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$XORG_XCB_UTIL_RENDER_UTIL_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$XORG_XCB_UTIL_KEYSYMS_PATH"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$XORG_XCB_UTIL_ERRORS_PATH"
export LD_LIBRARY_PATH
RPDFUZZ_PATH="${rapidfuzzStorePath}/lib/python3.12/site-packages"
QDRKSTYLE_PATH="${qdarkstyleStorePath}/lib/python3.12/site-packages"
QTPY_PATH="${qtpyStorePath}/lib/python3.12/site-packages"
PYQT6_PATH="${pyqt6StorePath}/lib/python3.12/site-packages"
PYQT6_SIP_PATH="${pyqt6SipStorePath}/lib/python3.12/site-packages"
PATCH="$PATCH:$RPDFUZZ_PATH"
PATCH="$PATCH:$QDRKSTYLE_PATH"
PATCH="$PATCH:$QTPY_PATH"
PATCH="$PATCH:$PYQT6_PATH"
PATCH="$PATCH:$PYQT6_SIP_PATH"
export PATCH
# Install deps
uv lock
'';
}

View File

@ -50,7 +50,7 @@ __brokers__: list[str] = [
'binance', 'binance',
'ib', 'ib',
'kraken', 'kraken',
'kucoin' 'kucoin',
# broken but used to work # broken but used to work
# 'questrade', # 'questrade',
@ -71,7 +71,7 @@ def get_brokermod(brokername: str) -> ModuleType:
Return the imported broker module by name. Return the imported broker module by name.
''' '''
module = import_module('.' + brokername, 'piker.brokers') module: ModuleType = import_module('.' + brokername, 'piker.brokers')
# we only allow monkeying because it's for internal keying # we only allow monkeying because it's for internal keying
module.name = module.__name__.split('.')[-1] module.name = module.__name__.split('.')[-1]
return module return module

View File

@ -18,10 +18,11 @@
Handy cross-broker utils. Handy cross-broker utils.
""" """
from __future__ import annotations
from functools import partial from functools import partial
import json import json
import asks import httpx
import logging import logging
from ..log import ( from ..log import (
@ -60,11 +61,11 @@ class NoData(BrokerError):
def __init__( def __init__(
self, self,
*args, *args,
info: dict, info: dict|None = None,
) -> None: ) -> None:
super().__init__(*args) super().__init__(*args)
self.info: dict = info self.info: dict|None = info
# when raised, machinery can check if the backend # when raised, machinery can check if the backend
# set a "frame size" for doing datetime calcs. # set a "frame size" for doing datetime calcs.
@ -90,16 +91,18 @@ class DataThrottle(BrokerError):
def resproc( def resproc(
resp: asks.response_objects.Response, resp: httpx.Response,
log: logging.Logger, log: logging.Logger,
return_json: bool = True, return_json: bool = True,
log_resp: bool = False, log_resp: bool = False,
) -> asks.response_objects.Response: ) -> httpx.Response:
"""Process response and return its json content. '''
Process response and return its json content.
Raise the appropriate error on non-200 OK responses. Raise the appropriate error on non-200 OK responses.
"""
'''
if not resp.status_code == 200: if not resp.status_code == 200:
raise BrokerError(resp.body) raise BrokerError(resp.body)
try: try:

View File

@ -1,8 +1,8 @@
# piker: trading gear for hackers # piker: trading gear for hackers
# Copyright (C) # Copyright (C)
# Guillermo Rodriguez (aka ze jefe) # Guillermo Rodriguez (aka ze jefe)
# Tyler Goodlet # Tyler Goodlet
# (in stewardship for pikers) # (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify # This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by # it under the terms of the GNU Affero General Public License as published by
@ -25,14 +25,13 @@ from __future__ import annotations
from collections import ChainMap from collections import ChainMap
from contextlib import ( from contextlib import (
asynccontextmanager as acm, asynccontextmanager as acm,
AsyncExitStack,
) )
from datetime import datetime from datetime import datetime
from pprint import pformat from pprint import pformat
from typing import ( from typing import (
Any, Any,
Callable, Callable,
Hashable,
Sequence,
Type, Type,
) )
import hmac import hmac
@ -43,8 +42,7 @@ import trio
from pendulum import ( from pendulum import (
now, now,
) )
import asks import httpx
from rapidfuzz import process as fuzzy
import numpy as np import numpy as np
from piker import config from piker import config
@ -54,6 +52,7 @@ from piker.clearing._messages import (
from piker.accounting import ( from piker.accounting import (
Asset, Asset,
digits_to_dec, digits_to_dec,
MktPair,
) )
from piker.types import Struct from piker.types import Struct
from piker.data import ( from piker.data import (
@ -69,7 +68,6 @@ from .venues import (
PAIRTYPES, PAIRTYPES,
Pair, Pair,
MarketType, MarketType,
_spot_url, _spot_url,
_futes_url, _futes_url,
_testnet_futes_url, _testnet_futes_url,
@ -79,19 +77,18 @@ from .venues import (
log = get_logger('piker.brokers.binance') log = get_logger('piker.brokers.binance')
def get_config() -> dict: def get_config() -> dict[str, Any]:
conf: dict conf: dict
path: Path path: Path
conf, path = config.load( conf, path = config.load(
conf_name='brokers', conf_name='brokers',
touch_if_dne=True, touch_if_dne=True,
) )
section: dict = conf.get('binance')
section = conf.get('binance')
if not section: if not section:
log.warning(f'No config section found for binance in {path}') log.warning(
f'No config section found for binance in {path}'
)
return {} return {}
return section return section
@ -147,7 +144,7 @@ def binance_timestamp(
class Client: class Client:
''' '''
Async ReST API client using ``trio`` + ``asks`` B) Async ReST API client using `trio` + `httpx` B)
Supports all of the spot, margin and futures endpoints depending Supports all of the spot, margin and futures endpoints depending
on method. on method.
@ -156,10 +153,17 @@ class Client:
def __init__( def __init__(
self, self,
venue_sessions: dict[
str, # venue key
tuple[httpx.AsyncClient, str] # session, eps path
],
conf: dict[str, Any],
# TODO: change this to `Client.[mkt_]venue: MarketType`? # TODO: change this to `Client.[mkt_]venue: MarketType`?
mkt_mode: MarketType = 'spot', mkt_mode: MarketType = 'spot',
) -> None: ) -> None:
self.conf = conf
# build out pair info tables for each market type # build out pair info tables for each market type
# and wrap in a chain-map view for search / query. # and wrap in a chain-map view for search / query.
self._spot_pairs: dict[str, Pair] = {} # spot info table self._spot_pairs: dict[str, Pair] = {} # spot info table
@ -186,44 +190,13 @@ class Client:
# market symbols for use by search. See `.exch_info()`. # market symbols for use by search. See `.exch_info()`.
self._pairs: ChainMap[str, Pair] = ChainMap() self._pairs: ChainMap[str, Pair] = ChainMap()
# spot EPs sesh
self._sesh = asks.Session(connections=4)
self._sesh.base_location: str = _spot_url
# spot testnet
self._test_sesh: asks.Session = asks.Session(connections=4)
self._test_sesh.base_location: str = _testnet_spot_url
# margin and extended spot endpoints session.
self._sapi_sesh = asks.Session(connections=4)
self._sapi_sesh.base_location: str = _spot_url
# futes EPs sesh
self._fapi_sesh = asks.Session(connections=4)
self._fapi_sesh.base_location: str = _futes_url
# futes testnet
self._test_fapi_sesh: asks.Session = asks.Session(connections=4)
self._test_fapi_sesh.base_location: str = _testnet_futes_url
# global client "venue selection" mode. # global client "venue selection" mode.
# set this when you want to switch venues and not have to # set this when you want to switch venues and not have to
# specify the venue for the next request. # specify the venue for the next request.
self.mkt_mode: MarketType = mkt_mode self.mkt_mode: MarketType = mkt_mode
# per 8 # per-mkt-venue API client table
self.venue_sesh: dict[ self.venue_sesh = venue_sessions
str, # venue key
tuple[asks.Session, str] # session, eps path
] = {
'spot': (self._sesh, '/api/v3/'),
'spot_testnet': (self._test_sesh, '/fapi/v1/'),
'margin': (self._sapi_sesh, '/sapi/v1/'),
'usdtm_futes': (self._fapi_sesh, '/fapi/v1/'),
'usdtm_futes_testnet': (self._test_fapi_sesh, '/fapi/v1/'),
# 'futes_coin': self._dapi, # TODO
}
# lookup for going from `.mkt_mode: str` to the config # lookup for going from `.mkt_mode: str` to the config
# subsection `key: str` # subsection `key: str`
@ -238,40 +211,6 @@ class Client:
'futes': ['usdtm_futes'], 'futes': ['usdtm_futes'],
} }
# for creating API keys see,
# https://www.binance.com/en/support/faq/how-to-create-api-keys-on-binance-360002502072
self.conf: dict = get_config()
for key, subconf in self.conf.items():
if api_key := subconf.get('api_key', ''):
venue_keys: list[str] = self.confkey2venuekeys[key]
venue_key: str
sesh: asks.Session
for venue_key in venue_keys:
sesh, _ = self.venue_sesh[venue_key]
api_key_header: dict = {
# taken from official:
# https://github.com/binance/binance-futures-connector-python/blob/main/binance/api.py#L47
"Content-Type": "application/json;charset=utf-8",
# TODO: prolly should just always query and copy
# in the real latest ver?
"User-Agent": "binance-connector/6.1.6smbz6",
"X-MBX-APIKEY": api_key,
}
sesh.headers.update(api_key_header)
# if `.use_tesnet = true` in the config then
# also add headers for the testnet session which
# will be used for all order control
if subconf.get('use_testnet', False):
testnet_sesh, _ = self.venue_sesh[
venue_key + '_testnet'
]
testnet_sesh.headers.update(api_key_header)
def _mk_sig( def _mk_sig(
self, self,
data: dict, data: dict,
@ -290,7 +229,6 @@ class Client:
'to define the creds for auth-ed endpoints!?' 'to define the creds for auth-ed endpoints!?'
) )
# XXX: Info on security and authentification # XXX: Info on security and authentification
# https://binance-docs.github.io/apidocs/#endpoint-security-type # https://binance-docs.github.io/apidocs/#endpoint-security-type
if not (api_secret := subconf.get('api_secret')): if not (api_secret := subconf.get('api_secret')):
@ -319,7 +257,7 @@ class Client:
params: dict, params: dict,
method: str = 'get', method: str = 'get',
venue: str | None = None, # if None use `.mkt_mode` state venue: str|None = None, # if None use `.mkt_mode` state
signed: bool = False, signed: bool = False,
allow_testnet: bool = False, allow_testnet: bool = False,
@ -330,8 +268,9 @@ class Client:
- /fapi/v3/ USD-M FUTURES, or - /fapi/v3/ USD-M FUTURES, or
- /api/v3/ SPOT/MARGIN - /api/v3/ SPOT/MARGIN
account/market endpoint request depending on either passed in `venue: str` account/market endpoint request depending on either passed in
or the current setting `.mkt_mode: str` setting, default `'spot'`. `venue: str` or the current setting `.mkt_mode: str` setting,
default `'spot'`.
Docs per venue API: Docs per venue API:
@ -360,9 +299,6 @@ class Client:
venue=venue_key, venue=venue_key,
) )
sesh: asks.Session
path: str
# Check if we're configured to route order requests to the # Check if we're configured to route order requests to the
# venue equivalent's testnet. # venue equivalent's testnet.
use_testnet: bool = False use_testnet: bool = False
@ -387,11 +323,12 @@ class Client:
# ctl machinery B) # ctl machinery B)
venue_key += '_testnet' venue_key += '_testnet'
sesh, path = self.venue_sesh[venue_key] client: httpx.AsyncClient
path: str
meth: Callable = getattr(sesh, method) client, path = self.venue_sesh[venue_key]
meth: Callable = getattr(client, method)
resp = await meth( resp = await meth(
path=path + endpoint, url=path + endpoint,
params=params, params=params,
timeout=float('inf'), timeout=float('inf'),
) )
@ -433,7 +370,15 @@ class Client:
item['filters'] = filters item['filters'] = filters
pair_type: Type = PAIRTYPES[venue] pair_type: Type = PAIRTYPES[venue]
pair: Pair = pair_type(**item) try:
pair: Pair = pair_type(**item)
except Exception as e:
e.add_note(
"\nDon't panic, prolly stupid binance changed their symbology schema again..\n"
'Check out their API docs here:\n\n'
'https://binance-docs.github.io/apidocs/spot/en/#exchange-information'
)
raise
pair_table[pair.symbol.upper()] = pair pair_table[pair.symbol.upper()] = pair
# update an additional top-level-cross-venue-table # update an additional top-level-cross-venue-table
@ -528,7 +473,9 @@ class Client:
''' '''
pair_table: dict[str, Pair] = self._venue2pairs[ pair_table: dict[str, Pair] = self._venue2pairs[
venue or self.mkt_mode venue
or
self.mkt_mode
] ]
if ( if (
expiry expiry
@ -547,9 +494,9 @@ class Client:
venues: list[str] = [venue] venues: list[str] = [venue]
# batch per-venue download of all exchange infos # batch per-venue download of all exchange infos
async with trio.open_nursery() as rn: async with trio.open_nursery() as tn:
for ven in venues: for ven in venues:
rn.start_soon( tn.start_soon(
self._cache_pairs, self._cache_pairs,
ven, ven,
) )
@ -602,11 +549,11 @@ class Client:
) -> dict[str, Any]: ) -> dict[str, Any]:
fq_pairs: dict = await self.exch_info() fq_pairs: dict[str, Pair] = await self.exch_info()
# TODO: cache this list like we were in # TODO: cache this list like we were in
# `open_symbol_search()`? # `open_symbol_search()`?
keys: list[str] = list(fq_pairs) # keys: list[str] = list(fq_pairs)
return match_from_pairs( return match_from_pairs(
pairs=fq_pairs, pairs=fq_pairs,
@ -614,9 +561,20 @@ class Client:
score_cutoff=50, score_cutoff=50,
) )
def pair2venuekey(
self,
pair: Pair,
) -> str:
return {
'USDTM': 'usdtm_futes',
'SPOT': 'spot',
# 'COINM': 'coin_futes',
# ^-TODO-^ bc someone might want it..?
}[pair.venue]
async def bars( async def bars(
self, self,
symbol: str, mkt: MktPair,
start_dt: datetime | None = None, start_dt: datetime | None = None,
end_dt: datetime | None = None, end_dt: datetime | None = None,
@ -646,16 +604,20 @@ class Client:
start_time = binance_timestamp(start_dt) start_time = binance_timestamp(start_dt)
end_time = binance_timestamp(end_dt) end_time = binance_timestamp(end_dt)
bs_pair: Pair = self._pairs[mkt.bs_fqme.upper()]
# https://binance-docs.github.io/apidocs/spot/en/#kline-candlestick-data # https://binance-docs.github.io/apidocs/spot/en/#kline-candlestick-data
bars = await self._api( bars = await self._api(
'klines', 'klines',
params={ params={
'symbol': symbol.upper(), # NOTE: always query using their native symbology!
'symbol': mkt.bs_mktid.upper(),
'interval': '1m', 'interval': '1m',
'startTime': start_time, 'startTime': start_time,
'endTime': end_time, 'endTime': end_time,
'limit': limit 'limit': limit
}, },
venue=self.pair2venuekey(bs_pair),
allow_testnet=False, allow_testnet=False,
) )
new_bars: list[tuple] = [] new_bars: list[tuple] = []
@ -972,17 +934,148 @@ class Client:
await self.close_listen_key(key) await self.close_listen_key(key)
_venue_urls: dict[str, str] = {
'spot': (
_spot_url,
'/api/v3/',
),
'spot_testnet': (
_testnet_spot_url,
'/fapi/v1/'
),
# margin and extended spot endpoints session.
# TODO: did this ever get implemented fully?
# 'margin': (
# _spot_url,
# '/sapi/v1/'
# ),
'usdtm_futes': (
_futes_url,
'/fapi/v1/',
),
'usdtm_futes_testnet': (
_testnet_futes_url,
'/fapi/v1/',
),
# TODO: for anyone who actually needs it ;P
# 'coin_futes': ()
}
def init_api_keys(
client: Client,
conf: dict[str, Any],
) -> None:
'''
Set up per-venue API keys each http client according to the user's
`brokers.conf`.
For ex, to use spot-testnet and live usdt futures APIs:
```toml
[binance]
# spot test net
spot.use_testnet = true
spot.api_key = '<spot_api_key_from_binance_account>'
spot.api_secret = '<spot_api_key_password>'
# futes live
futes.use_testnet = false
accounts.usdtm = 'futes'
futes.api_key = '<futes_api_key_from_binance>'
futes.api_secret = '<futes_api_key_password>''
# if uncommented will use the built-in paper engine and not
# connect to `binance` API servers for order ctl.
# accounts.paper = 'paper'
```
'''
for key, subconf in conf.items():
if api_key := subconf.get('api_key', ''):
venue_keys: list[str] = client.confkey2venuekeys[key]
venue_key: str
client: httpx.AsyncClient
for venue_key in venue_keys:
client, _ = client.venue_sesh[venue_key]
api_key_header: dict = {
# taken from official:
# https://github.com/binance/binance-futures-connector-python/blob/main/binance/api.py#L47
"Content-Type": "application/json;charset=utf-8",
# TODO: prolly should just always query and copy
# in the real latest ver?
"User-Agent": "binance-connector/6.1.6smbz6",
"X-MBX-APIKEY": api_key,
}
client.headers.update(api_key_header)
# if `.use_tesnet = true` in the config then
# also add headers for the testnet session which
# will be used for all order control
if subconf.get('use_testnet', False):
testnet_sesh, _ = client.venue_sesh[
venue_key + '_testnet'
]
testnet_sesh.headers.update(api_key_header)
@acm @acm
async def get_client() -> Client: async def get_client(
mkt_mode: MarketType = 'spot',
) -> Client:
'''
Construct an single `piker` client which composes multiple underlying venue
specific API clients both for live and test networks.
client = Client() '''
await client.exch_info() venue_sessions: dict[
log.info( str, # venue key
f'{client} in {client.mkt_mode} mode: caching exchange infos..\n' tuple[httpx.AsyncClient, str] # session, eps path
'Cached multi-market pairs:\n' ] = {}
f'spot: {len(client._spot_pairs)}\n' async with AsyncExitStack() as client_stack:
f'usdtm_futes: {len(client._ufutes_pairs)}\n' for name, (base_url, path) in _venue_urls.items():
f'Total: {len(client._pairs)}\n' api: httpx.AsyncClient = await client_stack.enter_async_context(
) httpx.AsyncClient(
base_url=base_url,
# headers={},
yield client # TODO: is there a way to numerate this?
# https://www.python-httpx.org/advanced/clients/#why-use-a-client
# connections=4
)
)
venue_sessions[name] = (
api,
path,
)
conf: dict[str, Any] = get_config()
# for creating API keys see,
# https://www.binance.com/en/support/faq/how-to-create-api-keys-on-binance-360002502072
client = Client(
venue_sessions=venue_sessions,
conf=conf,
mkt_mode=mkt_mode,
)
init_api_keys(
client=client,
conf=conf,
)
fq_pairs: dict[str, Pair] = await client.exch_info()
assert fq_pairs
log.info(
f'Loaded multi-venue `Client` in mkt_mode={client.mkt_mode!r}\n\n'
f'Symbology Summary:\n'
f'------ - ------\n'
f'spot: {len(client._spot_pairs)}\n'
f'usdtm_futes: {len(client._ufutes_pairs)}\n'
'------ - ------\n'
f'total: {len(client._pairs)}\n'
)
yield client

View File

@ -264,15 +264,20 @@ async def open_trade_dialog(
# do a open_symcache() call.. though maybe we can hide # do a open_symcache() call.. though maybe we can hide
# this in a new async version of open_account()? # this in a new async version of open_account()?
async with open_cached_client('binance') as client: async with open_cached_client('binance') as client:
subconf: dict = client.conf[venue_name] subconf: dict|None = client.conf.get(venue_name)
use_testnet = subconf.get('use_testnet', False)
# XXX: if no futes.api_key or spot.api_key has been set we # XXX: if no futes.api_key or spot.api_key has been set we
# always fall back to the paper engine! # always fall back to the paper engine!
if not subconf.get('api_key'): if (
not subconf
or
not subconf.get('api_key')
):
await ctx.started('paper') await ctx.started('paper')
return return
use_testnet: bool = subconf.get('use_testnet', False)
async with ( async with (
open_cached_client('binance') as client, open_cached_client('binance') as client,
): ):

View File

@ -42,12 +42,12 @@ from trio_typing import TaskStatus
from pendulum import ( from pendulum import (
from_timestamp, from_timestamp,
) )
from rapidfuzz import process as fuzzy
import numpy as np import numpy as np
import tractor import tractor
from piker.brokers import ( from piker.brokers import (
open_cached_client, open_cached_client,
NoData,
) )
from piker._cacheables import ( from piker._cacheables import (
async_lifo_cache, async_lifo_cache,
@ -110,6 +110,7 @@ class AggTrade(Struct, frozen=True):
async def stream_messages( async def stream_messages(
ws: NoBsWs, ws: NoBsWs,
) -> AsyncGenerator[NoBsWs, dict]: ) -> AsyncGenerator[NoBsWs, dict]:
# TODO: match syntax here! # TODO: match syntax here!
@ -220,6 +221,8 @@ def make_sub(pairs: list[str], sub_name: str, uid: int) -> dict[str, str]:
} }
# TODO, why aren't frame resp `log.info()`s showing in upstream
# code?!
@acm @acm
async def open_history_client( async def open_history_client(
mkt: MktPair, mkt: MktPair,
@ -252,24 +255,30 @@ async def open_history_client(
else: else:
client.mkt_mode = 'spot' client.mkt_mode = 'spot'
# NOTE: always query using their native symbology! array: np.ndarray = await client.bars(
mktid: str = mkt.bs_mktid mkt=mkt,
array = await client.bars(
mktid,
start_dt=start_dt, start_dt=start_dt,
end_dt=end_dt, end_dt=end_dt,
) )
if array.size == 0:
raise NoData(
f'No frame for {start_dt} -> {end_dt}\n'
)
times = array['time'] times = array['time']
if ( if not times.any():
end_dt is None raise ValueError(
): 'Bad frame with null-times?\n\n'
inow = round(time.time()) f'{times}'
)
if end_dt is None:
inow: int = round(time.time())
if (inow - times[-1]) > 60: if (inow - times[-1]) > 60:
await tractor.pause() await tractor.pause()
start_dt = from_timestamp(times[0]) start_dt = from_timestamp(times[0])
end_dt = from_timestamp(times[-1]) end_dt = from_timestamp(times[-1])
return array, start_dt, end_dt return array, start_dt, end_dt
yield get_ohlc, {'erlangs': 3, 'rate': 3} yield get_ohlc, {'erlangs': 3, 'rate': 3}
@ -456,6 +465,8 @@ async def stream_quotes(
): ):
init_msgs: list[FeedInit] = [] init_msgs: list[FeedInit] = []
for sym in symbols: for sym in symbols:
mkt: MktPair
pair: Pair
mkt, pair = await get_mkt_info(sym) mkt, pair = await get_mkt_info(sym)
# build out init msgs according to latest spec # build out init msgs according to latest spec
@ -504,7 +515,6 @@ async def stream_quotes(
# start streaming # start streaming
async for typ, quote in msg_gen: async for typ, quote in msg_gen:
# period = time.time() - last # period = time.time() - last
# hz = 1/period if period else float('inf') # hz = 1/period if period else float('inf')
# if hz > 60: # if hz > 60:
@ -540,7 +550,7 @@ async def open_symbol_search(
) )
# repack in fqme-keyed table # repack in fqme-keyed table
byfqme: dict[start, Pair] = {} byfqme: dict[str, Pair] = {}
for pair in pairs.values(): for pair in pairs.values():
byfqme[pair.bs_fqme] = pair byfqme[pair.bs_fqme] = pair

View File

@ -137,10 +137,12 @@ class SpotPair(Pair, frozen=True):
quoteOrderQtyMarketAllowed: bool quoteOrderQtyMarketAllowed: bool
isSpotTradingAllowed: bool isSpotTradingAllowed: bool
isMarginTradingAllowed: bool isMarginTradingAllowed: bool
otoAllowed: bool
defaultSelfTradePreventionMode: str defaultSelfTradePreventionMode: str
allowedSelfTradePreventionModes: list[str] allowedSelfTradePreventionModes: list[str]
permissions: list[str] permissions: list[str]
permissionSets: list[list[str]]
# NOTE: see `.data._symcache.SymbologyCache.load()` for why # NOTE: see `.data._symcache.SymbologyCache.load()` for why
ns_path: str = 'piker.brokers.binance:SpotPair' ns_path: str = 'piker.brokers.binance:SpotPair'
@ -179,7 +181,6 @@ class FutesPair(Pair):
quoteAsset: str # 'USDT', quoteAsset: str # 'USDT',
quotePrecision: int # 8, quotePrecision: int # 8,
requiredMarginPercent: float # '5.0000', requiredMarginPercent: float # '5.0000',
settlePlan: int # 0,
timeInForce: list[str] # ['GTC', 'IOC', 'FOK', 'GTX'], timeInForce: list[str] # ['GTC', 'IOC', 'FOK', 'GTX'],
triggerProtect: float # '0.0500', triggerProtect: float # '0.0500',
underlyingSubType: list[str] # ['PoW'], underlyingSubType: list[str] # ['PoW'],

View File

@ -100,7 +100,7 @@ async def data_reset_hack(
log.warning( log.warning(
no_setup_msg no_setup_msg
+ +
f'REQUIRES A `vnc_addrs: array` ENTRY' 'REQUIRES A `vnc_addrs: array` ENTRY'
) )
vnc_host, vnc_port = vnc_sockaddr.get( vnc_host, vnc_port = vnc_sockaddr.get(
@ -259,7 +259,7 @@ def i3ipc_xdotool_manual_click_hack() -> None:
timeout=timeout, timeout=timeout,
) )
# re-activate and focus original window # re-activate and focus original window
subprocess.call([ subprocess.call([
'xdotool', 'xdotool',
'windowactivate', '--sync', str(orig_win_id), 'windowactivate', '--sync', str(orig_win_id),

View File

@ -287,9 +287,31 @@ class Client:
self.conf = config self.conf = config
# NOTE: the ib.client here is "throttled" to 45 rps by default # NOTE: the ib.client here is "throttled" to 45 rps by default
self.ib = ib self.ib: IB = ib
self.ib.RaiseRequestErrors: bool = True self.ib.RaiseRequestErrors: bool = True
# self._acnt_names: set[str] = {}
self._acnt_names: list[str] = []
@property
def acnts(self) -> list[str]:
# return list(self._acnt_names)
return self._acnt_names
def __repr__(self) -> str:
return (
f'<{type(self).__name__}('
f'ib={self.ib} '
f'acnts={self.acnts}'
# TODO: we need to mask out acnt-#s and other private
# infos if we're going to console this!
# f' |_.conf:\n'
# f' {pformat(self.conf)}\n'
')>'
)
async def get_fills(self) -> list[Fill]: async def get_fills(self) -> list[Fill]:
''' '''
Return list of rents `Fills` from trading session. Return list of rents `Fills` from trading session.
@ -376,55 +398,63 @@ class Client:
# whatToShow='MIDPOINT', # whatToShow='MIDPOINT',
# whatToShow='TRADES', # whatToShow='TRADES',
) )
log.info(
f'REQUESTING {ib_duration_str} worth {bar_size} BARS\n'
f'fqme: {fqme}\n'
f'global _enters: {_enters}\n'
f'kwargs: {pformat(kwargs)}\n'
)
bars = await self.ib.reqHistoricalDataAsync( bars = await self.ib.reqHistoricalDataAsync(
**kwargs, **kwargs,
) )
query_info: str = (
f'REQUESTING IB history BARS\n'
f' ------ - ------\n'
f'dt_duration: {dt_duration}\n'
f'ib_duration_str: {ib_duration_str}\n'
f'bar_size: {bar_size}\n'
f'fqme: {fqme}\n'
f'actor-global _enters: {_enters}\n'
f'kwargs: {pformat(kwargs)}\n'
)
# tail case if no history for range or none prior. # tail case if no history for range or none prior.
# NOTE: there's actually 3 cases here to handle (and
# this should be read alongside the implementation of
# `.reqHistoricalDataAsync()`):
# - a timeout occurred in which case insync internals return
# an empty list thing with bars.clear()...
# - no data exists for the period likely due to
# a weekend, holiday or other non-trading period prior to
# ``end_dt`` which exceeds the ``duration``,
# - LITERALLY this is the start of the mkt's history!
if not bars: if not bars:
# NOTE: there's actually 3 cases here to handle (and # TODO: figure out wut's going on here.
# this should be read alongside the implementation of
# `.reqHistoricalDataAsync()`):
# - a timeout occurred in which case insync internals return
# an empty list thing with bars.clear()...
# - no data exists for the period likely due to
# a weekend, holiday or other non-trading period prior to
# ``end_dt`` which exceeds the ``duration``,
# - LITERALLY this is the start of the mkt's history!
# TODO: is this handy, a sync requester for tinkering
# with empty frame cases?
# def get_hist():
# return self.ib.reqHistoricalData(**kwargs)
# import pdbp
# pdbp.set_trace()
# sync requester for debugging empty frame cases log.critical(
def get_hist(): 'STUPID IB SAYS NO HISTORY\n\n'
return self.ib.reqHistoricalData(**kwargs) + query_info
)
assert get_hist
import pdbp
pdbp.set_trace()
return [], np.empty(0), dt_duration
# TODO: we could maybe raise ``NoData`` instead if we # TODO: we could maybe raise ``NoData`` instead if we
# rewrite the method in the first case? right now there's no # rewrite the method in the first case?
# way to detect a timeout. # right now there's no way to detect a timeout..
return [], np.empty(0), dt_duration
# NOTE XXX: ensure minimum duration in bars B) log.info(query_info)
# => we recursively call this method until we get at least # NOTE XXX: ensure minimum duration in bars?
# as many bars such that they sum in aggregate to the the # => recursively call this method until we get at least as
# desired total time (duration) at most. # many bars such that they sum in aggregate to the the
# XXX XXX XXX # desired total time (duration) at most.
# WHY DID WE EVEN NEED THIS ORIGINALLY!? # - if you query over a gap and get no data
# XXX XXX XXX # that may short circuit the history
# - if you query over a gap and get no data
# that may short circuit the history
if ( if (
end_dt # XXX XXX XXX
and False # => WHY DID WE EVEN NEED THIS ORIGINALLY!? <=
# XXX XXX XXX
False
and end_dt
): ):
nparr: np.ndarray = bars_to_np(bars) nparr: np.ndarray = bars_to_np(bars)
times: np.ndarray = nparr['time'] times: np.ndarray = nparr['time']
@ -927,7 +957,10 @@ class Client:
warnset = True warnset = True
else: else:
log.info(f'Got first quote for {contract}') log.info(
'Got first quote for contract\n'
f'{contract}\n'
)
break break
else: else:
if timeouterr and raise_on_timeout: if timeouterr and raise_on_timeout:
@ -991,8 +1024,12 @@ class Client:
outsideRth=True, outsideRth=True,
optOutSmartRouting=True, optOutSmartRouting=True,
# TODO: need to understand this setting better as
# it pertains to shit ass mms..
routeMarketableToBbo=True, routeMarketableToBbo=True,
designatedLocation='SMART', designatedLocation='SMART',
# TODO: make all orders GTC? # TODO: make all orders GTC?
# https://interactivebrokers.github.io/tws-api/classIBApi_1_1Order.html#a95539081751afb9980f4c6bd1655a6ba # https://interactivebrokers.github.io/tws-api/classIBApi_1_1Order.html#a95539081751afb9980f4c6bd1655a6ba
# goodTillDate=f"yyyyMMdd-HH:mm:ss", # goodTillDate=f"yyyyMMdd-HH:mm:ss",
@ -1120,8 +1157,8 @@ def get_config() -> dict[str, Any]:
names = list(accounts.keys()) names = list(accounts.keys())
accts = section['accounts'] = bidict(accounts) accts = section['accounts'] = bidict(accounts)
log.info( log.info(
f'brokers.toml defines {len(accts)} accounts: ' f'{path} defines {len(accts)} account aliases:\n'
f'{pformat(names)}' f'{pformat(names)}\n'
) )
if section is None: if section is None:
@ -1188,7 +1225,7 @@ async def load_aio_clients(
try_ports = list(try_ports.values()) try_ports = list(try_ports.values())
_err = None _err = None
accounts_def = config.load_accounts(['ib']) accounts_def: dict[str, str] = config.load_accounts(['ib'])
ports = try_ports if port is None else [port] ports = try_ports if port is None else [port]
combos = list(itertools.product(hosts, ports)) combos = list(itertools.product(hosts, ports))
accounts_found: dict[str, Client] = {} accounts_found: dict[str, Client] = {}
@ -1213,6 +1250,12 @@ async def load_aio_clients(
for i in range(connect_retries): for i in range(connect_retries):
try: try:
log.info(
'Trying `ib_async` connect\n'
f'{host}: {port}\n'
f'clientId: {client_id}\n'
f'timeout: {connect_timeout}\n'
)
await ib.connectAsync( await ib.connectAsync(
host, host,
port, port,
@ -1227,7 +1270,9 @@ async def load_aio_clients(
client = Client(ib=ib, config=conf) client = Client(ib=ib, config=conf)
# update all actor-global caches # update all actor-global caches
log.info(f"Caching client for {sockaddr}") log.runtime(
f'Connected and caching `Client` @ {sockaddr!r}'
)
_client_cache[sockaddr] = client _client_cache[sockaddr] = client
break break
@ -1242,37 +1287,59 @@ async def load_aio_clients(
OSError, OSError,
) as ce: ) as ce:
_err = ce _err = ce
log.warning( message: str = (
f'Failed to connect on {host}:{port} for {i} time with,\n' f'Failed to connect on {host}:{port} after {i} tries with\n'
f'{ib.client.apiError.value()}\n' f'{ib.client.apiError.value()!r}\n\n'
'retrying with a new client id..') 'Retrying with a new client id..\n'
)
log.runtime(message)
else:
# XXX report loudly if we never established after all
# re-tries
log.warning(message)
# Pre-collect all accounts available for this # Pre-collect all accounts available for this
# connection and map account names to this client # connection and map account names to this client
# instance. # instance.
for value in ib.accountValues(): for value in ib.accountValues():
acct_number = value.account acct_number: str = value.account
entry = accounts_def.inverse.get(acct_number) acnt_alias: str = accounts_def.inverse.get(acct_number)
if not entry: if not acnt_alias:
# TODO: should we constuct the below reco-ex from
# the existing config content?
_, path = config.load(
conf_name='brokers',
)
raise ValueError( raise ValueError(
'No section in brokers.toml for account:' 'No alias in account section for account!\n'
f' {acct_number}\n' f'Please add an acnt alias entry to your {path}\n'
f'Please add entry to continue using this API client' 'For example,\n\n'
'[ib.accounts]\n'
'margin = {accnt_number!r}\n'
'^^^^^^ <- you need this part!\n\n'
'This ensures `piker` will not leak private acnt info '
'to console output by default!\n'
) )
# surjection of account names to operating clients. # surjection of account names to operating clients.
if acct_number not in accounts_found: if acnt_alias not in accounts_found:
accounts_found[entry] = client accounts_found[acnt_alias] = client
# client._acnt_names.add(acnt_alias)
client._acnt_names.append(acnt_alias)
log.info( if accounts_found:
f'Loaded accounts for client @ {host}:{port}\n' log.info(
f'{pformat(accounts_found)}' f'Loaded accounts for api client\n\n'
) f'{pformat(accounts_found)}\n'
)
# XXX: why aren't we just updating this directy above # XXX: why aren't we just updating this directy above
# instead of using the intermediary `accounts_found`? # instead of using the intermediary `accounts_found`?
_accounts2clients.update(accounts_found) _accounts2clients.update(accounts_found)
# if we have no clients after the scan loop then error out. # if we have no clients after the scan loop then error out.
if not _client_cache: if not _client_cache:
@ -1306,7 +1373,9 @@ async def load_clients_for_trio(
a ``tractor.to_asyncio.open_channel_from()``. a ``tractor.to_asyncio.open_channel_from()``.
''' '''
async with load_aio_clients() as accts2clients: async with load_aio_clients(
disconnect_on_exit=False,
) as accts2clients:
to_trio.send_nowait(accts2clients) to_trio.send_nowait(accts2clients)
@ -1472,7 +1541,7 @@ async def open_aio_client_method_relay(
msg: tuple[str, dict] | dict | None = await from_trio.get() msg: tuple[str, dict] | dict | None = await from_trio.get()
match msg: match msg:
case None: # termination sentinel case None: # termination sentinel
print('asyncio PROXY-RELAY SHUTDOWN') log.info('asyncio `Client` method-proxy SHUTDOWN!')
break break
case (meth_name, kwargs): case (meth_name, kwargs):

View File

@ -1183,7 +1183,14 @@ async def deliver_trade_events(
pos pos
and fill and fill
): ):
assert fill.commissionReport == cr now_cr: CommissionReport = fill.commissionReport
if (now_cr != cr):
log.warning(
'UhhHh ib updated the commission report mid-fill..?\n'
f'was: {pformat(cr)}\n'
f'now: {pformat(now_cr)}\n'
)
await emit_pp_update( await emit_pp_update(
ems_stream, ems_stream,
accounts_def, accounts_def,

View File

@ -671,8 +671,8 @@ async def _setup_quote_stream(
# making them mostly useless and explains why the scanner # making them mostly useless and explains why the scanner
# is always slow XD # is always slow XD
# '293', # Trade count for day # '293', # Trade count for day
'294', # Trade rate / minute # '294', # Trade rate / minute
'295', # Vlm rate / minute # '295', # Vlm rate / minute
), ),
contract: Contract | None = None, contract: Contract | None = None,
@ -915,9 +915,13 @@ async def stream_quotes(
if first_ticker: if first_ticker:
first_quote: dict = normalize(first_ticker) first_quote: dict = normalize(first_ticker)
log.info(
'Rxed init quote:\n' # TODO: we need a stack-oriented log levels filters for
f'{pformat(first_quote)}' # this!
# log.info(message, filter={'stack': 'live_feed'}) ?
log.runtime(
'Rxed init quote:\n\n'
f'{pformat(first_quote)}\n'
) )
# NOTE: it might be outside regular trading hours for # NOTE: it might be outside regular trading hours for
@ -969,7 +973,11 @@ async def stream_quotes(
raise_on_timeout=True, raise_on_timeout=True,
) )
first_quote: dict = normalize(first_ticker) first_quote: dict = normalize(first_ticker)
log.info(
# TODO: we need a stack-oriented log levels filters for
# this!
# log.info(message, filter={'stack': 'live_feed'}) ?
log.runtime(
'Rxed init quote:\n' 'Rxed init quote:\n'
f'{pformat(first_quote)}' f'{pformat(first_quote)}'
) )

View File

@ -31,7 +31,11 @@ from typing import (
) )
from bidict import bidict from bidict import bidict
import pendulum from pendulum import (
DateTime,
parse,
from_timestamp,
)
from ib_insync import ( from ib_insync import (
Contract, Contract,
Commodity, Commodity,
@ -66,10 +70,11 @@ tx_sort: Callable = partial(
iter_by_dt, iter_by_dt,
parsers={ parsers={
'dateTime': parse_flex_dt, 'dateTime': parse_flex_dt,
'datetime': pendulum.parse, 'datetime': parse,
# for some some fucking 2022 and
# back options records...fuck me. # XXX: for some some fucking 2022 and
'date': pendulum.parse, # back options records.. f@#$ me..
'date': parse,
} }
) )
@ -89,15 +94,38 @@ def norm_trade(
conid: int = str(record.get('conId') or record['conid']) conid: int = str(record.get('conId') or record['conid'])
bs_mktid: str = str(conid) bs_mktid: str = str(conid)
comms = record.get('commission')
if comms is None:
comms = -1*record['ibCommission']
price = record.get('price') or record['tradePrice'] # NOTE: sometimes weird records (like BTTX?)
# have no field for this?
comms: float = -1 * (
record.get('commission')
or record.get('ibCommission')
or 0
)
if not comms:
log.warning(
'No commissions found for record?\n'
f'{pformat(record)}\n'
)
price: float = (
record.get('price')
or record.get('tradePrice')
)
if price is None:
log.warning(
'No `price` field found in record?\n'
'Skipping normalization..\n'
f'{pformat(record)}\n'
)
return None
# the api doesn't do the -/+ on the quantity for you but flex # the api doesn't do the -/+ on the quantity for you but flex
# records do.. are you fucking serious ib...!? # records do.. are you fucking serious ib...!?
size = record.get('quantity') or record['shares'] * { size: float|int = (
record.get('quantity')
or record['shares']
) * {
'BOT': 1, 'BOT': 1,
'SLD': -1, 'SLD': -1,
}[record['side']] }[record['side']]
@ -128,26 +156,31 @@ def norm_trade(
# otype = tail[6] # otype = tail[6]
# strike = tail[7:] # strike = tail[7:]
print(f'skipping opts contract {symbol}') log.warning(
f'Skipping option contract -> NO SUPPORT YET!\n'
f'{symbol}\n'
)
return None return None
# timestamping is way different in API records # timestamping is way different in API records
dtstr = record.get('datetime') dtstr: str = record.get('datetime')
date = record.get('date') date: str = record.get('date')
flex_dtstr = record.get('dateTime') flex_dtstr: str = record.get('dateTime')
if dtstr or date: if dtstr or date:
dt = pendulum.parse(dtstr or date) dt: DateTime = parse(dtstr or date)
elif flex_dtstr: elif flex_dtstr:
# probably a flex record with a wonky non-std timestamp.. # probably a flex record with a wonky non-std timestamp..
dt = parse_flex_dt(record['dateTime']) dt: DateTime = parse_flex_dt(record['dateTime'])
# special handling of symbol extraction from # special handling of symbol extraction from
# flex records using some ad-hoc schema parsing. # flex records using some ad-hoc schema parsing.
asset_type: str = record.get( asset_type: str = (
'assetCategory' record.get('assetCategory')
) or record.get('secType', 'STK') or record.get('secType')
or 'STK'
)
if (expiry := ( if (expiry := (
record.get('lastTradeDateOrContractMonth') record.get('lastTradeDateOrContractMonth')
@ -357,6 +390,7 @@ def norm_trade_records(
if txn is None: if txn is None:
continue continue
# inject txns sorted by datetime
insort( insort(
records, records,
txn, txn,
@ -405,7 +439,7 @@ def api_trades_to_ledger_entries(
txn_dict[attr_name] = val txn_dict[attr_name] = val
tid = str(txn_dict['execId']) tid = str(txn_dict['execId'])
dt = pendulum.from_timestamp(txn_dict['time']) dt = from_timestamp(txn_dict['time'])
txn_dict['datetime'] = str(dt) txn_dict['datetime'] = str(dt)
acctid = accounts[txn_dict['acctNumber']] acctid = accounts[txn_dict['acctNumber']]

View File

@ -209,7 +209,10 @@ async def open_symbol_search(ctx: tractor.Context) -> None:
break break
ib_client = proxy._aio_ns.ib ib_client = proxy._aio_ns.ib
log.info(f'Using {ib_client} for symbol search') log.info(
f'Using API client for symbol-search\n'
f'{ib_client}\n'
)
last = time.time() last = time.time()
async for pattern in stream: async for pattern in stream:
@ -294,7 +297,7 @@ async def open_symbol_search(ctx: tractor.Context) -> None:
elif stock_results: elif stock_results:
break break
# else: # else:
await tractor.pause() # await tractor.pause()
# # match against our ad-hoc set immediately # # match against our ad-hoc set immediately
# adhoc_matches = fuzzy.extract( # adhoc_matches = fuzzy.extract(
@ -522,7 +525,21 @@ async def get_mkt_info(
venue = con.primaryExchange or con.exchange venue = con.primaryExchange or con.exchange
price_tick: Decimal = Decimal(str(details.minTick)) price_tick: Decimal = Decimal(str(details.minTick))
# price_tick: Decimal = Decimal('0.01') ib_min_tick_gt_2: Decimal = Decimal('0.01')
if (
price_tick < ib_min_tick_gt_2
):
# TODO: we need to add some kinda dynamic rounding sys
# to our MktPair i guess?
# not sure where the logic should sit, but likely inside
# the `.clearing._ems` i suppose...
log.warning(
'IB seems to disallow a min price tick < 0.01 '
'when the price is > 2.0..?\n'
f'Decreasing min tick precision for {fqme} to 0.01'
)
# price_tick = ib_min_tick
# await tractor.pause()
if atype == 'stock': if atype == 'stock':
# XXX: GRRRR they don't support fractional share sizes for # XXX: GRRRR they don't support fractional share sizes for

View File

@ -27,8 +27,8 @@ from typing import (
) )
import time import time
import httpx
import pendulum import pendulum
import asks
import numpy as np import numpy as np
import urllib.parse import urllib.parse
import hashlib import hashlib
@ -60,6 +60,11 @@ log = get_logger('piker.brokers.kraken')
# <uri>/<version>/ # <uri>/<version>/
_url = 'https://api.kraken.com/0' _url = 'https://api.kraken.com/0'
_headers: dict[str, str] = {
'User-Agent': 'krakenex/2.1.0 (+https://github.com/veox/python3-krakenex)'
}
# TODO: this is the only backend providing this right? # TODO: this is the only backend providing this right?
# in which case we should drop it from the defaults and # in which case we should drop it from the defaults and
# instead make a custom fields descr in this module! # instead make a custom fields descr in this module!
@ -135,16 +140,15 @@ class Client:
def __init__( def __init__(
self, self,
config: dict[str, str], config: dict[str, str],
httpx_client: httpx.AsyncClient,
name: str = '', name: str = '',
api_key: str = '', api_key: str = '',
secret: str = '' secret: str = ''
) -> None: ) -> None:
self._sesh = asks.Session(connections=4)
self._sesh.base_location = _url self._sesh: httpx.AsyncClient = httpx_client
self._sesh.headers.update({
'User-Agent':
'krakenex/2.1.0 (+https://github.com/veox/python3-krakenex)'
})
self._name = name self._name = name
self._api_key = api_key self._api_key = api_key
self._secret = secret self._secret = secret
@ -166,10 +170,9 @@ class Client:
method: str, method: str,
data: dict, data: dict,
) -> dict[str, Any]: ) -> dict[str, Any]:
resp = await self._sesh.post( resp: httpx.Response = await self._sesh.post(
path=f'/public/{method}', url=f'/public/{method}',
json=data, json=data,
timeout=float('inf')
) )
return resproc(resp, log) return resproc(resp, log)
@ -180,18 +183,18 @@ class Client:
uri_path: str uri_path: str
) -> dict[str, Any]: ) -> dict[str, Any]:
headers = { headers = {
'Content-Type': 'Content-Type': 'application/x-www-form-urlencoded',
'application/x-www-form-urlencoded', 'API-Key': self._api_key,
'API-Key': 'API-Sign': get_kraken_signature(
self._api_key, uri_path,
'API-Sign': data,
get_kraken_signature(uri_path, data, self._secret) self._secret,
),
} }
resp = await self._sesh.post( resp: httpx.Response = await self._sesh.post(
path=f'/private/{method}', url=f'/private/{method}',
data=data, data=data,
headers=headers, headers=headers,
timeout=float('inf')
) )
return resproc(resp, log) return resproc(resp, log)
@ -665,24 +668,36 @@ class Client:
@acm @acm
async def get_client() -> Client: async def get_client() -> Client:
conf = get_config() conf: dict[str, Any] = get_config()
if conf: async with httpx.AsyncClient(
client = Client( base_url=_url,
conf, headers=_headers,
# TODO: don't break these up and just do internal # TODO: is there a way to numerate this?
# conf lookups instead.. # https://www.python-httpx.org/advanced/clients/#why-use-a-client
name=conf['key_descr'], # connections=4
api_key=conf['api_key'], ) as trio_client:
secret=conf['secret'] if conf:
) client = Client(
else: conf,
client = Client({}) httpx_client=trio_client,
# at startup, load all symbols, and asset info in # TODO: don't break these up and just do internal
# batch requests. # conf lookups instead..
async with trio.open_nursery() as nurse: name=conf['key_descr'],
nurse.start_soon(client.get_assets) api_key=conf['api_key'],
await client.get_mkt_pairs() secret=conf['secret']
)
else:
client = Client(
conf={},
httpx_client=trio_client,
)
yield client # at startup, load all symbols, and asset info in
# batch requests.
async with trio.open_nursery() as nurse:
nurse.start_soon(client.get_assets)
await client.get_mkt_pairs()
yield client

View File

@ -612,18 +612,18 @@ async def open_trade_dialog(
# enter relay loop # enter relay loop
await handle_order_updates( await handle_order_updates(
client, client=client,
ws, ws=ws,
stream, ws_stream=stream,
ems_stream, ems_stream=ems_stream,
apiflows, apiflows=apiflows,
ids, ids=ids,
reqids2txids, reqids2txids=reqids2txids,
acnt, acnt=acnt,
api_trans, ledger=ledger,
acctid, acctid=acctid,
acc_name, acc_name=acc_name,
token, token=token,
) )
@ -639,7 +639,8 @@ async def handle_order_updates(
# transaction records which will be updated # transaction records which will be updated
# on new trade clearing events (aka order "fills") # on new trade clearing events (aka order "fills")
ledger_trans: dict[str, Transaction], ledger: TransactionLedger,
# ledger_trans: dict[str, Transaction],
acctid: str, acctid: str,
acc_name: str, acc_name: str,
token: str, token: str,
@ -699,7 +700,8 @@ async def handle_order_updates(
# if tid not in ledger_trans # if tid not in ledger_trans
} }
for tid, trade in trades.items(): for tid, trade in trades.items():
assert tid not in ledger_trans # assert tid not in ledger_trans
assert tid not in ledger
txid = trade['ordertxid'] txid = trade['ordertxid']
reqid = trade.get('userref') reqid = trade.get('userref')
@ -747,11 +749,17 @@ async def handle_order_updates(
client, client,
api_name_set='wsname', api_name_set='wsname',
) )
ppmsgs = trades2pps( ppmsgs: list[BrokerdPosition] = trades2pps(
acnt, acnt=acnt,
acctid, ledger=ledger,
new_trans, acctid=acctid,
new_trans=new_trans,
) )
# ppmsgs = trades2pps(
# acnt,
# acctid,
# new_trans,
# )
for pp_msg in ppmsgs: for pp_msg in ppmsgs:
await ems_stream.send(pp_msg) await ems_stream.send(pp_msg)

View File

@ -16,10 +16,9 @@
# along with this program. If not, see <https://www.gnu.org/licenses/>. # along with this program. If not, see <https://www.gnu.org/licenses/>.
''' '''
Kucoin broker backend Kucoin cex API backend.
''' '''
from contextlib import ( from contextlib import (
asynccontextmanager as acm, asynccontextmanager as acm,
aclosing, aclosing,
@ -42,7 +41,7 @@ import wsproto
from uuid import uuid4 from uuid import uuid4
from trio_typing import TaskStatus from trio_typing import TaskStatus
import asks import httpx
from bidict import bidict from bidict import bidict
import numpy as np import numpy as np
import pendulum import pendulum
@ -63,7 +62,7 @@ from piker._cacheables import (
) )
from piker.log import get_logger from piker.log import get_logger
from piker.data.validate import FeedInit from piker.data.validate import FeedInit
from piker.types import Struct from piker.types import Struct # NOTE, this is already a `tractor.msg.Struct`
from piker.data import ( from piker.data import (
def_iohlcv_fields, def_iohlcv_fields,
match_from_pairs, match_from_pairs,
@ -99,9 +98,18 @@ class KucoinMktPair(Struct, frozen=True):
def size_tick(self) -> Decimal: def size_tick(self) -> Decimal:
return Decimal(str(self.quoteMinSize)) return Decimal(str(self.quoteMinSize))
callauctionFirstStageStartTime: None|float
callauctionIsEnabled: bool
callauctionPriceCeiling: float|None
callauctionPriceFloor: float|None
callauctionSecondStageStartTime: float|None
callauctionThirdStageStartTime: float|None
enableTrading: bool enableTrading: bool
feeCategory: int
feeCurrency: str feeCurrency: str
isMarginEnabled: bool isMarginEnabled: bool
makerFeeCoefficient: float
market: str market: str
minFunds: float minFunds: float
name: str name: str
@ -111,7 +119,10 @@ class KucoinMktPair(Struct, frozen=True):
quoteIncrement: float quoteIncrement: float
quoteMaxSize: float quoteMaxSize: float
quoteMinSize: float quoteMinSize: float
st: bool
symbol: str # our bs_mktid, kucoin's internal id symbol: str # our bs_mktid, kucoin's internal id
takerFeeCoefficient: float
tradingStartTime: float|None
class AccountTrade(Struct, frozen=True): class AccountTrade(Struct, frozen=True):
@ -212,8 +223,12 @@ def get_config() -> BrokerConfig | None:
class Client: class Client:
def __init__(self) -> None: def __init__(
self._config: BrokerConfig | None = get_config() self,
httpx_client: httpx.AsyncClient,
) -> None:
self._http: httpx.AsyncClient = httpx_client
self._config: BrokerConfig|None = get_config()
self._pairs: dict[str, KucoinMktPair] = {} self._pairs: dict[str, KucoinMktPair] = {}
self._fqmes2mktids: bidict[str, str] = bidict() self._fqmes2mktids: bidict[str, str] = bidict()
self._bars: list[list[float]] = [] self._bars: list[list[float]] = []
@ -227,18 +242,24 @@ class Client:
) -> dict[str, str | bytes]: ) -> dict[str, str | bytes]:
''' '''
Generate authenticated request headers Generate authenticated request headers:
https://docs.kucoin.com/#authentication https://docs.kucoin.com/#authentication
https://www.kucoin.com/docs/basic-info/connection-method/authentication/creating-a-request
https://www.kucoin.com/docs/basic-info/connection-method/authentication/signing-a-message
''' '''
if not self._config: if not self._config:
raise ValueError( raise ValueError(
'No config found when trying to send authenticated request') 'No config found when trying to send authenticated request'
)
str_to_sign = ( str_to_sign = (
str(int(time.time() * 1000)) str(int(time.time() * 1000))
+ action + f'/api/{api}/{endpoint.lstrip("/")}' +
action
+
f'/api/{api}/{endpoint.lstrip("/")}'
) )
signature = base64.b64encode( signature = base64.b64encode(
@ -249,6 +270,7 @@ class Client:
).digest() ).digest()
) )
# TODO: can we cache this between calls?
passphrase = base64.b64encode( passphrase = base64.b64encode(
hmac.new( hmac.new(
self._config.key_secret.encode('utf-8'), self._config.key_secret.encode('utf-8'),
@ -270,8 +292,10 @@ class Client:
self, self,
action: Literal['POST', 'GET'], action: Literal['POST', 'GET'],
endpoint: str, endpoint: str,
api: str = 'v2', api: str = 'v2',
headers: dict = {}, headers: dict = {},
) -> Any: ) -> Any:
''' '''
Generic request wrapper for Kucoin API Generic request wrapper for Kucoin API
@ -284,14 +308,19 @@ class Client:
api, api,
) )
api_url = f'https://api.kucoin.com/api/{api}/{endpoint}' req_meth: Callable = getattr(
self._http,
res = await asks.request(action, api_url, headers=headers) action.lower(),
)
json = res.json() res = await req_meth(
if 'data' in json: url=f'/{api}/{endpoint}',
return json['data'] headers=headers,
)
json: dict = res.json()
if (data := json.get('data')) is not None:
return data
else: else:
api_url: str = self._http.base_url
log.error( log.error(
f'Error making request to {api_url} ->\n' f'Error making request to {api_url} ->\n'
f'{pformat(res)}' f'{pformat(res)}'
@ -311,7 +340,7 @@ class Client:
''' '''
token_type = 'private' if private else 'public' token_type = 'private' if private else 'public'
try: try:
data: dict[str, Any] | None = await self._request( data: dict[str, Any]|None = await self._request(
'POST', 'POST',
endpoint=f'bullet-{token_type}', endpoint=f'bullet-{token_type}',
api='v1' api='v1'
@ -349,8 +378,8 @@ class Client:
currencies: dict[str, Currency] = {} currencies: dict[str, Currency] = {}
entries: list[dict] = await self._request( entries: list[dict] = await self._request(
'GET', 'GET',
api='v1',
endpoint='currencies', endpoint='currencies',
api='v1',
) )
for entry in entries: for entry in entries:
curr = Currency(**entry).copy() curr = Currency(**entry).copy()
@ -366,13 +395,22 @@ class Client:
dict[str, KucoinMktPair], dict[str, KucoinMktPair],
bidict[str, KucoinMktPair], bidict[str, KucoinMktPair],
]: ]:
entries = await self._request('GET', 'symbols') entries = await self._request(
'GET',
endpoint='symbols',
)
log.info(f' {len(entries)} Kucoin market pairs fetched') log.info(f' {len(entries)} Kucoin market pairs fetched')
pairs: dict[str, KucoinMktPair] = {} pairs: dict[str, KucoinMktPair] = {}
fqmes2mktids: bidict[str, str] = bidict() fqmes2mktids: bidict[str, str] = bidict()
for item in entries: for item in entries:
pair = pairs[item['name']] = KucoinMktPair(**item) try:
pair = pairs[item['name']] = KucoinMktPair(**item)
except TypeError as te:
raise TypeError(
'`KucoinMktPair` and reponse fields do not match ??\n'
f'{KucoinMktPair.fields_diff(item)}\n'
) from te
fqmes2mktids[ fqmes2mktids[
item['name'].lower().replace('-', '') item['name'].lower().replace('-', '')
] = pair.name ] = pair.name
@ -567,13 +605,21 @@ def fqme_to_kucoin_sym(
@acm @acm
async def get_client() -> AsyncGenerator[Client, None]: async def get_client() -> AsyncGenerator[Client, None]:
client = Client() '''
Load an API `Client` preconfigured from user settings
async with trio.open_nursery() as n: '''
n.start_soon(client.get_mkt_pairs) async with (
await client.get_currencies() httpx.AsyncClient(
base_url='https://api.kucoin.com/api',
) as trio_client,
):
client = Client(httpx_client=trio_client)
async with trio.open_nursery() as tn:
tn.start_soon(client.get_mkt_pairs)
await client.get_currencies()
yield client yield client
@tractor.context @tractor.context
@ -609,7 +655,7 @@ async def open_ping_task(
await trio.sleep((ping_interval - 1000) / 1000) await trio.sleep((ping_interval - 1000) / 1000)
await ws.send_msg({'id': connect_id, 'type': 'ping'}) await ws.send_msg({'id': connect_id, 'type': 'ping'})
log.info('Starting ping task for kucoin ws connection') log.warning('Starting ping task for kucoin ws connection')
n.start_soon(ping_server) n.start_soon(ping_server)
yield yield
@ -621,9 +667,14 @@ async def open_ping_task(
async def get_mkt_info( async def get_mkt_info(
fqme: str, fqme: str,
) -> tuple[MktPair, KucoinMktPair]: ) -> tuple[
MktPair,
KucoinMktPair,
]:
''' '''
Query for and return a `MktPair` and `KucoinMktPair`. Query for and return both a `piker.accounting.MktPair` and
`KucoinMktPair` from provided `fqme: str`
(fully-qualified-market-endpoint).
''' '''
async with open_cached_client('kucoin') as client: async with open_cached_client('kucoin') as client:
@ -698,6 +749,8 @@ async def stream_quotes(
log.info(f'Starting up quote stream(s) for {symbols}') log.info(f'Starting up quote stream(s) for {symbols}')
for sym_str in symbols: for sym_str in symbols:
mkt: MktPair
pair: KucoinMktPair
mkt, pair = await get_mkt_info(sym_str) mkt, pair = await get_mkt_info(sym_str)
init_msgs.append( init_msgs.append(
FeedInit(mkt_info=mkt) FeedInit(mkt_info=mkt)
@ -705,7 +758,11 @@ async def stream_quotes(
ws: NoBsWs ws: NoBsWs
token, ping_interval = await client._get_ws_token() token, ping_interval = await client._get_ws_token()
connect_id = str(uuid4()) log.info('API reported ping_interval: {ping_interval}\n')
connect_id: str = str(uuid4())
typ: str
quote: dict
async with ( async with (
open_autorecon_ws( open_autorecon_ws(
( (
@ -719,20 +776,37 @@ async def stream_quotes(
), ),
) as ws, ) as ws,
open_ping_task(ws, ping_interval, connect_id), open_ping_task(ws, ping_interval, connect_id),
aclosing(stream_messages(ws, sym_str)) as msg_gen, aclosing(
iter_normed_quotes(
ws, sym_str
)
) as iter_quotes,
): ):
typ, quote = await anext(msg_gen) typ, quote = await anext(iter_quotes)
while typ != 'trade': # take care to not unblock here until we get a real
# take care to not unblock here until we get a real # trade quote?
# trade quote # ^TODO, remove this right?
typ, quote = await anext(msg_gen) # -[ ] what often blocks chart boot/new-feed switching
# since we'ere waiting for a live quote instead of just
# loading history afap..
# |_ XXX, not sure if we require a bit of rework to core
# feed init logic or if backends justg gotta be
# changed up.. feel like there was some causality
# dilema prolly only seen with IB too..
# while typ != 'trade':
# typ, quote = await anext(iter_quotes)
task_status.started((init_msgs, quote)) task_status.started((init_msgs, quote))
feed_is_live.set() feed_is_live.set()
async for typ, msg in msg_gen: # XXX NOTE, DO NOT include the `.<backend>` suffix!
await send_chan.send({sym_str: msg}) # OW the sampling loop will not broadcast correctly..
# since `bus._subscribers.setdefault(bs_fqme, set())`
# is used inside `.data.open_feed_bus()` !!!
topic: str = mkt.bs_fqme
async for typ, quote in iter_quotes:
await send_chan.send({topic: quote})
@acm @acm
@ -787,7 +861,7 @@ async def subscribe(
) )
async def stream_messages( async def iter_normed_quotes(
ws: NoBsWs, ws: NoBsWs,
sym: str, sym: str,
@ -818,6 +892,9 @@ async def stream_messages(
yield 'trade', { yield 'trade', {
'symbol': sym, 'symbol': sym,
# TODO, is 'last' even used elsewhere/a-good
# semantic? can't we just read the ticks with our
# .data.ticktools.frame_ticks()`/
'last': trade_data.price, 'last': trade_data.price,
'brokerd_ts': last_trade_ts, 'brokerd_ts': last_trade_ts,
'ticks': [ 'ticks': [
@ -910,7 +987,7 @@ async def open_history_client(
if end_dt is None: if end_dt is None:
inow = round(time.time()) inow = round(time.time())
print( log.debug(
f'difference in time between load and processing' f'difference in time between load and processing'
f'{inow - times[-1]}' f'{inow - times[-1]}'
) )

View File

@ -0,0 +1,49 @@
piker.clearing
______________
trade execution-n-control subsys for both live and paper trading as
well as algo-trading manual override/interaction across any backend
broker and data provider.
avail UIs
*********
order ctl
---------
the `piker.clearing` subsys is exposed mainly though
the `piker chart` GUI as a "chart trader" style UX and
is automatically enabled whenever a chart is opened.
.. ^TODO, more prose here!
the "manual" order control features are exposed via the
`piker.ui.order_mode` API and can pretty much always be
used (at least) in simulated-trading mode, aka "paper"-mode, and
the micro-manual is as follows:
``order_mode`` (
edge triggered activation by any of the following keys,
``mouse-click`` on y-level to submit at that price
):
- ``f``/ ``ctl-f`` to stage buy
- ``d``/ ``ctl-d`` to stage sell
- ``a`` to stage alert
``search_mode`` (
``ctl-l`` or ``ctl-space`` to open,
``ctl-c`` or ``ctl-space`` to close
) :
- begin typing to have symbol search automatically lookup
symbols from all loaded backend (broker) providers
- arrow keys and mouse click to navigate selection
- vi-like ``ctl-[hjkl]`` for navigation
position (pp) mgmt
------------------
you can also configure your position allocation limits from the
sidepane.
.. ^TODO, explain and provide tut once more refined!

View File

@ -104,14 +104,15 @@ def get_app_dir(
# `tractor`) with the testing dir and check for it whenever we # `tractor`) with the testing dir and check for it whenever we
# detect `pytest` is being used (which it isn't under normal # detect `pytest` is being used (which it isn't under normal
# operation). # operation).
if "pytest" in sys.modules: # if "pytest" in sys.modules:
import tractor # import tractor
actor = tractor.current_actor(err_on_no_runtime=False) # actor = tractor.current_actor(err_on_no_runtime=False)
if actor: # runtime is up # if actor: # runtime is up
rvs = tractor._state._runtime_vars # rvs = tractor._state._runtime_vars
testdirpath = Path(rvs['piker_vars']['piker_test_dir']) # import pdbp; pdbp.set_trace()
assert testdirpath.exists(), 'piker test harness might be borked!?' # testdirpath = Path(rvs['piker_vars']['piker_test_dir'])
app_name = str(testdirpath) # assert testdirpath.exists(), 'piker test harness might be borked!?'
# app_name = str(testdirpath)
if platform.system() == 'Windows': if platform.system() == 'Windows':
key = "APPDATA" if roaming else "LOCALAPPDATA" key = "APPDATA" if roaming else "LOCALAPPDATA"

View File

@ -273,7 +273,7 @@ async def _reconnect_forever(
nobsws._connected.set() nobsws._connected.set()
await trio.sleep_forever() await trio.sleep_forever()
except HandshakeError: except HandshakeError:
log.exception(f'Retrying connection') log.exception('Retrying connection')
# ws & nursery block ends # ws & nursery block ends
@ -359,8 +359,8 @@ async def open_autorecon_ws(
''' '''
JSONRPC response-request style machinery for transparent multiplexing of msgs JSONRPC response-request style machinery for transparent multiplexing
over a NoBsWs. of msgs over a `NoBsWs`.
''' '''
@ -377,43 +377,82 @@ async def open_jsonrpc_session(
url: str, url: str,
start_id: int = 0, start_id: int = 0,
response_type: type = JSONRPCResult, response_type: type = JSONRPCResult,
request_type: Optional[type] = None, msg_recv_timeout: float = float('inf'),
request_hook: Optional[Callable] = None, # ^NOTE, since only `deribit` is using this jsonrpc stuff atm
error_hook: Optional[Callable] = None, # and options mkts are generally "slow moving"..
#
# FURTHER if we break the underlying ws connection then since we
# don't pass a `fixture` to the task that manages `NoBsWs`, i.e.
# `_reconnect_forever()`, the jsonrpc "transport pipe" get's
# broken and never restored with wtv init sequence is required to
# re-establish a working req-resp session.
) -> Callable[[str, dict], dict]: ) -> Callable[[str, dict], dict]:
'''
Init a json-RPC-over-websocket connection to the provided `url`.
A `json_rpc: Callable[[str, dict], dict` is delivered to the
caller for sending requests and a bg-`trio.Task` handles
processing of response msgs including error reporting/raising in
the parent/caller task.
'''
# NOTE, store all request msgs so we can raise errors on the
# caller side!
req_msgs: dict[int, dict] = {}
async with ( async with (
trio.open_nursery() as n, trio.open_nursery() as tn,
open_autorecon_ws(url) as ws open_autorecon_ws(
url=url,
msg_recv_timeout=msg_recv_timeout,
) as ws
): ):
rpc_id: Iterable = count(start_id) rpc_id: Iterable[int] = count(start_id)
rpc_results: dict[int, dict] = {} rpc_results: dict[int, dict] = {}
async def json_rpc(method: str, params: dict) -> dict: async def json_rpc(
method: str,
params: dict,
) -> dict:
''' '''
perform a json rpc call and wait for the result, raise exception in perform a json rpc call and wait for the result, raise exception in
case of error field present on response case of error field present on response
''' '''
nonlocal req_msgs
req_id: int = next(rpc_id)
msg = { msg = {
'jsonrpc': '2.0', 'jsonrpc': '2.0',
'id': next(rpc_id), 'id': req_id,
'method': method, 'method': method,
'params': params 'params': params
} }
_id = msg['id'] _id = msg['id']
rpc_results[_id] = { result = rpc_results[_id] = {
'result': None, 'result': None,
'event': trio.Event() 'error': None,
'event': trio.Event(), # signal caller resp arrived
} }
req_msgs[_id] = msg
await ws.send_msg(msg) await ws.send_msg(msg)
# wait for reponse before unblocking requester code
await rpc_results[_id]['event'].wait() await rpc_results[_id]['event'].wait()
ret = rpc_results[_id]['result'] if (maybe_result := result['result']):
ret = maybe_result
del rpc_results[_id]
del rpc_results[_id] else:
err = result['error']
raise Exception(
f'JSONRPC request failed\n'
f'req: {msg}\n'
f'resp: {err}\n'
)
if ret.error is not None: if ret.error is not None:
raise Exception(json.dumps(ret.error, indent=4)) raise Exception(json.dumps(ret.error, indent=4))
@ -428,6 +467,7 @@ async def open_jsonrpc_session(
the server side. the server side.
''' '''
nonlocal req_msgs
async for msg in ws: async for msg in ws:
match msg: match msg:
case { case {
@ -451,19 +491,28 @@ async def open_jsonrpc_session(
'params': _, 'params': _,
}: }:
log.debug(f'Recieved\n{msg}') log.debug(f'Recieved\n{msg}')
if request_hook:
await request_hook(request_type(**msg))
case { case {
'error': error 'error': error
}: }:
log.warning(f'Recieved\n{error}') # retreive orig request msg, set error
if error_hook: # response in original "result" msg,
await error_hook(response_type(**msg)) # THEN FINALLY set the event to signal caller
# to raise the error in the parent task.
req_id: int = error['id']
req_msg: dict = req_msgs[req_id]
result: dict = rpc_results[req_id]
result['error'] = error
result['event'].set()
log.error(
f'JSONRPC request failed\n'
f'req: {req_msg}\n'
f'resp: {error}\n'
)
case _: case _:
log.warning(f'Unhandled JSON-RPC msg!?\n{msg}') log.warning(f'Unhandled JSON-RPC msg!?\n{msg}')
n.start_soon(recv_task) tn.start_soon(recv_task)
yield json_rpc yield json_rpc
n.cancel_scope.cancel() tn.cancel_scope.cancel()

View File

@ -386,6 +386,8 @@ def ldshm(
open_annot_ctl() as actl, open_annot_ctl() as actl,
): ):
shm_df: pl.DataFrame | None = None shm_df: pl.DataFrame | None = None
tf2aids: dict[float, dict] = {}
for ( for (
shmfile, shmfile,
shm, shm,
@ -526,16 +528,17 @@ def ldshm(
new_df, new_df,
step_gaps, step_gaps,
) )
# last chance manual overwrites in REPL # last chance manual overwrites in REPL
await tractor.pause() # await tractor.pause()
assert aids assert aids
tf2aids[period_s] = aids
else: else:
# allow interaction even when no ts problems. # allow interaction even when no ts problems.
await tractor.pause() assert not diff
# assert not diff
await tractor.pause()
log.info('Exiting TSP shm anal-izer!')
if shm_df is None: if shm_df is None:
log.error( log.error(

View File

@ -161,7 +161,13 @@ class NativeStorageClient:
def index_files(self): def index_files(self):
for path in self._datadir.iterdir(): for path in self._datadir.iterdir():
if path.name in {'borked', 'expired',}: if (
path.is_dir()
or
'.parquet' not in str(path)
# or
# path.name in {'borked', 'expired',}
):
continue continue
key: str = path.name.rstrip('.parquet') key: str = path.name.rstrip('.parquet')

View File

@ -44,8 +44,10 @@ import trio
from trio_typing import TaskStatus from trio_typing import TaskStatus
import tractor import tractor
from pendulum import ( from pendulum import (
Interval,
DateTime, DateTime,
Duration, Duration,
duration as mk_duration,
from_timestamp, from_timestamp,
) )
import numpy as np import numpy as np
@ -214,7 +216,8 @@ async def maybe_fill_null_segments(
# pair, immediately stop backfilling? # pair, immediately stop backfilling?
if ( if (
start_dt start_dt
and end_dt < start_dt and
end_dt < start_dt
): ):
await tractor.pause() await tractor.pause()
break break
@ -262,6 +265,7 @@ async def maybe_fill_null_segments(
except tractor.ContextCancelled: except tractor.ContextCancelled:
# log.exception # log.exception
await tractor.pause() await tractor.pause()
raise
null_segs_detected.set() null_segs_detected.set()
# RECHECK for more null-gaps # RECHECK for more null-gaps
@ -349,7 +353,7 @@ async def maybe_fill_null_segments(
async def start_backfill( async def start_backfill(
get_hist, get_hist,
frame_types: dict[str, Duration] | None, def_frame_duration: Duration,
mod: ModuleType, mod: ModuleType,
mkt: MktPair, mkt: MktPair,
shm: ShmArray, shm: ShmArray,
@ -379,22 +383,23 @@ async def start_backfill(
update_start_on_prepend: bool = False update_start_on_prepend: bool = False
if backfill_until_dt is None: if backfill_until_dt is None:
# TODO: drop this right and just expose the backfill # TODO: per-provider default history-durations?
# limits inside a [storage] section in conf.toml? # -[ ] inside the `open_history_client()` config allow
# when no tsdb "last datum" is provided, we just load # declaring the history duration limits instead of
# some near-term history. # guessing and/or applying the same limits to all?
# periods = { #
# 1: {'days': 1}, # -[ ] allow declaring (default) per-provider backfill
# 60: {'days': 14}, # limits inside a [storage] sub-section in conf.toml?
# } #
# NOTE, when no tsdb "last datum" is provided, we just
# do a decently sized backfill and load it into storage. # load some near-term history by presuming a "decently
# large" 60s duration limit and a much shorter 1s range.
periods = { periods = {
1: {'days': 2}, 1: {'days': 2},
60: {'years': 6}, 60: {'years': 6},
} }
period_duration: int = periods[timeframe] period_duration: int = periods[timeframe]
update_start_on_prepend = True update_start_on_prepend: bool = True
# NOTE: manually set the "latest" datetime which we intend to # NOTE: manually set the "latest" datetime which we intend to
# backfill history "until" so as to adhere to the history # backfill history "until" so as to adhere to the history
@ -416,7 +421,6 @@ async def start_backfill(
f'backfill_until_dt: {backfill_until_dt}\n' f'backfill_until_dt: {backfill_until_dt}\n'
f'last_start_dt: {last_start_dt}\n' f'last_start_dt: {last_start_dt}\n'
) )
try: try:
( (
array, array,
@ -426,71 +430,114 @@ async def start_backfill(
timeframe, timeframe,
end_dt=last_start_dt, end_dt=last_start_dt,
) )
except NoData as _daterr: except NoData as _daterr:
# 3 cases: orig_last_start_dt: datetime = last_start_dt
# - frame in the middle of a legit venue gap gap_report: str = (
# - history actually began at the `last_start_dt` f'EMPTY FRAME for `end_dt: {last_start_dt}`?\n'
# - some other unknown error (ib blocking the f'{mod.name} -> tf@fqme: {timeframe}@{mkt.fqme}\n'
# history bc they don't want you seeing how they f'last_start_dt: {orig_last_start_dt}\n\n'
# cucked all the tinas..) f'bf_until: {backfill_until_dt}\n'
if dur := frame_types.get(timeframe): )
# decrement by a frame's worth of duration and # EMPTY FRAME signal with 3 (likely) causes:
# retry a few times. #
last_start_dt.subtract( # 1. range contains legit gap in venue history
seconds=dur.total_seconds() # 2. history actually (edge case) **began** at the
# value `last_start_dt`
# 3. some other unknown error (ib blocking the
# history-query bc they don't want you seeing how
# they cucked all the tinas.. like with options
# hist)
#
if def_frame_duration:
# decrement by a duration's (frame) worth of time
# as maybe indicated by the backend to see if we
# can get older data before this possible
# "history gap".
last_start_dt: datetime = last_start_dt.subtract(
seconds=def_frame_duration.total_seconds()
) )
log.warning( gap_report += (
f'{mod.name} -> EMPTY FRAME for end_dt?\n' f'Decrementing `end_dt` and retrying with,\n'
f'tf@fqme: {timeframe}@{mkt.fqme}\n' f'def_frame_duration: {def_frame_duration}\n'
'bf_until <- last_start_dt:\n' f'(new) last_start_dt: {last_start_dt}\n'
f'{backfill_until_dt} <- {last_start_dt}\n'
f'Decrementing `end_dt` by {dur} and retry..\n'
) )
log.warning(gap_report)
# skip writing to shm/tsdb and try the next
# duration's worth of prior history.
continue continue
# broker says there never was or is no more history to pull else:
except DataUnavailable: # await tractor.pause()
log.warning( raise DataUnavailable(gap_report)
f'NO-MORE-DATA in range?\n'
f'`{mod.name}` halted history:\n'
f'tf@fqme: {timeframe}@{mkt.fqme}\n'
'bf_until <- last_start_dt:\n'
f'{backfill_until_dt} <- {last_start_dt}\n'
)
# ugh, what's a better way? # broker says there never was or is no more history to pull
# TODO: fwiw, we probably want a way to signal a throttle except DataUnavailable as due:
# condition (eg. with ib) so that we can halt the message: str = due.args[0]
# request loop until the condition is resolved? log.warning(
if timeframe > 1: f'Provider {mod.name!r} halted backfill due to,\n\n'
await tractor.pause()
f'{message}\n'
f'fqme: {mkt.fqme}\n'
f'timeframe: {timeframe}\n'
f'last_start_dt: {last_start_dt}\n'
f'bf_until: {backfill_until_dt}\n'
)
# UGH: what's a better way?
# TODO: backends are responsible for being correct on
# this right!?
# -[ ] in the `ib` case we could maybe offer some way
# to halt the request loop until the condition is
# resolved or should the backend be entirely in
# charge of solving such faults? yes, right?
return return
time: np.ndarray = array['time']
assert ( assert (
array['time'][0] time[0]
== ==
next_start_dt.timestamp() next_start_dt.timestamp()
) )
diff = last_start_dt - next_start_dt assert time[-1] == next_end_dt.timestamp()
frame_time_diff_s = diff.seconds
expected_dur: Interval = last_start_dt - next_start_dt
# frame's worth of sample-period-steps, in seconds # frame's worth of sample-period-steps, in seconds
frame_size_s: float = len(array) * timeframe frame_size_s: float = len(array) * timeframe
expected_frame_size_s: float = frame_size_s + timeframe recv_frame_dur: Duration = (
if frame_time_diff_s > expected_frame_size_s: from_timestamp(array[-1]['time'])
-
from_timestamp(array[0]['time'])
)
if (
(lt_frame := (recv_frame_dur < expected_dur))
or
(null_frame := (frame_size_s == 0))
# ^XXX, should NEVER hit now!
):
# XXX: query result includes a start point prior to our # XXX: query result includes a start point prior to our
# expected "frame size" and thus is likely some kind of # expected "frame size" and thus is likely some kind of
# history gap (eg. market closed period, outage, etc.) # history gap (eg. market closed period, outage, etc.)
# so just report it to console for now. # so just report it to console for now.
if lt_frame:
reason = 'Possible GAP (or first-datum)'
else:
assert null_frame
reason = 'NULL-FRAME'
missing_dur: Interval = expected_dur.end - recv_frame_dur.end
log.warning( log.warning(
'GAP DETECTED:\n' f'{timeframe}s-series {reason} detected!\n'
f'last_start_dt: {last_start_dt}\n' f'fqme: {mkt.fqme}\n'
f'diff: {diff}\n' f'last_start_dt: {last_start_dt}\n\n'
f'frame_time_diff_s: {frame_time_diff_s}\n' f'recv interval: {recv_frame_dur}\n'
f'expected interval: {expected_dur}\n\n'
f'Missing duration of history of {missing_dur.in_words()!r}\n'
f'{missing_dur}\n'
) )
# await tractor.pause()
to_push = diff_history( to_push = diff_history(
array, array,
@ -565,22 +612,27 @@ async def start_backfill(
# long-term storage. # long-term storage.
if ( if (
storage is not None storage is not None
and write_tsdb and
write_tsdb
): ):
log.info( log.info(
f'Writing {ln} frame to storage:\n' f'Writing {ln} frame to storage:\n'
f'{next_start_dt} -> {last_start_dt}' f'{next_start_dt} -> {last_start_dt}'
) )
# always drop the src asset token for # NOTE, always drop the src asset token for
# non-currency-pair like market types (for now) # non-currency-pair like market types (for now)
#
# THAT IS, for now our table key schema is NOT
# including the dst[/src] source asset token. SO,
# 'tsla.nasdaq.ib' over 'tsla/usd.nasdaq.ib' for
# historical reasons ONLY.
if mkt.dst.atype not in { if mkt.dst.atype not in {
'crypto', 'crypto',
'crypto_currency', 'crypto_currency',
'fiat', # a "forex pair" 'fiat', # a "forex pair"
'perpetual_future', # stupid "perps" from cex land
}: }:
# for now, our table key schema is not including
# the dst[/src] source asset token.
col_sym_key: str = mkt.get_fqme( col_sym_key: str = mkt.get_fqme(
delim_char='', delim_char='',
without_src=True, without_src=True,
@ -685,7 +737,7 @@ async def back_load_from_tsdb(
last_tsdb_dt last_tsdb_dt
and latest_start_dt and latest_start_dt
): ):
backfilled_size_s = ( backfilled_size_s: Duration = (
latest_start_dt - last_tsdb_dt latest_start_dt - last_tsdb_dt
).seconds ).seconds
# if the shm buffer len is not large enough to contain # if the shm buffer len is not large enough to contain
@ -908,6 +960,8 @@ async def tsdb_backfill(
f'{pformat(config)}\n' f'{pformat(config)}\n'
) )
# concurrently load the provider's most-recent-frame AND any
# pre-existing tsdb history already saved in `piker` storage.
dt_eps: list[DateTime, DateTime] = [] dt_eps: list[DateTime, DateTime] = []
async with trio.open_nursery() as tn: async with trio.open_nursery() as tn:
tn.start_soon( tn.start_soon(
@ -918,7 +972,6 @@ async def tsdb_backfill(
timeframe, timeframe,
config, config,
) )
tsdb_entry: tuple = await load_tsdb_hist( tsdb_entry: tuple = await load_tsdb_hist(
storage, storage,
mkt, mkt,
@ -947,6 +1000,25 @@ async def tsdb_backfill(
mr_end_dt, mr_end_dt,
) = dt_eps ) = dt_eps
first_frame_dur_s: Duration = (mr_end_dt - mr_start_dt).seconds
calced_frame_size: Duration = mk_duration(
seconds=first_frame_dur_s,
)
# NOTE, attempt to use the backend declared default frame
# sizing (as allowed by their time-series query APIs) and
# if not provided try to construct a default from the
# first frame received above.
def_frame_durs: dict[
int,
Duration,
]|None = config.get('frame_types', None)
if def_frame_durs:
def_frame_size: Duration = def_frame_durs[timeframe]
assert def_frame_size == calced_frame_size
else:
# use what we calced from first frame above.
def_frame_size = calced_frame_size
# NOTE: when there's no offline data, there's 2 cases: # NOTE: when there's no offline data, there's 2 cases:
# - data backend doesn't support timeframe/sample # - data backend doesn't support timeframe/sample
# period (in which case `dt_eps` should be `None` and # period (in which case `dt_eps` should be `None` and
@ -977,7 +1049,7 @@ async def tsdb_backfill(
partial( partial(
start_backfill, start_backfill,
get_hist=get_hist, get_hist=get_hist,
frame_types=config.get('frame_types', None), def_frame_duration=def_frame_size,
mod=mod, mod=mod,
mkt=mkt, mkt=mkt,
shm=shm, shm=shm,

View File

@ -616,6 +616,18 @@ def detect_price_gaps(
# ]) # ])
... ...
# TODO: probably just use the null_segs impl above?
def detect_vlm_gaps(
df: pl.DataFrame,
col: str = 'volume',
) -> pl.DataFrame:
vnull: pl.DataFrame = w_dts.filter(
pl.col(col) == 0
)
return vnull
def dedupe( def dedupe(
src_df: pl.DataFrame, src_df: pl.DataFrame,
@ -626,7 +638,6 @@ def dedupe(
) -> tuple[ ) -> tuple[
pl.DataFrame, # with dts pl.DataFrame, # with dts
pl.DataFrame, # gaps
pl.DataFrame, # with deduplicated dts (aka gap/repeat removal) pl.DataFrame, # with deduplicated dts (aka gap/repeat removal)
int, # len diff between input and deduped int, # len diff between input and deduped
]: ]:
@ -639,19 +650,22 @@ def dedupe(
''' '''
wdts: pl.DataFrame = with_dts(src_df) wdts: pl.DataFrame = with_dts(src_df)
# maybe sort on any time field deduped = wdts
if sort:
wdts = wdts.sort(by='time')
# TODO: detect out-of-order segments which were corrected!
# -[ ] report in log msg
# -[ ] possibly return segment sections which were moved?
# remove duplicated datetime samples/sections # remove duplicated datetime samples/sections
deduped: pl.DataFrame = wdts.unique( deduped: pl.DataFrame = wdts.unique(
subset=['dt'], # subset=['dt'],
subset=['time'],
maintain_order=True, maintain_order=True,
) )
# maybe sort on any time field
if sort:
deduped = deduped.sort(by='time')
# TODO: detect out-of-order segments which were corrected!
# -[ ] report in log msg
# -[ ] possibly return segment sections which were moved?
diff: int = ( diff: int = (
wdts.height wdts.height
- -

View File

@ -14,9 +14,8 @@
# You should have received a copy of the GNU Affero General Public License # You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>. # along with this program. If not, see <https://www.gnu.org/licenses/>.
""" '''
Stuff for your eyes, aka super hawt Qt UI components. UI components built using `Qt` with major versions swapped in via
the import indirection in the `.qt` sub-mod.
Currently we only support PyQt5 due to this issue in Pyside2: '''
https://bugreports.qt.io/projects/PYSIDE/issues/PYSIDE-1313
"""

View File

@ -21,8 +21,10 @@ Anchor funtions for UI placement of annotions.
from __future__ import annotations from __future__ import annotations
from typing import Callable, TYPE_CHECKING from typing import Callable, TYPE_CHECKING
from PyQt5.QtCore import QPointF from piker.ui.qt import (
from PyQt5.QtWidgets import QGraphicsPathItem QPointF,
QGraphicsPathItem,
)
if TYPE_CHECKING: if TYPE_CHECKING:
from ._chart import ChartPlotWidget from ._chart import ChartPlotWidget

View File

@ -20,12 +20,22 @@ Annotations for ur faces.
""" """
from typing import Callable from typing import Callable
from PyQt5 import QtCore, QtGui, QtWidgets from pyqtgraph import (
from PyQt5.QtCore import QPointF, QRectF Point,
from PyQt5.QtWidgets import QGraphicsPathItem functions as fn,
from pyqtgraph import Point, functions as fn, Color Color,
)
import numpy as np import numpy as np
from piker.ui.qt import (
QtCore,
QtGui,
QtWidgets,
QPointF,
QRectF,
QGraphicsPathItem,
)
def mk_marker_path( def mk_marker_path(

View File

@ -21,9 +21,11 @@ Main app startup and run.
from functools import partial from functools import partial
from types import ModuleType from types import ModuleType
from PyQt5.QtCore import QEvent
import trio import trio
from piker.ui.qt import (
QEvent,
)
from ..service import maybe_spawn_brokerd from ..service import maybe_spawn_brokerd
from . import _event from . import _event
from ._exec import run_qtractor from ._exec import run_qtractor

View File

@ -25,9 +25,16 @@ from math import floor
import polars as pl import polars as pl
import pyqtgraph as pg import pyqtgraph as pg
from PyQt5 import QtCore, QtGui, QtWidgets
from PyQt5.QtCore import QPointF
from piker.ui.qt import (
QtCore,
QtGui,
QtWidgets,
QPointF,
txt_flag,
align_flag,
px_cache_mode,
)
from . import _pg_overrides as pgo from . import _pg_overrides as pgo
from ..accounting._mktinfo import float_digits from ..accounting._mktinfo import float_digits
from ._label import Label from ._label import Label
@ -414,11 +421,15 @@ class AxisLabel(pg.GraphicsObject):
super().__init__() super().__init__()
self.setParentItem(parent) self.setParentItem(parent)
self.setFlag(self.ItemIgnoresTransformations) self.setFlag(
self.GraphicsItemFlag.ItemIgnoresTransformations
)
self.setZValue(100) self.setZValue(100)
# XXX: pretty sure this is faster # XXX: pretty sure this is faster
self.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache) self.setCacheMode(
px_cache_mode.DeviceCoordinateCache
)
self._parent = parent self._parent = parent
@ -555,21 +566,14 @@ class AxisLabel(pg.GraphicsObject):
return (self.rect.width(), self.rect.height()) return (self.rect.width(), self.rect.height())
# _common_text_flags = (
# QtCore.Qt.TextDontClip |
# QtCore.Qt.AlignCenter |
# QtCore.Qt.AlignTop |
# QtCore.Qt.AlignHCenter |
# QtCore.Qt.AlignVCenter
# )
class XAxisLabel(AxisLabel): class XAxisLabel(AxisLabel):
_x_margin = 8 _x_margin = 8
text_flags = ( text_flags = (
QtCore.Qt.TextDontClip align_flag.AlignCenter
| QtCore.Qt.AlignCenter | txt_flag.TextDontClip
) )
def size_hint(self) -> tuple[float, float]: def size_hint(self) -> tuple[float, float]:
@ -626,10 +630,10 @@ class YAxisLabel(AxisLabel):
_y_margin: int = 4 _y_margin: int = 4
text_flags = ( text_flags = (
QtCore.Qt.AlignLeft align_flag.AlignLeft
# QtCore.Qt.AlignHCenter | align_flag.AlignVCenter
| QtCore.Qt.AlignVCenter # | align_flag.AlignHCenter
| QtCore.Qt.TextDontClip | txt_flag.TextDontClip
) )
def __init__( def __init__(

View File

@ -28,22 +28,20 @@ from typing import (
TYPE_CHECKING, TYPE_CHECKING,
) )
from PyQt5 import QtCore, QtWidgets import pyqtgraph as pg
from PyQt5.QtCore import ( import trio
from piker.ui.qt import (
QtCore,
QtWidgets,
Qt, Qt,
QLineF, QLineF,
# QPointF,
)
from PyQt5.QtWidgets import (
QFrame, QFrame,
QWidget, QWidget,
QHBoxLayout, QHBoxLayout,
QVBoxLayout, QVBoxLayout,
QSplitter, QSplitter,
) )
import pyqtgraph as pg
import trio
from ._axes import ( from ._axes import (
DynamicDateAxis, DynamicDateAxis,
PriceAxis, PriceAxis,
@ -570,8 +568,8 @@ class LinkedSplits(QWidget):
# style? # style?
self.chart.setFrameStyle( self.chart.setFrameStyle(
QFrame.StyledPanel | QFrame.Shape.StyledPanel |
QFrame.Plain QFrame.Shadow.Plain
) )
return self.chart return self.chart
@ -689,8 +687,8 @@ class LinkedSplits(QWidget):
cpw.plotItem.vb.linked = self cpw.plotItem.vb.linked = self
cpw.setFrameStyle( cpw.setFrameStyle(
QtWidgets.QFrame.StyledPanel QFrame.Shape.StyledPanel
# | QtWidgets.QFrame.Plain # | QFrame.Shadow.Plain
) )
# don't show the little "autoscale" A label. # don't show the little "autoscale" A label.

View File

@ -28,9 +28,14 @@ from typing import (
import inspect import inspect
import numpy as np import numpy as np
import pyqtgraph as pg import pyqtgraph as pg
from PyQt5 import QtCore, QtWidgets
from PyQt5.QtCore import QPointF, QRectF
from piker.ui.qt import (
QPointF,
QRectF,
QtCore,
QtWidgets,
px_cache_mode,
)
from ._style import ( from ._style import (
_xaxis_at, _xaxis_at,
hcolor, hcolor,
@ -104,7 +109,9 @@ class LineDot(pg.CurvePoint):
dot.setParentItem(self) dot.setParentItem(self)
# keep a static size # keep a static size
self.setFlag(self.ItemIgnoresTransformations) self.setFlag(
self.GraphicsItemFlag.ItemIgnoresTransformations
)
def event( def event(
self, self,
@ -424,10 +431,10 @@ class Cursor(pg.GraphicsObject):
# vertical and horizonal lines and a y-axis label # vertical and horizonal lines and a y-axis label
vl = plot.addLine(x=0, pen=self.lines_pen, movable=False) vl = plot.addLine(x=0, pen=self.lines_pen, movable=False)
vl.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache) vl.setCacheMode(px_cache_mode.DeviceCoordinateCache)
hl = plot.addLine(y=0, pen=self.lines_pen, movable=False) hl = plot.addLine(y=0, pen=self.lines_pen, movable=False)
hl.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache) hl.setCacheMode(px_cache_mode.DeviceCoordinateCache)
hl.hide() hl.hide()
yl = YAxisLabel( yl = YAxisLabel(
@ -511,7 +518,10 @@ class Cursor(pg.GraphicsObject):
plot=chart plot=chart
) )
chart.addItem(cursor) chart.addItem(cursor)
self.graphics[chart].setdefault('cursors', []).append(cursor) self.graphics[chart].setdefault(
'cursors',
[],
).append(cursor)
return cursor return cursor
def mouseAction( def mouseAction(

View File

@ -19,20 +19,21 @@ Fast, smooth, sexy curves.
""" """
from contextlib import contextmanager as cm from contextlib import contextmanager as cm
from enum import EnumType
from typing import Callable from typing import Callable
import numpy as np import numpy as np
import pyqtgraph as pg import pyqtgraph as pg
from PyQt5 import QtWidgets
from PyQt5.QtWidgets import QGraphicsItem from piker.ui.qt import (
from PyQt5.QtCore import ( QtWidgets,
QGraphicsItem,
Qt, Qt,
QLineF, QLineF,
QRectF, QRectF,
)
from PyQt5.QtGui import (
QPainter, QPainter,
QPainterPath, QPainterPath,
px_cache_mode,
) )
from ._style import hcolor from ._style import hcolor
from ..log import get_logger from ..log import get_logger
@ -42,15 +43,16 @@ from ..toolz.profile import (
ms_slower_then, ms_slower_then,
) )
log = get_logger(__name__) log = get_logger(__name__)
pen_style: EnumType = Qt.PenStyle
_line_styles: dict[str, int] = { _line_styles: dict[str, int] = {
'solid': Qt.PenStyle.SolidLine, 'solid': pen_style.SolidLine,
'dash': Qt.PenStyle.DashLine, 'dash': pen_style.DashLine,
'dot': Qt.PenStyle.DotLine, 'dot': pen_style.DotLine,
'dashdot': Qt.PenStyle.DashDotLine, 'dashdot': pen_style.DashDotLine,
} }
@ -69,12 +71,12 @@ class FlowGraphic(pg.GraphicsObject):
# XXX-NOTE-XXX: graphics caching B) # XXX-NOTE-XXX: graphics caching B)
# see explanation for different caching modes: # see explanation for different caching modes:
# https://stackoverflow.com/a/39410081 # https://stackoverflow.com/a/39410081
cache_mode: int = QGraphicsItem.DeviceCoordinateCache cache_mode: int = px_cache_mode.DeviceCoordinateCache
# XXX: WARNING item caching seems to only be useful # XXX: WARNING item caching seems to only be useful
# if we don't re-generate the entire QPainterPath every time # if we don't re-generate the entire QPainterPath every time
# don't ever use this - it's a colossal nightmare of artefacts # don't ever use this - it's a colossal nightmare of artefacts
# and is disastrous for performance. # and is disastrous for performance.
# QGraphicsItem.ItemCoordinateCache # cache_mode.ItemCoordinateCache
# TODO: still questions todo with coord-cacheing that we should # TODO: still questions todo with coord-cacheing that we should
# probably talk to a core dev about: # probably talk to a core dev about:
# - if this makes trasform interactions slower (such as zooming) # - if this makes trasform interactions slower (such as zooming)
@ -176,7 +178,7 @@ class FlowGraphic(pg.GraphicsObject):
@cm @cm
def reset_cache(self) -> None: def reset_cache(self) -> None:
try: try:
none = QGraphicsItem.NoCache none = px_cache_mode.NoCache
log.debug( log.debug(
f'{self._name} -> CACHE DISABLE: {none}' f'{self._name} -> CACHE DISABLE: {none}'
) )

View File

@ -40,8 +40,8 @@ from numpy import (
ndarray, ndarray,
) )
import pyqtgraph as pg import pyqtgraph as pg
from PyQt5.QtCore import QLineF
from piker.ui.qt import QLineF
from ..data._sharedmem import ( from ..data._sharedmem import (
ShmArray, ShmArray,
) )

View File

@ -57,6 +57,7 @@ from piker.toolz import (
Profiler, Profiler,
) )
from piker.log import get_logger from piker.log import get_logger
from piker import config
# from ..data._source import tf_in_1s # from ..data._source import tf_in_1s
from ._axes import YAxisLabel from ._axes import YAxisLabel
from ._chart import ( from ._chart import (
@ -1231,6 +1232,8 @@ async def link_views_with_region(
# region.sigRegionChangeFinished.connect(update_pi_from_region) # region.sigRegionChangeFinished.connect(update_pi_from_region)
# NOTE: default is set to 60 FPS until the runtime delivers the
# discoverd hw value below.
_quote_throttle_rate: int = 60 - 6 _quote_throttle_rate: int = 60 - 6
@ -1272,26 +1275,54 @@ async def display_symbol_data(
# TODO: ctl over update loop's maximum frequency. # TODO: ctl over update loop's maximum frequency.
# - load this from a config.toml! # - load this from a config.toml!
# - allow dyanmic configuration from chart UI? # - allow dyanmic configuration from chart UI?
(
conf,
path,
) = config.load()
ui_conf: dict = conf['ui']
global _quote_throttle_rate global _quote_throttle_rate
from ._window import main_window from ._window import main_window
display_rate = main_window().current_screen().refreshRate()
_quote_throttle_rate = floor(display_rate) - 6 display_rate: int = floor(
main_window().current_screen().refreshRate()
) - 6
mx_redraw_rate: int = ui_conf.get(
'max_redraw_rate',
_quote_throttle_rate,
)
if mx_redraw_rate < display_rate:
log.info(
'Down-throttling redraw rate to config setting\n'
f'display FPS: {display_rate}\n'
'max_redraw_rate: {max_redraw_rate}\n'
)
else:
_quote_throttle_rate = display_rate
# TODO: we should be able to increase this if we use some # TODO: we should be able to increase this if we use some
# `mypyc` speedups elsewhere? 22ish seems to be the sweet # `mypyc` speedups elsewhere? 22ish seems to be the sweet
# spot for single-feed chart. # spot for single-feed chart.
num_of_feeds = len(fqmes) num_of_feeds = len(fqmes)
mx: int = 22 # if num_of_feeds > 1:
if num_of_feeds > 1:
# there will be more ctx switches with more than 1 feed so we # there will be more ctx switches with more than 1 feed so we
# max throttle down a bit more. # max throttle down a bit more.
mx = 16 mx_per_feed: int = (
ui_conf.get(
'per_feed_redraw_rate',
mx_redraw_rate,
)
or 16
)
# limit to at least display's FPS # limit to at least display's FPS
# avoiding needless Qt-in-guest-mode context switches # avoiding needless Qt-in-guest-mode context switches
cycles_per_feed = min( cycles_per_feed = min(
round(_quote_throttle_rate/num_of_feeds), round(_quote_throttle_rate/num_of_feeds),
mx, mx_per_feed,
) )
feed: Feed feed: Feed

View File

@ -32,24 +32,21 @@ from pyqtgraph import (
QtCore, QtCore,
QtWidgets, QtWidgets,
) )
from PyQt5.QtCore import (
QPointF,
QRectF,
)
from PyQt5.QtGui import (
QColor,
QTransform,
)
from PyQt5.QtWidgets import (
QGraphicsProxyWidget,
QGraphicsScene,
QLabel,
)
from pyqtgraph import functions as fn from pyqtgraph import functions as fn
import numpy as np import numpy as np
from piker.types import Struct from piker.types import Struct
from piker.ui.qt import (
Qt,
QPointF,
QRectF,
QGraphicsProxyWidget,
QGraphicsScene,
QLabel,
QColor,
QTransform,
)
from ._style import ( from ._style import (
hcolor, hcolor,
_font, _font,
@ -316,7 +313,9 @@ class SelectRect(QtWidgets.QGraphicsRectItem):
self.setZValue(1e9) self.setZValue(1e9)
label = self._label = QLabel() label = self._label = QLabel()
label.setTextFormat(0) # markdown label.setTextFormat(
Qt.TextFormat.MarkdownText
)
label.setFont(_font.font) label.setFont(_font.font)
label.setMargin(0) label.setMargin(0)
label.setAlignment( label.setAlignment(

View File

@ -23,28 +23,29 @@ from typing import Callable
import trio import trio
from tractor.trionics import gather_contexts from tractor.trionics import gather_contexts
from PyQt5 import QtCore
from PyQt5.QtCore import QEvent, pyqtBoundSignal
from PyQt5.QtWidgets import QWidget
from PyQt5.QtWidgets import (
QGraphicsSceneMouseEvent as gs_mouse,
)
from piker.ui.qt import (
QtCore,
QWidget,
QEvent,
keys,
gs_keys,
pyqtBoundSignal,
)
from piker.types import Struct from piker.types import Struct
MOUSE_EVENTS = { MOUSE_EVENTS = {
gs_mouse.GraphicsSceneMousePress, gs_keys.GraphicsSceneMousePress,
gs_mouse.GraphicsSceneMouseRelease, gs_keys.GraphicsSceneMouseRelease,
QEvent.MouseButtonPress, keys.MouseButtonPress,
QEvent.MouseButtonRelease, keys.MouseButtonRelease,
# QtGui.QMouseEvent, # QtGui.QMouseEvent,
} }
# TODO: maybe consider some constrained ints down the road? # TODO: maybe consider some constrained ints down the road?
# https://pydantic-docs.helpmanual.io/usage/types/#constrained-types # https://pydantic-docs.helpmanual.io/usage/types/#constrained-types
class KeyboardMsg(Struct): class KeyboardMsg(Struct):
'''Unpacked Qt keyboard event data. '''Unpacked Qt keyboard event data.
@ -114,7 +115,10 @@ class EventRelay(QtCore.QObject):
# something to do with Qt internals and calling the # something to do with Qt internals and calling the
# parent handler? # parent handler?
if etype in {QEvent.KeyPress, QEvent.KeyRelease}: if etype in {
QEvent.Type.KeyPress,
QEvent.Type.KeyRelease,
}:
msg = KeyboardMsg( msg = KeyboardMsg(
event=ev, event=ev,
@ -160,7 +164,9 @@ class EventRelay(QtCore.QObject):
async def open_event_stream( async def open_event_stream(
source_widget: QWidget, source_widget: QWidget,
event_types: set[QEvent] = {QEvent.KeyPress}, event_types: set[QEvent] = {
QEvent.Type.KeyPress,
},
filter_auto_repeats: bool = True, filter_auto_repeats: bool = True,
) -> trio.abc.ReceiveChannel: ) -> trio.abc.ReceiveChannel:

View File

@ -30,25 +30,22 @@ from typing import (
import platform import platform
import traceback import traceback
# Qt specific
import PyQt5 # noqa
from PyQt5.QtWidgets import (
QWidget,
QMainWindow,
QApplication,
)
from PyQt5 import QtCore
from PyQt5.QtCore import (
pyqtRemoveInputHook,
Qt,
QCoreApplication,
)
import qdarkstyle import qdarkstyle
from qdarkstyle import DarkPalette from qdarkstyle import DarkPalette
# import qdarkgraystyle # TODO: play with it # import qdarkgraystyle # TODO: play with it
import trio import trio
from outcome import Error from outcome import Error
# Qt version-agnostic
from .qt import (
QWidget,
QMainWindow,
QApplication,
QtCore,
pyqtRemoveInputHook,
Qt,
QCoreApplication,
)
from ..service import ( from ..service import (
maybe_open_pikerd, maybe_open_pikerd,
get_runtime_vars, get_runtime_vars,
@ -150,7 +147,7 @@ def run_qtractor(
# load dark theme # load dark theme
stylesheet = qdarkstyle.load_stylesheet( stylesheet = qdarkstyle.load_stylesheet(
qt_api='pyqt5', qt_api='pyqt6',
palette=DarkPalette, palette=DarkPalette,
) )
app.setStyleSheet(stylesheet) app.setStyleSheet(stylesheet)

View File

@ -28,9 +28,15 @@ from typing import (
) )
import trio import trio
from PyQt5 import QtGui
from PyQt5.QtCore import QSize, QModelIndex, Qt, QEvent from piker.ui.qt import (
from PyQt5.QtWidgets import ( keys,
size_policy,
QtGui,
QSize,
QModelIndex,
Qt,
QEvent,
QWidget, QWidget,
QLabel, QLabel,
QComboBox, QComboBox,
@ -39,7 +45,6 @@ from PyQt5.QtWidgets import (
QVBoxLayout, QVBoxLayout,
QFormLayout, QFormLayout,
QProgressBar, QProgressBar,
QSizePolicy,
QStyledItemDelegate, QStyledItemDelegate,
QStyleOptionViewItem, QStyleOptionViewItem,
) )
@ -71,14 +76,14 @@ class Edit(QLineEdit):
if width_in_chars: if width_in_chars:
self._chars = int(width_in_chars) self._chars = int(width_in_chars)
x_size_policy = QSizePolicy.Fixed x_size_policy = size_policy.Fixed
else: else:
# chart count which will be used to calculate # chart count which will be used to calculate
# width of input field. # width of input field.
self._chars: int = 6 self._chars: int = 6
# fit to surroundingn frame width # fit to surroundingn frame width
x_size_policy = QSizePolicy.Expanding x_size_policy = size_policy.Expanding
super().__init__(parent) super().__init__(parent)
@ -86,7 +91,7 @@ class Edit(QLineEdit):
# https://doc.qt.io/qt-5/qsizepolicy.html#Policy-enum # https://doc.qt.io/qt-5/qsizepolicy.html#Policy-enum
self.setSizePolicy( self.setSizePolicy(
x_size_policy, x_size_policy,
QSizePolicy.Fixed, size_policy.Fixed,
) )
self.setFont(font.font) self.setFont(font.font)
@ -180,11 +185,13 @@ class Selection(QComboBox):
self._items: dict[str, int] = {} self._items: dict[str, int] = {}
super().__init__(parent=parent) super().__init__(parent=parent)
self.setSizeAdjustPolicy(QComboBox.AdjustToContents) self.setSizeAdjustPolicy(
QComboBox.SizeAdjustPolicy.AdjustToContents,
)
# make line edit expand to surrounding frame # make line edit expand to surrounding frame
self.setSizePolicy( self.setSizePolicy(
QSizePolicy.Expanding, size_policy.Expanding,
QSizePolicy.Fixed, size_policy.Fixed,
) )
view = self.view() view = self.view()
view.setUniformItemSizes(True) view.setUniformItemSizes(True)
@ -308,8 +315,8 @@ class FieldsForm(QWidget):
# size it as we specify # size it as we specify
self.setSizePolicy( self.setSizePolicy(
QSizePolicy.Expanding, size_policy.Expanding,
QSizePolicy.Expanding, size_policy.Expanding,
) )
# XXX: not sure why we have to create this here exactly # XXX: not sure why we have to create this here exactly
@ -416,8 +423,8 @@ class FieldsForm(QWidget):
select.set_items(values) select.set_items(values)
self.setSizePolicy( self.setSizePolicy(
QSizePolicy.Fixed, size_policy.Fixed,
QSizePolicy.Fixed, size_policy.Fixed,
) )
select.show() select.show()
self.form.addRow(label, select) self.form.addRow(label, select)
@ -437,7 +444,10 @@ async def handle_field_input(
async for kbmsg in recv_chan: async for kbmsg in recv_chan:
if kbmsg.etype in {QEvent.KeyPress, QEvent.KeyRelease}: if kbmsg.etype in {
keys.KeyPress,
keys.KeyRelease,
}:
event, etype, key, mods, txt = kbmsg.to_tuple() event, etype, key, mods, txt = kbmsg.to_tuple()
print(f'key: {kbmsg.key}, mods: {kbmsg.mods}, txt: {kbmsg.txt}') print(f'key: {kbmsg.key}, mods: {kbmsg.mods}, txt: {kbmsg.txt}')
@ -703,7 +713,8 @@ def mk_fill_status_bar(
) )
bottom_label = form.add_field_label( bottom_label = form.add_field_label(
'x: {step_size}', # 'x: {step_size}',
'{unit_prefix}: {step_size}',
font_size=bar_label_font_size, font_size=bar_label_font_size,
font_color='gunmetal', font_color='gunmetal',
) )

View File

@ -15,15 +15,18 @@
# along with this program. If not, see <https://www.gnu.org/licenses/>. # along with this program. If not, see <https://www.gnu.org/licenses/>.
''' '''
``QIcon`` hackery. `QIcon` hackery.
Mostly dynamically loading pixmaps for use with `QGraphicsScene`.
''' '''
from PyQt5.QtWidgets import QStyle from piker.ui.qt import (
from PyQt5.QtGui import ( QSize,
QIcon, QPixmap, QColor QStyle,
QIcon,
QPixmap,
QColor,
) )
from PyQt5.QtCore import QSize
from ._style import hcolor from ._style import hcolor
# https://www.pythonguis.com/faq/built-in-qicons-pyqt/ # https://www.pythonguis.com/faq/built-in-qicons-pyqt/
@ -44,7 +47,8 @@ def mk_icons(
size: QSize, size: QSize,
) -> dict[str, QIcon]: ) -> dict[str, QIcon]:
'''This helper is indempotent. '''
This helper is indempotent.
''' '''
global _icons, _icon_names global _icons, _icon_names
@ -56,7 +60,11 @@ def mk_icons(
# load account selection using current style # load account selection using current style
for name, icon_name in _icon_names.items(): for name, icon_name in _icon_names.items():
stdpixmap = getattr(QStyle, icon_name) stdpixmap = getattr(
# https://www.pythonguis.com/faq/built-in-qicons-pyqt/
QStyle.StandardPixmap, # pyqt/pyside6
icon_name,
)
stdicon = style.standardIcon(stdpixmap) stdicon = style.standardIcon(stdpixmap)
pixmap = stdicon.pixmap(size) pixmap = stdicon.pixmap(size)

View File

@ -36,23 +36,21 @@ import pyqtgraph as pg
# this down the road.. Bo # this down the road.. Bo
from pyqtgraph.GraphicsScene import mouseEvents as mevs from pyqtgraph.GraphicsScene import mouseEvents as mevs
# from pyqtgraph.GraphicsScene.mouseEvents import MouseDragEvent # from pyqtgraph.GraphicsScene.mouseEvents import MouseDragEvent
from PyQt5.QtWidgets import QGraphicsSceneMouseEvent as gs_mouse
from PyQt5.QtGui import (
QWheelEvent,
)
from PyQt5.QtCore import (
Qt,
QEvent,
)
from pyqtgraph import ( from pyqtgraph import (
ViewBox, ViewBox,
Point, Point,
QtCore, QtCore,
functions as fn,
) )
from pyqtgraph import functions as fn
import numpy as np import numpy as np
import trio import trio
from piker.ui.qt import (
QWheelEvent,
QGraphicsSceneMouseEvent as gs_mouse,
Qt,
QEvent,
)
from ..log import get_logger from ..log import get_logger
from ..toolz import ( from ..toolz import (
Profiler, Profiler,
@ -81,22 +79,22 @@ if TYPE_CHECKING:
log = get_logger(__name__) log = get_logger(__name__)
NUMBER_LINE = { NUMBER_LINE = {
Qt.Key_1, Qt.Key.Key_1,
Qt.Key_2, Qt.Key.Key_2,
Qt.Key_3, Qt.Key.Key_3,
Qt.Key_4, Qt.Key.Key_4,
Qt.Key_5, Qt.Key.Key_5,
Qt.Key_6, Qt.Key.Key_6,
Qt.Key_7, Qt.Key.Key_7,
Qt.Key_8, Qt.Key.Key_8,
Qt.Key_9, Qt.Key.Key_9,
Qt.Key_0, Qt.Key.Key_0,
} }
ORDER_MODE = { ORDER_MODE = {
Qt.Key_A, Qt.Key.Key_A,
Qt.Key_F, Qt.Key.Key_F,
Qt.Key_D, Qt.Key.Key_D,
} }

View File

@ -21,9 +21,12 @@ Double auction top-of-book (L1) graphics.
from typing import Tuple from typing import Tuple
import pyqtgraph as pg import pyqtgraph as pg
from PyQt5 import QtCore, QtGui
from PyQt5.QtCore import QPointF
from piker.ui.qt import (
QPointF,
QtCore,
QtGui,
)
from ._axes import YAxisLabel from ._axes import YAxisLabel
from ._style import hcolor from ._style import hcolor
from ._pg_overrides import PlotItem from ._pg_overrides import PlotItem

View File

@ -25,10 +25,17 @@ from typing import (
) )
import pyqtgraph as pg import pyqtgraph as pg
from PyQt5 import QtGui, QtWidgets
from PyQt5.QtWidgets import QLabel, QSizePolicy
from PyQt5.QtCore import QPointF, QRectF, Qt
from piker.ui.qt import (
px_cache_mode,
QtGui,
QtWidgets,
QLabel,
size_policy,
QPointF,
QRectF,
Qt,
)
from ._style import ( from ._style import (
DpiAwareFont, DpiAwareFont,
hcolor, hcolor,
@ -78,7 +85,7 @@ class Label:
self._x_offset = x_offset self._x_offset = x_offset
txt = self.txt = QtWidgets.QGraphicsTextItem(parent=parent) txt = self.txt = QtWidgets.QGraphicsTextItem(parent=parent)
txt.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache) txt.setCacheMode(px_cache_mode.DeviceCoordinateCache)
vb.scene().addItem(txt) vb.scene().addItem(txt)
@ -103,7 +110,7 @@ class Label:
self._anchor_func = self.txt.pos().x self._anchor_func = self.txt.pos().x
# not sure if this makes a diff # not sure if this makes a diff
self.txt.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache) self.txt.setCacheMode(px_cache_mode.DeviceCoordinateCache)
# TODO: edit and selection support # TODO: edit and selection support
# https://doc.qt.io/qt-5/qt.html#TextInteractionFlag-enum # https://doc.qt.io/qt-5/qt.html#TextInteractionFlag-enum
@ -299,12 +306,14 @@ class FormatLabel(QLabel):
""" """
) )
self.setFont(_font.font) self.setFont(_font.font)
self.setTextFormat(Qt.MarkdownText) # markdown self.setTextFormat(
Qt.TextFormat.MarkdownText
)
self.setMargin(0) self.setMargin(0)
self.setSizePolicy( self.setSizePolicy(
QSizePolicy.Expanding, size_policy.Expanding,
QSizePolicy.Expanding, size_policy.Expanding,
) )
self.setAlignment( self.setAlignment(
Qt.AlignVCenter | Qt.AlignLeft Qt.AlignVCenter | Qt.AlignLeft

View File

@ -27,20 +27,22 @@ from typing import (
) )
import pyqtgraph as pg import pyqtgraph as pg
from pyqtgraph import Point, functions as fn from pyqtgraph import (
from PyQt5 import ( Point,
functions as fn,
)
from piker.ui.qt import (
px_cache_mode,
QtCore, QtCore,
QtGui, QtGui,
)
from PyQt5.QtWidgets import (
QGraphicsPathItem, QGraphicsPathItem,
QStyleOptionGraphicsItem, QStyleOptionGraphicsItem,
QGraphicsItem, QGraphicsItem,
QGraphicsScene, QGraphicsScene,
QWidget, QWidget,
QPointF,
) )
from PyQt5.QtCore import QPointF
from ._annotate import LevelMarker from ._annotate import LevelMarker
from ._anchors import ( from ._anchors import (
vbr_left, vbr_left,
@ -140,7 +142,9 @@ class LevelLine(pg.InfiniteLine):
self._right_end_sc: float = 0 self._right_end_sc: float = 0
# use px caching # use px caching
self.setCacheMode(QGraphicsItem.DeviceCoordinateCache) self.setCacheMode(
px_cache_mode.DeviceCoordinateCache
)
def txt_offsets(self) -> tuple[int, int]: def txt_offsets(self) -> tuple[int, int]:
return 0, 0 return 0, 0
@ -211,7 +215,7 @@ class LevelLine(pg.InfiniteLine):
) -> None: ) -> None:
if not called_from_on_pos_change: if not called_from_on_pos_change:
last = self.value() last: float = self.value()
# if the position hasn't changed then ``.update_labels()`` # if the position hasn't changed then ``.update_labels()``
# will not be called by a non-triggered `.on_pos_change()`, # will not be called by a non-triggered `.on_pos_change()`,

View File

@ -20,16 +20,14 @@ Super fast OHLC sampling graphics types.
from __future__ import annotations from __future__ import annotations
import numpy as np import numpy as np
from PyQt5 import (
from piker.ui.qt import (
QtGui, QtGui,
QtWidgets, QtWidgets,
) QPainterPath,
from PyQt5.QtCore import (
QLineF, QLineF,
QRectF, QRectF,
) )
from PyQt5.QtGui import QPainterPath
from ._curve import FlowGraphic from ._curve import FlowGraphic
from ..toolz import ( from ..toolz import (
Profiler, Profiler,

View File

@ -344,7 +344,10 @@ class SettingsPane:
dsize = tracker.live_pp.dsize dsize = tracker.live_pp.dsize
# READ out settings and update the status UI / settings widgets # READ out settings and update the status UI / settings widgets
suffix = {'currency': ' $', 'units': ' u'}[alloc.size_unit] unit_char: str = {
'currency': '$',
'units': 'u',
}[alloc.size_unit]
size_unit, limit = alloc.limit_info() size_unit, limit = alloc.limit_info()
step_size, currency_per_slot = alloc.step_sizes() step_size, currency_per_slot = alloc.step_sizes()
@ -358,10 +361,11 @@ class SettingsPane:
self.apply_setting('limit', limit) self.apply_setting('limit', limit)
self.step_label.format( self.step_label.format(
step_size=str(humanize(step_size)) + suffix unit_prefix=unit_char,
step_size=str(humanize(step_size))
) )
self.limit_label.format( self.limit_label.format(
limit=str(humanize(limit)) + suffix limit=f'{unit_char}: {str(humanize(limit))}'
) )
# update size unit in UI # update size unit in UI

View File

@ -38,14 +38,14 @@ from tractor import (
Context, Context,
MsgStream, MsgStream,
) )
from PyQt5.QtWidgets import (
QGraphicsItem,
)
from piker.log import get_logger from piker.log import get_logger
from piker.types import Struct from piker.types import Struct
from piker.service import find_service from piker.service import find_service
from piker.brokers import SymbolNotFound from piker.brokers import SymbolNotFound
from piker.ui.qt import (
QGraphicsItem,
)
from ._display import DisplayState from ._display import DisplayState
from ._interaction import ChartView from ._interaction import ChartView
from ._editors import SelectRect from ._editors import SelectRect

View File

@ -30,8 +30,8 @@ from typing import (
import msgspec import msgspec
import numpy as np import numpy as np
import pyqtgraph as pg import pyqtgraph as pg
from PyQt5.QtGui import QPainterPath
from piker.ui.qt import QPainterPath
from ..data._formatters import ( from ..data._formatters import (
IncrementalFormatter, IncrementalFormatter,
) )

View File

@ -48,27 +48,24 @@ from pprint import pformat
from rapidfuzz import process as fuzzy from rapidfuzz import process as fuzzy
import trio import trio
from trio_typing import TaskStatus from trio_typing import TaskStatus
from PyQt5 import QtCore
from PyQt5 import QtWidgets from piker.ui.qt import (
from PyQt5.QtCore import ( size_policy,
align_flag,
Qt, Qt,
QtCore,
QtWidgets,
QModelIndex, QModelIndex,
QItemSelectionModel, QItemSelectionModel,
)
from PyQt5.QtGui import (
# QLayout, # QLayout,
QStandardItem, QStandardItem,
QStandardItemModel, QStandardItemModel,
)
from PyQt5.QtWidgets import (
QWidget, QWidget,
QTreeView, QTreeView,
# QListWidgetItem, # QListWidgetItem,
# QAbstractScrollArea, # QAbstractScrollArea,
# QStyledItemDelegate, # QStyledItemDelegate,
) )
from ..log import get_logger from ..log import get_logger
from ._style import ( from ._style import (
_font, _font,
@ -129,8 +126,8 @@ class CompleterView(QTreeView):
# ux settings # ux settings
self.setSizePolicy( self.setSizePolicy(
QtWidgets.QSizePolicy.Expanding, size_policy.Expanding,
QtWidgets.QSizePolicy.Expanding, size_policy.Expanding,
) )
self.setItemsExpandable(True) self.setItemsExpandable(True)
self.setExpandsOnDoubleClick(False) self.setExpandsOnDoubleClick(False)
@ -567,8 +564,8 @@ class SearchWidget(QtWidgets.QWidget):
# size it as we specify # size it as we specify
self.setSizePolicy( self.setSizePolicy(
QtWidgets.QSizePolicy.Fixed, size_policy.Fixed,
QtWidgets.QSizePolicy.Fixed, size_policy.Fixed,
) )
self.godwidget = godwidget self.godwidget = godwidget
@ -592,14 +589,16 @@ class SearchWidget(QtWidgets.QWidget):
}} }}
""" """
) )
label.setTextFormat(3) # markdown label.setTextFormat(
Qt.TextFormat.MarkdownText
)
label.setFont(_font.font) label.setFont(_font.font)
label.setMargin(4) label.setMargin(4)
label.setText("search:") label.setText("search:")
label.show() label.show()
label.setAlignment( label.setAlignment(
QtCore.Qt.AlignVCenter align_flag.AlignVCenter
| QtCore.Qt.AlignLeft | align_flag.AlignLeft
) )
self.bar_hbox.addWidget(label) self.bar_hbox.addWidget(label)
@ -617,9 +616,17 @@ class SearchWidget(QtWidgets.QWidget):
self.vbox.addLayout(self.bar_hbox) self.vbox.addLayout(self.bar_hbox)
self.vbox.setAlignment(self.bar, Qt.AlignTop | Qt.AlignRight) self.vbox.setAlignment(
self.bar,
align_flag.AlignTop
| align_flag.AlignRight,
)
self.vbox.addWidget(self.bar.view) self.vbox.addWidget(self.bar.view)
self.vbox.setAlignment(self.view, Qt.AlignTop | Qt.AlignLeft) self.vbox.setAlignment(
self.view,
align_flag.AlignTop
| align_flag.AlignLeft,
)
def focus(self) -> None: def focus(self) -> None:
self.show() self.show()

View File

@ -22,10 +22,14 @@ from typing import Dict
import math import math
import pyqtgraph as pg import pyqtgraph as pg
from PyQt5 import QtCore, QtGui
from PyQt5.QtCore import Qt, QCoreApplication
from qdarkstyle import DarkPalette from qdarkstyle import DarkPalette
from .qt import (
QtCore,
QtGui,
Qt,
QCoreApplication,
)
from ..log import get_logger from ..log import get_logger
from .. import config from .. import config

View File

@ -27,16 +27,14 @@ from typing import (
) )
import uuid import uuid
from PyQt5 import QtCore from piker.ui.qt import (
from PyQt5.QtWidgets import ( Qt,
QtCore,
QWidget, QWidget,
QMainWindow, QMainWindow,
QApplication, QApplication,
QLabel, QLabel,
QStatusBar, QStatusBar,
)
from PyQt5.QtGui import (
QScreen, QScreen,
QCloseEvent, QCloseEvent,
) )
@ -197,7 +195,9 @@ class MainWindow(QMainWindow):
""" """
# font-size : {font_size}px; # font-size : {font_size}px;
) )
label.setTextFormat(3) # markdown label.setTextFormat(
Qt.TextFormat.MarkdownText
)
label.setFont(_font_small.font) label.setFont(_font_small.font)
label.setMargin(2) label.setMargin(2)
label.setAlignment( label.setAlignment(

View File

@ -34,7 +34,6 @@ import uuid
from bidict import bidict from bidict import bidict
import tractor import tractor
import trio import trio
from PyQt5.QtCore import Qt
from piker import config from piker import config
from piker.accounting import ( from piker.accounting import (
@ -59,6 +58,7 @@ from piker.data import (
) )
from piker.types import Struct from piker.types import Struct
from piker.log import get_logger from piker.log import get_logger
from piker.ui.qt import Qt
from ._editors import LineEditor, ArrowEditor from ._editors import LineEditor, ArrowEditor
from ._lines import order_line, LevelLine from ._lines import order_line, LevelLine
from ._position import ( from ._position import (
@ -358,7 +358,7 @@ class OrderMode:
send_msg: bool = True, send_msg: bool = True,
order: Order | None = None, order: Order | None = None,
) -> Dialog | None: ) -> Dialog|None:
''' '''
Send execution order to EMS return a level line to Send execution order to EMS return a level line to
represent the order on a chart. represent the order on a chart.
@ -494,7 +494,7 @@ class OrderMode:
uuid: str, uuid: str,
order: Order | None = None, order: Order | None = None,
) -> Dialog: ) -> Dialog | None:
''' '''
Order submitted status event handler. Order submitted status event handler.
@ -515,6 +515,11 @@ class OrderMode:
# if an order msg is provided update the line # if an order msg is provided update the line
# **from** that msg. # **from** that msg.
if order: if order:
if order.price <= 0:
log.error(f'Order has 0 price, cancelling..\n{order}')
self.cancel_orders([order.oid])
return None
line.set_level(order.price) line.set_level(order.price)
self.on_level_change_update_next_order_info( self.on_level_change_update_next_order_info(
level=order.price, level=order.price,
@ -1013,8 +1018,13 @@ async def process_trade_msg(
) -> tuple[Dialog, Status]: ) -> tuple[Dialog, Status]:
fmsg = pformat(msg) # TODO: obvi once we're parsing to native struct instances we can
log.debug(f'Received order msg:\n{fmsg}') # drop the `pformat()` call Bo
fmtmsg: Struct | dict = msg
if not isinstance(msg, Struct):
fmtmsg: str = pformat(msg)
log.debug(f'Received order msg:\n{fmtmsg}')
name = msg['name'] name = msg['name']
if name in ( if name in (
@ -1030,7 +1040,7 @@ async def process_trade_msg(
): ):
log.info( log.info(
f'Loading position for `{fqme}`:\n' f'Loading position for `{fqme}`:\n'
f'{fmsg}' f'{fmtmsg}'
) )
tracker = mode.trackers[msg['account']] tracker = mode.trackers[msg['account']]
tracker.live_pp.update_from_msg(msg) tracker.live_pp.update_from_msg(msg)
@ -1072,7 +1082,7 @@ async def process_trade_msg(
elif order.action != 'cancel': elif order.action != 'cancel':
log.warning( log.warning(
f'received msg for untracked dialog:\n{fmsg}' f'received msg for untracked dialog:\n{fmtmsg}'
) )
assert msg.resp in ('open', 'dark_open'), f'Unknown msg: {msg}' assert msg.resp in ('open', 'dark_open'), f'Unknown msg: {msg}'
@ -1139,7 +1149,7 @@ async def process_trade_msg(
req={'exec_mode': 'dark'}, req={'exec_mode': 'dark'},
): ):
# TODO: UX for a "pending" clear/live order # TODO: UX for a "pending" clear/live order
log.info(f'Dark order triggered for {fmsg}') log.info(f'Dark order triggered for {fmtmsg}')
case Status( case Status(
resp='triggered', resp='triggered',

104
piker/ui/qt.py 100644
View File

@ -0,0 +1,104 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Qt UI framework version shimming.
Allow importing sub-pkgs from this module instead of worrying about
major version specifics, any enum moves or component renames.
Code in `piker.ui.*` should always explicitlyimport directly from
this module like `from piker.ui.qt import ( ..`
'''
from enum import EnumType
from PyQt6 import (
QtCore,
QtGui,
QtWidgets,
)
from PyQt6.QtCore import (
Qt,
QCoreApplication,
QLineF,
QRectF,
# NOTE: for enums use the `.Type` subattr-space
QEvent,
QPointF,
QSize,
QModelIndex,
QItemSelectionModel,
pyqtBoundSignal,
pyqtRemoveInputHook,
)
align_flag: EnumType = Qt.AlignmentFlag
txt_flag: EnumType = Qt.TextFlag
keys: EnumType = QEvent.Type
scrollbar_policy: EnumType = Qt.ScrollBarPolicy
# ^-NOTE-^: handy snippet to discover enums:
# import enum
# [attr for attr_name in dir(QFrame)
# if (attr := getattr(QFrame, attr_name))
# and isinstance(attr, enum.EnumType)]
from PyQt6.QtGui import (
QPainter,
QPainterPath,
QIcon,
QPixmap,
QColor,
QTransform,
QStandardItem,
QStandardItemModel,
QWheelEvent,
QScreen,
QCloseEvent,
)
from PyQt6.QtWidgets import (
QMainWindow,
QApplication,
QLabel,
QStatusBar,
QLineEdit,
QHBoxLayout,
QVBoxLayout,
QFormLayout,
QProgressBar,
QSizePolicy,
QStyledItemDelegate,
QStyleOptionViewItem,
QComboBox,
QWidget,
QFrame,
QSplitter,
QTreeView,
QStyle,
QGraphicsItem,
QGraphicsPathItem,
# QGraphicsView,
QStyleOptionGraphicsItem,
QGraphicsScene,
QGraphicsSceneMouseEvent,
QGraphicsProxyWidget,
)
gs_keys: EnumType = QGraphicsSceneMouseEvent.Type
size_policy: EnumType = QtWidgets.QSizePolicy.Policy
px_cache_mode: EnumType = QGraphicsItem.CacheMode

View File

@ -15,128 +15,119 @@
# You should have received a copy of the GNU Affero General Public License # You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>. # along with this program. If not, see <https://www.gnu.org/licenses/>.
[build-system] [build-system]
requires = ["poetry-core"] requires = ["hatchling"]
build-backend = "poetry.core.masonry.api" build-backend = "hatchling.build"
# ------ - ------ [project]
[tool.poetry]
name = "piker" name = "piker"
version = "0.1.0.alpha0.dev0" version = "0.1.0a0dev0"
description = "trading gear for hackers" description = "trading gear for hackers"
authors = ["Tyler Goodlet <jgbt@protonmail.com>"] authors = [{ name = "Tyler Goodlet", email = "goodboy_foss@protonmail.com" }]
license = "AGPLv3" requires-python = ">=3.12, <3.13"
license = "AGPL-3.0-or-later"
readme = "README.rst" readme = "README.rst"
keywords = [
"async",
"trading",
"finance",
"quant",
"charting",
]
classifiers = [
"Development Status :: 3 - Alpha",
"License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Science/Research",
"Intended Audience :: Developers",
"Intended Audience :: Education",
]
dependencies = [
"async-generator >=1.10, <2.0.0",
"attrs >=23.1.0, <24.0.0",
"bidict >=0.22.1, <0.23.0",
"colorama >=0.4.6, <0.5.0",
"colorlog >=6.7.0, <7.0.0",
"ib-insync >=0.9.86, <0.10.0",
"numba >=0.59.0, <0.60.0",
"numpy >=1.25, <2.0",
"polars >=0.18.13, <0.19.0",
"pygments >=2.16.1, <3.0.0",
"rich >=13.5.2, <14.0.0",
"tomli >=2.0.1, <3.0.0",
"tomli-w >=1.0.0, <2.0.0",
"trio-util >=0.7.0, <0.8.0",
"trio-websocket >=0.10.3, <0.11.0",
"typer >=0.9.0, <1.0.0",
"rapidfuzz >=3.5.2, <4.0.0",
"pdbp >=1.5.0, <2.0.0",
"trio >=0.24, <0.25",
"pendulum >=3.0.0, <4.0.0",
"httpx >=0.27.0, <0.28.0",
"cryptofeed >=2.4.0, <3.0.0",
"pyarrow >=17.0.0, <18.0.0",
"websockets ==12.0",
"msgspec",
"tractor",
"asyncvnc",
"tomlkit",
]
# TODO: add meta-data from setup.py [project.optional-dependencies]
# keywords=[ uis = [
# "async", # https://docs.astral.sh/uv/concepts/projects/dependencies/#optional-dependencies
# "trading", # TODO: make sure the levenshtein shit compiles on nix..
# "finance", # rapidfuzz = {extras = ["speedup"], version = "^0.18.0"}
# "quant", "rapidfuzz >=3.2.0, <4.0.0",
# "charting", "qdarkstyle >=3.0.2, <4.0.0",
# ], "pyqt6 >=6.7.0, <7.0.0",
# classifiers=[ "pyqtgraph",
# 'Development Status :: 3 - Alpha',
# 'License :: OSI Approved :: ',
# 'Operating System :: POSIX :: Linux',
# "Programming Language :: Python :: Implementation :: CPython",
# "Programming Language :: Python :: 3 :: Only",
# "Programming Language :: Python :: 3.10",
# "Programming Language :: Python :: 3.11",
# 'Intended Audience :: Financial and Insurance Industry',
# 'Intended Audience :: Science/Research',
# 'Intended Audience :: Developers',
# 'Intended Audience :: Education',
# ],
# ------ - ------ # for consideration,
# - 'visidata'
[tool.poetry.dependencies] # TODO: add an `--only daemon` group for running non-ui / pikerd
asks = "^3.0.0" # service tree in distributed mode B)
async-generator = "^1.10" # https://docs.astral.sh/uv/concepts/projects/dependencies/#optional-dependencies
attrs = "^23.1.0" ]
bidict = "^0.22.1"
colorama = "^0.4.6"
colorlog = "^6.7.0"
cython = "^3.0.0"
greenback = "^1.1.1"
ib-insync = "^0.9.86"
msgspec = "^0.18.0"
numba = "^0.57.1"
numpy = "1.24"
pendulum = "^2.1.2"
polars = "^0.18.13"
pygments = "^2.16.1"
python = "^3.10"
rich = "^13.5.2"
# setuptools = "^68.0.0"
tomli = "^2.0.1"
tomli-w = "^1.0.0"
trio = "^0.22.2"
trio-util = "^0.7.0"
trio-websocket = "^0.10.3"
typer = "^0.9.0"
[dependency-groups]
[tool.poetry.dependencies.asyncvnc] # TODO: a toolset that makes debugging a `pikerd` service (tree) easy
git = 'https://github.com/pikers/asyncvnc.git' # to hack on directly using more or less the local env:
branch = 'main'
[tool.poetry.dependencies.tomlkit]
git = 'https://github.com/pikers/tomlkit.git'
branch = 'piker_pin'
develop = true
# path = "../tomlkit/"
[tool.poetry.dependencies.tractor]
git = 'https://github.com/goodboy/tractor.git'
branch = 'asyncio_debugger_support'
# branch = 'piker_pin'
develop = true
# path = '../tractor/'
# ------ - ------
[tool.poetry.group.uis]
optional = true
[tool.poetry.group.uis.dependencies]
# https://python-poetry.org/docs/managing-dependencies/#dependency-groups
# TODO: make sure the levenshtein shit compiles on nix..
# rapidfuzz = {extras = ["speedup"], version = "^0.18.0"}
rapidfuzz = "^3.2.0"
qdarkstyle = ">=3.0.2"
pyqt5 = "^5.15.9"
pyqtgraph = { git = 'https://github.com/pikers/pyqtgraph.git' }
pyqt6 = "^6.5.2"
# ------ - ------
[tool.poetry.group.dev]
optional = true
[tool.poetry.group.dev.dependencies]
# testing / CI
pytest = "^6.0.0"
elasticsearch = "^8.9.0"
# console ehancements and eventually remote debugging
# extras/helpers.
# TODO: add a toolset that makes debugging a `pikerd` service
# (tree) easy to hack on directly using more or less the local env:
# - xonsh + xxh # - xonsh + xxh
# - rsyscall + pdbp # - rsyscall + pdbp
# - actor runtime control console like BEAM/OTP # - actor runtime control console like BEAM/OTP
xonsh = "^0.14.0" # XXX: explicit env install for shell use w nix #
prompt-toolkit = "^3.0.39" # console ehancements and eventually remote debugging extras/helpers.
# use `uv --dev` to enable
dev = [
"pytest >=6.0.0, <7.0.0",
"elasticsearch >=8.9.0, <9.0.0",
"xonsh >=0.14.2, <0.15.0",
"prompt-toolkit ==3.0.40",
"cython >=3.0.0, <4.0.0",
"greenback >=1.1.1, <2.0.0",
"ruff>=0.9.6",
]
# ------ - ------ [project.scripts]
piker = "piker.cli:cli"
pikerd = "piker.cli:pikerd"
ledger = "piker.accounting.cli:ledger"
# TODO: add an `--only daemon` group for running non-ui / pikerd [tool.hatch.build.targets.sdist]
# service tree in distributed mode B) include = ["piker"]
# https://python-poetry.org/docs/managing-dependencies/#installing-group-dependencies
# [tool.poetry.group.daemon.dependencies]
[tool.poetry.scripts] [tool.hatch.build.targets.wheel]
piker = 'piker.cli:cli' include = ["piker"]
pikerd = 'piker.cli:pikerd'
ledger = 'piker.accounting.cli:ledger' [tool.uv.sources]
pyqtgraph = { git = "https://github.com/pikers/pyqtgraph.git" }
asyncvnc = { git = "https://github.com/pikers/asyncvnc.git", branch = "main" }
tomlkit = { git = "https://github.com/pikers/tomlkit.git", branch ="piker_pin" }
msgspec = { git = "https://github.com/jcrist/msgspec.git" }
tractor = { path = "../tractor", editable = true }

93
ruff.toml 100644
View File

@ -0,0 +1,93 @@
# from default `ruff.toml` @
# https://docs.astral.sh/ruff/configuration/
# Exclude a variety of commonly ignored directories.
exclude = [
".bzr",
".direnv",
".eggs",
".git",
".git-rewrite",
".hg",
".ipynb_checkpoints",
".mypy_cache",
".nox",
".pants.d",
".pyenv",
".pytest_cache",
".pytype",
".ruff_cache",
".svn",
".tox",
".venv",
".vscode",
"__pypackages__",
"_build",
"buck-out",
"build",
"dist",
"node_modules",
"site-packages",
"venv",
]
# Same as Black.
line-length = 88
indent-width = 4
# Assume Python 3.9
target-version = "py312"
# ------ - ------
# TODO, stop warnings around `anext()` builtin use?
# tool.ruff.target-version = "py310"
[lint]
# Enable Pyflakes (`F`) and a subset of the pycodestyle (`E`) codes by default.
# Unlike Flake8, Ruff doesn't enable pycodestyle warnings (`W`) or
# McCabe complexity (`C901`) by default.
select = ["E4", "E7", "E9", "F"]
ignore = []
ignore-init-module-imports = false
[lint.per-file-ignores]
"piker/ui/qt.py" = [
"E402",
'F401', # unused imports (without __all__ or blah as blah)
# "F841", # unused variable rules
]
# Allow fix for all enabled rules (when `--fix`) is provided.
fixable = ["ALL"]
unfixable = []
# Allow unused variables when underscore-prefixed.
dummy-variable-rgx = "^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$"
[format]
# Use single quotes in `ruff format`.
quote-style = "single"
# Like Black, indent with spaces, rather than tabs.
indent-style = "space"
# Like Black, respect magic trailing commas.
skip-magic-trailing-comma = false
# Like Black, automatically detect the appropriate line ending.
line-ending = "auto"
# Enable auto-formatting of code examples in docstrings. Markdown,
# reStructuredText code/literal blocks and doctests are all supported.
#
# This is currently disabled by default, but it is planned for this
# to be opt-out in the future.
docstring-code-format = false
# Set the line length limit used when formatting code snippets in
# docstrings.
#
# This only has an effect when the `docstring-code-format` setting is
# enabled.
docstring-code-line-length = "dynamic"

1500
uv.lock 100644

File diff suppressed because it is too large Load Diff