Compare commits

...

26 Commits

Author SHA1 Message Date
Tyler Goodlet 793a454463 Adjust feed status fields/display-pane to new actor-ID
That is to use the new `tractor.msg.types.Aid` struct to pull the
`brokerd` info from the `tractor.Channel.aid: Aid` attr as well as more
generally handling the new `Channel.raddr.proto_key: str` and no longer
assuming a TCP IPC transport; this per the recent `tractor.ipc`
subsys which adds multi-IPC-transports!

Downstream tweaks to match,
- use an "opt-in" field set to display in the `brokerd` info pane in
  `.ui._feedstatus.mk_feed_label()`.
 |_ also add some todos and drop some seemingly unneeded form sizing
    calcs?
- tweak `.ui._label` to allow not using markdown, though ended up not
  doing that since it looked too plain..
2026-01-02 17:36:13 -05:00
Tyler Goodlet b8b4f1b80f Adjust to `trio`'s strict eg nurseries throughout!
Using `tractor.trionics.collapse_eg()` as needed to avoid, at the least,
crash-worthy (in debug-mode REPL-ing terms) nested cancellation egs that
exhibit on SIGINT/ctl-c of each "app" (chart & daemon).

Also a bit of renaming of all `trio.Nursery`s to `tn`, the new "task
nursery" shorthand-var-name being used in all our other `tractor`
related projects.
2026-01-02 17:36:13 -05:00
Tyler Goodlet 1cf041d8e6 Start a manual `tags` file for internal refs 2026-01-02 17:36:13 -05:00
Tyler Goodlet 831b6cfb21 Add a couple new grays to the pallete 2026-01-02 17:25:29 -05:00
Tyler Goodlet e5f7e8de9d Bump to (latest) `polars`, the `0.20.6x` series B)
Since I was trying out the neat lookin `polars-fuzzy-match` (also added
for now as a core dep here) which requires the new plugin sys, plus it's
about time we synced with upstream!

Adjust some column syntax to the new `.name` sub-field-space and the
`uv` lock-file to match.

Other,
- add back `trio-typing` bc i guess something else needs it (debug
  tooling stuff in new `tractor`?)
- flip back to the `tractor` pre-main pin since the new `main`-branch
  requires new `trio` stuff we haven't ported yet..
2026-01-02 17:24:17 -05:00
Tyler Goodlet 871bb2620e Port to newer `tractor.get_registry()` 2026-01-02 17:00:23 -05:00
Tyler Goodlet de980a69e0 Update legacy type to `tractor.MsgStream` 2026-01-02 17:00:23 -05:00
Tyler Goodlet ab9f01caf2 Fix type-check assertion in ems test to use `is` 2026-01-02 17:00:23 -05:00
Tyler Goodlet d85632ba9b Cast to `float` as needed from order-mode and ems
Since we're not quite yet using automatic typed msging from
`tractor`/`msgspec` (i.e. still manually decoding order ctl msgs from
built-in types..`dict`s still not `msgspec.Struct`) this adds the
appropriate typecasting ops to ensure the required precision is attained
prior to processing and/or submission to a brokerd backend service.

For the `.clearing._ems`,
- flip all `trigger_price` previously presumed to be `float` to just
  the field-identical `price: Decimal` and ensure we cast to `float`
  for any `trigger_price` usage, like before passing to `mk_check()`.

For `.ui.order_mode.OrderMode`,
- add a new `.curr_mkt: MktPair` convenience property to get the
  chart-active value.
- ensure we always use the `.curr_mkt.quantize() -> Decimal` before
  setting any IPC-msg's `.price` field!
- always cast `float(Order.price)` before use in setting line-levels.
- don't bother setting `Order.symbol` to a (now fully removed) `Symbol`
  instance since it's not really required-for-use anywhere; leaving it
  a `str` (per the type-annot) is fine for now?
2026-01-02 17:00:23 -05:00
Tyler Goodlet 8294ca6487 Mk `Brokerd[Order].price` avoid `float`-errs
By re-typing to a `.price: Decimal` field on both legs of the EMS.

It seems we must do it ourselves since,
- these msg's (fields) are relayed through the clearing engine to each
  `brokerd` backend and,
- bc many (if not all) of those backends `.broker`-clients (nor their
  encapsulated "brokerage services") **are not** doing any
  precision-truncation themselves.

So, for now, instead we opt to expect rounding at the source. This means
we will explicitly require casting to/from `float` at the line-graphics
interface to the order-clearing-engine (as implemented throughout
`.ui.order_mode.OrderMode`); and this is coming shortly.
2026-01-02 17:00:23 -05:00
Tyler Goodlet 87385a4e2d ib: never relay "Warning:" errors to EMS..
You'd think they could be bothered to make either a "log" or "warning"
msg type instead of a `type='error'`.. but alas, this attempts to detect
all such "warning"-errors and never proxy them to the clearing engine
thus avoiding the cancellation of any associated (by `reqid`)
pre-existing orders (control dialogs).

Also update all surrounding log messages to a more multiline style.
2026-01-02 16:59:09 -05:00
Tyler Goodlet b3c5478017 ib: jig `.data_reset_hack()` with vnc-client failover
Since apparently porting to the new docker container enforces using
a vnc password and `asyncvnc` seems to have a bug/mis-config whenever
i've tried a pw over a wg tunnel..?

Soo, this tries out the old `i3ipc`-win-focus + `xdo` click hack when
the above fails.

Deats,
- add a mod-level `try_xdo_manual()` to wrap calling
  `i3ipc_xdotool_manual_click_hack()` with an oserr handler, ensure we
  don't bother trying if `i3ipc` import fails beforehand tho.
- call ^ from both the orig case block and the failover from the
  vnc-client case.
- factor the `+no_setup_msg: str` out to mod level and expect it to be
  `.format()`-ed.
- refresh todo around `asyncvnc` pw ish..
- add a new `i3ipc_fin_wins_titled()` window-title scanner which
  predicates input `titles` and delivers any matches alongside the orig
  focused win at call time.
- tweak `i3ipc_xdotool_manual_click_hack()` to call ^ and remove prior
  unfactored window scanning logic.
2026-01-02 16:59:09 -05:00
Tyler Goodlet 6c9a78c5a0 Add fix for binance API 3.1 rollout..
See https://developers.binance.com/docs/binance-spot-api-docs#2025-08-26
2026-01-02 16:59:09 -05:00
Tyler Goodlet da223f7a55 kraken: add crash-handling around `Pair()` init
Since it can otherwise be difficult to debug due to nursery cancellation
(we need that taskman yo!).
2026-01-02 16:59:09 -05:00
Tyler Goodlet 49fe0a3398 kraken: `Pair.costmin` is now optional?
Some pairs don't seem to define it but it's not listed as deprecated on
official API page (new one now linked in type def's doc string).
2026-01-02 16:59:09 -05:00
Tyler Goodlet 29fc3b8a8b binance: add new `permissionSets` to base `Pair` 2026-01-02 16:59:09 -05:00
Tyler Goodlet 1bfe777637 Update `binance` spot pairs with `amendAllowed`
As per API updates,
https://developers.binance.com/docs/binance-spot-api-docs
https://developers.binance.com/docs/binance-spot-api-docs/faqs/order_amend_keep_priority

I also slightly tweaked the filed mismatch exception note to include the
`repr(pair_type)` so the dev can know which pair types should be
changed.
2026-01-02 16:59:09 -05:00
Tyler Goodlet c694d915f1 `.kraken`: add masked pauses for order req debug
Such that the next time i inevitably must debug the some order-request
error status or precision discrepancy, i have the mkt-symbol branch
ready to go. Also, switch to `'action': 'buy'|'sell' as action,` style
`case` matching instead of the post-`if` predicate style.
2026-01-02 16:59:09 -05:00
Tyler Goodlet c120cb51a4 `.questrade`: link in ws-API issue! 2026-01-02 16:59:09 -05:00
Tyler Goodlet 7c20231f16 `.kraken.broker`: need to `await verify_balances()` .. 2026-01-02 16:59:09 -05:00
Tyler Goodlet d809c79788 `.brokers.ib.feed`: better `tractor.to_asyncio` typing and var naming throughout! 2026-01-02 16:59:09 -05:00
Tyler Goodlet 9f2f8a1664 `.brokers.cli`: module type and todo for `--pdb` flag to NOT src from sub-cmd 2026-01-02 16:59:09 -05:00
Tyler Goodlet 9f141635d1 Type loaded backend modules 2026-01-02 16:59:09 -05:00
Tyler Goodlet 0604ca7c82 Bump various `.brokers.core` doc string content/style 2026-01-02 16:59:09 -05:00
Tyler Goodlet 82c2256271 Add missing f-str prefix to log line 2026-01-02 16:55:15 -05:00
Tyler Goodlet a743fa28b5 Teensie `piker.data` styling tweaks
- use more compact optional value style with `|`-union
- fix `.flows` typing-only import since we need `MktPair` to be
  immediately defined for use on a `msgspec.Struct` field.
- more "tree-like" warning msg in `.validate()` reporting.
2026-01-02 16:55:15 -05:00
36 changed files with 531 additions and 259 deletions

View File

@ -98,13 +98,14 @@ async def open_cached_client(
If one has not been setup do it and cache it. If one has not been setup do it and cache it.
''' '''
brokermod = get_brokermod(brokername) brokermod: ModuleType = get_brokermod(brokername)
# TODO: make abstract or `typing.Protocol`
# client: Client
async with maybe_open_context( async with maybe_open_context(
acm_func=brokermod.get_client, acm_func=brokermod.get_client,
kwargs=kwargs, kwargs=kwargs,
) as (cache_hit, client): ) as (cache_hit, client):
if cache_hit: if cache_hit:
log.runtime(f'Reusing existing {client}') log.runtime(f'Reusing existing {client}')

View File

@ -96,7 +96,10 @@ async def _setup_persistent_brokerd(
# - `open_symbol_search()` # - `open_symbol_search()`
# NOTE: see ep invocation details inside `.data.feed`. # NOTE: see ep invocation details inside `.data.feed`.
try: try:
async with trio.open_nursery() as service_nursery: async with (
tractor.trionics.collapse_eg(),
trio.open_nursery() as service_nursery
):
bus: _FeedsBus = feed.get_feed_bus( bus: _FeedsBus = feed.get_feed_bus(
brokername, brokername,
service_nursery, service_nursery,

View File

@ -374,9 +374,14 @@ class Client:
pair: Pair = pair_type(**item) pair: Pair = pair_type(**item)
except Exception as e: except Exception as e:
e.add_note( e.add_note(
"\nDon't panic, prolly stupid binance changed their symbology schema again..\n" f'\n'
'Check out their API docs here:\n\n' f'New or removed field we need to codify!\n'
'https://binance-docs.github.io/apidocs/spot/en/#exchange-information' f'pair-type: {pair_type!r}\n'
f'\n'
f"Don't panic, prolly stupid binance changed their symbology schema again..\n"
f'Check out their API docs here:\n'
f'\n'
f'https://binance-docs.github.io/apidocs/spot/en/#exchange-information\n'
) )
raise raise
pair_table[pair.symbol.upper()] = pair pair_table[pair.symbol.upper()] = pair

View File

@ -97,6 +97,13 @@ class Pair(Struct, frozen=True, kw_only=True):
baseAsset: str baseAsset: str
baseAssetPrecision: int baseAssetPrecision: int
permissionSets: list[list[str]]
# https://developers.binance.com/docs/binance-spot-api-docs#2025-08-26
# will become non-optional 2025-08-28?
# https://developers.binance.com/docs/binance-spot-api-docs#future-changes
pegInstructionsAllowed: bool|None = None
filters: dict[ filters: dict[
str, str,
str | int | float, str | int | float,
@ -142,7 +149,11 @@ class SpotPair(Pair, frozen=True):
defaultSelfTradePreventionMode: str defaultSelfTradePreventionMode: str
allowedSelfTradePreventionModes: list[str] allowedSelfTradePreventionModes: list[str]
permissions: list[str] permissions: list[str]
permissionSets: list[list[str]]
# can the paint botz creat liq gaps even easier on this asset?
# Bp
# https://developers.binance.com/docs/binance-spot-api-docs/faqs/order_amend_keep_priority
amendAllowed: bool
# NOTE: see `.data._symcache.SymbologyCache.load()` for why # NOTE: see `.data._symcache.SymbologyCache.load()` for why
ns_path: str = 'piker.brokers.binance:SpotPair' ns_path: str = 'piker.brokers.binance:SpotPair'

View File

@ -471,11 +471,15 @@ def search(
''' '''
# global opts # global opts
brokermods = list(config['brokermods'].values()) brokermods: list[ModuleType] = list(config['brokermods'].values())
# TODO: this is coming from the `search --pdb` NOT from
# the `piker --pdb` XD ..
# -[ ] pull from the parent click ctx's values..dumdum
# assert pdb
# define tractor entrypoint # define tractor entrypoint
async def main(func): async def main(func):
async with maybe_open_pikerd( async with maybe_open_pikerd(
loglevel=config['loglevel'], loglevel=config['loglevel'],
debug_mode=pdb, debug_mode=pdb,

View File

@ -22,7 +22,9 @@ routines should be primitive data types where possible.
""" """
import inspect import inspect
from types import ModuleType from types import ModuleType
from typing import List, Dict, Any, Optional from typing import (
Any,
)
import trio import trio
@ -34,8 +36,10 @@ from ..accounting import MktPair
async def api(brokername: str, methname: str, **kwargs) -> dict: async def api(brokername: str, methname: str, **kwargs) -> dict:
"""Make (proxy through) a broker API call by name and return its result. '''
""" Make (proxy through) a broker API call by name and return its result.
'''
brokermod = get_brokermod(brokername) brokermod = get_brokermod(brokername)
async with brokermod.get_client() as client: async with brokermod.get_client() as client:
meth = getattr(client, methname, None) meth = getattr(client, methname, None)
@ -62,10 +66,14 @@ async def api(brokername: str, methname: str, **kwargs) -> dict:
async def stocks_quote( async def stocks_quote(
brokermod: ModuleType, brokermod: ModuleType,
tickers: List[str] tickers: list[str]
) -> Dict[str, Dict[str, Any]]:
"""Return quotes dict for ``tickers``. ) -> dict[str, dict[str, Any]]:
""" '''
Return a `dict` of snapshot quotes for the provided input
`tickers`: a `list` of fqmes.
'''
async with brokermod.get_client() as client: async with brokermod.get_client() as client:
return await client.quote(tickers) return await client.quote(tickers)
@ -74,13 +82,15 @@ async def stocks_quote(
async def option_chain( async def option_chain(
brokermod: ModuleType, brokermod: ModuleType,
symbol: str, symbol: str,
date: Optional[str] = None, date: str|None = None,
) -> Dict[str, Dict[str, Dict[str, Any]]]: ) -> dict[str, dict[str, dict[str, Any]]]:
"""Return option chain for ``symbol`` for ``date``. '''
Return option chain for ``symbol`` for ``date``.
By default all expiries are returned. If ``date`` is provided By default all expiries are returned. If ``date`` is provided
then contract quotes for that single expiry are returned. then contract quotes for that single expiry are returned.
"""
'''
async with brokermod.get_client() as client: async with brokermod.get_client() as client:
if date: if date:
id = int((await client.tickers2ids([symbol]))[symbol]) id = int((await client.tickers2ids([symbol]))[symbol])
@ -98,7 +108,7 @@ async def option_chain(
# async def contracts( # async def contracts(
# brokermod: ModuleType, # brokermod: ModuleType,
# symbol: str, # symbol: str,
# ) -> Dict[str, Dict[str, Dict[str, Any]]]: # ) -> dict[str, dict[str, dict[str, Any]]]:
# """Return option contracts (all expiries) for ``symbol``. # """Return option contracts (all expiries) for ``symbol``.
# """ # """
# async with brokermod.get_client() as client: # async with brokermod.get_client() as client:
@ -110,15 +120,24 @@ async def bars(
brokermod: ModuleType, brokermod: ModuleType,
symbol: str, symbol: str,
**kwargs, **kwargs,
) -> Dict[str, Dict[str, Dict[str, Any]]]: ) -> dict[str, dict[str, dict[str, Any]]]:
"""Return option contracts (all expiries) for ``symbol``. '''
""" Return option contracts (all expiries) for ``symbol``.
'''
async with brokermod.get_client() as client: async with brokermod.get_client() as client:
return await client.bars(symbol, **kwargs) return await client.bars(symbol, **kwargs)
async def search_w_brokerd(name: str, pattern: str) -> dict: async def search_w_brokerd(
name: str,
pattern: str,
) -> dict:
# TODO: WHY NOT WORK!?!
# when we `step` through the next block?
# import tractor
# await tractor.pause()
async with open_cached_client(name) as client: async with open_cached_client(name) as client:
# TODO: support multiple asset type concurrent searches. # TODO: support multiple asset type concurrent searches.
@ -130,12 +149,12 @@ async def symbol_search(
pattern: str, pattern: str,
**kwargs, **kwargs,
) -> Dict[str, Dict[str, Dict[str, Any]]]: ) -> dict[str, dict[str, dict[str, Any]]]:
''' '''
Return symbol info from broker. Return symbol info from broker.
''' '''
results = [] results: list[str] = []
async def search_backend( async def search_backend(
brokermod: ModuleType brokermod: ModuleType
@ -143,6 +162,13 @@ async def symbol_search(
brokername: str = mod.name brokername: str = mod.name
# TODO: figure this the FUCK OUT
# -> ok so obvi in the root actor any async task that's
# spawned outside the main tractor-root-actor task needs to
# call this..
# await tractor.devx._debug.maybe_init_greenback()
# tractor.pause_from_sync()
async with maybe_spawn_brokerd( async with maybe_spawn_brokerd(
mod.name, mod.name,
infect_asyncio=getattr( infect_asyncio=getattr(
@ -162,7 +188,6 @@ async def symbol_search(
)) ))
async with trio.open_nursery() as n: async with trio.open_nursery() as n:
for mod in brokermods: for mod in brokermods:
n.start_soon(search_backend, mod.name) n.start_soon(search_backend, mod.name)
@ -172,11 +197,13 @@ async def symbol_search(
async def mkt_info( async def mkt_info(
brokermod: ModuleType, brokermod: ModuleType,
fqme: str, fqme: str,
**kwargs, **kwargs,
) -> MktPair: ) -> MktPair:
''' '''
Return MktPair info from broker including src and dst assets. Return the `piker.accounting.MktPair` info struct from a given
backend broker tradable src/dst asset pair.
''' '''
async with open_cached_client(brokermod.name) as client: async with open_cached_client(brokermod.name) as client:

View File

@ -34,6 +34,7 @@ from piker.brokers._util import get_logger
if TYPE_CHECKING: if TYPE_CHECKING:
from .api import Client from .api import Client
from ib_insync import IB from ib_insync import IB
import i3ipc
log = get_logger('piker.brokers.ib') log = get_logger('piker.brokers.ib')
@ -48,6 +49,37 @@ _reset_tech: Literal[
] = 'vnc' ] = 'vnc'
no_setup_msg:str = (
'No data reset hack test setup for {vnc_sockaddr}!\n'
'See config setup tips @\n'
'https://github.com/pikers/piker/tree/master/piker/brokers/ib'
)
def try_xdo_manual(
vnc_sockaddr: str,
):
'''
Do the "manual" `xdo`-based screen switch + click
combo since apparently the `asyncvnc` client ain't workin..
Note this is only meant as a backup method for Xorg users,
ideally you can use a real vnc client and the `vnc_click_hack()`
impl!
'''
global _reset_tech
try:
i3ipc_xdotool_manual_click_hack()
_reset_tech = 'i3ipc_xdotool'
return True
except OSError:
log.exception(
no_setup_msg.format(vnc_sockaddr)
)
return False
async def data_reset_hack( async def data_reset_hack(
# vnc_host: str, # vnc_host: str,
client: Client, client: Client,
@ -90,15 +122,9 @@ async def data_reset_hack(
vnc_port: int vnc_port: int
vnc_sockaddr: tuple[str] | None = client.conf.get('vnc_addrs') vnc_sockaddr: tuple[str] | None = client.conf.get('vnc_addrs')
no_setup_msg:str = (
f'No data reset hack test setup for {vnc_sockaddr}!\n'
'See config setup tips @\n'
'https://github.com/pikers/piker/tree/master/piker/brokers/ib'
)
if not vnc_sockaddr: if not vnc_sockaddr:
log.warning( log.warning(
no_setup_msg no_setup_msg.format(vnc_sockaddr)
+ +
'REQUIRES A `vnc_addrs: array` ENTRY' 'REQUIRES A `vnc_addrs: array` ENTRY'
) )
@ -119,27 +145,38 @@ async def data_reset_hack(
port=vnc_port, port=vnc_port,
) )
) )
except OSError: except (
if vnc_host != 'localhost': OSError, # no VNC server avail..
log.warning(no_setup_msg) PermissionError, # asyncvnc pw fail..
return False ):
try: try:
import i3ipc # noqa (since a deps dynamic check) import i3ipc # noqa (since a deps dynamic check)
except ModuleNotFoundError: except ModuleNotFoundError:
log.warning(no_setup_msg) log.warning(
no_setup_msg.format(vnc_sockaddr)
)
return False return False
try: if vnc_host not in {
i3ipc_xdotool_manual_click_hack() 'localhost',
_reset_tech = 'i3ipc_xdotool' '127.0.0.1',
return True }:
except OSError: focussed, matches = i3ipc_fin_wins_titled()
log.exception(no_setup_msg) if not matches:
return False log.warning(
no_setup_msg.format(vnc_sockaddr)
)
return False
else:
try_xdo_manual(vnc_sockaddr)
# localhost but no vnc-client or it borked..
else:
try_xdo_manual(vnc_sockaddr)
case 'i3ipc_xdotool': case 'i3ipc_xdotool':
i3ipc_xdotool_manual_click_hack() try_xdo_manual(vnc_sockaddr)
# i3ipc_xdotool_manual_click_hack()
case _ as tech: case _ as tech:
raise RuntimeError(f'{tech} is not supported for reset tech!?') raise RuntimeError(f'{tech} is not supported for reset tech!?')
@ -178,9 +215,9 @@ async def vnc_click_hack(
host, host,
port=port, port=port,
# TODO: doesn't work see: # TODO: doesn't work?
# https://github.com/barneygale/asyncvnc/issues/7 # see, https://github.com/barneygale/asyncvnc/issues/7
# password='ibcansmbz', password='doggy',
) as client: ) as client:
@ -194,70 +231,103 @@ async def vnc_click_hack(
client.keyboard.press('Ctrl', 'Alt', key) # keys are stacked client.keyboard.press('Ctrl', 'Alt', key) # keys are stacked
def i3ipc_fin_wins_titled(
titles: list[str] = [
'Interactive Brokers', # tws running in i3
'IB Gateway', # gw running in i3
# 'IB', # gw running in i3 (newer version?)
# !TODO, remote vnc instance
# -[ ] something in title (or other Con-props) that indicates
# this is explicitly for ibrk sw?
# |_[ ] !can use modden spawn eventually!
'TigerVNC',
# 'vncviewer', # the terminal..
],
) -> tuple[
i3ipc.Con, # orig focussed win
list[tuple[str, i3ipc.Con]], # matching wins by title
]:
'''
Attempt to find a local-DE window titled with an entry in
`titles`.
If found deliver the current focussed window and all matching
`i3ipc.Con`s in a list.
'''
import i3ipc
ipc = i3ipc.Connection()
# TODO: might be worth offering some kinda api for grabbing
# the window id from the pid?
# https://stackoverflow.com/a/2250879
tree = ipc.get_tree()
focussed: i3ipc.Con = tree.find_focused()
matches: list[i3ipc.Con] = []
for name in titles:
results = tree.find_titled(name)
print(f'results for {name}: {results}')
if results:
con = results[0]
matches.append((
name,
con,
))
return (
focussed,
matches,
)
def i3ipc_xdotool_manual_click_hack() -> None: def i3ipc_xdotool_manual_click_hack() -> None:
''' '''
Do the data reset hack but expecting a local X-window using `xdotool`. Do the data reset hack but expecting a local X-window using `xdotool`.
''' '''
import i3ipc focussed, matches = i3ipc_fin_wins_titled()
i3 = i3ipc.Connection() orig_win_id = focussed.window
# TODO: might be worth offering some kinda api for grabbing
# the window id from the pid?
# https://stackoverflow.com/a/2250879
t = i3.get_tree()
orig_win_id = t.find_focused().window
# for tws
win_names: list[str] = [
'Interactive Brokers', # tws running in i3
'IB Gateway', # gw running in i3
# 'IB', # gw running in i3 (newer version?)
]
try: try:
for name in win_names: for name, con in matches:
results = t.find_titled(name) print(f'Resetting data feed for {name}')
print(f'results for {name}: {results}') win_id = str(con.window)
if results: w, h = con.rect.width, con.rect.height
con = results[0]
print(f'Resetting data feed for {name}')
win_id = str(con.window)
w, h = con.rect.width, con.rect.height
# TODO: seems to be a few libs for python but not sure # TODO: seems to be a few libs for python but not sure
# if they support all the sub commands we need, order of # if they support all the sub commands we need, order of
# most recent commit history: # most recent commit history:
# https://github.com/rr-/pyxdotool # https://github.com/rr-/pyxdotool
# https://github.com/ShaneHutter/pyxdotool # https://github.com/ShaneHutter/pyxdotool
# https://github.com/cphyc/pyxdotool # https://github.com/cphyc/pyxdotool
# TODO: only run the reconnect (2nd) kc on a detected # TODO: only run the reconnect (2nd) kc on a detected
# disconnect? # disconnect?
for key_combo, timeout in [ for key_combo, timeout in [
# only required if we need a connection reset. # only required if we need a connection reset.
# ('ctrl+alt+r', 12), # ('ctrl+alt+r', 12),
# data feed reset. # data feed reset.
('ctrl+alt+f', 6) ('ctrl+alt+f', 6)
]: ]:
subprocess.call([ subprocess.call([
'xdotool', 'xdotool',
'windowactivate', '--sync', win_id, 'windowactivate', '--sync', win_id,
# move mouse to bottom left of window (where # move mouse to bottom left of window (where
# there should be nothing to click). # there should be nothing to click).
'mousemove_relative', '--sync', str(w-4), str(h-4), 'mousemove_relative', '--sync', str(w-4), str(h-4),
# NOTE: we may need to stick a `--retry 3` in here.. # NOTE: we may need to stick a `--retry 3` in here..
'click', '--window', win_id, 'click', '--window', win_id,
'--repeat', '3', '1', '--repeat', '3', '1',
# hackzorzes # hackzorzes
'key', key_combo, 'key', key_combo,
], ],
timeout=timeout, timeout=timeout,
) )
# re-activate and focus original window # re-activate and focus original window
subprocess.call([ subprocess.call([

View File

@ -1241,32 +1241,47 @@ async def deliver_trade_events(
# never relay errors for non-broker related issues # never relay errors for non-broker related issues
# https://interactivebrokers.github.io/tws-api/message_codes.html # https://interactivebrokers.github.io/tws-api/message_codes.html
code: int = err['error_code'] code: int = err['error_code']
if code in { reason: str = err['reason']
200, # uhh reqid: str = str(err['reqid'])
# "Warning:" msg codes,
# https://interactivebrokers.github.io/tws-api/message_codes.html#warning_codes
# - 2109: 'Outside Regular Trading Hours'
if 'Warning:' in reason:
log.warning(
f'Order-API-warning: {code!r}\n'
f'reqid: {reqid!r}\n'
f'\n'
f'{pformat(err)}\n'
# ^TODO? should we just print the `reason`
# not the full `err`-dict?
)
continue
# XXX known special (ignore) cases
elif code in {
200, # uhh.. ni idea
# hist pacing / connectivity # hist pacing / connectivity
162, 162,
165, 165,
# WARNING codes:
# https://interactivebrokers.github.io/tws-api/message_codes.html#warning_codes
# Attribute 'Outside Regular Trading Hours' is
# " 'ignored based on the order type and
# destination. PlaceOrder is now ' 'being
# processed.',
2109,
# XXX: lol this isn't even documented.. # XXX: lol this isn't even documented..
# 'No market data during competing live session' # 'No market data during competing live session'
1669, 1669,
}: }:
log.error(
f'Order-API-error which is non-cancel-causing ?!\n'
f'\n'
f'{pformat(err)}\n'
)
continue continue
reqid: str = str(err['reqid'])
reason: str = err['reason']
if err['reqid'] == -1: if err['reqid'] == -1:
log.error(f'TWS external order error:\n{pformat(err)}') log.error(
f'TWS external order error ??\n'
f'{pformat(err)}\n'
)
flow: dict = dict( flow: dict = dict(
flows.get(reqid) flows.get(reqid)

View File

@ -587,7 +587,7 @@ async def get_bars(
data_cs.cancel() data_cs.cancel()
# spawn new data reset task # spawn new data reset task
data_cs, reset_done = await nurse.start( data_cs, reset_done = await tn.start(
partial( partial(
wait_on_data_reset, wait_on_data_reset,
proxy, proxy,
@ -607,11 +607,11 @@ async def get_bars(
# such that simultaneous symbol queries don't try data resettingn # such that simultaneous symbol queries don't try data resettingn
# too fast.. # too fast..
unset_resetter: bool = False unset_resetter: bool = False
async with trio.open_nursery() as nurse: async with trio.open_nursery() as tn:
# start history request that we allow # start history request that we allow
# to run indefinitely until a result is acquired # to run indefinitely until a result is acquired
nurse.start_soon(query) tn.start_soon(query)
# start history reset loop which waits up to the timeout # start history reset loop which waits up to the timeout
# for a result before triggering a data feed reset. # for a result before triggering a data feed reset.
@ -631,7 +631,7 @@ async def get_bars(
unset_resetter: bool = True unset_resetter: bool = True
# spawn new data reset task # spawn new data reset task
data_cs, reset_done = await nurse.start( data_cs, reset_done = await tn.start(
partial( partial(
wait_on_data_reset, wait_on_data_reset,
proxy, proxy,
@ -705,7 +705,9 @@ async def _setup_quote_stream(
# to_trio, from_aio = trio.open_memory_channel(2**8) # type: ignore # to_trio, from_aio = trio.open_memory_channel(2**8) # type: ignore
def teardown(): def teardown():
ticker.updateEvent.disconnect(push) ticker.updateEvent.disconnect(push)
log.error(f"Disconnected stream for `{symbol}`") log.error(
f'Disconnected stream for `{symbol}`'
)
client.ib.cancelMktData(contract) client.ib.cancelMktData(contract)
# decouple broadcast mem chan # decouple broadcast mem chan
@ -761,7 +763,10 @@ async def open_aio_quote_stream(
symbol: str, symbol: str,
contract: Contract | None = None, contract: Contract | None = None,
) -> trio.abc.ReceiveStream: ) -> (
trio.abc.Channel| # iface
tractor.to_asyncio.LinkedTaskChannel # actually
):
from tractor.trionics import broadcast_receiver from tractor.trionics import broadcast_receiver
global _quote_streams global _quote_streams
@ -778,6 +783,7 @@ async def open_aio_quote_stream(
yield from_aio yield from_aio
return return
from_aio: tractor.to_asyncio.LinkedTaskChannel
async with tractor.to_asyncio.open_channel_from( async with tractor.to_asyncio.open_channel_from(
_setup_quote_stream, _setup_quote_stream,
symbol=symbol, symbol=symbol,
@ -983,17 +989,18 @@ async def stream_quotes(
) )
cs: trio.CancelScope | None = None cs: trio.CancelScope | None = None
startup: bool = True startup: bool = True
iter_quotes: trio.abc.Channel
while ( while (
startup startup
or cs.cancel_called or cs.cancel_called
): ):
with trio.CancelScope() as cs: with trio.CancelScope() as cs:
async with ( async with (
trio.open_nursery() as nurse, trio.open_nursery() as tn,
open_aio_quote_stream( open_aio_quote_stream(
symbol=sym, symbol=sym,
contract=con, contract=con,
) as stream, ) as iter_quotes,
): ):
# ugh, clear ticks since we've consumed them # ugh, clear ticks since we've consumed them
# (ahem, ib_insync is stateful trash) # (ahem, ib_insync is stateful trash)
@ -1021,9 +1028,9 @@ async def stream_quotes(
await rt_ev.wait() await rt_ev.wait()
cs.cancel() # cancel called should now be set cs.cancel() # cancel called should now be set
nurse.start_soon(reset_on_feed) tn.start_soon(reset_on_feed)
async with aclosing(stream): async with aclosing(iter_quotes):
# if syminfo.get('no_vlm', False): # if syminfo.get('no_vlm', False):
if not init_msg.shm_write_opts['has_vlm']: if not init_msg.shm_write_opts['has_vlm']:
@ -1038,19 +1045,21 @@ async def stream_quotes(
# wait for real volume on feed (trading might be # wait for real volume on feed (trading might be
# closed) # closed)
while True: while True:
ticker = await stream.receive() ticker = await iter_quotes.receive()
# for a real volume contract we rait for # for a real volume contract we rait for
# the first "real" trade to take place # the first "real" trade to take place
if ( if (
# not calc_price # not calc_price
# and not ticker.rtTime # and not ticker.rtTime
not ticker.rtTime False
# not ticker.rtTime
): ):
# spin consuming tickers until we # spin consuming tickers until we
# get a real market datum # get a real market datum
log.debug(f"New unsent ticker: {ticker}") log.debug(f"New unsent ticker: {ticker}")
continue continue
else: else:
log.debug("Received first volume tick") log.debug("Received first volume tick")
# ugh, clear ticks since we've # ugh, clear ticks since we've
@ -1066,13 +1075,18 @@ async def stream_quotes(
log.debug(f"First ticker received {quote}") log.debug(f"First ticker received {quote}")
# tell data-layer spawner-caller that live # tell data-layer spawner-caller that live
# quotes are now streaming. # quotes are now active desptie not having
# necessarily received a first vlm/clearing
# tick.
ticker = await iter_quotes.receive()
feed_is_live.set() feed_is_live.set()
fqme: str = quote['fqme']
await send_chan.send({fqme: quote})
# last = time.time() # last = time.time()
async for ticker in stream: async for ticker in iter_quotes:
quote = normalize(ticker) quote = normalize(ticker)
fqme = quote['fqme'] fqme: str = quote['fqme']
await send_chan.send({fqme: quote}) await send_chan.send({fqme: quote})
# ugh, clear ticks since we've consumed them # ugh, clear ticks since we've consumed them

View File

@ -34,6 +34,7 @@ import urllib.parse
import hashlib import hashlib
import hmac import hmac
import base64 import base64
import tractor
import trio import trio
from piker import config from piker import config
@ -372,8 +373,7 @@ class Client:
# 1658347714, 'status': 'Success'}]} # 1658347714, 'status': 'Success'}]}
if xfers: if xfers:
import tractor await tractor.pause()
await tractor.pp()
trans: dict[str, Transaction] = {} trans: dict[str, Transaction] = {}
for entry in xfers: for entry in xfers:
@ -501,7 +501,8 @@ class Client:
for xkey, data in resp['result'].items(): for xkey, data in resp['result'].items():
# NOTE: always cache in pairs tables for faster lookup # NOTE: always cache in pairs tables for faster lookup
pair = Pair(xname=xkey, **data) with tractor.devx.maybe_open_crash_handler(): # as bxerr:
pair = Pair(xname=xkey, **data)
# register the above `Pair` structs for all # register the above `Pair` structs for all
# key-sets/monikers: a set of 4 (frickin) tables # key-sets/monikers: a set of 4 (frickin) tables

View File

@ -175,9 +175,8 @@ async def handle_order_requests(
case { case {
'account': 'kraken.spot' as account, 'account': 'kraken.spot' as account,
'action': action, 'action': 'buy'|'sell',
} if action in {'buy', 'sell'}: }:
# validate # validate
order = BrokerdOrder(**msg) order = BrokerdOrder(**msg)
@ -262,6 +261,12 @@ async def handle_order_requests(
} | extra } | extra
log.info(f'Submitting WS order request:\n{pformat(req)}') log.info(f'Submitting WS order request:\n{pformat(req)}')
# NOTE HOWTO, debug order requests
#
# if 'XRP' in pair:
# await tractor.pause()
await ws.send_msg(req) await ws.send_msg(req)
# placehold for sanity checking in relay loop # placehold for sanity checking in relay loop
@ -544,7 +549,7 @@ async def open_trade_dialog(
# to be reloaded. # to be reloaded.
balances: dict[str, float] = await client.get_balances() balances: dict[str, float] = await client.get_balances()
verify_balances( await verify_balances(
acnt, acnt,
src_fiat, src_fiat,
balances, balances,
@ -1085,6 +1090,8 @@ async def handle_order_updates(
f'Failed to {action} order {reqid}:\n' f'Failed to {action} order {reqid}:\n'
f'{errmsg}' f'{errmsg}'
) )
# if tractor._state.debug_mode():
# await tractor.pause()
symbol: str = 'N/A' symbol: str = 'N/A'
if chain := apiflows.get(reqid): if chain := apiflows.get(reqid):

View File

@ -21,7 +21,6 @@ Symbology defs and search.
from decimal import Decimal from decimal import Decimal
import tractor import tractor
from rapidfuzz import process as fuzzy
from piker._cacheables import ( from piker._cacheables import (
async_lifo_cache, async_lifo_cache,
@ -41,8 +40,13 @@ from piker.accounting._mktinfo import (
) )
# https://www.kraken.com/features/api#get-tradable-pairs
class Pair(Struct): class Pair(Struct):
'''
A tradable asset pair as schema-defined by,
https://docs.kraken.com/api/docs/rest-api/get-tradable-asset-pairs
'''
xname: str # idiotic bs_mktid equiv i guess? xname: str # idiotic bs_mktid equiv i guess?
altname: str # alternate pair name altname: str # alternate pair name
wsname: str # WebSocket pair name (if available) wsname: str # WebSocket pair name (if available)
@ -53,7 +57,6 @@ class Pair(Struct):
lot: str # volume lot size lot: str # volume lot size
cost_decimals: int cost_decimals: int
costmin: float
pair_decimals: int # scaling decimal places for pair pair_decimals: int # scaling decimal places for pair
lot_decimals: int # scaling decimal places for volume lot_decimals: int # scaling decimal places for volume
@ -79,6 +82,7 @@ class Pair(Struct):
tick_size: float # min price step size tick_size: float # min price step size
status: str status: str
costmin: str|None = None # XXX, only some mktpairs?
short_position_limit: float = 0 short_position_limit: float = 0
long_position_limit: float = float('inf') long_position_limit: float = float('inf')

View File

@ -37,6 +37,12 @@ import tractor
from async_generator import asynccontextmanager from async_generator import asynccontextmanager
import numpy as np import numpy as np
import wrapt import wrapt
# TODO, port to `httpx`/`trio-websocket` whenver i get back to
# writing a proper ws-api streamer for this backend (since the data
# feeds are free now) as per GH feat-req:
# https://github.com/pikers/piker/issues/509
#
import asks import asks
from ..calc import humanize, percent_change from ..calc import humanize, percent_change

View File

@ -25,7 +25,10 @@ from typing import TYPE_CHECKING
import trio import trio
import tractor import tractor
from tractor.trionics import broadcast_receiver from tractor.trionics import (
broadcast_receiver,
collapse_eg,
)
from ._util import ( from ._util import (
log, # sub-sys logger log, # sub-sys logger
@ -285,8 +288,11 @@ async def open_ems(
client._ems_stream = trades_stream client._ems_stream = trades_stream
# start sync code order msg delivery task # start sync code order msg delivery task
async with trio.open_nursery() as n: async with (
n.start_soon( collapse_eg(),
trio.open_nursery() as tn,
):
tn.start_soon(
relay_orders_from_sync_code, relay_orders_from_sync_code,
client, client,
fqme, fqme,
@ -302,4 +308,4 @@ async def open_ems(
) )
# stop the sync-msg-relay task on exit. # stop the sync-msg-relay task on exit.
n.cancel_scope.cancel() tn.cancel_scope.cancel()

View File

@ -76,7 +76,6 @@ if TYPE_CHECKING:
# TODO: numba all of this # TODO: numba all of this
def mk_check( def mk_check(
trigger_price: float, trigger_price: float,
known_last: float, known_last: float,
action: str, action: str,
@ -162,7 +161,7 @@ async def clear_dark_triggers(
router: Router, router: Router,
brokerd_orders_stream: tractor.MsgStream, brokerd_orders_stream: tractor.MsgStream,
quote_stream: tractor.ReceiveMsgStream, # noqa quote_stream: tractor.MsgStream,
broker: str, broker: str,
fqme: str, fqme: str,
@ -178,6 +177,7 @@ async def clear_dark_triggers(
''' '''
# XXX: optimize this for speed! # XXX: optimize this for speed!
# TODO: # TODO:
# - port to the new ringbuf stuff in `tractor.ipc`!
# - numba all this! # - numba all this!
# - this stream may eventually contain multiple symbols # - this stream may eventually contain multiple symbols
quote_stream._raise_on_lag = False quote_stream._raise_on_lag = False
@ -1190,12 +1190,16 @@ async def process_client_order_cmds(
submitting live orders immediately if requested by the client. submitting live orders immediately if requested by the client.
''' '''
# cmd: dict # TODO, only allow `msgspec.Struct` form!
cmd: dict
async for cmd in client_order_stream: async for cmd in client_order_stream:
log.info(f'Received order cmd:\n{pformat(cmd)}') log.info(
f'Received order cmd:\n'
f'{pformat(cmd)}\n'
)
# CAWT DAMN we need struct support! # CAWT DAMN we need struct support!
oid = str(cmd['oid']) oid: str = str(cmd['oid'])
# register this stream as an active order dialog (msg flow) for # register this stream as an active order dialog (msg flow) for
# this order id such that translated message from the brokerd # this order id such that translated message from the brokerd
@ -1301,7 +1305,7 @@ async def process_client_order_cmds(
case { case {
'oid': oid, 'oid': oid,
'symbol': fqme, 'symbol': fqme,
'price': trigger_price, 'price': price,
'size': size, 'size': size,
'action': ('buy' | 'sell') as action, 'action': ('buy' | 'sell') as action,
'exec_mode': ('live' | 'paper'), 'exec_mode': ('live' | 'paper'),
@ -1333,7 +1337,7 @@ async def process_client_order_cmds(
symbol=sym, symbol=sym,
action=action, action=action,
price=trigger_price, price=price,
size=size, size=size,
account=req.account, account=req.account,
) )
@ -1355,7 +1359,11 @@ async def process_client_order_cmds(
# (``translate_and_relay_brokerd_events()`` above) will # (``translate_and_relay_brokerd_events()`` above) will
# handle relaying the ems side responses back to # handle relaying the ems side responses back to
# the client/cmd sender from this request # the client/cmd sender from this request
log.info(f'Sending live order to {broker}:\n{pformat(msg)}') log.info(
f'Sending live order to {broker}:\n'
f'{pformat(msg)}'
)
await brokerd_order_stream.send(msg) await brokerd_order_stream.send(msg)
# an immediate response should be ``BrokerdOrderAck`` # an immediate response should be ``BrokerdOrderAck``
@ -1371,7 +1379,7 @@ async def process_client_order_cmds(
case { case {
'oid': oid, 'oid': oid,
'symbol': fqme, 'symbol': fqme,
'price': trigger_price, 'price': price,
'size': size, 'size': size,
'exec_mode': exec_mode, 'exec_mode': exec_mode,
'action': action, 'action': action,
@ -1399,7 +1407,12 @@ async def process_client_order_cmds(
if isnan(last): if isnan(last):
last = flume.rt_shm.array[-1]['close'] last = flume.rt_shm.array[-1]['close']
pred = mk_check(trigger_price, last, action) trigger_price: float = float(price)
pred = mk_check(
trigger_price,
last,
action,
)
# NOTE: for dark orders currently we submit # NOTE: for dark orders currently we submit
# the triggered live order at a price 5 ticks # the triggered live order at a price 5 ticks
@ -1539,7 +1552,7 @@ async def _emsd_main(
ctx: tractor.Context, ctx: tractor.Context,
fqme: str, fqme: str,
exec_mode: str, # ('paper', 'live') exec_mode: str, # ('paper', 'live')
loglevel: str | None = None, loglevel: str|None = None,
) -> tuple[ ) -> tuple[
dict[ dict[

View File

@ -19,6 +19,7 @@ Clearing sub-system message and protocols.
""" """
from __future__ import annotations from __future__ import annotations
from decimal import Decimal
from typing import ( from typing import (
Literal, Literal,
) )
@ -71,7 +72,15 @@ class Order(Struct):
symbol: str # | MktPair symbol: str # | MktPair
account: str # should we set a default as '' ? account: str # should we set a default as '' ?
price: float # https://docs.python.org/3/library/decimal.html#decimal-objects
#
# ?TODO? decimal usage throughout?
# -[ ] possibly leverage the `Encoder(decimal_format='number')`
# bit?
# |_https://jcristharif.com/msgspec/supported-types.html#decimal
# -[ ] should we also use it for .size?
#
price: Decimal
size: float # -ve is "sell", +ve is "buy" size: float # -ve is "sell", +ve is "buy"
brokers: list[str] = [] brokers: list[str] = []
@ -178,7 +187,7 @@ class BrokerdOrder(Struct):
time_ns: int time_ns: int
symbol: str # fqme symbol: str # fqme
price: float price: Decimal
size: float size: float
# TODO: if we instead rely on a +ve/-ve size to determine # TODO: if we instead rely on a +ve/-ve size to determine

View File

@ -510,7 +510,7 @@ async def handle_order_requests(
reqid = await client.submit_limit( reqid = await client.submit_limit(
oid=order.oid, oid=order.oid,
symbol=f'{order.symbol}.{client.broker}', symbol=f'{order.symbol}.{client.broker}',
price=order.price, price=float(order.price),
action=order.action, action=order.action,
size=order.size, size=order.size,
# XXX: by default 0 tells ``ib_insync`` methods that # XXX: by default 0 tells ``ib_insync`` methods that

View File

@ -335,7 +335,7 @@ def services(config, tl, ports):
name='service_query', name='service_query',
loglevel=config['loglevel'] if tl else None, loglevel=config['loglevel'] if tl else None,
), ),
tractor.get_arbiter( tractor.get_registry(
host=host, host=host,
port=ports[0] port=ports[0]
) as portal ) as portal

View File

@ -740,7 +740,7 @@ async def sample_and_broadcast(
log.warning( log.warning(
f'Feed OVERRUN {sub_key}' f'Feed OVERRUN {sub_key}'
'@{bus.brokername} -> \n' f'@{bus.brokername} -> \n'
f'feed @ {chan.uid}\n' f'feed @ {chan.uid}\n'
f'throttle = {throttle} Hz' f'throttle = {throttle} Hz'
) )

View File

@ -39,6 +39,7 @@ from typing import (
AsyncContextManager, AsyncContextManager,
Awaitable, Awaitable,
Sequence, Sequence,
TYPE_CHECKING,
) )
import trio import trio
@ -75,6 +76,10 @@ from ._sampling import (
uniform_rate_send, uniform_rate_send,
) )
if TYPE_CHECKING:
from tractor._addr import Address
from tractor.msg.types import Aid
class Sub(Struct, frozen=True): class Sub(Struct, frozen=True):
''' '''
@ -786,7 +791,6 @@ async def install_brokerd_search(
@acm @acm
async def maybe_open_feed( async def maybe_open_feed(
fqmes: list[str], fqmes: list[str],
loglevel: str | None = None, loglevel: str | None = None,
@ -840,13 +844,12 @@ async def maybe_open_feed(
@acm @acm
async def open_feed( async def open_feed(
fqmes: list[str], fqmes: list[str],
loglevel: str | None = None, loglevel: str|None = None,
allow_overruns: bool = True, allow_overruns: bool = True,
start_stream: bool = True, start_stream: bool = True,
tick_throttle: float | None = None, # Hz tick_throttle: float|None = None, # Hz
allow_remote_ctl_ui: bool = False, allow_remote_ctl_ui: bool = False,
@ -899,19 +902,19 @@ async def open_feed(
feed.portals[brokermod] = portal feed.portals[brokermod] = portal
# fill out "status info" that the UI can show # fill out "status info" that the UI can show
host, port = portal.channel.raddr chan: tractor.Channel = portal.chan
if host == '127.0.0.1': raddr: Address = chan.raddr
host = 'localhost' aid: Aid = chan.aid
# TAG_feed_status_update
feed.status.update({ feed.status.update({
'actor_name': portal.channel.uid[0], 'actor_id': aid,
'host': host, 'actor_short_id': f'{aid.name}@{aid.pid}',
'port': port, 'ipc': chan.raddr.proto_key,
'ipc_addr': raddr,
'hist_shm': 'NA', 'hist_shm': 'NA',
'rt_shm': 'NA', 'rt_shm': 'NA',
'throttle_rate': tick_throttle, 'throttle_hz': tick_throttle,
}) })
# feed.status.update(init_msg.pop('status', {}))
# (allocate and) connect to any feed bus for this broker # (allocate and) connect to any feed bus for this broker
bus_ctxs.append( bus_ctxs.append(

View File

@ -36,10 +36,10 @@ from ._sharedmem import (
ShmArray, ShmArray,
_Token, _Token,
) )
from piker.accounting import MktPair
if TYPE_CHECKING: if TYPE_CHECKING:
from ..accounting import MktPair from piker.data.feed import Feed
from .feed import Feed
class Flume(Struct): class Flume(Struct):
@ -82,7 +82,7 @@ class Flume(Struct):
# TODO: do we need this really if we can pull the `Portal` from # TODO: do we need this really if we can pull the `Portal` from
# ``tractor``'s internals? # ``tractor``'s internals?
feed: Feed | None = None feed: Feed|None = None
@property @property
def rt_shm(self) -> ShmArray: def rt_shm(self) -> ShmArray:

View File

@ -113,9 +113,9 @@ def validate_backend(
) )
if ep is None: if ep is None:
log.warning( log.warning(
f'Provider backend {mod.name} is missing ' f'Provider backend {mod.name!r} is missing '
f'{daemon_name} support :(\n' f'{daemon_name!r} support?\n'
f'The following endpoint is missing: {name}' f'|_module endpoint-func missing: {name!r}\n'
) )
inits: list[ inits: list[

View File

@ -498,6 +498,7 @@ async def cascade(
func_name: str = func.__name__ func_name: str = func.__name__
async with ( async with (
tractor.trionics.collapse_eg(), # avoid multi-taskc tb in console
trio.open_nursery() as tn, trio.open_nursery() as tn,
): ):
# TODO: might be better to just make a "restart" method where # TODO: might be better to just make a "restart" method where

View File

@ -200,7 +200,8 @@ async def open_pikerd(
reg_addrs, reg_addrs,
), ),
tractor.open_nursery() as actor_nursery, tractor.open_nursery() as actor_nursery,
trio.open_nursery() as service_nursery, tractor.trionics.collapse_eg(),
trio.open_nursery() as service_tn,
): ):
for addr in reg_addrs: for addr in reg_addrs:
if addr not in root_actor.accept_addrs: if addr not in root_actor.accept_addrs:
@ -211,7 +212,7 @@ async def open_pikerd(
# assign globally for future daemon/task creation # assign globally for future daemon/task creation
Services.actor_n = actor_nursery Services.actor_n = actor_nursery
Services.service_n = service_nursery Services.service_n = service_tn
Services.debug_mode = debug_mode Services.debug_mode = debug_mode
try: try:
@ -221,7 +222,7 @@ async def open_pikerd(
# TODO: is this more clever/efficient? # TODO: is this more clever/efficient?
# if 'samplerd' in Services.service_tasks: # if 'samplerd' in Services.service_tasks:
# await Services.cancel_service('samplerd') # await Services.cancel_service('samplerd')
service_nursery.cancel_scope.cancel() service_tn.cancel_scope.cancel()
# TODO: do we even need this? # TODO: do we even need this?

View File

@ -517,7 +517,7 @@ def with_dts(
''' '''
return df.with_columns([ return df.with_columns([
pl.col(time_col).shift(1).suffix('_prev'), pl.col(time_col).shift(1).name.suffix('_prev'),
pl.col(time_col).diff().alias('s_diff'), pl.col(time_col).diff().alias('s_diff'),
pl.from_epoch(pl.col(time_col)).alias('dt'), pl.from_epoch(pl.col(time_col)).alias('dt'),
]).with_columns([ ]).with_columns([
@ -623,7 +623,7 @@ def detect_vlm_gaps(
) -> pl.DataFrame: ) -> pl.DataFrame:
vnull: pl.DataFrame = w_dts.filter( vnull: pl.DataFrame = df.filter(
pl.col(col) == 0 pl.col(col) == 0
) )
return vnull return vnull

View File

@ -21,6 +21,7 @@ Main app startup and run.
from functools import partial from functools import partial
from types import ModuleType from types import ModuleType
import tractor
import trio import trio
from piker.ui.qt import ( from piker.ui.qt import (
@ -116,6 +117,7 @@ async def _async_main(
needed_brokermods[brokername] = brokers[brokername] needed_brokermods[brokername] = brokers[brokername]
async with ( async with (
tractor.trionics.collapse_eg(),
trio.open_nursery() as root_n, trio.open_nursery() as root_n,
): ):
# set root nursery and task stack for spawning other charts/feeds # set root nursery and task stack for spawning other charts/feeds

View File

@ -33,7 +33,6 @@ import trio
from piker.ui.qt import ( from piker.ui.qt import (
QtCore, QtCore,
QtWidgets,
Qt, Qt,
QLineF, QLineF,
QFrame, QFrame,

View File

@ -1445,7 +1445,10 @@ async def display_symbol_data(
# for pause/resume on mouse interaction # for pause/resume on mouse interaction
rt_chart.feed = feed rt_chart.feed = feed
async with trio.open_nursery() as ln: async with (
tractor.trionics.collapse_eg(),
trio.open_nursery() as ln,
):
# if available load volume related built-in display(s) # if available load volume related built-in display(s)
vlm_charts: dict[ vlm_charts: dict[
str, str,

View File

@ -22,7 +22,10 @@ from contextlib import asynccontextmanager as acm
from typing import Callable from typing import Callable
import trio import trio
from tractor.trionics import gather_contexts from tractor.trionics import (
gather_contexts,
collapse_eg,
)
from piker.ui.qt import ( from piker.ui.qt import (
QtCore, QtCore,
@ -207,7 +210,10 @@ async def open_signal_handler(
async for args in recv: async for args in recv:
await async_handler(*args) await async_handler(*args)
async with trio.open_nursery() as tn: async with (
collapse_eg(),
trio.open_nursery() as tn
):
tn.start_soon(proxy_to_handler) tn.start_soon(proxy_to_handler)
async with send: async with send:
yield yield
@ -242,6 +248,7 @@ async def open_handlers(
widget: QWidget widget: QWidget
streams: list[trio.abc.ReceiveChannel] streams: list[trio.abc.ReceiveChannel]
async with ( async with (
collapse_eg(),
trio.open_nursery() as tn, trio.open_nursery() as tn,
gather_contexts([ gather_contexts([
open_event_stream( open_event_stream(

View File

@ -18,10 +18,11 @@
Feed status and controls widget(s) for embedding in a UI-pane. Feed status and controls widget(s) for embedding in a UI-pane.
""" """
from __future__ import annotations from __future__ import annotations
from textwrap import dedent from typing import (
from typing import TYPE_CHECKING Any,
TYPE_CHECKING,
)
# from PyQt5.QtCore import Qt # from PyQt5.QtCore import Qt
@ -49,35 +50,55 @@ def mk_feed_label(
a feed control protocol. a feed control protocol.
''' '''
status = feed.status status: dict[str, Any] = feed.status
assert status assert status
msg = dedent(""" # SO tips on ws/nls,
actor: **{actor_name}**\n # https://stackoverflow.com/a/15721400
|_ @**{host}:{port}**\n ws: str = ' '
""") # nl: str = '<br>' # dun work?
actor_info_repr: str = (
f')> **{status["actor_short_id"]}**\n'
'\n' # bc md?
)
for key, val in status.items(): # fields to select *IN* for display
if key in ('host', 'port', 'actor_name'): # (see `.data.feed.open_feed()` status
continue # update -> TAG_feed_status_update)
msg += f'\n|_ {key}: **{{{key}}}**\n' for key in [
'ipc',
'hist_shm',
'rt_shm',
'throttle_hz',
]:
# NOTE, the 2nd key is filled via `.format()` updates.
actor_info_repr += (
f'\n' # bc md?
f'{ws}|_{key}: **{{{key}}}**\n'
)
# ^TODO? formatting and content..
# -[ ] showing which fqme is "forward" on the
# chart/fsp/order-mode?
# '|_ flows: **{symbols}**\n'
#
# -[x] why isn't the indent working?
# => markdown, now solved..
feed_label = FormatLabel( feed_label = FormatLabel(
fmt_str=msg, fmt_str=actor_info_repr,
# |_ streams: **{symbols}**\n
font=_font.font, font=_font.font,
font_size=_font_small.px_size, font_size=_font_small.px_size,
font_color='default_lightest', font_color='default_lightest',
) )
# ?TODO, remove this?
# form.vbox.setAlignment(feed_label, Qt.AlignBottom) # form.vbox.setAlignment(feed_label, Qt.AlignBottom)
# form.vbox.setAlignment(Qt.AlignBottom) # form.vbox.setAlignment(Qt.AlignBottom)
_ = chart.height() - ( # _ = chart.height() - (
form.height() + # form.height() +
form.fill_bar.height() # form.fill_bar.height()
# feed_label.height() # # feed_label.height()
) # )
feed_label.format(**feed.status) feed_label.format(**feed.status)
return feed_label return feed_label

View File

@ -600,6 +600,7 @@ async def open_fsp_admin(
kwargs=kwargs, kwargs=kwargs,
) as (cache_hit, cluster_map), ) as (cache_hit, cluster_map),
tractor.trionics.collapse_eg(),
trio.open_nursery() as tn, trio.open_nursery() as tn,
): ):
if cache_hit: if cache_hit:
@ -613,6 +614,8 @@ async def open_fsp_admin(
) )
try: try:
yield admin yield admin
# ??TODO, does this *need* to be inside a finally?
finally: finally:
# terminate all tasks via signals # terminate all tasks via signals
for key, entry in admin._registry.items(): for key, entry in admin._registry.items():

View File

@ -285,18 +285,20 @@ class FormatLabel(QLabel):
font_size: int, font_size: int,
font_color: str, font_color: str,
use_md: bool = True,
parent=None, parent=None,
) -> None: ) -> None:
super().__init__(parent) super().__init__(parent)
# by default set the format string verbatim and expect user to # by default set the format string verbatim and expect user
# call ``.format()`` later (presumably they'll notice the # to call ``.format()`` later (presumably they'll notice the
# unformatted content if ``fmt_str`` isn't meant to be # unformatted content if ``fmt_str`` isn't meant to be
# unformatted). # unformatted).
self.fmt_str = fmt_str self.fmt_str = fmt_str
self.setText(fmt_str) # self.setText(fmt_str) # ?TODO, why here?
self.setStyleSheet( self.setStyleSheet(
f"""QLabel {{ f"""QLabel {{
@ -306,9 +308,10 @@ class FormatLabel(QLabel):
""" """
) )
self.setFont(_font.font) self.setFont(_font.font)
self.setTextFormat( if use_md:
Qt.TextFormat.MarkdownText self.setTextFormat(
) Qt.TextFormat.MarkdownText
)
self.setMargin(0) self.setMargin(0)
self.setSizePolicy( self.setSizePolicy(
@ -316,7 +319,10 @@ class FormatLabel(QLabel):
size_policy.Expanding, size_policy.Expanding,
) )
self.setAlignment( self.setAlignment(
Qt.AlignVCenter | Qt.AlignLeft Qt.AlignLeft
|
Qt.AlignBottom
# Qt.AlignVCenter
) )
self.setText(self.fmt_str) self.setText(self.fmt_str)

View File

@ -269,6 +269,8 @@ def hcolor(name: str) -> str:
# default ohlc-bars/curve gray # default ohlc-bars/curve gray
'bracket': '#666666', # like the logo 'bracket': '#666666', # like the logo
'pikers': '#616161', # a trader shade of..
'beast': '#161616', # in the dark alone.
# bluish # bluish
'charcoal': '#36454F', 'charcoal': '#36454F',

View File

@ -21,6 +21,7 @@ Chart trading, the only way to scalp.
from __future__ import annotations from __future__ import annotations
from contextlib import asynccontextmanager from contextlib import asynccontextmanager
from dataclasses import dataclass, field from dataclasses import dataclass, field
from decimal import Decimal
from functools import partial from functools import partial
from pprint import pformat from pprint import pformat
import time import time
@ -41,7 +42,6 @@ from piker.accounting import (
Position, Position,
mk_allocator, mk_allocator,
MktPair, MktPair,
Symbol,
) )
from piker.clearing import ( from piker.clearing import (
open_ems, open_ems,
@ -143,6 +143,15 @@ class OrderMode:
} }
_staged_order: Order | None = None _staged_order: Order | None = None
@property
def curr_mkt(self) -> MktPair:
'''
Deliver the currently selected `MktPair` according
chart state.
'''
return self.chart.linked.mkt
def on_level_change_update_next_order_info( def on_level_change_update_next_order_info(
self, self,
level: float, level: float,
@ -172,7 +181,11 @@ class OrderMode:
line.update_labels(order_info) line.update_labels(order_info)
# update bound-in staged order # update bound-in staged order
order.price = level mkt: MktPair = self.curr_mkt
order.price: Decimal = mkt.quantize(
size=level,
quantity_type='price',
)
order.size = order_info['size'] order.size = order_info['size']
# when an order is changed we flip the settings side-pane to # when an order is changed we flip the settings side-pane to
@ -187,7 +200,9 @@ class OrderMode:
) -> LevelLine: ) -> LevelLine:
level = order.price # TODO, if we instead just always decimalize at the ems layer
# we can avoid this back-n-forth casting?
level = float(order.price)
line = order_line( line = order_line(
chart or self.chart, chart or self.chart,
@ -224,7 +239,11 @@ class OrderMode:
# the order mode allocator but we still need to update the # the order mode allocator but we still need to update the
# "staged" order message we'll send to the ems # "staged" order message we'll send to the ems
def update_order_price(y: float) -> None: def update_order_price(y: float) -> None:
order.price = y mkt: MktPair = self.curr_mkt
order.price: Decimal = mkt.quantize(
size=y,
quantity_type='price',
)
line._on_level_change = update_order_price line._on_level_change = update_order_price
@ -275,34 +294,31 @@ class OrderMode:
chart = cursor.linked.chart chart = cursor.linked.chart
if ( if (
not chart not chart
and cursor and
and cursor.active_plot cursor
and
cursor.active_plot
): ):
return return
chart = cursor.active_plot chart = cursor.active_plot
price = cursor._datum_xy[1] price: float = cursor._datum_xy[1]
if not price: if not price:
# zero prices are not supported by any means # zero prices are not supported by any means
# since that's illogical / a no-op. # since that's illogical / a no-op.
return return
mkt: MktPair = self.chart.linked.mkt
# NOTE : we could also use instead,
# mkt.quantize(price, quantity_type='price')
# but it returns a Decimal and it's probably gonna
# be slower?
# TODO: should we be enforcing this precision # TODO: should we be enforcing this precision
# at a different layer in the stack? right now # at a different layer in the stack?
# any precision error will literally be relayed # |_ might require `MktPair` tracking in the EMS?
# all the way back from the backend. # |_ right now any precision error will be relayed
# all the way back from the backend and vice-versa..
price = round( #
price, mkt: MktPair = self.curr_mkt
ndigits=mkt.price_tick_digits, price: Decimal = mkt.quantize(
size=price,
quantity_type='price',
) )
order = self._staged_order = Order( order = self._staged_order = Order(
action=action, action=action,
price=price, price=price,
@ -378,7 +394,7 @@ class OrderMode:
'oid': oid, 'oid': oid,
}) })
if order.price <= 0: if float(order.price) <= 0:
log.error( log.error(
'*!? Invalid `Order.price <= 0` ?!*\n' '*!? Invalid `Order.price <= 0` ?!*\n'
# TODO: make this present multi-line in object form # TODO: make this present multi-line in object form
@ -515,14 +531,15 @@ class OrderMode:
# if an order msg is provided update the line # if an order msg is provided update the line
# **from** that msg. # **from** that msg.
if order: if order:
if order.price <= 0: price: float = float(order.price)
if price <= 0:
log.error(f'Order has 0 price, cancelling..\n{order}') log.error(f'Order has 0 price, cancelling..\n{order}')
self.cancel_orders([order.oid]) self.cancel_orders([order.oid])
return None return None
line.set_level(order.price) line.set_level(price)
self.on_level_change_update_next_order_info( self.on_level_change_update_next_order_info(
level=order.price, level=price,
line=line, line=line,
order=order, order=order,
# use the corresponding position tracker for the # use the corresponding position tracker for the
@ -681,9 +698,9 @@ class OrderMode:
) -> Dialog | None: ) -> Dialog | None:
# NOTE: the `.order` attr **must** be set with the # NOTE: the `.order` attr **must** be set with the
# equivalent order msg in order to be loaded. # equivalent order msg in order to be loaded.
order = msg.req order: Order = msg.req
oid = str(msg.oid) oid = str(msg.oid)
symbol = order.symbol symbol: str = order.symbol
# TODO: MEGA UGGG ZONEEEE! # TODO: MEGA UGGG ZONEEEE!
src = msg.src src = msg.src
@ -702,13 +719,22 @@ class OrderMode:
order.oid = str(order.oid) order.oid = str(order.oid)
order.brokers = [brokername] order.brokers = [brokername]
# TODO: change this over to `MktPair`, but it's # ?TODO? change this over to `MktPair`, but it's gonna be
# gonna be tough since we don't have any such data # tough since we don't have any such data really in our
# really in our clearing msg schema.. # clearing msg schema..
order.symbol = Symbol.from_fqme( # BUT WAIT! WHY do we even want/need this!?
fqsn=fqme, #
info={}, # order.symbol = self.curr_mkt
) #
# XXX, the old approach.. which i don't quire member why..
# -[ ] verify we for sure don't require this any more!
# |_https://github.com/pikers/piker/issues/517
#
# order.symbol = Symbol.from_fqme(
# fqsn=fqme,
# info={},
# )
maybe_dialog: Dialog | None = self.submit_order( maybe_dialog: Dialog | None = self.submit_order(
send_msg=False, send_msg=False,
order=order, order=order,
@ -766,6 +792,7 @@ async def open_order_mode(
brokerd_accounts, brokerd_accounts,
ems_dialog_msgs, ems_dialog_msgs,
), ),
tractor.trionics.collapse_eg(),
trio.open_nursery() as tn, trio.open_nursery() as tn,
): ):
@ -1101,7 +1128,7 @@ async def process_trade_msg(
) )
) )
): ):
msg.req = order msg.req: Order = order
dialog: ( dialog: (
Dialog Dialog
# NOTE: on an invalid order submission (eg. # NOTE: on an invalid order submission (eg.
@ -1166,7 +1193,7 @@ async def process_trade_msg(
tm = time.time() tm = time.time()
mode.on_fill( mode.on_fill(
oid, oid,
price=req.price, price=float(req.price),
time_s=tm, time_s=tm,
) )
mode.lines.remove_line(uuid=oid) mode.lines.remove_line(uuid=oid)
@ -1221,7 +1248,7 @@ async def process_trade_msg(
tm = details['broker_time'] tm = details['broker_time']
mode.on_fill( mode.on_fill(
oid, oid,
price=details['price'], price=float(details['price']),
time_s=tm, time_s=tm,
pointing='up' if action == 'buy' else 'down', pointing='up' if action == 'buy' else 'down',
) )

1
tags 100644
View File

@ -0,0 +1 @@
TAG_feed_status_update ./piker/data/feed.py /TAG_feed_status_update/

View File

@ -179,7 +179,7 @@ def test_ems_err_on_bad_broker(
# NOTE: emsd should error on the actor's enabled modules # NOTE: emsd should error on the actor's enabled modules
# import phase, when looking for a backend named `doggy`. # import phase, when looking for a backend named `doggy`.
except tractor.RemoteActorError as re: except tractor.RemoteActorError as re:
assert re.type == ModuleNotFoundError assert re.type is ModuleNotFoundError
run_and_tollerate_cancels(load_bad_fqme) run_and_tollerate_cancels(load_bad_fqme)