We previously only offered a sync API (which was recently renamed to
`.<meth>_nowait()` style) since initially all order control was from our
`OrderMode` Qt driven UI/UX. This adds the equivalent async methods for
both testing as well as eventual auto-strat driven control B)
Also includes a bunch of renaming:
- `OrderBook` -> `OrderClient`.
- better internal renaming of the client's mem chan vars and add a ref
`._ems_stream: tractor.MsgStream`.
- drop `get_orders()` factory, just always check for the actor-global
instance and always set the ems stream on that client (in case old one
was closed).
This will end up being super handy for testing our accounting subsystems
as well as providing unified and simple cli utils for managing ledgers
and position tracking. Allows loading the paper boi without starting
a data feed and instead just trigger ledger and pps loading without
starting the entire clearing engine.
Deatz:
- only init `PaperBoi` and start clearing loop (tasks) if a non-`None`
fqme is provided, ow just `Context.started()` the existing pps msgs
as loaded from the ledger.
- always update both the ledger and pp table on startup and pass
a single instance of each obj to the `PaperBoi` for reuse (without
opening and closing backing config files since we now have
`.write_config()`).
- drop the global `_positions` dict, it's not needed any more if we use
a `PaperBoi.ppt: PpTable` which persists with the engine actor's
lifetime.
Add a new `class TransactionLedger(collections.UserDict)` for managing
ledger (files) from a `dict`-like API. The main motivations being easy
conversion between `dict` <-> `Transaction` obj forms as well as dynamic
(toml) file updates via a set of methods:
- `.write_config()` to render and write state to the local toml file.
- `.iter_trans()` to allow iterator style conversion to `Transaction`
form for each entry.
- `.to_trans()` for the dict output from the above.
Some adjustments to `Transaction` namely making `.sym/.sys` optional for
now so that paper engine entries can be loaded (offline) without
connecting to the emulated broker backend. Move to using `pathlib.Path`
throughout for bootyful toml file mgmt B)
When loading a `Position` from a pps file we might not have the entire
`MktPair` field-set loaded (though going forward that shouldn't really
ever happen except in the case of a legacy `pps.toml`), in which case we
can check if the `.fqme: str` value loaded from the transaction is
longer and use that instead - presuming it must have more mkt meta-data
filled out.
Also includes some more `fqsn` -> `fqme` renames.
Been meaning to do this port for a while and since it makes passing
around file handles (presumably alongside the in mem obj form) a lot
simpler/nicer and the implementations of all the config file handling
much more terse with less presumptions about the form of filename/dir
`str` values all over the place B)
moar technically, let's us:
- drop remaining `.config` usage of `os.path`.
- return `Path`s from most routines.
- adds a special case to `get_conf_path()` such that if the input name
contains a `pps.` pattern, we avoid validating the name; this is going
to be used by new `.accounting.open_pps()` code which will instead
write a separate TOML file for each account B)
Previous we were re-processing all ledgers for every position msg
received from the API, per client.. Instead do that once in a first pass
and drop all key-miss lookups for `bs_mktid`s; it should never happen.
Better typing for in-routine vars, convert pos msg/objects to `dict`
prior to logging so it's sane to read on console. Skip processing
specifically option contracts for now.
Turns out `binance` is pretty great with their schema since they have
more or less the same data schema for their exchange info ep which we
wrap in a `Pair` struct:
https://binance-docs.github.io/apidocs/spot/en/#exchange-information
That makes it super easy to provide the most general case for filling
out a `MktPair` with both `.src/dst: Asset` to maintain maximum
meta-data B)
Deatz:
- adjust `Pair` to have `.size/price_tick: Decimal` by parsing out
the values from the filters field; TODO: we should probably just rewrite
the input `.filter` at init time so we can keep the frozen style.
- rename `Client.mkt_info()` (was `.symbol_info` to `.exch_info()`
better matching the ep name and have it build, cache, and return
a `dict[str, Pair]`; allows dropping `.cache_symbols()`
- only pass the `mkt_info: MktPair` field in the init msg!
Accept a msg with any of:
- `.src: Asset` and `.dst: Asset`
- `.src: str` and `.dst: str`
- `.src: Asset` and `.dst: str`
but not the final combo tho XD
Also, fix `.key` to properly cast any `.src: Asset` to string!
If user has loaded from a flex report then we don't want the API records
from the same period to override those; instead just update with any
missing fields from the API schema.
Also, always `str`-ify the contract id (what is set for the `.bs_mktid`
*before* packing into transaction type to ensure when serialized to
`pps.toml` there are no discrepancies at the codec level.. smh
Instead adjust `load_aio_clients()` to only reload clients detected as
non-loaded or disconnected (2 birds), and avoid use of the global module
table which could result in stale disconnected clients persisting on
multiple `brokerd` client reconnects, resulting in error.
To make nested `msgspec.Struct`s work we need to tell the codec that the
`.symbol` is some struct def, since we don't really need to enforce that
(yet) we're just going to enc/dec as `str` until we further formalize
and/or need something more complex.