Compare commits

...

147 Commits

Author SHA1 Message Date
Tyler Goodlet f30d866710 Rename fqsn -> fqme in feeds tests 2023-04-14 16:00:40 -04:00
Tyler Goodlet 1bd4fd80f8 `ib`: rejects their own fractional size tick..
Frickin ib, they give you the `0.001` (or wtv) in the
`ContractDetails.minSize: float` but won't accept fractional sizes
through the API.. Either way, it's probably not sane to be supporting
fractional order sizes for legacy instruments by default especially
since it in theory affects a lot of the clearing outcomes by having ib
do wtv magical junk behind the scenes to make it work..
2023-04-14 16:00:40 -04:00
Tyler Goodlet 65e42b79ac Rename ems test mod 2023-04-14 16:00:40 -04:00
Tyler Goodlet 2f5c456d4b More explicit test mod docstring 2023-04-14 16:00:40 -04:00
Tyler Goodlet 7786ffa889 Another fqsn -> fqme rename 2023-04-14 16:00:40 -04:00
Tyler Goodlet 3edc95dda0 Quantize order prices prior to `OrderClient.send()`
Order mode previously was just willy-nilly sending `float` prices
(particularly on order edits) which are generated from the associated
level line. This actually uses the `MktPair.price_tick: Decimal` to
ensure the value is rounded correctly before submission to the ems..

Also adjusts the order mode init to expect a table of tables of startup
position messages, with the inner table being keyed by fqme per msg.
2023-04-14 16:00:40 -04:00
Tyler Goodlet ecd7500ee6 Link `tractor` debug mode to `pytest` --pdb flag 2023-04-14 16:00:40 -04:00
Tyler Goodlet 114e7660aa Fix bad-fqme test, adjust prices based on buy/sell 2023-04-14 16:00:40 -04:00
Tyler Goodlet d29e9eeb31 Only flip size sign for seels if not already -ve 2023-04-14 16:00:40 -04:00
Tyler Goodlet 69d6e9bb4e Fix zero-pp entry to toml case for new file-per-account format 2023-04-14 16:00:40 -04:00
Tyler Goodlet 81e9a990bf Better EMS client-side msg formatting 2023-04-14 16:00:40 -04:00
Tyler Goodlet be90fb458f Rewrite order ctl tests as a parametrization
More or less a complete rework which allows passing a detailed
clearing/fills input and allows for *not* rebooting the runtime / ems
between each position check.

Some further enhancements:
- use (unit) fractional sizes to simulate both the more realistic and
  more "complex position calculation" case; since this is crypto.
- add a no-fqme-found test.
- factor cross-session/offline pos storage (pps.toml) checks into
  a `load_and_check_pos()` helper which does all entry loading directly
  from a provided `BrokerdPosition` msg.
- use the new `OrderClient.send()` async api.
2023-04-14 16:00:40 -04:00
Tyler Goodlet 0c03434a15 `binance`: add startup caching info log msg 2023-04-14 16:00:40 -04:00
Tyler Goodlet 6163067b38 Pack startup pps into a table keyed by fqmes 2023-04-14 16:00:40 -04:00
Tyler Goodlet ac6eac046a `order_mode`: broad rename book -> client 2023-04-14 16:00:40 -04:00
Tyler Goodlet 13123b71ac Drop old blessings code, general cleanups 2023-04-14 16:00:40 -04:00
Tyler Goodlet 3937059cf0 paper-eng: close context and terminate actor on exit 2023-04-14 16:00:40 -04:00
Tyler Goodlet 49bd03c2b6 `ledger` cli: dump colored summary lines to console
Tried a couple libs and ended up sticking with `rich` (since it's the
sibling lib to `typer`) but also (initially) implemented a version with
`blessings` that I ended up commenting out (and will likely remove).

Adjusted the CLI I/O a slight bit as well:
- require a fully qualified account name of the form:
  `<brokername>.<accountname>` and error on non-matching input.
- dump positions summary lines as humanized size, ppu and cost basis
  values per line.
2023-04-14 16:00:40 -04:00
Tyler Goodlet 4aad380267 paper: on no input fqme, load all mktinfos from pos table 2023-04-14 16:00:40 -04:00
Tyler Goodlet 0251225951 `pprint.pformat()` IB position mismatch log msgs 2023-04-14 16:00:40 -04:00
Tyler Goodlet a75aa5e461 Use `force_mkt` override in paper pps updates
When processing paper trades ledgers we normally won't have specific
`MktPair` info for the backend market we're simulating, as such we
need to look up this info when updating pps.toml files such that we
get precision info correct (particularly in the case of cryptos!) and
can also run paper ledger processing without running the simulated
clearing loop. In order to make it happen we lookup any `get_mkt_info()`
ep on the backend and pass the output to the `force_mkt` input of the
`PpTable.update_from_trans()` method.
2023-04-14 16:00:40 -04:00
Tyler Goodlet a5748e1a67 `binance`: add `get_mkt_info()` ep 2023-04-14 16:00:40 -04:00
Tyler Goodlet 76222f85c8 `kraken`: add module level `get_mkt_info()`
This will (likely) act as a new backend query endpoint for other `piker`
(client) code to lookup `MktPair` info from each backend. To start it
also returns the backend-broker's local `Pair` (or wtv other type) as
well.

The main motivation for this is for our paper engine which can require
the mkt info when processing paper-trades ledgers which do not contain
appropriate info to compute position metrics.
2023-04-14 16:00:40 -04:00
Tyler Goodlet ba901e6ef5 kraken: drop console setup, now done during brokerd init 2023-04-14 16:00:40 -04:00
Tyler Goodlet d4af6589af kraken: rename `Client._atable` -> `_altnames` 2023-04-14 16:00:40 -04:00
Tyler Goodlet 89160e6a03 Only log about pps once in order mode code 2023-04-14 16:00:40 -04:00
Tyler Goodlet d5dacfc7bd Another `@acm` in `._cacheables` XD 2023-04-14 16:00:40 -04:00
Tyler Goodlet ffea96f1b5 Make default order size to decimal 2023-04-14 16:00:40 -04:00
Tyler Goodlet adb11fb778 Drop `"<broker>.<account>.."` from pps.toml entries
Add special blocks to handle removing the broker account levels from
both writing and reading routines.
2023-04-14 16:00:06 -04:00
Tyler Goodlet 1408ba80aa paper: always sync pps.toml state on startup 2023-04-14 16:00:06 -04:00
Tyler Goodlet e98ef3c512 Tweak ems msg-received log msg 2023-04-14 16:00:06 -04:00
Tyler Goodlet a05beab414 Drop `loglevel` from `spawn_args` inputs to `maybe_spawn_daemon()` 2023-04-14 16:00:06 -04:00
Tyler Goodlet e2adfae54c Use `--pdb` flag to config `brokerd` debug mode 2023-04-14 16:00:06 -04:00
Tyler Goodlet 64ca507196 `kraken`: handle ws connection startup status msgs 2023-04-14 16:00:06 -04:00
Tyler Goodlet 1e6e8ddf2e Drop masked `MktPair.size_tick_digits()` cruft 2023-04-14 16:00:06 -04:00
Tyler Goodlet 44ff2fd60f paper engine: use the `fqme` for the `bs_mktid`
Instead of stripping the broker part just use the full fqme for all
`Transaction.bs_mktid: str` values since it makes indexing the `PpTable`
much easier with less key mangling..
2023-04-14 16:00:06 -04:00
Tyler Goodlet c44627ab52 Cancel the `OrderClient` sync-method relay task on exit 2023-04-14 16:00:06 -04:00
Tyler Goodlet 9c5d6d2592 Set `emsd` log level and clearly report startup pps
Change the root-service-task entrypoint to accept the level and
setup a console log as is now expected for all sub-services. Cast all
backend delivered startup `BrokerdPosition` msgs and log them to
console.
2023-04-14 15:59:04 -04:00
Tyler Goodlet 5235cb5bfe Expect `loglevel: str` in brokerd root task ep
Set the level right after spawn and once for the lifetime of the daemon.
2023-04-14 15:59:04 -04:00
Tyler Goodlet 72c98af1d1 Always pass `loglevel: str` to daemon root task eps
If you want a sub-actor to write console logs (with the right level) the
`get_console_log()` call has to be made somewhere during service task
startup. Previously this wasn't well formalized nor used (depending on
daemon) so passing `loglevel` to the service's root-task-endpoint (eg.
`_setup_persistent_brokerd()`) encourages that the daemon's logging is
configured during init according to the spawner's requesting logging
config. The previous `get_console_log()` call happening inside
`maybe_spawn_daemon()` wasn't actually doing anything in the target
daemon XD, so obviously remove that and instead passthrough loglevel
to the ctx endpoints and service manager methods.
2023-04-14 15:59:04 -04:00
Tyler Goodlet 6e9d3fb66d Expose `piker.clearing.OrderClient` 2023-04-14 15:59:04 -04:00
Tyler Goodlet e0f502507d WIP complete rework of paper engine tests
More or less we need to be able to audit not only simple "make trades
check pps.toml files" tests (which btw were great to get started!).

We also need more sophisticated and granular order mgmt and service
config scenarios,

- full e2e EMS msg flow verification
- multi-client (dis)connection scenarios and/or monitoring
- dark order clearing and offline storage
- accounting schema and position calcs detailing

As such, this is the beginning to "modularlizingz" the components needed
in the test harness to this end by breaking up the `OrderClient` control
flows vs. position checking logic so as to allow for more flexible test
scenario cases and likely `pytest` parametrizations over different
transaction sequences.
2023-04-14 15:59:04 -04:00
Tyler Goodlet 0de2101eb4 Ensure we set the test config dir in the root actor..
Not sure how this worked before but we need to also override the
`piker._config_dir: Path` in the root actor when running in `pytest`; my
guess is something in the old test suite was masking this problem after
the change to passing the dir path down through the runtime vars via
`tractor`?

Also this drops the ems related fixtures/factories since they're
specific enough to define in the clearing engine tests directly.
2023-04-14 15:59:04 -04:00
Tyler Goodlet 426f3ff38b ib: lul, fix oil (cl) venue to correctly be nymex.. 2023-04-14 15:59:04 -04:00
Tyler Goodlet c26dc35308 Adjust tests to `.clearing._client.OrderClient` type 2023-04-14 15:59:04 -04:00
Tyler Goodlet 8efb9ece61 ib: maybe incr client id; can't catch api errors..
Turns out we don't hookup our eventkit handler until after the
`load_aio_clients()` is complete, which means we can't get
`ib_insync.Client.apiError` events unless inside the asyncio side task.
So I guess try to report any such errors during API scan (note the
duplicate client id case is a special one from ibis itself) even though
we're not going to catch them trio side. The hack to work around this is
to just increment the client id value with the `connect_retries` led `i`
value even though that will break on more then 3 clients attached to an
API endpoint lul ..

Further adjustments that were to the end of trying to fix this proper:
- add `remove_handler_on_err()` cm to disconnect a handler when the trio
  side of the channel closes.
- actually connect to client api erros in our `Client.inline_errors()`
- increase connect timeout to a sec.
- change the trio-asyncio proxy response-msg loop over to `match:`
  syntax and raise on unhandled msgs from eventkit handlers.
2023-04-14 15:59:04 -04:00
Tyler Goodlet ffa3e80503 Detail `pikerd` sock bind collision in error 2023-04-14 15:59:04 -04:00
Tyler Goodlet 1cc37673a4 `ib`: drop pp mismatch err block, we already do it in audit routine 2023-04-14 15:58:54 -04:00
Tyler Goodlet b8d7c05d74 Async-ify order client methods and some renaming
We previously only offered a sync API (which was recently renamed to
`.<meth>_nowait()` style) since initially all order control was from our
`OrderMode` Qt driven UI/UX. This adds the equivalent async methods for
both testing as well as eventual auto-strat driven control B)

Also includes a bunch of renaming:
- `OrderBook` -> `OrderClient`.
- better internal renaming of the client's mem chan vars and add a ref
  `._ems_stream: tractor.MsgStream`.
- drop `get_orders()` factory, just always check for the actor-global
  instance and always set the ems stream on that client (in case old one
  was closed).
2023-04-14 15:58:54 -04:00
Tyler Goodlet 5cb63a67e1 `kraken`: write ledger and pps files on startup 2023-04-14 15:58:54 -04:00
Tyler Goodlet fb90c2049f Rework paper engine for "offline" pp loading
This will end up being super handy for testing our accounting subsystems
as well as providing unified and simple cli utils for managing ledgers
and position tracking. Allows loading the paper boi without starting
a data feed and instead just trigger ledger and pps loading without
starting the entire clearing engine.

Deatz:
- only init `PaperBoi` and start clearing loop (tasks) if a non-`None`
  fqme is provided, ow just `Context.started()` the existing pps msgs
  as loaded from the ledger.
- always update both the ledger and pp table on startup and pass
  a single instance of each obj to the `PaperBoi` for reuse (without
  opening and closing backing config files since we now have
  `.write_config()`).
- drop the global `_positions` dict, it's not needed any more if we use
  a `PaperBoi.ppt: PpTable` which persists with the engine actor's
  lifetime.
2023-04-14 15:58:54 -04:00
Tyler Goodlet f46a5337d5 Convert `Flume.MktPair.size_tick` to float for dark clearing 2023-04-14 15:58:54 -04:00
Tyler Goodlet 5c61055411 Add paper engine "offline loading" support to the ledger cli 2023-04-14 15:58:54 -04:00
Tyler Goodlet 2f31e40d3b Formalize a ledger type + API: `TransactionLedger`
Add a new `class TransactionLedger(collections.UserDict)` for managing
ledger (files) from a `dict`-like API. The main motivations being easy
conversion between `dict` <-> `Transaction` obj forms as well as dynamic
(toml) file updates via a set of methods:

- `.write_config()` to render and write state to the local toml file.
- `.iter_trans()` to allow iterator style conversion to `Transaction`
  form for each entry.
- `.to_trans()` for the dict output from the above.

Some adjustments to `Transaction` namely making `.sym/.sys` optional for
now so that paper engine entries can be loaded (offline) without
connecting to the emulated broker backend. Move to using `pathlib.Path`
throughout for bootyful toml file mgmt B)
2023-04-14 15:58:54 -04:00
Tyler Goodlet 055025c64c Always use the "most resolved" `Position.symbol: MktPair`
When loading a `Position` from a pps file we might not have the entire
`MktPair` field-set loaded (though going forward that shouldn't really
ever happen except in the case of a legacy `pps.toml`), in which case we
can check if the `.fqme: str` value loaded from the transaction is
longer and use that instead - presuming it must have more mkt meta-data
filled out.

Also includes some more `fqsn` -> `fqme` renames.
2023-04-14 15:58:54 -04:00
Tyler Goodlet 4631af60e9 `ib`: keep broker name in `Transaction.fqsn` 2023-04-14 15:58:54 -04:00
Tyler Goodlet ba31f2dfb3 `ib`: move flex utils to new submod 2023-04-14 15:58:54 -04:00
Tyler Goodlet 926373dd33 `ib`: again, only *update* ledger records from API 2023-04-14 15:58:54 -04:00
Tyler Goodlet 0ea806c7bd Flatter format for pos/ledger mngr statements 2023-04-14 15:58:54 -04:00
Tyler Goodlet 68437fb5af Write a separate `pps.<brokername>.<accountname>.toml` file per account 2023-04-14 15:58:53 -04:00
Tyler Goodlet 14418b4c5c Rework `.config` routines to use `pathlib.Path`
Been meaning to do this port for a while and since it makes passing
around file handles (presumably alongside the in mem obj form) a lot
simpler/nicer and the implementations of all the config file handling
much more terse with less presumptions about the form of filename/dir
`str` values all over the place B)

moar technically, let's us:
- drop remaining `.config` usage of `os.path`.
- return `Path`s from most routines.
- adds a special case to `get_conf_path()` such that if the input name
  contains a `pps.` pattern, we avoid validating the name; this is going
  to be used by new `.accounting.open_pps()` code which will instead
  write a separate TOML file for each account B)
2023-04-14 15:58:53 -04:00
Tyler Goodlet 10ae9106e2 Move `.clearing._allocate` -> `accounting._allocate` 2023-04-14 15:58:53 -04:00
Tyler Goodlet d644436a3c Drop `Optional` use from daemon mod 2023-04-14 15:58:53 -04:00
Tyler Goodlet 7c3418def6 Use our `@acm` alias in paper eng 2023-04-14 15:58:53 -04:00
Tyler Goodlet c3686185c1 `ib`: only process ledger-txs once per client
Previous we were re-processing all ledgers for every position msg
received from the API, per client.. Instead do that once in a first pass
and drop all key-miss lookups for `bs_mktid`s; it should never happen.

Better typing for in-routine vars, convert pos msg/objects to `dict`
prior to logging so it's sane to read on console. Skip processing
specifically option contracts for now.
2023-04-14 15:58:53 -04:00
Tyler Goodlet f0d181e3f7 `ib`: break up data vs. broker enabled modules 2023-04-14 15:58:53 -04:00
Tyler Goodlet 312c4cdec7 First working `brokerd` -> `trades_dialogue()` ep loader 2023-04-14 15:58:53 -04:00
Tyler Goodlet f234483a1f `binance`: adjust earch to expect `Pair`s 2023-04-14 15:58:53 -04:00
Tyler Goodlet 3c32c9b3c9 Drop `cryptofeed`, what a mess XD 2023-04-14 15:58:53 -04:00
Tyler Goodlet 2675039b16 WIP: trying out `typer` for ledger cli 2023-04-14 15:58:53 -04:00
Tyler Goodlet 5919d75e85 Drop weird extra line from license headers 2023-04-14 15:58:53 -04:00
Tyler Goodlet 62a40c57a0 `binance`: pre-process `Pair` filters at init
Allows us to keep the struct frozen as well avoid complexity in the pure
data type. Also changes `.price/size_tick` to plain ol' properties.
2023-04-14 15:58:53 -04:00
Tyler Goodlet a50452dbfd `binance`: use `MktPair` in live feed setup
Turns out `binance` is pretty great with their schema  since they have
more or less the same data schema for their exchange info ep which we
wrap in a `Pair` struct:
https://binance-docs.github.io/apidocs/spot/en/#exchange-information

That makes it super easy to provide the most general case for filling
out a `MktPair` with both `.src/dst: Asset` to maintain maximum
meta-data B)

Deatz:
- adjust `Pair` to have `.size/price_tick: Decimal` by parsing out
  the values from the filters field; TODO: we should probably just rewrite
  the input `.filter` at init time so we can keep the frozen style.
- rename `Client.mkt_info()` (was `.symbol_info` to `.exch_info()`
  better matching the ep name and have it build, cache, and return
  a `dict[str, Pair]`; allows dropping `.cache_symbols()`
- only pass the `mkt_info: MktPair` field in the init msg!
2023-04-14 15:58:53 -04:00
Tyler Goodlet 2142c13228 Generalize `MktPair.from_msg()` handling
Accept a msg with any of:
- `.src: Asset` and `.dst: Asset`
- `.src: str` and `.dst: str`
- `.src: Asset` and `.dst: str`

but not the final combo tho XD
Also, fix `.key` to properly cast any `.src: Asset` to string!
2023-04-14 15:58:53 -04:00
Tyler Goodlet 4236e5c3b1 `ib`: never override existing ledger records
If user has loaded from a flex report then we don't want the API records
from the same period to override those; instead just update with any
missing fields from the API schema.

Also, always `str`-ify the contract id (what is set for the `.bs_mktid`
*before* packing into transaction type to ensure when serialized to
`pps.toml` there are no discrepancies at the codec level.. smh
2023-04-14 15:58:53 -04:00
Tyler Goodlet 366de901df `ib`: drop use of `_account2clients` in `load_clients_for_trio()`
Instead adjust `load_aio_clients()` to only reload clients detected as
non-loaded or disconnected (2 birds), and avoid use of the global module
table which could result in stale disconnected clients persisting on
multiple `brokerd` client reconnects, resulting in error.
2023-04-14 15:58:53 -04:00
Tyler Goodlet 8ee3fc4aa5 Move toml table decoder to separate mod 2023-04-14 15:58:53 -04:00
Tyler Goodlet d9344aa468 `ib`: stick exc handler around client connection erros 2023-04-14 15:58:53 -04:00
Tyler Goodlet 49958e68ea `kraken`: heh, use `trio_util` for trades streamz tooo XD 2023-04-14 15:58:53 -04:00
Tyler Goodlet 116f7fd40f WIP: refactor ib pp load init 2023-04-14 15:58:53 -04:00
Tyler Goodlet 8d7b968c44 Cache contract lookups from `Client.get_con()` 2023-04-14 15:58:53 -04:00
Tyler Goodlet 5f89ec4feb Dump `Position`s as pformatted dicts for now.. 2023-04-14 15:58:53 -04:00
Tyler Goodlet f783d3eba3 Use common `.brokers` logger in most backends 2023-04-14 15:58:53 -04:00
Tyler Goodlet a72a9e76e9 Add common logger instance for `.brokers` 2023-04-14 15:58:53 -04:00
Tyler Goodlet c412197413 Use a single log for entire `.service` subsys 2023-04-14 15:58:53 -04:00
Tyler Goodlet 0930074e76 Use a single log for entire `.clearing` subsys 2023-04-14 15:58:53 -04:00
Tyler Goodlet ca752bea8c Use `MktPair` attr `.size_tick` in charting 2023-04-14 15:58:53 -04:00
Tyler Goodlet 963523359c Use `Struct.copy()` with update dict for `Order` from staged 2023-04-14 15:58:53 -04:00
Tyler Goodlet 3db9854271 Rename `Client.send_update()` -> `.update_nowait()` 2023-04-14 15:58:53 -04:00
Tyler Goodlet b1e241781e Use `str(cmd.symbol)` for fqme on cancels, add `_nowait()` method names 2023-04-14 15:58:53 -04:00
Tyler Goodlet 71019ad54b Add `.__str__()` to mktpair and symbol types, fix `MktPair.fqme` token order 2023-04-14 15:58:53 -04:00
Tyler Goodlet 2a1d485cd1 Rename `fqsn` -> `fqme` in paper engine 2023-04-14 15:58:53 -04:00
Tyler Goodlet f7f2d1247b Drop more `Optional` usage on our `Struct` 2023-04-14 15:58:53 -04:00
Tyler Goodlet 8594a39c61 '`kraken`: finally, use new `MktPair` in `'mkt_info'` init msg field!' 2023-04-14 15:58:53 -04:00
Tyler Goodlet 08b4d69a91 Drop use of legacy `Symbol.broker_info` in display startup 2023-04-14 15:58:53 -04:00
Tyler Goodlet 0aacb835dc Typecast `OrderMode.staged.symbol: str` before `.copy()`! 2023-04-14 15:58:53 -04:00
Tyler Goodlet 4447f45430 `kraken`: parse our source asset key and set on `MktPair.src: str` 2023-04-14 15:58:53 -04:00
Tyler Goodlet 0e0338e217 Always cast `Order.symbol: str` for now
To make nested `msgspec.Struct`s work we need to tell the codec that the
`.symbol` is some struct def, since we don't really need to enforce that
(yet) we're just going to enc/dec as `str` until we further formalize
and/or need something more complex.
2023-04-14 15:58:53 -04:00
Tyler Goodlet ef2d0c46d4 Expect new `MktPair.tick_size: Decimal` attr in ems 2023-04-14 15:58:53 -04:00
Tyler Goodlet 986bb4e7c8 Use `MktPair` for `Flume.symbol` when used by backend
Initial attempt at getting the sampling and shm layer to use the new mkt
info meta-data type. Draft out a potential `BackendInitMsg:
msgspec.Struct` for validating the init msg returned from the
`stream_quotes()` start value; obvs don't actually use it yet.
2023-04-14 15:58:53 -04:00
Tyler Goodlet 76fe9018cf `.clearing`: broad rename of `fqsn` -> `fqme` 2023-04-14 15:58:53 -04:00
Tyler Goodlet d7c1e5e188 Add `MktPair.suffix: str` read from contract info
To be compat with the `Symbol` (for now) and generally allow for reading
the (derivative) contract specific part of the fqme. Adjust
`contract_info: list[str]` and make `src: str = ''` by default.
2023-04-14 15:58:53 -04:00
Tyler Goodlet 52de60c7ee Optionally load `MktPair` in `Flume`s 2023-04-14 15:58:47 -04:00
Tyler Goodlet 8b0aead72e First stage port of `.data.feed` to `MktPair`
Add `MktPair` handling block for when a backend delivers
a `mkt_info`-field containing init msg. Adjust the original
`Symbol`-style `'symbol_info'` msg processing to do `Decimal` defaults
and convert to `MktPair` including slapping in a hacky `_atype: str`
field XD

General initial name changes to `bs_mktid` and `_fqme` throughout!
2023-04-14 15:58:47 -04:00
Tyler Goodlet da7371988a Comment about `Struct.typecast()` conflict with frozen instances 2023-04-14 15:58:47 -04:00
Tyler Goodlet 0866d46484 Default `pps.toml` precision fields to `Decimal`
For `price_tick` and `size_tick` we read in `str` and decimal-ize
and now correctly fail over to default values of the same type..
Also, always treat `bs_mktid` field as a `str` in TOML form.

Drop the strange `clears: dict` var from the loading code (not sure why
that was left in smh) and better name `toml_clears_list` for the
TOML-loaded-pre-transaction sequence.
2023-04-14 15:58:47 -04:00
Tyler Goodlet 98d8b4a0e8 Implement `MktPair.from_msg()` constructor
Handle case where `'dst'` field is just a `str` (in which case delegate to
`.from_fqme()`) as well as do `Asset` loading and use our
`Struct.copy()` to enforce type-casting to (for eg. `Decimal`s) such
that we'll now capture typing errors despite IPC transport.

Change `Symbol.tick_size` and `.lot_tick_size` defaults to decimal
for proper casting and type `MktPair.atype: str` since `msgspec` can't
cast to `AssetTypeName` without special handling..
2023-04-14 15:58:47 -04:00
Tyler Goodlet 901562cb6b `ib`: deliver mkt precision info as `Decimal` 2023-04-14 15:58:47 -04:00
Tyler Goodlet d628b732b7 `binance`: deliver mkt precision info as `Decimal` 2023-04-14 15:58:47 -04:00
Tyler Goodlet 2a5c13bcde Rename `float_digits()` -> `dec_digits()`, since decimal. 2023-04-14 15:58:47 -04:00
Tyler Goodlet c68f240376 Fix `Symbol.tick_size_digits`, add `.price/size_tick` props 2023-04-14 15:58:47 -04:00
Tyler Goodlet 141af47ec0 Cast to float from decimal for level line y-increment
Qt only accepts `float` to it's APIs obvs..
2023-04-14 15:57:48 -04:00
Tyler Goodlet 904a73804d Add parity mapping from altnames back to themsevles in `Client._ntable` 2023-04-14 15:57:48 -04:00
Tyler Goodlet 4cfe13756b Encode a `mktpair` field if passed in msg by caller 2023-04-14 15:57:48 -04:00
Tyler Goodlet bd73ec4ea4 Use `MktPair` building `Position` objects in `PpTable.update_from_trans()` 2023-04-14 15:57:48 -04:00
Tyler Goodlet 8891da2ff3 Explicitly decode tick sizes as decimal for symbol loading in `Flume` 2023-04-14 15:57:48 -04:00
Tyler Goodlet 068f5d8eab Cast back to float from decimal for cursor y-increment 2023-04-14 15:57:48 -04:00
Tyler Goodlet 4b57235bd0 Pass old fields in sym info init msg section 2023-04-14 15:57:48 -04:00
Tyler Goodlet 42f7aa994a Ensure `Symbol` tick sizes are decoded as `Decimal`.. 2023-04-14 15:57:48 -04:00
Tyler Goodlet 28f52dae93 `kraken`: use `Client.mkt_info()` in quotes feed init msg 2023-04-14 15:57:48 -04:00
Tyler Goodlet d6634e9b02 Add `MktPair._atype` for back-compat, always `str(.dst)` 2023-04-14 15:57:48 -04:00
Tyler Goodlet 94cb8fa1b1 `kraken`: use `MktPair` in trasactions 2023-04-14 15:57:48 -04:00
Tyler Goodlet 204f9c49d2 `kraken`: add `Client.mkt_info()`
Allows building a `MktPair` from the backend specific `Pair` for
eventual use in the data feed layer. Also adds `Pair.price/tick_size` to
get to the expected tick precision info format.
2023-04-14 15:57:48 -04:00
Tyler Goodlet 3fccd8a67a Drop shm logging levels to debug over warning 2023-04-14 15:57:17 -04:00
Tyler Goodlet 3ead16c47c Flip to `.bs_mktid` in `ib` and `kraken` 2023-04-14 15:57:17 -04:00
Tyler Goodlet b4ab1675fc Handle read and write of `pps.toml` using `MktPair`
Add a logic branch for now that switches on an instance check.
Generally swap over all `Position.symbol` and `Transaction.sym` refs to
`MktPair`. Do a wholesale rename of all `.bsuid` var names to
`.bs_mktid`.
2023-04-14 15:57:17 -04:00
Tyler Goodlet 9ab196d778 Prep for dropping `Transaction.sym`
Instead let's name it `.sys` for "system", the thing we use to conduct
the "transactions" ..

Also rename `.bsuid` -> `.bs_mktid` for "backend system market id`
which is more explicit, easier to remember and read.
2023-04-14 15:57:17 -04:00
Tyler Goodlet e5eb317b47 Further refinement and shimming of `MktPair`
Prepping to entirely replace `Symbol`; this adds a buncha docs/comments,
better implementation for representing and parsing the FQME: "fully
qualified market endpoint".

Deatz:
- make `.src` an optional field until we figure out how we're going
  to support loading source assets from all backends sensibly..
- implement `MktPair.fqme: str` (what was previously called `fqsn`)
  using a new util func: `maybe_cons_tokens()`.
- `Symbol.brokers` and expect only `.broker` usage.
- remap anything with `fqsn` in the name to `fqme` with aliases from the
  old name.
- implement `unpack_fqme()` with `match:` syntax B)
- add `MktPair.tick_size_digits`, `.lot_size_digits`, `.fqsn`, `.key` for
  backward compat.
- make all fqme generation related fields empty `str`s by default.
- add `MktPair.resolved: bool` a flag indicating whether or not `.dst`
  is an `Asset` instance or just a string and, `.bs_mktid` the field
  to hold the "backend system market id" per broker.
2023-04-14 15:57:17 -04:00
Tyler Goodlet 2485bc803b Drop use of `mk_fqsn()` 2023-04-14 15:57:17 -04:00
Tyler Goodlet 9e336f0fc3 Drop use of `Symbol.brokers` everywhere 2023-04-14 15:57:17 -04:00
Tyler Goodlet ee4138ae01 Start to prep `Transaction` for `MktPair`.. 2023-04-14 15:56:48 -04:00
Tyler Goodlet ef915273ea Port `accounting._pos` to new `Symbol` simplifications 2023-04-14 15:56:48 -04:00
Tyler Goodlet b74c41cb77 Delegate to new `.accounting._mktinfo._derivs` from `ui._positioning` 2023-04-14 15:56:48 -04:00
Tyler Goodlet 786372618c `kraken`: write `pps.toml` on updates for now 2023-04-14 15:56:48 -04:00
Tyler Goodlet 4a2696f0ab `kraken`: pack `Asset` into local client cache
Try out using our new internal type for storing info about kraken's asset
infos now stored in the `Client.assets: dict[str, Asset]` table. Handle
a server error when requesting such info msgs.
2023-04-14 15:56:48 -04:00
Tyler Goodlet 8b7563488a `ib`: adjust to new simplified `Symbol`
Drop usage of removed methods and attrs and only pass in the
`.tick_size: Decimal` value during construction.
2023-04-14 15:56:48 -04:00
Tyler Goodlet 04a2ccc42c Drop `Symbol.front_fqsn()` usage from chart, fsp and clearing stuff 2023-04-14 15:56:48 -04:00
Tyler Goodlet 141d6ede9c Drop `Symbol.front_feed()` usage from order mode 2023-04-14 15:56:48 -04:00
Tyler Goodlet e9cedc6613 Simplify `Symbol` extend `MktPair`, add `Asset`
Drop everything we can in terms of methods and attrs from `Symbol`:
- kill `.tokens()`, `.front_feed()`, `.tokens()`, `.nearest_tick()`,
  `.front_fqsn()`, instead moving logic from these methods into
  dependents (and obviously removing any usage from rest of code base,
  coming in follow up commits).
- rename `.quantize_size()` -> `.quantize()`.
- re-implement `.brokers`, `.lot_size_digits`, `.tick_size_digits` as
  `@property` methods; for the latter two, allows us to minimize to only
  accepting min tick decimal values on alternative constructor class
  methods and to drop the equivalent instance vars.
- map `_fqsn` related variable names to new and preferred `_fqme`.

We also juggle around some utility functions, moving limited precision
related `decimal.Decimal` routines to the top of module and soon-to-be
legacy `fqsn` related routines to the bottom.

`MktPair` draft type extensions:
- drop requirements for `src_type`, and offer the optional `.dst_type`
  field as either a `str` or (new `typing.Literal`) `AssetTypeName`.
- define an equivalent `.quantize()` as (re)defined in `Symbol` but with
  `quantity_type: str` field which specifies whether to use the price or
  the size precision.
- add a lot more docs, a `.key` property for the "symbol" name, draft
  property for a `.fqme: str`
- allow `.src` and `.dst` to be of type `str | Asset`

Add a new `Asset` to capture "things which can be used in markets and/or
transactions" XD
- defines a `.name`, `.atype: AssetTypeName` a financial category tag, `tx_tick:
  Decimal` the precision limit for transactions and of course
  a `.quantime()` method for doing accounting arithmetic on a given tech
  stack.
- define the `atype: AssetTypeName` type as a finite set of `str`s
  expected to be used in various ways for default settings in other
  parts of the data and order control layers..
2023-04-14 15:56:48 -04:00
Tyler Goodlet c0a3c6dff7 Move all fqsn parsing and `Symbol` to new `accounting._mktinfo 2023-04-14 15:56:48 -04:00
Tyler Goodlet 33ee647224 (u)Limit the fd allocation for java 8 runtime..
Can't believe this was actually the issue..seriously i don't envy
jvm users.

See following issues:
- https://stackoverflow.com/a/56895801
- https://bugs.openjdk.org/browse/JDK-8150460
2023-04-14 15:56:48 -04:00
Tyler Goodlet 6a935344ca `ib`: (cukcit) just presume a stonk if we can read type from existing ledger.. 2023-04-14 15:56:48 -04:00
Tyler Goodlet a0a19a952f Break out old `.pp` components into submods: `._ledger` and `._pos` 2023-04-14 15:56:48 -04:00
Tyler Goodlet 6cb80abfc0 Start a new `.accounting` subpkg, move `.pp` contents there 2023-04-14 15:56:48 -04:00
Tyler Goodlet 3a5e788afc '`kraken`: fix pos loading using `digits_to_dec()` to pair info
Our issue was not having the correct value set on each
`Symbol.lot_tick_size`.. and then doing PPU calcs with the default set
for legacy mkts..

Also,
- actually write `pps.toml` on broker mode exit.
- drop `get_likely_pair()` and import from pp module.
2023-04-14 15:56:48 -04:00
Tyler Goodlet 24fe44fb96 Add an inverse of `float_digits()`: `digits_to_dec() 2023-04-14 15:56:48 -04:00
Tyler Goodlet b982505b43 Ensure clearing table entries are time-sorted..
Not sure how this worked before but, the PPU calculation critically
requires that the order of clearing transactions are in the correct
chronological order! Fix this by sorting `trans: dict[str, Transaction]`
in the `PpTable.update_from_trans()` method.

Also, move the `get_likely_pair()` parser from the `kraken` backend here
for future use particularly when we revamp the asset-transaction
processing layer.
2023-04-14 15:56:48 -04:00
68 changed files with 3911 additions and 2197 deletions

View File

@ -2,8 +2,21 @@
# https://github.com/waytrade/ib-gateway-docker/blob/master/docker-compose.yml
version: "3.5"
services:
ib_gw_paper:
# apparently java is a mega cukc:
# https://stackoverflow.com/a/56895801
# https://bugs.openjdk.org/browse/JDK-8150460
ulimits:
# nproc: 65535
nproc: 6000
nofile:
soft: 2000
hard: 3000
# other image tags available:
# https://github.com/waytrade/ib-gateway-docker#supported-tags
# image: waytrade/ib-gateway:981.3j

View File

@ -21,7 +21,7 @@ Cacheing apis and toolz.
from collections import OrderedDict
from contextlib import (
asynccontextmanager,
asynccontextmanager as acm,
)
from tractor.trionics import maybe_open_context
@ -62,7 +62,7 @@ def async_lifo_cache(maxsize=128):
return decorator
@asynccontextmanager
@acm
async def open_cached_client(
brokername: str,
) -> 'Client': # noqa

View File

@ -0,0 +1,93 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
"Accounting for degens": count dem numberz that tracks how much you got
for tendiez.
'''
from ..log import get_logger
from ._ledger import (
Transaction,
TransactionLedger,
open_trade_ledger,
)
from ._pos import (
load_pps_from_ledger,
open_pps,
Position,
PpTable,
)
log = get_logger(__name__)
__all__ = [
'Transaction',
'TransactionLedger',
'open_trade_ledger',
'PpTable',
'open_pps',
'load_pps_from_ledger',
'Position',
]
def get_likely_pair(
src: str,
dst: str,
bs_mktid: str,
) -> str:
'''
Attempt to get the likely trading pair matching a given destination
asset `dst: str`.
'''
try:
src_name_start = bs_mktid.rindex(src)
except (
ValueError, # substr not found
):
# TODO: handle nested positions..(i.e.
# positions where the src fiat was used to
# buy some other dst which was furhter used
# to buy another dst..)
log.warning(
f'No src fiat {src} found in {bs_mktid}?'
)
return
likely_dst = bs_mktid[:src_name_start]
if likely_dst == dst:
return bs_mktid
if __name__ == '__main__':
import sys
from pprint import pformat
args = sys.argv
assert len(args) > 1, 'Specifiy account(s) from `brokers.toml`'
args = args[1:]
for acctid in args:
broker, name = acctid.split('.')
trans, updated_pps = load_pps_from_ledger(broker, name)
print(
f'Processing transactions into pps for {broker}:{acctid}\n'
f'{pformat(trans)}\n\n'
f'{pformat(updated_pps)}'
)

View File

@ -23,9 +23,9 @@ from typing import Optional
from bidict import bidict
from ..data._source import Symbol
from ._pos import Position
from ._mktinfo import Symbol
from ..data.types import Struct
from ..pp import Position
_size_units = bidict({
@ -206,7 +206,7 @@ class Allocator(Struct):
symbol=sym,
size=order_size,
ppu=price,
bsuid=sym,
bs_mktid=sym,
)
)

View File

@ -0,0 +1,254 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Trade and transaction ledger processing.
'''
from __future__ import annotations
from collections import UserDict
from contextlib import contextmanager as cm
from pathlib import Path
import time
from typing import (
Any,
Iterator,
Union,
Generator
)
from pendulum import (
datetime,
parse,
)
import tomli
import toml
from .. import config
from ..data.types import Struct
from ..log import get_logger
from ._mktinfo import (
Symbol, # legacy
MktPair,
Asset,
)
log = get_logger(__name__)
class Transaction(Struct, frozen=True):
# TODO: unify this with the `MktPair`,
# once we have that as a required field,
# we don't really need the fqsn any more..
fqsn: str
tid: Union[str, int] # unique transaction id
size: float
price: float
cost: float # commisions or other additional costs
dt: datetime
# TODO: we can drop this right since we
# can instead expect the backend to provide this
# via the `MktPair`?
expiry: datetime | None = None
# remap for back-compat
@property
def fqme(self) -> str:
return self.fqsn
# TODO: drop the Symbol type
# the underlying "transaction system", normally one of a ``MktPair``
# (a description of a tradable double auction) or a ledger-recorded
# ("ledger" in any sense as long as you can record transfers) of any
# sort) ``Asset``.
sym: MktPair | Asset | Symbol | None = None
@property
def sys(self) -> Symbol:
return self.sym
# (optional) key-id defined by the broker-service backend which
# ensures the instrument-symbol market key for this record is unique
# in the "their backend/system" sense; i.e. this uid for the market
# as defined (internally) in some namespace defined by the broker
# service.
bs_mktid: str | int | None = None
def to_dict(self) -> dict:
dct = super().to_dict()
# ensure we use a pendulum formatted
# ISO style str here!@
dct['dt'] = str(self.dt)
return dct
class TransactionLedger(UserDict):
'''
Very simple ``dict`` wrapper + ``pathlib.Path`` handle to
a TOML formatted transaction file for enabling file writes
dynamically whilst still looking exactly like a ``dict`` from the
outside.
'''
def __init__(
self,
ledger_dict: dict,
file_path: Path,
) -> None:
self.file_path = file_path
super().__init__(ledger_dict)
def write_config(self) -> None:
'''
Render the self.data ledger dict to it's TML file form.
'''
with self.file_path.open(mode='w') as fp:
toml.dump(self.data, fp)
def iter_trans(
self,
broker: str = 'paper',
) -> Generator[
tuple[str, Transaction],
None,
None,
]:
'''
Deliver trades records in ``(key: str, t: Transaction)``
form via generator.
'''
if broker != 'paper':
raise NotImplementedError('Per broker support not dun yet!')
# TODO: lookup some standard normalizer
# func in the backend?
# from ..brokers import get_brokermod
# mod = get_brokermod(broker)
# trans_dict = mod.norm_trade_records(self.data)
# NOTE: instead i propose the normalizer is
# a one shot routine (that can be lru cached)
# and instead call it for each entry incrementally:
# normer = mod.norm_trade_record(txdict)
for tid, txdict in self.data.items():
# special field handling for datetimes
# to ensure pendulum is used!
fqme = txdict.get('fqme', txdict['fqsn'])
dt = parse(txdict['dt'])
expiry = txdict.get('expiry')
yield (
tid,
Transaction(
fqsn=fqme,
tid=txdict['tid'],
dt=dt,
price=txdict['price'],
size=txdict['size'],
cost=txdict.get('cost', 0),
bs_mktid=txdict['bs_mktid'],
# optional
sym=None,
expiry=parse(expiry) if expiry else None,
)
)
def to_trans(
self,
broker: str = 'paper',
) -> dict[str, Transaction]:
'''
Return the entire output from ``.iter_trans()`` in a ``dict``.
'''
return dict(self.iter_trans())
@cm
def open_trade_ledger(
broker: str,
account: str,
) -> Generator[dict, None, None]:
'''
Indempotently create and read in a trade log file from the
``<configuration_dir>/ledgers/`` directory.
Files are named per broker account of the form
``<brokername>_<accountname>.toml``. The ``accountname`` here is the
name as defined in the user's ``brokers.toml`` config.
'''
ldir: Path = config._config_dir / 'ledgers'
if not ldir.is_dir():
ldir.mkdir()
fname = f'trades_{broker}_{account}.toml'
tradesfile: Path = ldir / fname
if not tradesfile.is_file():
log.info(
f'Creating new local trades ledger: {tradesfile}'
)
tradesfile.touch()
with tradesfile.open(mode='rb') as cf:
start = time.time()
ledger_dict = tomli.load(cf)
log.info(f'Ledger load took {time.time() - start}s')
cpy = ledger_dict.copy()
ledger = TransactionLedger(
ledger_dict=cpy,
file_path=tradesfile,
)
try:
yield ledger
finally:
if ledger.data != ledger_dict:
# TODO: show diff output?
# https://stackoverflow.com/questions/12956957/print-diff-of-python-dictionaries
log.info(f'Updating ledger for {tradesfile}:\n')
ledger.write_config()
def iter_by_dt(
clears: dict[str, Any],
) -> Iterator[tuple[str, dict]]:
'''
Iterate entries of a ``clears: dict`` table sorted by entry recorded
datetime presumably set at the ``'dt'`` field in each entry.
'''
for tid, data in sorted(
list(clears.items()),
key=lambda item: item[1]['dt'],
):
yield tid, data

View File

@ -0,0 +1,558 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Market (pair) meta-info layer: sane addressing semantics and meta-data
for cross-provider marketplaces.
We intoduce the concept of,
- a FQMA: fully qualified market address,
- a sane schema for FQMAs including derivatives,
- a msg-serializeable description of markets for
easy sharing with other pikers B)
'''
from __future__ import annotations
from decimal import (
Decimal,
ROUND_HALF_EVEN,
)
from typing import (
Any,
Literal,
)
from ..data.types import Struct
_underlyings: list[str] = [
'stock',
'bond',
'crypto',
'fiat',
'commodity',
]
_derivs: list[str] = [
'swap',
'future',
'continuous_future',
'option',
'futures_option',
]
# NOTE: a tag for other subsystems to try
# and do default settings for certain things:
# - allocator does unit vs. dolla size limiting.
AssetTypeName: Literal[
_underlyings
+
_derivs
]
# egs. stock, futer, option, bond etc.
def dec_digits(
value: float | str | Decimal,
) -> int:
'''
Return the number of precision digits read from a decimal or float
value.
'''
if value == 0:
return 0
return int(
-Decimal(str(value)).as_tuple().exponent
)
float_digits = dec_digits
def digits_to_dec(
ndigits: int,
) -> Decimal:
'''
Return the minimum float value for an input integer value.
eg. 3 -> 0.001
'''
if ndigits == 0:
return Decimal('0')
return Decimal('0.' + '0'*(ndigits-1) + '1')
class Asset(Struct, frozen=True):
'''
Container type describing any transactable asset and its
contract-like and/or underlying technology meta-info.
'''
name: str
atype: str # AssetTypeName
# minimum transaction size / precision.
# eg. for buttcoin this is a "satoshi".
tx_tick: Decimal
# NOTE: additional info optionally packed in by the backend, but
# should not be explicitly required in our generic API.
info: dict = {} # make it frozen?
# TODO?
# _to_dict_skip = {'info'}
def __str__(self) -> str:
return self.name
def quantize(
self,
size: float,
) -> Decimal:
'''
Truncate input ``size: float`` using ``Decimal``
quantized form of the digit precision defined
by ``self.lot_tick_size``.
'''
digits = float_digits(self.tx_tick)
return Decimal(size).quantize(
Decimal(f'1.{"0".ljust(digits, "0")}'),
rounding=ROUND_HALF_EVEN
)
def maybe_cons_tokens(
tokens: list[Any],
delim_char: str = '.',
) -> str:
'''
Construct `str` output from a maybe-concatenation of input
sequence of elements in ``tokens``.
'''
return delim_char.join(filter(bool, tokens)).lower()
class MktPair(Struct, frozen=True):
'''
Market description for a pair of assets which are tradeable:
a market which enables transactions of the form,
buy: source asset -> destination asset
sell: destination asset -> source asset
The main intention of this type is for a **simple** cross-asset
venue/broker normalized descrption type from which all
market-auctions can be mapped from FQME identifiers.
TODO: our eventual target fqme format/schema is:
<dst>/<src>.<expiry>.<con_info_1>.<con_info_2>. -> .<venue>.<broker>
^ -- optional tokens ------------------------------- ^
'''
dst: str | Asset
# "destination asset" (name) used to buy *to*
# (or used to sell *from*)
price_tick: Decimal # minimum price increment
size_tick: Decimal # minimum size (aka vlm) increment
# the tick size is the number describing the smallest step in value
# available in this market between the source and destination
# assets.
# https://en.wikipedia.org/wiki/Tick_size
# https://en.wikipedia.org/wiki/Commodity_tick
# https://en.wikipedia.org/wiki/Percentage_in_point
# unique "broker id" since every market endpoint provider
# has their own nomenclature and schema for market maps.
bs_mktid: str
broker: str # the middle man giving access
# NOTE: to start this field is optional but should eventually be
# required; the reason is for backward compat since more positioning
# calculations were not originally stored with a src asset..
src: str | Asset = ''
# "source asset" (name) used to buy *from*
# (or used to sell *to*).
venue: str = '' # market venue provider name
expiry: str = '' # for derivs, expiry datetime parseable str
# destination asset's financial type/classification name
# NOTE: this is required for the order size allocator system,
# since we use different default settings based on the type
# of the destination asset, eg. futes use a units limits vs.
# equities a $limit.
# dst_type: AssetTypeName | None = None
# source asset's financial type/classification name
# TODO: is a src type required for trading?
# there's no reason to need any more then the one-way alloc-limiter
# config right?
# src_type: AssetTypeName
# for derivs, info describing contract, egs.
# strike price, call or put, swap type, exercise model, etc.
contract_info: list[str] | None = None
_atype: str = ''
# NOTE: when cast to `str` return fqme
def __str__(self) -> str:
return self.fqme
@classmethod
def from_msg(
cls,
msg: dict[str, Any],
) -> MktPair:
'''
Constructor for a received msg-dict normally received over IPC.
'''
dst_asset_msg = msg.pop('dst')
src_asset_msg = msg.pop('src')
if isinstance(dst_asset_msg, str):
src: str = str(src_asset_msg)
assert isinstance(src, str)
return cls.from_fqme(
dst_asset_msg,
src=src,
**msg,
)
else:
# NOTE: we call `.copy()` here to ensure
# type casting!
dst = Asset(**dst_asset_msg).copy()
if not isinstance(src_asset_msg, str):
src = Asset(**src_asset_msg).copy()
else:
src = str(src_asset_msg)
return cls(
dst=dst,
src=src,
**msg,
).copy()
@property
def resolved(self) -> bool:
return isinstance(self.dst, Asset)
@classmethod
def from_fqme(
cls,
fqme: str,
price_tick: float | str,
size_tick: float | str,
bs_mktid: str,
**kwargs,
) -> MktPair:
broker, key, suffix = unpack_fqme(fqme)
# XXX: loading from a fqme string will
# leave this pair as "un resolved" meaning
# we don't yet have `.dst` set as an `Asset`
# which we expect to be filled in by some
# backend client with access to that data-info.
return cls(
dst=key, # not resolved
price_tick=price_tick,
size_tick=size_tick,
bs_mktid=bs_mktid,
broker=broker,
**kwargs,
).copy()
@property
def key(self) -> str:
'''
The "endpoint key" for this market.
Eg. mnq/usd or btc/usdt or xmr/btc
In most other tina platforms this is referred to as the
"symbol".
'''
return maybe_cons_tokens(
[str(self.dst),
str(self.src)],
delim_char='',
)
@property
def suffix(self) -> str:
'''
The "contract suffix" for this market.
Eg. mnq/usd.20230616.cme.ib
^ ----- ^
or tsla/usd.20230324.200c.cboe.ib
^ ---------- ^
In most other tina platforms they only show you these details in
some kinda "meta data" format, we have FQMEs so we do this up
front and explicit.
'''
field_strs = [self.expiry]
con_info = self.contract_info
if con_info is not None:
field_strs.extend(con_info)
return maybe_cons_tokens(field_strs)
# NOTE: the main idea behind an fqme is to map a "market address"
# to some endpoint from a transaction provider (eg. a broker) such
# that we build a table of `fqme: str -> bs_mktid: Any` where any "piker
# market address" maps 1-to-1 to some broker trading endpoint.
# @cached_property
@property
def fqme(self) -> str:
'''
Return the fully qualified market endpoint-address for the
pair of transacting assets.
fqme = "fully qualified market endpoint"
And yes, you pronounce it colloquially as read..
Basically the idea here is for all client code (consumers of piker's
APIs which query the data/broker-provider agnostic layer(s)) should be
able to tell which backend / venue / derivative each data feed/flow is
from by an explicit string-key of the current form:
<market-instrument-name>
.<venue>
.<expiry>
.<derivative-suffix-info>
.<brokerbackendname>
eg. for an explicit daq mini futes contract: mnq.cme.20230317.ib
TODO: I have thoughts that we should actually change this to be
more like an "attr lookup" (like how the web should have done
urls, but marketting peeps ruined it etc. etc.)
<broker>.<venue>.<instrumentname>.<suffixwithmetadata>
TODO:
See community discussion on naming and nomenclature, order
of addressing hierarchy, general schema, internal representation:
https://github.com/pikers/piker/issues/467
'''
return maybe_cons_tokens([
self.key, # final "pair name" (eg. qqq[/usd], btcusdt)
self.venue,
self.suffix, # includes expiry and other con info
self.broker,
])
@property
def fqsn(self) -> str:
return self.fqme
def quantize(
self,
size: float,
quantity_type: Literal['price', 'size'] = 'size',
) -> Decimal:
'''
Truncate input ``size: float`` using ``Decimal``
and ``.size_tick``'s # of digits.
'''
match quantity_type:
case 'price':
digits = float_digits(self.price_tick)
case 'size':
digits = float_digits(self.size_tick)
return Decimal(size).quantize(
Decimal(f'1.{"0".ljust(digits, "0")}'),
rounding=ROUND_HALF_EVEN
)
# TODO: BACKWARD COMPAT, TO REMOVE?
@property
def type_key(self) -> str:
if isinstance(self.dst, Asset):
return str(self.dst.atype)
return self._atype
@property
def tick_size_digits(self) -> int:
return float_digits(self.price_tick)
@property
def lot_size_digits(self) -> int:
return float_digits(self.size_tick)
def unpack_fqme(
fqme: str,
) -> tuple[str, str, str]:
'''
Unpack a fully-qualified-symbol-name to ``tuple``.
'''
venue = ''
suffix = ''
# TODO: probably reverse the order of all this XD
tokens = fqme.split('.')
match tokens:
case [mkt_ep, broker]:
# probably crypto
# mkt_ep, broker = tokens
return (
broker,
mkt_ep,
'',
)
# TODO: swap venue and suffix/deriv-info here?
case [mkt_ep, venue, suffix, broker]:
pass
case [mkt_ep, venue, broker]:
suffix = ''
case _:
raise ValueError(f'Invalid fqme: {fqme}')
return (
broker,
'.'.join([mkt_ep, venue]),
suffix,
)
unpack_fqsn = unpack_fqme
class Symbol(Struct):
'''
I guess this is some kinda container thing for dealing with
all the different meta-data formats from brokers?
'''
key: str
# precision descriptors for price and vlm
tick_size: Decimal = Decimal('0.01')
lot_tick_size: Decimal = Decimal('0.0')
suffix: str = ''
broker_info: dict[str, dict[str, Any]] = {}
@classmethod
def from_fqsn(
cls,
fqsn: str,
info: dict[str, Any],
) -> Symbol:
broker, key, suffix = unpack_fqsn(fqsn)
tick_size = info.get('price_tick_size', 0.01)
lot_size = info.get('lot_tick_size', 0.0)
return Symbol(
key=key,
tick_size=tick_size,
lot_tick_size=lot_size,
suffix=suffix,
broker_info={broker: info},
)
# compat name mapping
from_fqme = from_fqsn
@property
def type_key(self) -> str:
return list(self.broker_info.values())[0]['asset_type']
@property
def tick_size_digits(self) -> int:
return float_digits(self.tick_size)
@property
def lot_size_digits(self) -> int:
return float_digits(self.lot_tick_size)
@property
def price_tick(self) -> Decimal:
return Decimal(str(self.tick_size))
@property
def size_tick(self) -> Decimal:
return Decimal(str(self.lot_tick_size))
@property
def broker(self) -> str:
return list(self.broker_info.keys())[0]
@property
def fqsn(self) -> str:
broker = self.broker
key = self.key
if self.suffix:
tokens = (key, self.suffix, broker)
else:
tokens = (key, broker)
return '.'.join(tokens).lower()
fqme = fqsn
def quantize(
self,
size: float,
) -> Decimal:
digits = float_digits(self.lot_tick_size)
return Decimal(size).quantize(
Decimal(f'1.{"0".ljust(digits, "0")}'),
rounding=ROUND_HALF_EVEN
)
# NOTE: when cast to `str` return fqme
def __str__(self) -> str:
return self.fqme

View File

@ -12,137 +12,62 @@
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Personal/Private position parsing, calculating, summarizing in a way
that doesn't try to cuk most humans who prefer to not lose their moneys..
(looking at you `ib` and dirt-bird friends)
'''
from __future__ import annotations
from contextlib import contextmanager as cm
from pprint import pformat
import os
from os import path
from decimal import Decimal
from math import copysign
import re
import time
from pprint import pformat
from pathlib import Path
from typing import (
Any,
Iterator,
Optional,
Union,
Generator
)
import pendulum
from pendulum import datetime, now
import tomli
import toml
from . import config
from .brokers import get_brokermod
from .clearing._messages import BrokerdPosition, Status
from .data._source import Symbol, unpack_fqsn
from .log import get_logger
from .data.types import Struct
from ._toml import (
toml,
PpsEncoder,
)
from ._ledger import (
Transaction,
iter_by_dt,
open_trade_ledger,
)
from ._mktinfo import (
Symbol,
MktPair,
Asset,
unpack_fqme,
)
from .. import config
from ..brokers import get_brokermod
from ..clearing._messages import BrokerdPosition, Status
from ..data.types import Struct
from ..log import get_logger
log = get_logger(__name__)
@cm
def open_trade_ledger(
broker: str,
account: str,
) -> Generator[dict, None, None]:
'''
Indempotently create and read in a trade log file from the
``<configuration_dir>/ledgers/`` directory.
Files are named per broker account of the form
``<brokername>_<accountname>.toml``. The ``accountname`` here is the
name as defined in the user's ``brokers.toml`` config.
'''
ldir = path.join(config._config_dir, 'ledgers')
if not path.isdir(ldir):
os.makedirs(ldir)
fname = f'trades_{broker}_{account}.toml'
tradesfile = path.join(ldir, fname)
if not path.isfile(tradesfile):
log.info(
f'Creating new local trades ledger: {tradesfile}'
)
with open(tradesfile, 'w') as cf:
pass # touch
with open(tradesfile, 'rb') as cf:
start = time.time()
ledger = tomli.load(cf)
log.info(f'Ledger load took {time.time() - start}s')
cpy = ledger.copy()
try:
yield cpy
finally:
if cpy != ledger:
# TODO: show diff output?
# https://stackoverflow.com/questions/12956957/print-diff-of-python-dictionaries
log.info(f'Updating ledger for {tradesfile}:\n')
ledger.update(cpy)
# we write on close the mutated ledger data
with open(tradesfile, 'w') as cf:
toml.dump(ledger, cf)
class Transaction(Struct, frozen=True):
# TODO: should this be ``.to`` (see below)?
fqsn: str
sym: Symbol
tid: Union[str, int] # unique transaction id
size: float
price: float
cost: float # commisions or other additional costs
dt: datetime
expiry: datetime | None = None
# optional key normally derived from the broker
# backend which ensures the instrument-symbol this record
# is for is truly unique.
bsuid: Union[str, int] | None = None
# optional fqsn for the source "asset"/money symbol?
# from: Optional[str] = None
def iter_by_dt(
clears: dict[str, Any],
) -> Iterator[tuple[str, dict]]:
'''
Iterate entries of a ``clears: dict`` table sorted by entry recorded
datetime presumably set at the ``'dt'`` field in each entry.
'''
for tid, data in sorted(
list(clears.items()),
key=lambda item: item[1]['dt'],
):
yield tid, data
class Position(Struct):
'''
Basic pp (personal/piker position) model with attached clearing
transaction history.
'''
symbol: Symbol
symbol: Symbol | MktPair
# can be +ve or -ve for long/short
size: float
@ -151,19 +76,22 @@ class Position(Struct):
# zero for the entirety of the current "trade state".
ppu: float
# unique backend symbol id
bsuid: str
# unique "backend system market id"
bs_mktid: str
split_ratio: Optional[int] = None
split_ratio: int | None = None
# ordered record of known constituent trade messages
clears: dict[
Union[str, int, Status], # trade id
dict[str, Any], # transaction history summaries
] = {}
first_clear_dt: Optional[datetime] = None
first_clear_dt: datetime | None = None
expiry: Optional[datetime] = None
expiry: datetime | None = None
def __repr__(self) -> str:
return pformat(self.to_dict())
def to_dict(self) -> dict:
return {
@ -193,29 +121,43 @@ class Position(Struct):
# it via the trades ledger..
# drop symbol obj in serialized form
s = d.pop('symbol')
fqsn = s.front_fqsn()
fqme = s.fqme
broker, key, suffix = unpack_fqme(fqme)
broker, key, suffix = unpack_fqsn(fqsn)
sym_info = s.broker_info[broker]
if isinstance(s, Symbol):
sym_info = s.broker_info[broker]
d['asset_type'] = sym_info['asset_type']
d['price_tick'] = (
sym_info.get('price_tick_size')
or
s.tick_size
)
d['size_tick'] = (
sym_info.get('lot_tick_size')
or
s.lot_tick_size
)
d['asset_type'] = sym_info['asset_type']
d['price_tick_size'] = (
sym_info.get('price_tick_size')
or
s.tick_size
)
d['lot_tick_size'] = (
sym_info.get('lot_tick_size')
or
s.lot_tick_size
)
# the newwww wayyy B)
else:
mkt = s
assert isinstance(mkt, MktPair)
# an asset resolved mkt where we have ``Asset`` info about
# each tradeable asset in the market.
if mkt.resolved:
dst: Asset = mkt.dst
d['asset_type'] = dst.atype
d['price_tick'] = mkt.price_tick
d['size_tick'] = mkt.size_tick
if self.expiry is None:
d.pop('expiry', None)
elif expiry:
d['expiry'] = str(expiry)
toml_clears_list = []
toml_clears_list: list[dict[str, Any]] = []
# reverse sort so latest clears are at top of section?
for tid, data in iter_by_dt(clears):
@ -239,7 +181,7 @@ class Position(Struct):
d['clears'] = toml_clears_list
return fqsn, d
return fqme, d
def ensure_state(self) -> None:
'''
@ -296,20 +238,20 @@ class Position(Struct):
# XXX: better place to do this?
symbol = self.symbol
lot_size_digits = symbol.lot_size_digits
ppu, size = (
round(
msg['avg_price'],
ndigits=symbol.tick_size_digits
),
round(
msg['size'],
ndigits=lot_size_digits
),
)
# TODO: switch to new fields..?
# .size_tick_digits, .price_tick_digits
size_tick_digits = symbol.lot_size_digits
price_tick_digits = symbol.tick_size_digits
self.ppu = ppu
self.size = size
self.ppu = round(
# TODO: change this to ppu?
msg['avg_price'],
ndigits=price_tick_digits,
)
self.size = round(
msg['size'],
ndigits=size_tick_digits,
)
@property
def dsize(self) -> float:
@ -484,7 +426,9 @@ class Position(Struct):
if self.split_ratio is not None:
size = round(size * self.split_ratio)
return float(self.symbol.quantize_size(size))
return float(
self.symbol.quantize(size),
)
def minimize_clears(
self,
@ -543,8 +487,8 @@ class Position(Struct):
return clear
def sugest_split(self) -> float:
...
# def sugest_split(self) -> float:
# ...
class PpTable(Struct):
@ -552,36 +496,61 @@ class PpTable(Struct):
brokername: str
acctid: str
pps: dict[str, Position]
conf: Optional[dict] = {}
conf_path: Path
conf: dict | None = {}
def update_from_trans(
self,
trans: dict[str, Transaction],
cost_scalar: float = 2,
force_mkt: MktPair | None = None,
) -> dict[str, Position]:
pps = self.pps
updated: dict[str, Position] = {}
# lifo update all pps from records
for tid, t in trans.items():
# lifo update all pps from records, ensuring
# we compute the PPU and size sorted in time!
for t in sorted(
trans.values(),
key=lambda t: t.dt,
reverse=True,
):
fqme = t.fqme
bs_mktid = t.bs_mktid
pp = pps.setdefault(
t.bsuid,
# template the mkt-info presuming a legacy market ticks
# if no info exists in the transactions..
mkt: MktPair | Symbol | None = force_mkt or t.sys
if not mkt:
mkt = MktPair.from_fqme(
fqme,
price_tick='0.01',
size_tick='0.0',
bs_mktid=bs_mktid,
)
pp = pps.get(bs_mktid)
if not pp:
# if no existing pp, allocate fresh one.
Position(
Symbol.from_fqsn(
t.fqsn,
info={},
) if not t.sym else t.sym,
pp = pps[bs_mktid] = Position(
mkt,
size=0.0,
ppu=0.0,
bsuid=t.bsuid,
bs_mktid=bs_mktid,
expiry=t.expiry,
)
)
else:
# NOTE: if for some reason a "less resolved" mkt pair
# info has been set (based on the `.fqme` being
# a shorter string), instead use the one from the
# transaction since it likely has (more) full
# information from the provider.
if len(pp.symbol.fqme) < len(fqme):
pp.symbol = mkt
clears = pp.clears
if clears:
first_clear_dt = pp.first_clear_dt
@ -590,7 +559,10 @@ class PpTable(Struct):
# included in the current pps state.
if (
t.tid in clears
or first_clear_dt and t.dt < first_clear_dt
or (
first_clear_dt
and t.dt < first_clear_dt
)
):
# NOTE: likely you'll see repeats of the same
# ``Transaction`` passed in here if/when you are restarting
@ -601,12 +573,14 @@ class PpTable(Struct):
# update clearing table
pp.add_clear(t)
updated[t.bsuid] = pp
updated[t.bs_mktid] = pp
# minimize clears tables and update sizing.
for bsuid, pp in updated.items():
for bs_mktid, pp in updated.items():
pp.ensure_state()
# deliver only the position entries that were actually updated
# (modified the state) from the input transaction set.
return updated
def dump_active(
@ -630,14 +604,8 @@ class PpTable(Struct):
open_pp_objs: dict[str, Position] = {}
pp_objs = self.pps
for bsuid in list(pp_objs):
pp = pp_objs[bsuid]
# XXX: debug hook for size mismatches
# qqqbsuid = 320227571
# if bsuid == qqqbsuid:
# breakpoint()
for bs_mktid in list(pp_objs):
pp = pp_objs[bs_mktid]
pp.ensure_state()
if (
@ -656,10 +624,10 @@ class PpTable(Struct):
# ignored; the closed positions won't be written to the
# ``pps.toml`` since ``pp_active_entries`` above is what's
# written.
closed_pp_objs[bsuid] = pp
closed_pp_objs[bs_mktid] = pp
else:
open_pp_objs[bsuid] = pp
open_pp_objs[bs_mktid] = pp
return open_pp_objs, closed_pp_objs
@ -669,11 +637,11 @@ class PpTable(Struct):
active, closed = self.dump_active()
# ONLY dict-serialize all active positions; those that are closed
# we don't store in the ``pps.toml``.
# ONLY dict-serialize all active positions; those that are
# closed we don't store in the ``pps.toml``.
to_toml_dict = {}
for bsuid, pos in active.items():
for bs_mktid, pos in active.items():
# keep the minimal amount of clears that make up this
# position since the last net-zero state.
@ -681,12 +649,12 @@ class PpTable(Struct):
pos.ensure_state()
# serialize to pre-toml form
fqsn, asdict = pos.to_pretoml()
log.info(f'Updating active pp: {fqsn}')
fqme, asdict = pos.to_pretoml()
log.info(f'Updating active pp: {fqme}')
# XXX: ugh, it's cuz we push the section under
# the broker name.. maybe we need to rethink this?
brokerless_key = fqsn.removeprefix(f'{self.brokername}.')
brokerless_key = fqme.removeprefix(f'{self.brokername}.')
to_toml_dict[brokerless_key] = asdict
return to_toml_dict
@ -701,17 +669,31 @@ class PpTable(Struct):
# active, closed_pp_objs = table.dump_active()
pp_entries = self.to_toml()
if pp_entries:
log.info(f'Updating ``pps.toml`` for {path}:\n')
log.info(f'Current positions:\n{pp_entries}')
self.conf[self.brokername][self.acctid] = pp_entries
log.info(
f'Updating positions in ``{self.conf_path}``:\n'
f'n{pformat(pp_entries)}'
)
elif (
self.brokername in self.conf and
self.acctid in self.conf[self.brokername]
):
del self.conf[self.brokername][self.acctid]
if len(self.conf[self.brokername]) == 0:
del self.conf[self.brokername]
if self.brokername in self.conf:
log.warning(
f'Rewriting {self.conf_path} keys to drop <broker.acct>!'
)
# legacy key schema including <brokername.account>, so
# rewrite all entries to drop those tables since we now
# put that in the filename!
accounts = self.conf.pop(self.brokername)
assert len(accounts) == 1
entries = accounts.pop(self.acctid)
self.conf.update(entries)
self.conf.update(pp_entries)
# if there are no active position entries according
# to the toml dump output above, then clear the config
# file of all entries.
elif self.conf:
for entry in list(self.conf):
del self.conf[entry]
# TODO: why tf haven't they already done this for inline
# tables smh..
@ -722,8 +704,8 @@ class PpTable(Struct):
] = enc.dump_inline_table
config.write(
self.conf,
'pps',
config=self.conf,
path=self.conf_path,
encoder=enc,
fail_empty=False
)
@ -735,7 +717,7 @@ def load_pps_from_ledger(
acctname: str,
# post normalization filter on ledger entries to be processed
filter_by: Optional[list[dict]] = None,
filter_by: list[dict] | None = None,
) -> tuple[
dict[str, Transaction],
@ -745,7 +727,7 @@ def load_pps_from_ledger(
Open a ledger file by broker name and account and read in and
process any trade records into our normalized ``Transaction`` form
and then update the equivalent ``Pptable`` and deliver the two
bsuid-mapped dict-sets of the transactions and pps.
bs_mktid-mapped dict-sets of the transactions and pps.
'''
with (
@ -761,9 +743,9 @@ def load_pps_from_ledger(
if filter_by:
records = {}
bsuids = set(filter_by)
bs_mktids = set(filter_by)
for tid, r in src_records.items():
if r.bsuid in bsuids:
if r.bs_mktid in bs_mktids:
records[tid] = r
else:
records = src_records
@ -773,151 +755,35 @@ def load_pps_from_ledger(
return records, updated
# TODO: instead see if we can hack tomli and tomli-w to do the same:
# - https://github.com/hukkin/tomli
# - https://github.com/hukkin/tomli-w
class PpsEncoder(toml.TomlEncoder):
'''
Special "styled" encoder that makes a ``pps.toml`` redable and
compact by putting `.clears` tables inline and everything else
flat-ish.
'''
separator = ','
def dump_list(self, v):
'''
Dump an inline list with a newline after every element and
with consideration for denoted inline table types.
'''
retval = "[\n"
for u in v:
if isinstance(u, toml.decoder.InlineTableDict):
out = self.dump_inline_table(u)
else:
out = str(self.dump_value(u))
retval += " " + out + "," + "\n"
retval += "]"
return retval
def dump_inline_table(self, section):
"""Preserve inline table in its compact syntax instead of expanding
into subsection.
https://github.com/toml-lang/toml#user-content-inline-table
"""
val_list = []
for k, v in section.items():
# if isinstance(v, toml.decoder.InlineTableDict):
if isinstance(v, dict):
val = self.dump_inline_table(v)
else:
val = str(self.dump_value(v))
val_list.append(k + " = " + val)
retval = "{ " + ", ".join(val_list) + " }"
return retval
def dump_sections(self, o, sup):
retstr = ""
if sup != "" and sup[-1] != ".":
sup += '.'
retdict = self._dict()
arraystr = ""
for section in o:
qsection = str(section)
value = o[section]
if not re.match(r'^[A-Za-z0-9_-]+$', section):
qsection = toml.encoder._dump_str(section)
# arrayoftables = False
if (
self.preserve
and isinstance(value, toml.decoder.InlineTableDict)
):
retstr += (
qsection
+
" = "
+
self.dump_inline_table(o[section])
+
'\n' # only on the final terminating left brace
)
# XXX: this code i'm pretty sure is just blatantly bad
# and/or wrong..
# if isinstance(o[section], list):
# for a in o[section]:
# if isinstance(a, dict):
# arrayoftables = True
# if arrayoftables:
# for a in o[section]:
# arraytabstr = "\n"
# arraystr += "[[" + sup + qsection + "]]\n"
# s, d = self.dump_sections(a, sup + qsection)
# if s:
# if s[0] == "[":
# arraytabstr += s
# else:
# arraystr += s
# while d:
# newd = self._dict()
# for dsec in d:
# s1, d1 = self.dump_sections(d[dsec], sup +
# qsection + "." +
# dsec)
# if s1:
# arraytabstr += ("[" + sup + qsection +
# "." + dsec + "]\n")
# arraytabstr += s1
# for s1 in d1:
# newd[dsec + "." + s1] = d1[s1]
# d = newd
# arraystr += arraytabstr
elif isinstance(value, dict):
retdict[qsection] = o[section]
elif o[section] is not None:
retstr += (
qsection
+
" = "
+
str(self.dump_value(o[section]))
)
# if not isinstance(value, dict):
if not isinstance(value, toml.decoder.InlineTableDict):
# inline tables should not contain newlines:
# https://toml.io/en/v1.0.0#inline-table
retstr += '\n'
else:
raise ValueError(value)
retstr += arraystr
return (retstr, retdict)
@cm
def open_pps(
brokername: str,
acctid: str,
write_on_exit: bool = False,
) -> Generator[PpTable, None, None]:
'''
Read out broker-specific position entries from
incremental update file: ``pps.toml``.
'''
conf, path = config.load('pps')
brokersection = conf.setdefault(brokername, {})
pps = brokersection.setdefault(acctid, {})
conf: dict
conf_path: Path
conf, conf_path = config.load(
f'pps.{brokername}.{acctid}',
)
if brokername in conf:
log.warning(
f'Rewriting {conf_path} keys to drop <broker.acct>!'
)
# legacy key schema including <brokername.account>, so
# rewrite all entries to drop those tables since we now
# put that in the filename!
accounts = conf.pop(brokername)
for acctid in accounts.copy():
entries = accounts.pop(acctid)
conf.update(entries)
# TODO: ideally we can pass in an existing
# pps state to this right? such that we
@ -934,61 +800,72 @@ def open_pps(
brokername,
acctid,
pp_objs,
conf_path,
conf=conf,
)
# unmarshal/load ``pps.toml`` config entries into object form
# and update `PpTable` obj entries.
for fqsn, entry in pps.items():
bsuid = entry['bsuid']
symbol = Symbol.from_fqsn(
fqsn,
for fqme, entry in conf.items():
# NOTE & TODO: right now we fill in the defaults from
# `.data._source.Symbol` but eventually these should always
# either be already written to the pos table or provided at
# write time to ensure always having these values somewhere
# and thus allowing us to get our pos sizing precision
# correct!
info={
'asset_type': entry.get('asset_type', '<unknown>'),
'price_tick_size': entry.get('price_tick_size', 0.01),
'lot_tick_size': entry.get('lot_tick_size', 0.0),
}
# atype = entry.get('asset_type', '<unknown>')
# unique broker market id
bs_mktid = str(
entry.get('bsuid')
or entry.get('bs_mktid')
)
price_tick = Decimal(str(
entry.get('price_tick_size')
or entry.get('price_tick')
or '0.01'
))
size_tick = Decimal(str(
entry.get('lot_tick_size')
or entry.get('size_tick')
or '0.0'
))
# load the pair using the fqme which
# will make the pair "unresolved" until
# the backend broker actually loads
# the market and position info.
mkt = MktPair.from_fqme(
fqme,
price_tick=price_tick,
size_tick=size_tick,
bs_mktid=bs_mktid
)
# TODO: RE: general "events" instead of just "clears":
# - make this an `events` field and support more event types
# such as 'split', 'name_change', 'mkt_info', etc..
# - should be make a ``Struct`` for clear/event entries? convert
# "clear events table" from the toml config (list of a dicts)
# and load it into object form for use in position processing of
# new clear events.
# convert clears sub-tables (only in this form
# for toml re-presentation) back into a master table.
clears_list = entry['clears']
# index clears entries in "object" form by tid in a top
# level dict instead of a list (as is presented in our
# ``pps.toml``).
clears = pp_objs.setdefault(bsuid, {})
# TODO: should be make a ``Struct`` for clear/event entries?
# convert "clear events table" from the toml config (list of
# a dicts) and load it into object form for use in position
# processing of new clear events.
toml_clears_list: list[dict[str, Any]] = entry['clears']
trans: list[Transaction] = []
for clears_table in toml_clears_list:
for clears_table in clears_list:
tid = clears_table.pop('tid')
dtstr = clears_table['dt']
dt = pendulum.parse(dtstr)
clears_table['dt'] = dt
trans.append(Transaction(
fqsn=bsuid,
sym=symbol,
bsuid=bsuid,
fqsn=bs_mktid,
sym=mkt,
bs_mktid=bs_mktid,
tid=tid,
size=clears_table['size'],
price=clears_table['price'],
cost=clears_table['cost'],
dt=dt,
))
clears[tid] = clears_table
size = entry['size']
@ -1004,13 +881,13 @@ def open_pps(
if expiry:
expiry = pendulum.parse(expiry)
pp = pp_objs[bsuid] = Position(
symbol,
pp = pp_objs[bs_mktid] = Position(
mkt,
size=size,
ppu=ppu,
split_ratio=split_ratio,
expiry=expiry,
bsuid=entry['bsuid'],
bs_mktid=bs_mktid,
)
# XXX: super critical, we need to be sure to include
@ -1029,19 +906,3 @@ def open_pps(
finally:
if write_on_exit:
table.write_config()
if __name__ == '__main__':
import sys
args = sys.argv
assert len(args) > 1, 'Specifiy account(s) from `brokers.toml`'
args = args[1:]
for acctid in args:
broker, name = acctid.split('.')
trans, updated_pps = load_pps_from_ledger(broker, name)
print(
f'Processing transactions into pps for {broker}:{acctid}\n'
f'{pformat(trans)}\n\n'
f'{pformat(updated_pps)}'
)

View File

@ -0,0 +1,156 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
TOML codec hacks to make position tables look decent.
(looking at you "`toml`-lib"..)
'''
import re
import toml
# TODO: instead see if we can hack tomli and tomli-w to do the same:
# - https://github.com/hukkin/tomli
# - https://github.com/hukkin/tomli-w
class PpsEncoder(toml.TomlEncoder):
'''
Special "styled" encoder that makes a ``pps.toml`` redable and
compact by putting `.clears` tables inline and everything else
flat-ish.
'''
separator = ','
def dump_list(self, v):
'''
Dump an inline list with a newline after every element and
with consideration for denoted inline table types.
'''
retval = "[\n"
for u in v:
if isinstance(u, toml.decoder.InlineTableDict):
out = self.dump_inline_table(u)
else:
out = str(self.dump_value(u))
retval += " " + out + "," + "\n"
retval += "]"
return retval
def dump_inline_table(self, section):
"""Preserve inline table in its compact syntax instead of expanding
into subsection.
https://github.com/toml-lang/toml#user-content-inline-table
"""
val_list = []
for k, v in section.items():
# if isinstance(v, toml.decoder.InlineTableDict):
if isinstance(v, dict):
val = self.dump_inline_table(v)
else:
val = str(self.dump_value(v))
val_list.append(k + " = " + val)
retval = "{ " + ", ".join(val_list) + " }"
return retval
def dump_sections(self, o, sup):
retstr = ""
if sup != "" and sup[-1] != ".":
sup += '.'
retdict = self._dict()
arraystr = ""
for section in o:
qsection = str(section)
value = o[section]
if not re.match(r'^[A-Za-z0-9_-]+$', section):
qsection = toml.encoder._dump_str(section)
# arrayoftables = False
if (
self.preserve
and isinstance(value, toml.decoder.InlineTableDict)
):
retstr += (
qsection
+
" = "
+
self.dump_inline_table(o[section])
+
'\n' # only on the final terminating left brace
)
# XXX: this code i'm pretty sure is just blatantly bad
# and/or wrong..
# if isinstance(o[section], list):
# for a in o[section]:
# if isinstance(a, dict):
# arrayoftables = True
# if arrayoftables:
# for a in o[section]:
# arraytabstr = "\n"
# arraystr += "[[" + sup + qsection + "]]\n"
# s, d = self.dump_sections(a, sup + qsection)
# if s:
# if s[0] == "[":
# arraytabstr += s
# else:
# arraystr += s
# while d:
# newd = self._dict()
# for dsec in d:
# s1, d1 = self.dump_sections(d[dsec], sup +
# qsection + "." +
# dsec)
# if s1:
# arraytabstr += ("[" + sup + qsection +
# "." + dsec + "]\n")
# arraytabstr += s1
# for s1 in d1:
# newd[dsec + "." + s1] = d1[s1]
# d = newd
# arraystr += arraytabstr
elif isinstance(value, dict):
retdict[qsection] = o[section]
elif o[section] is not None:
retstr += (
qsection
+
" = "
+
str(self.dump_value(o[section]))
)
# if not isinstance(value, dict):
if not isinstance(value, toml.decoder.InlineTableDict):
# inline tables should not contain newlines:
# https://toml.io/en/v1.0.0#inline-table
retstr += '\n'
else:
raise ValueError(value)
retstr += arraystr
return (retstr, retdict)

View File

@ -0,0 +1,217 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
CLI front end for trades ledger and position tracking management.
'''
from typing import (
Any,
)
from rich.console import Console
from rich.markdown import Markdown
import tractor
import trio
import typer
from ..service import (
open_piker_runtime,
)
from ..clearing._messages import BrokerdPosition
from ..calc import humanize
ledger = typer.Typer()
def broker_init(
brokername: str,
loglevel: str | None = None,
**start_actor_kwargs,
) -> dict:
'''
Given an input broker name, load all named arguments
which can be passed to a daemon + context spawn for
the relevant `brokerd` service endpoint.
'''
from ..brokers import get_brokermod
brokermod = get_brokermod(brokername)
modpath = brokermod.__name__
start_actor_kwargs['name'] = f'brokerd.{brokername}'
start_actor_kwargs.update(
getattr(
brokermod,
'_spawn_kwargs',
{},
)
)
# lookup actor-enabled modules declared by the backend offering the
# `brokerd` endpoint(s).
enabled = start_actor_kwargs['enable_modules'] = [modpath]
for submodname in getattr(
brokermod,
'__enable_modules__',
[],
):
subpath = f'{modpath}.{submodname}'
enabled.append(subpath)
# non-blocking setup of brokerd service nursery
from ..data import _setup_persistent_brokerd
return (
start_actor_kwargs, # to `ActorNursery.start_actor()`
_setup_persistent_brokerd, # service task ep
getattr( # trades endpoint
brokermod,
'trades_dialogue',
None,
),
)
@ledger.command()
def sync(
fully_qualified_account_name: str,
pdb: bool = False,
loglevel: str = typer.Option(
'error',
"-l",
),
):
console = Console()
try:
brokername, account = fully_qualified_account_name.split('.')
except ValueError:
md = Markdown(
f'=> `{fully_qualified_account_name}` <=\n\n'
'is not a valid '
'__fully qualified account name?__\n\n'
'Your account name needs to be of the form '
'`<brokername>.<account_name>`\n'
)
console.print(md)
return
start_kwargs, _, trades_ep = broker_init(
brokername,
loglevel=loglevel,
)
async def main():
async with (
open_piker_runtime(
name='ledger_cli',
loglevel=loglevel,
debug_mode=pdb,
) as (actor, sockaddr),
tractor.open_nursery() as an,
):
portal = await an.start_actor(
loglevel=loglevel,
debug_mode=pdb,
**start_kwargs,
)
if (
brokername == 'paper'
or trades_ep is None
):
from ..clearing import _paper_engine as paper
open_trades_endpoint = paper.open_paperboi(
fqme=None, # tell paper to not start clearing loop
broker=brokername,
loglevel=loglevel,
)
else:
# open live brokerd trades endpoint
open_trades_endpoint = portal.open_context(
trades_ep,
loglevel=loglevel,
)
positions: dict[str, Any]
accounts: list[str]
async with (
open_trades_endpoint as (
brokerd_ctx,
(positions, accounts,),
),
):
summary: str = (
'[dim underline]Piker Position Summary[/] '
f'[dim blue underline]{brokername}[/]'
'[dim].[/]'
f'[blue underline]{account}[/]'
f'[dim underline] -> total pps: [/]'
f'[green]{len(positions)}[/]\n'
)
for ppdict in positions:
ppmsg = BrokerdPosition(**ppdict)
size = ppmsg.size
if size:
ppu: float = round(
ppmsg.avg_price,
ndigits=2,
)
cost_basis: str = humanize(size * ppu)
h_size: str = humanize(size)
if size < 0:
pcolor = 'red'
else:
pcolor = 'green'
# sematic-highlight of fqme
fqme = ppmsg.symbol
tokens = fqme.split('.')
styled_fqme = f'[blue underline]{tokens[0]}[/]'
for tok in tokens[1:]:
styled_fqme += '[dim].[/]'
styled_fqme += f'[dim blue underline]{tok}[/]'
# TODO: instead display in a ``rich.Table``?
summary += (
styled_fqme +
'[dim]: [/]'
f'[{pcolor}]{h_size}[/]'
'[dim blue]u @[/]'
f'[{pcolor}]{ppu}[/]'
'[dim blue] = [/]'
f'[{pcolor}]$ {cost_basis}\n[/]'
)
console.print(summary)
await brokerd_ctx.cancel()
await portal.cancel_actor()
trio.run(main)
if __name__ == "__main__":
ledger()

View File

@ -15,13 +15,28 @@
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Handy utils.
Handy cross-broker utils.
"""
from functools import partial
import json
import asks
import logging
from ..log import colorize_json
from ..log import (
get_logger,
get_console_log,
colorize_json,
)
subsys: str = 'piker.brokers'
log = get_logger(subsys)
get_console_log = partial(
get_console_log,
name=subsys,
)
class BrokerError(Exception):
@ -69,7 +84,6 @@ class DataThrottle(BrokerError):
# TODO: add in throttle metrics/feedback
def resproc(
resp: asks.response_objects.Response,
log: logging.Logger,

View File

@ -1,5 +1,8 @@
# piker: trading gear for hackers
# Copyright (C) Guillermo Rodriguez (in stewardship for piker0)
# Copyright (C)
# Guillermo Rodriguez
# Tyler Goodlet
# (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
@ -20,6 +23,7 @@ Binance backend
"""
from contextlib import asynccontextmanager as acm
from datetime import datetime
from decimal import Decimal
from typing import (
Any, Union, Optional,
AsyncGenerator, Callable,
@ -36,14 +40,19 @@ import numpy as np
import tractor
import wsproto
from ..accounting._mktinfo import (
Asset,
MktPair,
digits_to_dec,
)
from .._cacheables import open_cached_client
from ._util import (
resproc,
SymbolNotFound,
DataUnavailable,
)
from ..log import (
get_logger,
from ._util import (
log,
get_console_log,
)
from ..data.types import Struct
@ -52,8 +61,6 @@ from ..data._web_bs import (
NoBsWs,
)
log = get_logger(__name__)
_url = 'https://api.binance.com'
@ -88,6 +95,9 @@ _show_wap_in_history = False
# https://binance-docs.github.io/apidocs/spot/en/#exchange-information
# TODO: make this frozen again by pre-processing the
# filters list to a dict at init time?
class Pair(Struct, frozen=True):
symbol: str
status: str
@ -114,9 +124,22 @@ class Pair(Struct, frozen=True):
defaultSelfTradePreventionMode: str
allowedSelfTradePreventionModes: list[str]
filters: list[dict[str, Union[str, int, float]]]
filters: dict[
str,
Union[str, int, float]
]
permissions: list[str]
@property
def size_tick(self) -> Decimal:
# XXX: lul, after manually inspecting the response format we
# just directly pick out the info we need
return Decimal(self.filters['PRICE_FILTER']['tickSize'].rstrip('0'))
@property
def price_tick(self) -> Decimal:
return Decimal(self.filters['LOT_SIZE']['stepSize'].rstrip('0'))
class OHLC(Struct):
'''
@ -159,7 +182,7 @@ class Client:
def __init__(self) -> None:
self._sesh = asks.Session(connections=4)
self._sesh.base_location = _url
self._pairs: dict[str, Any] = {}
self._pairs: dict[str, Pair] = {}
async def _api(
self,
@ -173,48 +196,58 @@ class Client:
)
return resproc(resp, log)
async def symbol_info(
async def exch_info(
self,
sym: Optional[str] = None,
sym: str | None = None,
) -> dict[str, Any]:
'''Get symbol info for the exchange.
) -> dict[str, Pair] | Pair:
'''
Fresh exchange-pairs info query for symbol ``sym: str``:
https://binance-docs.github.io/apidocs/spot/en/#exchange-information
'''
# TODO: we can load from our self._pairs cache
# on repeat calls...
cached_pair = self._pairs.get(sym)
if cached_pair:
return cached_pair
# will retrieve all symbols by default
# retrieve all symbols by default
params = {}
if sym is not None:
sym = sym.lower()
params = {'symbol': sym}
resp = await self._api(
'exchangeInfo',
params=params,
)
resp = await self._api('exchangeInfo', params=params)
entries = resp['symbols']
if not entries:
raise SymbolNotFound(f'{sym} not found')
raise SymbolNotFound(f'{sym} not found:\n{resp}')
syms = {item['symbol']: item for item in entries}
# pre-process .filters field into a table
pairs = {}
for item in entries:
symbol = item['symbol']
filters = {}
filters_ls: list = item.pop('filters')
for entry in filters_ls:
ftype = entry['filterType']
filters[ftype] = entry
pairs[symbol] = Pair(
filters=filters,
**item,
)
# pairs = {
# item['symbol']: Pair(**item) for item in entries
# }
self._pairs.update(pairs)
if sym is not None:
return syms[sym]
return pairs[sym]
else:
return syms
return self._pairs
async def cache_symbols(
self,
) -> dict:
if not self._pairs:
self._pairs = await self.symbol_info()
return self._pairs
symbol_info = exch_info
async def search_symbols(
self,
@ -224,7 +257,7 @@ class Client:
if self._pairs is not None:
data = self._pairs
else:
data = await self.symbol_info()
data = await self.exch_info()
matches = fuzzy.extractBests(
pattern,
@ -299,7 +332,8 @@ class Client:
@acm
async def get_client() -> Client:
client = Client()
await client.cache_symbols()
log.info(f'Caching exchange infos..')
await client.exch_info()
yield client
@ -439,6 +473,34 @@ async def open_history_client(
yield get_ohlc, {'erlangs': 3, 'rate': 3}
async def get_mkt_info(
fqme: str,
) -> tuple[MktPair, Pair]:
async with open_cached_client('binance') as client:
pair: Pair = await client.exch_info(fqme.upper())
mkt = MktPair(
dst=Asset(
name=pair.baseAsset,
atype='crypto',
tx_tick=digits_to_dec(pair.baseAssetPrecision),
),
src=Asset(
name=pair.quoteAsset,
atype='crypto',
tx_tick=digits_to_dec(pair.quoteAssetPrecision),
),
price_tick=pair.price_tick,
size_tick=pair.size_tick,
bs_mktid=pair.symbol,
broker='binance',
)
return mkt, pair
async def stream_quotes(
send_chan: trio.abc.SendChannel,
@ -453,36 +515,15 @@ async def stream_quotes(
# XXX: required to propagate ``tractor`` loglevel to piker logging
get_console_log(loglevel or tractor.current_actor().loglevel)
sym_infos = {}
uid = 0
async with (
open_cached_client('binance') as client,
send_chan as send_chan,
):
# keep client cached for real-time section
cache = await client.cache_symbols()
mkt_infos: dict[str, MktPair] = {}
for sym in symbols:
d = cache[sym.upper()]
syminfo = Pair(**d) # validation
si = sym_infos[sym] = syminfo.to_dict()
filters = {}
for entry in syminfo.filters:
ftype = entry['filterType']
filters[ftype] = entry
# XXX: after manually inspecting the response format we
# just directly pick out the info we need
si['price_tick_size'] = float(
filters['PRICE_FILTER']['tickSize']
)
si['lot_tick_size'] = float(
filters['LOT_SIZE']['stepSize']
)
si['asset_type'] = 'crypto'
mkt, pair = await get_mkt_info(sym)
mkt_infos[sym] = mkt
symbol = symbols[0]
@ -490,9 +531,10 @@ async def stream_quotes(
# pass back token, and bool, signalling if we're the writer
# and that history has been written
symbol: {
'symbol_info': sym_infos[sym],
'shm_write_opts': {'sum_tick_vml': False},
'fqsn': sym,
'mkt_info': mkt_infos[sym],
'shm_write_opts': {'sum_tick_vml': False},
},
}
@ -579,13 +621,13 @@ async def open_symbol_search(
async with open_cached_client('binance') as client:
# load all symbols locally for fast search
cache = await client.cache_symbols()
cache = await client.exch_info()
await ctx.started()
async with ctx.open_stream() as stream:
async for pattern in stream:
# results = await client.symbol_info(sym=pattern.upper())
# results = await client.exch_info(sym=pattern.upper())
matches = fuzzy.extractBests(
pattern,
@ -593,7 +635,7 @@ async def open_symbol_search(
score_cutoff=50,
)
# repack in dict form
await stream.send(
{item[0]['symbol']: item[0]
for item in matches}
)
await stream.send({
item[0].symbol: item[0]
for item in matches
})

View File

@ -28,7 +28,13 @@ import tractor
from ..cli import cli
from .. import watchlists as wl
from ..log import get_console_log, colorize_json, get_logger
from ..log import (
colorize_json,
)
from ._util import (
log,
get_console_log,
)
from ..service import (
maybe_spawn_brokerd,
maybe_open_pikerd,
@ -38,9 +44,7 @@ from ..brokers import (
get_brokermod,
data,
)
log = get_logger('cli')
DEFAULT_BROKER = 'questrade'
DEFAULT_BROKER = 'binance'
_config_dir = click.get_app_dir('piker')
_watchlists_data_path = os.path.join(_config_dir, 'watchlists.json')

View File

@ -26,15 +26,12 @@ from typing import List, Dict, Any, Optional
import trio
from ..log import get_logger
from ._util import log
from . import get_brokermod
from ..service import maybe_spawn_brokerd
from .._cacheables import open_cached_client
log = get_logger(__name__)
async def api(brokername: str, methname: str, **kwargs) -> dict:
"""Make (proxy through) a broker API call by name and return its result.
"""

View File

@ -41,13 +41,13 @@ import tractor
from tractor.experimental import msgpub
from async_generator import asynccontextmanager
from ..log import get_logger, get_console_log
from ._util import (
log,
get_console_log,
)
from . import get_brokermod
log = get_logger(__name__)
async def wait_for_network(
net_func: Callable,
sleep: int = 1

View File

@ -127,7 +127,7 @@ your ``pps.toml`` file will have position entries like,
[ib.algopaper."mnq.globex.20221216"]
size = -1.0
ppu = 12423.630576923071
bsuid = 515416577
bs_mktid = 515416577
expiry = "2022-12-16T00:00:00+00:00"
clears = [
{ dt = "2022-08-31T18:54:46+00:00", ppu = 12423.630576923071, accum_size = -19.0, price = 12372.75, size = 1.0, cost = 0.57, tid = "0000e1a7.630f5e5a.01.01" },

View File

@ -35,7 +35,6 @@ from .feed import (
)
from .broker import (
trades_dialogue,
norm_trade_records,
)
__all__ = [
@ -46,14 +45,23 @@ __all__ = [
'stream_quotes',
]
# tractor RPC enable arg
__enable_modules__: list[str] = [
_brokerd_mods: list[str] = [
'api',
'feed',
'broker',
]
_datad_mods: list[str] = [
'feed',
]
# tractor RPC enable arg
__enable_modules__: list[str] = (
_brokerd_mods
+
_datad_mods
)
# passed to ``tractor.ActorNursery.start_actor()``
_spawn_kwargs = {
'infect_asyncio': True,

View File

@ -0,0 +1,187 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
"FLEX" report processing utils.
"""
from bidict import bidict
import pendulum
from pprint import pformat
from typing import Any
from .api import (
get_config,
log,
)
from piker.accounting import (
open_trade_ledger,
)
def parse_flex_dt(
record: str,
) -> pendulum.datetime:
date, ts = record.split(';')
dt = pendulum.parse(date)
ts = f'{ts[:2]}:{ts[2:4]}:{ts[4:]}'
tsdt = pendulum.parse(ts)
return dt.set(hour=tsdt.hour, minute=tsdt.minute, second=tsdt.second)
def flex_records_to_ledger_entries(
accounts: bidict,
trade_entries: list[object],
) -> dict:
'''
Convert flex report entry objects into ``dict`` form, pretty much
straight up without modification except add a `pydatetime` field
from the parsed timestamp.
'''
trades_by_account = {}
for t in trade_entries:
entry = t.__dict__
# XXX: LOL apparently ``toml`` has a bug
# where a section key error will show up in the write
# if you leave a table key as an `int`? So i guess
# cast to strs for all keys..
# oddly for some so-called "BookTrade" entries
# this field seems to be blank, no cuckin clue.
# trade['ibExecID']
tid = str(entry.get('ibExecID') or entry['tradeID'])
# date = str(entry['tradeDate'])
# XXX: is it going to cause problems if a account name
# get's lost? The user should be able to find it based
# on the actual exec history right?
acctid = accounts[str(entry['accountId'])]
# probably a flex record with a wonky non-std timestamp..
dt = entry['pydatetime'] = parse_flex_dt(entry['dateTime'])
entry['datetime'] = str(dt)
if not tid:
# this is likely some kind of internal adjustment
# transaction, likely one of the following:
# - an expiry event that will show a "book trade" indicating
# some adjustment to cash balances: zeroing or itm settle.
# - a manual cash balance position adjustment likely done by
# the user from the accounts window in TWS where they can
# manually set the avg price and size:
# https://api.ibkr.com/lib/cstools/faq/web1/index.html#/tag/DTWS_ADJ_AVG_COST
log.warning(f'Skipping ID-less ledger entry:\n{pformat(entry)}')
continue
trades_by_account.setdefault(
acctid, {}
)[tid] = entry
for acctid in trades_by_account:
trades_by_account[acctid] = dict(sorted(
trades_by_account[acctid].items(),
key=lambda entry: entry[1]['pydatetime'],
))
return trades_by_account
def load_flex_trades(
path: str | None = None,
) -> dict[str, Any]:
from ib_insync import flexreport, util
conf = get_config()
if not path:
# load ``brokers.toml`` and try to get the flex
# token and query id that must be previously defined
# by the user.
token = conf.get('flex_token')
if not token:
raise ValueError(
'You must specify a ``flex_token`` field in your'
'`brokers.toml` in order load your trade log, see our'
'intructions for how to set this up here:\n'
'PUT LINK HERE!'
)
qid = conf['flex_trades_query_id']
# TODO: hack this into our logging
# system like we do with the API client..
util.logToConsole()
# TODO: rewrite the query part of this with async..httpx?
report = flexreport.FlexReport(
token=token,
queryId=qid,
)
else:
# XXX: another project we could potentially look at,
# https://pypi.org/project/ibflex/
report = flexreport.FlexReport(path=path)
trade_entries = report.extract('Trade')
ln = len(trade_entries)
log.info(f'Loaded {ln} trades from flex query')
trades_by_account = flex_records_to_ledger_entries(
conf['accounts'].inverse, # reverse map to user account names
trade_entries,
)
ledger_dict: dict | None = None
for acctid in trades_by_account:
trades_by_id = trades_by_account[acctid]
with open_trade_ledger('ib', acctid) as ledger_dict:
tid_delta = set(trades_by_id) - set(ledger_dict)
log.info(
'New trades detected\n'
f'{pformat(tid_delta)}'
)
if tid_delta:
sorted_delta = dict(sorted(
{tid: trades_by_id[tid] for tid in tid_delta}.items(),
key=lambda entry: entry[1].pop('pydatetime'),
))
ledger_dict.update(sorted_delta)
return ledger_dict
if __name__ == '__main__':
import sys
import os
args = sys.argv
if len(args) > 1:
args = args[1:]
for arg in args:
path = os.path.abspath(arg)
load_flex_trades(path=path)
else:
# expect brokers.toml to have an entry and
# pull from the web service.
load_flex_trades()

View File

@ -24,9 +24,7 @@ import subprocess
import tractor
from piker.log import get_logger
log = get_logger(__name__)
from .._util import log
_reset_tech: Literal[

View File

@ -20,15 +20,22 @@
"""
from __future__ import annotations
from contextlib import asynccontextmanager as acm
from contextlib import (
asynccontextmanager as acm,
contextmanager as cm,
)
from contextlib import AsyncExitStack
from dataclasses import asdict, astuple
from datetime import datetime
from functools import partial
from functools import (
partial,
# lru_cache,
)
import itertools
from math import isnan
from typing import (
Any,
Callable,
Optional,
Union,
)
@ -44,6 +51,7 @@ import trio
import tractor
from tractor import to_asyncio
import pendulum
from eventkit import Event
import ib_insync as ibis
from ib_insync.contract import (
Contract,
@ -68,12 +76,10 @@ import numpy as np
from piker import config
from piker.log import get_logger
from piker.brokers._util import log
from piker.data._source import base_ohlc_dtype
log = get_logger(__name__)
_time_units = {
's': ' sec',
'm': ' mins',
@ -130,11 +136,13 @@ class NonShittyWrapper(Wrapper):
class NonShittyIB(ibis.IB):
"""The beginning of overriding quite a few decisions in this lib.
'''
The beginning of overriding quite a few decisions in this lib.
- Don't use datetimes
- Don't use named tuples
"""
'''
def __init__(self):
# override `ib_insync` internal loggers so we can see wtf
@ -172,6 +180,8 @@ _adhoc_cmdty_set = {
'xagusd.cmdty', # silver spot
}
# NOTE: if you aren't seeing one of these symbol's futues contracts
# show up, it's likely the `.<venue>` part is wrong!
_adhoc_futes_set = {
# equities
@ -197,7 +207,7 @@ _adhoc_futes_set = {
'mgc.comex', # micro
# oil & gas
'cl.comex',
'cl.nymex',
'ni.comex', # silver futes
'qi.comex', # mini-silver futes
@ -311,6 +321,22 @@ _samplings: dict[int, tuple[str, str]] = {
}
@cm
def remove_handler_on_err(
event: Event,
handler: Callable,
) -> None:
try:
yield
except trio.BrokenResourceError:
# XXX: eventkit's ``Event.emit()`` for whatever redic
# reason will catch and ignore regular exceptions
# resulting in tracebacks spammed to console..
# Manually do the dereg ourselves.
log.exception(f'Disconnected from {event} updates')
event.disconnect(handler)
class Client:
'''
IB wrapped for our broker backend API.
@ -330,7 +356,7 @@ class Client:
self.ib.RaiseRequestErrors = True
# contract cache
self._feeds: dict[str, trio.abc.SendChannel] = {}
self._cons: dict[str, Contract] = {}
# NOTE: the ib.client here is "throttled" to 45 rps by default
@ -386,8 +412,7 @@ class Client:
bar_size, duration, dt_duration = _samplings[sample_period_s]
global _enters
# log.info(f'REQUESTING BARS {_enters} @ end={end_dt}')
print(
log.info(
f"REQUESTING {duration}'s worth {bar_size} BARS\n"
f'{_enters} @ end={end_dt}"'
)
@ -614,13 +639,20 @@ class Client:
return con
# TODO: make this work with our `MethodProxy`..
# @lru_cache(maxsize=None)
async def get_con(
self,
conid: int,
) -> Contract:
return await self.ib.qualifyContractsAsync(
ibis.Contract(conId=conid)
)
try:
return self._cons[conid]
except KeyError:
con: Contract = await self.ib.qualifyContractsAsync(
ibis.Contract(conId=conid)
)
self._cons[conid] = con
return con
def parse_patt2fqsn(
self,
@ -644,7 +676,7 @@ class Client:
# fqsn parsing stage
# ------------------
if '.ib' in pattern:
from ..data._source import unpack_fqsn
from ..accounting._mktinfo import unpack_fqsn
_, symbol, expiry = unpack_fqsn(pattern)
else:
@ -722,7 +754,7 @@ class Client:
)
elif (
exch in ('IDEALPRO')
exch in {'IDEALPRO'}
or sectype == 'CASH'
):
# if '/' in symbol:
@ -1008,6 +1040,21 @@ class Client:
self.ib.errorEvent.connect(push_err)
api_err = self.ib.client.apiError
def report_api_err(msg: str) -> None:
with remove_handler_on_err(
api_err,
report_api_err,
):
breakpoint()
to_trio.send_nowait((
'error',
msg,
))
api_err.connect(report_api_err)
def positions(
self,
account: str = '',
@ -1137,7 +1184,7 @@ async def load_aio_clients(
# the API TCP in `ib_insync` connection can be flaky af so instead
# retry a few times to get the client going..
connect_retries: int = 3,
connect_timeout: float = 0.5,
connect_timeout: float = 1,
disconnect_on_exit: bool = True,
) -> dict[str, Client]:
@ -1191,9 +1238,14 @@ async def load_aio_clients(
for host, port in combos:
sockaddr = (host, port)
maybe_client = _client_cache.get(sockaddr)
if (
sockaddr in _client_cache
or sockaddr in _scan_ignore
sockaddr in _scan_ignore
or (
maybe_client
and maybe_client.ib.isConnected()
)
):
continue
@ -1204,9 +1256,9 @@ async def load_aio_clients(
await ib.connectAsync(
host,
port,
clientId=client_id,
clientId=client_id + i,
# this timeout is sensative on windows and will
# this timeout is sensitive on windows and will
# fail without a good "timeout error" so be
# careful.
timeout=connect_timeout,
@ -1230,15 +1282,10 @@ async def load_aio_clients(
OSError,
) as ce:
_err = ce
if i > 8:
# cache logic to avoid rescanning if we already have all
# clients loaded.
_scan_ignore.add(sockaddr)
raise
log.warning(
f'Failed to connect on {port} for {i} time, retrying...')
f'Failed to connect on {port} for {i} time with,\n'
f'{ib.client.apiError.value()}\n'
'retrying with a new client id..')
# Pre-collect all accounts available for this
# connection and map account names to this client
@ -1299,19 +1346,13 @@ async def load_clients_for_trio(
a ``tractor.to_asyncio.open_channel_from()``.
'''
global _accounts2clients
async with load_aio_clients() as accts2clients:
if _accounts2clients:
to_trio.send_nowait(_accounts2clients)
to_trio.send_nowait(accts2clients)
# TODO: maybe a sync event to wait on instead?
await asyncio.sleep(float('inf'))
else:
async with load_aio_clients() as accts2clients:
to_trio.send_nowait(accts2clients)
# TODO: maybe a sync event to wait on instead?
await asyncio.sleep(float('inf'))
@acm
async def open_client_proxies() -> tuple[
@ -1451,6 +1492,7 @@ async def open_aio_client_method_relay(
) -> None:
# sync with `open_client_proxy()` caller
to_trio.send_nowait(client)
# TODO: separate channel for error handling?
@ -1460,25 +1502,34 @@ async def open_aio_client_method_relay(
# back results
while not to_trio._closed:
msg = await from_trio.get()
if msg is None:
print('asyncio PROXY-RELAY SHUTDOWN')
break
meth_name, kwargs = msg
meth = getattr(client, meth_name)
match msg:
case None: # termination sentinel
print('asyncio PROXY-RELAY SHUTDOWN')
break
try:
resp = await meth(**kwargs)
# echo the msg back
to_trio.send_nowait({'result': resp})
case (meth_name, kwargs):
meth_name, kwargs = msg
meth = getattr(client, meth_name)
except (
RequestError,
try:
resp = await meth(**kwargs)
# echo the msg back
to_trio.send_nowait({'result': resp})
# TODO: relay all errors to trio?
# BaseException,
) as err:
to_trio.send_nowait({'exception': err})
except (
RequestError,
# TODO: relay all errors to trio?
# BaseException,
) as err:
to_trio.send_nowait({'exception': err})
case {'error': content}:
to_trio.send_nowait({'exception': content})
case _:
raise ValueError(f'Unhandled msg {msg}')
@acm
@ -1509,7 +1560,8 @@ async def open_client_proxy(
# mock all remote methods on ib ``Client``.
for name, method in inspect.getmembers(
Client, predicate=inspect.isfunction
Client,
predicate=inspect.isfunction,
):
if '_' == name[0]:
continue

View File

@ -13,6 +13,7 @@
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Order and trades endpoints for use with ``piker``'s EMS.
@ -21,6 +22,7 @@ from __future__ import annotations
from bisect import insort
from contextlib import ExitStack
from dataclasses import asdict
from decimal import Decimal
from functools import partial
from pprint import pformat
import time
@ -37,6 +39,7 @@ from trio_typing import TaskStatus
import tractor
from ib_insync.contract import (
Contract,
Option,
)
from ib_insync.order import (
Trade,
@ -51,14 +54,14 @@ from ib_insync.objects import Position as IbPosition
import pendulum
from piker import config
from piker.pp import (
from piker.accounting import (
Position,
Transaction,
open_trade_ledger,
open_pps,
PpTable,
)
from piker.log import get_console_log
from .._util import get_console_log
from piker.clearing._messages import (
Order,
Status,
@ -70,9 +73,8 @@ from piker.clearing._messages import (
BrokerdFill,
BrokerdError,
)
from piker.data._source import (
from piker.accounting._mktinfo import (
Symbol,
float_digits,
)
from .api import (
_accounts2clients,
@ -83,19 +85,23 @@ from .api import (
Client,
MethodProxy,
)
from ._flex_reports import parse_flex_dt
def pack_position(
pos: IbPosition
) -> dict[str, Any]:
) -> tuple[
str,
dict[str, Any]
]:
con = pos.contract
fqsn, calc_price = con2fqsn(con)
# TODO: options contracts into a sane format..
return (
con.conId,
str(con.conId),
BrokerdPosition(
broker='ib',
account=pos.account,
@ -281,18 +287,21 @@ async def recv_trade_updates(
async def update_ledger_from_api_trades(
trade_entries: list[dict[str, Any]],
client: Union[Client, MethodProxy],
accounts_def_inv: bidict[str, str],
) -> tuple[
dict[str, Transaction],
dict[str, dict],
]:
# XXX; ERRGGG..
# pack in the "primary/listing exchange" value from a
# contract lookup since it seems this isn't available by
# default from the `.fills()` method endpoint...
for entry in trade_entries:
condict = entry['contract']
# print(
# f"{condict['symbol']}: GETTING CONTRACT INFO!\n"
# )
conid = condict['conId']
pexch = condict['primaryExchange']
@ -310,9 +319,8 @@ async def update_ledger_from_api_trades(
# pack in the ``Contract.secType``
entry['asset_type'] = condict['secType']
conf = get_config()
entries = api_trades_to_ledger_entries(
conf['accounts'].inverse,
accounts_def_inv,
trade_entries,
)
# normalize recent session's trades to the `Transaction` type
@ -335,14 +343,15 @@ async def update_and_audit_msgs(
msgs: list[BrokerdPosition] = []
for p in pps:
bsuid = p.bsuid
bs_mktid = p.bs_mktid
# retreive equivalent ib reported position message
# for comparison/audit versus the piker equivalent
# breakeven pp calcs.
ibppmsg = cids2pps.get((acctid, bsuid))
ibppmsg = cids2pps.get((acctid, bs_mktid))
if ibppmsg:
symbol = ibppmsg.symbol
msg = BrokerdPosition(
broker='ib',
@ -353,13 +362,16 @@ async def update_and_audit_msgs(
# table..
account=ibppmsg.account,
# XXX: the `.ib` is stripped..?
symbol=ibppmsg.symbol,
symbol=symbol,
currency=ibppmsg.currency,
size=p.size,
avg_price=p.ppu,
)
msgs.append(msg)
ibfmtmsg = pformat(ibppmsg.to_dict())
pikerfmtmsg = pformat(msg.to_dict())
if validate:
ibsize = ibppmsg.size
pikersize = msg.size
@ -379,26 +391,24 @@ async def update_and_audit_msgs(
# raise ValueError(
log.error(
f'POSITION MISMATCH ib <-> piker ledger:\n'
f'ib: {ibppmsg}\n'
f'piker: {msg}\n'
f'reverse_split_ratio: {reverse_split_ratio}\n'
f'split_ratio: {split_ratio}\n\n'
'FIGURE OUT WHY TF YOUR LEDGER IS OFF!?!?\n\n'
f'Pos mismatch in ib vs. the piker ledger!\n'
f'IB:\n{ibfmtmsg}\n\n'
f'PIKER:\n{pikerfmtmsg}\n\n'
'If you are expecting a (reverse) split in this '
'instrument you should probably put the following '
f'in the `pps.toml` section:\n{entry}'
'instrument you should probably put the following'
'in the `pps.toml` section:\n'
f'{entry}\n'
# f'reverse_split_ratio: {reverse_split_ratio}\n'
# f'split_ratio: {split_ratio}\n\n'
)
msg.size = ibsize
if ibppmsg.avg_price != msg.avg_price:
# TODO: make this a "propoganda" log level?
# TODO: make this a "propaganda" log level?
log.warning(
'The mega-cucks at IB want you to believe with their '
f'"FIFO" positioning for {msg.symbol}:\n'
f'"ib" mega-cucker avg price: {ibppmsg.avg_price}\n'
f'piker, LIFO breakeven PnL price: {msg.avg_price}'
f'IB "FIFO" avg price for {msg.symbol} is DIFF:\n'
f'ib: {ibppmsg.avg_price}\n'
f'piker: {msg.avg_price}'
)
else:
@ -414,7 +424,7 @@ async def update_and_audit_msgs(
# right since `.broker` is already included?
account=f'ib.{acctid}',
# XXX: the `.ib` is stripped..?
symbol=p.symbol.front_fqsn(),
symbol=p.symbol.fqme,
# currency=ibppmsg.currency,
size=p.size,
avg_price=p.ppu,
@ -422,16 +432,90 @@ async def update_and_audit_msgs(
if validate and p.size:
# raise ValueError(
log.error(
f'UNEXPECTED POSITION says ib:\n'
f'piker: {msg}\n'
'YOU SHOULD FIGURE OUT WHY TF YOUR LEDGER IS OFF!?\n'
'THEY LIQUIDATED YOU OR YOUR MISSING LEDGER RECORDS!?'
f'UNEXPECTED POSITION says IB:\n'
'Maybe they LIQUIDATED YOU or your missing ledger records?\n'
f'PIKER:\n{pikerfmtmsg}\n\n'
)
msgs.append(msg)
return msgs
async def aggr_open_orders(
order_msgs: list[Status],
client: Client,
proxy: MethodProxy,
accounts_def: bidict[str, str],
) -> None:
'''
Collect all open orders from client and fill in `order_msgs: list`.
'''
trades: list[Trade] = client.ib.openTrades()
for trade in trades:
order = trade.order
quant = trade.order.totalQuantity
action = order.action.lower()
size = {
'sell': -1,
'buy': 1,
}[action] * quant
con = trade.contract
# TODO: in the case of the SMART venue (aka ib's
# router-clearing sys) we probably should handle
# showing such orders overtop of the fqsn for the
# primary exchange, how to map this easily is going
# to be a bit tricky though?
deats = await proxy.con_deats(contracts=[con])
fqsn = list(deats)[0]
reqid = order.orderId
# TODO: maybe embed a ``BrokerdOrder`` instead
# since then we can directly load it on the client
# side in the order mode loop?
msg = Status(
time_ns=time.time_ns(),
resp='open',
oid=str(reqid),
reqid=reqid,
# embedded order info
req=Order(
action=action,
exec_mode='live',
oid=str(reqid),
symbol=fqsn,
account=accounts_def.inverse[order.account],
price=order.lmtPrice,
size=size,
),
src='ib',
)
order_msgs.append(msg)
return order_msgs
# proxy wrapper for starting trade event stream
async def open_trade_event_stream(
client: Client,
task_status: TaskStatus[
trio.abc.ReceiveChannel
] = trio.TASK_STATUS_IGNORED,
):
# each api client has a unique event stream
async with tractor.to_asyncio.open_channel_from(
recv_trade_updates,
client=client,
) as (first, trade_event_stream):
task_status.started(trade_event_stream)
await trio.sleep_forever()
@tractor.context
async def trades_dialogue(
@ -465,7 +549,10 @@ async def trades_dialogue(
# we might also want to delegate a specific actor for
# ledger writing / reading for speed?
async with (
open_client_proxies() as (proxies, aioclients),
open_client_proxies() as (
proxies,
aioclients,
),
):
# Open a trade ledgers stack for appending trade records over
# multiple accounts.
@ -473,6 +560,9 @@ async def trades_dialogue(
ledgers: dict[str, dict] = {}
tables: dict[str, PpTable] = {}
order_msgs: list[Status] = []
conf = get_config()
accounts_def_inv = conf['accounts'].inverse
with (
ExitStack() as lstack,
):
@ -491,146 +581,6 @@ async def trades_dialogue(
acctid,
)
)
table = tables[acctid] = lstack.enter_context(
open_pps(
'ib',
acctid,
write_on_exit=True,
)
)
for account, proxy in proxies.items():
client = aioclients[account]
trades: list[Trade] = client.ib.openTrades()
for trade in trades:
order = trade.order
quant = trade.order.totalQuantity
action = order.action.lower()
size = {
'sell': -1,
'buy': 1,
}[action] * quant
con = trade.contract
# TODO: in the case of the SMART venue (aka ib's
# router-clearing sys) we probably should handle
# showing such orders overtop of the fqsn for the
# primary exchange, how to map this easily is going
# to be a bit tricky though?
deats = await proxy.con_deats(contracts=[con])
fqsn = list(deats)[0]
reqid = order.orderId
# TODO: maybe embed a ``BrokerdOrder`` instead
# since then we can directly load it on the client
# side in the order mode loop?
msg = Status(
time_ns=time.time_ns(),
resp='open',
oid=str(reqid),
reqid=reqid,
# embedded order info
req=Order(
action=action,
exec_mode='live',
oid=str(reqid),
symbol=fqsn,
account=accounts_def.inverse[order.account],
price=order.lmtPrice,
size=size,
),
src='ib',
)
order_msgs.append(msg)
# process pp value reported from ib's system. we only use these
# to cross-check sizing since average pricing on their end uses
# the so called (bs) "FIFO" style which more or less results in
# a price that's not useful for traders who want to not lose
# money.. xb
for pos in client.positions():
# collect all ib-pp reported positions so that we can be
# sure know which positions to update from the ledger if
# any are missing from the ``pps.toml``
bsuid, msg = pack_position(pos)
acctid = msg.account = accounts_def.inverse[msg.account]
acctid = acctid.strip('ib.')
cids2pps[(acctid, bsuid)] = msg
assert msg.account in accounts, (
f'Position for unknown account: {msg.account}')
ledger = ledgers[acctid]
table = tables[acctid]
pp = table.pps.get(bsuid)
if (
not pp
or pp.size != msg.size
):
trans = norm_trade_records(ledger)
table.update_from_trans(trans)
# update trades ledgers for all accounts from connected
# api clients which report trades for **this session**.
trades = await proxy.trades()
(
trans_by_acct,
api_to_ledger_entries,
) = await update_ledger_from_api_trades(
trades,
proxy,
)
# if new trades are detected from the API, prepare
# them for the ledger file and update the pptable.
if api_to_ledger_entries:
trade_entries = api_to_ledger_entries.get(acctid)
if trade_entries:
# write ledger with all new trades **AFTER**
# we've updated the `pps.toml` from the
# original ledger state! (i.e. this is
# currently done on exit)
ledger.update(trade_entries)
trans = trans_by_acct.get(acctid)
if trans:
table.update_from_trans(trans)
# XXX: not sure exactly why it wouldn't be in
# the updated output (maybe this is a bug?) but
# if you create a pos from TWS and then load it
# from the api trades it seems we get a key
# error from ``update[bsuid]`` ?
pp = table.pps.get(bsuid)
if not pp:
log.error(
f'The contract id for {msg} may have '
f'changed to {bsuid}\nYou may need to '
'adjust your ledger for this, skipping '
'for now.'
)
continue
# XXX: not sure exactly why it wouldn't be in
# the updated output (maybe this is a bug?) but
# if you create a pos from TWS and then load it
# from the api trades it seems we get a key
# error from ``update[bsuid]`` ?
pp = table.pps[bsuid]
pairinfo = pp.symbol
if msg.size != pp.size:
log.error(
f'Pos size mismatch {pairinfo.front_fqsn()}:\n'
f'ib: {msg.size}\n'
f'piker: {pp.size}\n'
)
active_pps, closed_pps = table.dump_active()
# load all positions from `pps.toml`, cross check with
# ib's positions data, and relay re-formatted pps as
@ -641,6 +591,105 @@ async def trades_dialogue(
# - no new trades yet but we want to reload and audit any
# positions reported by ib's sys that may not yet be in
# piker's ``pps.toml`` state-file.
tables[acctid] = lstack.enter_context(
open_pps(
'ib',
acctid,
write_on_exit=True,
)
)
for account, proxy in proxies.items():
client = aioclients[account]
# order_msgs is filled in by this helper
await aggr_open_orders(
order_msgs,
client,
proxy,
accounts_def,
)
acctid: str = account.strip('ib.')
ledger: dict = ledgers[acctid]
table: PpTable = tables[acctid]
# update trades ledgers for all accounts from connected
# api clients which report trades for **this session**.
api_trades = await proxy.trades()
if api_trades:
trans_by_acct: dict[str, Transaction]
api_to_ledger_entries: dict[str, dict]
(
trans_by_acct,
api_to_ledger_entries,
) = await update_ledger_from_api_trades(
api_trades,
proxy,
accounts_def_inv,
)
# if new api_trades are detected from the API, prepare
# them for the ledger file and update the pptable.
if api_to_ledger_entries:
trade_entries = api_to_ledger_entries.get(acctid)
# TODO: fix this `tractor` BUG!
# https://github.com/goodboy/tractor/issues/354
# await tractor.breakpoint()
if trade_entries:
# write ledger with all new api_trades
# **AFTER** we've updated the `pps.toml`
# from the original ledger state! (i.e. this
# is currently done on exit)
for tid, entry in trade_entries.items():
ledger.setdefault(tid, {}).update(entry)
trans = trans_by_acct.get(acctid)
if trans:
table.update_from_trans(trans)
# update position table with latest ledger from all
# gathered transactions: ledger file + api records.
trans = norm_trade_records(ledger)
table.update_from_trans(trans)
# process pp value reported from ib's system. we only
# use these to cross-check sizing since average pricing
# on their end uses the so called (bs) "FIFO" style
# which more or less results in a price that's not
# useful for traders who want to not lose money.. xb
# -> collect all ib-pp reported positions so that we can be
# sure know which positions to update from the ledger if
# any are missing from the ``pps.toml``
pos: IbPosition # named tuple subtype
for pos in client.positions():
# NOTE XXX: we skip options for now since we don't
# yet support the symbology nor the live feeds.
if isinstance(pos.contract, Option):
log.warning(
f'Option contracts not supported for now:\n'
f'{pos._asdict()}'
)
continue
bs_mktid, msg = pack_position(pos)
acctid = msg.account = accounts_def.inverse[msg.account]
acctid = acctid.strip('ib.')
cids2pps[(acctid, bs_mktid)] = msg
assert msg.account in accounts, (
f'Position for unknown account: {msg.account}')
# iterate all (newly) updated pps tables for every
# client-account and build out position msgs to deliver to
# EMS.
for acctid, table in tables.items():
active_pps, closed_pps = table.dump_active()
for pps in [active_pps, closed_pps]:
msgs = await update_and_audit_msgs(
acctid,
@ -661,22 +710,6 @@ async def trades_dialogue(
tuple(name for name in accounts_def if name in accounts),
))
# proxy wrapper for starting trade event stream
async def open_trade_event_stream(
client: Client,
task_status: TaskStatus[
trio.abc.ReceiveChannel
] = trio.TASK_STATUS_IGNORED,
):
# each api client has a unique event stream
async with tractor.to_asyncio.open_channel_from(
recv_trade_updates,
client=client,
) as (first, trade_event_stream):
task_status.started(trade_event_stream)
await trio.sleep_forever()
async with (
ctx.open_stream() as ems_stream,
trio.open_nursery() as n,
@ -723,44 +756,50 @@ async def trades_dialogue(
async def emit_pp_update(
ems_stream: tractor.MsgStream,
trade_entry: dict,
accounts_def: bidict,
accounts_def: bidict[str, str],
proxies: dict,
cids2pps: dict,
ledgers,
tables,
ledgers: dict[str, dict[str, Any]],
tables: dict[str, PpTable],
) -> None:
# compute and relay incrementally updated piker pp
acctid = accounts_def.inverse[trade_entry['execution']['acctNumber']]
proxy = proxies[acctid]
acctid = acctid.strip('ib.')
accounts_def_inv: bidict[str, str] = accounts_def.inverse
fq_acctid = accounts_def_inv[trade_entry['execution']['acctNumber']]
proxy = proxies[fq_acctid]
(
records_by_acct,
api_to_ledger_entries,
) = await update_ledger_from_api_trades(
[trade_entry],
proxy,
accounts_def_inv,
)
trans = records_by_acct[acctid]
trans = records_by_acct[fq_acctid]
r = list(trans.values())[0]
acctid = fq_acctid.strip('ib.')
table = tables[acctid]
table.update_from_trans(trans)
active, closed = table.dump_active()
# NOTE: update ledger with all new trades
for acctid, trades_by_id in api_to_ledger_entries.items():
for fq_acctid, trades_by_id in api_to_ledger_entries.items():
acctid = fq_acctid.strip('ib.')
ledger = ledgers[acctid]
ledger.update(trades_by_id)
for tid, tdict in trades_by_id.items():
# NOTE: don't override flex/previous entries with new API
# ones, just update with new fields!
ledger.setdefault(tid, {}).update(tdict)
# generate pp msgs and cross check with ib's positions data, relay
# re-formatted pps as msgs to the ems.
for pos in filter(
bool,
[active.get(r.bsuid), closed.get(r.bsuid)]
[active.get(r.bs_mktid), closed.get(r.bs_mktid)]
):
msgs = await update_and_audit_msgs(
acctid,
@ -859,8 +898,8 @@ async def deliver_trade_events(
# https://github.com/erdewit/ib_insync/issues/363
# acctid = accounts_def.inverse[trade.order.account]
# # double check there is no error when
# # cancelling.. gawwwd
# double check there is no error when
# cancelling.. gawwwd
# if ib_status_key == 'cancelled':
# last_log = trade.log[-1]
# if (
@ -1000,6 +1039,7 @@ async def deliver_trade_events(
accounts_def,
proxies,
cids2pps,
ledgers,
tables,
)
@ -1034,6 +1074,7 @@ async def deliver_trade_events(
accounts_def,
proxies,
cids2pps,
ledgers,
tables,
)
@ -1095,7 +1136,7 @@ async def deliver_trade_events(
def norm_trade_records(
ledger: dict[str, Any],
) -> list[Transaction]:
) -> dict[str, Transaction]:
'''
Normalize a flex report or API retrieved executions
ledger into our standard record format.
@ -1110,7 +1151,6 @@ def norm_trade_records(
comms = -1*record['ibCommission']
price = record.get('price') or record['tradePrice']
price_tick_digits = float_digits(price)
# the api doesn't do the -/+ on the quantity for you but flex
# records do.. are you fucking serious ib...!?
@ -1153,7 +1193,9 @@ def norm_trade_records(
# special handling of symbol extraction from
# flex records using some ad-hoc schema parsing.
asset_type: str = record.get('assetCategory') or record['secType']
asset_type: str = record.get(
'assetCategory'
) or record.get('secType', 'STK')
# TODO: XXX: WOA this is kinda hacky.. probably
# should figure out the correct future pair key more
@ -1170,49 +1212,55 @@ def norm_trade_records(
suffix = f'{exch}.{expiry}'
expiry = pendulum.parse(expiry)
src: str = record['currency']
# src: str = record['currency']
# price_tick_digits = float_digits(price)
tick_size = Decimal(
Decimal(10)**Decimal(str(price)).as_tuple().exponent
)
# TODO: convert to MktPair!!!
pair = Symbol.from_fqsn(
fqsn=f'{symbol}.{suffix}.ib',
info={
'tick_size_digits': price_tick_digits,
'tick_size': tick_size,
# NOTE: for "legacy" assets, volume is normally discreet, not
# a float, but we keep a digit in case the suitz decide
# to get crazy and change it; we'll be kinda ready
# schema-wise..
'lot_size_digits': 1,
'lot_tick_size': 0.0,
# TODO: remove when we switching from
# ``Symbol`` -> ``MktPair``
'asset_type': asset_type,
# TODO: figure out a target fin-type name
# set and normalize to that here!
'dst_type': asset_type.lower(),
# # TODO: figure out a target fin-type name
# # set and normalize to that here!
# 'dst_type': asset_type.lower(),
# starting to use new key naming as in ``MktPair``
# type have drafted...
'src': src,
'src_type': 'fiat',
# # starting to use new key naming as in ``MktPair``
# # type have drafted...
# 'src': src,
# 'src_type': 'fiat',
},
)
fqsn = pair.front_fqsn().rstrip('.ib')
fqme = pair.fqme
# NOTE: for flex records the normal fields for defining an fqsn
# NOTE: for flex records the normal fields for defining an fqme
# sometimes won't be available so we rely on two approaches for
# the "reverse lookup" of piker style fqsn keys:
# the "reverse lookup" of piker style fqme keys:
# - when dealing with API trade records received from
# `IB.trades()` we do a contract lookup at he time of processing
# - when dealing with flex records, it is assumed the record
# is at least a day old and thus the TWS position reporting system
# should already have entries if the pps are still open, in
# which case, we can pull the fqsn from that table (see
# which case, we can pull the fqme from that table (see
# `trades_dialogue()` above).
insort(
records,
Transaction(
fqsn=fqsn,
fqsn=fqme,
sym=pair,
tid=tid,
size=size,
@ -1220,7 +1268,7 @@ def norm_trade_records(
cost=comms,
dt=dt,
expiry=expiry,
bsuid=conid,
bs_mktid=str(conid),
),
key=lambda t: t.dt
)
@ -1228,18 +1276,8 @@ def norm_trade_records(
return {r.tid: r for r in records}
def parse_flex_dt(
record: str,
) -> pendulum.datetime:
date, ts = record.split(';')
dt = pendulum.parse(date)
ts = f'{ts[:2]}:{ts[2:4]}:{ts[4:]}'
tsdt = pendulum.parse(ts)
return dt.set(hour=tsdt.hour, minute=tsdt.minute, second=tsdt.second)
def api_trades_to_ledger_entries(
accounts: bidict,
accounts: bidict[str, str],
# TODO: maybe we should just be passing through the
# ``ib_insync.order.Trade`` instance directly here
@ -1309,148 +1347,3 @@ def api_trades_to_ledger_entries(
))
return trades_by_account
def flex_records_to_ledger_entries(
accounts: bidict,
trade_entries: list[object],
) -> dict:
'''
Convert flex report entry objects into ``dict`` form, pretty much
straight up without modification except add a `pydatetime` field
from the parsed timestamp.
'''
trades_by_account = {}
for t in trade_entries:
entry = t.__dict__
# XXX: LOL apparently ``toml`` has a bug
# where a section key error will show up in the write
# if you leave a table key as an `int`? So i guess
# cast to strs for all keys..
# oddly for some so-called "BookTrade" entries
# this field seems to be blank, no cuckin clue.
# trade['ibExecID']
tid = str(entry.get('ibExecID') or entry['tradeID'])
# date = str(entry['tradeDate'])
# XXX: is it going to cause problems if a account name
# get's lost? The user should be able to find it based
# on the actual exec history right?
acctid = accounts[str(entry['accountId'])]
# probably a flex record with a wonky non-std timestamp..
dt = entry['pydatetime'] = parse_flex_dt(entry['dateTime'])
entry['datetime'] = str(dt)
if not tid:
# this is likely some kind of internal adjustment
# transaction, likely one of the following:
# - an expiry event that will show a "book trade" indicating
# some adjustment to cash balances: zeroing or itm settle.
# - a manual cash balance position adjustment likely done by
# the user from the accounts window in TWS where they can
# manually set the avg price and size:
# https://api.ibkr.com/lib/cstools/faq/web1/index.html#/tag/DTWS_ADJ_AVG_COST
log.warning(f'Skipping ID-less ledger entry:\n{pformat(entry)}')
continue
trades_by_account.setdefault(
acctid, {}
)[tid] = entry
for acctid in trades_by_account:
trades_by_account[acctid] = dict(sorted(
trades_by_account[acctid].items(),
key=lambda entry: entry[1]['pydatetime'],
))
return trades_by_account
def load_flex_trades(
path: Optional[str] = None,
) -> dict[str, Any]:
from ib_insync import flexreport, util
conf = get_config()
if not path:
# load ``brokers.toml`` and try to get the flex
# token and query id that must be previously defined
# by the user.
token = conf.get('flex_token')
if not token:
raise ValueError(
'You must specify a ``flex_token`` field in your'
'`brokers.toml` in order load your trade log, see our'
'intructions for how to set this up here:\n'
'PUT LINK HERE!'
)
qid = conf['flex_trades_query_id']
# TODO: hack this into our logging
# system like we do with the API client..
util.logToConsole()
# TODO: rewrite the query part of this with async..httpx?
report = flexreport.FlexReport(
token=token,
queryId=qid,
)
else:
# XXX: another project we could potentially look at,
# https://pypi.org/project/ibflex/
report = flexreport.FlexReport(path=path)
trade_entries = report.extract('Trade')
ln = len(trade_entries)
log.info(f'Loaded {ln} trades from flex query')
trades_by_account = flex_records_to_ledger_entries(
conf['accounts'].inverse, # reverse map to user account names
trade_entries,
)
ledger_dict: Optional[dict] = None
for acctid in trades_by_account:
trades_by_id = trades_by_account[acctid]
with open_trade_ledger('ib', acctid) as ledger_dict:
tid_delta = set(trades_by_id) - set(ledger_dict)
log.info(
'New trades detected\n'
f'{pformat(tid_delta)}'
)
if tid_delta:
sorted_delta = dict(sorted(
{tid: trades_by_id[tid] for tid in tid_delta}.items(),
key=lambda entry: entry[1].pop('pydatetime'),
))
ledger_dict.update(sorted_delta)
return ledger_dict
if __name__ == '__main__':
import sys
import os
args = sys.argv
if len(args) > 1:
args = args[1:]
for arg in args:
path = os.path.abspath(arg)
load_flex_trades(path=path)
else:
# expect brokers.toml to have an entry and
# pull from the web service.
load_flex_trades()

View File

@ -20,6 +20,7 @@ Data feed endpoints pre-wrapped and ready for use with ``tractor``/``trio``.
from __future__ import annotations
import asyncio
from contextlib import asynccontextmanager as acm
from decimal import Decimal
from dataclasses import asdict
from datetime import datetime
from functools import partial
@ -618,7 +619,7 @@ async def _setup_quote_stream(
async def open_aio_quote_stream(
symbol: str,
contract: Optional[Contract] = None,
contract: Contract | None = None,
) -> trio.abc.ReceiveStream:
@ -735,9 +736,19 @@ async def stream_quotes(
sym = symbols[0]
log.info(f'request for real-time quotes: {sym}')
proxy: MethodProxy
async with open_data_client() as proxy:
con, first_ticker, details = await proxy.get_sym_details(symbol=sym)
try:
(
con, # Contract
first_ticker, # Ticker
details, # ContractDetails
) = await proxy.get_sym_details(symbol=sym)
except ConnectionError:
log.exception(f'Proxy is ded {proxy._aio_ns}')
raise
first_quote = normalize(first_ticker)
# print(f'first quote: {first_quote}')
@ -748,6 +759,7 @@ async def stream_quotes(
'''
# pass back some symbol info like min_tick, trading_hours, etc.
con: Contract = details.contract
syminfo = asdict(details)
syminfo.update(syminfo['contract'])
@ -765,15 +777,24 @@ async def stream_quotes(
}:
syminfo['no_vlm'] = True
# XXX: pretty sure we don't need this any more right?
# for stocks it seems TWS reports too small a tick size
# such that you can't submit orders with that granularity?
min_tick = 0.01 if atype == 'stock' else 0
# min_price_tick = Decimal('0.01') if atype == 'stock' else 0
# price_tick = max(price_tick, min_tick)
syminfo['price_tick_size'] = max(syminfo['minTick'], min_tick)
price_tick: Decimal = Decimal(str(syminfo['minTick']))
size_tick: Decimal = Decimal(str(syminfo['minSize']).rstrip('0'))
# for "legacy" assets, volume is normally discreet, not
# a float
syminfo['lot_tick_size'] = 0.0
# XXX: GRRRR they don't support fractional share sizes for
# stocks from the API?!
if con.secType == 'STK':
size_tick = Decimal('1')
syminfo['price_tick_size'] = price_tick
# NOTE: as you'd expect for "legacy" assets, the "volume
# precision" is normally discreet.
syminfo['lot_tick_size'] = size_tick
ibclient = proxy._aio_ns.ib.client
host, port = ibclient.host, ibclient.port
@ -820,7 +841,7 @@ async def stream_quotes(
await trio.sleep_forever()
return # we never expect feed to come up?
cs: Optional[trio.CancelScope] = None
cs: trio.CancelScope | None = None
startup: bool = True
while (
startup

View File

@ -58,7 +58,7 @@ your ``pps.toml`` file will have position entries like,
[kraken.spot."xmreur.kraken"]
size = 4.80907954
ppu = 103.97000000
bsuid = "XXMRZEUR"
bs_mktid = "XXMRZEUR"
clears = [
{ tid = "TFJBKK-SMBZS-VJ4UWS", cost = 0.8, price = 103.97, size = 4.80907954, dt = "2022-05-20T02:26:33.413397+00:00" },
]

View File

@ -34,6 +34,7 @@ from .api import (
get_client,
)
from .feed import (
get_mkt_info,
open_history_client,
open_symbol_search,
stream_quotes,

View File

@ -20,10 +20,10 @@ Kraken web API wrapping.
'''
from contextlib import asynccontextmanager as acm
from datetime import datetime
from decimal import Decimal
import itertools
from typing import (
Any,
Optional,
Union,
)
import time
@ -41,14 +41,18 @@ import trio
from piker import config
from piker.data.types import Struct
from piker.data._source import Symbol
from piker.accounting._mktinfo import (
Asset,
MktPair,
digits_to_dec,
)
from piker.brokers._util import (
resproc,
SymbolNotFound,
BrokerError,
DataThrottle,
)
from piker.pp import Transaction
from piker.accounting import Transaction
from . import log
# <uri>/<version>/
@ -155,12 +159,23 @@ class Pair(Struct):
short_position_limit: float = 0
long_position_limit: float = float('inf')
@property
def price_tick(self) -> Decimal:
return digits_to_dec(self.pair_decimals)
@property
def size_tick(self) -> Decimal:
return digits_to_dec(self.lot_decimals)
class Client:
# global symbol normalization table
# symbol mapping from all names to the altname
_ntable: dict[str, str] = {}
_atable: bidict[str, str] = bidict()
# 2-way map of symbol names to their "alt names" ffs XD
_altnames: bidict[str, str] = bidict()
_pairs: dict[str, Pair] = {}
def __init__(
@ -176,11 +191,13 @@ class Client:
'User-Agent':
'krakenex/2.1.0 (+https://github.com/veox/python3-krakenex)'
})
self.conf: dict[str, str] = config
self._name = name
self._api_key = api_key
self._secret = secret
self.conf: dict[str, str] = config
self.assets: dict[str, Asset] = {}
@property
def pairs(self) -> dict[str, Pair]:
if self._pairs is None:
@ -247,20 +264,54 @@ class Client:
'Balance',
{},
)
by_bsuid = resp['result']
by_bsmktid = resp['result']
# TODO: we need to pull out the "asset" decimals
# data and return a `decimal.Decimal` instead here!
# using the underlying Asset
return {
self._atable[sym].lower(): float(bal)
for sym, bal in by_bsuid.items()
self._altnames[sym].lower(): float(bal)
for sym, bal in by_bsmktid.items()
}
async def get_assets(self) -> dict[str, dict]:
'''
Get all assets available for trading and xfer.
https://docs.kraken.com/rest/#tag/Market-Data/operation/getAssetInfo
return msg:
"asset1": {
"aclass": "string",
"altname": "string",
"decimals": 0,
"display_decimals": 0,
"collateral_value": 0,
"status": "string"
}
'''
resp = await self._public('Assets', {})
return resp['result']
async def cache_assets(self) -> None:
assets = self.assets = await self.get_assets()
for bsuid, info in assets.items():
self._atable[bsuid] = info['altname']
'''
Load and cache all asset infos and pack into
our native ``Asset`` struct.
'''
assets = await self.get_assets()
for bs_mktid, info in assets.items():
aname = self._altnames[bs_mktid] = info['altname']
aclass = info['aclass']
self.assets[bs_mktid] = Asset(
name=aname.lower(),
atype=f'crypto_{aclass}',
tx_tick=digits_to_dec(info['decimals']),
info=info,
)
async def get_trades(
self,
@ -323,10 +374,15 @@ class Client:
Currently only withdrawals are supported.
'''
xfers: list[dict] = (await self.endpoint(
resp = await self.endpoint(
'WithdrawStatus',
{'asset': asset},
))['result']
)
try:
xfers: list[dict] = resp['result']
except KeyError:
log.exception(f'Kraken suxxx: {resp}')
return []
# eg. resp schema:
# 'result': [{'method': 'Bitcoin', 'aclass': 'currency', 'asset':
@ -341,28 +397,21 @@ class Client:
# look up the normalized name and asset info
asset_key = entry['asset']
asset_info = self.assets[asset_key]
asset = self._atable[asset_key].lower()
asset = self.assets[asset_key]
asset_key = self._altnames[asset_key].lower()
# XXX: this is in the asset units (likely) so it isn't
# quite the same as a commisions cost necessarily..)
cost = float(entry['fee'])
fqsn = asset + '.kraken'
pairinfo = Symbol.from_fqsn(
fqsn,
info={
'asset_type': 'crypto',
'lot_tick_size': asset_info['decimals'],
},
)
fqme = asset_key + '.kraken'
tran = Transaction(
fqsn=fqsn,
sym=pairinfo,
tx = Transaction(
fqsn=fqme,
sym=asset,
tid=entry['txid'],
dt=pendulum.from_timestamp(entry['time']),
bsuid=f'{asset}{src_asset}',
bs_mktid=f'{asset_key}{src_asset}',
size=-1*(
float(entry['amount'])
+
@ -375,7 +424,7 @@ class Client:
# XXX: see note above
cost=cost,
)
trans[tran.tid] = tran
trans[tx.tid] = tx
return trans
@ -424,9 +473,9 @@ class Client:
# txid is a transaction id given by kraken
return await self.endpoint('CancelOrder', {"txid": reqid})
async def symbol_info(
async def pair_info(
self,
pair: Optional[str] = None,
pair: str | None = None,
) -> dict[str, Pair] | Pair:
@ -447,7 +496,36 @@ class Client:
_, data = next(iter(pairs.items()))
return Pair(**data)
else:
return {key: Pair(**data) for key, data in pairs.items()}
return {
key: Pair(**data)
for key, data in pairs.items()
}
async def mkt_info(
self,
pair_str: str,
) -> MktPair:
(
bs_mktid, # str
pair_info, # Pair
) = Client.normalize_symbol(pair_str)
dst_asset = self.assets[pair_info.base]
# NOTE XXX parse out the src asset name until we figure out
# how to get the src asset's `Pair` info from kraken..
src_key = pair_str.lstrip(dst_asset.name.upper()).lower()
return MktPair(
dst=dst_asset,
price_tick=pair_info.price_tick,
size_tick=pair_info.size_tick,
bs_mktid=bs_mktid,
src=src_key,
broker='kraken',
)
async def cache_symbols(self) -> dict:
'''
@ -460,7 +538,7 @@ class Client:
'''
if not self._pairs:
self._pairs.update(await self.symbol_info())
self._pairs.update(await self.pair_info())
# table of all ws and rest keys to their alt-name values.
ntable: dict[str, str] = {}
@ -470,7 +548,7 @@ class Client:
pair: Pair = self._pairs[rest_key]
altname = pair.altname
wsname = pair.wsname
ntable[rest_key] = ntable[wsname] = altname
ntable[altname] = ntable[rest_key] = ntable[wsname] = altname
# register the pair under all monikers, a giant flat
# surjection of all possible names to each info obj.

View File

@ -21,7 +21,6 @@ Order api and machinery
from collections import ChainMap, defaultdict
from contextlib import (
asynccontextmanager as acm,
contextmanager as cm,
)
from functools import partial
from itertools import count
@ -35,20 +34,23 @@ from typing import (
Union,
)
from async_generator import aclosing
from bidict import bidict
import pendulum
import trio
import tractor
from piker.pp import (
from piker.accounting import (
Position,
PpTable,
Transaction,
TransactionLedger,
open_trade_ledger,
open_pps,
get_likely_pair,
)
from piker.accounting._mktinfo import (
MktPair,
)
from piker.data._source import Symbol
from piker.clearing._messages import (
Order,
Status,
@ -67,7 +69,6 @@ from .api import (
get_client,
)
from .feed import (
get_console_log,
open_autorecon_ws,
NoBsWs,
stream_messages,
@ -367,6 +368,8 @@ def trades2pps(
acctid: str,
new_trans: dict[str, Transaction] = {},
write_storage: bool = True,
) -> tuple[
list[BrokerdPosition],
list[Transaction],
@ -397,13 +400,20 @@ def trades2pps(
# right since `.broker` is already
# included?
account='kraken.' + acctid,
symbol=p.symbol.front_fqsn(),
symbol=p.symbol.fqme,
size=p.size,
avg_price=p.ppu,
currency='',
)
position_msgs.append(msg)
if write_storage:
# TODO: ideally this blocks the this task
# as little as possible. we need to either do
# these writes in another actor, or try out `trio`'s
# async file IO api?
table.write_config()
return position_msgs
@ -414,9 +424,6 @@ async def trades_dialogue(
) -> AsyncIterator[dict[str, Any]]:
# XXX: required to propagate ``tractor`` loglevel to ``piker`` logging
get_console_log(loglevel or tractor.current_actor().loglevel)
async with get_client() as client:
if not client._api_key:
@ -467,26 +474,39 @@ async def trades_dialogue(
# update things correctly.
simulate_pp_update: bool = False
table: PpTable
ledger: TransactionLedger
with (
open_pps(
'kraken',
acctid
acctid,
write_on_exit=True,
) as table,
open_trade_ledger(
'kraken',
acctid
) as ledger_dict,
acctid,
) as ledger,
):
# transaction-ify the ledger entries
ledger_trans = norm_trade_records(ledger_dict)
ledger_trans = norm_trade_records(ledger)
if not table.pps:
# NOTE: we can't use this since it first needs
# broker: str input support!
# table.update_from_trans(ledger.to_trans())
table.update_from_trans(ledger_trans)
table.write_config()
# TODO: eventually probably only load
# as far back as it seems is not deliverd in the
# most recent 50 trades and assume that by ordering we
# already have those records in the ledger.
tids2trades = await client.get_trades()
ledger_dict.update(tids2trades)
ledger.update(tids2trades)
if tids2trades:
ledger.write_config()
api_trans = norm_trade_records(tids2trades)
# retrieve kraken reported balances
@ -494,13 +514,15 @@ async def trades_dialogue(
# what amount of trades-transactions need
# to be reloaded.
balances = await client.get_balances()
for dst, size in balances.items():
# we don't care about tracking positions
# in the user's source fiat currency.
if (
dst == src_fiat
or not any(
dst in bsuid for bsuid in table.pps
dst in bs_mktid for bs_mktid in table.pps
)
):
log.warning(
@ -508,45 +530,20 @@ async def trades_dialogue(
)
continue
def get_likely_pair(
dst: str,
bsuid: str,
src_fiat: str = src_fiat
) -> str:
'''
Attempt to get the likely trading pair masting
a given destination asset `dst: str`.
'''
try:
src_name_start = bsuid.rindex(src_fiat)
except (
ValueError, # substr not found
):
# TODO: handle nested positions..(i.e.
# positions where the src fiat was used to
# buy some other dst which was furhter used
# to buy another dst..)
log.warning(
f'No src fiat {src_fiat} found in {bsuid}?'
)
return
likely_dst = bsuid[:src_name_start]
if likely_dst == dst:
return bsuid
def has_pp(
dst: str,
size: float,
) -> Position | bool:
) -> Position | None:
src2dst: dict[str, str] = {}
for bsuid in table.pps:
likely_pair = get_likely_pair(dst, bsuid)
for bs_mktid in table.pps:
likely_pair = get_likely_pair(
src_fiat,
dst,
bs_mktid,
)
if likely_pair:
src2dst[src_fiat] = dst
@ -565,7 +562,7 @@ async def trades_dialogue(
):
log.warning(
f'`kraken` account says you have a ZERO '
f'balance for {bsuid}:{pair}\n'
f'balance for {bs_mktid}:{pair}\n'
f'but piker seems to think `{pp.size}`\n'
'This is likely a discrepancy in piker '
'accounting if the above number is'
@ -574,7 +571,7 @@ async def trades_dialogue(
)
return pp
return False
return None # signal no entry
pos = has_pp(dst, size)
if not pos:
@ -601,8 +598,12 @@ async def trades_dialogue(
# in the ``pps.toml`` for the necessary pair
# yet and thus this likely pair grabber will
# likely fail.
for bsuid in table.pps:
likely_pair = get_likely_pair(dst, bsuid)
for bs_mktid in table.pps:
likely_pair = get_likely_pair(
src_fiat,
dst,
bs_mktid,
)
if likely_pair:
break
else:
@ -652,6 +653,12 @@ async def trades_dialogue(
)
await ctx.started((ppmsgs, [acc_name]))
# TODO: ideally this blocks the this task
# as little as possible. we need to either do
# these writes in another actor, or try out `trio`'s
# async file IO api?
table.write_config()
# Get websocket token for authenticated data stream
# Assert that a token was actually received.
resp = await client.endpoint('GetWebSocketsToken', {})
@ -671,11 +678,9 @@ async def trades_dialogue(
token=token,
),
) as ws,
aclosing(stream_messages(ws)) as stream,
stream_messages(ws) as stream,
trio.open_nursery() as nurse,
):
stream = stream_messages(ws)
# task for processing inbound requests from ems
nurse.start_soon(
handle_order_requests,
@ -724,8 +729,8 @@ async def handle_order_updates(
'''
Main msg handling loop for all things order management.
This code is broken out to make the context explicit and state variables
defined in the signature clear to the reader.
This code is broken out to make the context explicit and state
variables defined in the signature clear to the reader.
'''
async for msg in ws_stream:
@ -827,8 +832,6 @@ async def handle_order_updates(
for pp_msg in ppmsgs:
await ems_stream.send(pp_msg)
ledger_trans.update(new_trans)
# process and relay order state change events
# https://docs.kraken.com/websockets/#message-openOrders
case [
@ -1197,30 +1200,28 @@ def norm_trade_records(
}[record['type']]
# we normalize to kraken's `altname` always..
bsuid, pair_info = Client.normalize_symbol(record['pair'])
fqsn = f'{bsuid}.kraken'
bs_mktid, pair_info = Client.normalize_symbol(
record['pair']
)
fqme = f'{bs_mktid}.kraken'
mktpair = Symbol.from_fqsn(
fqsn,
info={
'lot_size_digits': pair_info.lot_decimals,
'tick_size_digits': pair_info.pair_decimals,
'asset_type': 'crypto',
},
dst, src = pair_info.wsname.lower().split('/')
mkt = MktPair.from_fqme(
fqme,
price_tick=pair_info.price_tick,
size_tick=pair_info.size_tick,
bs_mktid=bs_mktid,
)
records[tid] = Transaction(
fqsn=fqsn,
sym=mktpair,
fqsn=fqme,
sym=mkt,
tid=tid,
size=size,
price=float(record['price']),
cost=float(record['fee']),
dt=pendulum.from_timestamp(float(record['time'])),
bsuid=bsuid,
# XXX: there are no derivs on kraken right?
# expiry=expiry,
bs_mktid=bs_mktid,
)
return records

View File

@ -35,13 +35,15 @@ from trio_util import trio_async_generator
import tractor
import trio
from piker.accounting._mktinfo import (
MktPair,
)
from piker._cacheables import open_cached_client
from piker.brokers._util import (
BrokerError,
DataThrottle,
DataUnavailable,
)
from piker.log import get_console_log
from piker.data.types import Struct
from piker.data._web_bs import open_autorecon_ws, NoBsWs
from . import log
@ -75,6 +77,7 @@ class OHLC(Struct):
ticks: list[Any] = []
@trio_async_generator
async def stream_messages(
ws: NoBsWs,
):
@ -130,63 +133,75 @@ async def process_data_feed_msgs(
Parse and pack data feed messages.
'''
async for msg in stream_messages(ws):
match msg:
case {
'errorMessage': errmsg
}:
raise BrokerError(errmsg)
async with stream_messages(ws) as ws_stream:
async for msg in ws_stream:
match msg:
case {
'errorMessage': errmsg
}:
raise BrokerError(errmsg)
case {
'event': 'subscriptionStatus',
} as sub:
log.info(
'WS subscription is active:\n'
f'{sub}'
)
continue
case [
chan_id,
*payload_array,
chan_name,
pair
]:
if 'ohlc' in chan_name:
ohlc = OHLC(
chan_id,
chan_name,
pair,
*payload_array[0]
case {
'event': 'subscriptionStatus',
} as sub:
log.info(
'WS subscription is active:\n'
f'{sub}'
)
ohlc.typecast()
yield 'ohlc', ohlc
continue
elif 'spread' in chan_name:
case [
chan_id,
*payload_array,
chan_name,
pair
]:
if 'ohlc' in chan_name:
ohlc = OHLC(
chan_id,
chan_name,
pair,
*payload_array[0]
)
ohlc.typecast()
yield 'ohlc', ohlc
bid, ask, ts, bsize, asize = map(
float, payload_array[0])
elif 'spread' in chan_name:
# TODO: really makes you think IB has a horrible API...
quote = {
'symbol': pair.replace('/', ''),
'ticks': [
{'type': 'bid', 'price': bid, 'size': bsize},
{'type': 'bsize', 'price': bid, 'size': bsize},
bid, ask, ts, bsize, asize = map(
float, payload_array[0])
{'type': 'ask', 'price': ask, 'size': asize},
{'type': 'asize', 'price': ask, 'size': asize},
],
}
yield 'l1', quote
# TODO: really makes you think IB has a horrible API...
quote = {
'symbol': pair.replace('/', ''),
'ticks': [
{'type': 'bid', 'price': bid, 'size': bsize},
{'type': 'bsize', 'price': bid, 'size': bsize},
# elif 'book' in msg[-2]:
# chan_id, *payload_array, chan_name, pair = msg
# print(msg)
{'type': 'ask', 'price': ask, 'size': asize},
{'type': 'asize', 'price': ask, 'size': asize},
],
}
yield 'l1', quote
case _:
print(f'UNHANDLED MSG: {msg}')
# yield msg
# elif 'book' in msg[-2]:
# chan_id, *payload_array, chan_name, pair = msg
# print(msg)
case {
'connectionID': conid,
'event': 'systemStatus',
'status': 'online',
'version': ver,
}:
log.info(
f'Established {ver} ws connection with id: {conid}'
)
continue
case _:
print(f'UNHANDLED MSG: {msg}')
# yield msg
def normalize(
@ -263,6 +278,27 @@ async def open_history_client(
yield get_ohlc, {'erlangs': 1, 'rate': 1}
async def get_mkt_info(
fqme: str,
) -> tuple[MktPair, Pair]:
'''
Query for and return a `MktPair` and backend-native `Pair` (or
wtv else) info.
If more then one fqme is provided return a ``dict`` of native
key-strs to `MktPair`s.
'''
async with open_cached_client('kraken') as client:
# uppercase since kraken bs_mktid is always upper
sym_str = fqme.upper()
pair: Pair = await client.pair_info(sym_str)
mkt: MktPair = await client.mkt_info(sym_str)
return mkt, pair
async def stream_quotes(
send_chan: trio.abc.SendChannel,
@ -283,43 +319,29 @@ async def stream_quotes(
``pairs`` must be formatted <crypto_symbol>/<fiat_symbol>.
'''
# XXX: required to propagate ``tractor`` loglevel to piker logging
get_console_log(loglevel or tractor.current_actor().loglevel)
ws_pairs = {}
sym_infos = {}
ws_pairs: list[str] = []
mkt_infos: dict[str, MktPair] = {}
async with open_cached_client('kraken') as client, send_chan as send_chan:
# keep client cached for real-time section
for sym in symbols:
# transform to upper since piker style is always lower
sym = sym.upper()
si: Pair = await client.symbol_info(sym)
# try:
# si = Pair(**sym_info) # validation
# except TypeError:
# fields_diff = set(sym_info) - set(Pair.__struct_fields__)
# raise TypeError(
# f'Missing msg fields {fields_diff}'
# )
syminfo = si.to_dict()
syminfo['price_tick_size'] = 1. / 10**si.pair_decimals
syminfo['lot_tick_size'] = 1. / 10**si.lot_decimals
syminfo['asset_type'] = 'crypto'
sym_infos[sym] = syminfo
ws_pairs[sym] = si.wsname
async with (
send_chan as send_chan,
):
for sym_str in symbols:
mkt, pair = await get_mkt_info(sym_str)
mkt_infos[sym_str] = mkt
ws_pairs.append(pair.wsname)
symbol = symbols[0].lower()
# sync with `.data.feed` caller
# TODO: should we make this init msg a `Struct`?
init_msgs = {
# pass back token, and bool, signalling if we're the writer
# and that history has been written
symbol: {
'symbol_info': sym_infos[sym],
'shm_write_opts': {'sum_tick_vml': False},
'fqsn': sym,
'fqsn': sym_str,
'mkt_info': mkt_infos[sym_str],
'shm_write_opts': {
'sum_tick_vml': False,
},
},
}
@ -332,7 +354,7 @@ async def stream_quotes(
# https://github.com/krakenfx/kraken-wsclient-py/blob/master/kraken_wsclient_py/kraken_wsclient_py.py#L188
ohlc_sub = {
'event': 'subscribe',
'pair': list(ws_pairs.values()),
'pair': ws_pairs,
'subscription': {
'name': 'ohlc',
'interval': 1,
@ -348,7 +370,7 @@ async def stream_quotes(
# trade data (aka L1)
l1_sub = {
'event': 'subscribe',
'pair': list(ws_pairs.values()),
'pair': ws_pairs,
'subscription': {
'name': 'spread',
# 'depth': 10}
@ -363,7 +385,7 @@ async def stream_quotes(
# unsub from all pairs on teardown
if ws.connected():
await ws.send_msg({
'pair': list(ws_pairs.values()),
'pair': ws_pairs,
'event': 'unsubscribe',
'subscription': ['ohlc', 'spread'],
})

View File

@ -43,10 +43,13 @@ from ..calc import humanize, percent_change
from .._cacheables import open_cached_client, async_lifo_cache
from .. import config
from ._util import resproc, BrokerError, SymbolNotFound
from ..log import get_logger, colorize_json, get_console_log
log = get_logger(__name__)
from ..log import (
colorize_json,
)
from .util import (
log,
get_console_log,
)
_use_practice_account = False
_refresh_token_ep = 'https://{}login.questrade.com/oauth2/'

View File

@ -27,12 +27,13 @@ from typing import List
from async_generator import asynccontextmanager
import asks
from ..log import get_logger
from ._util import resproc, BrokerError
from ._util import (
resproc,
BrokerError,
log,
)
from ..calc import percent_change
log = get_logger(__name__)
_service_ep = 'https://api.robinhood.com'
@ -65,8 +66,10 @@ class Client:
self.api = _API(self._sess)
def _zip_in_order(self, symbols: [str], quotes: List[dict]):
return {quote.get('symbol', sym) if quote else sym: quote
for sym, quote in zip(symbols, results_dict)}
return {
quote.get('symbol', sym) if quote else sym: quote
for sym, quote in zip(symbols, quotes)
}
async def quote(self, symbols: [str]):
"""Retrieve quotes for a list of ``symbols``.

View File

@ -18,9 +18,17 @@
Market machinery for order executions, book, management.
"""
from ._client import open_ems
from ..log import get_logger
from ._client import (
open_ems,
OrderClient,
)
__all__ = [
'open_ems',
'OrderClient',
]
log = get_logger(__name__)

View File

@ -1,5 +1,5 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
@ -27,68 +27,105 @@ import trio
import tractor
from tractor.trionics import broadcast_receiver
from ..log import get_logger
from ._util import (
log, # sub-sys logger
)
from ..accounting._mktinfo import unpack_fqme
from ..data.types import Struct
from ..service import maybe_open_emsd
from ._messages import (
Order,
Cancel,
BrokerdPosition,
)
from ..brokers import get_brokermod
if TYPE_CHECKING:
from ._messages import (
BrokerdPosition,
Status,
)
log = get_logger(__name__)
class OrderClient(Struct):
'''
EMS-client-side order book ctl and tracking.
class OrderBook(Struct):
'''EMS-client-side order book ctl and tracking.
A style similar to "model-view" is used here where this api is
provided as a supervised control for an EMS actor which does all the
hard/fast work of talking to brokers/exchanges to conduct
executions.
Currently, this is mostly for keeping local state to match the EMS
and use received events to trigger graphics updates.
(A)sync API for submitting orders and alerts to the `emsd` service;
this is the main control for execution management from client code.
'''
# IPC stream to `emsd` actor
_ems_stream: tractor.MsgStream
# mem channels used to relay order requests to the EMS daemon
_to_ems: trio.abc.SendChannel
_from_order_book: trio.abc.ReceiveChannel
_to_relay_task: trio.abc.SendChannel
_from_sync_order_client: trio.abc.ReceiveChannel
# history table
_sent_orders: dict[str, Order] = {}
def send(
def send_nowait(
self,
msg: Order | dict,
) -> dict:
) -> dict | Order:
'''
Sync version of ``.send()``.
'''
self._sent_orders[msg.oid] = msg
self._to_ems.send_nowait(msg)
self._to_relay_task.send_nowait(msg)
return msg
def send_update(
async def send(
self,
msg: Order | dict,
) -> dict | Order:
'''
Send a new order msg async to the `emsd` service.
'''
self._sent_orders[msg.oid] = msg
await self._ems_stream.send(msg)
return msg
def update_nowait(
self,
uuid: str,
**data: dict,
) -> dict:
'''
Sync version of ``.update()``.
'''
cmd = self._sent_orders[uuid]
msg = cmd.copy(update=data)
self._sent_orders[uuid] = msg
self._to_ems.send_nowait(msg)
return cmd
self._to_relay_task.send_nowait(msg)
return msg
def cancel(self, uuid: str) -> bool:
"""Cancel an order (or alert) in the EMS.
async def update(
self,
uuid: str,
**data: dict,
) -> dict:
'''
Update an existing order dialog with a msg updated from
``update`` kwargs.
"""
'''
cmd = self._sent_orders[uuid]
msg = cmd.copy(update=data)
self._sent_orders[uuid] = msg
await self._ems_stream.send(msg)
return msg
def _mk_cancel_msg(
self,
uuid: str,
) -> Cancel:
cmd = self._sent_orders.get(uuid)
if not cmd:
log.error(
@ -96,77 +133,77 @@ class OrderBook(Struct):
f'Maybe there is a stale entry or line?\n'
f'You should report this as a bug!'
)
msg = Cancel(
fqme = str(cmd.symbol)
return Cancel(
oid=uuid,
symbol=cmd.symbol,
)
self._to_ems.send_nowait(msg)
_orders: OrderBook = None
def get_orders(
emsd_uid: tuple[str, str] = None
) -> OrderBook:
""""
OrderBook singleton factory per actor.
"""
if emsd_uid is not None:
# TODO: read in target emsd's active book on startup
pass
global _orders
if _orders is None:
size = 100
tx, rx = trio.open_memory_channel(size)
brx = broadcast_receiver(rx, size)
# setup local ui event streaming channels for request/resp
# streamging with EMS daemon
_orders = OrderBook(
_to_ems=tx,
_from_order_book=brx,
symbol=fqme,
)
return _orders
def cancel_nowait(
self,
uuid: str,
) -> None:
'''
Sync version of ``.cancel()``.
'''
self._to_relay_task.send_nowait(
self._mk_cancel_msg(uuid)
)
async def cancel(
self,
uuid: str,
) -> bool:
'''
Cancel an already existintg order (or alert) dialog.
'''
await self._ems_stream.send(
self._mk_cancel_msg(uuid)
)
# TODO: we can get rid of this relay loop once we move
# order_mode inputs to async code!
async def relay_order_cmds_from_sync_code(
_client: OrderClient = None
async def relay_orders_from_sync_code(
client: OrderClient,
symbol_key: str,
to_ems_stream: tractor.MsgStream,
) -> None:
"""
Order streaming task: deliver orders transmitted from UI
to downstream consumers.
'''
Order submission relay task: deliver orders sent from synchronous (UI)
code to the EMS via ``OrderClient._from_sync_order_client``.
This is run in the UI actor (usually the one running Qt but could be
any other client service code). This process simply delivers order
messages to the above ``_to_ems`` send channel (from sync code using
messages to the above ``_to_relay_task`` send channel (from sync code using
``.send_nowait()``), these values are pulled from the channel here
and relayed to any consumer(s) that called this function using
a ``tractor`` portal.
This effectively makes order messages look like they're being
"pushed" from the parent to the EMS where local sync code is likely
doing the pushing from some UI.
doing the pushing from some non-async UI handler.
"""
book = get_orders()
async with book._from_order_book.subscribe() as orders_stream:
async for cmd in orders_stream:
'''
async with (
client._from_sync_order_client.subscribe() as sync_order_cmds
):
async for cmd in sync_order_cmds:
sym = cmd.symbol
msg = pformat(cmd)
msg = pformat(cmd.to_dict())
if sym == symbol_key:
log.info(f'Send order cmd:\n{msg}')
# send msg over IPC / wire
await to_ems_stream.send(cmd)
else:
log.warning(
f'Ignoring unmatched order cmd for {sym} != {symbol_key}:'
@ -176,62 +213,37 @@ async def relay_order_cmds_from_sync_code(
@acm
async def open_ems(
fqsn: str,
fqme: str,
mode: str = 'live',
loglevel: str = 'error',
) -> tuple[
OrderBook,
OrderClient,
tractor.MsgStream,
dict[
# brokername, acctid
tuple[str, str],
list[BrokerdPosition],
dict[str, BrokerdPosition],
],
list[str],
dict[str, Status],
]:
'''
Spawn an EMS daemon and begin sending orders and receiving
alerts.
(Maybe) spawn an EMS-daemon (emsd), deliver an `OrderClient` for
requesting orders/alerts and a `trades_stream` which delivers all
response-msgs.
This EMS tries to reduce most broker's terrible order entry apis to
a very simple protocol built on a few easy to grok and/or
"rantsy" premises:
- most users will prefer "dark mode" where orders are not submitted
to a broker until and execution condition is triggered
(aka client-side "hidden orders")
- Brokers over-complicate their apis and generally speaking hire
poor designers to create them. We're better off using creating a super
minimal, schema-simple, request-event-stream protocol to unify all the
existing piles of shit (and shocker, it'll probably just end up
looking like a decent crypto exchange's api)
- all order types can be implemented with client-side limit orders
- we aren't reinventing a wheel in this case since none of these
brokers are exposing FIX protocol; it is they doing the re-invention.
TODO: make some fancy diagrams using mermaid.io
the possible set of responses from the stream is currently:
- 'dark_submitted', 'broker_submitted'
- 'dark_cancelled', 'broker_cancelled'
- 'dark_executed', 'broker_executed'
- 'broker_filled'
This is a "client side" entrypoint which may spawn the `emsd` service
if it can't be discovered and generally speaking is the lowest level
broker control client-API.
'''
# wait for service to connect back to us signalling
# ready for order commands
book = get_orders()
broker, symbol, suffix = unpack_fqme(fqme)
from ..data._source import unpack_fqsn
broker, symbol, suffix = unpack_fqsn(fqsn)
async with maybe_open_emsd(broker) as portal:
async with maybe_open_emsd(
broker,
loglevel=loglevel,
) as portal:
mod = get_brokermod(broker)
if (
@ -244,9 +256,8 @@ async def open_ems(
async with (
# connect to emsd
portal.open_context(
_emsd_main,
fqsn=fqsn,
fqme=fqme,
exec_mode=mode,
loglevel=loglevel,
@ -262,18 +273,39 @@ async def open_ems(
# open 2-way trade command stream
ctx.open_stream() as trades_stream,
):
# use any pre-existing actor singleton client.
global _client
if _client is None:
size = 100
tx, rx = trio.open_memory_channel(size)
brx = broadcast_receiver(rx, size)
# setup local ui event streaming channels for request/resp
# streamging with EMS daemon
_client = OrderClient(
_ems_stream=trades_stream,
_to_relay_task=tx,
_from_sync_order_client=brx,
)
_client._ems_stream = trades_stream
# start sync code order msg delivery task
async with trio.open_nursery() as n:
n.start_soon(
relay_order_cmds_from_sync_code,
fqsn,
relay_orders_from_sync_code,
_client,
fqme,
trades_stream
)
yield (
book,
_client,
trades_stream,
positions,
accounts,
dialogs,
)
# stop the sync-msg-relay task on exit.
n.cancel_scope.cancel()

View File

@ -1,5 +1,5 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
@ -41,11 +41,13 @@ import trio
from trio_typing import TaskStatus
import tractor
from ..log import get_logger
from ._util import (
log, # sub-sys logger
get_console_log,
)
from ..data._normalize import iterticks
from ..data._source import (
unpack_fqsn,
mk_fqsn,
from ..accounting._mktinfo import (
unpack_fqme,
float_digits,
)
from ..data.feed import (
@ -69,9 +71,6 @@ from ._messages import (
)
log = get_logger(__name__)
# TODO: numba all of this
def mk_check(
@ -157,7 +156,7 @@ async def clear_dark_triggers(
brokerd_orders_stream: tractor.MsgStream,
quote_stream: tractor.ReceiveMsgStream, # noqa
broker: str,
fqsn: str,
fqme: str,
book: DarkBook,
@ -232,7 +231,7 @@ async def clear_dark_triggers(
account=account,
size=size,
):
bfqsn: str = symbol.replace(f'.{broker}', '')
bfqme: str = symbol.replace(f'.{broker}', '')
submit_price = price + abs_diff_away
resp = 'triggered' # hidden on client-side
@ -245,7 +244,7 @@ async def clear_dark_triggers(
oid=oid,
account=account,
time_ns=time.time_ns(),
symbol=bfqsn,
symbol=bfqme,
price=submit_price,
size=size,
)
@ -288,14 +287,14 @@ async def clear_dark_triggers(
# send response to client-side
await router.client_broadcast(
fqsn,
fqme,
status,
)
else: # condition scan loop complete
log.debug(f'execs are {execs}')
if execs:
book.triggers[fqsn] = execs
book.triggers[fqme] = execs
# print(f'execs scan took: {time.time() - start}')
@ -316,9 +315,6 @@ class TradesRelay(Struct):
# allowed account names
accounts: tuple[str]
# count of connected ems clients for this ``brokerd``
consumers: int = 0
class Router(Struct):
'''
@ -336,7 +332,7 @@ class Router(Struct):
# sets of clients mapped from subscription keys
subscribers: defaultdict[
str, # sub key, default fqsn
str, # sub key, default fqme
set[tractor.MsgStream], # unique client streams
] = defaultdict(set)
@ -413,6 +409,9 @@ class Router(Struct):
trades_endpoint is None
or exec_mode == 'paper'
):
# for logging purposes
brokermod = paper
# for paper mode we need to mock this trades response feed
# so we load bidir stream to a new sub-actor running
# a paper-simulator clearing engine.
@ -425,7 +424,7 @@ class Router(Struct):
# actor to simulate the real IPC load it'll have when also
# pulling data from feeds
open_trades_endpoint = paper.open_paperboi(
fqsn='.'.join([symbol, broker]),
fqme='.'.join([symbol, broker]),
loglevel=loglevel,
)
@ -466,30 +465,31 @@ class Router(Struct):
# client set.
# locally cache and track positions per account with
# a table of (brokername, acctid) -> `BrokerdPosition`
# msgs.
pps = {}
for msg in positions:
log.info(f'loading pp: {msg}')
account = msg['account']
# TODO: better value error for this which
# dumps the account and message and states the
# mismatch..
assert account in accounts
pps.setdefault(
(broker, account),
[],
).append(msg)
# a nested table of msgs:
# tuple(brokername, acctid) ->
# (fqme: str ->
# `BrokerdPosition`)
relay = TradesRelay(
brokerd_stream=brokerd_trades_stream,
positions=pps,
positions={},
accounts=accounts,
consumers=1,
)
for msg in positions:
msg = BrokerdPosition(**msg)
log.info(
f'loading pp for {brokermod.__name__}:\n'
f'{pformat(msg.to_dict())}',
)
# TODO: state any mismatch here?
account = msg.account
assert account in accounts
relay.positions.setdefault(
(broker, account),
{},
)[msg.symbol] = msg
self.relays[broker] = relay
@ -507,7 +507,7 @@ class Router(Struct):
async def open_trade_relays(
self,
fqsn: str,
fqme: str,
exec_mode: str,
loglevel: str,
@ -517,29 +517,29 @@ class Router(Struct):
) -> tuple[TradesRelay, Feed]:
'''
Open and yield ``brokerd`` trades dialogue context-stream if
none already exists.
Maybe open a live feed to the target fqme, start `brokerd` order
msg relay and dark clearing tasks to run in the background
indefinitely.
'''
from ..data._source import unpack_fqsn
broker, symbol, suffix = unpack_fqsn(fqsn)
broker, symbol, suffix = unpack_fqme(fqme)
async with (
maybe_open_feed(
[fqsn],
[fqme],
loglevel=loglevel,
) as feed,
):
brokername, _, _ = unpack_fqsn(fqsn)
brokername, _, _ = unpack_fqme(fqme)
brokermod = feed.mods[brokername]
broker = brokermod.name
portal = feed.portals[brokermod]
# XXX: this should be initial price quote from target provider
flume = feed.flumes[fqsn]
flume = feed.flumes[fqme]
first_quote: dict = flume.first_quote
book: DarkBook = self.get_dark_book(broker)
book.lasts[fqsn]: float = first_quote['last']
book.lasts[fqme]: float = first_quote['last']
async with self.maybe_open_brokerd_dialog(
brokermod=brokermod,
@ -558,7 +558,7 @@ class Router(Struct):
relay.brokerd_stream,
flume.stream,
broker,
fqsn, # form: <name>.<venue>.<suffix>.<broker>
fqme, # form: <name>.<venue>.<suffix>.<broker>
book
)
@ -638,11 +638,14 @@ _router: Router = None
@tractor.context
async def _setup_persistent_emsd(
ctx: tractor.Context,
loglevel: str | None = None,
) -> None:
if loglevel:
get_console_log(loglevel)
global _router
# open a root "service nursery" for the ``emsd`` actor
@ -692,16 +695,15 @@ async def translate_and_relay_brokerd_events(
async for brokerd_msg in brokerd_trades_stream:
fmsg = pformat(brokerd_msg)
log.info(
f'Received broker trade event:\n'
f'Rx brokerd trade msg:\n'
f'{fmsg}'
)
status_msg: Optional[Status] = None
status_msg: Status | None = None
match brokerd_msg:
# BrokerdPosition
case {
'name': 'position',
'symbol': sym,
'broker': broker,
}:
pos_msg = BrokerdPosition(**brokerd_msg)
@ -712,9 +714,9 @@ async def translate_and_relay_brokerd_events(
relay.positions.setdefault(
# NOTE: translate to a FQSN!
(broker, sym),
[]
).append(pos_msg)
(broker, pos_msg.account),
{}
)[pos_msg.symbol] = pos_msg
# fan-out-relay position msgs immediately by
# broadcasting updates on all client streams
@ -781,12 +783,11 @@ async def translate_and_relay_brokerd_events(
# no msg to client necessary
continue
# BrokerdOrderError
# BrokerdError
case {
'name': 'error',
'oid': oid, # ems order-dialog id
'reqid': reqid, # brokerd generated order-request id
'symbol': sym,
}:
status_msg = book._active.get(oid)
msg = BrokerdError(**brokerd_msg)
@ -947,9 +948,9 @@ async def translate_and_relay_brokerd_events(
# may end up with collisions?
status_msg = Status(**brokerd_msg)
# NOTE: be sure to pack an fqsn for the client side!
# NOTE: be sure to pack an fqme for the client side!
order = Order(**status_msg.req)
order.symbol = mk_fqsn(broker, order.symbol)
order.symbol = f'{order.symbol}.{broker}'
assert order.price and order.size
status_msg.req = order
@ -1024,7 +1025,7 @@ async def process_client_order_cmds(
client_order_stream: tractor.MsgStream,
brokerd_order_stream: tractor.MsgStream,
fqsn: str,
fqme: str,
flume: Flume,
dark_book: DarkBook,
router: Router,
@ -1051,11 +1052,11 @@ async def process_client_order_cmds(
# backend can be routed and relayed to subscribed clients.
subs = router.dialogs[oid]
# add all subscribed clients for this fqsn (should eventually be
# add all subscribed clients for this fqme (should eventually be
# a more generalize subscription system) to received order msg
# updates (and thus show stuff in the UI).
subs.add(client_order_stream)
subs.update(router.subscribers[fqsn])
subs.update(router.subscribers[fqme])
reqid = dark_book._ems2brokerd_ids.inverse.get(oid)
@ -1113,7 +1114,7 @@ async def process_client_order_cmds(
and status.resp == 'dark_open'
):
# remove from dark book clearing
entry = dark_book.triggers[fqsn].pop(oid, None)
entry = dark_book.triggers[fqme].pop(oid, None)
if entry:
(
pred,
@ -1129,7 +1130,7 @@ async def process_client_order_cmds(
status.req = cmd
await router.client_broadcast(
fqsn,
fqme,
status,
)
@ -1139,7 +1140,7 @@ async def process_client_order_cmds(
dark_book._active.pop(oid)
else:
log.exception(f'No dark order for {fqsn}?')
log.exception(f'No dark order for {fqme}?')
# TODO: eventually we should be receiving
# this struct on the wire unpacked in a scoped protocol
@ -1148,7 +1149,7 @@ async def process_client_order_cmds(
# LIVE order REQUEST
case {
'oid': oid,
'symbol': fqsn,
'symbol': fqme,
'price': trigger_price,
'size': size,
'action': ('buy' | 'sell') as action,
@ -1161,7 +1162,7 @@ async def process_client_order_cmds(
# remove the broker part before creating a message
# to send to the specific broker since they probably
# aren't expectig their own name, but should they?
sym = fqsn.replace(f'.{broker}', '')
sym = fqme.replace(f'.{broker}', '')
if status is not None:
# if we already had a broker order id then
@ -1218,7 +1219,7 @@ async def process_client_order_cmds(
# DARK-order / alert REQUEST
case {
'oid': oid,
'symbol': fqsn,
'symbol': fqme,
'price': trigger_price,
'size': size,
'exec_mode': exec_mode,
@ -1240,7 +1241,7 @@ async def process_client_order_cmds(
# price received from the feed, instead of being
# like every other shitty tina platform that makes
# the user choose the predicate operator.
last = dark_book.lasts[fqsn]
last = dark_book.lasts[fqme]
# sometimes the real-time feed hasn't come up
# so just pull from the latest history.
@ -1250,7 +1251,7 @@ async def process_client_order_cmds(
pred = mk_check(trigger_price, last, action)
spread_slap: float = 5
min_tick = flume.symbol.tick_size
min_tick = float(flume.symbol.size_tick)
min_tick_digits = float_digits(min_tick)
if action == 'buy':
@ -1282,7 +1283,7 @@ async def process_client_order_cmds(
# NOTE: this may result in an override of an existing
# dark book entry if the order id already exists
dark_book.triggers.setdefault(
fqsn, {}
fqme, {}
)[oid] = (
pred,
tickfilter,
@ -1307,7 +1308,7 @@ async def process_client_order_cmds(
# broadcast status to all subscribed clients
await router.client_broadcast(
fqsn,
fqme,
status,
)
@ -1318,35 +1319,36 @@ async def process_client_order_cmds(
@acm
async def maybe_open_trade_relays(
router: Router,
fqsn: str,
fqme: str,
exec_mode: str, # ('paper', 'live')
loglevel: str = 'info',
) -> tuple:
def cache_on_fqsn_unless_paper(
def cache_on_fqme_unless_paper(
router: Router,
fqsn: str,
fqme: str,
exec_mode: str, # ('paper', 'live')
loglevel: str = 'info',
) -> Hashable:
if exec_mode == 'paper':
return f'paper_{fqsn}'
return f'paper_{fqme}'
else:
return fqsn
return fqme
# XXX: closure to enable below use of
# ``tractor.trionics.maybe_open_context()``
@acm
async def cached_mngr(
router: Router,
fqsn: str,
fqme: str,
exec_mode: str, # ('paper', 'live')
loglevel: str = 'info',
):
relay, feed, client_ready = await _router.nursery.start(
_router.open_trade_relays,
fqsn,
fqme,
exec_mode,
loglevel,
)
@ -1356,11 +1358,11 @@ async def maybe_open_trade_relays(
acm_func=cached_mngr,
kwargs={
'router': _router,
'fqsn': fqsn,
'fqme': fqme,
'exec_mode': exec_mode,
'loglevel': loglevel,
},
key=cache_on_fqsn_unless_paper,
key=cache_on_fqme_unless_paper,
) as (
cache_hit,
(relay, feed, client_ready)
@ -1371,9 +1373,9 @@ async def maybe_open_trade_relays(
@tractor.context
async def _emsd_main(
ctx: tractor.Context,
fqsn: str,
fqme: str,
exec_mode: str, # ('paper', 'live')
loglevel: str = 'info',
loglevel: str | None = None,
) -> tuple[
dict[
@ -1428,7 +1430,7 @@ async def _emsd_main(
global _router
assert _router
broker, symbol, suffix = unpack_fqsn(fqsn)
broker, symbol, suffix = unpack_fqme(fqme)
# TODO: would be nice if in tractor we can require either a ctx arg,
# or a named arg with ctx in it and a type annotation of
@ -1445,7 +1447,7 @@ async def _emsd_main(
# few duplicate streams as necessary per ems actor.
async with maybe_open_trade_relays(
_router,
fqsn,
fqme,
exec_mode,
loglevel,
) as (relay, feed, client_ready):
@ -1468,28 +1470,28 @@ async def _emsd_main(
# register the client side before starting the
# brokerd-side relay task to ensure the client is
# delivered all exisiting open orders on startup.
# TODO: instead of by fqsn we need a subscription
# TODO: instead of by fqme we need a subscription
# system/schema here to limit what each new client is
# allowed to see in terms of broadcasted order flow
# updates per dialog.
_router.subscribers[fqsn].add(client_stream)
_router.subscribers[fqme].add(client_stream)
client_ready.set()
# start inbound (from attached client) order request processing
# main entrypoint, run here until cancelled.
try:
flume = feed.flumes[fqsn]
flume = feed.flumes[fqme]
await process_client_order_cmds(
client_stream,
brokerd_stream,
fqsn,
fqme,
flume,
dark_book,
_router,
)
finally:
# try to remove client from subscription registry
_router.subscribers[fqsn].remove(client_stream)
_router.subscribers[fqme].remove(client_stream)
for oid, client_streams in _router.dialogs.items():
client_streams.discard(client_stream)

View File

@ -29,7 +29,6 @@ from typing import (
from msgspec import field
from ..data._source import Symbol
from ..data.types import Struct
@ -94,7 +93,8 @@ class Order(Struct):
# internal ``emdsd`` unique "order id"
oid: str # uuid4
symbol: str | Symbol
# TODO: figure out how to optionally typecast this to `MktPair`?
symbol: str # | MktPair
account: str # should we set a default as '' ?
price: float
@ -300,10 +300,10 @@ class BrokerdError(Struct):
class BrokerdPosition(Struct):
'''Position update event from brokerd.
'''
Position update event from brokerd.
'''
broker: str
account: str
symbol: str

View File

@ -1,5 +1,5 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
@ -19,14 +19,12 @@ Fake trading for forward testing.
"""
from collections import defaultdict
from contextlib import asynccontextmanager
from contextlib import asynccontextmanager as acm
from datetime import datetime
from operator import itemgetter
import itertools
import time
from typing import (
Any,
Optional,
Callable,
)
import uuid
@ -36,18 +34,26 @@ import pendulum
import trio
import tractor
from ..brokers import get_brokermod
from .. import data
from ..data.types import Struct
from ..data._source import Symbol
from ..pp import (
from ..accounting._mktinfo import (
Symbol,
MktPair,
)
from ..accounting import (
Position,
PpTable,
Transaction,
TransactionLedger,
open_trade_ledger,
open_pps,
)
from ..data._normalize import iterticks
from ..data._source import unpack_fqsn
from ..log import get_logger
from ..accounting._mktinfo import unpack_fqme
from ._util import (
log, # sub-sys logger
)
from ._messages import (
BrokerdCancel,
BrokerdOrder,
@ -58,10 +64,6 @@ from ._messages import (
BrokerdError,
)
from ..config import load
log = get_logger(__name__)
class PaperBoi(Struct):
'''
@ -75,13 +77,14 @@ class PaperBoi(Struct):
ems_trades_stream: tractor.MsgStream
ppt: PpTable
ledger: TransactionLedger
# map of paper "live" orders which be used
# to simulate fills based on paper engine settings
_buys: defaultdict[str, bidict]
_sells: defaultdict[str, bidict]
_reqids: bidict
_positions: dict[str, Position]
_trade_ledger: dict[str, Any]
_syms: dict[str, Symbol] = {}
# init edge case L1 spread
@ -95,7 +98,7 @@ class PaperBoi(Struct):
price: float,
action: str,
size: float,
reqid: Optional[str],
reqid: str | None,
) -> int:
'''
@ -121,7 +124,10 @@ class PaperBoi(Struct):
# in the broker trades event processing loop
await trio.sleep(0.05)
if action == 'sell':
if (
action == 'sell'
and size > 0
):
size = -size
msg = BrokerdStatus(
@ -197,7 +203,7 @@ class PaperBoi(Struct):
async def fake_fill(
self,
fqsn: str,
fqme: str,
price: float,
size: float,
action: str, # one of {'buy', 'sell'}
@ -250,43 +256,48 @@ class PaperBoi(Struct):
)
await self.ems_trades_stream.send(msg)
# lookup any existing position
key = fqsn.rstrip(f'.{self.broker}')
# NOTE: for paper we set the "bs_mktid" as just the fqme since
# we don't actually have any unique backend symbol ourselves
# other then this thing, our fqme address.
bs_mktid: str = fqme
t = Transaction(
fqsn=fqsn,
sym=self._syms[fqsn],
fqsn=fqme,
sym=self._syms[fqme],
tid=oid,
size=size,
price=price,
cost=0, # TODO: cost model
dt=pendulum.from_timestamp(fill_time_s),
bsuid=key,
bs_mktid=bs_mktid,
)
with (
open_trade_ledger(self.broker, 'paper') as ledger,
open_pps(self.broker, 'paper', write_on_exit=True) as table
):
tx = t.to_dict()
tx.pop('sym')
ledger.update({oid: tx})
# Write to pps toml right now
table.update_from_trans({oid: t})
tx = t.to_dict()
tx.pop('sym')
pp = table.pps[key]
pp_msg = BrokerdPosition(
broker=self.broker,
account='paper',
symbol=fqsn,
# TODO: we need to look up the asset currency from
# broker info. i guess for crypto this can be
# inferred from the pair?
currency=key,
size=pp.size,
avg_price=pp.ppu,
)
# update in-mem ledger and pos table
self.ledger.update({oid: tx})
self.ppt.update_from_trans({oid: t})
await self.ems_trades_stream.send(pp_msg)
# transmit pp msg to ems
pp = self.ppt.pps[bs_mktid]
pp_msg = BrokerdPosition(
broker=self.broker,
account='paper',
symbol=fqme,
size=pp.size,
avg_price=pp.ppu,
# TODO: we need to look up the asset currency from
# broker info. i guess for crypto this can be
# inferred from the pair?
# currency=bs_mktid,
)
await self.ems_trades_stream.send(pp_msg)
# write all updates to filesys
self.ledger.write_config()
self.ppt.write_config()
async def simulate_fills(
@ -421,7 +432,7 @@ async def simulate_fills(
# clearing price would have filled entirely
await client.fake_fill(
fqsn=sym,
fqme=sym,
# todo slippage to determine fill price
price=tick_price,
size=size,
@ -469,6 +480,7 @@ async def handle_order_requests(
BrokerdOrderAck(
oid=order.oid,
reqid=reqid,
account='paper'
)
)
@ -512,7 +524,6 @@ _sells: defaultdict[
tuple[float, float, str, str], # order info
]
] = defaultdict(bidict)
_positions: dict[str, Position] = {}
@tractor.context
@ -520,33 +531,66 @@ async def trades_dialogue(
ctx: tractor.Context,
broker: str,
fqsn: str,
loglevel: str = None,
fqme: str | None = None, # if empty, we only boot broker mode
loglevel: str = 'warning',
) -> None:
tractor.log.get_console_log(loglevel)
async with (
data.open_feed(
[fqsn],
loglevel=loglevel,
) as feed,
ppt: PpTable
ledger: TransactionLedger
with (
open_pps(
broker,
'paper',
write_on_exit=True,
) as ppt,
open_trade_ledger(
broker,
'paper',
) as ledger
):
# attempt to get market info from the backend instead of presuming
# the ledger entries have everything correct.
# TODO: how to process ledger info from backends?
# - should we be rolling our own actor-cached version of these
# client API refs or using portal IPC to send requests to the
# existing brokerd daemon?
# - alternatively we can possibly expect and use
# a `.broker.norm_trade_records()` ep?
fqmes: list[str] = [fqme]
if fqme is None:
fqmes = list(ppt.pps)
with open_pps(broker, 'paper') as table:
# save pps in local state
_positions.update(table.pps)
for fqme in fqmes:
mkt: MktPair | None = None
brokermod = get_brokermod(broker)
gmi = getattr(brokermod, 'get_mkt_info', None)
if gmi:
mkt, pair = await brokermod.get_mkt_info(
fqme.rstrip(f'.{broker}'),
)
# update pos table from ledger history
ppt.update_from_trans(
ledger.to_trans(),
# NOTE: here we pass in any `MktPair` provided by the
# backend broker instead of assuming the pps.toml contains
# the correct contents!
force_mkt=mkt
)
pp_msgs: list[BrokerdPosition] = []
pos: Position
token: str # f'{symbol}.{self.broker}'
for token, pos in _positions.items():
for token, pos in ppt.pps.items():
pp_msgs.append(BrokerdPosition(
broker=broker,
account='paper',
symbol=pos.symbol.front_fqsn(),
symbol=pos.symbol.fqme,
size=pos.size,
avg_price=pos.ppu,
))
@ -556,42 +600,63 @@ async def trades_dialogue(
['paper'],
))
# write new positions state in case ledger was
# newer then that tracked in pps.toml
ppt.write_config()
# exit early since no fqme was passed,
# normally this case is just to load
# positions "offline".
if fqme is None:
log.warning(
'Paper engine only running in position delivery mode!\n'
'NO SIMULATED CLEARING LOOP IS ACTIVE!'
)
await trio.sleep_forever()
return
async with (
ctx.open_stream() as ems_stream,
trio.open_nursery() as n,
data.open_feed(
[fqme],
loglevel=loglevel,
) as feed,
):
client = PaperBoi(
broker,
ems_stream,
_buys=_buys,
_sells=_sells,
async with (
ctx.open_stream() as ems_stream,
trio.open_nursery() as n,
):
client = PaperBoi(
broker=broker,
ems_trades_stream=ems_stream,
ppt=ppt,
ledger=ledger,
_reqids=_reqids,
_buys=_buys,
_sells=_sells,
_reqids=_reqids,
_positions=_positions,
# TODO: load postions from ledger file
_syms={
fqme: flume.symbol
for fqme, flume in feed.flumes.items()
}
)
# TODO: load postions from ledger file
_trade_ledger={},
_syms={
fqsn: flume.symbol
for fqsn, flume in feed.flumes.items()
}
)
n.start_soon(
handle_order_requests,
client,
ems_stream,
)
n.start_soon(
handle_order_requests,
client,
ems_stream,
)
# paper engine simulator clearing task
await simulate_fills(feed.streams[broker], client)
# paper engine simulator clearing task
await simulate_fills(feed.streams[broker], client)
@asynccontextmanager
@acm
async def open_paperboi(
fqsn: str,
loglevel: str,
fqme: str | None = None,
broker: str | None = None,
loglevel: str | None = None,
) -> Callable:
'''
@ -599,28 +664,39 @@ async def open_paperboi(
its context.
'''
broker, symbol, expiry = unpack_fqsn(fqsn)
if not fqme:
assert broker, 'One of `broker` or `fqme` is required siss..!'
else:
broker, symbol, expiry = unpack_fqme(fqme)
we_spawned: bool = False
service_name = f'paperboi.{broker}'
async with (
tractor.find_actor(service_name) as portal,
tractor.open_nursery() as tn,
):
# only spawn if no paperboi already is up
# (we likely don't need more then one proc for basic
# simulated order clearing)
# NOTE: only spawn if no paperboi already is up since we likely
# don't need more then one actor for simulated order clearing
# per broker-backend.
if portal is None:
log.info('Starting new paper-engine actor')
portal = await tn.start_actor(
service_name,
enable_modules=[__name__]
)
we_spawned = True
async with portal.open_context(
trades_dialogue,
broker=broker,
fqsn=fqsn,
fqme=fqme,
loglevel=loglevel,
) as (ctx, first):
yield ctx, first
# tear down connection and any spawned actor on exit
await ctx.cancel()
if we_spawned:
await portal.cancel_actor()

View File

@ -0,0 +1,33 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Sub-sys module commons.
"""
from functools import partial
from ..log import (
get_logger,
get_console_log,
)
subsys: str = 'piker.clearing'
log = get_logger(subsys)
get_console_log = partial(
get_console_log,
name=subsys,
)

View File

@ -21,8 +21,6 @@ Platform configuration (files) mgmt.
import platform
import sys
import os
from os import path
from os.path import dirname
import shutil
from typing import Optional
from pathlib import Path
@ -126,30 +124,35 @@ def get_app_dir(
)
_config_dir = _click_config_dir = get_app_dir('piker')
_parent_user = os.environ.get('SUDO_USER')
_click_config_dir: Path = Path(get_app_dir('piker'))
_config_dir: Path = _click_config_dir
_parent_user: str = os.environ.get('SUDO_USER')
if _parent_user:
non_root_user_dir = os.path.expanduser(
f'~{_parent_user}'
non_root_user_dir = Path(
os.path.expanduser(f'~{_parent_user}')
)
root = 'root'
root: str = 'root'
_ccds: str = str(_click_config_dir) # click config dir string
i_tail: int = int(_ccds.rfind(root) + len(root))
_config_dir = (
non_root_user_dir +
_click_config_dir[
_click_config_dir.rfind(root) + len(root):
]
non_root_user_dir
/
Path(_ccds[i_tail+1:]) # +1 to capture trailing '/'
)
_conf_names: set[str] = {
'brokers',
'pps',
# 'pps',
'trades',
'watchlists',
'paper_trades'
}
_watchlists_data_path = os.path.join(_config_dir, 'watchlists.json')
# TODO: probably drop all this super legacy, questrade specific,
# config stuff XD ?
_watchlists_data_path: Path = _config_dir / Path('watchlists.json')
_context_defaults = dict(
default_map={
# Questrade specific quote poll rates
@ -180,7 +183,7 @@ def _conf_fn_w_ext(
def get_conf_path(
conf_name: str = 'brokers',
) -> str:
) -> Path:
'''
Return the top-level default config path normally under
``~/.config/piker`` on linux for a given ``conf_name``, the config
@ -196,72 +199,68 @@ def get_conf_path(
- strats.toml
'''
assert conf_name in _conf_names
if 'pps.' not in conf_name:
assert str(conf_name) in _conf_names
fn = _conf_fn_w_ext(conf_name)
return os.path.join(
_config_dir,
fn,
)
return _config_dir / Path(fn)
def repodir():
def repodir() -> Path:
'''
Return the abspath to the repo directory.
Return the abspath as ``Path`` to the git repo's root dir.
'''
dirpath = path.abspath(
# we're 3 levels down in **this** module file
dirname(dirname(os.path.realpath(__file__)))
)
return dirpath
return Path(__file__).absolute().parent.parent
def load(
conf_name: str = 'brokers',
path: str = None,
path: Path | None = None,
**tomlkws,
) -> (dict, str):
) -> tuple[dict, str]:
'''
Load config file by name.
'''
path = path or get_conf_path(conf_name)
path: Path = path or get_conf_path(conf_name)
if not os.path.isdir(_config_dir):
Path(_config_dir).mkdir(parents=True, exist_ok=True)
if not os.path.isfile(path):
fn = _conf_fn_w_ext(conf_name)
template = os.path.join(
repodir(),
'config',
fn
if not _config_dir.is_dir():
_config_dir.mkdir(
parents=True,
exist_ok=True,
)
# try to copy in a template config to the user's directory
# if one exists.
if os.path.isfile(template):
if not path.is_file():
fn: str = _conf_fn_w_ext(conf_name)
# try to copy in a template config to the user's directory if
# one exists.
template: Path = repodir() / 'config' / fn
if template.is_file():
shutil.copyfile(template, path)
else:
# create an empty file
with open(path, 'x'):
# create empty file
with path.open(mode='x'):
pass
else:
with open(path, 'r'):
with path.open(mode='r'):
pass # touch it
config = toml.load(path, **tomlkws)
config: dict = toml.load(str(path), **tomlkws)
log.debug(f"Read config file {path}")
return config, path
def write(
config: dict, # toml config as dict
name: str = 'brokers',
path: str = None,
name: str | None = None,
path: Path | None = None,
fail_empty: bool = True,
**toml_kwargs,
) -> None:
@ -271,21 +270,26 @@ def write(
Create a ``brokers.ini`` file if one does not exist.
'''
path = path or get_conf_path(name)
dirname = os.path.dirname(path)
if not os.path.isdir(dirname):
log.debug(f"Creating config dir {_config_dir}")
os.makedirs(dirname)
if name:
path: Path = path or get_conf_path(name)
dirname: Path = path.parent
if not dirname.is_dir():
log.debug(f"Creating config dir {_config_dir}")
dirname.mkdir()
if not config and fail_empty:
if (
not config
and fail_empty
):
raise ValueError(
"Watch out you're trying to write a blank config!")
"Watch out you're trying to write a blank config!"
)
log.debug(
f"Writing config `{name}` file to:\n"
f"{path}"
)
with open(path, 'w') as cf:
with path.open(mode='w') as cf:
return toml.dump(
config,
cf,

View File

@ -56,6 +56,7 @@ __all__ = [
async def _setup_persistent_brokerd(
ctx: tractor.Context,
brokername: str,
loglevel: str | None = None,
) -> None:
'''
@ -64,7 +65,9 @@ async def _setup_persistent_brokerd(
the broker backend as needed.
'''
get_console_log(tractor.current_actor().loglevel)
get_console_log(
loglevel or tractor.current_actor().loglevel,
)
from .feed import (
_bus,
@ -84,5 +87,3 @@ async def _setup_persistent_brokerd(
# we pin this task to keep the feeds manager active until the
# parent actor decides to tear it down
await trio.sleep_forever()

View File

@ -429,7 +429,7 @@ async def spawn_samplerd(
async def maybe_open_samplerd(
loglevel: str | None = None,
**kwargs,
**pikerd_kwargs,
) -> tractor.Portal: # noqa
'''
@ -442,9 +442,9 @@ async def maybe_open_samplerd(
async with maybe_spawn_daemon(
dname,
service_task_target=spawn_samplerd,
spawn_args={'loglevel': loglevel},
spawn_args={},
loglevel=loglevel,
**kwargs,
**pikerd_kwargs,
) as portal:
yield portal

View File

@ -649,7 +649,7 @@ def maybe_open_shm_array(
token = _known_tokens[key]
return attach_shm_array(token=token, **kwargs), False
except KeyError:
log.warning(f"Could not find {key} in shms cache")
log.debug(f"Could not find {key} in shms cache")
if dtype:
token = _make_token(
key,
@ -659,7 +659,7 @@ def maybe_open_shm_array(
try:
return attach_shm_array(token=token, **kwargs), False
except FileNotFoundError:
log.warning(f"Could not attach to shm with token {token}")
log.debug(f"Could not attach to shm with token {token}")
# This actor does not know about memory
# associated with the provided "key".

View File

@ -28,8 +28,12 @@ from bidict import bidict
import numpy as np
from .types import Struct
# from numba import from_dtype
from ..accounting._mktinfo import (
# mkfqsn,
unpack_fqsn,
# digits_to_dec,
float_digits,
)
ohlc_fields = [
('time', float),
@ -50,6 +54,7 @@ base_ohlc_dtype = np.dtype(ohlc_fields)
# TODO: for now need to construct this manually for readonly arrays, see
# https://github.com/numba/numba/issues/4511
# from numba import from_dtype
# numba_ohlc_dtype = from_dtype(base_ohlc_dtype)
# map time frame "keys" to seconds values
@ -64,32 +69,6 @@ tf_in_1s = bidict({
})
def mk_fqsn(
provider: str,
symbol: str,
) -> str:
'''
Generate a "fully qualified symbol name" which is
a reverse-hierarchical cross broker/provider symbol
'''
return '.'.join([symbol, provider]).lower()
def float_digits(
value: float,
) -> int:
'''
Return the number of precision digits read from a float value.
'''
if value == 0:
return 0
return int(-Decimal(str(value)).as_tuple().exponent)
def ohlc_zeros(length: int) -> np.ndarray:
"""Construct an OHLC field formatted structarray.
@ -100,220 +79,6 @@ def ohlc_zeros(length: int) -> np.ndarray:
return np.zeros(length, dtype=base_ohlc_dtype)
def unpack_fqsn(fqsn: str) -> tuple[str, str, str]:
'''
Unpack a fully-qualified-symbol-name to ``tuple``.
'''
venue = ''
suffix = ''
# TODO: probably reverse the order of all this XD
tokens = fqsn.split('.')
if len(tokens) < 3:
# probably crypto
symbol, broker = tokens
return (
broker,
symbol,
'',
)
elif len(tokens) > 3:
symbol, venue, suffix, broker = tokens
else:
symbol, venue, broker = tokens
suffix = ''
# head, _, broker = fqsn.rpartition('.')
# symbol, _, suffix = head.rpartition('.')
return (
broker,
'.'.join([symbol, venue]),
suffix,
)
class MktPair(Struct, frozen=True):
src: str # source asset name being used to buy
src_type: str # source asset's financial type/classification name
# ^ specifies a "class" of financial instrument
# egs. stock, futer, option, bond etc.
dst: str # destination asset name being bought
dst_type: str # destination asset's financial type/classification name
price_tick: float # minimum price increment value increment
price_tick_digits: int # required decimal digits for above
size_tick: float # minimum size (aka vlm) increment value increment
size_tick_digits: int # required decimal digits for above
venue: str | None = None # market venue provider name
expiry: str | None = None # for derivs, expiry datetime parseable str
# for derivs, info describing contract, egs.
# strike price, call or put, swap type, exercise model, etc.
contract_info: str | None = None
@classmethod
def from_msg(
self,
msg: dict[str, Any],
) -> MktPair:
'''
Constructor for a received msg-dict normally received over IPC.
'''
...
# fqa, fqma, .. etc. see issue:
# https://github.com/pikers/piker/issues/467
@property
def fqsn(self) -> str:
'''
Return the fully qualified market (endpoint) name for the
pair of transacting assets.
'''
...
# TODO: rework the below `Symbol` (which was originally inspired and
# derived from stuff in quantdom) into a simpler, ipc msg ready, market
# endpoint meta-data container type as per the drafted interace above.
class Symbol(Struct):
'''
I guess this is some kinda container thing for dealing with
all the different meta-data formats from brokers?
'''
key: str
tick_size: float = 0.01
lot_tick_size: float = 0.0 # "volume" precision as min step value
tick_size_digits: int = 2
lot_size_digits: int = 0
suffix: str = ''
broker_info: dict[str, dict[str, Any]] = {}
@classmethod
def from_broker_info(
cls,
broker: str,
symbol: str,
info: dict[str, Any],
suffix: str = '',
) -> Symbol:
tick_size = info.get('price_tick_size', 0.01)
lot_size = info.get('lot_tick_size', 0.0)
return Symbol(
key=symbol,
tick_size=tick_size,
lot_tick_size=lot_size,
tick_size_digits=float_digits(tick_size),
lot_size_digits=float_digits(lot_size),
suffix=suffix,
broker_info={broker: info},
)
@classmethod
def from_fqsn(
cls,
fqsn: str,
info: dict[str, Any],
) -> Symbol:
broker, key, suffix = unpack_fqsn(fqsn)
return cls.from_broker_info(
broker,
key,
info=info,
suffix=suffix,
)
@property
def type_key(self) -> str:
return list(self.broker_info.values())[0]['asset_type']
@property
def brokers(self) -> list[str]:
return list(self.broker_info.keys())
def nearest_tick(self, value: float) -> float:
'''
Return the nearest tick value based on mininum increment.
'''
mult = 1 / self.tick_size
return round(value * mult) / mult
def front_feed(self) -> tuple[str, str]:
'''
Return the "current" feed key for this symbol.
(i.e. the broker + symbol key in a tuple).
'''
return (
list(self.broker_info.keys())[0],
self.key,
)
def tokens(self) -> tuple[str]:
broker, key = self.front_feed()
if self.suffix:
return (key, self.suffix, broker)
else:
return (key, broker)
@property
def fqsn(self) -> str:
return '.'.join(self.tokens()).lower()
def front_fqsn(self) -> str:
'''
fqsn = "fully qualified symbol name"
Basically the idea here is for all client-ish code (aka programs/actors
that ask the provider agnostic layers in the stack for data) should be
able to tell which backend / venue / derivative each data feed/flow is
from by an explicit string key of the current form:
<instrumentname>.<venue>.<suffixwithmetadata>.<brokerbackendname>
TODO: I have thoughts that we should actually change this to be
more like an "attr lookup" (like how the web should have done
urls, but marketting peeps ruined it etc. etc.):
<broker>.<venue>.<instrumentname>.<suffixwithmetadata>
'''
tokens = self.tokens()
fqsn = '.'.join(map(str.lower, tokens))
return fqsn
def quantize_size(
self,
size: float,
) -> Decimal:
'''
Truncate input ``size: float`` using ``Decimal``
and ``.lot_size_digits``.
'''
digits = self.lot_size_digits
return Decimal(size).quantize(
Decimal(f'1.{"0".ljust(digits, "0")}'),
rounding=ROUND_HALF_EVEN
)
def _nan_to_closest_num(array: np.ndarray):
"""Return interpolated values instead of NaN.

View File

@ -26,6 +26,7 @@ from collections import (
Counter,
)
from contextlib import asynccontextmanager as acm
from decimal import Decimal
from datetime import datetime
from functools import partial
import time
@ -70,11 +71,13 @@ from ._sharedmem import (
)
from .ingest import get_ingestormod
from .types import Struct
from ._source import (
base_iohlc_dtype,
from ..accounting._mktinfo import (
Asset,
MktPair,
unpack_fqme,
Symbol,
unpack_fqsn,
)
from ._source import base_iohlc_dtype
from ..ui import _search
from ._sampling import (
open_sample_stream,
@ -565,7 +568,7 @@ async def tsdb_backfill(
timeframe=timeframe,
)
broker, symbol, expiry = unpack_fqsn(fqsn)
broker, symbol, expiry = unpack_fqme(fqsn)
try:
(
latest_start_dt,
@ -930,6 +933,24 @@ async def manage_history(
await trio.sleep_forever()
class BackendInitMsg(Struct, frozen=True):
'''
A stringent data provider startup msg schema validator.
The fields defined here are matched with those absolutely required
from each backend broker/data provider.
'''
fqme: str
symbol_info: dict | None = None
mkt_info: MktPair | None = None
shm_write_opts: dict[str, Any] | None = None
def validate_init_msg() -> None:
...
async def allocate_persistent_feed(
bus: _FeedsBus,
sub_registered: trio.Event,
@ -974,7 +995,10 @@ async def allocate_persistent_feed(
# establish broker backend quote stream by calling
# ``stream_quotes()``, which is a required broker backend endpoint.
init_msg, first_quote = await bus.nursery.start(
(
init_msg,
first_quote,
) = await bus.nursery.start(
partial(
mod.stream_quotes,
send_chan=send,
@ -1005,21 +1029,58 @@ async def allocate_persistent_feed(
# a small streaming machine around the remote feed which can then
# do the normal work of sampling and writing shm buffers
# (depending on if we want sampling done on the far end or not?)
msg = init_msg[symstr]
per_mkt_init_msg = init_msg[symstr]
# the broker-specific fully qualified symbol name,
# but ensure it is lower-cased for external use.
bfqsn = msg['fqsn'].lower()
bs_mktid = per_mkt_init_msg['fqsn'].lower()
# true fqsn including broker/provider suffix
fqsn = '.'.join((bfqsn, brokername))
# msg['fqsn'] = bfqsn
# true fqme including broker/provider suffix
fqme = '.'.join((bs_mktid, brokername))
symbol = Symbol.from_fqsn(
fqsn=fqsn,
info=msg['symbol_info'],
)
assert symbol.type_key
mktinfo = per_mkt_init_msg.get('mkt_info')
if not mktinfo:
log.warning(
f'BACKEND {brokername} is using old `Symbol` style API\n'
'IT SHOULD BE PORTED TO THE NEW `.accounting._mktinfo.MktPair`\n'
'STATTTTT!!!\n'
)
mktinfo = per_mkt_init_msg['symbol_info']
# TODO: read out renamed/new tick size fields in block below!
price_tick = mktinfo.get(
'price_tick_size',
Decimal('0.01'),
)
size_tick = mktinfo.get(
'lot_tick_size',
Decimal('0.0'),
)
log.warning(f'FQME: {fqme} -> backend needs port to `MktPair`')
mkt = MktPair.from_fqme(
fqme,
price_tick=price_tick,
size_tick=size_tick,
bs_mktid=bs_mktid,
_atype=mktinfo['asset_type']
)
symbol = Symbol.from_fqsn(
fqsn=fqme,
info=mktinfo,
)
else:
# the new msg-protocol is to expect an already packed
# ``Asset`` and ``MktPair`` object from the backend
symbol = mkt = mktinfo
assert isinstance(mkt, MktPair)
assert isinstance(mkt.dst, Asset)
assert mkt.type_key
# HISTORY storage, run 2 tasks:
# - a history loader / maintainer
@ -1040,18 +1101,23 @@ async def allocate_persistent_feed(
manage_history,
mod,
bus,
fqsn,
fqme,
some_data_ready,
feed_is_live,
)
# yield back control to starting nursery once we receive either
# some history or a real-time quote.
log.info(f'waiting on history to load: {fqsn}')
log.info(f'waiting on history to load: {fqme}')
await some_data_ready.wait()
flume = Flume(
# TODO: we have to use this for now since currently the
# MktPair above doesn't render the correct output key it seems
# when we provide the `MktInfo` here?..?
symbol=symbol,
first_quote=first_quote,
_rt_shm_token=rt_shm.token,
_hist_shm_token=hist_shm.token,
@ -1061,7 +1127,7 @@ async def allocate_persistent_feed(
# for ambiguous names we simply apply the retreived
# feed to that name (for now).
bus.feeds[symstr] = bus.feeds[bfqsn] = flume
bus.feeds[symstr] = bus.feeds[bs_mktid] = flume
task_status.started()
@ -1072,6 +1138,8 @@ async def allocate_persistent_feed(
# the backend will indicate when real-time quotes have begun.
await feed_is_live.wait()
# NOTE: if not configured otherwise, we always sum tick volume
# values in the OHLCV sampler.
sum_tick_vlm: bool = init_msg.get(
'shm_write_opts', {}
).get('sum_tick_vlm', True)
@ -1095,7 +1163,7 @@ async def allocate_persistent_feed(
rt_shm.array['time'][1] = ts + 1
elif hist_shm.array.size == 0:
await tractor.breakpoint()
raise RuntimeError(f'History (1m) Shm for {fqme} is empty!?')
# wait the spawning parent task to register its subscriber
# send-stream entry before we start the sample loop.
@ -1104,7 +1172,7 @@ async def allocate_persistent_feed(
# start sample loop and shm incrementer task for OHLC style sampling
# at the above registered step periods.
try:
log.info(f'Starting sampler task for {fqsn}')
log.info(f'Starting sampler task for {fqme}')
await sample_and_broadcast(
bus,
rt_shm,
@ -1114,7 +1182,7 @@ async def allocate_persistent_feed(
sum_tick_vlm
)
finally:
log.warning(f'{fqsn} feed task terminated')
log.warning(f'{fqme} feed task terminated')
@tractor.context
@ -1197,26 +1265,28 @@ async def open_feed_bus(
# subscriber
flume = bus.feeds[symbol]
sym = flume.symbol
bfqsn = sym.key
fqsn = sym.fqsn # true fqsn
assert bfqsn in fqsn and brokername in fqsn
bs_mktid = sym.key
fqsn = sym.fqme # true fqsn
assert bs_mktid in fqsn and brokername in fqsn
if sym.suffix:
bfqsn = fqsn.removesuffix(f'.{brokername}')
log.warning(f'{brokername} expanded symbol {symbol} -> {bfqsn}')
bs_mktid = fqsn.removesuffix(f'.{brokername}')
log.warning(f'{brokername} expanded symbol {symbol} -> {bs_mktid}')
# pack for ``.started()`` sync msg
flumes[fqsn] = flume
# we use the broker-specific fqsn (bfqsn) for
# the sampler subscription since the backend isn't (yet)
# expected to append it's own name to the fqsn, so we filter
# on keys which *do not* include that name (e.g .ib) .
bus._subscribers.setdefault(bfqsn, set())
# we use the broker-specific market id (bs_mktid) for the
# sampler subscription since the backend isn't (yet) expected to
# append it's own name to the fqsn, so we filter on keys which
# *do not* include that name (e.g .ib) .
bus._subscribers.setdefault(bs_mktid, set())
# sync feed subscribers with flume handles
await ctx.started(
{fqsn: flume.to_msg() for fqsn, flume in flumes.items()}
{fqsn: flume.to_msg()
for fqsn, flume in flumes.items()
}
)
if not start_stream:
@ -1276,9 +1346,9 @@ async def open_feed_bus(
# maybe use the current task-id to key the sub list that's
# added / removed? Or maybe we can add a general
# pause-resume by sub-key api?
bfqsn = fqsn.removesuffix(f'.{brokername}')
local_subs.setdefault(bfqsn, set()).add(sub)
bus.add_subs(bfqsn, {sub})
bs_mktid = fqsn.removesuffix(f'.{brokername}')
local_subs.setdefault(bs_mktid, set()).add(sub)
bus.add_subs(bs_mktid, {sub})
# sync caller with all subs registered state
sub_registered.set()
@ -1291,16 +1361,16 @@ async def open_feed_bus(
async for msg in stream:
if msg == 'pause':
for bfqsn, subs in local_subs.items():
for bs_mktid, subs in local_subs.items():
log.info(
f'Pausing {bfqsn} feed for {uid}')
bus.remove_subs(bfqsn, subs)
f'Pausing {bs_mktid} feed for {uid}')
bus.remove_subs(bs_mktid, subs)
elif msg == 'resume':
for bfqsn, subs in local_subs.items():
for bs_mktid, subs in local_subs.items():
log.info(
f'Resuming {bfqsn} feed for {uid}')
bus.add_subs(bfqsn, subs)
f'Resuming {bs_mktid} feed for {uid}')
bus.add_subs(bs_mktid, subs)
else:
raise ValueError(msg)
@ -1314,8 +1384,8 @@ async def open_feed_bus(
cs.cancel()
# drop all subs for this task from the bus
for bfqsn, subs in local_subs.items():
bus.remove_subs(bfqsn, subs)
for bs_mktid, subs in local_subs.items():
bus.remove_subs(bs_mktid, subs)
class Feed(Struct):
@ -1512,7 +1582,7 @@ async def open_feed(
feed = Feed()
for fqsn in fqsns:
brokername, key, suffix = unpack_fqsn(fqsn)
brokername, key, suffix = unpack_fqme(fqsn)
bfqsn = fqsn.replace('.' + brokername, '')
try:
@ -1635,7 +1705,7 @@ async def open_feed(
# apply `brokerd`-common steam to each flume
# tracking a symbol from that provider.
for fqsn, flume in feed.flumes.items():
if brokermod.name in flume.symbol.brokers:
if brokermod.name == flume.symbol.broker:
flume.stream = stream
assert len(feed.mods) == len(feed.portals) == len(feed.streams)

View File

@ -22,6 +22,7 @@ real-time data processing data-structures.
"""
from __future__ import annotations
# from decimal import Decimal
from typing import (
TYPE_CHECKING,
)
@ -30,10 +31,11 @@ import tractor
import pendulum
import numpy as np
from .types import Struct
from ._source import (
from ..accounting._mktinfo import (
MktPair,
Symbol,
)
from .types import Struct
from ._sharedmem import (
attach_shm_array,
ShmArray,
@ -89,7 +91,7 @@ class Flume(Struct):
queuing properties.
'''
symbol: Symbol
symbol: Symbol | MktPair
first_quote: dict
_rt_shm_token: _Token
@ -172,8 +174,16 @@ class Flume(Struct):
# TODO: get native msgspec decoding for these workinn
def to_msg(self) -> dict:
msg = self.to_dict()
# TODO: do we even need to convert to dict
# first now?
# TODO: drop the former.
msg['symbol'] = msg['symbol'].to_dict()
mktpair = msg.get('mktpair')
if mktpair:
msg['mktpair'] = mktpair.to_dict()
# can't serialize the stream or feed objects, it's expected
# you'll have a ref to it since this msg should be rxed on
@ -183,12 +193,30 @@ class Flume(Struct):
return msg
@classmethod
def from_msg(cls, msg: dict) -> dict:
symbol = Symbol(**msg.pop('symbol'))
return cls(
symbol=symbol,
**msg,
)
def from_msg(
cls,
msg: dict,
) -> dict:
'''
Load from an IPC msg presumably in either `dict` or
`msgspec.Struct` form.
'''
sym_msg = msg.pop('symbol')
if 'dst' in sym_msg:
mkt = MktPair.from_msg(sym_msg)
else:
# XXX NOTE: ``msgspec`` can encode `Decimal`
# but it doesn't decide to it by default since
# we aren't spec-cing these msgs as structs, SO
# we have to ensure we do a struct type case (which `.copy()`
# does) to ensure we get the right type!
mkt = Symbol(**sym_msg).copy()
return cls(symbol=mkt, **msg)
def get_index(
self,

View File

@ -19,7 +19,6 @@ Built-in (extension) types.
"""
import sys
from typing import Optional
from pprint import pformat
import msgspec
@ -59,7 +58,7 @@ class Struct(
def copy(
self,
update: Optional[dict] = None,
update: dict | None = None,
) -> msgspec.Struct:
'''
@ -80,9 +79,11 @@ class Struct(
msgspec.msgpack.Encoder().encode(self)
)
# NOTE XXX: this won't work on frozen types!
# use ``.copy()`` above in such cases.
def typecast(
self,
# fields: Optional[list[str]] = None,
# fields: list[str] | None = None,
) -> None:
for fname, ftype in self.__annotations__.items():
setattr(self, fname, ftype(getattr(self, fname)))

View File

@ -45,7 +45,7 @@ from ..data._sampling import (
_default_delay_s,
open_sample_stream,
)
from ..data._source import Symbol
from ..accounting._mktinfo import Symbol
from ._api import (
Fsp,
_load_builtins,
@ -104,7 +104,7 @@ async def fsp_compute(
disabled=True
)
fqsn = symbol.front_fqsn()
fqsn = symbol.fqme
out_stream = func(
# TODO: do we even need this if we do the feed api right?

View File

@ -20,6 +20,7 @@ Actor-runtime service orchestration machinery.
"""
from __future__ import annotations
from ._util import log
from ._mngr import Services
from ._registry import ( # noqa
_tractor_kwargs,

View File

@ -34,8 +34,8 @@ from contextlib import (
import tractor
import trio
from ..log import (
get_logger,
from ._util import (
log, # sub-sys logger
get_console_log,
)
from ._mngr import (
@ -47,8 +47,6 @@ from ._registry import ( # noqa
open_registry,
)
log = get_logger(__name__)
def get_tractor_runtime_kwargs() -> dict[str, Any]:
'''
@ -185,7 +183,10 @@ async def open_pikerd(
trio.open_nursery() as service_nursery,
):
if root_actor.accept_addr != reg_addr:
raise RuntimeError(f'Daemon failed to bind on {reg_addr}!?')
raise RuntimeError(
f'`pikerd` failed to bind on {reg_addr}!\n'
'Maybe you have another daemon already running?'
)
# assign globally for future daemon/task creation
Services.actor_n = actor_nursery

View File

@ -48,14 +48,12 @@ from requests.exceptions import (
ReadTimeout,
)
from ..log import (
get_logger,
from ._util import (
log, # sub-sys logger
get_console_log,
)
from .. import config
log = get_logger(__name__)
class DockerNotStarted(Exception):
'Prolly you dint start da daemon bruh'

View File

@ -20,7 +20,6 @@ Daemon-actor spawning "endpoint-hooks".
"""
from __future__ import annotations
from typing import (
Optional,
Callable,
Any,
)
@ -30,9 +29,8 @@ from contextlib import (
import tractor
from ..log import (
get_logger,
get_console_log,
from ._util import (
log, # sub-sys logger
)
from ..brokers import get_brokermod
from ._mngr import (
@ -41,9 +39,8 @@ from ._mngr import (
from ._actor_runtime import maybe_open_pikerd
from ._registry import find_service
log = get_logger(__name__)
# `brokerd` enabled modules
# TODO: move this def to the `.data` subpkg..
# NOTE: keeping this list as small as possible is part of our caps-sec
# model and should be treated with utmost care!
_data_mods = [
@ -60,11 +57,13 @@ async def maybe_spawn_daemon(
service_name: str,
service_task_target: Callable,
spawn_args: dict[str, Any],
loglevel: Optional[str] = None,
spawn_args: dict[str, Any],
loglevel: str | None = None,
singleton: bool = False,
**kwargs,
**pikerd_kwargs,
) -> tractor.Portal:
'''
@ -79,9 +78,6 @@ async def maybe_spawn_daemon(
clients.
'''
if loglevel:
get_console_log(loglevel)
# serialize access to this section to avoid
# 2 or more tasks racing to create a daemon
lock = Services.locks[service_name]
@ -93,18 +89,17 @@ async def maybe_spawn_daemon(
yield portal
return
log.warning(f"Couldn't find any existing {service_name}")
# TODO: really shouldn't the actor spawning be part of the service
# starting method `Services.start_service()` ?
log.warning(
f"Couldn't find any existing {service_name}\n"
'Attempting to spawn new daemon-service..'
)
# ask root ``pikerd`` daemon to spawn the daemon we need if
# pikerd is not live we now become the root of the
# process tree
async with maybe_open_pikerd(
loglevel=loglevel,
**kwargs,
**pikerd_kwargs,
) as pikerd_portal:
@ -117,23 +112,33 @@ async def maybe_spawn_daemon(
# service task for that actor.
started: bool
if pikerd_portal is None:
started = await service_task_target(**spawn_args)
started = await service_task_target(
loglevel=loglevel,
**spawn_args,
)
else:
# tell the remote `pikerd` to start the target,
# the target can't return a non-serializable value
# since it is expected that service startingn is
# non-blocking and the target task will persist running
# on `pikerd` after the client requesting it's start
# disconnects.
# request a remote `pikerd` (service manager) to start the
# target daemon-task, the target can't return
# a non-serializable value since it is expected that service
# starting is non-blocking and the target task will persist
# running "under" or "within" the `pikerd` actor tree after
# the questing client disconnects. in other words this
# spawns a persistent daemon actor that continues to live
# for the lifespan of whatever the service manager inside
# `pikerd` says it should.
started = await pikerd_portal.run(
service_task_target,
loglevel=loglevel,
**spawn_args,
)
if started:
log.info(f'Service {service_name} started!')
# block until we can discover (by IPC connection) to the newly
# spawned daemon-actor and then deliver the portal to the
# caller.
async with tractor.wait_for_actor(service_name) as portal:
lock.release()
yield portal
@ -143,7 +148,8 @@ async def maybe_spawn_daemon(
async def spawn_brokerd(
brokername: str,
loglevel: Optional[str] = None,
loglevel: str | None = None,
**tractor_kwargs,
) -> bool:
@ -182,8 +188,11 @@ async def spawn_brokerd(
await Services.start_service_task(
dname,
portal,
# signature of target root-task endpoint
_setup_persistent_brokerd,
brokername=brokername,
loglevel=loglevel,
)
return True
@ -192,8 +201,9 @@ async def spawn_brokerd(
async def maybe_spawn_brokerd(
brokername: str,
loglevel: Optional[str] = None,
**kwargs,
loglevel: str | None = None,
**pikerd_kwargs,
) -> tractor.Portal:
'''
@ -207,10 +217,10 @@ async def maybe_spawn_brokerd(
service_task_target=spawn_brokerd,
spawn_args={
'brokername': brokername,
'loglevel': loglevel,
},
loglevel=loglevel,
**kwargs,
**pikerd_kwargs,
) as portal:
yield portal
@ -218,7 +228,7 @@ async def maybe_spawn_brokerd(
async def spawn_emsd(
loglevel: Optional[str] = None,
loglevel: str | None = None,
**extra_tractor_kwargs
) -> bool:
@ -245,7 +255,10 @@ async def spawn_emsd(
await Services.start_service_task(
'emsd',
portal,
# signature of target root-task endpoint
_setup_persistent_emsd,
loglevel=loglevel,
)
return True
@ -254,18 +267,18 @@ async def spawn_emsd(
async def maybe_open_emsd(
brokername: str,
loglevel: Optional[str] = None,
**kwargs,
loglevel: str | None = None,
) -> tractor._portal.Portal: # noqa
**pikerd_kwargs,
) -> tractor.Portal: # noqa
async with maybe_spawn_daemon(
'emsd',
service_task_target=spawn_emsd,
spawn_args={'loglevel': loglevel},
spawn_args={},
loglevel=loglevel,
**kwargs,
**pikerd_kwargs,
) as portal:
yield portal

View File

@ -28,12 +28,10 @@ import trio
from trio_typing import TaskStatus
import tractor
from ..log import (
get_logger,
from ._util import (
log, # sub-sys logger
)
log = get_logger(__name__)
# TODO: factor this into a ``tractor.highlevel`` extension
# pack for the library.
@ -58,7 +56,7 @@ class Services:
name: str,
portal: tractor.Portal,
target: Callable,
**kwargs,
**ctx_kwargs,
) -> (trio.CancelScope, tractor.Context):
'''
@ -83,7 +81,7 @@ class Services:
with trio.CancelScope() as cs:
async with portal.open_context(
target,
**kwargs,
**ctx_kwargs,
) as (ctx, first):

View File

@ -28,13 +28,10 @@ from typing import (
import tractor
from ..log import (
get_logger,
from ._util import (
log, # sub-sys logger
)
log = get_logger(__name__)
_default_registry_host: str = '127.0.0.1'
_default_registry_port: int = 6116
_default_reg_addr: tuple[str, int] = (

View File

@ -0,0 +1,33 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Sub-sys module commons.
"""
from functools import partial
from ..log import (
get_logger,
get_console_log,
)
subsys: str = 'piker.service'
log = get_logger(subsys)
get_console_log = partial(
get_console_log,
name=subsys,
)

View File

@ -20,21 +20,17 @@ from typing import (
TYPE_CHECKING,
)
import asks
if TYPE_CHECKING:
import docker
from ._ahab import DockerContainer
from piker.log import (
get_logger,
get_console_log
from . import log # sub-sys logger
from ._util import (
get_console_log,
)
import asks
log = get_logger(__name__)
# container level config
_config = {
@ -92,7 +88,7 @@ def start_elasticsearch(
'http://localhost:19200/_cat/health',
params={'format': 'json'}
)).json()
kog.info(
log.info(
'ElasticSearch cntr health:\n'
f'{health}'
)

View File

@ -54,14 +54,14 @@ if TYPE_CHECKING:
import docker
from ._ahab import DockerContainer
from ._util import (
log, # sub-sys logger
get_console_log,
)
from ..data.feed import maybe_open_feed
from ..log import get_logger, get_console_log
from .._profile import Profiler
log = get_logger(__name__)
# ahabd-supervisor and container level config
_config = {
'grpc_listen_port': 5995,
@ -703,7 +703,7 @@ async def open_tsdb_client(
# profiler('Finished db arrays diffs')
syms = await storage.client.list_symbols()
_ = await storage.client.list_symbols()
# log.info(f'Existing tsdb symbol set:\n{pformat(syms)}')
# profiler(f'listed symbols {syms}')
yield storage

View File

@ -28,7 +28,7 @@ from ..service import maybe_spawn_brokerd
from . import _event
from ._exec import run_qtractor
from ..data.feed import install_brokerd_search
from ..data._source import unpack_fqsn
from ..accounting._mktinfo import unpack_fqsn
from . import _search
from ._chart import GodWidget
from ..log import get_logger

View File

@ -29,7 +29,7 @@ from PyQt5 import QtCore, QtGui, QtWidgets
from PyQt5.QtCore import QPointF
from . import _pg_overrides as pgo
from ..data._source import float_digits
from ..accounting._mktinfo import float_digits
from ._label import Label
from ._style import DpiAwareFont, hcolor, _font
from ._interaction import ChartView

View File

@ -68,7 +68,7 @@ from ..data.feed import (
Feed,
Flume,
)
from ..data._source import Symbol
from ..accounting._mktinfo import Symbol
from ..log import get_logger
from ._interaction import ChartView
from ._forms import FieldsForm
@ -290,8 +290,8 @@ class GodWidget(QWidget):
symbol = self.rt_linked.symbol
if symbol is not None:
self.window.setWindowTitle(
f'{symbol.front_fqsn()} '
f'tick:{symbol.tick_size}'
f'{symbol.fqme} '
f'tick:{symbol.size_tick}'
)
return order_mode_started

View File

@ -363,7 +363,8 @@ class Cursor(pg.GraphicsObject):
# value used for rounding y-axis discreet tick steps
# computing once, up front, here cuz why not
self._y_incr_mult = 1 / self.linked._symbol.tick_size
mkt = self.linked._symbol
self._y_tick_mult = 1/float(mkt.price_tick)
# line width in view coordinates
self._lw = self.pixelWidth() * self.lines_pen.width()
@ -571,9 +572,15 @@ class Cursor(pg.GraphicsObject):
line_offset = self._lw / 2
# round y value to nearest tick step
m = self._y_incr_mult
m = self._y_tick_mult
iy = round(y * m) / m
vl_y = iy - line_offset
# print(
# f'tick: {self._y_tick}\n'
# f'y: {y}\n'
# f'iy: {iy}\n'
# f'vl_y: {vl_y}\n'
# )
# update y-range items
if iy != last_iy:

View File

@ -1221,7 +1221,6 @@ async def display_symbol_data(
# use expanded contract symbols passed back from feed layer.
fqsns = list(feed.flumes.keys())
# step_size_s = 1
# tf_key = tf_in_1s[step_size_s]
godwidget.window.setWindowTitle(
@ -1288,7 +1287,6 @@ async def display_symbol_data(
hist_ohlcv: ShmArray = flume.hist_shm
symbol = flume.symbol
brokername = symbol.brokers[0]
fqsn = symbol.fqsn
hist_chart = hist_linked.plot_ohlc_main(
@ -1337,8 +1335,7 @@ async def display_symbol_data(
None | ChartPlotWidget
] = {}.fromkeys(feed.flumes)
if (
not symbol.broker_info[brokername].get('no_vlm', False)
and has_vlm(ohlcv)
has_vlm(ohlcv)
and vlm_chart is None
):
vlm_chart = vlm_charts[fqsn] = await ln.start(
@ -1497,13 +1494,13 @@ async def display_symbol_data(
)
# boot order-mode
order_ctl_symbol: str = fqsns[0]
order_ctl_fqme: str = fqsns[0]
mode: OrderMode
async with (
open_order_mode(
feed,
godwidget,
fqsns[0],
order_ctl_fqme,
order_mode_started,
loglevel=loglevel
) as mode
@ -1511,7 +1508,7 @@ async def display_symbol_data(
rt_linked.mode = mode
rt_viz = rt_chart.get_viz(order_ctl_symbol)
rt_viz = rt_chart.get_viz(order_ctl_fqme)
rt_viz.plot.setFocus()
# default view adjuments and sidepane alignment

View File

@ -46,7 +46,7 @@ from ..data._sharedmem import (
try_read,
)
from ..data.feed import Flume
from ..data._source import Symbol
from ..accounting._mktinfo import Symbol
from ._chart import (
ChartPlotWidget,
LinkedSplits,

View File

@ -126,7 +126,7 @@ class LevelLine(pg.InfiniteLine):
self._on_drag_start = lambda l: None
self._on_drag_end = lambda l: None
self._y_incr_mult = 1 / chart.linked.symbol.tick_size
self._y_incr_mult = float(1 / chart.linked.symbol.size_tick)
self._right_end_sc: float = 0
# use px caching

View File

@ -45,8 +45,14 @@ from ..calc import (
pnl,
puterize,
)
from ..clearing._allocate import Allocator
from ..pp import Position
from ..accounting._allocate import Allocator
from ..accounting import (
Position,
)
from ..accounting._mktinfo import (
_derivs,
)
from ..data._normalize import iterticks
from ..data.feed import (
Feed,
@ -85,7 +91,7 @@ async def update_pnl_from_feed(
pp: PositionTracker = order_mode.current_pp
live: Position = pp.live_pp
key: str = live.symbol.front_fqsn()
key: str = live.symbol.fqme
log.info(f'Starting pnl display for {pp.alloc.account}')
@ -424,7 +430,7 @@ class SettingsPane:
# maybe start update task
global _pnl_tasks
fqsn = sym.front_fqsn()
fqsn = sym.fqme
if fqsn not in _pnl_tasks:
_pnl_tasks[fqsn] = True
self.order_mode.nursery.start_soon(
@ -495,14 +501,6 @@ def pp_line(
return line
_derivs = (
'future',
'continuous_future',
'option',
'futures_option',
)
# TODO: move into annoate module?
def mk_level_marker(
chart: ChartPlotWidget,

View File

@ -1,5 +1,5 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
@ -36,13 +36,16 @@ import trio
from PyQt5.QtCore import Qt
from .. import config
from ..pp import Position
from ..clearing._client import open_ems, OrderBook
from ..clearing._allocate import (
from ..accounting import Position
from ..accounting._allocate import (
mk_allocator,
)
from ..clearing._client import (
open_ems,
OrderClient,
)
from ._style import _font
from ..data._source import Symbol
from ..accounting._mktinfo import Symbol
from ..data.feed import (
Feed,
Flume,
@ -120,7 +123,7 @@ class OrderMode:
chart: ChartPlotWidget # type: ignore # noqa
hist_chart: ChartPlotWidget # type: ignore # noqa
nursery: trio.Nursery # used by ``ui._position`` code?
book: OrderBook
client: OrderClient
lines: LineEditor
arrows: ArrowEditor
multistatus: MultiStatus
@ -286,13 +289,27 @@ class OrderMode:
symbol = self.chart.linked.symbol
# NOTE : we could also use instead,
# symbol.quantize(price, quantity_type='price')
# but it returns a Decimal and it's probably gonna
# be slower?
# TODO: should we be enforcing this precision
# at a different layer in the stack? right now
# any precision error will literally be relayed
# all the way back from the backend.
price = round(
price,
ndigits=symbol.tick_size_digits,
)
order = self._staged_order = Order(
action=action,
price=price,
account=self.current_pp.alloc.account,
size=0,
symbol=symbol,
brokers=symbol.brokers,
brokers=[symbol.broker],
oid='', # filled in on submit
exec_mode=trigger_type, # dark or live
)
@ -349,12 +366,17 @@ class OrderMode:
'''
if not order:
staged: Order = self._staged_order
# apply order fields for ems
oid = str(uuid.uuid4())
order = staged.copy()
order.oid = oid
order.symbol = order.symbol.front_fqsn()
# NOTE: we have to str-ify `MktPair` first since we can't
# cast to it without being mega explicit with
# `msgspec.Struct`, which we're not yet..
order: Order = staged.copy({
'symbol': str(staged.symbol),
'oid': oid,
})
lines = self.lines_from_order(
order,
@ -401,13 +423,13 @@ class OrderMode:
# send order cmd to ems
if send_msg:
self.book.send(order)
self.client.send_nowait(order)
else:
# just register for control over this order
# TODO: some kind of mini-perms system here based on
# an out-of-band tagging/auth sub-sys for multiplayer
# order control?
self.book._sent_orders[order.oid] = order
self.client._sent_orders[order.oid] = order
return dialog
@ -428,14 +450,23 @@ class OrderMode:
line: LevelLine,
) -> None:
'''
Retreive the level line's end state, compute the size
and price for the new price-level, send an update msg to
the EMS, adjust mirrored level line on secondary chart.
level = line.value()
'''
mktinfo = self.chart.linked.symbol
level = round(
line.value(),
ndigits=mktinfo.tick_size_digits,
)
# updated by level change callback set in ``.new_line_from_order()``
dialog = line.dialog
size = dialog.order.size
# NOTE: sends modified order msg to EMS
self.book.send_update(
self.client.update_nowait(
uuid=line.dialog.uuid,
price=level,
size=size,
@ -551,7 +582,7 @@ class OrderMode:
) -> None:
msg = self.book._sent_orders.pop(uuid, None)
msg = self.client._sent_orders.pop(uuid, None)
if msg is not None:
self.lines.remove_line(uuid=uuid)
@ -607,7 +638,7 @@ class OrderMode:
dialog.last_status_close = cancel_status_close
ids.append(oid)
self.book.cancel(uuid=oid)
self.client.cancel_nowait(uuid=oid)
return ids
@ -674,19 +705,22 @@ async def open_order_mode(
multistatus = chart.window().status_bar
done = multistatus.open_status('starting order mode..')
book: OrderBook
client: OrderClient
trades_stream: tractor.MsgStream
# The keys in this dict **must** be in set our set of "normalized"
# symbol names (i.e. the same names you'd get back in search
# results) in order for position msgs to correctly trigger the
# display of a position indicator on screen.
position_msgs: dict[str, list[BrokerdPosition]]
position_msgs: dict[str, dict[str, BrokerdPosition]]
# spawn EMS actor-service
async with (
open_ems(fqsn, loglevel=loglevel) as (
book,
open_ems(
fqsn,
loglevel=loglevel,
) as (
client,
trades_stream,
position_msgs,
brokerd_accounts,
@ -709,7 +743,7 @@ async def open_order_mode(
# load account names from ``brokers.toml``
accounts_def = config.load_accounts(
providers=symbol.brokers
providers=[symbol.broker],
)
# XXX: ``brokerd`` delivers a set of account names that it
@ -737,7 +771,7 @@ async def open_order_mode(
ppu=0,
# XXX: BLEH, do we care about this on the client side?
bsuid=symbol,
bs_mktid=symbol.key,
)
# allocator config
@ -813,7 +847,7 @@ async def open_order_mode(
chart,
hist_chart,
tn,
book,
client,
lines,
arrows,
multistatus,
@ -861,12 +895,14 @@ async def open_order_mode(
# Pack position messages by account, should only be one-to-one.
# NOTE: requires the backend exactly specifies
# the expected symbol key in its positions msg.
for (broker, acctid), msgs in position_msgs.items():
for msg in msgs:
log.info(f'Loading pp for {acctid}@{broker}:\n{pformat(msg)}')
for (
(broker, acctid),
pps_by_fqme
) in position_msgs.items():
for msg in pps_by_fqme.values():
await process_trade_msg(
mode,
book,
client,
msg,
)
@ -900,7 +936,7 @@ async def open_order_mode(
await process_trade_msg(
mode,
book,
client,
msg,
)
@ -908,7 +944,7 @@ async def open_order_mode(
process_trades_and_update_ui,
trades_stream,
mode,
book,
client,
)
yield mode
@ -918,7 +954,7 @@ async def process_trades_and_update_ui(
trades_stream: tractor.MsgStream,
mode: OrderMode,
book: OrderBook,
client: OrderClient,
) -> None:
@ -927,14 +963,14 @@ async def process_trades_and_update_ui(
async for msg in trades_stream:
await process_trade_msg(
mode,
book,
client,
msg,
)
async def process_trade_msg(
mode: OrderMode,
book: OrderBook,
client: OrderClient,
msg: dict,
) -> tuple[Dialog, Status]:
@ -948,13 +984,16 @@ async def process_trade_msg(
):
sym = mode.chart.linked.symbol
pp_msg_symbol = msg['symbol'].lower()
fqsn = sym.front_fqsn()
broker, key = sym.front_feed()
fqme = sym.fqme
broker = sym.broker
if (
pp_msg_symbol == fqsn
or pp_msg_symbol == fqsn.removesuffix(f'.{broker}')
pp_msg_symbol == fqme
or pp_msg_symbol == fqme.removesuffix(f'.{broker}')
):
log.info(f'{fqsn} matched pp msg: {fmsg}')
log.info(
f'Loading position for `{fqme}`:\n'
f'{fmsg}'
)
tracker = mode.trackers[msg['account']]
tracker.live_pp.update_from_msg(msg)
tracker.update_from_pp(set_as_startup=True) # status/pane UI
@ -997,7 +1036,7 @@ async def process_trade_msg(
assert msg.resp in ('open', 'dark_open'), f'Unknown msg: {msg}'
sym = mode.chart.linked.symbol
fqsn = sym.front_fqsn()
fqsn = sym.fqme
if (
((order.symbol + f'.{msg.src}') == fqsn)
@ -1069,7 +1108,7 @@ async def process_trade_msg(
case Status(resp='fill'):
# handle out-of-piker fills reporting?
order: Order = book._sent_orders.get(oid)
order: Order = client._sent_orders.get(oid)
if not order:
log.warning(f'order {oid} is unknown')
order = msg.req

View File

@ -18,6 +18,3 @@
# ``asyncvnc`` for sending interactions to ib-gw inside docker
-e git+https://github.com/pikers/asyncvnc.git@main#egg=asyncvnc
# ``cryptofeed`` for connecting to various crypto exchanges + custom fixes
-e git+https://github.com/pikers/cryptofeed.git@date_parsing#egg=cryptofeed

View File

@ -37,6 +37,7 @@ setup(
'console_scripts': [
'piker = piker.cli:cli',
'pikerd = piker.cli:pikerd',
'ledger = piker.accounting.cli:ledger',
]
},
install_requires=[

View File

@ -5,6 +5,7 @@ import os
from pathlib import Path
import pytest
import pytest_trio
import tractor
from piker import (
config,
@ -13,7 +14,6 @@ from piker.service import (
Services,
)
from piker.log import get_console_log
from piker.clearing._client import open_ems
def pytest_addoption(parser):
@ -87,8 +87,11 @@ def log(
@acm
async def _open_test_pikerd(
tmpconfdir: str,
reg_addr: tuple[str, int] | None = None,
loglevel: str = 'warning',
debug_mode: bool = False,
**kwargs,
) -> tuple[
@ -122,6 +125,8 @@ async def _open_test_pikerd(
# or just in sequence per test, so we keep root.
drop_root_perms_for_ahab=False,
debug_mode=debug_mode,
**kwargs,
) as service_manager,
@ -153,6 +158,11 @@ def open_test_pikerd(
tmpconfdir.mkdir()
tmpconfdir_str: str = str(tmpconfdir)
# override config dir in the root actor (aka
# this top level testing process).
from piker import config
config._config_dir = tmpconfdir
# NOTE: on linux the tmp config dir is generally located at:
# /tmp/pytest-of-<username>/pytest-<run#>/test_<current_test_name>/
# the default `pytest` config ensures that only the last 4 test
@ -171,6 +181,8 @@ def open_test_pikerd(
# bind in level from fixture, which is itself set by
# `--ll <value>` cli flag.
loglevel=loglevel,
debug_mode=request.config.option.usepdb
)
# NOTE: the `tmp_dir` fixture will wipe any files older then 3 test
@ -183,37 +195,3 @@ def open_test_pikerd(
# - no leaked subprocs or shm buffers
# - all requested container service are torn down
# - certain ``tractor`` runtime state?
@acm
async def _open_test_pikerd_and_ems(
fqsn,
mode,
loglevel,
open_test_pikerd
):
async with (
open_test_pikerd() as (_, _, _, services),
open_ems(
fqsn,
mode=mode,
loglevel=loglevel,
) as ems_services,
):
yield (services, ems_services)
@pytest.fixture
def open_test_pikerd_and_ems(
open_test_pikerd,
fqsn: str = 'xbtusdt.kraken',
mode: str = 'paper',
loglevel: str = 'info',
):
yield partial(
_open_test_pikerd_and_ems,
fqsn,
mode,
loglevel,
open_test_pikerd
)

393
tests/test_ems.py 100644
View File

@ -0,0 +1,393 @@
'''
Execution mgmt system (EMS) e2e testing.
Most tests leverage our paper clearing engine found (currently) in
``piker.clearing._paper_engine`.
Ideally in the longer run we are able to support forms of (non-clearing)
live order tests against certain backends that make it possible to do
so..
'''
from contextlib import (
contextmanager as cm,
)
from typing import (
Awaitable,
Callable,
AsyncContextManager,
Literal,
)
import trio
# import pytest_trio
from exceptiongroup import BaseExceptionGroup
import pytest
import tractor
from uuid import uuid4
from piker.service import Services
from piker.log import get_logger
from piker.clearing._messages import (
Order,
Status,
# Cancel,
BrokerdPosition,
)
from piker.clearing import (
open_ems,
OrderClient,
)
from piker.accounting._mktinfo import (
unpack_fqme,
)
from piker.accounting import (
open_pps,
Position,
)
log = get_logger(__name__)
async def open_pikerd(
open_test_pikerd: AsyncContextManager,
) -> Services:
async with (
open_test_pikerd() as (_, _, _, services),
):
yield services
async def order_and_and_wait_for_ppmsg(
client: OrderClient,
trades_stream: tractor.MsgStream,
fqme: str,
action: Literal['buy', 'sell'],
price: float = 100e3, # just a super high price.
size: float = 0.01,
exec_mode: str = 'live',
account: str = 'paper',
) -> list[Status | BrokerdPosition]:
'''
Start piker, place a trade and assert data in
pps stream, ledger and position table.
'''
sent: list[Order] = []
broker, key, suffix = unpack_fqme(fqme)
order = Order(
exec_mode=exec_mode,
action=action, # TODO: remove this from our schema?
oid=str(uuid4()),
account=account,
size=size,
symbol=fqme,
price=price,
brokers=[broker],
)
sent.append(order)
await client.send(order)
# TODO: i guess we should still test the old sync-API?
# client.send_nowait(order)
# Wait for position message before moving on to verify flow(s)
# for the multi-order position entry/exit.
msgs: list[Status | BrokerdPosition] = []
async for msg in trades_stream:
match msg:
case {'name': 'position'}:
ppmsg = BrokerdPosition(**msg)
msgs.append(ppmsg)
break
case {'name': 'status'}:
msgs.append(Status(**msg))
return sent, msgs
def run_and_tollerate_cancels(
fn: Callable[..., Awaitable],
expect_errs: tuple[Exception] | None = None,
tollerate_errs: tuple[Exception] = (tractor.ContextCancelled,),
):
'''
Run ``trio``-``piker`` runtime with potential tolerance for
inter-actor cancellation during teardown (normally just
`tractor.ContextCancelled`s).
'''
if expect_errs:
with pytest.raises(BaseExceptionGroup) as exc_info:
trio.run(fn)
for err in exc_info.value.exceptions:
assert type(err) in expect_errs
else:
try:
trio.run(fn)
except tollerate_errs:
pass
@cm
def load_and_check_pos(
order: Order,
ppmsg: BrokerdPosition,
) -> None:
with open_pps(ppmsg.broker, ppmsg.account) as table:
if ppmsg.size == 0:
assert ppmsg.symbol not in table.pps
yield None
return
else:
# NOTE: a special case is here since the `PpTable.pps` are
# normally indexed by the particular broker's
# `Position.bs_mktid: str` (a unique market / symbol id provided
# by their systems/design) but for the paper engine case, this
# is the same the fqme.
pp: Position = table.pps[ppmsg.symbol]
assert ppmsg.size == pp.size
assert ppmsg.avg_price == pp.ppu
yield pp
def test_ems_err_on_bad_broker(
open_test_pikerd: Services,
loglevel: str,
):
async def load_bad_fqme():
try:
async with (
open_test_pikerd() as (_, _, _, services),
open_ems(
'doggycoin.doggy',
mode='paper',
loglevel=loglevel,
) as _
):
pytest.fail('EMS is working on non-broker!?')
except ModuleNotFoundError:
pass
run_and_tollerate_cancels(load_bad_fqme)
async def match_ppmsgs_on_ems_boot(
ppmsgs: list[BrokerdPosition],
) -> None:
'''
Given a list of input position msgs, verify they match
what is loaded from the EMS on connect.
'''
by_acct: dict[tuple, list[BrokerdPosition]] = {}
for msg in ppmsgs:
by_acct.setdefault(
(msg.broker, msg.account),
[],
).append(msg)
# TODO: actually support multi-mkts to `open_ems()`
# but for now just pass the first fqme.
fqme = msg.symbol
# disconnect from EMS, reconnect and ensure we get our same
# position relayed to us again in the startup msg.
async with (
open_ems(
fqme,
mode='paper',
loglevel='info',
) as (
_, # OrderClient
_, # tractor.MsgStream
startup_pps,
accounts,
_, # dialogs,
)
):
for (broker, account), ppmsgs in by_acct.items():
assert account in accounts
# lookup all msgs rx-ed for this account
rx_msgs = startup_pps[(broker, account)]
for expect_ppmsg in ppmsgs:
rx_msg = BrokerdPosition(**rx_msgs[expect_ppmsg.symbol])
assert rx_msg == expect_ppmsg
async def submit_and_check(
fills: tuple[dict],
loglevel: str,
) -> tuple[
BrokerdPosition,
Position,
]:
'''
Enter a trade and assert entries are made in pps and ledger files.
Shutdown the ems-client and ensure on reconnect we get the expected
matching ``BrokerdPosition`` and pps.toml entries.
'''
broker: str = 'kraken'
mkt_key: str = 'xbtusdt'
fqme: str = f'{mkt_key}.{broker}'
startup_pps: dict[
tuple[str, str], # brokername, acctid
list[BrokerdPosition],
]
async with (
open_ems(
fqme,
mode='paper',
loglevel=loglevel,
) as (
client, # OrderClient
trades_stream, # tractor.MsgStream
startup_pps,
accounts,
dialogs,
)
):
# no positions on startup
assert not startup_pps
assert 'paper' in accounts
od: dict
for od in fills:
print(f'Sending order {od} for fill')
size = od['size']
sent, msgs = await order_and_and_wait_for_ppmsg(
client,
trades_stream,
fqme,
action='buy' if size > 0 else 'sell',
price=100e3 if size > 0 else 0,
size=size,
)
last_order: Order = sent[-1]
last_resp = msgs[-1]
assert isinstance(last_resp, BrokerdPosition)
ppmsg = last_resp
# check that pps.toml for account has been updated
# and all ems position msgs match that state.
with load_and_check_pos(
last_order,
ppmsg,
) as pos:
pass
return ppmsg, pos
@pytest.mark.parametrize(
'fills',
[
# buy and leave
({'size': 0.001},),
# sell short, then buy back to net-zero in dst
(
{'size': -0.001},
{'size': 0.001},
),
# multi-partial entry and exits from net-zero, to short and back
# to net-zero.
(
# enters
{'size': 0.001},
{'size': 0.002},
# partial exit
{'size': -0.001},
# partial enter
{'size': 0.0015},
{'size': 0.001},
{'size': 0.002},
# nearly back to zero.
{'size': -0.001},
# switch to net-short
{'size': -0.025},
{'size': -0.0195},
# another entry
{'size': 0.001},
# final cover to net-zero again.
{'size': 0.038},
),
],
ids='fills={}'.format,
)
def test_multi_fill_positions(
open_test_pikerd: AsyncContextManager,
loglevel: str,
fills: tuple[dict],
check_cross_session: bool = False,
) -> None:
ppmsg: BrokerdPosition
pos: Position
accum_size: float = 0
for fill in fills:
accum_size += fill['size']
async def atest():
# export to outer scope for audit on second runtime-boot.
nonlocal ppmsg, pos
async with (
open_test_pikerd() as (_, _, _, services),
):
ppmsg, pos = await submit_and_check(
fills=fills,
loglevel=loglevel,
)
assert ppmsg.size == accum_size
run_and_tollerate_cancels(atest)
if check_cross_session or accum_size != 0:
# rerun just to check that position info is persistent for the paper
# account (i.e. a user can expect to see paper pps persist across
# runtime sessions.
async def just_check_pp():
async with (
open_test_pikerd() as (_, _, _, services),
):
await match_ppmsgs_on_ems_boot([ppmsg])
run_and_tollerate_cancels(just_check_pp)

View File

@ -13,13 +13,14 @@ from piker.data import (
ShmArray,
open_feed,
)
from piker.data._source import (
from piker.data.flows import Flume
from piker.accounting._mktinfo import (
unpack_fqsn,
)
@pytest.mark.parametrize(
'fqsns',
'fqmes',
[
# binance
(100, {'btcusdt.binance', 'ethusdt.binance'}, False),
@ -30,20 +31,20 @@ from piker.data._source import (
# binance + kraken
(100, {'btcusdt.binance', 'xbtusd.kraken'}, False),
],
ids=lambda param: f'quotes={param[0]}@fqsns={param[1]}',
ids=lambda param: f'quotes={param[0]}@fqmes={param[1]}',
)
def test_multi_fqsn_feed(
open_test_pikerd: AsyncContextManager,
fqsns: set[str],
fqmes: set[str],
loglevel: str,
ci_env: bool
):
'''
Start a real-time data feed for provided fqsn and pull
Start a real-time data feed for provided fqme and pull
a few quotes then simply shut down.
'''
max_quotes, fqsns, run_in_ci = fqsns
max_quotes, fqmes, run_in_ci = fqmes
if (
ci_env
@ -52,15 +53,15 @@ def test_multi_fqsn_feed(
pytest.skip('Skipping CI disabled test due to feed restrictions')
brokers = set()
for fqsn in fqsns:
brokername, key, suffix = unpack_fqsn(fqsn)
for fqme in fqmes:
brokername, key, suffix = unpack_fqsn(fqme)
brokers.add(brokername)
async def main():
async with (
open_test_pikerd(),
open_feed(
fqsns,
fqmes,
loglevel=loglevel,
# TODO: ensure throttle rate is applied
@ -71,20 +72,20 @@ def test_multi_fqsn_feed(
) as feed
):
# verify shm buffers exist
for fqin in fqsns:
for fqin in fqmes:
flume = feed.flumes[fqin]
ohlcv: ShmArray = flume.rt_shm
hist_ohlcv: ShmArray = flume.hist_shm
async with feed.open_multi_stream(brokers) as stream:
# pull the first startup quotes, one for each fqsn, and
# pull the first startup quotes, one for each fqme, and
# ensure they match each flume's startup quote value.
fqsns_copy = fqsns.copy()
fqsns_copy = fqmes.copy()
with trio.fail_after(0.5):
for _ in range(1):
first_quotes = await stream.receive()
for fqsn, quote in first_quotes.items():
for fqme, quote in first_quotes.items():
# XXX: TODO: WTF apparently this error will get
# supressed and only show up in the teardown
@ -92,18 +93,18 @@ def test_multi_fqsn_feed(
# <tractorbugurl>
# assert 0
fqsns_copy.remove(fqsn)
flume = feed.flumes[fqsn]
fqsns_copy.remove(fqme)
flume: Flume = feed.flumes[fqme]
assert quote['last'] == flume.first_quote['last']
cntr = Counter()
with trio.fail_after(6):
async for quotes in stream:
for fqsn, quote in quotes.items():
cntr[fqsn] += 1
for fqme, quote in quotes.items():
cntr[fqme] += 1
# await tractor.breakpoint()
flume = feed.flumes[fqsn]
flume = feed.flumes[fqme]
ohlcv: ShmArray = flume.rt_shm
hist_ohlcv: ShmArray = flume.hist_shm
@ -116,7 +117,7 @@ def test_multi_fqsn_feed(
# assert last == rt_row['close']
# assert last == hist_row['close']
pprint(
f'{fqsn}: {quote}\n'
f'{fqme}: {quote}\n'
f'rt_ohlc: {rt_row}\n'
f'hist_ohlc: {hist_row}\n'
)
@ -124,6 +125,6 @@ def test_multi_fqsn_feed(
if cntr.total() >= max_quotes:
break
assert set(cntr.keys()) == fqsns
assert set(cntr.keys()) == fqmes
trio.run(main)

View File

@ -1,230 +0,0 @@
'''
Paper-mode testing
'''
import trio
from exceptiongroup import BaseExceptionGroup
from typing import (
AsyncContextManager,
Literal,
)
import pytest
from tractor._exceptions import ContextCancelled
from uuid import uuid4
from functools import partial
from piker.log import get_logger
from piker.clearing._messages import Order
from piker.pp import (
open_pps,
)
log = get_logger(__name__)
def get_fqsn(broker, symbol):
fqsn = f'{symbol}.{broker}'
return (fqsn, symbol, broker)
oid = ''
test_exec_mode = 'live'
(fqsn, symbol, broker) = get_fqsn('kraken', 'xbtusdt')
brokers = [broker]
account = 'paper'
async def _async_main(
open_test_pikerd_and_ems: AsyncContextManager,
action: Literal['buy', 'sell'] | None = None,
price: int = 30000,
executions: int = 1,
size: float = 0.01,
# Assert options
assert_entries: bool = False,
assert_pps: bool = False,
assert_zeroed_pps: bool = False,
assert_msg: bool = False,
) -> None:
'''
Start piker, place a trade and assert data in
pps stream, ledger and position table.
'''
oid: str = ''
last_msg = {}
async with open_test_pikerd_and_ems() as (
services,
(book, trades_stream, pps, accounts, dialogs),
):
if action:
for x in range(executions):
oid = str(uuid4())
order = Order(
exec_mode=test_exec_mode,
action=action,
oid=oid,
account=account,
size=size,
symbol=fqsn,
price=price,
brokers=brokers,
)
# This is actually a syncronous call to push a message
book.send(order)
async for msg in trades_stream:
last_msg = msg
match msg:
# Wait for position message before moving on
case {'name': 'position'}:
break
# Teardown piker like a user would
raise KeyboardInterrupt
if assert_entries or assert_pps or assert_zeroed_pps or assert_msg:
_assert(
assert_entries,
assert_pps,
assert_zeroed_pps,
pps,
last_msg,
size,
executions,
)
def _assert(
assert_entries,
assert_pps,
assert_zerod_pps,
pps,
last_msg,
size,
executions,
):
with (
open_pps(broker, account, write_on_exit=False) as table,
):
'''
Assert multiple cases including pps,
ledger and final position message state
'''
if assert_entries:
for key, val in [
('broker', broker),
('account', account),
('symbol', fqsn),
('size', size * executions),
('currency', symbol),
('avg_price', table.pps[symbol].ppu)
]:
assert last_msg[key] == val
if assert_pps:
last_ppu = pps[(broker, account)][-1]
assert last_ppu['avg_price'] == table.pps[symbol].ppu
if assert_zerod_pps:
assert not bool(table.pps)
def _run_test_and_check(fn):
'''
Close position and assert empty position in pps
'''
with pytest.raises(BaseExceptionGroup) as exc_info:
trio.run(fn)
for exception in exc_info.value.exceptions:
assert isinstance(exception, KeyboardInterrupt) or isinstance(
exception, ContextCancelled
)
def test_buy(
open_test_pikerd_and_ems: AsyncContextManager,
):
'''
Enter a trade and assert entries are made in pps and ledger files.
'''
_run_test_and_check(
partial(
_async_main,
open_test_pikerd_and_ems=open_test_pikerd_and_ems,
action='buy',
assert_entries=True,
),
)
# Open ems and assert existence of pps entries
_run_test_and_check(
partial(
_async_main,
open_test_pikerd_and_ems=open_test_pikerd_and_ems,
assert_pps=True,
),
)
def test_sell(
open_test_pikerd_and_ems: AsyncContextManager,
):
'''
Sell position and ensure pps are zeroed.
'''
_run_test_and_check(
partial(
_async_main,
open_test_pikerd_and_ems=open_test_pikerd_and_ems,
action='sell',
price=1,
),
)
_run_test_and_check(
partial(
_async_main,
open_test_pikerd_and_ems=open_test_pikerd_and_ems,
assert_zeroed_pps=True,
),
)
def test_multi_sell(
open_test_pikerd_and_ems: AsyncContextManager,
):
'''
Make 5 market limit buy orders and
then sell 5 slots at the same price.
Finally, assert cleared positions.
'''
_run_test_and_check(
partial(
_async_main,
open_test_pikerd_and_ems=open_test_pikerd_and_ems,
action='buy',
executions=5,
),
)
_run_test_and_check(
partial(
_async_main,
open_test_pikerd_and_ems=open_test_pikerd_and_ems,
action='sell',
executions=5,
price=1,
assert_zeroed_pps=True,
),
)

View File

@ -24,7 +24,7 @@ from piker.clearing._messages import (
Status,
)
from piker.clearing._client import (
OrderBook,
OrderClient,
)
@ -121,7 +121,7 @@ def test_ensure_ems_in_paper_actors(
async def main():
# type declares
book: OrderBook
book: OrderClient
trades_stream: tractor.MsgStream
pps: dict[str, list[BrokerdPosition]]
accounts: list[str]