Compare commits

...

132 Commits

Author SHA1 Message Date
Tyler Goodlet bb9f6475f4 Add disclaimer 2020-09-01 12:50:13 -04:00
Tyler Goodlet 54463b3595 Drop to 1k bars on init load 2020-09-01 12:46:30 -04:00
Tyler Goodlet aaf234cbaf Better bg color, tweak margins. 2020-08-31 17:18:35 -04:00
Tyler Goodlet 0f6589d9ff Add proper x-axis time-stamping 2020-08-31 17:18:02 -04:00
Tyler Goodlet 30d8e096c6 Use dashed crosshair, simplify x-axis alloc 2020-08-31 17:17:20 -04:00
Tyler Goodlet 19609178ce Even more colors 2020-08-31 17:16:44 -04:00
Tyler Goodlet 4c39407363 Use dashed lines for crosshair 2020-08-30 12:32:14 -04:00
Tyler Goodlet a345daa522 Try to find cad stocks 2020-08-30 12:31:32 -04:00
Tyler Goodlet ea75281cbc Add and update y-sticky labels on new price data 2020-08-30 12:29:29 -04:00
Tyler Goodlet 86a1f33abb Start color map 2020-08-30 12:28:38 -04:00
Tyler Goodlet 649798a91f Add updateable y-sticky label 2020-08-30 12:27:41 -04:00
Tyler Goodlet 6ce8d1147e Cleanup latency tracker 2020-08-26 21:45:34 -04:00
Tyler Goodlet 0d08e39597 Refer to main chart's data for date axis 2020-08-26 21:45:13 -04:00
Tyler Goodlet 38df68935d Normalize kraken quotes for latency tracking 2020-08-26 21:44:03 -04:00
Tyler Goodlet 778e3c7b06 Properly teardown data feed on cancel 2020-08-26 21:43:21 -04:00
Tyler Goodlet 69aced7521 Add "contents" labels to charts
Add a default "contents label" (eg. OHLC values for bar charts) to each
chart and update on crosshair interaction.

Few technical changes to make this happen:
- adjust bar graphics to have the HL line be in the "middle" of the
  underlying arrays' "index range" in the containing view.
- add a label dict each chart's graphics name to a label + update routine
- use symbol names instead of this "main" identifier crap for referring to
  particular price curves/graphics
2020-08-26 21:42:11 -04:00
Tyler Goodlet bfdd2c43cc Start brokers.api module 2020-08-21 14:28:02 -04:00
Tyler Goodlet f5ad56a257 WIP initial draft of FSP subsystem
This is a first attempt at a financial signal processing subsystem which
utilizes async generators for streaming frames of numpy array data
between actors. In this initial attempt the focus is on processing price
data and relaying it to the chart app for real-time display. So far this
seems to work (with decent latency) but much more work is likely needed
around improving the data model for even better latency and less data
duplication.

Surprisingly (or not?) a lot of simplifications to the charting code
came out of this in terms of conducting graphics updates in streaming
tasks instead of hiding them inside the obfuscated mess that is the
Qt-style-inheritance-OO-90s-trash. The goal from here on wards will be
to enforce strict semantics around reading and writing of data such that
state is kept outside "object trees" as much as possible and streaming
function semantics guide our flow model. Unsurprisingly, this reduction
in "instance state" is happening wherever we use `trio` ;)

A little summary on the technical changes:
- not going to explain the fsp system yet; it's too nascent and
  probably going to get some heavy editing.
- drop any "update" methods from the `LinkedCharts` type since each
  sub-chart will have it's own update task and thus a separate update
  loop; further individual graphics (per chart) may eventually require
  this same design.
- delete `ChartView`; moved into separate mod.
- add "stream from fsp" task to start our foray into real-time actor
  processed numpy streaming.
2020-08-19 15:32:09 -04:00
Tyler Goodlet f4dddecf17 Drop weird chart type enum 2020-08-19 07:52:17 -04:00
Tyler Goodlet 20250961a6 Begin to use `@tractor.msg.pub` throughout streaming API
Since the new FSP system will require time aligned data amongst actors,
it makes sense to share broker data feeds as much as possible on a local
system. There doesn't seem to be downside to this approach either since
if not fanning-out in our code, the broker (server) has to do it anyway
(and who knows how junk their implementation is) though with more
clients, sockets etc. in memory on our end. It also preps the code for
introducing a more "serious" pub-sub systems like zeromq/nanomessage.
2020-08-19 07:51:43 -04:00
Tyler Goodlet 2f89979d8c Use partial, pass kwargs to `tractor._main()` 2020-08-19 07:41:18 -04:00
Tyler Goodlet 7acf7106df Start "interaction" module 2020-08-14 22:17:57 -04:00
Tyler Goodlet 23dcc45b63 Port monitor to normalized streams 2020-08-10 15:49:07 -04:00
Tyler Goodlet 37607d61ca Port `DataFeed` api to broker specific normalizer routine 2020-08-10 15:23:35 -04:00
Tyler Goodlet 5fe8e420b8 Add a normalizer routine which emits quote differentials/ticks 2020-08-09 00:03:09 -04:00
Tyler Goodlet 75824f7afa Future todo 2020-08-09 00:02:04 -04:00
Tyler Goodlet a1e2730aa1 Handle (far end forced) disconnects 2020-08-09 00:01:40 -04:00
Tyler Goodlet bdcf5f884b Add `services` cmd for monitoring actors 2020-08-03 21:31:56 -04:00
Tyler Goodlet e6e06a52cb Flatten out chart tasks 2020-08-02 20:10:06 -04:00
Tyler Goodlet ccf600f79a Add ravel() reference link 2020-08-02 20:09:27 -04:00
Tyler Goodlet 0389836fe6 Handle "mouse-not-on-plot" edge cases 2020-08-02 15:23:20 -04:00
Tyler Goodlet bfc1a1fcf5 Attempt more reliable chart startup
Wait for a first actual real-time quote before starting graphics update
tasks. Use the new normalized tick format brokers are expected to emit
as a `quotes['ticks']` list. Auto detect time frame from historical
bars.
2020-08-02 12:18:53 -04:00
Tyler Goodlet 3066ab216a Passthrough loglevel from qtractor 2020-08-02 12:17:38 -04:00
Tyler Goodlet 699fffd964 Trigger connection reset on slowed heartbeat 2020-08-02 12:17:03 -04:00
Tyler Goodlet f779af02f1 Drop forkserver usage.
We've got the sweet and realable `trio` spawner now :)
2020-08-02 01:42:04 -04:00
Tyler Goodlet 72d6b5b06f Trace log the heartbeat 2020-08-02 01:35:29 -04:00
Tyler Goodlet 9e4ee4d382 Pass piker log level through to tractor for chart app 2020-08-02 00:18:54 -04:00
Tyler Goodlet b743230f7f Add a couple more deps 2020-08-01 22:24:51 -04:00
Tyler Goodlet b872696d9f Set tractor loglevel in cli config 2020-08-01 22:23:19 -04:00
Tyler Goodlet 4e9057621c Drop "pipfiles"; pipenv is getting the hard boot 2020-08-01 22:22:12 -04:00
Tyler Goodlet fad58d18c9 Make ws loop restart on connection failures 2020-08-01 22:12:26 -04:00
Tyler Goodlet 06f03c690c Begin to wrap marketstore as a data feed
Wrap the sync client in an async interface in anticipation of an actual
async client. This starts support for the `open_fee()`/`stream_quotes()`
api though the tick normalization isn't correct yet.
2020-08-01 20:08:05 -04:00
Tyler Goodlet fa899c3979 Generate tick data correctly using .etime 2020-08-01 16:52:51 -04:00
Tyler Goodlet e2dab3977e Support new normalized ticks format with kraken
Generate tick datums in a list under a `ticks` field in each quote
kinda like how IB does it.
2020-07-31 00:11:17 -04:00
Tyler Goodlet bcd17d0bb6 Also log the payload 2020-07-31 00:10:47 -04:00
Tyler Goodlet e49417a4b8 Add normalization step for ticks
Start a draft normalization format for (sampled) tick data.
Ideally we move toward the dense tick format (DFT) enforced by
techtonicDB, but for now let's just get a dict of something simple
going: `{'type': 'trade', 'price': <price}` kind of thing. This
gets us started being able to real-time chart from all data feed
back-ends. Oh, and hack in support for XAUUSD..and get subactor
logging workin.
2020-07-31 00:03:17 -04:00
Tyler Goodlet 307bc87738 Fix typo 2020-07-28 14:45:18 -04:00
Tyler Goodlet 80b656e2ab Add startup logic to handle market closure 2020-07-28 14:44:32 -04:00
Tyler Goodlet b16bc9b42d Define "packetizer" in specific broker mod
Allows for formatting published quotes using a broker specific
formatting callback.
2020-07-28 14:27:51 -04:00
Tyler Goodlet 12655f87fd Support the `stream_quotes()` api in questrade backend 2020-07-20 16:58:40 -04:00
Tyler Goodlet a59497c949 Always just look up the current plot on mouse handling 2020-07-17 10:43:03 -04:00
Tyler Goodlet b97286d7f5 Allow for dynamically added plots
Add `ChartPlotWidget.add_plot()` to add sub charts for indicators which
can be updated independently. Clean up rt bar update code and drop some
legacy ohlc loading cruft.
2020-07-17 09:06:20 -04:00
Tyler Goodlet a1032a0cd7 Massively simplify the cross-hair monstrosity
Stop with all this "main chart" special treatment.
Manage all lines in the same way across all referenced plots.
Add `CrossHair.add_plot()` for adding new plots dynamically.

Just, smh.
2020-07-16 21:54:24 -04:00
Tyler Goodlet d6bd964fac Raise errors, fix module script entry 2020-07-15 13:26:48 -04:00
Tyler Goodlet 5513d48c11 Use array of names for lookup 2020-07-15 10:59:29 -04:00
Tyler Goodlet ce1c98463c Change name to qtractor 2020-07-15 09:55:09 -04:00
Tyler Goodlet 7ea6f25993 Pass broker name 2020-07-15 09:54:47 -04:00
Tyler Goodlet d431ec20a8 Override annoying stuff in ib_insync 2020-07-15 09:54:24 -04:00
Tyler Goodlet c6b4b62228 Standardize ohlc dtype 2020-07-15 08:42:01 -04:00
Tyler Goodlet 4a1bcf7626 Fix import error 2020-07-15 08:41:29 -04:00
Tyler Goodlet 8fa569787d Port to new streaming api, yield whole tickers 2020-07-15 08:40:20 -04:00
Tyler Goodlet 50f903d7c5 Handle overloaded arg 2020-07-15 08:28:50 -04:00
Tyler Goodlet d0a9afbb36 Port to new data apis 2020-07-15 08:28:13 -04:00
Tyler Goodlet b05a205d1b Start enforcing a common stream setup api
Add routines for brokerd spawning and quote stream creation.
2020-07-15 08:22:09 -04:00
Tyler Goodlet f9dcb9a984 Add kraken to backend list 2020-07-15 08:20:29 -04:00
Tyler Goodlet 7395b22e3d Add historical bars retreival 2020-07-15 08:20:03 -04:00
Tyler Goodlet cf8a5d04ce Store lines graphics in struct array to simplify indexing 2020-07-09 08:37:30 -04:00
Tyler Goodlet 93fed5ec28 Drop kivy stuff from docs 2020-07-08 15:42:51 -04:00
Tyler Goodlet 5d0f4bf112 Deps bump 2020-07-08 15:42:32 -04:00
Tyler Goodlet 9dc3bdf273 Add WIP real-time 5s bar charting 2020-07-08 15:42:05 -04:00
Tyler Goodlet ee4b3a327c Always convert to posix time 2020-07-08 15:41:14 -04:00
Tyler Goodlet ac5e9de1b3 Make run_qtrio invoke tractor at top level 2020-07-08 15:40:35 -04:00
Tyler Goodlet 7bc49eac9f Move bar generation into func; support bar appends
There's really nothing coupling it to the graphics class (which frankly
also seems like it doesn't need to be a class.. Qt).

Add support to `.update_from_array()` for diffing with the input array
and creating additional bar-lines where necessary. Note, there are still
issues with the "correctness" here in terms of bucketing open/close
values in the time frame / bar range. Also, this jamming of each bar's 3
lines into a homogeneous array seems like it could be better done with
struct arrays and avoid all this "index + 3" stuff.
2020-07-08 15:06:39 -04:00
Tyler Goodlet 4aa526f400 Use structure array indexing syntax 2020-07-08 14:56:45 -04:00
Tyler Goodlet db76443389 Handle flat bar updates
Flat bars have a rendering issue we work around by hacking values in `QLineF`
but we have to revert those on any last bar that is being updated in
real-time. Comment out candle implementations for now; we can get back
to it if/when the tinas unite. Oh, and make bars have a little space
between them.
2020-07-07 10:44:55 -04:00
Tyler Goodlet 52e258fe83 Docs the ui pkg mod 2020-07-07 10:39:22 -04:00
Tyler Goodlet 27a20c8535 Add better contract search/lookup
Add a `Client.find_contract()` which internally takes
a <symbol>.<exchange> str as input and uses `IB.qualifyContractsAsync()`
internally to try and validate the most likely contract. Make the module
script call this using `asyncio.run()` for console testing.
2020-07-07 10:33:47 -04:00
Tyler Goodlet 4fbb41a978 Convert to stream, parse into dataclass 2020-07-05 11:43:58 -04:00
Tyler Goodlet a852292563 Start kraken backend 2020-07-04 18:59:02 -04:00
Tyler Goodlet 36303f0770 Fix a bunch of scrolling / panning logic
Don't allow zooming to less then a min number of data points. Allow
panning "outside" the data set (i.e. moving one of the sequence "ends"
to the middle of the view. Start adding logging.
2020-07-04 17:48:31 -04:00
Tyler Goodlet a4658ac990 Add ib 2020-07-03 19:09:57 -04:00
Tyler Goodlet 13caf821fe Use msgpack-numpy 2020-07-03 18:32:40 -04:00
Tyler Goodlet ccaedfae3f Handle high = low bars
For whatever reason if the `QLineF` high/low values are the same a weird
little rectangle is drawn (my guess is a `float` precision error of some
sort). Instead, if they're the same just use one of the values.
Also, store local vars to avoid so many lookups.
2020-07-03 18:08:03 -04:00
Tyler Goodlet 4ca4ced6e8 Make search work with ib backend 2020-07-02 16:02:58 -04:00
Tyler Goodlet f216d1f922 Add a mostly actor aware API to IB backend
Infected `asyncio` support is being added to `tractor` in
goodboy/tractor#121 so delegate to all that new machinery.

Start building out an "actor-aware" api which takes care of all the
`trio`-`asyncio` interaction for data streaming and request handling.
Add a little (shudder) method proxy system which can be used to invoke
client methods from another actor. Start on a streaming api in
preparation for real-time charting.
2020-07-02 12:54:34 -04:00
Tyler Goodlet 72a3149dc7 Allow passing in tbk keys to query 2020-06-24 14:23:37 -04:00
Tyler Goodlet 56132d1fcc Use new method name 2020-06-24 14:13:56 -04:00
Tyler Goodlet dc9dbf4385 Add search command to cli 2020-06-24 14:13:36 -04:00
Tyler Goodlet 4d6f529d66 Add symbol search to broker api 2020-06-24 14:13:00 -04:00
Tyler Goodlet 1b31fcca57 Add stocks search to qt client 2020-06-24 14:12:38 -04:00
Tyler Goodlet f768e6d91e Add initial IB broker backend using ib_insync
Start working towards meeting the backend client api.
Infect `asyncio` using `trio`'s new guest mode and demonstrate
real-time ticker streaming to console.
2020-06-24 13:17:29 -04:00
Tyler Goodlet 4e6d1b8bd1 Rework charting internals for real-time plotting
`pg.PlotCurveItem.setData()` is normally used for real-time updates to
curves and takes in a whole new array of data to graphics.
It makes sense to stick with this interface especially if
the current datum graphic will originally be drawn from tick quotes and
later filled in when bars data is available (eg. IB has this option in
TWS charts for volume). Additionally, having a data feed api where the push
process/task can write to shared memory and the UI task(s) can read from
that space is ideal. It allows for indicator and algo calculations to be
run in parallel (via actors) with initial price draw instructions
such that plotting of downstream metrics can be "pipelined" into the
chart UI's render loop. This essentially makes the chart UI async
programmable from multiple remote processes (or at least that's the
goal).

Some details:
- Only store a single ref to the source array data on the
  `LinkedSplitCharts`.  There should only be one reference since the main
  relation is **that** x-time aligned sequence.
- Add `LinkedSplitCharts.update_from_quote()` which takes in a quote
  dict and updates the OHLC array from it's contents.
- Add `ChartPlotWidget.update_from_array()` method to trigger graphics
  updates per chart with consideration for overlay curves.
2020-06-19 08:01:10 -04:00
Tyler Goodlet b328457f3b Support updating bars graphics from array
This makes a OHLC graphics "sequence" update very similar (actually API
compatible) with `pg.PlotCurveItem.setData()`. The difference here is
that only latest OHLC datum is used to update the charts last bar.
2020-06-19 08:00:04 -04:00
Tyler Goodlet b14cb66d7c Drop disk caching of quotes 2020-06-17 22:59:24 -04:00
Tyler Goodlet bae7ea1ac1 Revert weird bad .time access 2020-06-17 22:58:54 -04:00
Tyler Goodlet 9a397c54ac Factor signalling api into new module 2020-06-17 22:56:27 -04:00
Tyler Goodlet 48f266e276 Rip out all usage of `quantdom.bases.Quotes` smh. 2020-06-17 20:45:47 -04:00
Tyler Goodlet feccadc331 Add a sane pandas.DataFrame to recarray converter 2020-06-17 19:22:37 -04:00
Tyler Goodlet cdb70d25f3 Move all Qt components into top level ui module 2020-06-17 19:20:54 -04:00
Tyler Goodlet ea93e96d88 Move all kivy ui components to subpackage 2020-06-17 14:51:29 -04:00
Tyler Goodlet 56f65bcd40 Cleanup yrange auto-update callback
This was a mess before with a weird loop using the parent split charts
to update all "indicators". Instead just have each plot do its own
yrange updates since the signals are being handled just fine per plot.
Handle both the OHLC and plane line chart cases with a hacky `try:,
except IndexError:` for now.

Oh, and move the main entry point for the chart app to the relevant
module. I added some WIP bar update code for the moment.
2020-06-17 11:45:43 -04:00
Tyler Goodlet 970a528264 Add zeroed ohlc array constructor 2020-06-17 11:44:54 -04:00
Tyler Goodlet 482c4ff87f Add update method for last bars graphic 2020-06-17 09:29:18 -04:00
Tyler Goodlet a92b53d2c1 Use a single array for all lines
Speed up the lines array creation using proper slice assignment.
This gives another 10% speedup to the historical price rendering.
Drop ``_tina_mode`` support for now since we're not testing it.
2020-06-16 14:24:24 -04:00
Tyler Goodlet 11a7530d09 Render plots from provided input sequence(s)
Previously graphics were loaded and rendered implicitly during the
import and creation of certain objects. Remove all this and instead
expect client code to pass in the OHLC sequence to plot. Speed up
the bars graphics rendering by simplifying to a single iteration of
the input array; gives about a 2x speedup.
2020-06-16 13:44:53 -04:00
Tyler Goodlet d102537ca8 Add symbol-info command 2020-06-16 11:55:37 -04:00
Tyler Goodlet d10f80865e Add ui package mod 2020-06-16 11:52:16 -04:00
Tyler Goodlet 16ecd1ffe3 Don't scroll right after max zoom 2020-06-16 11:52:16 -04:00
Tyler Goodlet ef214226a2 Factor components into more suitably named modules 2020-06-16 11:52:16 -04:00
Tyler Goodlet 44984272be Move drawing and resize behavior into chart widget 2020-06-16 11:52:16 -04:00
Tyler Goodlet ea234f4472 Start grouping interactions into a ``ViewBox``
Move chart resize code into our ``ViewBox`` subtype (a ``ChartView``)
in an effort to start organizing interaction behaviour closer to the
appropriate underlying objects. Add some docs for all this and do some
renaming.
2020-06-16 11:52:16 -04:00
Tyler Goodlet 68266f5a20 Lol I guess we probably need this 2020-06-16 11:52:16 -04:00
Tyler Goodlet 856c6f2fd8 Factor common chart configuration 2020-06-16 11:52:16 -04:00
Tyler Goodlet 6bac250db5 Add scrolling from right and cross-hair
Modify the default ``ViewBox`` scroll to zoom behaviour such that
whatever right-most point is visible is used as the "center" for
zooming. Add a "traditional" cross-hair cursor.
2020-06-16 11:52:16 -04:00
Tyler Goodlet 9d4a432757 Styling, start re-org, commenting
- Move out equity plotting to new module.
- Make axis margins and fonts look good on i3.
- Adjust axis labels colors to gray.
- Start commenting a lot of the code after figuring out what it all does
  when cross referencing with ``pyqtgraph``.
- Add option to move date axis to middle.
2020-06-16 11:52:16 -04:00
Tyler Goodlet 49949ae6d5 Who needs it ;P 2020-06-16 11:52:16 -04:00
Tyler Goodlet 0d06dbbefa Add piker chart command 2020-06-16 11:52:16 -04:00
Tyler Goodlet 489e8c226f Move UI spawning cmds to new module 2020-06-16 11:52:16 -04:00
Tyler Goodlet 42aa2bce5b Add charting components from `Quantdom`
Hand select necessary components to get real-time charting with
`pyqtgraph` from the `Quantdom` projects:
https://github.com/constverum/Quantdom

We've offered to collaborate with the author but have received no
response and the project has not been updated in over a year.
Given this, we are moving forward with taking the required components to
make further improvements upon especially since the `pyqtgraph` project
is now being actively maintained again.

If the author comes back we will be more then happy to contribute
modified components upstream:
https://github.com/constverum/Quantdom/issues/18

Relates to #80
2020-06-16 11:52:16 -04:00
Tyler Goodlet b7f306b715 Add initial Qt-trio integration
Use the new "guest mode" available on trio master branch.  Add
entrypoint for `pyqtgraph` based charting based on the `Quantdom`
project.
2020-06-16 11:52:16 -04:00
Tyler Goodlet 0fa515d0e3 Use qt5 and trio guest mode 2020-06-16 11:52:16 -04:00
Tyler Goodlet 98deb4759e Use darkstyle pkg 2020-06-16 11:52:16 -04:00
Tyler Goodlet aa711cdee8 Blind stab at a basic chart 2020-06-16 11:52:16 -04:00
Tyler Goodlet 784f75fff6 Add glue link in readme 2020-06-16 11:52:16 -04:00
Tyler Goodlet 0111bf1a60 Update quote cache on each loop 2020-06-16 11:52:16 -04:00
Tyler Goodlet 9173f22f3b Add tbk tick streaming with trio-websocket 2020-06-16 11:52:16 -04:00
Tyler Goodlet c47df8811b Fix assignment out of order 2020-06-16 11:52:16 -04:00
Tyler Goodlet b2ac571147 Make monitor handle non-full quote messages 2020-06-16 11:52:16 -04:00
Tyler Goodlet 3042d1eec6 Push only new key value pairs over quote streams
This is something I've been meaning to try for a while and will likely
make writing tick data to a db more straight forward (filling in NaN
values is more matter of fact) plus it should minimize bandwidth usage.
Note, it'll require stream consumers to be considerate of non-full
quotes arriving and thus using the first "full" quote message to fill
out dynamically formatted systems or displays.
2020-06-16 11:52:16 -04:00
Tyler Goodlet 2f2ff7cded Make stock quote formatter work with diff streams 2020-06-16 11:52:16 -04:00
Tyler Goodlet b47803457e Update version and deps 2020-06-16 11:52:14 -04:00
Tyler Goodlet 63b7c3687c Add support for TICK ingest to marketstore 2020-06-16 11:51:45 -04:00
45 changed files with 5715 additions and 1012 deletions

22
Pipfile
View File

@ -1,22 +0,0 @@
[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true
name = "pypi"
[packages]
e1839a8 = {path = ".",editable = true}
trio = "*"
Cython = "*"
# use master branch kivy since wheels seem borked (due to cython stuff)
kivy = {git = "git://github.com/kivy/kivy.git"}
pdbpp = "*"
msgpack = "*"
tractor = {git = "git://github.com/goodboy/tractor.git"}
toml = "*"
pyqtgraph = "*"
pyside2 = "*"
[dev-packages]
pytest = "*"
pdbpp = "*"
piker = {editable = true,path = "."}

644
Pipfile.lock generated
View File

@ -1,644 +0,0 @@
{
"_meta": {
"hash": {
"sha256": "4f280bbb01bd2a384bbe963ec012e2bfa89b852a45a009674e30a3e281cd6b04"
},
"pipfile-spec": 6,
"requires": {},
"sources": [
{
"name": "pypi",
"url": "https://pypi.python.org/simple",
"verify_ssl": true
}
]
},
"default": {
"anyio": {
"hashes": [
"sha256:2b758634cf1adc3589937d91f6736b3696d091c1485a10c9cc3f1bd5e6d96813",
"sha256:78db97333af17381cadd2cc0dbd3d4e0258e4932368eab8895cb65a269db054e"
],
"version": "==1.2.3"
},
"asks": {
"hashes": [
"sha256:d7289cb5b7a28614e4fecab63b3734e2a4296d3c323e315f8dc4b546d64f71b7"
],
"version": "==2.3.6"
},
"async-generator": {
"hashes": [
"sha256:01c7bf666359b4967d2cda0000cc2e4af16a0ae098cbffcb8472fb9e8ad6585b",
"sha256:6ebb3d106c12920aaae42ccb6f787ef5eefdcdd166ea3d628fa8476abe712144"
],
"version": "==1.10"
},
"attrs": {
"hashes": [
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
],
"version": "==19.3.0"
},
"click": {
"hashes": [
"sha256:8a18b4ea89d8820c5d0c7da8a64b2c324b4dabb695804dbfea19b9be9d88c0cc",
"sha256:e345d143d80bf5ee7534056164e5e112ea5e22716bbb1ce727941f4c8b471b9a"
],
"version": "==7.1.1"
},
"colorlog": {
"hashes": [
"sha256:30aaef5ab2a1873dec5da38fd6ba568fa761c9fa10b40241027fa3edea47f3d2",
"sha256:732c191ebbe9a353ec160d043d02c64ddef9028de8caae4cfa8bd49b6afed53e"
],
"version": "==4.1.0"
},
"cython": {
"hashes": [
"sha256:03f6bbb380ad0acb744fb06e42996ea217e9d00016ca0ff6f2e7d60f580d0360",
"sha256:05e8cfd3a3a6087aec49a1ae08a89171db991956209406d1e5576f9db70ece52",
"sha256:05eb79efc8029d487251c8a2702a909a8ba33c332e06d2f3980866541bd81253",
"sha256:094d28a34c3fa992ae02aea1edbe6ff89b3cc5870b6ee38b5baeb805dc57b013",
"sha256:0c70e842e52e2f50cc43bad43b5e5bc515f30821a374e544abb0e0746f2350ff",
"sha256:1dcdaa319558eb924294a554dcf6c12383ec947acc7e779e8d3622409a7f7d28",
"sha256:1fc5bdda28f25fec44e4721677458aa509d743cd350862270309d61aa148d6ff",
"sha256:280573a01d9348d44a42d6a9c651d9f7eb1fe9217df72555b2a118f902996a10",
"sha256:298ceca7b0f0da4205fcb0b7c9ac9e120e2dafffd5019ba1618e84ef89434b5a",
"sha256:4074a8bff0040035673cc6dd365a762476d6bff4d03d8ce6904e3e53f9a25dc8",
"sha256:41e7068e95fbf9ec94b41437f989caf9674135e770a39cdb9c00de459bafd1bc",
"sha256:47e5e1502d52ef03387cf9d3b3241007961a84a466e58a3b74028e1dd4957f8c",
"sha256:521340844cf388d109ceb61397f3fd5250ccb622a1a8e93559e8de76c80940a9",
"sha256:6c53338c1811f8c6d7f8cb7abd874810b15045e719e8207f957035c9177b4213",
"sha256:75c2dda47dcc3c77449712b1417bb6b89ec3b7b02e18c64262494dceffdf455e",
"sha256:773c5a98e463b52f7e8197254b39b703a5ea1972aef3a94b3b921515d77dd041",
"sha256:78c3068dcba300d473fef57cdf523e34b37de522f5a494ef9ee1ac9b4b8bbe3f",
"sha256:7bc18fc5a170f2c1cef5387a3d997c28942918bbee0f700e73fd2178ee8d474d",
"sha256:7f89eff20e4a7a64b55210dac17aea711ed8a3f2e78f2ff784c0e984302583dd",
"sha256:89458b49976b1dee5d89ab4ac943da3717b4292bf624367e862e4ee172fcce99",
"sha256:986f871c0fa649b293061236b93782d25c293a8dd8117c7ba05f8a61bdc261ae",
"sha256:a0f495a4fe5278aab278feee35e6102efecde5176a8a74dd28c28e3fc5c8d7c7",
"sha256:a14aa436586c41633339415de82a41164691d02d3e661038da533be5d40794a5",
"sha256:b8ab3ab38afc47d8f4fe629b836243544351cef681b6bdb1dc869028d6fdcbfb",
"sha256:bb487881608ebd293592553c618f0c83316f4f13a64cb18605b1d2fb9fd3da3e",
"sha256:c0b24bfe3431b3cb7ced323bca813dbd13aca973a1475b512d3331fd0de8ec60",
"sha256:c7894c06205166d360ab2915ae306d1f7403e9ce3d3aaeff4095eaf98e42ce66",
"sha256:d4039bb7f234ad32267c55e72fd49fb56078ea102f9d9d8559f6ec34d4887630",
"sha256:e4d6bb8703d0319eb04b7319b12ea41580df44fd84d83ccda13ea463c6801414",
"sha256:e8fab9911fd2fa8e5af407057cb8bdf87762f983cba483fa3234be20a9a0af77",
"sha256:f3818e578e687cdb21dc4aa4a3bc6278c656c9c393e9eda14dd04943f478863d",
"sha256:fe666645493d72712c46e4fbe8bec094b06aec3c337400479e9704439c9d9586"
],
"index": "pypi",
"version": "==0.29.14"
},
"e1839a8": {
"editable": true,
"path": "."
},
"fancycompleter": {
"hashes": [
"sha256:09e0feb8ae242abdfd7ef2ba55069a46f011814a80fe5476be48f51b00247272",
"sha256:dd076bca7d9d524cc7f25ec8f35ef95388ffef9ef46def4d3d25e9b044ad7080"
],
"version": "==0.9.1"
},
"h11": {
"hashes": [
"sha256:33d4bca7be0fa039f4e84d50ab00531047e53d6ee8ffbc83501ea602c169cae1",
"sha256:4bc6d6a1238b7615b266ada57e0618568066f57dd6fa967d1290ec9309b2f2f1"
],
"version": "==0.9.0"
},
"idna": {
"hashes": [
"sha256:7588d1c14ae4c77d74036e8c22ff447b26d0fde8f007354fd48a7814db15b7cb",
"sha256:a068a21ceac8a4d63dbfd964670474107f541babbd2250d61922f029858365fa"
],
"version": "==2.9"
},
"kivy": {
"git": "git://github.com/kivy/kivy.git",
"ref": "9398c8d5d260c9f4d5dd0aadc7de5001bddaf984"
},
"msgpack": {
"hashes": [
"sha256:0cc7ca04e575ba34fea7cfcd76039f55def570e6950e4155a4174368142c8e1b",
"sha256:187794cd1eb73acccd528247e3565f6760bd842d7dc299241f830024a7dd5610",
"sha256:1904b7cb65342d0998b75908304a03cb004c63ef31e16c8c43fee6b989d7f0d7",
"sha256:229a0ccdc39e9b6c6d1033cd8aecd9c296823b6c87f0de3943c59b8bc7c64bee",
"sha256:24149a75643aeaa81ece4259084d11b792308a6cf74e796cbb35def94c89a25a",
"sha256:30b88c47e0cdb6062daed88ca283b0d84fa0d2ad6c273aa0788152a1c643e408",
"sha256:32fea0ea3cd1ef820286863a6202dcfd62a539b8ec3edcbdff76068a8c2cc6ce",
"sha256:355f7fd0f90134229eaeefaee3cf42e0afc8518e8f3cd4b25f541a7104dcb8f9",
"sha256:4abdb88a9b67e64810fb54b0c24a1fd76b12297b4f7a1467d85a14dd8367191a",
"sha256:757bd71a9b89e4f1db0622af4436d403e742506dbea978eba566815dc65ec895",
"sha256:76df51492bc6fa6cc8b65d09efdb67cbba3cbfe55004c3afc81352af92b4a43c",
"sha256:774f5edc3475917cd95fe593e625d23d8580f9b48b570d8853d06cac171cd170",
"sha256:8a3ada8401736df2bf497f65589293a86c56e197a80ae7634ec2c3150a2f5082",
"sha256:a06efd0482a1942aad209a6c18321b5e22d64eb531ea20af138b28172d8f35ba",
"sha256:b24afc52e18dccc8c175de07c1d680bdf315844566f4952b5bedb908894bec79",
"sha256:b8b4bd3dafc7b92608ae5462add1c8cc881851c2d4f5d8977fdea5b081d17f21",
"sha256:c6e5024fc0cdf7f83b6624850309ddd7e06c48a75fa0d1c5173de4d93300eb19",
"sha256:db7ff14abc73577b0bcbcf73ecff97d3580ecaa0fc8724babce21fdf3fe08ef6",
"sha256:dedf54d72d9e7b6d043c244c8213fe2b8bbfe66874b9a65b39c4cc892dd99dd4",
"sha256:ea3c2f859346fcd55fc46e96885301d9c2f7a36d453f5d8f2967840efa1e1830",
"sha256:f0f47bafe9c9b8ed03e19a100a743662dd8c6d0135e684feea720a0d0046d116"
],
"index": "pypi",
"version": "==0.6.2"
},
"numpy": {
"hashes": [
"sha256:1786a08236f2c92ae0e70423c45e1e62788ed33028f94ca99c4df03f5be6b3c6",
"sha256:17aa7a81fe7599a10f2b7d95856dc5cf84a4eefa45bc96123cbbc3ebc568994e",
"sha256:20b26aaa5b3da029942cdcce719b363dbe58696ad182aff0e5dcb1687ec946dc",
"sha256:2d75908ab3ced4223ccba595b48e538afa5ecc37405923d1fea6906d7c3a50bc",
"sha256:39d2c685af15d3ce682c99ce5925cc66efc824652e10990d2462dfe9b8918c6a",
"sha256:56bc8ded6fcd9adea90f65377438f9fea8c05fcf7c5ba766bef258d0da1554aa",
"sha256:590355aeade1a2eaba17617c19edccb7db8d78760175256e3cf94590a1a964f3",
"sha256:70a840a26f4e61defa7bdf811d7498a284ced303dfbc35acb7be12a39b2aa121",
"sha256:77c3bfe65d8560487052ad55c6998a04b654c2fbc36d546aef2b2e511e760971",
"sha256:9537eecf179f566fd1c160a2e912ca0b8e02d773af0a7a1120ad4f7507cd0d26",
"sha256:9acdf933c1fd263c513a2df3dceecea6f3ff4419d80bf238510976bf9bcb26cd",
"sha256:ae0975f42ab1f28364dcda3dde3cf6c1ddab3e1d4b2909da0cb0191fa9ca0480",
"sha256:b3af02ecc999c8003e538e60c89a2b37646b39b688d4e44d7373e11c2debabec",
"sha256:b6ff59cee96b454516e47e7721098e6ceebef435e3e21ac2d6c3b8b02628eb77",
"sha256:b765ed3930b92812aa698a455847141869ef755a87e099fddd4ccf9d81fffb57",
"sha256:c98c5ffd7d41611407a1103ae11c8b634ad6a43606eca3e2a5a269e5d6e8eb07",
"sha256:cf7eb6b1025d3e169989416b1adcd676624c2dbed9e3bcb7137f51bfc8cc2572",
"sha256:d92350c22b150c1cae7ebb0ee8b5670cc84848f6359cf6b5d8f86617098a9b73",
"sha256:e422c3152921cece8b6a2fb6b0b4d73b6579bd20ae075e7d15143e711f3ca2ca",
"sha256:e840f552a509e3380b0f0ec977e8124d0dc34dc0e68289ca28f4d7c1d0d79474",
"sha256:f3d0a94ad151870978fb93538e95411c83899c9dc63e6fb65542f769568ecfa5"
],
"version": "==1.18.1"
},
"outcome": {
"hashes": [
"sha256:ee46c5ce42780cde85d55a61819d0e6b8cb490f1dbd749ba75ff2629771dcd2d",
"sha256:fc7822068ba7dd0fc2532743611e8a73246708d3564e29a39f93d6ab3701b66f"
],
"version": "==1.0.1"
},
"pandas": {
"hashes": [
"sha256:23e177d43e4bf68950b0f8788b6a2fef2f478f4ec94883acb627b9264522a98a",
"sha256:2530aea4fe46e8df7829c3f05e0a0f821c893885d53cb8ac9b89cc67c143448c",
"sha256:303827f0bb40ff610fbada5b12d50014811efcc37aaf6ef03202dc3054bfdda1",
"sha256:3b019e3ea9f5d0cfee0efabae2cfd3976874e90bcc3e97b29600e5a9b345ae3d",
"sha256:3c07765308f091d81b6735d4f2242bb43c332cc3461cae60543df6b10967fe27",
"sha256:5036d4009012a44aa3e50173e482b664c1fae36decd277c49e453463798eca4e",
"sha256:6f38969e2325056f9959efbe06c27aa2e94dd35382265ad0703681d993036052",
"sha256:74a470d349d52b9d00a2ba192ae1ee22155bb0a300fd1ccb2961006c3fa98ed3",
"sha256:7d77034e402165b947f43050a8a415aa3205abfed38d127ea66e57a2b7b5a9e0",
"sha256:7f9a509f6f11fa8b9313002ebdf6f690a7aa1dd91efd95d90185371a0d68220e",
"sha256:942b5d04762feb0e55b2ad97ce2b254a0ffdd344b56493b04a627266e24f2d82",
"sha256:a9fbe41663416bb70ed05f4e16c5f377519c0dc292ba9aa45f5356e37df03a38",
"sha256:d10e83866b48c0cdb83281f786564e2a2b51a7ae7b8a950c3442ad3c9e36b48c",
"sha256:e2140e1bbf9c46db9936ee70f4be6584d15ff8dc3dfff1da022d71227d53bad3"
],
"version": "==1.0.1"
},
"pdbpp": {
"hashes": [
"sha256:73ff220d5006e0ecdc3e2705d8328d8aa5ac27fef95cc06f6e42cd7d22d55eb8"
],
"index": "pypi",
"version": "==0.10.2"
},
"psutil": {
"hashes": [
"sha256:06660136ab88762309775fd47290d7da14094422d915f0466e0adf8e4b22214e",
"sha256:0c11adde31011a286197630ba2671e34651f004cc418d30ae06d2033a43c9e20",
"sha256:0c211eec4185725847cb6c28409646c7cfa56fdb531014b35f97b5dc7fe04ff9",
"sha256:0fc7a5619b47f74331add476fbc6022d7ca801c22865c7069ec0867920858963",
"sha256:3004361c6b93dbad71330d992c1ae409cb8314a6041a0b67507cc882357f583e",
"sha256:5e8dbf31871b0072bcba8d1f2861c0ec6c84c78f13c723bb6e981bce51b58f12",
"sha256:6d81b9714791ef9a3a00b2ca846ee547fc5e53d259e2a6258c3d2054928039ff",
"sha256:724390895cff80add7a1c4e7e0a04d9c94f3ee61423a2dcafd83784fabbd1ee9",
"sha256:ad21281f7bd6c57578dd53913d2d44218e9e29fd25128d10ff7819ef16fa46e7",
"sha256:f21a7bb4b207e4e7c60b3c40ffa89d790997619f04bbecec9db8e3696122bc78",
"sha256:f60042bef7dc50a78c06334ca8e25580455948ba2fa98f240d034a4fed9141a5"
],
"index": "pypi",
"version": "==5.6.6"
},
"pygments": {
"hashes": [
"sha256:647344a061c249a3b74e230c739f434d7ea4d8b1d5f3721bc0f3558049b38f44",
"sha256:ff7a40b4860b727ab48fad6360eb351cc1b33cbf9b15a0f689ca5353e9463324"
],
"version": "==2.6.1"
},
"pyqtgraph": {
"hashes": [
"sha256:4c08ab34881fae5ecf9ddfe6c1220b9e41e6d3eb1579a7d8ef501abb8e509251"
],
"index": "pypi",
"version": "==0.10.0"
},
"pyrepl": {
"hashes": [
"sha256:292570f34b5502e871bbb966d639474f2b57fbfcd3373c2d6a2f3d56e681a775"
],
"version": "==0.9.0"
},
"pyside2": {
"hashes": [
"sha256:589b90944c24046d31bf76694590a600d59d20130015086491b793a81753629a",
"sha256:63cc845434388b398b79b65f7b5312b9b5348fbc772d84092c9245efbf341197",
"sha256:7c57fe60ed57a3a8b95d9163abca9caa803a1470f29b40bff8ef4103b97a96c8",
"sha256:7c61a6883f3474939097b9dabc80f028887046be003ce416da1b3565a08d1f92",
"sha256:ed6d22c7a3a99f480d4c9348bcced97ef7bc0c9d353ad3665ae705e8eb61feb5",
"sha256:ede8ed6e7021232184829d16614eb153f188ea250862304eac35e04b2bd0120c"
],
"index": "pypi",
"version": "==5.13.2"
},
"python-dateutil": {
"hashes": [
"sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c",
"sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a"
],
"version": "==2.8.1"
},
"pytz": {
"hashes": [
"sha256:1c557d7d0e871de1f5ccd5833f60fb2550652da6be2693c1e02300743d21500d",
"sha256:b02c06db6cf09c12dd25137e563b31700d3b80fcc4ad23abb7a315f2789819be"
],
"version": "==2019.3"
},
"shiboken2": {
"hashes": [
"sha256:5e84a4b4e7ab08bb5db0a8168e5d0316fbf3c25b788012701a82079faadfb19b",
"sha256:7c766c4160636a238e0e4430e2f40b504b13bcc4951902eb78cd5c971f26c898",
"sha256:81fa9b288c6c4b4c91220fcca2002eadb48fc5c3238e8bd88e982e00ffa77c53",
"sha256:ca08a3c95b1b20ac2b243b7b06379609bd73929dbc27b28c01415feffe3bcea1",
"sha256:e2f72b5cfdb8b48bdb55bda4b42ec7d36d1bce0be73d6d7d4a358225d6fb5f25",
"sha256:e6543506cb353d417961b9ec3c6fc726ec2f72eeab609dc88943c2e5cb6d6408"
],
"version": "==5.13.2"
},
"six": {
"hashes": [
"sha256:236bdbdce46e6e6a3d61a337c0f8b763ca1e8717c03b369e87a7ec7ce1319c0a",
"sha256:8f3cd2e254d8f793e7f3d6d9df77b92252b52637291d0f0da013c76ea2724b6c"
],
"version": "==1.14.0"
},
"sniffio": {
"hashes": [
"sha256:20ed6d5b46f8ae136d00b9dcb807615d83ed82ceea6b2058cecb696765246da5",
"sha256:8e3810100f69fe0edd463d02ad407112542a11ffdc29f67db2bf3771afb87a21"
],
"version": "==1.1.0"
},
"sortedcontainers": {
"hashes": [
"sha256:974e9a32f56b17c1bac2aebd9dcf197f3eb9cd30553c5852a3187ad162e1a03a",
"sha256:d9e96492dd51fae31e60837736b38fe42a187b5404c16606ff7ee7cd582d4c60"
],
"version": "==2.1.0"
},
"toml": {
"hashes": [
"sha256:229f81c57791a41d65e399fc06bf0848bab550a9dfd5ed66df18ce5f05e73d5c",
"sha256:235682dd292d5899d361a811df37e04a8828a5b1da3115886b73cf81ebc9100e"
],
"index": "pypi",
"version": "==0.10.0"
},
"tractor": {
"git": "git://github.com/goodboy/tractor.git",
"ref": "ab349cdb8d8cbf2e3d48c0589cb710a43483f233"
},
"trio": {
"hashes": [
"sha256:a6d83c0cb4a177ec0f5179ce88e27914d5c8e6fd01c4285176b949e6ddc88c6c",
"sha256:f1cf00054ad974c86d9b7afa187a65d79fd5995340abe01e8e4784d86f4acb30"
],
"index": "pypi",
"version": "==0.13.0"
},
"wmctrl": {
"hashes": [
"sha256:d806f65ac1554366b6e31d29d7be2e8893996c0acbb2824bbf2b1f49cf628a13"
],
"version": "==0.3"
}
},
"develop": {
"anyio": {
"hashes": [
"sha256:2b758634cf1adc3589937d91f6736b3696d091c1485a10c9cc3f1bd5e6d96813",
"sha256:78db97333af17381cadd2cc0dbd3d4e0258e4932368eab8895cb65a269db054e"
],
"version": "==1.2.3"
},
"asks": {
"hashes": [
"sha256:d7289cb5b7a28614e4fecab63b3734e2a4296d3c323e315f8dc4b546d64f71b7"
],
"version": "==2.3.6"
},
"async-generator": {
"hashes": [
"sha256:01c7bf666359b4967d2cda0000cc2e4af16a0ae098cbffcb8472fb9e8ad6585b",
"sha256:6ebb3d106c12920aaae42ccb6f787ef5eefdcdd166ea3d628fa8476abe712144"
],
"version": "==1.10"
},
"atomicwrites": {
"hashes": [
"sha256:03472c30eb2c5d1ba9227e4c2ca66ab8287fbfbbda3888aa93dc2e28fc6811b4",
"sha256:75a9445bac02d8d058d5e1fe689654ba5a6556a1dfd8ce6ec55a0ed79866cfa6"
],
"version": "==1.3.0"
},
"attrs": {
"hashes": [
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
],
"version": "==19.3.0"
},
"click": {
"hashes": [
"sha256:8a18b4ea89d8820c5d0c7da8a64b2c324b4dabb695804dbfea19b9be9d88c0cc",
"sha256:e345d143d80bf5ee7534056164e5e112ea5e22716bbb1ce727941f4c8b471b9a"
],
"version": "==7.1.1"
},
"colorlog": {
"hashes": [
"sha256:30aaef5ab2a1873dec5da38fd6ba568fa761c9fa10b40241027fa3edea47f3d2",
"sha256:732c191ebbe9a353ec160d043d02c64ddef9028de8caae4cfa8bd49b6afed53e"
],
"version": "==4.1.0"
},
"cython": {
"hashes": [
"sha256:03f6bbb380ad0acb744fb06e42996ea217e9d00016ca0ff6f2e7d60f580d0360",
"sha256:05e8cfd3a3a6087aec49a1ae08a89171db991956209406d1e5576f9db70ece52",
"sha256:05eb79efc8029d487251c8a2702a909a8ba33c332e06d2f3980866541bd81253",
"sha256:094d28a34c3fa992ae02aea1edbe6ff89b3cc5870b6ee38b5baeb805dc57b013",
"sha256:0c70e842e52e2f50cc43bad43b5e5bc515f30821a374e544abb0e0746f2350ff",
"sha256:1dcdaa319558eb924294a554dcf6c12383ec947acc7e779e8d3622409a7f7d28",
"sha256:1fc5bdda28f25fec44e4721677458aa509d743cd350862270309d61aa148d6ff",
"sha256:280573a01d9348d44a42d6a9c651d9f7eb1fe9217df72555b2a118f902996a10",
"sha256:298ceca7b0f0da4205fcb0b7c9ac9e120e2dafffd5019ba1618e84ef89434b5a",
"sha256:4074a8bff0040035673cc6dd365a762476d6bff4d03d8ce6904e3e53f9a25dc8",
"sha256:41e7068e95fbf9ec94b41437f989caf9674135e770a39cdb9c00de459bafd1bc",
"sha256:47e5e1502d52ef03387cf9d3b3241007961a84a466e58a3b74028e1dd4957f8c",
"sha256:521340844cf388d109ceb61397f3fd5250ccb622a1a8e93559e8de76c80940a9",
"sha256:6c53338c1811f8c6d7f8cb7abd874810b15045e719e8207f957035c9177b4213",
"sha256:75c2dda47dcc3c77449712b1417bb6b89ec3b7b02e18c64262494dceffdf455e",
"sha256:773c5a98e463b52f7e8197254b39b703a5ea1972aef3a94b3b921515d77dd041",
"sha256:78c3068dcba300d473fef57cdf523e34b37de522f5a494ef9ee1ac9b4b8bbe3f",
"sha256:7bc18fc5a170f2c1cef5387a3d997c28942918bbee0f700e73fd2178ee8d474d",
"sha256:7f89eff20e4a7a64b55210dac17aea711ed8a3f2e78f2ff784c0e984302583dd",
"sha256:89458b49976b1dee5d89ab4ac943da3717b4292bf624367e862e4ee172fcce99",
"sha256:986f871c0fa649b293061236b93782d25c293a8dd8117c7ba05f8a61bdc261ae",
"sha256:a0f495a4fe5278aab278feee35e6102efecde5176a8a74dd28c28e3fc5c8d7c7",
"sha256:a14aa436586c41633339415de82a41164691d02d3e661038da533be5d40794a5",
"sha256:b8ab3ab38afc47d8f4fe629b836243544351cef681b6bdb1dc869028d6fdcbfb",
"sha256:bb487881608ebd293592553c618f0c83316f4f13a64cb18605b1d2fb9fd3da3e",
"sha256:c0b24bfe3431b3cb7ced323bca813dbd13aca973a1475b512d3331fd0de8ec60",
"sha256:c7894c06205166d360ab2915ae306d1f7403e9ce3d3aaeff4095eaf98e42ce66",
"sha256:d4039bb7f234ad32267c55e72fd49fb56078ea102f9d9d8559f6ec34d4887630",
"sha256:e4d6bb8703d0319eb04b7319b12ea41580df44fd84d83ccda13ea463c6801414",
"sha256:e8fab9911fd2fa8e5af407057cb8bdf87762f983cba483fa3234be20a9a0af77",
"sha256:f3818e578e687cdb21dc4aa4a3bc6278c656c9c393e9eda14dd04943f478863d",
"sha256:fe666645493d72712c46e4fbe8bec094b06aec3c337400479e9704439c9d9586"
],
"index": "pypi",
"version": "==0.29.14"
},
"fancycompleter": {
"hashes": [
"sha256:09e0feb8ae242abdfd7ef2ba55069a46f011814a80fe5476be48f51b00247272",
"sha256:dd076bca7d9d524cc7f25ec8f35ef95388ffef9ef46def4d3d25e9b044ad7080"
],
"version": "==0.9.1"
},
"h11": {
"hashes": [
"sha256:33d4bca7be0fa039f4e84d50ab00531047e53d6ee8ffbc83501ea602c169cae1",
"sha256:4bc6d6a1238b7615b266ada57e0618568066f57dd6fa967d1290ec9309b2f2f1"
],
"version": "==0.9.0"
},
"idna": {
"hashes": [
"sha256:7588d1c14ae4c77d74036e8c22ff447b26d0fde8f007354fd48a7814db15b7cb",
"sha256:a068a21ceac8a4d63dbfd964670474107f541babbd2250d61922f029858365fa"
],
"version": "==2.9"
},
"more-itertools": {
"hashes": [
"sha256:5dd8bcf33e5f9513ffa06d5ad33d78f31e1931ac9a18f33d37e77a180d393a7c",
"sha256:b1ddb932186d8a6ac451e1d95844b382f55e12686d51ca0c68b6f61f2ab7a507"
],
"version": "==8.2.0"
},
"msgpack": {
"hashes": [
"sha256:0cc7ca04e575ba34fea7cfcd76039f55def570e6950e4155a4174368142c8e1b",
"sha256:187794cd1eb73acccd528247e3565f6760bd842d7dc299241f830024a7dd5610",
"sha256:1904b7cb65342d0998b75908304a03cb004c63ef31e16c8c43fee6b989d7f0d7",
"sha256:229a0ccdc39e9b6c6d1033cd8aecd9c296823b6c87f0de3943c59b8bc7c64bee",
"sha256:24149a75643aeaa81ece4259084d11b792308a6cf74e796cbb35def94c89a25a",
"sha256:30b88c47e0cdb6062daed88ca283b0d84fa0d2ad6c273aa0788152a1c643e408",
"sha256:32fea0ea3cd1ef820286863a6202dcfd62a539b8ec3edcbdff76068a8c2cc6ce",
"sha256:355f7fd0f90134229eaeefaee3cf42e0afc8518e8f3cd4b25f541a7104dcb8f9",
"sha256:4abdb88a9b67e64810fb54b0c24a1fd76b12297b4f7a1467d85a14dd8367191a",
"sha256:757bd71a9b89e4f1db0622af4436d403e742506dbea978eba566815dc65ec895",
"sha256:76df51492bc6fa6cc8b65d09efdb67cbba3cbfe55004c3afc81352af92b4a43c",
"sha256:774f5edc3475917cd95fe593e625d23d8580f9b48b570d8853d06cac171cd170",
"sha256:8a3ada8401736df2bf497f65589293a86c56e197a80ae7634ec2c3150a2f5082",
"sha256:a06efd0482a1942aad209a6c18321b5e22d64eb531ea20af138b28172d8f35ba",
"sha256:b24afc52e18dccc8c175de07c1d680bdf315844566f4952b5bedb908894bec79",
"sha256:b8b4bd3dafc7b92608ae5462add1c8cc881851c2d4f5d8977fdea5b081d17f21",
"sha256:c6e5024fc0cdf7f83b6624850309ddd7e06c48a75fa0d1c5173de4d93300eb19",
"sha256:db7ff14abc73577b0bcbcf73ecff97d3580ecaa0fc8724babce21fdf3fe08ef6",
"sha256:dedf54d72d9e7b6d043c244c8213fe2b8bbfe66874b9a65b39c4cc892dd99dd4",
"sha256:ea3c2f859346fcd55fc46e96885301d9c2f7a36d453f5d8f2967840efa1e1830",
"sha256:f0f47bafe9c9b8ed03e19a100a743662dd8c6d0135e684feea720a0d0046d116"
],
"index": "pypi",
"version": "==0.6.2"
},
"numpy": {
"hashes": [
"sha256:1786a08236f2c92ae0e70423c45e1e62788ed33028f94ca99c4df03f5be6b3c6",
"sha256:17aa7a81fe7599a10f2b7d95856dc5cf84a4eefa45bc96123cbbc3ebc568994e",
"sha256:20b26aaa5b3da029942cdcce719b363dbe58696ad182aff0e5dcb1687ec946dc",
"sha256:2d75908ab3ced4223ccba595b48e538afa5ecc37405923d1fea6906d7c3a50bc",
"sha256:39d2c685af15d3ce682c99ce5925cc66efc824652e10990d2462dfe9b8918c6a",
"sha256:56bc8ded6fcd9adea90f65377438f9fea8c05fcf7c5ba766bef258d0da1554aa",
"sha256:590355aeade1a2eaba17617c19edccb7db8d78760175256e3cf94590a1a964f3",
"sha256:70a840a26f4e61defa7bdf811d7498a284ced303dfbc35acb7be12a39b2aa121",
"sha256:77c3bfe65d8560487052ad55c6998a04b654c2fbc36d546aef2b2e511e760971",
"sha256:9537eecf179f566fd1c160a2e912ca0b8e02d773af0a7a1120ad4f7507cd0d26",
"sha256:9acdf933c1fd263c513a2df3dceecea6f3ff4419d80bf238510976bf9bcb26cd",
"sha256:ae0975f42ab1f28364dcda3dde3cf6c1ddab3e1d4b2909da0cb0191fa9ca0480",
"sha256:b3af02ecc999c8003e538e60c89a2b37646b39b688d4e44d7373e11c2debabec",
"sha256:b6ff59cee96b454516e47e7721098e6ceebef435e3e21ac2d6c3b8b02628eb77",
"sha256:b765ed3930b92812aa698a455847141869ef755a87e099fddd4ccf9d81fffb57",
"sha256:c98c5ffd7d41611407a1103ae11c8b634ad6a43606eca3e2a5a269e5d6e8eb07",
"sha256:cf7eb6b1025d3e169989416b1adcd676624c2dbed9e3bcb7137f51bfc8cc2572",
"sha256:d92350c22b150c1cae7ebb0ee8b5670cc84848f6359cf6b5d8f86617098a9b73",
"sha256:e422c3152921cece8b6a2fb6b0b4d73b6579bd20ae075e7d15143e711f3ca2ca",
"sha256:e840f552a509e3380b0f0ec977e8124d0dc34dc0e68289ca28f4d7c1d0d79474",
"sha256:f3d0a94ad151870978fb93538e95411c83899c9dc63e6fb65542f769568ecfa5"
],
"version": "==1.18.1"
},
"outcome": {
"hashes": [
"sha256:ee46c5ce42780cde85d55a61819d0e6b8cb490f1dbd749ba75ff2629771dcd2d",
"sha256:fc7822068ba7dd0fc2532743611e8a73246708d3564e29a39f93d6ab3701b66f"
],
"version": "==1.0.1"
},
"packaging": {
"hashes": [
"sha256:3c292b474fda1671ec57d46d739d072bfd495a4f51ad01a055121d81e952b7a3",
"sha256:82f77b9bee21c1bafbf35a84905d604d5d1223801d639cf3ed140bd651c08752"
],
"version": "==20.3"
},
"pandas": {
"hashes": [
"sha256:23e177d43e4bf68950b0f8788b6a2fef2f478f4ec94883acb627b9264522a98a",
"sha256:2530aea4fe46e8df7829c3f05e0a0f821c893885d53cb8ac9b89cc67c143448c",
"sha256:303827f0bb40ff610fbada5b12d50014811efcc37aaf6ef03202dc3054bfdda1",
"sha256:3b019e3ea9f5d0cfee0efabae2cfd3976874e90bcc3e97b29600e5a9b345ae3d",
"sha256:3c07765308f091d81b6735d4f2242bb43c332cc3461cae60543df6b10967fe27",
"sha256:5036d4009012a44aa3e50173e482b664c1fae36decd277c49e453463798eca4e",
"sha256:6f38969e2325056f9959efbe06c27aa2e94dd35382265ad0703681d993036052",
"sha256:74a470d349d52b9d00a2ba192ae1ee22155bb0a300fd1ccb2961006c3fa98ed3",
"sha256:7d77034e402165b947f43050a8a415aa3205abfed38d127ea66e57a2b7b5a9e0",
"sha256:7f9a509f6f11fa8b9313002ebdf6f690a7aa1dd91efd95d90185371a0d68220e",
"sha256:942b5d04762feb0e55b2ad97ce2b254a0ffdd344b56493b04a627266e24f2d82",
"sha256:a9fbe41663416bb70ed05f4e16c5f377519c0dc292ba9aa45f5356e37df03a38",
"sha256:d10e83866b48c0cdb83281f786564e2a2b51a7ae7b8a950c3442ad3c9e36b48c",
"sha256:e2140e1bbf9c46db9936ee70f4be6584d15ff8dc3dfff1da022d71227d53bad3"
],
"version": "==1.0.1"
},
"pdbpp": {
"hashes": [
"sha256:73ff220d5006e0ecdc3e2705d8328d8aa5ac27fef95cc06f6e42cd7d22d55eb8"
],
"index": "pypi",
"version": "==0.10.2"
},
"piker": {
"editable": true,
"path": "."
},
"pluggy": {
"hashes": [
"sha256:15b2acde666561e1298d71b523007ed7364de07029219b604cf808bfa1c765b0",
"sha256:966c145cd83c96502c3c3868f50408687b38434af77734af1e9ca461a4081d2d"
],
"version": "==0.13.1"
},
"py": {
"hashes": [
"sha256:5e27081401262157467ad6e7f851b7aa402c5852dbcb3dae06768434de5752aa",
"sha256:c20fdd83a5dbc0af9efd622bee9a5564e278f6380fffcacc43ba6f43db2813b0"
],
"version": "==1.8.1"
},
"pygments": {
"hashes": [
"sha256:647344a061c249a3b74e230c739f434d7ea4d8b1d5f3721bc0f3558049b38f44",
"sha256:ff7a40b4860b727ab48fad6360eb351cc1b33cbf9b15a0f689ca5353e9463324"
],
"version": "==2.6.1"
},
"pyparsing": {
"hashes": [
"sha256:4c830582a84fb022400b85429791bc551f1f4871c33f23e44f353119e92f969f",
"sha256:c342dccb5250c08d45fd6f8b4a559613ca603b57498511740e65cd11a2e7dcec"
],
"version": "==2.4.6"
},
"pyrepl": {
"hashes": [
"sha256:292570f34b5502e871bbb966d639474f2b57fbfcd3373c2d6a2f3d56e681a775"
],
"version": "==0.9.0"
},
"pytest": {
"hashes": [
"sha256:8e256fe71eb74e14a4d20a5987bb5e1488f0511ee800680aaedc62b9358714e8",
"sha256:ff0090819f669aaa0284d0f4aad1a6d9d67a6efdc6dd4eb4ac56b704f890a0d6"
],
"index": "pypi",
"version": "==5.2.4"
},
"python-dateutil": {
"hashes": [
"sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c",
"sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a"
],
"version": "==2.8.1"
},
"pytz": {
"hashes": [
"sha256:1c557d7d0e871de1f5ccd5833f60fb2550652da6be2693c1e02300743d21500d",
"sha256:b02c06db6cf09c12dd25137e563b31700d3b80fcc4ad23abb7a315f2789819be"
],
"version": "==2019.3"
},
"six": {
"hashes": [
"sha256:236bdbdce46e6e6a3d61a337c0f8b763ca1e8717c03b369e87a7ec7ce1319c0a",
"sha256:8f3cd2e254d8f793e7f3d6d9df77b92252b52637291d0f0da013c76ea2724b6c"
],
"version": "==1.14.0"
},
"sniffio": {
"hashes": [
"sha256:20ed6d5b46f8ae136d00b9dcb807615d83ed82ceea6b2058cecb696765246da5",
"sha256:8e3810100f69fe0edd463d02ad407112542a11ffdc29f67db2bf3771afb87a21"
],
"version": "==1.1.0"
},
"sortedcontainers": {
"hashes": [
"sha256:974e9a32f56b17c1bac2aebd9dcf197f3eb9cd30553c5852a3187ad162e1a03a",
"sha256:d9e96492dd51fae31e60837736b38fe42a187b5404c16606ff7ee7cd582d4c60"
],
"version": "==2.1.0"
},
"trio": {
"hashes": [
"sha256:a6d83c0cb4a177ec0f5179ce88e27914d5c8e6fd01c4285176b949e6ddc88c6c",
"sha256:f1cf00054ad974c86d9b7afa187a65d79fd5995340abe01e8e4784d86f4acb30"
],
"index": "pypi",
"version": "==0.13.0"
},
"wcwidth": {
"hashes": [
"sha256:8fd29383f539be45b20bd4df0dc29c20ba48654a41e661925e612311e9f3c603",
"sha256:f28b3e8a6483e5d49e7f8949ac1a78314e740333ae305b4ba5defd3e74fb37a8"
],
"version": "==0.1.8"
},
"wmctrl": {
"hashes": [
"sha256:d806f65ac1554366b6e31d29d7be2e8893996c0acbb2824bbf2b1f49cf628a13"
],
"version": "==0.3"
}
}
}

View File

@ -9,12 +9,14 @@ trading and financial analysis targetted at hardcore Linux users.
It tries to use as much bleeding edge tech as possible including (but not limited to): It tries to use as much bleeding edge tech as possible including (but not limited to):
- Python 3.7+ for glue and business logic - Python 3.7+ for glue_ and business logic
- trio_ and asyncio_ for async - trio_ for structured concurrency
- tractor_ as the underlying actor model - tractor_ for distributed, multi-core, real-time streaming
- marketstore_ for historical and real-time tick data persistence and sharing - marketstore_ for historical and real-time tick data persistence and sharing
- techtonicdb_ for L2 book storage - techtonicdb_ for L2 book storage
- Qt_ for pristine high performance UIs - Qt_ for pristine high performance UIs
- pyqtgraph_ for real-time charting
- ``numpy`` for `fast numerics`_
.. |travis| image:: https://img.shields.io/travis/pikers/piker/master.svg .. |travis| image:: https://img.shields.io/travis/pikers/piker/master.svg
:target: https://travis-ci.org/pikers/piker :target: https://travis-ci.org/pikers/piker
@ -23,6 +25,8 @@ It tries to use as much bleeding edge tech as possible including (but not limite
.. _marketstore: https://github.com/alpacahq/marketstore .. _marketstore: https://github.com/alpacahq/marketstore
.. _techtonicdb: https://github.com/0b01/tectonicdb .. _techtonicdb: https://github.com/0b01/tectonicdb
.. _Qt: https://www.qt.io/ .. _Qt: https://www.qt.io/
.. _glue: https://numpy.org/doc/stable/user/c-info.python-as-glue.html#using-python-as-glue
.. _fast numerics: https://zerowithdot.com/python-numpy-and-pandas-performance/
Focus and Features: Focus and Features:
@ -64,27 +68,7 @@ Install
be cloned from this repo and hacked on directly. be cloned from this repo and hacked on directly.
A couple bleeding edge components are being used atm pertaining to A couple bleeding edge components are being used atm pertaining to
async ports of libraries for use with `trio`_. new components within `trio`_.
Before installing make sure you have `pipenv`_ and have installed
``python3.7`` as well as `kivy source build`_ dependencies
since currently there's reliance on an async development branch.
`kivy` dependencies
===================
On Archlinux you need the following dependencies::
pacman -S python-docutils gstreamer sdl2_ttf sdl2_mixer sdl2_image xclip
To manually install the async branch of ``kivy`` from github do (though
this should be done as part of the ``pipenv install`` below)::
pipenv install -e 'git+git://github.com/matham/kivy.git@async-loop#egg=kivy'
.. _kivy source build:
https://kivy.org/docs/installation/installation-linux.html#installation-in-a-virtual-environment
For a development install:: For a development install::
@ -96,34 +80,36 @@ For a development install::
Broker Support Broker Support
************** **************
For live data feeds the only fully functional broker at the moment is Questrade_. For live data feeds the set of supported brokers is:
Eventual support is in the works for `IB`, `TD Ameritrade` and `IEX`. - Questrade_ which comes with effectively free L1
- IB_ via ib_insync
- Webull_ via the reverse engineered public API
- Kraken_ for crypto over their public websocket API
If you want your broker supported and they have an API let us know. If you want your broker supported and they have an API let us know.
.. _Questrade: https://www.questrade.com/api/documentation .. _Questrade: https://www.questrade.com/api/documentation
.. _IB: https://interactivebrokers.github.io/tws-api/index.html
.. _Webull: https://www.kraken.com/features/api#public-market-data
.. _Kraken: https://www.kraken.com/features/api#public-market-data
Play with some UIs Check out some charts
****************** *********************
Bet you weren't expecting this from the foss::
To start the real-time index monitor with the `questrade` backend:: piker chart spy.arca
piker -l info monitor indexes
If you want to see super granular price changes, increase the It is also possible to run a specific broker's data feed as a top
broker quote query ``rate`` with ``-r``:: level micro-service daemon::
piker monitor indexes -r 10 pikerd -l info -b ib
It is also possible to run the broker data feed micro service as a daemon::
pikerd -l info
Then start the client app as normal:: Then start the client app as normal::
piker monitor indexes piker chart -b ib ES.GLOBEX
.. _pipenv: https://docs.pipenv.org/ .. _pipenv: https://docs.pipenv.org/

View File

@ -1,5 +1,5 @@
# piker: trading gear for hackers. # piker: trading gear for hackers.
# Copyright 2018 Tyler Goodlet # Copyright 2018-forever Tyler Goodlet and p0
# This program is free software: you can redistribute it and/or modify # This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by # it under the terms of the GNU Affero General Public License as published by
@ -16,4 +16,26 @@
""" """
piker: trading gear for hackers. piker: trading gear for hackers.
If your boss told you to checkout this code base, stop, right the fuck now.
Tell your boss this code base is absolute shit and you'll never be able to
make it "work" due to scaling and that Python is too slow. If you even for
a second suggest to them that it's useful you'll be hearing from us.
If you're a yabf (yet another backtesting framework) author/maintainer, also stop.
We don't slurp callback soup; your code can be called from any FSP routine and
``piker`` requires no adoption of your "framework". If you want to provide an
integration, cool, but we aren't going to adopt any oudated concurrency models.
This code is for **you** as a person to defend yourself from our messed
up finacial system and the charlatans and phonies who run it.
``piker`` will never be for sale nor influenced by suits of any type.
Join us, let's make the best software linux has ever seen.
""" """
import msgpack # noqa
import msgpack_numpy
# patch msgpack for numpy arrays
msgpack_numpy.patch()

View File

@ -11,6 +11,8 @@ asks.init('trio')
__brokers__ = [ __brokers__ = [
'questrade', 'questrade',
'robinhood', 'robinhood',
'ib',
'kraken',
] ]

View File

@ -0,0 +1,53 @@
"""
Actor-aware broker agnostic interface.
"""
from contextlib import asynccontextmanager, AsyncExitStack
import trio
import tractor
from . import get_brokermod
from ..log import get_logger
log = get_logger(__name__)
@asynccontextmanager
async def get_cached_client(
brokername: str,
*args,
**kwargs,
) -> 'Client': # noqa
"""Get a cached broker client from the current actor's local vars.
If one has not been setup do it and cache it.
"""
# check if a cached client is in the local actor's statespace
ss = tractor.current_actor().statespace
clients = ss.setdefault('clients', {'_lock': trio.Lock()})
lock = clients['_lock']
client = None
try:
log.info(f"Loading existing `{brokername}` daemon")
async with lock:
client = clients[brokername]
client._consumers += 1
yield client
except KeyError:
log.info(f"Creating new client for broker {brokername}")
async with lock:
brokermod = get_brokermod(brokername)
exit_stack = AsyncExitStack()
client = await exit_stack.enter_async_context(
brokermod.get_client()
)
client._consumers = 0
client._exit_stack = exit_stack
clients[brokername] = client
yield client
finally:
client._consumers -= 1
if client._consumers <= 0:
# teardown the client
await client._exit_stack.aclose()

View File

@ -14,7 +14,7 @@ import tractor
from ..cli import cli from ..cli import cli
from .. import watchlists as wl from .. import watchlists as wl
from ..log import get_console_log, colorize_json, get_logger from ..log import get_console_log, colorize_json, get_logger
from ..brokers.core import maybe_spawn_brokerd_as_subactor from ..data import maybe_spawn_brokerd
from ..brokers import core, get_brokermod, data from ..brokers import core, get_brokermod, data
log = get_logger('cli') log = get_logger('cli')
@ -99,7 +99,7 @@ def quote(config, tickers, df_output):
@cli.command() @cli.command()
@click.option('--df-output', '-df', flag_value=True, @click.option('--df-output', '-df', flag_value=True,
help='Output in `pandas.DataFrame` format') help='Output in `pandas.DataFrame` format')
@click.option('--count', '-c', default=100, @click.option('--count', '-c', default=1000,
help='Number of bars to retrieve') help='Number of bars to retrieve')
@click.argument('symbol', required=True) @click.argument('symbol', required=True)
@click.pass_obj @click.pass_obj
@ -117,10 +117,11 @@ def bars(config, symbol, count, df_output):
brokermod, brokermod,
symbol, symbol,
count=count, count=count,
as_np=df_output
) )
) )
if not bars: if not len(bars):
log.error(f"No quotes could be found for {symbol}?") log.error(f"No quotes could be found for {symbol}?")
return return
@ -130,50 +131,6 @@ def bars(config, symbol, count, df_output):
click.echo(colorize_json(bars)) click.echo(colorize_json(bars))
@cli.command()
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.option('--rate', '-r', default=3, help='Quote rate limit')
@click.option('--test', '-t', help='Test quote stream file')
@click.option('--dhost', '-dh', default='127.0.0.1',
help='Daemon host address to connect to')
@click.argument('name', nargs=1, required=True)
@click.pass_obj
def monitor(config, rate, name, dhost, test, tl):
"""Start a real-time watchlist UI
"""
# global opts
brokermod = config['brokermod']
loglevel = config['loglevel']
log = config['log']
watchlist_from_file = wl.ensure_watchlists(_watchlists_data_path)
watchlists = wl.merge_watchlist(watchlist_from_file, wl._builtins)
tickers = watchlists[name]
if not tickers:
log.error(f"No symbols found for watchlist `{name}`?")
return
from ..ui.monitor import _async_main
async def main(tries):
async with maybe_spawn_brokerd_as_subactor(
tries=tries, loglevel=loglevel
) as portal:
# run app "main"
await _async_main(
name, portal, tickers,
brokermod, rate, test=test,
)
tractor.run(
partial(main, tries=1),
name='monitor',
loglevel=loglevel if tl else None,
rpc_module_paths=['piker.ui.monitor'],
start_method='forkserver',
)
@cli.command() @cli.command()
@click.option('--rate', '-r', default=5, help='Logging level') @click.option('--rate', '-r', default=5, help='Logging level')
@click.option('--filename', '-f', default='quotestream.jsonstream', @click.option('--filename', '-f', default='quotestream.jsonstream',
@ -198,7 +155,7 @@ def record(config, rate, name, dhost, filename):
return return
async def main(tries): async def main(tries):
async with maybe_spawn_brokerd_as_subactor( async with maybe_spawn_brokerd(
tries=tries, loglevel=loglevel tries=tries, loglevel=loglevel
) as portal: ) as portal:
# run app "main" # run app "main"
@ -271,37 +228,44 @@ def optsquote(config, symbol, df_output, date):
@cli.command() @cli.command()
@click.option('--tl', is_flag=True, help='Enable tractor logging') @click.argument('tickers', nargs=-1, required=True)
@click.option('--date', '-d', help='Contracts expiry date')
@click.option('--test', '-t', help='Test quote stream file')
@click.option('--rate', '-r', default=1, help='Logging level')
@click.argument('symbol', required=True)
@click.pass_obj @click.pass_obj
def optschain(config, symbol, date, tl, rate, test): def symbol_info(config, tickers):
"""Start an option chain UI """Print symbol quotes to the console
""" """
# global opts # global opts
loglevel = config['loglevel'] brokermod = config['brokermod']
brokername = config['broker']
from ..ui.option_chain import _async_main quotes = trio.run(partial(core.symbol_info, brokermod, tickers))
if not quotes:
log.error(f"No quotes could be found for {tickers}?")
return
async def main(tries): if len(quotes) < len(tickers):
async with maybe_spawn_brokerd_as_subactor( syms = tuple(map(itemgetter('symbol'), quotes))
tries=tries, loglevel=loglevel for ticker in tickers:
): if ticker not in syms:
# run app "main" brokermod.log.warn(f"Could not find symbol {ticker}?")
await _async_main(
symbol,
brokername,
rate=rate,
loglevel=loglevel,
test=test,
)
tractor.run( click.echo(colorize_json(quotes))
partial(main, tries=1),
name='kivy-options-chain',
loglevel=loglevel if tl else None, @cli.command()
@click.argument('pattern', required=True)
@click.pass_obj
def search(config, pattern):
"""Search for symbols from broker backend(s).
"""
# global opts
brokermod = config['brokermod']
quotes = tractor.run(
partial(core.symbol_search, brokermod, pattern),
start_method='forkserver', start_method='forkserver',
loglevel='info',
) )
if not quotes:
log.error(f"No matches could be found for {pattern}?")
return
click.echo(colorize_json(quotes))

View File

@ -1,23 +1,18 @@
""" """
Broker high level API layer. Broker high level cross-process API layer.
This API should be kept "remote service compatible" meaning inputs to
routines should be primitive data types where possible.
""" """
import inspect import inspect
from types import ModuleType from types import ModuleType
from typing import List, Dict, Any, Optional from typing import List, Dict, Any, Optional
from async_generator import asynccontextmanager
import tractor
from ..log import get_logger from ..log import get_logger
from .data import DataFeed
from . import get_brokermod from . import get_brokermod
log = get_logger('broker.core') log = get_logger(__name__)
_data_mods = [
'piker.brokers.core',
'piker.brokers.data',
]
async def api(brokername: str, methname: str, **kwargs) -> dict: async def api(brokername: str, methname: str, **kwargs) -> dict:
@ -25,12 +20,11 @@ async def api(brokername: str, methname: str, **kwargs) -> dict:
""" """
brokermod = get_brokermod(brokername) brokermod = get_brokermod(brokername)
async with brokermod.get_client() as client: async with brokermod.get_client() as client:
meth = getattr(client, methname, None)
meth = getattr(client.api, methname, None)
if meth is None: if meth is None:
log.warning( log.debug(
f"Couldn't find API method {methname} looking up on client") f"Couldn't find API method {methname} looking up on client")
meth = getattr(client, methname, None) meth = getattr(client.api, methname, None)
if meth is None: if meth is None:
log.error(f"No api method `{methname}` could be found?") log.error(f"No api method `{methname}` could be found?")
@ -48,24 +42,6 @@ async def api(brokername: str, methname: str, **kwargs) -> dict:
return await meth(**kwargs) return await meth(**kwargs)
@asynccontextmanager
async def maybe_spawn_brokerd_as_subactor(sleep=0.5, tries=10, loglevel=None):
"""If no ``brokerd`` daemon-actor can be found spawn one in a
local subactor.
"""
async with tractor.open_nursery() as nursery:
async with tractor.find_actor('brokerd') as portal:
if not portal:
log.info(
"No broker daemon could be found, spawning brokerd..")
portal = await nursery.start_actor(
'brokerd',
rpc_module_paths=_data_mods,
loglevel=loglevel,
)
yield portal
async def stocks_quote( async def stocks_quote(
brokermod: ModuleType, brokermod: ModuleType,
tickers: List[str] tickers: List[str]
@ -121,3 +97,26 @@ async def bars(
""" """
async with brokermod.get_client() as client: async with brokermod.get_client() as client:
return await client.bars(symbol, **kwargs) return await client.bars(symbol, **kwargs)
async def symbol_info(
brokermod: ModuleType,
symbol: str,
**kwargs,
) -> Dict[str, Dict[str, Dict[str, Any]]]:
"""Return symbol info from broker.
"""
async with brokermod.get_client() as client:
return await client.symbol_info(symbol, **kwargs)
async def symbol_search(
brokermod: ModuleType,
pattern: str,
**kwargs,
) -> Dict[str, Dict[str, Dict[str, Any]]]:
"""Return symbol info from broker.
"""
async with brokermod.get_client() as client:
# TODO: support multiple asset type concurrent searches.
return await client.search_stocks(pattern=pattern, **kwargs)

View File

@ -12,7 +12,7 @@ import typing
from typing import ( from typing import (
Coroutine, Callable, Dict, Coroutine, Callable, Dict,
List, Any, Tuple, AsyncGenerator, List, Any, Tuple, AsyncGenerator,
Sequence, Sequence
) )
import contextlib import contextlib
from operator import itemgetter from operator import itemgetter
@ -25,7 +25,7 @@ from ..log import get_logger, get_console_log
from . import get_brokermod from . import get_brokermod
log = get_logger('broker.data') log = get_logger(__name__)
async def wait_for_network( async def wait_for_network(
@ -47,7 +47,7 @@ async def wait_for_network(
continue continue
except socket.gaierror: except socket.gaierror:
if not down: # only report/log network down once if not down: # only report/log network down once
log.warn(f"Network is down waiting for re-establishment...") log.warn("Network is down waiting for re-establishment...")
down = True down = True
await trio.sleep(sleep) await trio.sleep(sleep)
@ -80,24 +80,30 @@ class BrokerFeed:
@tractor.msg.pub(tasks=['stock', 'option']) @tractor.msg.pub(tasks=['stock', 'option'])
async def stream_requests( async def stream_poll_requests(
get_topics: typing.Callable, get_topics: Callable,
get_quotes: Coroutine, get_quotes: Coroutine,
feed: BrokerFeed, normalizer: Callable,
rate: int = 3, # delay between quote requests rate: int = 3, # delay between quote requests
diff_cached: bool = True, # only deliver "new" quotes to the queue
) -> None: ) -> None:
"""Stream requests for quotes for a set of symbols at the given """Stream requests for quotes for a set of symbols at the given
``rate`` (per second). ``rate`` (per second).
This routine is built for brokers who support quote polling for multiple
symbols per request. The ``get_topics()`` func is called to retreive the
set of symbols each iteration and ``get_quotes()`` is to retreive
the quotes.
A stock-broker client ``get_quotes()`` async function must be A stock-broker client ``get_quotes()`` async function must be
provided which returns an async quote retrieval function. provided which returns an async quote retrieval function.
"""
broker_limit = getattr(feed.mod, '_rate_limit', float('inf'))
if broker_limit < rate:
rate = broker_limit
log.warn(f"Limiting {feed.mod.__name__} query rate to {rate}/sec")
.. note::
This code is mostly tailored (for now) to the questrade backend.
It is currently the only broker that doesn't support streaming without
paying for data. See the note in the diffing section regarding volume
differentials which needs to be addressed in order to get cross-broker
support.
"""
sleeptime = round(1. / rate, 3) sleeptime = round(1. / rate, 3)
_cache = {} # ticker to quote caching _cache = {} # ticker to quote caching
@ -123,28 +129,15 @@ async def stream_requests(
quotes = await wait_for_network(request_quotes) quotes = await wait_for_network(request_quotes)
new_quotes = {} new_quotes = {}
if diff_cached:
# If cache is enabled then only deliver "new" changes. normalized = normalizer(quotes, _cache)
# Useful for polling setups but obviously should be for symbol, quote in normalized.items():
# disabled if you're rx-ing per-tick data. # XXX: we append to a list for the options case where the
for quote in quotes: # subscription topic (key) is the same for all
symbol = quote['symbol'] # expiries even though this is uncessary for the
last = _cache.setdefault(symbol, {}) # stock case (different topic [i.e. symbol] for each
new = set(quote.items()) - set(last.items()) # quote).
if new: new_quotes.setdefault(quote['key'], []).append(quote)
log.info(
f"New quote {quote['symbol']}:\n{new}")
_cache[symbol] = quote
# XXX: we append to a list for the options case where the
# subscription topic (key) is the same for all
# expiries even though this is uncessary for the
# stock case (different topic [i.e. symbol] for each
# quote).
new_quotes.setdefault(quote['key'], []).append(quote)
else:
# log.debug(f"Delivering quotes:\n{quotes}")
for quote in quotes:
new_quotes.setdefault(quote['key'], []).append(quote)
if new_quotes: if new_quotes:
yield new_quotes yield new_quotes
@ -168,54 +161,7 @@ async def symbol_data(broker: str, tickers: List[str]):
"""Retrieve baseline symbol info from broker. """Retrieve baseline symbol info from broker.
""" """
async with get_cached_feed(broker) as feed: async with get_cached_feed(broker) as feed:
return await feed.client.symbol_data(tickers) return await feed.client.symbol_info(tickers)
async def smoke_quote(get_quotes, tickers, broker):
"""Do an initial "smoke" request for symbols in ``tickers`` filtering
out any symbols not supported by the broker queried in the call to
``get_quotes()``.
"""
# TODO: trim out with #37
#################################################
# get a single quote filtering out any bad tickers
# NOTE: this code is always run for every new client
# subscription even when a broker quoter task is already running
# since the new client needs to know what symbols are accepted
log.warn(f"Retrieving smoke quote for symbols {tickers}")
quotes = await get_quotes(tickers)
# report any tickers that aren't returned in the first quote
invalid_tickers = set(tickers) - set(map(itemgetter('key'), quotes))
for symbol in invalid_tickers:
tickers.remove(symbol)
log.warn(
f"Symbol `{symbol}` not found by broker `{broker}`"
)
# pop any tickers that return "empty" quotes
payload = {}
for quote in quotes:
symbol = quote['symbol']
if quote is None:
log.warn(
f"Symbol `{symbol}` not found by broker"
f" `{broker}`")
# XXX: not this mutates the input list (for now)
tickers.remove(symbol)
continue
# report any unknown/invalid symbols (QT specific)
if quote.get('low52w', False) is None:
log.error(
f"{symbol} seems to be defunct")
payload[symbol] = quote
return payload
# end of section to be trimmed out with #37
###########################################
@asynccontextmanager @asynccontextmanager
@ -260,7 +206,6 @@ async def start_quote_stream(
broker: str, broker: str,
symbols: List[Any], symbols: List[Any],
feed_type: str = 'stock', feed_type: str = 'stock',
diff_cached: bool = True,
rate: int = 3, rate: int = 3,
) -> None: ) -> None:
"""Handle per-broker quote stream subscriptions using a "lazy" pub-sub """Handle per-broker quote stream subscriptions using a "lazy" pub-sub
@ -279,8 +224,6 @@ async def start_quote_stream(
f"{ctx.chan.uid} subscribed to {broker} for symbols {symbols}") f"{ctx.chan.uid} subscribed to {broker} for symbols {symbols}")
# another actor task may have already created it # another actor task may have already created it
async with get_cached_feed(broker) as feed: async with get_cached_feed(broker) as feed:
# function to format packets delivered to subscribers
packetizer = None
if feed_type == 'stock': if feed_type == 'stock':
get_quotes = feed.quoters.setdefault( get_quotes = feed.quoters.setdefault(
@ -289,7 +232,8 @@ async def start_quote_stream(
) )
# do a smoke quote (note this mutates the input list and filters # do a smoke quote (note this mutates the input list and filters
# out bad symbols for now) # out bad symbols for now)
payload = await smoke_quote(get_quotes, symbols, broker) first_quotes = await feed.mod.smoke_quote(get_quotes, symbols)
formatter = feed.mod.format_stock_quote
elif feed_type == 'option': elif feed_type == 'option':
# FIXME: yeah we need maybe a more general way to specify # FIXME: yeah we need maybe a more general way to specify
@ -300,29 +244,40 @@ async def start_quote_stream(
await feed.mod.option_quoter(feed.client, symbols) await feed.mod.option_quoter(feed.client, symbols)
) )
# packetize # packetize
payload = { first_quotes = {
quote['symbol']: quote quote['symbol']: quote
for quote in await get_quotes(symbols) for quote in await get_quotes(symbols)
} }
formatter = feed.mod.format_option_quote
def packetizer(topic, quotes): sd = await feed.client.symbol_info(symbols)
return {quote['symbol']: quote for quote in quotes} feed.mod._symbol_info_cache.update(sd)
normalize = partial(
feed.mod.normalize,
formatter=formatter,
)
# pre-process first set of quotes
payload = {}
for sym, quote in first_quotes.items():
fquote, _ = formatter(quote, sd)
assert fquote['displayable']
payload[sym] = fquote
# push initial smoke quote response for client initialization
await ctx.send_yield(payload) await ctx.send_yield(payload)
await stream_requests( await stream_poll_requests(
# ``msg.pub`` required kwargs # ``msg.pub`` required kwargs
task_name=feed_type, task_name=feed_type,
ctx=ctx, ctx=ctx,
topics=symbols, topics=symbols,
packetizer=packetizer, packetizer=feed.mod.packetizer,
# actual func args # actual func args
feed=feed,
get_quotes=get_quotes, get_quotes=get_quotes,
diff_cached=diff_cached, normalizer=normalize,
rate=rate, rate=rate,
) )
log.info( log.info(
@ -357,7 +312,6 @@ class DataFeed:
symbols: Sequence[str], symbols: Sequence[str],
feed_type: str, feed_type: str,
rate: int = 1, rate: int = 1,
diff_cached: bool = True,
test: str = '', test: str = '',
) -> (AsyncGenerator, dict): ) -> (AsyncGenerator, dict):
if feed_type not in self._allowed: if feed_type not in self._allowed:
@ -378,15 +332,19 @@ class DataFeed:
# subscribe for tickers (this performs a possible filtering # subscribe for tickers (this performs a possible filtering
# where invalid symbols are discarded) # where invalid symbols are discarded)
sd = await self.portal.run( sd = await self.portal.run(
"piker.brokers.data", 'symbol_data', "piker.brokers.data",
broker=self.brokermod.name, tickers=symbols) 'symbol_data',
broker=self.brokermod.name,
tickers=symbols
)
self._symbol_data_cache.update(sd) self._symbol_data_cache.update(sd)
if test: if test:
# stream from a local test file # stream from a local test file
quote_gen = await self.portal.run( quote_gen = await self.portal.run(
"piker.brokers.data", 'stream_from_file', "piker.brokers.data",
filename=test 'stream_from_file',
filename=test,
) )
else: else:
log.info(f"Starting new stream for {symbols}") log.info(f"Starting new stream for {symbols}")
@ -397,7 +355,6 @@ class DataFeed:
broker=self.brokermod.name, broker=self.brokermod.name,
symbols=symbols, symbols=symbols,
feed_type=feed_type, feed_type=feed_type,
diff_cached=diff_cached,
rate=rate, rate=rate,
) )

553
piker/brokers/ib.py 100644
View File

@ -0,0 +1,553 @@
"""
Interactive Brokers API backend.
Note the client runs under an ``asyncio`` loop (since ``ib_insync`` is
built on it) and thus actor aware API calls must be spawned with
``infected_aio==True``.
"""
from contextlib import asynccontextmanager
from dataclasses import asdict
from functools import partial
from typing import List, Dict, Any, Tuple, Optional, AsyncGenerator, Callable
import asyncio
import logging
import inspect
import itertools
import time
from async_generator import aclosing
from ib_insync.contract import Contract, ContractDetails
from ib_insync.ticker import Ticker
import ib_insync as ibis
from ib_insync.wrapper import Wrapper
from ib_insync.client import Client as ib_Client
import trio
import tractor
from ..log import get_logger, get_console_log
from ..data import maybe_spawn_brokerd
from ..ui._source import from_df
log = get_logger(__name__)
# passed to ``tractor.ActorNursery.start_actor()``
_spawn_kwargs = {
'infect_asyncio': True,
}
_time_units = {
's': ' sec',
'm': ' mins',
'h': ' hours',
}
_time_frames = {
'1s': '1 Sec',
'5s': '5 Sec',
'30s': '30 Sec',
'1m': 'OneMinute',
'2m': 'TwoMinutes',
'3m': 'ThreeMinutes',
'4m': 'FourMinutes',
'5m': 'FiveMinutes',
'10m': 'TenMinutes',
'15m': 'FifteenMinutes',
'20m': 'TwentyMinutes',
'30m': 'HalfHour',
'1h': 'OneHour',
'2h': 'TwoHours',
'4h': 'FourHours',
'D': 'OneDay',
'W': 'OneWeek',
'M': 'OneMonth',
'Y': 'OneYear',
}
# overrides to sidestep pretty questionable design decisions in
# ``ib_insync``:
class NonShittyWrapper(Wrapper):
def tcpDataArrived(self):
"""Override time stamps to be floats for now.
"""
# use a float to store epoch time instead of datetime
self.lastTime = time.time()
for ticker in self.pendingTickers:
ticker.rtTime = None
ticker.ticks = []
ticker.tickByTicks = []
ticker.domTicks = []
self.pendingTickers = set()
class NonShittyIB(ibis.IB):
"""The beginning of overriding quite a few quetionable decisions
in this lib.
- Don't use datetimes
- Don't use named tuples
"""
def __init__(self):
self._createEvents()
# XXX: just to override this wrapper
self.wrapper = NonShittyWrapper(self)
self.client = ib_Client(self.wrapper)
self.errorEvent += self._onError
self.client.apiEnd += self.disconnectedEvent
self._logger = logging.getLogger('ib_insync.ib')
# map of symbols to contract ids
_adhoc_cmdty_data_map = {
# https://misc.interactivebrokers.com/cstools/contract_info/v3.10/index.php?action=Conid%20Info&wlId=IB&conid=69067924
# NOTE: cmdtys don't have trade data:
# https://groups.io/g/twsapi/message/44174
'XAUUSD': ({'conId': 69067924}, {'whatToShow': 'MIDPOINT'}),
}
class Client:
"""IB wrapped for our broker backend API.
Note: this client requires running inside an ``asyncio`` loop.
"""
def __init__(
self,
ib: ibis.IB,
) -> None:
self.ib = ib
async def bars(
self,
symbol: str,
# EST in ISO 8601 format is required... below is EPOCH
start_date: str = "1970-01-01T00:00:00.000000-05:00",
time_frame: str = '1m',
count: int = int(2e3), # <- max allowed per query
is_paid_feed: bool = False,
) -> List[Dict[str, Any]]:
"""Retreive OHLCV bars for a symbol over a range to the present.
"""
bars_kwargs = {'whatToShow': 'TRADES'}
contract = await self.find_contract(symbol)
bars_kwargs.update(getattr(contract, 'bars_kwargs', {}))
# _min = min(2000*100, count)
bars = await self.ib.reqHistoricalDataAsync(
contract,
endDateTime='',
# durationStr='60 S',
# durationStr='1 D',
# time length calcs
durationStr='{count} S'.format(count=1000 * 5),
barSizeSetting='5 secs',
# always use extended hours
useRTH=False,
# restricted per contract type
**bars_kwargs,
# whatToShow='MIDPOINT',
# whatToShow='TRADES',
)
if not bars:
# TODO: raise underlying error here
raise ValueError(f"No bars retreived for {symbol}?")
# convert to pandas dataframe:
df = ibis.util.df(bars)
return from_df(df)
async def search_stocks(
self,
pattern: str,
# how many contracts to search "up to"
upto: int = 3,
asdicts: bool = True,
) -> Dict[str, ContractDetails]:
"""Search for stocks matching provided ``str`` pattern.
Return a dictionary of ``upto`` entries worth of contract details.
"""
descriptions = await self.ib.reqMatchingSymbolsAsync(pattern)
futs = []
for d in descriptions:
con = d.contract
futs.append(self.ib.reqContractDetailsAsync(con))
# batch request all details
results = await asyncio.gather(*futs)
# XXX: if there is more then one entry in the details list
details = {}
for details_set in results:
# then the contract is so called "ambiguous".
for d in details_set:
con = d.contract
unique_sym = f'{con.symbol}.{con.primaryExchange}'
details[unique_sym] = asdict(d) if asdicts else d
if len(details) == upto:
return details
return details
async def search_futes(
self,
pattern: str,
# how many contracts to search "up to"
upto: int = 3,
asdicts: bool = True,
) -> Dict[str, ContractDetails]:
raise NotImplementedError
async def get_cont_fute(
self,
symbol: str,
exchange: str,
) -> Contract:
"""Get an unqualifed contract for the current "continous" future.
"""
contcon = ibis.ContFuture(symbol, exchange=exchange)
frontcon = (await self.ib.qualifyContractsAsync(contcon))[0]
return ibis.Future(conId=frontcon.conId)
async def find_contract(
self,
symbol,
currency: str = 'USD',
**kwargs,
) -> Contract:
# use heuristics to figure out contract "type"
sym, exch = symbol.upper().split('.')
# futes
if exch in ('GLOBEX', 'NYMEX', 'CME', 'CMECRYPTO'):
con = await self.get_cont_fute(symbol=sym, exchange=exch)
# commodities
elif exch == 'CMDTY': # eg. XAUUSD.CMDTY
con_kwargs, bars_kwargs = _adhoc_cmdty_data_map[sym]
con = ibis.Commodity(**con_kwargs)
con.bars_kwargs = bars_kwargs
# stonks
else:
# TODO: metadata system for all these exchange rules..
if exch in ('PURE', 'TSE'): # non-yankee
currency = 'CAD'
con = ibis.Stock(symbol=sym, exchange=exch, currency=currency)
try:
exch = 'SMART' if not exch else exch
contract = (await self.ib.qualifyContractsAsync(con))[0]
except IndexError:
raise ValueError(f"No contract could be found {con}")
return contract
async def stream_ticker(
self,
symbol: str,
to_trio,
opts: Tuple[int] = ('233', '375'),
# opts: Tuple[int] = ('459',),
) -> None:
"""Stream a ticker using the std L1 api.
"""
contract = await self.find_contract(symbol)
ticker: Ticker = self.ib.reqMktData(contract, ','.join(opts))
def push(t):
log.debug(t)
try:
to_trio.send_nowait(t)
except trio.BrokenResourceError:
# XXX: eventkit's ``Event.emit()`` for whatever redic
# reason will catch and ignore regular exceptions
# resulting in tracebacks spammed to console..
# Manually do the dereg ourselves.
ticker.updateEvent.disconnect(push)
log.error(f"Disconnected stream for `{symbol}`")
self.ib.cancelMktData(contract)
ticker.updateEvent.connect(push)
# let the engine run and stream
await self.ib.disconnectedEvent
# default config ports
_tws_port: int = 7497
_gw_port: int = 4002
_try_ports = [_tws_port, _gw_port]
@asynccontextmanager
async def _aio_get_client(
host: str = '127.0.0.1',
port: int = None,
client_id: Optional[int] = None,
) -> Client:
"""Return an ``ib_insync.IB`` instance wrapped in our client API.
"""
if client_id is None:
# if this is a persistent brokerd, try to allocate a new id for
# each client
try:
ss = tractor.current_actor().statespace
client_id = next(ss.setdefault('client_ids', itertools.count()))
# TODO: in case the arbiter has no record
# of existing brokerd we need to broadcase for one.
except RuntimeError:
# tractor likely isn't running
client_id = 1
ib = NonShittyIB()
ports = _try_ports if port is None else [port]
_err = None
for port in ports:
try:
await ib.connectAsync(host, port, clientId=client_id)
break
except ConnectionRefusedError as ce:
_err = ce
log.warning(f'Failed to connect on {port}')
else:
raise ConnectionRefusedError(_err)
try:
yield Client(ib)
except BaseException:
ib.disconnect()
raise
async def _aio_run_client_method(
meth: str,
to_trio=None,
from_trio=None,
**kwargs,
) -> None:
log.info("Connecting to the EYEEEEBEEEEE GATEWAYYYYYYY!")
async with _aio_get_client() as client:
async_meth = getattr(client, meth)
# handle streaming methods
args = tuple(inspect.getfullargspec(async_meth).args)
if to_trio and 'to_trio' in args:
kwargs['to_trio'] = to_trio
return await async_meth(**kwargs)
async def _trio_run_client_method(
method: str,
**kwargs,
) -> None:
ca = tractor.current_actor()
assert ca.is_infected_aio()
# if the method is an *async gen* stream for it
meth = getattr(Client, method)
if inspect.isasyncgenfunction(meth):
kwargs['_treat_as_stream'] = True
# if the method is an *async func* but manually
# streams back results, make sure to also stream it
args = tuple(inspect.getfullargspec(meth).args)
if 'to_trio' in args:
kwargs['_treat_as_stream'] = True
result = await tractor.to_asyncio.run_task(
_aio_run_client_method,
meth=method,
**kwargs
)
return result
class _MethodProxy:
def __init__(
self,
portal: tractor._portal.Portal
) -> None:
self._portal = portal
async def _run_method(
self,
*,
meth: str = None,
**kwargs
) -> Any:
return await self._portal.run(
__name__,
'_trio_run_client_method',
method=meth,
**kwargs
)
def get_method_proxy(portal, target) -> _MethodProxy:
proxy = _MethodProxy(portal)
# mock all remote methods
for name, method in inspect.getmembers(
target, predicate=inspect.isfunction
):
if '_' == name[0]:
continue
setattr(proxy, name, partial(proxy._run_method, meth=name))
return proxy
@asynccontextmanager
async def get_client(
**kwargs,
) -> Client:
"""Init the ``ib_insync`` client in another actor and return
a method proxy to it.
"""
async with maybe_spawn_brokerd(
brokername='ib',
expose_mods=[__name__],
infect_asyncio=True,
**kwargs
) as portal:
yield get_method_proxy(portal, Client)
def normalize(
ticker: Ticker,
calc_price: bool = False
) -> dict:
# convert named tuples to dicts so we send usable keys
new_ticks = []
for tick in ticker.ticks:
td = tick._asdict()
if td['tickType'] in (48, 77):
td['type'] = 'trade'
new_ticks.append(td)
ticker.ticks = new_ticks
# some contracts don't have volume so we may want to calculate
# a midpoint price based on data we can acquire (such as bid / ask)
if calc_price:
ticker.ticks.append(
{'type': 'trade', 'price': ticker.marketPrice()}
)
# serialize for transport
data = asdict(ticker)
# add time stamps for downstream latency measurements
data['brokerd_ts'] = time.time()
if ticker.rtTime:
data['broker_ts'] = data['rtTime_s'] = float(ticker.rtTime) / 1000.
return data
# TODO: figure out how to share quote feeds sanely despite
# the wacky ``ib_insync`` api.
# @tractor.msg.pub
async def stream_quotes(
symbols: List[str],
loglevel: str = None,
# compat for @tractor.msg.pub
topics: Any = None,
get_topics: Callable = None,
) -> AsyncGenerator[str, Dict[str, Any]]:
"""Stream symbol quotes.
This is a ``trio`` callable routine meant to be invoked
once the brokerd is up.
"""
# XXX: required to propagate ``tractor`` loglevel to piker logging
get_console_log(loglevel or tractor.current_actor().loglevel)
# TODO: support multiple subscriptions
sym = symbols[0]
stream = await tractor.to_asyncio.run_task(
_trio_run_client_method,
method='stream_ticker',
symbol=sym,
)
async with aclosing(stream):
# first quote can be ignored as a 2nd with newer data is sent?
first_ticker = await stream.__anext__()
if type(first_ticker.contract) not in (ibis.Commodity,):
suffix = 'exchange'
calc_price = False # should be real volume for contract
quote = normalize(first_ticker)
log.debug(f"First ticker received {quote}")
con = quote['contract']
topic = '.'.join((con['symbol'], con[suffix])).lower()
yield {topic: quote}
# ugh, clear ticks since we've consumed them
# (ahem, ib_insync is stateful trash)
first_ticker.ticks = []
async for ticker in stream:
# spin consuming tickers until we get a real market datum
if not ticker.rtTime:
log.debug(f"New unsent ticker: {ticker}")
continue
else:
log.debug("Received first real volume tick")
quote = normalize(ticker)
topic = '.'.join((con['symbol'], con[suffix])).lower()
yield {topic: quote}
# ugh, clear ticks since we've consumed them
# (ahem, ib_insync is stateful trash)
ticker.ticks = []
# XXX: this works because we don't use
# ``aclosing()`` above?
break
else:
# commodities don't have an exchange name for some reason?
suffix = 'secType'
calc_price = True
async for ticker in stream:
quote = normalize(
ticker,
calc_price=calc_price
)
con = quote['contract']
topic = '.'.join((con['symbol'], con[suffix])).lower()
yield {topic: quote}
# ugh, clear ticks since we've consumed them
ticker.ticks = []
if __name__ == '__main__':
import sys
sym = sys.argv[1]
contract = asyncio.run(
_aio_run_client_method(
'find_contract',
symbol=sym,
)
)
print(contract)

View File

@ -0,0 +1,298 @@
"""
Kraken backend.
"""
from contextlib import asynccontextmanager
from dataclasses import dataclass, asdict, field
from itertools import starmap
from typing import List, Dict, Any, Callable
import json
import time
import trio_websocket
from trio_websocket._impl import ConnectionClosed, DisconnectionTimeout
import arrow
import asks
import numpy as np
import trio
import tractor
from ._util import resproc, SymbolNotFound, BrokerError
from ..log import get_logger, get_console_log
log = get_logger(__name__)
# <uri>/<version>/
_url = 'https://api.kraken.com/0'
# conversion to numpy worthy types
ohlc_dtype = [
('index', int),
('time', int),
('open', float),
('high', float),
('low', float),
('close', float),
('vwap', float),
('volume', float),
('count', int)
]
class Client:
def __init__(self) -> None:
self._sesh = asks.Session(connections=4)
self._sesh.base_location = _url
self._sesh.headers.update({
'User-Agent':
'krakenex/2.1.0 (+https://github.com/veox/python3-krakenex)'
})
async def _public(
self,
method: str,
data: dict,
) -> Dict[str, Any]:
resp = await self._sesh.post(
path=f'/public/{method}',
json=data,
timeout=float('inf')
)
return resproc(resp, log)
async def symbol_info(
self,
pair: str = 'all',
):
resp = await self._public('AssetPairs', {'pair': pair})
err = resp['error']
if err:
raise BrokerError(err)
true_pair_key, data = next(iter(resp['result'].items()))
return data
async def bars(
self,
symbol: str = 'XBTUSD',
# UTC 2017-07-02 12:53:20
since: int = None,
count: int = 720, # <- max allowed per query
as_np: bool = True,
) -> dict:
if since is None:
since = arrow.utcnow().floor('minute').shift(
minutes=-count).timestamp
# UTC 2017-07-02 12:53:20 is oldest seconds value
since = str(max(1499000000, since))
json = await self._public(
'OHLC',
data={
'pair': symbol,
'since': since,
},
)
try:
res = json['result']
res.pop('last')
bars = next(iter(res.values()))
# convert all fields to native types
bars = list(starmap(
lambda i, bar:
(i,) + tuple(
ftype(bar[i]) for i, (name, ftype)
in enumerate(ohlc_dtype[1:])
),
enumerate(bars))
)
return np.array(bars, dtype=ohlc_dtype) if as_np else bars
except KeyError:
raise SymbolNotFound(json['error'][0] + f': {symbol}')
@asynccontextmanager
async def get_client() -> Client:
yield Client()
@dataclass
class OHLC:
"""Description of the flattened OHLC quote format.
For schema details see:
https://docs.kraken.com/websockets/#message-ohlc
"""
chan_id: int # internal kraken id
chan_name: str # eg. ohlc-1 (name-interval)
pair: str # fx pair
time: float # Begin time of interval, in seconds since epoch
etime: float # End time of interval, in seconds since epoch
open: float # Open price of interval
high: float # High price within interval
low: float # Low price within interval
close: float # Close price of interval
vwap: float # Volume weighted average price within interval
volume: float # Accumulated volume **within interval**
count: int # Number of trades within interval
# (sampled) generated tick data
ticks: List[Any] = field(default_factory=list)
# XXX: ugh, super hideous.. needs built-in converters.
def __post_init__(self):
for f, val in self.__dataclass_fields__.items():
if f == 'ticks':
continue
setattr(self, f, val.type(getattr(self, f)))
async def recv_ohlc(recv):
too_slow_count = last_hb = 0
while True:
with trio.move_on_after(1.5) as cs:
msg = await recv()
# trigger reconnection logic if too slow
if cs.cancelled_caught:
too_slow_count += 1
if too_slow_count > 2:
log.warning(
"Heartbeat is to slow, "
"resetting ws connection")
raise trio_websocket._impl.ConnectionClosed(
"Reset Connection")
if isinstance(msg, dict):
if msg.get('event') == 'heartbeat':
now = time.time()
delay = now - last_hb
last_hb = now
log.trace(f"Heartbeat after {delay}")
# TODO: hmm i guess we should use this
# for determining when to do connection
# resets eh?
continue
err = msg.get('errorMessage')
if err:
raise BrokerError(err)
else:
chan_id, ohlc_array, chan_name, pair = msg
yield OHLC(chan_id, chan_name, pair, *ohlc_array)
def normalize(
ohlc: OHLC,
) -> dict:
quote = asdict(ohlc)
quote['broker_ts'] = quote['time']
quote['brokerd_ts'] = time.time()
quote['pair'] = quote['pair'].replace('/', '')
# seriously eh? what's with this non-symmetry everywhere
# in subscription systems...
topic = quote['pair'].replace('/', '')
print(quote)
return topic, quote
@tractor.msg.pub
async def stream_quotes(
get_topics: Callable,
# These are the symbols not expected by the ws api
# they are looked up inside this routine.
symbols: List[str] = ['XBTUSD', 'XMRUSD'],
sub_type: str = 'ohlc',
loglevel: str = None,
) -> None:
"""Subscribe for ohlc stream of quotes for ``pairs``.
``pairs`` must be formatted <crypto_symbol>/<fiat_symbol>.
"""
# XXX: required to propagate ``tractor`` loglevel to piker logging
get_console_log(loglevel or tractor.current_actor().loglevel)
ws_pairs = {}
async with get_client() as client:
for sym in symbols:
ws_pairs[sym] = (await client.symbol_info(sym))['wsname']
while True:
try:
async with trio_websocket.open_websocket_url(
'wss://ws.kraken.com',
) as ws:
# setup subs
# see: https://docs.kraken.com/websockets/#message-subscribe
subs = {
'pair': list(ws_pairs.values()),
'event': 'subscribe',
'subscription': {
'name': sub_type,
'interval': 1, # 1 min
# 'name': 'ticker',
# 'name': 'openOrders',
# 'depth': '25',
},
}
# TODO: we want to eventually allow unsubs which should
# be completely fine to request from a separate task
# since internally the ws methods appear to be FIFO
# locked.
await ws.send_message(json.dumps(subs))
async def recv():
return json.loads(await ws.get_message())
# pull a first quote and deliver
ohlc_gen = recv_ohlc(recv)
ohlc_last = await ohlc_gen.__anext__()
topic, quote = normalize(ohlc_last)
# packetize as {topic: quote}
yield {topic: quote}
# keep start of last interval for volume tracking
last_interval_start = ohlc_last.etime
# start streaming
async for ohlc in ohlc_gen:
# generate tick values to match time & sales pane:
# https://trade.kraken.com/charts/KRAKEN:BTC-USD?period=1m
volume = ohlc.volume
if ohlc.etime > last_interval_start: # new interval
last_interval_start = ohlc.etime
tick_volume = volume
else:
# this is the tick volume *within the interval*
tick_volume = volume - ohlc_last.volume
if tick_volume:
ohlc.ticks.append({
'type': 'trade',
'price': ohlc.close,
'size': tick_volume,
})
topic, quote = normalize(ohlc)
# XXX: format required by ``tractor.msg.pub``
# requires a ``Dict[topic: str, quote: dict]``
yield {topic: quote}
ohlc_last = ohlc
except (ConnectionClosed, DisconnectionTimeout):
log.exception("Good job kraken...reconnecting")
if __name__ == '__main__':
async def stream_ohlc():
async for msg in stream_quotes():
print(msg)
tractor.run(stream_ohlc)

View File

@ -3,14 +3,22 @@ Questrade API backend.
""" """
from __future__ import annotations from __future__ import annotations
import inspect import inspect
import contextlib
import time import time
from datetime import datetime from datetime import datetime
from functools import partial from functools import partial
import itertools
import configparser import configparser
from typing import List, Tuple, Dict, Any, Iterator, NamedTuple from pprint import pformat
from typing import (
List, Tuple, Dict, Any, Iterator, NamedTuple,
AsyncGenerator,
Callable,
)
import arrow import arrow
import trio import trio
import tractor
from async_generator import asynccontextmanager from async_generator import asynccontextmanager
import pandas as pd import pandas as pd
import numpy as np import numpy as np
@ -20,8 +28,11 @@ import asks
from ..calc import humanize, percent_change from ..calc import humanize, percent_change
from . import config from . import config
from ._util import resproc, BrokerError, SymbolNotFound from ._util import resproc, BrokerError, SymbolNotFound
from ..log import get_logger, colorize_json from ..log import get_logger, colorize_json, get_console_log
from .._async_utils import async_lifo_cache from .._async_utils import async_lifo_cache
from . import get_brokermod
from . import api
log = get_logger(__name__) log = get_logger(__name__)
@ -407,10 +418,10 @@ class Client:
return symbols2ids return symbols2ids
async def symbol_data(self, tickers: List[str]): async def symbol_info(self, symbols: List[str]):
"""Return symbol data for ``tickers``. """Return symbol data for ``symbols``.
""" """
t2ids = await self.tickers2ids(tickers) t2ids = await self.tickers2ids(symbols)
ids = ','.join(t2ids.values()) ids = ','.join(t2ids.values())
symbols = {} symbols = {}
for pkt in (await self.api.symbols(ids=ids))['symbols']: for pkt in (await self.api.symbols(ids=ids))['symbols']:
@ -418,6 +429,9 @@ class Client:
return symbols return symbols
# TODO: deprecate
symbol_data = symbol_info
async def quote(self, tickers: [str]): async def quote(self, tickers: [str]):
"""Return stock quotes for each ticker in ``tickers``. """Return stock quotes for each ticker in ``tickers``.
""" """
@ -554,6 +568,7 @@ class Client:
time_frame: str = '1m', time_frame: str = '1m',
count: float = 20e3, count: float = 20e3,
is_paid_feed: bool = False, is_paid_feed: bool = False,
as_np: bool = False
) -> List[Dict[str, Any]]: ) -> List[Dict[str, Any]]:
"""Retreive OHLCV bars for a symbol over a range to the present. """Retreive OHLCV bars for a symbol over a range to the present.
@ -598,6 +613,24 @@ class Client:
f"Took {time.time() - start} seconds to retreive {len(bars)} bars") f"Took {time.time() - start} seconds to retreive {len(bars)} bars")
return bars return bars
async def search_stocks(
self,
pattern: str,
# how many contracts to return
upto: int = 10,
) -> Dict[str, str]:
details = {}
results = await self.api.search(prefix=pattern)
for result in results['symbols']:
sym = result['symbol']
if '.' not in sym:
sym = f"{sym}.{result['listingExchange']}"
details[sym] = result
if len(details) == upto:
return details
# marketstore TSD compatible numpy dtype for bar # marketstore TSD compatible numpy dtype for bar
_qt_bars_dt = [ _qt_bars_dt = [
@ -808,7 +841,8 @@ _qt_stock_keys = {
# 'low52w': 'low52w', # put in info widget # 'low52w': 'low52w', # put in info widget
# 'high52w': 'high52w', # 'high52w': 'high52w',
# "lastTradePriceTrHrs": 7.99, # "lastTradePriceTrHrs": 7.99,
'lastTradeTime': ('fill_time', datetime.fromisoformat), # 'lastTradeTime': ('fill_time', datetime.fromisoformat),
'lastTradeTime': 'fill_time',
"lastTradeTick": 'tick', # ("Equal", "Up", "Down") "lastTradeTick": 'tick', # ("Equal", "Up", "Down")
# "symbolId": 3575753, # "symbolId": 3575753,
# "tier": "", # "tier": "",
@ -839,29 +873,43 @@ def format_stock_quote(
and the second is the same but with all values converted to a and the second is the same but with all values converted to a
"display-friendly" string format. "display-friendly" string format.
""" """
last = quote['lastTradePrice']
symbol = quote['symbol'] symbol = quote['symbol']
previous = symbol_data[symbol]['prevDayClosePrice'] previous = symbol_data[symbol]['prevDayClosePrice']
change = percent_change(previous, last)
share_count = symbol_data[symbol].get('outstandingShares', None) computed = {'symbol': symbol}
mktcap = share_count * last if (last and share_count) else 0 last = quote.get('lastTradePrice')
computed = { if last:
'symbol': quote['symbol'], change = percent_change(previous, last)
'%': round(change, 3), share_count = symbol_data[symbol].get('outstandingShares', None)
'MC': mktcap, mktcap = share_count * last if (last and share_count) else 0
# why QT do you have to be an asshole shipping null values!!! computed.update({
'$ vol': round((quote['VWAP'] or 0) * (quote['volume'] or 0), 3), # 'symbol': quote['symbol'],
'close': previous, '%': round(change, 3),
} 'MC': mktcap,
# why questrade do you have to be shipping null values!!!
# '$ vol': round((quote['VWAP'] or 0) * (quote['volume'] or 0), 3),
'close': previous,
})
vwap = quote.get('VWAP')
volume = quote.get('volume')
if volume is not None: # could be 0
# why questrade do you have to be an asshole shipping null values!!!
computed['$ vol'] = round((vwap or 0) * (volume or 0), 3)
new = {} new = {}
displayable = {} displayable = {}
for key, new_key in keymap.items(): for key, value in itertools.chain(quote.items(), computed.items()):
display_value = value = computed.get(key) or quote.get(key) new_key = keymap.get(key)
if not new_key:
continue
# API servers can return `None` vals when markets are closed (weekend) # API servers can return `None` vals when markets are closed (weekend)
value = 0 if value is None else value value = 0 if value is None else value
display_value = value
# convert values to a displayble format using available formatting func # convert values to a displayble format using available formatting func
if isinstance(new_key, tuple): if isinstance(new_key, tuple):
new_key, func = new_key new_key, func = new_key
@ -870,6 +918,7 @@ def format_stock_quote(
new[new_key] = value new[new_key] = value
displayable[new_key] = display_value displayable[new_key] = display_value
new['displayable'] = displayable
return new, displayable return new, displayable
@ -891,7 +940,8 @@ _qt_option_keys = {
# "theta": ('theta', partial(round, ndigits=3)), # "theta": ('theta', partial(round, ndigits=3)),
# "vega": ('vega', partial(round, ndigits=3)), # "vega": ('vega', partial(round, ndigits=3)),
'$ vol': ('$ vol', humanize), '$ vol': ('$ vol', humanize),
'volume': ('vol', humanize), # XXX: required key to trigger trade execution datum msg
'volume': ('volume', humanize),
# "2021-01-15T00:00:00.000000-05:00", # "2021-01-15T00:00:00.000000-05:00",
# "isHalted": false, # "isHalted": false,
# "key": [ # "key": [
@ -929,6 +979,7 @@ def format_option_quote(
quote: dict, quote: dict,
symbol_data: dict, symbol_data: dict,
keymap: dict = _qt_option_keys, keymap: dict = _qt_option_keys,
include_displayables: bool = True,
) -> Tuple[dict, dict]: ) -> Tuple[dict, dict]:
"""Remap a list of quote dicts ``quotes`` using the mapping of old keys """Remap a list of quote dicts ``quotes`` using the mapping of old keys
-> new keys ``keymap`` returning 2 dicts: one with raw data and the other -> new keys ``keymap`` returning 2 dicts: one with raw data and the other
@ -939,19 +990,25 @@ def format_option_quote(
"display-friendly" string format. "display-friendly" string format.
""" """
# TODO: need historical data.. # TODO: need historical data..
# (cause why would QT keep their quote structure consistent across # (cause why would questrade keep their quote structure consistent across
# assets..) # assets..)
# previous = symbol_data[symbol]['prevDayClosePrice'] # previous = symbol_data[symbol]['prevDayClosePrice']
# change = percent_change(previous, last) # change = percent_change(previous, last)
computed = { computed = {
# why QT do you have to be an asshole shipping null values!!! # why QT do you have to be an asshole shipping null values!!!
'$ vol': round((quote['VWAP'] or 0) * (quote['volume'] or 0), 3), # '$ vol': round((quote['VWAP'] or 0) * (quote['volume'] or 0), 3),
# '%': round(change, 3), # '%': round(change, 3),
# 'close': previous, # 'close': previous,
} }
new = {} new = {}
displayable = {} displayable = {}
vwap = quote.get('VWAP')
volume = quote.get('volume')
if volume is not None: # could be 0
# why questrade do you have to be an asshole shipping null values!!!
computed['$ vol'] = round((vwap or 0) * (volume or 0), 3)
# structuring and normalization # structuring and normalization
for key, new_key in keymap.items(): for key, new_key in keymap.items():
display_value = value = computed.get(key) or quote.get(key) display_value = value = computed.get(key) or quote.get(key)
@ -968,3 +1025,201 @@ def format_option_quote(
displayable[new_key] = display_value displayable[new_key] = display_value
return new, displayable return new, displayable
async def smoke_quote(
get_quotes,
tickers
):
"""Do an initial "smoke" request for symbols in ``tickers`` filtering
out any symbols not supported by the broker queried in the call to
``get_quotes()``.
"""
from operator import itemgetter
# TODO: trim out with #37
#################################################
# get a single quote filtering out any bad tickers
# NOTE: this code is always run for every new client
# subscription even when a broker quoter task is already running
# since the new client needs to know what symbols are accepted
log.warn(f"Retrieving smoke quote for symbols {tickers}")
quotes = await get_quotes(tickers)
# report any tickers that aren't returned in the first quote
invalid_tickers = set(tickers) - set(map(itemgetter('key'), quotes))
for symbol in invalid_tickers:
tickers.remove(symbol)
log.warn(
f"Symbol `{symbol}` not found") # by broker `{broker}`"
# )
# pop any tickers that return "empty" quotes
payload = {}
for quote in quotes:
symbol = quote['symbol']
if quote is None:
log.warn(
f"Symbol `{symbol}` not found")
# XXX: not this mutates the input list (for now)
tickers.remove(symbol)
continue
# report any unknown/invalid symbols (QT specific)
if quote.get('low52w', False) is None:
log.error(
f"{symbol} seems to be defunct")
quote['symbol'] = symbol
payload[symbol] = quote
return payload
# end of section to be trimmed out with #37
###########################################
# unbounded, shared between streaming tasks
_symbol_info_cache = {}
# function to format packets delivered to subscribers
def packetizer(
topic: str,
quotes: Dict[str, Any],
) -> Dict[str, Any]:
"""Normalize quotes by name into dicts using broker-specific
processing.
"""
# repack into symbol keyed dict
return {q['symbol']: q for q in quotes}
def normalize(
quotes: Dict[str, Any],
_cache: Dict[str, Any], # dict held in scope of the streaming loop
formatter: Callable,
) -> Dict[str, Any]:
"""Deliver normalized quotes by name into dicts using
broker-specific processing; only emit changes differeing from the
last quote sample creating a psuedo-tick type datum.
"""
new = {}
# XXX: this is effectively emitting "sampled ticks"
# useful for polling setups but obviously should be
# disabled if you're already rx-ing per-tick data.
for quote in quotes:
symbol = quote['symbol']
# look up last quote from cache
last = _cache.setdefault(symbol, {})
_cache[symbol] = quote
# compute volume difference
last_volume = last.get('volume', 0)
current_volume = quote['volume']
volume_diff = current_volume - last_volume
# find all keys that have match to a new value compared
# to the last quote received
changed = set(quote.items()) - set(last.items())
if changed:
log.info(f"New quote {symbol}:\n{changed}")
# TODO: can we reduce the # of iterations here and in
# called funcs?
payload = {k: quote[k] for k, v in changed}
payload['symbol'] = symbol # required by formatter
# TODO: we should probaby do the "computed" fields
# processing found inside this func in a downstream actor?
fquote, _ = formatter(payload, _symbol_info_cache)
fquote['key'] = fquote['symbol'] = symbol
# if there was volume likely the last size of
# shares traded is useful info and it's possible
# that the set difference from above will disregard
# a "size" value since the same # of shares were traded
# volume = payload.get('volume')
if volume_diff:
if volume_diff < 0:
log.error(f"Uhhh {symbol} volume: {volume_diff} ?")
fquote['volume_delta'] = volume_diff
# TODO: We can emit 2 ticks here:
# - one for the volume differential
# - one for the last known trade size
# The first in theory can be unwound and
# interpolated assuming the broker passes an
# accurate daily VWAP value.
# To make this work we need a universal ``size``
# field that is normalized before hitting this logic.
fquote['size'] = quote.get('lastTradeSize', 0)
if 'last' not in fquote:
fquote['last'] = quote.get('lastTradePrice', float('nan'))
new[symbol] = fquote
if new:
log.info(f"New quotes:\n{pformat(new)}")
return new
@tractor.stream
async def stream_quotes(
ctx: tractor.Context, # marks this as a streaming func
symbols: List[str],
feed_type: str = 'stock',
rate: int = 3,
loglevel: str = None,
# feed_type: str = 'stock',
) -> AsyncGenerator[str, Dict[str, Any]]:
# XXX: required to propagate ``tractor`` loglevel to piker logging
get_console_log(loglevel)
async with api.get_cached_client('questrade') as client:
if feed_type == 'stock':
formatter = format_stock_quote
get_quotes = await stock_quoter(client, symbols)
# do a smoke quote (note this mutates the input list and filters
# out bad symbols for now)
first_quotes = await smoke_quote(get_quotes, list(symbols))
else:
formatter = format_option_quote
get_quotes = await option_quoter(client, symbols)
# packetize
first_quotes = {
quote['symbol']: quote
for quote in await get_quotes(symbols)
}
# update global symbol data state
sd = await client.symbol_info(symbols)
_symbol_info_cache.update(sd)
# pre-process first set of quotes
payload = {}
for sym, quote in first_quotes.items():
fquote, _ = formatter(quote, sd)
payload[sym] = fquote
# push initial smoke quote response for client initialization
await ctx.send_yield(payload)
from .data import stream_poll_requests
await stream_poll_requests(
# ``msg.pub`` required kwargs
task_name=feed_type,
ctx=ctx,
topics=symbols,
packetizer=packetizer,
# actual target "streaming func" args
get_quotes=get_quotes,
normalizer=partial(normalize, formatter=formatter),
rate=rate,
)
log.info("Terminating stream quoter task")

View File

@ -6,9 +6,8 @@ import os
import click import click
import tractor import tractor
from ..log import get_console_log, get_logger from ..log import get_console_log, get_logger, colorize_json
from ..brokers import get_brokermod, config from ..brokers import get_brokermod, config
from ..brokers.core import _data_mods
log = get_logger('cli') log = get_logger('cli')
DEFAULT_BROKER = 'questrade' DEFAULT_BROKER = 'questrade'
@ -17,6 +16,7 @@ _config_dir = click.get_app_dir('piker')
_watchlists_data_path = os.path.join(_config_dir, 'watchlists.json') _watchlists_data_path = os.path.join(_config_dir, 'watchlists.json')
_context_defaults = dict( _context_defaults = dict(
default_map={ default_map={
# Questrade specific quote poll rates
'monitor': { 'monitor': {
'rate': 3, 'rate': 3,
}, },
@ -34,6 +34,7 @@ _context_defaults = dict(
def pikerd(loglevel, host, tl): def pikerd(loglevel, host, tl):
"""Spawn the piker broker-daemon. """Spawn the piker broker-daemon.
""" """
from ..data import _data_mods
get_console_log(loglevel) get_console_log(loglevel)
tractor.run_daemon( tractor.run_daemon(
rpc_module_paths=_data_mods, rpc_module_paths=_data_mods,
@ -46,9 +47,10 @@ def pikerd(loglevel, host, tl):
@click.option('--broker', '-b', default=DEFAULT_BROKER, @click.option('--broker', '-b', default=DEFAULT_BROKER,
help='Broker backend to use') help='Broker backend to use')
@click.option('--loglevel', '-l', default='warning', help='Logging level') @click.option('--loglevel', '-l', default='warning', help='Logging level')
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.option('--configdir', '-c', help='Configuration directory') @click.option('--configdir', '-c', help='Configuration directory')
@click.pass_context @click.pass_context
def cli(ctx, broker, loglevel, configdir): def cli(ctx, broker, loglevel, tl, configdir):
if configdir is not None: if configdir is not None:
assert os.path.isdir(configdir), f"`{configdir}` is not a valid path" assert os.path.isdir(configdir), f"`{configdir}` is not a valid path"
config._override_config_dir(configdir) config._override_config_dir(configdir)
@ -58,13 +60,50 @@ def cli(ctx, broker, loglevel, configdir):
'broker': broker, 'broker': broker,
'brokermod': get_brokermod(broker), 'brokermod': get_brokermod(broker),
'loglevel': loglevel, 'loglevel': loglevel,
'tractorloglevel': None,
'log': get_console_log(loglevel), 'log': get_console_log(loglevel),
'confdir': _config_dir, 'confdir': _config_dir,
'wl_path': _watchlists_data_path, 'wl_path': _watchlists_data_path,
}) })
# allow enabling same loglevel in ``tractor`` machinery
if tl:
ctx.obj.update({'tractorloglevel': loglevel})
@cli.command()
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.argument('names', nargs=-1, required=False)
@click.pass_obj
def services(config, tl, names):
async def list_services():
async with tractor.get_arbiter(
*tractor.current_actor()._arb_addr
) as portal:
registry = await portal.run('self', 'get_registry')
json_d = {}
for uid, socket in registry.items():
name, uuid = uid
host, port = socket
json_d[f'{name}.{uuid}'] = f'{host}:{port}'
click.echo(
f"Available `piker` services:\n{colorize_json(json_d)}"
)
tractor.run(
list_services,
name='service_query',
loglevel=config['loglevel'] if tl else None,
)
def _load_clis() -> None:
from ..data import cli as _
from ..brokers import cli as _ # noqa
from ..ui import cli as _ # noqa
from ..watchlists import cli as _ # noqa
# load downstream cli modules # load downstream cli modules
from ..brokers import cli as _ _load_clis()
from ..watchlists import cli as _
from ..data import marketstore as _

View File

@ -0,0 +1,121 @@
"""
Data feed apis and infra.
We provide tsdb integrations for retrieving
and storing data from your brokers as well as
sharing your feeds with other fellow pikers.
"""
from contextlib import asynccontextmanager
from importlib import import_module
from types import ModuleType
from typing import (
Dict, List, Any,
Sequence, AsyncIterator, Optional
)
import trio
import tractor
from ..brokers import get_brokermod
from ..log import get_logger, get_console_log
log = get_logger(__name__)
__ingestors__ = [
'marketstore',
]
def get_ingestor(name: str) -> ModuleType:
"""Return the imported ingestor module by name.
"""
module = import_module('.' + name, 'piker.data')
# we only allow monkeying because it's for internal keying
module.name = module.__name__.split('.')[-1]
return module
_data_mods = [
'piker.brokers.core',
'piker.brokers.data',
]
@asynccontextmanager
async def maybe_spawn_brokerd(
brokername: str,
sleep: float = 0.5,
loglevel: Optional[str] = None,
expose_mods: List = [],
**tractor_kwargs,
) -> tractor._portal.Portal:
"""If no ``brokerd.{brokername}`` daemon-actor can be found,
spawn one in a local subactor and return a portal to it.
"""
if loglevel:
get_console_log(loglevel)
tractor_kwargs['loglevel'] = loglevel
brokermod = get_brokermod(brokername)
dname = f'brokerd.{brokername}'
async with tractor.find_actor(dname) as portal:
# WTF: why doesn't this work?
if portal is not None:
yield portal
else:
log.info(f"Spawning {brokername} broker daemon")
tractor_kwargs = getattr(brokermod, '_spawn_kwargs', {})
async with tractor.open_nursery() as nursery:
try:
# spawn new daemon
portal = await nursery.start_actor(
dname,
rpc_module_paths=_data_mods + [brokermod.__name__],
loglevel=loglevel,
**tractor_kwargs
)
async with tractor.wait_for_actor(dname) as portal:
yield portal
finally:
# client code may block indefinitely so cancel when
# teardown is invoked
await nursery.cancel()
@asynccontextmanager
async def open_feed(
name: str,
symbols: Sequence[str],
loglevel: Optional[str] = None,
) -> AsyncIterator[Dict[str, Any]]:
"""Open a "data feed" which provides streamed real-time quotes.
"""
try:
mod = get_brokermod(name)
except ImportError:
mod = get_ingestormod(name)
if loglevel is None:
loglevel = tractor.current_actor().loglevel
async with maybe_spawn_brokerd(
mod.name,
loglevel=loglevel,
) as portal:
stream = await portal.run(
mod.__name__,
'stream_quotes',
symbols=symbols,
topics=symbols,
)
# Feed is required to deliver an initial quote asap.
# TODO: should we timeout and raise a more explicit error?
# with trio.fail_after(5):
with trio.fail_after(float('inf')):
# Retreive initial quote for each symbol
# such that consumer code can know the data layout
first_quote = await stream.__anext__()
log.info(f"Received first quote {first_quote}")
yield (first_quote, stream)

View File

@ -0,0 +1,324 @@
"""
``marketstore`` integration.
- client management routines
- ticK data ingest routines
- websocket client for subscribing to write triggers
- todo: tick sequence stream-cloning for testing
- todo: docker container management automation
"""
from contextlib import asynccontextmanager
from typing import Dict, Any, List, Callable, Tuple
import time
from math import isnan
import msgpack
import numpy as np
import pandas as pd
import pymarketstore as pymkts
from trio_websocket import open_websocket_url
from ..log import get_logger, get_console_log
from ..data import open_feed
log = get_logger(__name__)
_tick_tbk_ids: Tuple[str, str] = ('1Sec', 'TICK')
_tick_tbk: str = '{}/' + '/'.join(_tick_tbk_ids)
_url: str = 'http://localhost:5993/rpc'
_quote_dt = [
# these two are required for as a "primary key"
('Epoch', 'i8'),
('Nanoseconds', 'i4'),
('Tick', 'i4'), # (-1, 0, 1) = (on bid, same, on ask)
# ('fill_time', 'f4'),
('Last', 'f4'),
('Bid', 'f4'),
('Bsize', 'i8'),
('Asize', 'i8'),
('Ask', 'f4'),
('Size', 'i8'),
('Volume', 'i8'),
# ('Broker_time_ns', 'i64'),
# ('VWAP', 'f4')
]
_quote_tmp = {}.fromkeys(dict(_quote_dt).keys(), np.nan)
_tick_map = {
'Up': 1,
'Equal': 0,
'Down': -1,
None: np.nan,
}
class MarketStoreError(Exception):
"Generic marketstore client error"
def err_on_resp(response: dict) -> None:
"""Raise any errors found in responses from client request.
"""
responses = response['responses']
if responses is not None:
for r in responses:
err = r['error']
if err:
raise MarketStoreError(err)
def quote_to_marketstore_structarray(
quote: Dict[str, Any],
last_fill: str,
) -> np.array:
"""Return marketstore writeable structarray from quote ``dict``.
"""
if last_fill:
# new fill bby
now = timestamp(last_fill)
else:
# this should get inserted upstream by the broker-client to
# subtract from IPC latency
now = time.time_ns()
secs, ns = now / 10**9, now % 10**9
# pack into List[Tuple[str, Any]]
array_input = []
# insert 'Epoch' entry first
array_input.append(int(secs))
# insert 'Nanoseconds' field
array_input.append(int(ns))
# append remaining fields
for name, dt in _quote_dt[2:]:
if 'f' in dt:
none = np.nan
else:
none = 0
val = quote.get(name.casefold(), none)
array_input.append(val)
return np.array([tuple(array_input)], dtype=_quote_dt)
def timestamp(datestr: str) -> int:
"""Return marketstore compatible 'Epoch' integer in nanoseconds
from a date formatted str.
"""
return int(pd.Timestamp(datestr).value)
def mk_tbk(keys: Tuple[str, str, str]) -> str:
"""Generate a marketstore table key from a tuple.
Converts,
``('SPY', '1Sec', 'TICK')`` -> ``"SPY/1Sec/TICK"```
"""
return '{}/' + '/'.join(keys)
class Client:
"""Async wrapper around the alpaca ``pymarketstore`` sync client.
This will server as the shell for building out a proper async client
that isn't horribly documented and un-tested..
"""
def __init__(self, url: str):
self._client = pymkts.Client(url)
async def _invoke(
self,
meth: Callable,
*args,
**kwargs,
) -> Any:
return err_on_resp(meth(*args, **kwargs))
async def destroy(
self,
tbk: Tuple[str, str, str],
) -> None:
return await self._invoke(self._client.destroy, mk_tbk(tbk))
async def list_symbols(
self,
tbk: str,
) -> List[str]:
return await self._invoke(self._client.list_symbols, mk_tbk(tbk))
async def write(
self,
symbol: str,
array: np.ndarray,
) -> None:
start = time.time()
await self._invoke(
self._client.write,
array,
_tick_tbk.format(symbol),
isvariablelength=True
)
log.debug(f"{symbol} write time (s): {time.time() - start}")
def query(
self,
symbol,
tbk: Tuple[str, str] = _tick_tbk_ids,
) -> pd.DataFrame:
# XXX: causes crash
# client.query(pymkts.Params(symbol, '*', 'OHCLV'
result = self._client.query(
pymkts.Params(symbol, *tbk),
)
return result.first().df()
@asynccontextmanager
async def get_client(
url: str = _url,
) -> Client:
yield Client(url)
async def ingest_quote_stream(
symbols: List[str],
brokername: str,
tries: int = 1,
loglevel: str = None,
) -> None:
"""Ingest a broker quote stream into marketstore in (sampled) tick format.
"""
async with open_feed(
brokername,
symbols,
loglevel=loglevel,
) as (first_quotes, qstream):
quote_cache = first_quotes.copy()
async with get_client() as ms_client:
# start ingest to marketstore
async for quotes in qstream:
log.info(quotes)
for symbol, quote in quotes.items():
# remap tick strs to ints
quote['tick'] = _tick_map[quote.get('tick', 'Equal')]
# check for volume update (i.e. did trades happen
# since last quote)
new_vol = quote.get('volume', None)
if new_vol is None:
log.debug(f"No fills for {symbol}")
if new_vol == quote_cache.get('volume'):
# should never happen due to field diffing
# on sender side
log.error(
f"{symbol}: got same volume as last quote?")
quote_cache.update(quote)
a = quote_to_marketstore_structarray(
quote,
# TODO: check this closer to the broker query api
last_fill=quote.get('fill_time', '')
)
await ms_client.write(symbol, a)
async def stream_quotes(
symbols: List[str],
host: str = 'localhost',
port: int = 5993,
diff_cached: bool = True,
loglevel: str = None,
) -> None:
"""Open a symbol stream from a running instance of marketstore and
log to console.
"""
# XXX: required to propagate ``tractor`` loglevel to piker logging
get_console_log(loglevel or tractor.current_actor().loglevel)
tbks: Dict[str, str] = {sym: f"{sym}/*/*" for sym in symbols}
async with open_websocket_url(f'ws://{host}:{port}/ws') as ws:
# send subs topics to server
resp = await ws.send_message(
msgpack.dumps({'streams': list(tbks.values())})
)
log.info(resp)
async def recv() -> Dict[str, Any]:
return msgpack.loads((await ws.get_message()), encoding='utf-8')
streams = (await recv())['streams']
log.info(f"Subscribed to {streams}")
_cache = {}
while True:
msg = await recv()
# unpack symbol and quote data
# key is in format ``<SYMBOL>/<TIMEFRAME>/<ID>``
symbol = msg['key'].split('/')[0]
data = msg['data']
# calc time stamp(s)
s, ns = data.pop('Epoch'), data.pop('Nanoseconds')
ts = s * 10**9 + ns
data['broker_fill_time_ns'] = ts
quote = {}
for k, v in data.items():
if isnan(v):
continue
quote[k.lower()] = v
quote['symbol'] = symbol
quotes = {}
if diff_cached:
last = _cache.setdefault(symbol, {})
new = set(quote.items()) - set(last.items())
if new:
log.info(f"New quote {quote['symbol']}:\n{new}")
# only ship diff updates and other required fields
payload = {k: quote[k] for k, v in new}
payload['symbol'] = symbol
# if there was volume likely the last size of
# shares traded is useful info and it's possible
# that the set difference from above will disregard
# a "size" value since the same # of shares were traded
size = quote.get('size')
volume = quote.get('volume')
if size and volume:
new_volume_since_last = max(
volume - last.get('volume', 0), 0)
log.warning(
f"NEW VOLUME {symbol}:{new_volume_since_last}")
payload['size'] = size
payload['last'] = quote.get('last')
# XXX: we append to a list for the options case where the
# subscription topic (key) is the same for all
# expiries even though this is uncessary for the
# stock case (different topic [i.e. symbol] for each
# quote).
quotes.setdefault(symbol, []).append(payload)
# update cache
_cache[symbol].update(quote)
else:
quotes = {symbol: [{key.lower(): val for key, val in quote.items()}]}
if quotes:
yield quotes

156
piker/fsp.py 100644
View File

@ -0,0 +1,156 @@
"""
Financial signal processing for the peeps.
"""
from typing import AsyncIterator, List
import numpy as np
from .log import get_logger
from . import data
log = get_logger(__name__)
def rec2array(
rec: np.ndarray,
fields: List[str] = None
) -> np.ndarray:
"""Convert record array to std array.
Taken from:
https://github.com/scikit-hep/root_numpy/blob/master/root_numpy/_utils.py#L20
"""
simplify = False
if fields is None:
fields = rec.dtype.names
elif isinstance(fields, str):
fields = [fields]
simplify = True
# Creates a copy and casts all data to the same type
arr = np.dstack([rec[field] for field in fields])
# Check for array-type fields. If none, then remove outer dimension.
# Only need to check first field since np.dstack will anyway raise an
# exception if the shapes don't match
# np.dstack will also fail if fields is an empty list
if not rec.dtype[fields[0]].shape:
arr = arr[0]
if simplify:
# remove last dimension (will be of size 1)
arr = arr.reshape(arr.shape[:-1])
return arr
async def pull_and_process(
bars: np.ndarray,
brokername: str,
# symbols: List[str],
symbol: str,
fsp_func_name: str,
) -> AsyncIterator[dict]:
# async def _yield_bars():
# yield bars
# hist_out: np.ndarray = None
func = latency
# Conduct a single iteration of fsp with historical bars input
# async for hist_out in func(_yield_bars(), bars):
# yield {symbol: hist_out}
# open a data feed stream with requested broker
async with data.open_feed(
brokername,
[symbol],
) as (fquote, stream):
# TODO: load appropriate fsp with input args
async def filter_by_sym(sym, stream):
async for quotes in stream:
for symbol, quotes in quotes.items():
if symbol == sym:
yield quotes
async for processed in func(
filter_by_sym(symbol, stream),
bars,
):
print(f"{fsp_func_name}: {processed}")
yield processed
# TODO: things to figure the fuck out:
# - how to handle non-plottable values
# - composition of fsps / implicit chaining
async def latency(
source: 'TickStream[Dict[str, float]]',
ohlcv: np.ndarray
) -> AsyncIterator[np.ndarray]:
"""Compute High-Low midpoint value.
"""
# TODO: do we want to offer yielding this async
# before the rt data connection comes up?
# deliver zeros for all prior history
yield np.zeros(len(ohlcv))
async for quote in source:
ts = quote.get('broker_ts')
if ts:
print(
f"broker time: {quote['broker_ts']}"
f"brokerd time: {quote['brokerd_ts']}"
)
value = quote['brokerd_ts'] - quote['broker_ts']
yield value
async def last(
source: 'TickStream[Dict[str, float]]',
ohlcv: np.ndarray
) -> AsyncIterator[np.ndarray]:
"""Compute High-Low midpoint value.
"""
# deliver historical processed data first
yield ohlcv['close']
async for quote in source:
yield quote['close']
async def wma(
source, #: AsyncStream[np.ndarray],
ohlcv: np.ndarray, # price time-frame "aware"
lookback: np.ndarray, # price time-frame "aware"
weights: np.ndarray,
) -> AsyncIterator[np.ndarray]: # i like FinSigStream
"""Weighted moving average.
``weights`` is a sequence of already scaled values. As an example
for the WMA often found in "techincal analysis":
``weights = np.arange(1, N) * N*(N-1)/2``.
"""
length = len(weights)
_lookback = np.zeros(length - 1)
ohlcv.from_tf('5m')
# async for frame_len, frame in source:
async for frame in source:
wma = np.convolve(
ohlcv[-length:]['close'],
# np.concatenate((_lookback, frame)),
weights,
'valid'
)
# todo: handle case where frame_len < length - 1
_lookback = frame[-(length-1):]
yield wma

View File

@ -1,15 +1,6 @@
""" """
Stuff for your eyes. Stuff for your eyes, aka super hawt Qt UI components.
Currently we only support PyQt5 due to this issue in Pyside2:
https://bugreports.qt.io/projects/PYSIDE/issues/PYSIDE-1313
""" """
import os
import sys
# XXX clear all flags at import to avoid upsetting
# ol' kivy see: https://github.com/kivy/kivy/issues/4225
# though this is likely a ``click`` problem
sys.argv[1:] = []
# use the trio async loop
os.environ['KIVY_EVENTLOOP'] = 'trio'
import kivy
kivy.require('1.10.0')

255
piker/ui/_axes.py 100644
View File

@ -0,0 +1,255 @@
"""
Chart axes graphics and behavior.
"""
import time
from functools import partial
from typing import List
# import numpy as np
import pandas as pd
import pyqtgraph as pg
from PyQt5 import QtCore, QtGui
from PyQt5.QtCore import QPointF
from .quantdom.utils import fromtimestamp
from ._style import _font, hcolor
class PriceAxis(pg.AxisItem):
def __init__(
self,
) -> None:
super().__init__(orientation='right')
self.setStyle(**{
'textFillLimits': [(0, 0.5)],
# 'tickTextWidth': 10,
# 'tickTextHeight': 25,
# 'autoExpandTextSpace': True,
# 'maxTickLength': -20,
# 'stopAxisAtTick': (True, True),
})
self.setLabel(**{'font-size': '10pt'})
self.setTickFont(_font)
self.setWidth(125)
# XXX: drop for now since it just eats up h space
# def tickStrings(self, vals, scale, spacing):
# digts = max(0, np.ceil(-np.log10(spacing * scale)))
# return [
# ('{:<8,.%df}' % digts).format(v).replace(',', ' ') for v in vals
# ]
class DynamicDateAxis(pg.AxisItem):
# time formats mapped by seconds between bars
tick_tpl = {
60*60*24: '%Y-%b-%d',
60: '%H:%M',
30: '%H:%M:%S',
5: '%H:%M:%S',
}
def __init__(
self,
linked_charts,
*args,
**kwargs
) -> None:
super().__init__(*args, **kwargs)
self.linked_charts = linked_charts
self.setTickFont(_font)
# default styling
self.setStyle(
tickTextOffset=7,
textFillLimits=[(0, 0.70)],
# TODO: doesn't seem to work -> bug in pyqtgraph?
# tickTextHeight=2,
)
# self.setHeight(35)
def _indexes_to_timestrs(
self,
indexes: List[int],
) -> List[str]:
bars = self.linked_charts.chart._array
times = bars['time']
bars_len = len(bars)
delay = times[-1] - times[times != times[-1]][-1]
epochs = times[list(
map(int, filter(lambda i: i < bars_len, indexes))
)]
# TODO: **don't** have this hard coded shift to EST
dts = pd.to_datetime(epochs, unit='s') - 4*pd.offsets.Hour()
return dts.strftime(self.tick_tpl[delay])
def tickStrings(self, values: List[float], scale, spacing):
return self._indexes_to_timestrs(values)
class AxisLabel(pg.GraphicsObject):
# bg_color = pg.mkColor('#a9a9a9')
bg_color = pg.mkColor(hcolor('gray'))
fg_color = pg.mkColor(hcolor('black'))
def __init__(
self,
parent=None,
digits=2,
color=None,
opacity=1,
**kwargs
):
super().__init__(parent)
self.parent = parent
self.opacity = opacity
self.label_str = ''
self.digits = digits
# some weird color convertion logic?
if isinstance(color, QtGui.QPen):
self.bg_color = color.color()
self.fg_color = pg.mkColor(hcolor('black'))
elif isinstance(color, list):
self.bg_color = {'>0': color[0].color(), '<0': color[1].color()}
self.fg_color = pg.mkColor(hcolor('white'))
self.setFlag(self.ItemIgnoresTransformations)
def paint(self, p, option, widget):
p.setRenderHint(p.TextAntialiasing, True)
p.setPen(self.fg_color)
if self.label_str:
if not isinstance(self.bg_color, dict):
bg_color = self.bg_color
else:
if int(self.label_str.replace(' ', '')) > 0:
bg_color = self.bg_color['>0']
else:
bg_color = self.bg_color['<0']
p.setOpacity(self.opacity)
p.fillRect(option.rect, bg_color)
p.setOpacity(1)
p.setFont(_font)
p.drawText(option.rect, self.text_flags, self.label_str)
# uggggghhhh
def tick_to_string(self, tick_pos):
raise NotImplementedError()
def boundingRect(self): # noqa
raise NotImplementedError()
def update_label(self, evt_post, point_view):
raise NotImplementedError()
# end uggggghhhh
# _common_text_flags = (
# QtCore.Qt.TextDontClip |
# QtCore.Qt.AlignCenter |
# QtCore.Qt.AlignTop |
# QtCore.Qt.AlignHCenter |
# QtCore.Qt.AlignVCenter
# )
class XAxisLabel(AxisLabel):
text_flags = (
QtCore.Qt.TextDontClip
| QtCore.Qt.AlignCenter
# | QtCore.Qt.AlignTop
| QtCore.Qt.AlignVCenter
# | QtCore.Qt.AlignHCenter
)
# text_flags = _common_text_flags
def boundingRect(self): # noqa
# TODO: we need to get the parent axe's dimensions transformed
# to abs coords to be 100% correct here:
# self.parent.boundingRect()
return QtCore.QRectF(0, 0, 100, 31)
def update_label(
self,
abs_pos: QPointF, # scene coords
data: float, # data for text
offset: int = 0 # if have margins, k?
) -> None:
self.label_str = self.parent._indexes_to_timestrs([int(data)])[0]
width = self.boundingRect().width()
new_pos = QPointF(abs_pos.x() - width / 2 - offset, 0)
self.setPos(new_pos)
class YAxisLabel(AxisLabel):
# text_flags = _common_text_flags
text_flags = (
QtCore.Qt.AlignLeft
| QtCore.Qt.TextDontClip
| QtCore.Qt.AlignVCenter
)
def tick_to_string(self, tick_pos):
# WTF IS THIS FORMAT?
return ('{: ,.%df}' % self.digits).format(tick_pos).replace(',', ' ')
def boundingRect(self): # noqa
return QtCore.QRectF(0, 0, 120, 30)
def update_label(
self,
abs_pos: QPointF, # scene coords
data: float, # data for text
offset: int = 0 # if have margins, k?
) -> None:
self.label_str = self.tick_to_string(data)
height = self.boundingRect().height()
new_pos = QPointF(0, abs_pos.y() - height / 2 - offset)
self.setPos(new_pos)
class YSticky(YAxisLabel):
"""Y-axis label that sticks to where it's placed despite chart resizing.
"""
def __init__(
self,
chart,
*args,
**kwargs
) -> None:
super().__init__(*args, **kwargs)
self._chart = chart
# XXX: not sure why this wouldn't work with a proxy?
# pg.SignalProxy(
# delay=0,
# rateLimit=60,
# slot=last.update_on_resize,
# )
chart.sigRangeChanged.connect(self.update_on_resize)
def update_on_resize(self, vr, r):
# TODO: figure out how to generalize across data schema
self.update_from_data(*self._chart._array[-1][['index', 'close']])
def update_from_data(
self,
index: int,
last: float,
) -> None:
self.update_label(
self._chart.mapFromView(QPointF(index, last)),
last
)

784
piker/ui/_chart.py 100644
View File

@ -0,0 +1,784 @@
"""
High level Qt chart widgets.
"""
from typing import Tuple, Dict, Any
import time
from PyQt5 import QtCore, QtGui
import numpy as np
import pyqtgraph as pg
import tractor
import trio
from ._axes import (
DynamicDateAxis,
PriceAxis,
)
from ._graphics import CrossHair, BarItems
from ._axes import YSticky
from ._style import _xaxis_at, _min_points_to_show, hcolor
from ._source import Symbol
from .. import brokers
from .. import data
from ..log import get_logger
from ._exec import run_qtractor
from ._source import ohlc_dtype
from ._interaction import ChartView
from .. import fsp
log = get_logger(__name__)
# margins
CHART_MARGINS = (0, 0, 5, 3)
class ChartSpace(QtGui.QWidget):
"""High level widget which contains layouts for organizing
lower level charts as well as other widgets used to control
or modify them.
"""
def __init__(self, parent=None):
super().__init__(parent)
self.v_layout = QtGui.QVBoxLayout(self)
self.v_layout.setContentsMargins(0, 0, 0, 0)
self.toolbar_layout = QtGui.QHBoxLayout()
self.toolbar_layout.setContentsMargins(5, 5, 10, 0)
self.h_layout = QtGui.QHBoxLayout()
# self.init_timeframes_ui()
# self.init_strategy_ui()
self.v_layout.addLayout(self.toolbar_layout)
self.v_layout.addLayout(self.h_layout)
self._chart_cache = {}
def init_timeframes_ui(self):
self.tf_layout = QtGui.QHBoxLayout()
self.tf_layout.setSpacing(0)
self.tf_layout.setContentsMargins(0, 12, 0, 0)
time_frames = ('1M', '5M', '15M', '30M', '1H', '1D', '1W', 'MN')
btn_prefix = 'TF'
for tf in time_frames:
btn_name = ''.join([btn_prefix, tf])
btn = QtGui.QPushButton(tf)
# TODO:
btn.setEnabled(False)
setattr(self, btn_name, btn)
self.tf_layout.addWidget(btn)
self.toolbar_layout.addLayout(self.tf_layout)
# XXX: strat loader/saver that we don't need yet.
# def init_strategy_ui(self):
# self.strategy_box = StrategyBoxWidget(self)
# self.toolbar_layout.addWidget(self.strategy_box)
def load_symbol(
self,
symbol: str,
data: np.ndarray,
) -> None:
"""Load a new contract into the charting app.
"""
# XXX: let's see if this causes mem problems
self.window.setWindowTitle(f'piker chart {symbol}')
linkedcharts = self._chart_cache.setdefault(
symbol,
LinkedSplitCharts()
)
s = Symbol(key=symbol)
# remove any existing plots
if not self.h_layout.isEmpty():
self.h_layout.removeWidget(linkedcharts)
main_chart = linkedcharts.plot_main(s, data)
self.h_layout.addWidget(linkedcharts)
return linkedcharts, main_chart
# TODO: add signalling painter system
# def add_signals(self):
# self.chart.add_signals()
class LinkedSplitCharts(QtGui.QWidget):
"""Widget that holds a central chart plus derived
subcharts computed from the original data set apart
by splitters for resizing.
A single internal references to the data is maintained
for each chart and can be updated externally.
"""
long_pen = pg.mkPen('#006000')
long_brush = pg.mkBrush('#00ff00')
short_pen = pg.mkPen('#600000')
short_brush = pg.mkBrush('#ff0000')
zoomIsDisabled = QtCore.pyqtSignal(bool)
def __init__(self):
super().__init__()
self.signals_visible: bool = False
self._array: np.ndarray = None # main data source
self._ch: CrossHair = None # crosshair graphics
self.chart: ChartPlotWidget = None # main (ohlc) chart
self.subplots: Dict[Tuple[str, ...], ChartPlotWidget] = {}
self.xaxis = DynamicDateAxis(
orientation='bottom',
linked_charts=self
)
self.xaxis_ind = DynamicDateAxis(
orientation='bottom',
linked_charts=self
)
if _xaxis_at == 'bottom':
self.xaxis.setStyle(showValues=False)
else:
self.xaxis_ind.setStyle(showValues=False)
self.splitter = QtGui.QSplitter(QtCore.Qt.Vertical)
self.splitter.setHandleWidth(5)
self.layout = QtGui.QVBoxLayout(self)
self.layout.setContentsMargins(0, 0, 0, 0)
self.layout.addWidget(self.splitter)
def set_split_sizes(
self,
prop: float = 0.25 # proportion allocated to consumer subcharts
) -> None:
"""Set the proportion of space allocated for linked subcharts.
"""
major = 1 - prop
min_h_ind = int((self.height() * prop) / len(self.subplots))
sizes = [int(self.height() * major)]
sizes.extend([min_h_ind] * len(self.subplots))
self.splitter.setSizes(sizes) # , int(self.height()*0.2)
def plot_main(
self,
symbol: Symbol,
array: np.ndarray,
ohlc: bool = True,
) -> 'ChartPlotWidget':
"""Start up and show main (price) chart and all linked subcharts.
"""
self.digits = symbol.digits()
# TODO: this should eventually be a view onto shared mem or some
# higher level type / API
self._array = array
# add crosshairs
self._ch = CrossHair(
linkedsplitcharts=self,
digits=self.digits
)
self.chart = self.add_plot(
name=symbol.key,
array=array,
xaxis=self.xaxis,
ohlc=True,
_is_main=True,
)
# add crosshair graphic
self.chart.addItem(self._ch)
# style?
self.chart.setFrameStyle(QtGui.QFrame.StyledPanel | QtGui.QFrame.Plain)
return self.chart
def add_plot(
self,
name: str,
array: np.ndarray,
xaxis: DynamicDateAxis = None,
ohlc: bool = False,
_is_main: bool = False,
) -> 'ChartPlotWidget':
"""Add (sub)plots to chart widget by name.
If ``name`` == ``"main"`` the chart will be the the primary view.
"""
if self.chart is None and not _is_main:
raise RuntimeError(
"A main plot must be created first with `.plot_main()`")
# source of our custom interactions
cv = ChartView()
cv.linked_charts = self
# use "indicator axis" by default
xaxis = self.xaxis_ind if xaxis is None else xaxis
cpw = ChartPlotWidget(
array=array,
parent=self.splitter,
axisItems={'bottom': xaxis, 'right': PriceAxis()},
viewBox=cv,
)
# this name will be used to register the primary
# graphics curve managed by the subchart
cpw.name = name
cpw.plotItem.vb.linked_charts = self
cpw.setFrameStyle(QtGui.QFrame.StyledPanel | QtGui.QFrame.Plain)
cpw.getPlotItem().setContentsMargins(*CHART_MARGINS)
# link chart x-axis to main quotes chart
cpw.setXLink(self.chart)
# draw curve graphics
if ohlc:
cpw.draw_ohlc(name, array)
else:
cpw.draw_curve(name, array)
# add to cross-hair's known plots
self._ch.add_plot(cpw)
if not _is_main:
# track by name
self.subplots[name] = cpw
# scale split regions
self.set_split_sizes()
# XXX: we need this right?
# self.splitter.addWidget(cpw)
return cpw
class ChartPlotWidget(pg.PlotWidget):
"""``GraphicsView`` subtype containing a single ``PlotItem``.
- The added methods allow for plotting OHLC sequences from
``np.ndarray``s with appropriate field names.
- Overrides a ``pyqtgraph.PlotWidget`` (a ``GraphicsView`` containing
a single ``PlotItem``) to intercept and and re-emit mouse enter/exit
events.
(Could be replaced with a ``pg.GraphicsLayoutWidget`` if we
eventually want multiple plots managed together?)
"""
sig_mouse_leave = QtCore.Signal(object)
sig_mouse_enter = QtCore.Signal(object)
# TODO: can take a ``background`` color setting - maybe there's
# a better one?
def __init__(
self,
# the data view we generate graphics from
array: np.ndarray,
**kwargs,
):
"""Configure chart display settings.
"""
super().__init__(
background=hcolor('papas_special'),
# parent=None,
# plotItem=None,
**kwargs
)
self._array = array # readonly view of data
self._graphics = {} # registry of underlying graphics
self._labels = {} # registry of underlying graphics
self._ysticks = {} # registry of underlying graphics
# show only right side axes
self.hideAxis('left')
self.showAxis('right')
# show background grid
self.showGrid(x=True, y=True, alpha=0.4)
self.plotItem.vb.setXRange(0, 0)
# use cross-hair for cursor
self.setCursor(QtCore.Qt.CrossCursor)
# assign callback for rescaling y-axis automatically
# based on ohlc contents
self.sigXRangeChanged.connect(self._set_yrange)
def _update_contents_label(self, index: int) -> None:
if index > 0 and index < len(self._array):
for name, (label, update) in self._labels.items():
update(index)
def _set_xlimits(
self,
xfirst: int,
xlast: int
) -> None:
"""Set view limits (what's shown in the main chart "pane")
based on max/min x/y coords.
"""
self.setLimits(
xMin=xfirst,
xMax=xlast,
minXRange=_min_points_to_show,
)
def view_range(self) -> Tuple[int, int]:
vr = self.viewRect()
return int(vr.left()), int(vr.right())
def bars_range(self) -> Tuple[int, int, int, int]:
"""Return a range tuple for the bars present in view.
"""
l, r = self.view_range()
lbar = max(l, 0)
rbar = min(r, len(self._array))
return l, lbar, rbar, r
def draw_ohlc(
self,
name: str,
data: np.ndarray,
# XXX: pretty sure this is dumb and we don't need an Enum
style: pg.GraphicsObject = BarItems,
) -> pg.GraphicsObject:
"""Draw OHLC datums to chart.
"""
# remember it's an enum type..
graphics = style()
# adds all bar/candle graphics objects for each data point in
# the np array buffer to be drawn on next render cycle
graphics.draw_from_data(data)
self.addItem(graphics)
self._graphics[name] = graphics
# XXX: How to stack labels vertically?
# Ogi says: "
label = pg.LabelItem(
justify='left',
size='5pt',
)
self.scene().addItem(label)
def update(index: int) -> None:
label.setText(
"{name} O:{} H:{} L:{} C:{} V:{}".format(
*self._array[index].item()[2:],
name=name,
)
)
self._labels[name] = (label, update)
self._update_contents_label(index=-1)
# set xrange limits
xlast = data[-1]['index']
# show last 50 points on startup
self.plotItem.vb.setXRange(xlast - 50, xlast + 50)
self._add_sticky(name)
return graphics
def draw_curve(
self,
name: str,
data: np.ndarray,
) -> pg.PlotDataItem:
# draw the indicator as a plain curve
curve = pg.PlotDataItem(
data,
antialias=True,
# TODO: see how this handles with custom ohlcv bars graphics
clipToView=True,
)
self.addItem(curve)
# register overlay curve with name
if not self._graphics and name is None:
name = 'a_line_bby'
self._graphics[name] = curve
# XXX: How to stack labels vertically?
label = pg.LabelItem(
justify='left',
size='5pt',
)
self.scene().addItem(label)
def update(index: int) -> None:
data = self._array[index]
label.setText(f"{name}: {index} {data}")
self._labels[name] = (label, update)
self._update_contents_label(index=-1)
# set a "startup view"
xlast = len(data) - 1
# show last 50 points on startup
self.plotItem.vb.setXRange(xlast - 50, xlast + 50)
# TODO: we should instead implement a diff based
# "only update with new items" on the pg.PlotDataItem
curve.update_from_array = curve.setData
return curve
def _add_sticky(
self,
name: str,
# retreive: Callable[None, np.ndarray],
) -> YSticky:
# add y-axis "last" value label
last = self._ysticks['last'] = YSticky(
chart=self,
parent=self.getAxis('right'),
# digits=0,
opacity=1,
color=pg.mkPen(hcolor('gray'))
)
return last
def update_from_array(
self,
name: str,
array: np.ndarray,
**kwargs,
) -> pg.GraphicsObject:
graphics = self._graphics[name]
graphics.update_from_array(array, **kwargs)
return graphics
def _set_yrange(
self,
) -> None:
"""Set the viewable y-range based on embedded data.
This adds auto-scaling like zoom on the scroll wheel such
that data always fits nicely inside the current view of the
data set.
"""
l, lbar, rbar, r = self.bars_range()
# figure out x-range in view such that user can scroll "off" the data
# set up to the point where ``_min_points_to_show`` are left.
# if l < lbar or r > rbar:
bars_len = rbar - lbar
view_len = r - l
# TODO: logic to check if end of bars in view
extra = view_len - _min_points_to_show
begin = 0 - extra
end = len(self._array) - 1 + extra
log.trace(
f"\nl: {l}, lbar: {lbar}, rbar: {rbar}, r: {r}\n"
f"view_len: {view_len}, bars_len: {bars_len}\n"
f"begin: {begin}, end: {end}, extra: {extra}"
)
self._set_xlimits(begin, end)
# TODO: this should be some kind of numpy view api
bars = self._array[lbar:rbar]
if not len(bars):
# likely no data loaded yet
log.error(f"WTF bars_range = {lbar}:{rbar}")
return
elif lbar < 0:
breakpoint()
# TODO: should probably just have some kinda attr mark
# that determines this behavior based on array type
try:
ylow = bars['low'].min()
yhigh = bars['high'].max()
std = np.std(bars['close'])
except IndexError:
# must be non-ohlc array?
ylow = bars.min()
yhigh = bars.max()
std = np.std(bars)
# view margins: stay within 10% of the "true range"
diff = yhigh - ylow
ylow = ylow - (diff * 0.1)
yhigh = yhigh + (diff * 0.1)
chart = self
chart.setLimits(
yMin=ylow,
yMax=yhigh,
minYRange=std
)
chart.setYRange(ylow, yhigh)
def enterEvent(self, ev): # noqa
# pg.PlotWidget.enterEvent(self, ev)
self.sig_mouse_enter.emit(self)
def leaveEvent(self, ev): # noqa
# pg.PlotWidget.leaveEvent(self, ev)
self.sig_mouse_leave.emit(self)
self.scene().leaveEvent(ev)
async def add_new_bars(delay_s, linked_charts):
"""Task which inserts new bars into the ohlc every ``delay_s`` seconds.
"""
# TODO: right now we'll spin printing bars if the last time
# stamp is before a large period of no market activity.
# Likely the best way to solve this is to make this task
# aware of the instrument's tradable hours?
# adjust delay to compensate for trio processing time
ad = delay_s - 0.002
price_chart = linked_charts.chart
ohlc = price_chart._array
async def sleep():
"""Sleep until next time frames worth has passed from last bar.
"""
last_ts = ohlc[-1]['time']
delay = max((last_ts + ad) - time.time(), 0)
await trio.sleep(delay)
# sleep for duration of current bar
await sleep()
while True:
# TODO: bunch of stuff:
# - I'm starting to think all this logic should be
# done in one place and "graphics update routines"
# should not be doing any length checking and array diffing.
# - don't keep appending, but instead increase the
# underlying array's size less frequently
# - handle odd lot orders
# - update last open price correctly instead
# of copying it from last bar's close
# - 5 sec bar lookback-autocorrection like tws does?
def incr_ohlc_array(array: np.ndarray):
(index, t, close) = array[-1][['index', 'time', 'close']]
new_array = np.append(
array,
np.array(
[(index + 1, t + delay_s, close, close,
close, close, 0)],
dtype=array.dtype
),
)
return new_array
# add new increment/bar
ohlc = price_chart._array = incr_ohlc_array(ohlc)
# TODO: generalize this increment logic
for name, chart in linked_charts.subplots.items():
data = chart._array
chart._array = np.append(
data,
np.array(data[-1], dtype=data.dtype)
)
# read value at "open" of bar
# last_quote = ohlc[-1]
# XXX: If the last bar has not changed print a flat line and
# move to the next. This is a "animation" choice that we may not
# keep.
# if last_quote == ohlc[-1]:
# log.debug("Printing flat line for {sym}")
# update chart graphics and resize view
price_chart.update_from_array(price_chart.name, ohlc)
price_chart._set_yrange()
for name, chart in linked_charts.subplots.items():
chart.update_from_array(chart.name, chart._array)
chart._set_yrange()
# We **don't** update the bar right now
# since the next quote that arrives should in the
# tick streaming task
await sleep()
# TODO: should we update a graphics again time here?
# Think about race conditions with data update task.
async def _async_main(
sym: str,
brokername: str,
# implicit required argument provided by ``qtractor_run()``
widgets: Dict[str, Any],
# all kwargs are passed through from the CLI entrypoint
loglevel: str = None,
) -> None:
"""Main Qt-trio routine invoked by the Qt loop with
the widgets ``dict``.
"""
chart_app = widgets['main']
# historical data fetch
brokermod = brokers.get_brokermod(brokername)
async with brokermod.get_client() as client:
# figure out the exact symbol
bars = await client.bars(symbol=sym)
# remember, msgpack-numpy's ``from_buffer` returns read-only array
bars = np.array(bars[list(ohlc_dtype.names)])
# load in symbol's ohlc data
linked_charts, chart = chart_app.load_symbol(sym, bars)
# determine ohlc delay between bars
times = bars['time']
# find expected time step between datums
delay = times[-1] - times[times != times[-1]][-1]
async with trio.open_nursery() as n:
# load initial fsp chain (otherwise known as "indicators")
n.start_soon(
chart_from_fsp,
linked_charts,
fsp.latency,
sym,
bars,
brokermod,
loglevel,
)
# update last price sticky
last = chart._ysticks['last']
last.update_from_data(*chart._array[-1][['index', 'close']])
# graphics update loop
async with data.open_feed(
brokername,
[sym],
loglevel=loglevel,
) as (fquote, stream):
# wait for a first quote before we start any update tasks
quote = await stream.__anext__()
log.info(f'RECEIVED FIRST QUOTE {quote}')
# start graphics tasks after receiving first live quote
n.start_soon(add_new_bars, delay, linked_charts)
async for quotes in stream:
for sym, quote in quotes.items():
ticks = quote.get('ticks', ())
for tick in ticks:
if tick.get('type') == 'trade':
# TODO: eventually we'll want to update
# bid/ask labels and other data as
# subscribed by underlying UI consumers.
# last = quote.get('last') or quote['close']
last = tick['price']
# update ohlc (I guess we're enforcing this
# for now?) overwrite from quote
high, low = chart._array[-1][['high', 'low']]
chart._array[['high', 'low', 'close']][-1] = (
max(high, last),
min(low, last),
last,
)
chart.update_from_array(
chart.name,
chart._array,
)
# update sticky(s)
last = chart._ysticks['last']
last.update_from_data(
*chart._array[-1][['index', 'close']])
chart._set_yrange()
async def chart_from_fsp(
linked_charts,
fsp_func,
sym,
bars,
brokermod,
loglevel,
) -> None:
"""Start financial signal processing in subactor.
Pass target entrypoint and historical data.
"""
func_name = fsp_func.__name__
async with tractor.open_nursery() as n:
portal = await n.run_in_actor(
f'fsp.{func_name}', # name as title of sub-chart
# subactor entrypoint
fsp.pull_and_process,
bars=bars,
brokername=brokermod.name,
symbol=sym,
fsp_func_name=func_name,
# tractor config
loglevel=loglevel,
)
stream = await portal.result()
# receive processed historical data-array as first message
history: np.ndarray = (await stream.__anext__())
# TODO: enforce type checking here
newbars = np.array(history)
chart = linked_charts.add_plot(
name=func_name,
array=newbars,
)
# check for data length mis-allignment and fill missing values
diff = len(chart._array) - len(linked_charts.chart._array)
if diff < 0:
data = chart._array
chart._array = np.append(
data,
np.full(abs(diff), data[-1], dtype=data.dtype)
)
# update chart graphics
async for value in stream:
chart._array[-1] = value
chart.update_from_array(chart.name, chart._array)
chart._set_yrange()
def _main(
sym: str,
brokername: str,
tractor_kwargs,
) -> None:
"""Sync entry point to start a chart app.
"""
# Qt entry point
run_qtractor(
func=_async_main,
args=(sym, brokername),
main_widget=ChartSpace,
tractor_kwargs=tractor_kwargs,
)

117
piker/ui/_exec.py 100644
View File

@ -0,0 +1,117 @@
"""
Trio - Qt integration
Run ``trio`` in guest mode on top of the Qt event loop.
All global Qt runtime settings are mostly defined here.
"""
from functools import partial
import traceback
from typing import Tuple, Callable, Dict, Any
import PyQt5 # noqa
from pyqtgraph import QtGui
from PyQt5 import QtCore
from PyQt5.QtCore import pyqtRemoveInputHook
import qdarkstyle
import trio
import tractor
from outcome import Error
class MainWindow(QtGui.QMainWindow):
size = (800, 500)
title = 'piker chart (bby)'
def __init__(self, parent=None):
super().__init__(parent)
# self.setMinimumSize(*self.size)
self.setWindowTitle(self.title)
def run_qtractor(
func: Callable,
args: Tuple,
main_widget: QtGui.QWidget,
tractor_kwargs: Dict[str, Any] = {},
window_type: QtGui.QMainWindow = MainWindow,
) -> None:
# avoids annoying message when entering debugger from qt loop
pyqtRemoveInputHook()
app = QtGui.QApplication.instance()
if app is None:
app = PyQt5.QtWidgets.QApplication([])
# TODO: we might not need this if it's desired
# to cancel the tractor machinery on Qt loop
# close, however the details of doing that correctly
# currently seem tricky..
app.setQuitOnLastWindowClosed(False)
# This code is from Nathaniel, and I quote:
# "This is substantially faster than using a signal... for some
# reason Qt signal dispatch is really slow (and relies on events
# underneath anyway, so this is strictly less work)."
REENTER_EVENT = QtCore.QEvent.Type(QtCore.QEvent.registerEventType())
class ReenterEvent(QtCore.QEvent):
pass
class Reenter(QtCore.QObject):
def event(self, event):
event.fn()
return False
reenter = Reenter()
def run_sync_soon_threadsafe(fn):
event = ReenterEvent(REENTER_EVENT)
event.fn = fn
app.postEvent(reenter, event)
def done_callback(outcome):
print(f"Outcome: {outcome}")
if isinstance(outcome, Error):
exc = outcome.error
traceback.print_exception(type(exc), exc, exc.__traceback__)
app.quit()
# load dark theme
app.setStyleSheet(qdarkstyle.load_stylesheet(qt_api='pyqt5'))
# make window and exec
window = window_type()
instance = main_widget()
instance.window = window
widgets = {
'window': window,
'main': instance,
}
# setup tractor entry point args
main = partial(
tractor._main,
async_fn=func,
args=args + (widgets,),
arbiter_addr=(
tractor._default_arbiter_host,
tractor._default_arbiter_port,
),
name='qtractor',
**tractor_kwargs,
)
# guest mode
trio.lowlevel.start_guest_run(
main,
run_sync_soon_threadsafe=run_sync_soon_threadsafe,
done_callback=done_callback,
)
window.main_widget = main_widget
window.setCentralWidget(instance)
# actually render to screen
window.show()
app.exec_()

View File

@ -0,0 +1,410 @@
"""
Chart graphics for displaying a slew of different data types.
"""
from typing import List
from itertools import chain
import numpy as np
import pyqtgraph as pg
from PyQt5 import QtCore, QtGui
from PyQt5.QtCore import QLineF
from .quantdom.utils import timeit
from ._style import _xaxis_at, hcolor
from ._axes import YAxisLabel, XAxisLabel
# TODO:
# - checkout pyqtgraph.PlotCurveItem.setCompositionMode
_mouse_rate_limit = 30
_debounce_delay = 10
_ch_label_opac = 1
class CrossHair(pg.GraphicsObject):
def __init__(
self,
linkedsplitcharts: 'LinkedSplitCharts', # noqa
digits: int = 0
) -> None:
super().__init__()
# XXX: not sure why these are instance variables?
# It's not like we can change them on the fly..?
self.pen = pg.mkPen(
color=hcolor('default'),
style=QtCore.Qt.DashLine,
)
self.lines_pen = pg.mkPen(
color='#a9a9a9', # gray?
style=QtCore.Qt.DashLine,
)
self.lsc = linkedsplitcharts
self.graphics = {}
self.plots = []
self.active_plot = None
self.digits = digits
def add_plot(
self,
plot: 'ChartPlotWidget', # noqa
digits: int = 0,
) -> None:
# add ``pg.graphicsItems.InfiniteLine``s
# vertical and horizonal lines and a y-axis label
vl = plot.addLine(x=0, pen=self.lines_pen, movable=False)
hl = plot.addLine(y=0, pen=self.lines_pen, movable=False)
yl = YAxisLabel(
parent=plot.getAxis('right'),
digits=digits or self.digits,
opacity=_ch_label_opac,
color=self.pen,
)
# TODO: checkout what ``.sigDelayed`` can be used for
# (emitted once a sufficient delay occurs in mouse movement)
px_moved = pg.SignalProxy(
plot.scene().sigMouseMoved,
rateLimit=_mouse_rate_limit,
slot=self.mouseMoved,
delay=_debounce_delay,
)
px_enter = pg.SignalProxy(
plot.sig_mouse_enter,
rateLimit=_mouse_rate_limit,
slot=lambda: self.mouseAction('Enter', plot),
delay=_debounce_delay,
)
px_leave = pg.SignalProxy(
plot.sig_mouse_leave,
rateLimit=_mouse_rate_limit,
slot=lambda: self.mouseAction('Leave', plot),
delay=_debounce_delay,
)
self.graphics[plot] = {
'vl': vl,
'hl': hl,
'yl': yl,
'px': (px_moved, px_enter, px_leave),
}
self.plots.append(plot)
# Determine where to place x-axis label.
# Place below the last plot by default, ow
# keep x-axis right below main chart
plot_index = -1 if _xaxis_at == 'bottom' else 0
self.xaxis_label = XAxisLabel(
parent=self.plots[plot_index].getAxis('bottom'),
opacity=_ch_label_opac,
color=self.pen,
)
def mouseAction(self, action, plot): # noqa
if action == 'Enter':
# show horiz line and y-label
self.graphics[plot]['hl'].show()
self.graphics[plot]['yl'].show()
self.active_plot = plot
else: # Leave
# hide horiz line and y-label
self.graphics[plot]['hl'].hide()
self.graphics[plot]['yl'].hide()
self.active_plot = None
def mouseMoved(
self,
evt: 'Tuple[QMouseEvent]', # noqa
) -> None: # noqa
"""Update horizonal and vertical lines when mouse moves inside
either the main chart or any indicator subplot.
"""
pos = evt[0]
# find position inside active plot
try:
# map to view coordinate system
mouse_point = self.active_plot.mapToView(pos)
except AttributeError:
# mouse was not on active plot
return
x, y = mouse_point.x(), mouse_point.y()
plot = self.active_plot
self.graphics[plot]['hl'].setY(y)
self.graphics[self.active_plot]['yl'].update_label(
abs_pos=pos, data=y
)
for plot, opts in self.graphics.items():
# move the vertical line to the current x
opts['vl'].setX(x)
# update the chart's "contents" label
plot._update_contents_label(int(x))
# update the label on the bottom of the crosshair
self.xaxis_label.update_label(
abs_pos=pos,
data=x
)
def boundingRect(self):
try:
return self.active_plot.boundingRect()
except AttributeError:
return self.plots[0].boundingRect()
def _mk_lines_array(data: List, size: int) -> np.ndarray:
"""Create an ndarray to hold lines graphics objects.
"""
# TODO: might want to just make this a 2d array to be faster at
# flattening using .ravel(): https://stackoverflow.com/a/60089929
return np.zeros_like(
data,
shape=(int(size),),
dtype=[
('index', int),
('body', object),
('rarm', object),
('larm', object)
],
)
def bars_from_ohlc(
data: np.ndarray,
w: float,
start: int = 0,
) -> np.ndarray:
"""Generate an array of lines objects from input ohlc data.
"""
lines = _mk_lines_array(data, data.shape[0])
for i, q in enumerate(data[start:], start=start):
open, high, low, close, index = q[
['open', 'high', 'low', 'close', 'index']]
# place the x-coord start as "middle" of the drawing range such
# that the open arm line-graphic is at the left-most-side of
# the indexe's range according to the view mapping.
index_start = index + w
# high - low line
if low != high:
# hl = QLineF(index, low, index, high)
hl = QLineF(index_start, low, index_start, high)
else:
# XXX: if we don't do it renders a weird rectangle?
# see below too for handling this later...
hl = QLineF(low, low, low, low)
hl._flat = True
# open line
o = QLineF(index_start - w, open, index_start, open)
# close line
c = QLineF(index_start + w, close, index_start, close)
# indexing here is as per the below comments
# lines[3*i:3*i+3] = (hl, o, c)
lines[i] = (index, hl, o, c)
# if not _tina_mode: # piker mode
# else _tina_mode:
# self.lines = lines = np.concatenate(
# [high_to_low, open_sticks, close_sticks])
# use traditional up/down green/red coloring
# long_bars = np.resize(Quotes.close > Quotes.open, len(lines))
# short_bars = np.resize(
# Quotes.close < Quotes.open, len(lines))
# ups = lines[long_bars]
# downs = lines[short_bars]
# # draw "up" bars
# p.setPen(self.bull_brush)
# p.drawLines(*ups)
# # draw "down" bars
# p.setPen(self.bear_brush)
# p.drawLines(*downs)
return lines
class BarItems(pg.GraphicsObject):
"""Price range bars graphics rendered from a OHLC sequence.
"""
sigPlotChanged = QtCore.Signal(object)
# 0.5 is no overlap between arms, 1.0 is full overlap
w: float = 0.43
bull_pen = pg.mkPen(hcolor('gray'))
# XXX: tina mode, see below
# bull_brush = pg.mkPen('#00cc00')
# bear_brush = pg.mkPen('#fa0000')
def __init__(self):
super().__init__()
self.picture = QtGui.QPicture()
# XXX: not sure this actually needs to be an array other
# then for the old tina mode calcs for up/down bars below?
# lines container
self.lines = _mk_lines_array([], 50e3)
# track the current length of drawable lines within the larger array
self.index: int = 0
def last_value(self) -> QLineF:
return self.lines[self.index - 1]['rarm']
@timeit
def draw_from_data(
self,
data: np.recarray,
start: int = 0,
):
"""Draw OHLC datum graphics from a ``np.recarray``.
"""
lines = bars_from_ohlc(data, self.w, start=start)
# save graphics for later reference and keep track
# of current internal "last index"
index = len(lines)
self.lines[:index] = lines
self.index = index
self.draw_lines()
def draw_lines(self):
"""Draw the current line set using the painter.
"""
to_draw = self.lines[
['body', 'larm', 'rarm']][:self.index]
# pre-computing a QPicture object allows paint() to run much
# more quickly, rather than re-drawing the shapes every time.
p = QtGui.QPainter(self.picture)
p.setPen(self.bull_pen)
# TODO: might be better to use 2d array?
# try our fsp.rec2array() and a np.ravel() for speedup
# otherwise we might just have to go 2d ndarray of objects.
# see conlusion on speed here: # https://stackoverflow.com/a/60089929
p.drawLines(*chain.from_iterable(to_draw))
p.end()
def update_from_array(
self,
array: np.ndarray,
) -> None:
"""Update the last datum's bar graphic from input data array.
This routine should be interface compatible with
``pg.PlotCurveItem.setData()``. Normally this method in
``pyqtgraph`` seems to update all the data passed to the
graphics object, and then update/rerender, but here we're
assuming the prior graphics havent changed (OHLC history rarely
does) so this "should" be simpler and faster.
"""
index = self.index
length = len(array)
extra = length - index
if extra > 0:
# generate new graphics to match provided array
new = array[index:index + extra]
lines = bars_from_ohlc(new, self.w)
bars_added = len(lines)
self.lines[index:index + bars_added] = lines
self.index += bars_added
# else: # current bar update
# do we really need to verify the entire past data set?
# index, time, open, high, low, close, volume
i, time, open, _, _, close, _ = array[-1]
last = close
i, body, larm, rarm = self.lines[index-1]
if not rarm:
breakpoint()
# XXX: is there a faster way to modify this?
# update right arm
rarm.setLine(rarm.x1(), last, rarm.x2(), last)
# update body
high = body.y2()
low = body.y1()
if last < low:
low = last
if last > high:
high = last
if getattr(body, '_flat', None) and low != high:
# if the bar was flat it likely does not have
# the index set correctly due to a rendering bug
# see above
body.setLine(i + self.w, low, i + self.w, high)
body._flat = False
else:
body.setLine(body.x1(), low, body.x2(), high)
# draw the pic
self.draw_lines()
# trigger re-render
self.update()
# be compat with ``pg.PlotCurveItem``
setData = update_from_array
# XXX: From the customGraphicsItem.py example:
# The only required methods are paint() and boundingRect()
def paint(self, p, opt, widget):
p.drawPicture(0, 0, self.picture)
def boundingRect(self):
# boundingRect _must_ indicate the entire area that will be
# drawn on or else we will get artifacts and possibly crashing.
# (in this case, QPicture does all the work of computing the
# bounding rect for us)
return QtCore.QRectF(self.picture.boundingRect())
# XXX: when we get back to enabling tina mode for xb
# class CandlestickItems(BarItems):
# w2 = 0.7
# line_pen = pg.mkPen('#000000')
# bull_brush = pg.mkBrush('#00ff00')
# bear_brush = pg.mkBrush('#ff0000')
# def _generate(self, p):
# rects = np.array(
# [
# QtCore.QRectF(
# q.id - self.w,
# q.open,
# self.w2,
# q.close - q.open
# )
# for q in Quotes
# ]
# )
# p.setPen(self.line_pen)
# p.drawLines(
# [QtCore.QLineF(q.id, q.low, q.id, q.high)
# for q in Quotes]
# )
# p.setBrush(self.bull_brush)
# p.drawRects(*rects[Quotes.close > Quotes.open])
# p.setBrush(self.bear_brush)
# p.drawRects(*rects[Quotes.close < Quotes.open])

View File

@ -0,0 +1,77 @@
"""
UX interaction customs.
"""
import pyqtgraph as pg
from pyqtgraph import functions as fn
from ..log import get_logger
from ._style import _min_points_to_show
log = get_logger(__name__)
class ChartView(pg.ViewBox):
"""Price chart view box with interaction behaviors you'd expect from
any interactive platform:
- zoom on mouse scroll that auto fits y-axis
- no vertical scrolling
- zoom to a "fixed point" on the y-axis
"""
def __init__(
self,
parent=None,
**kwargs,
):
super().__init__(parent=parent, **kwargs)
# disable vertical scrolling
self.setMouseEnabled(x=True, y=False)
self.linked_charts = None
def wheelEvent(self, ev, axis=None):
"""Override "center-point" location for scrolling.
This is an override of the ``ViewBox`` method simply changing
the center of the zoom to be the y-axis.
TODO: PR a method into ``pyqtgraph`` to make this configurable
"""
if axis in (0, 1):
mask = [False, False]
mask[axis] = self.state['mouseEnabled'][axis]
else:
mask = self.state['mouseEnabled'][:]
# don't zoom more then the min points setting
l, lbar, rbar, r = self.linked_charts.chart.bars_range()
vl = r - l
if ev.delta() > 0 and vl <= _min_points_to_show:
log.trace("Max zoom bruh...")
return
if ev.delta() < 0 and vl >= len(self.linked_charts._array):
log.trace("Min zoom bruh...")
return
# actual scaling factor
s = 1.015 ** (ev.delta() * -1 / 20) # self.state['wheelScaleFactor'])
s = [(None if m is False else s) for m in mask]
# center = pg.Point(
# fn.invertQTransform(self.childGroup.transform()).map(ev.pos())
# )
# XXX: scroll "around" the right most element in the view
furthest_right_coord = self.boundingRect().topRight()
center = pg.Point(
fn.invertQTransform(
self.childGroup.transform()
).map(furthest_right_coord)
)
self._resetTarget()
self.scaleBy(s, center)
ev.accept()
self.sigRangeChangedManually.emit(mask)

View File

@ -0,0 +1,98 @@
"""
Signalling graphics and APIs.
WARNING: this code likely doesn't work at all (yet)
since it was copied from another class that shouldn't
have had it.
"""
import numpy as np
import pyqtgraph as pg
from PyQt5 import QtCore, QtGui
from .quantdom.charts import CenteredTextItem
from .quantdom.base import Quotes
from .quantdom.portfolio import Order, Portfolio
class SignallingApi(object):
def __init__(self, plotgroup):
self.plotgroup = plotgroup
self.chart = plotgroup.chart
def _show_text_signals(self, lbar, rbar):
signals = [
sig
for sig in self.signals_text_items[lbar:rbar]
if isinstance(sig, CenteredTextItem)
]
if len(signals) <= 50:
for sig in signals:
sig.show()
else:
for sig in signals:
sig.hide()
def _remove_signals(self):
self.chart.removeItem(self.signals_group_arrow)
self.chart.removeItem(self.signals_group_text)
del self.signals_text_items
del self.signals_group_arrow
del self.signals_group_text
self.signals_visible = False
def add_signals(self):
self.signals_group_text = QtGui.QGraphicsItemGroup()
self.signals_group_arrow = QtGui.QGraphicsItemGroup()
self.signals_text_items = np.empty(len(Quotes), dtype=object)
for p in Portfolio.positions:
x, price = p.id_bar_open, p.open_price
if p.type == Order.BUY:
y = Quotes[x].low * 0.99
pg.ArrowItem(
parent=self.signals_group_arrow,
pos=(x, y),
pen=self.plotgroup.long_pen,
brush=self.plotgroup.long_brush,
angle=90,
headLen=12,
tipAngle=50,
)
text_sig = CenteredTextItem(
parent=self.signals_group_text,
pos=(x, y),
pen=self.plotgroup.long_pen,
brush=self.plotgroup.long_brush,
text=(
'Buy at {:.%df}' % self.plotgroup.digits).format(
price),
valign=QtCore.Qt.AlignBottom,
)
text_sig.hide()
else:
y = Quotes[x].high * 1.01
pg.ArrowItem(
parent=self.signals_group_arrow,
pos=(x, y),
pen=self.plotgroup.short_pen,
brush=self.plotgroup.short_brush,
angle=-90,
headLen=12,
tipAngle=50,
)
text_sig = CenteredTextItem(
parent=self.signals_group_text,
pos=(x, y),
pen=self.plotgroup.short_pen,
brush=self.plotgroup.short_brush,
text=('Sell at {:.%df}' % self.plotgroup.digits).format(
price),
valign=QtCore.Qt.AlignTop,
)
text_sig.hide()
self.signals_text_items[x] = text_sig
self.chart.addItem(self.signals_group_arrow)
self.chart.addItem(self.signals_group_text)
self.signals_visible = True

112
piker/ui/_source.py 100644
View File

@ -0,0 +1,112 @@
"""
Numpy data source machinery.
"""
import math
from dataclasses import dataclass
import numpy as np
import pandas as pd
ohlc_dtype = np.dtype(
[
('index', int),
('time', float),
('open', float),
('high', float),
('low', float),
('close', float),
('volume', int),
]
)
# map time frame "keys" to minutes values
tf_in_1m = {
'1m': 1,
'5m': 5,
'15m': 15,
'30m': 30,
'1h': 60,
'4h': 240,
'1d': 1440,
}
def ohlc_zeros(length: int) -> np.ndarray:
"""Construct an OHLC field formatted structarray.
For "why a structarray" see here: https://stackoverflow.com/a/52443038
Bottom line, they're faster then ``np.recarray``.
"""
return np.zeros(length, dtype=ohlc_dtype)
@dataclass
class Symbol:
"""I guess this is some kinda container thing for dealing with
all the different meta-data formats from brokers?
"""
key: str = ''
min_tick: float = 0.01
contract: str = ''
def digits(self) -> int:
"""Return the trailing number of digits specified by the
min tick size for the instrument.
"""
return int(math.log(self.min_tick, 0.1))
def from_df(
df: pd.DataFrame,
source=None,
default_tf=None
) -> np.recarray:
"""Convert OHLC formatted ``pandas.DataFrame`` to ``numpy.recarray``.
"""
df.reset_index(inplace=True)
# hackery to convert field names
date = 'Date'
if 'date' in df.columns:
date = 'date'
# convert to POSIX time
df[date] = [d.timestamp() for d in df[date]]
# try to rename from some camel case
columns = {
'Date': 'time',
'date': 'time',
'Open': 'open',
'High': 'high',
'Low': 'low',
'Close': 'close',
'Volume': 'volume',
}
df = df.rename(columns=columns)
for name in df.columns:
if name not in ohlc_dtype.names[1:]:
del df[name]
# TODO: it turns out column access on recarrays is actually slower:
# https://jakevdp.github.io/PythonDataScienceHandbook/02.09-structured-data-numpy.html#RecordArrays:-Structured-Arrays-with-a-Twist
# it might make sense to make these structured arrays?
array = df.to_records()
_nan_to_closest_num(array)
return array
def _nan_to_closest_num(array: np.ndarray):
"""Return interpolated values instead of NaN.
"""
for col in ['open', 'high', 'low', 'close']:
mask = np.isnan(array[col])
if not mask.size:
continue
array[col][mask] = np.interp(
np.flatnonzero(mask), np.flatnonzero(~mask), array[col][~mask]
)

63
piker/ui/_style.py 100644
View File

@ -0,0 +1,63 @@
"""
Qt UI styling.
"""
import pyqtgraph as pg
from PyQt5 import QtGui
from qdarkstyle.palette import DarkPalette
# chart-wide font
_font = QtGui.QFont("Hack", 4)
_i3_rgba = QtGui.QColor.fromRgbF(*[0.14]*3 + [1])
# splitter widget config
_xaxis_at = 'bottom'
# charting config
_min_points_to_show = 3
_tina_mode = False
def enable_tina_mode() -> None:
"""Enable "tina mode" to make everything look "conventional"
like your pet hedgehog always wanted.
"""
# white background (for tinas like our pal xb)
pg.setConfigOption('background', 'w')
def hcolor(name: str) -> str:
"""Hex color codes by hipster speak.
"""
return {
# lives matter
'black': '#000000',
'erie_black': '#1B1B1B',
'licorice': '#1A1110',
'papas_special': '#06070c',
# fifty shades
'gray': '#808080', # like the kick
'jet': '#343434',
'charcoal': '#36454F',
# palette
'default': DarkPalette.COLOR_BACKGROUND_NORMAL,
'white': '#ffffff', # for tinas and sunbathers
# blue zone
'dad_blue': '#326693', # like his shirt
'vwap_blue': '#0582fb',
'dodger_blue': '#1e90ff', # like the team?
'panasonic_blue': '#0040be', # from japan
# traditional
'tina_green': '#00cc00',
'tina_red': '#fa0000',
}[name]

129
piker/ui/cli.py 100644
View File

@ -0,0 +1,129 @@
"""
Console interface to UI components.
"""
from functools import partial
import os
import click
import tractor
from ..cli import cli
from .. import watchlists as wl
from ..data import maybe_spawn_brokerd
_config_dir = click.get_app_dir('piker')
_watchlists_data_path = os.path.join(_config_dir, 'watchlists.json')
def _kivy_import_hack():
# Command line hacks to make it work.
# See the pkg mod.
from .kivy import kivy # noqa
@cli.command()
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.option('--rate', '-r', default=3, help='Quote rate limit')
@click.option('--test', '-t', help='Test quote stream file')
@click.option('--dhost', '-dh', default='127.0.0.1',
help='Daemon host address to connect to')
@click.argument('name', nargs=1, required=True)
@click.pass_obj
def monitor(config, rate, name, dhost, test, tl):
"""Start a real-time watchlist UI
"""
# global opts
brokermod = config['brokermod']
loglevel = config['loglevel']
log = config['log']
watchlist_from_file = wl.ensure_watchlists(_watchlists_data_path)
watchlists = wl.merge_watchlist(watchlist_from_file, wl._builtins)
tickers = watchlists[name]
if not tickers:
log.error(f"No symbols found for watchlist `{name}`?")
return
_kivy_import_hack()
from .kivy.monitor import _async_main
async def main(tries):
async with maybe_spawn_brokerd(
brokername=brokermod.name,
tries=tries, loglevel=loglevel
) as portal:
# run app "main"
await _async_main(
name, portal, tickers,
brokermod, rate, test=test,
)
tractor.run(
partial(main, tries=1),
name='monitor',
loglevel=loglevel if tl else None,
rpc_module_paths=['piker.ui.kivy.monitor'],
debug_mode=True,
)
@cli.command()
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.option('--date', '-d', help='Contracts expiry date')
@click.option('--test', '-t', help='Test quote stream file')
@click.option('--rate', '-r', default=1, help='Logging level')
@click.argument('symbol', required=True)
@click.pass_obj
def optschain(config, symbol, date, tl, rate, test):
"""Start an option chain UI
"""
# global opts
loglevel = config['loglevel']
brokername = config['broker']
_kivy_import_hack()
from .kivy.option_chain import _async_main
async def main(tries):
async with maybe_spawn_brokerd(
tries=tries, loglevel=loglevel
):
# run app "main"
await _async_main(
symbol,
brokername,
rate=rate,
loglevel=loglevel,
test=test,
)
tractor.run(
partial(main, tries=1),
name='kivy-options-chain',
loglevel=loglevel if tl else None,
)
@cli.command()
@click.option('--date', '-d', help='Contracts expiry date')
@click.option('--test', '-t', help='Test quote stream file')
@click.option('--rate', '-r', default=1, help='Logging level')
@click.argument('symbol', required=True)
@click.pass_obj
def chart(config, symbol, date, rate, test):
"""Start an option chain UI
"""
from ._chart import _main
# global opts
brokername = config['broker']
tractorloglevel = config['tractorloglevel']
_main(
sym=symbol,
brokername=brokername,
tractor_kwargs={
'debug_mode': True,
'loglevel': tractorloglevel,
},
)

View File

@ -0,0 +1,15 @@
"""
Legacy kivy components.
"""
import os
import sys
# XXX clear all flags at import to avoid upsetting
# ol' kivy see: https://github.com/kivy/kivy/issues/4225
# though this is likely a ``click`` problem
sys.argv[1:] = []
# use the trio async loop
os.environ['KIVY_EVENTLOOP'] = 'trio'
import kivy
kivy.require('1.10.0')

View File

@ -15,11 +15,11 @@ from kivy.lang import Builder
from kivy.app import async_runTouchApp from kivy.app import async_runTouchApp
from kivy.core.window import Window from kivy.core.window import Window
from ..brokers.data import DataFeed from ...brokers.data import DataFeed
from .tabular import ( from .tabular import (
Row, TickerTable, _kv, _black_rgba, colorcode, Row, TickerTable, _kv, _black_rgba, colorcode,
) )
from ..log import get_logger from ...log import get_logger
from .pager import PagerView from .pager import PagerView
@ -51,23 +51,24 @@ async def update_quotes(
chngcell = row.get_cell('%') chngcell = row.get_cell('%')
# determine daily change color # determine daily change color
color = colorcode('gray')
percent_change = record.get('%') percent_change = record.get('%')
if percent_change: if percent_change is not None and percent_change != chngcell:
daychange = float(record['%']) daychange = float(percent_change)
if daychange < 0.: if daychange < 0.:
color = colorcode('red2') color = colorcode('red2')
elif daychange > 0.: elif daychange > 0.:
color = colorcode('forestgreen') color = colorcode('forestgreen')
else:
color = colorcode('gray')
# update row header and '%' cell text color
if chngcell:
chngcell.color = color
hdrcell.color = color
# if the cell has been "highlighted" make sure to change its color # if the cell has been "highlighted" make sure to change its color
if hdrcell.background_color != [0]*4: if hdrcell.background_color != [0]*4:
hdrcell.background_color = color hdrcell.background_color = color
# update row header and '%' cell text color
chngcell.color = color
hdrcell.color = color
# briefly highlight bg of certain cells on each trade execution # briefly highlight bg of certain cells on each trade execution
unflash = set() unflash = set()
tick_color = None tick_color = None
@ -103,36 +104,37 @@ async def update_quotes(
# initial coloring # initial coloring
to_sort = set() to_sort = set()
for sym, quote in first_quotes.items(): for quote in first_quotes:
row = table.get_row(sym) row = table.get_row(quote['symbol'])
record, displayable = formatter( row.update(quote)
quote, symbol_data=symbol_data) color_row(row, quote, {})
row.update(record, displayable)
color_row(row, record, {})
to_sort.add(row.widget) to_sort.add(row.widget)
table.render_rows(to_sort) table.render_rows(to_sort)
log.debug("Finished initializing update loop") log.debug("Finished initializing update loop")
task_status.started() task_status.started()
# real-time cell update loop # real-time cell update loop
async for quotes in agen: # new quotes data only async for quotes in agen: # new quotes data only
to_sort = set() to_sort = set()
for symbol, quote in quotes.items(): for symbol, quote in quotes.items():
row = table.get_row(symbol) row = table.get_row(symbol)
record, displayable = formatter(
quote, symbol_data=symbol_data) # don't red/green the header cell in ``row.update()``
quote.pop('symbol')
quote.pop('key')
# determine if sorting should happen # determine if sorting should happen
sort_key = table.sort_key sort_key = table.sort_key
new = record[sort_key]
last = row.get_field(sort_key) last = row.get_field(sort_key)
new = quote.get(sort_key, last)
if new != last: if new != last:
to_sort.add(row.widget) to_sort.add(row.widget)
# update and color # update and color
cells = row.update(record, displayable) cells = row.update(quote)
color_row(row, record, cells) color_row(row, quote, cells)
if to_sort: if to_sort:
table.render_rows(to_sort) table.render_rows(to_sort)
@ -174,18 +176,14 @@ async def _async_main(
This is started with cli cmd `piker monitor`. This is started with cli cmd `piker monitor`.
''' '''
feed = DataFeed(portal, brokermod) feed = DataFeed(portal, brokermod)
quote_gen, quotes = await feed.open_stream( quote_gen, first_quotes = await feed.open_stream(
symbols, symbols,
'stock', 'stock',
rate=rate, rate=rate,
test=test, test=test,
) )
first_quotes_list = list(first_quotes.copy().values())
first_quotes, _ = feed.format_quotes(quotes) quotes = list(first_quotes.copy().values())
if first_quotes[0].get('last') is None:
log.error("Broker API is down temporarily")
return
# build out UI # build out UI
Window.set_title(f"monitor: {name}\t(press ? for help)") Window.set_title(f"monitor: {name}\t(press ? for help)")
@ -197,7 +195,9 @@ async def _async_main(
bidasks = brokermod._stock_bidasks bidasks = brokermod._stock_bidasks
# add header row # add header row
headers = first_quotes[0].keys() headers = list(first_quotes_list[0].keys())
headers.remove('displayable')
header = Row( header = Row(
{key: key for key in headers}, {key: key for key in headers},
headers=headers, headers=headers,
@ -212,11 +212,17 @@ async def _async_main(
cols=1, cols=1,
size_hint=(1, None), size_hint=(1, None),
) )
for ticker_record in first_quotes: for ticker_record in first_quotes_list:
symbol = ticker_record['symbol']
table.append_row( table.append_row(
ticker_record['symbol'], symbol,
Row(ticker_record, headers=('symbol',), Row(
bidasks=bidasks, table=table) ticker_record,
headers=('symbol',),
bidasks=bidasks,
no_cell=('displayable',),
table=table
)
) )
table.last_clicked_row = next(iter(table.symbols2rows.values())) table.last_clicked_row = next(iter(table.symbols2rows.values()))

View File

@ -11,7 +11,6 @@ from kivy.properties import BooleanProperty, ObjectProperty
from kivy.core.window import Window from kivy.core.window import Window
from kivy.clock import Clock from kivy.clock import Clock
from ...log import get_logger from ...log import get_logger
@ -100,7 +99,8 @@ class MouseOverBehavior(object):
# throttle at 10ms latency # throttle at 10ms latency
@triggered(timeout=0.01, interval=False) @triggered(timeout=0.01, interval=False)
def _on_mouse_pos(cls, *args): def _on_mouse_pos(cls, *args):
log.debug(f"{cls} time since last call: {time.time() - cls._last_time}") log.debug(
f"{cls} time since last call: {time.time() - cls._last_time}")
cls._last_time = time.time() cls._last_time = time.time()
# XXX: how to still do this at the class level? # XXX: how to still do this at the class level?
# don't proceed if I'm not displayed <=> If have no parent # don't proceed if I'm not displayed <=> If have no parent

View File

@ -15,11 +15,10 @@ from kivy.app import async_runTouchApp
from kivy.core.window import Window from kivy.core.window import Window
from kivy.uix.label import Label from kivy.uix.label import Label
from ..log import get_logger, get_console_log from ...log import get_logger, get_console_log
from ..brokers.data import DataFeed from ...brokers.data import DataFeed
from ..brokers import get_brokermod from ...brokers import get_brokermod
from .pager import PagerView from .pager import PagerView
from .tabular import Row, HeaderCell, Cell, TickerTable from .tabular import Row, HeaderCell, Cell, TickerTable
from .monitor import update_quotes from .monitor import update_quotes

View File

@ -9,8 +9,8 @@ from kivy.uix.widget import Widget
from kivy.uix.textinput import TextInput from kivy.uix.textinput import TextInput
from kivy.uix.scrollview import ScrollView from kivy.uix.scrollview import ScrollView
from ..log import get_logger from ...log import get_logger
from .kivy.utils_async import async_bind from .utils_async import async_bind
log = get_logger('keyboard') log = get_logger('keyboard')

View File

@ -12,8 +12,8 @@ from kivy.uix.button import Button
from kivy import utils from kivy import utils
from kivy.properties import BooleanProperty from kivy.properties import BooleanProperty
from ..log import get_logger from ...log import get_logger
from .kivy.mouse_over import new_mouse_over_group from .mouse_over import new_mouse_over_group
HoverBehavior = new_mouse_over_group() HoverBehavior = new_mouse_over_group()
@ -300,10 +300,10 @@ class Row(HoverBehavior, GridLayout):
# handle bidask cells # handle bidask cells
if key in layouts: if key in layouts:
self.add_widget(layouts[key]) self.add_widget(layouts[key])
elif key in children_flat: elif key in children_flat or key in no_cell:
# these cells have already been added to the `BidAskLayout` # these cells have already been added to the `BidAskLayout`
continue continue
elif key not in no_cell: else:
cell = self._append_cell(val, key, header=header) cell = self._append_cell(val, key, header=header)
cell.key = key cell.key = key
self._cell_widgets[key] = cell self._cell_widgets[key] = cell
@ -329,7 +329,7 @@ class Row(HoverBehavior, GridLayout):
self.add_widget(cell) self.add_widget(cell)
return cell return cell
def update(self, record, displayable): def update(self, record):
"""Update this row's cells with new values from a quote """Update this row's cells with new values from a quote
``record``. ``record``.
@ -340,7 +340,12 @@ class Row(HoverBehavior, GridLayout):
gray = colorcode('gray') gray = colorcode('gray')
fgreen = colorcode('forestgreen') fgreen = colorcode('forestgreen')
red = colorcode('red2') red = colorcode('red2')
displayable = record['displayable']
for key, val in record.items(): for key, val in record.items():
if key not in displayable:
continue
last = self.get_field(key) last = self.get_field(key)
color = gray color = gray
try: try:
@ -361,7 +366,7 @@ class Row(HoverBehavior, GridLayout):
if color != gray: if color != gray:
cells[key] = cell cells[key] = cell
self._last_record = record self._last_record.update(record)
return cells return cells
# mouse over handlers # mouse over handlers

View File

@ -61,10 +61,10 @@ class AsyncCallbackQueue(object):
return self return self
async def __anext__(self): async def __anext__(self):
self.event.clear() self.event = async_lib.Event()
while not self.callback_result and not self.quit: while not self.callback_result and not self.quit:
await self.event.wait() await self.event.wait()
self.event.clear() self.event = async_lib.Event()
if self.callback_result: if self.callback_result:
return self.callback_result.popleft() return self.callback_result.popleft()

View File

@ -0,0 +1,3 @@
"""
Super hawt Qt UI components
"""

View File

@ -0,0 +1,67 @@
import sys
from PySide2.QtCharts import QtCharts
from PySide2.QtWidgets import QApplication, QMainWindow
from PySide2.QtCore import Qt, QPointF
from PySide2 import QtGui
import qdarkstyle
data = ((1, 7380, 7520, 7380, 7510, 7324),
(2, 7520, 7580, 7410, 7440, 7372),
(3, 7440, 7650, 7310, 7520, 7434),
(4, 7450, 7640, 7450, 7550, 7480),
(5, 7510, 7590, 7460, 7490, 7502),
(6, 7500, 7590, 7480, 7560, 7512),
(7, 7560, 7830, 7540, 7800, 7584))
app = QApplication([])
# set dark stylesheet
# import pdb; pdb.set_trace()
app.setStyleSheet(qdarkstyle.load_stylesheet_pyside())
series = QtCharts.QCandlestickSeries()
series.setDecreasingColor(Qt.darkRed)
series.setIncreasingColor(Qt.darkGreen)
ma5 = QtCharts.QLineSeries() # 5-days average data line
tm = [] # stores str type data
# in a loop, series and ma5 append corresponding data
for num, o, h, l, c, m in data:
candle = QtCharts.QCandlestickSet(o, h, l, c)
series.append(candle)
ma5.append(QPointF(num, m))
tm.append(str(num))
pen = candle.pen()
# import pdb; pdb.set_trace()
chart = QtCharts.QChart()
# import pdb; pdb.set_trace()
series.setBodyOutlineVisible(False)
series.setCapsVisible(False)
# brush = QtGui.QBrush()
# brush.setColor(Qt.green)
# series.setBrush(brush)
chart.addSeries(series) # candle
chart.addSeries(ma5) # ma5 line
chart.setAnimationOptions(QtCharts.QChart.SeriesAnimations)
chart.createDefaultAxes()
chart.legend().hide()
chart.axisX(series).setCategories(tm)
chart.axisX(ma5).setVisible(False)
view = QtCharts.QChartView(chart)
view.chart().setTheme(QtCharts.QChart.ChartTheme.ChartThemeDark)
view.setRubberBand(QtCharts.QChartView.HorizontalRubberBand)
# chartview.chart().setTheme(QtCharts.QChart.ChartTheme.ChartThemeBlueCerulean)
ui = QMainWindow()
# ui.setGeometry(50, 50, 500, 300)
ui.setCentralWidget(view)
ui.show()
sys.exit(app.exec_())

View File

@ -0,0 +1,10 @@
"""
Curated set of components from ``Quantdom`` used as a starting
draft for real-time charting with ``pyqtgraph``.
Much thanks to the author:
https://github.com/constverum/Quantdom
Note this code is licensed Apache 2.0:
https://github.com/constverum/Quantdom/blob/master/LICENSE
"""

View File

@ -0,0 +1,188 @@
"""
Strategy and performance charting
"""
import numpy as np
import pyqtgraph as pg
from PyQt5 import QtCore, QtGui
from .base import Quotes
from .portfolio import Portfolio
from .utils import timeit
from .charts import (
PriceAxis,
CHART_MARGINS,
SampleLegendItem,
YAxisLabel,
CrossHairItem,
)
class EquityChart(QtGui.QWidget):
eq_pen_pos_color = pg.mkColor('#00cc00')
eq_pen_neg_color = pg.mkColor('#cc0000')
eq_brush_pos_color = pg.mkColor('#40ee40')
eq_brush_neg_color = pg.mkColor('#ee4040')
long_pen_color = pg.mkColor('#008000')
short_pen_color = pg.mkColor('#800000')
buy_and_hold_pen_color = pg.mkColor('#4444ff')
def __init__(self):
super().__init__()
self.xaxis = pg.DateAxisItem()
self.xaxis.setStyle(tickTextOffset=7, textFillLimits=[(0, 0.80)])
self.yaxis = PriceAxis()
self.layout = QtGui.QVBoxLayout(self)
self.layout.setContentsMargins(0, 0, 0, 0)
self.chart = pg.PlotWidget(
axisItems={'bottom': self.xaxis, 'right': self.yaxis},
enableMenu=False,
)
self.chart.setFrameStyle(QtGui.QFrame.StyledPanel | QtGui.QFrame.Plain)
self.chart.getPlotItem().setContentsMargins(*CHART_MARGINS)
self.chart.showGrid(x=True, y=True)
self.chart.hideAxis('left')
self.chart.showAxis('right')
self.chart.setCursor(QtCore.Qt.BlankCursor)
self.chart.sigXRangeChanged.connect(self._update_yrange_limits)
self.layout.addWidget(self.chart)
def _add_legend(self):
legend = pg.LegendItem((140, 100), offset=(10, 10))
legend.setParentItem(self.chart.getPlotItem())
for arr, item in self.curves:
legend.addItem(
SampleLegendItem(item),
item.opts['name']
if not isinstance(item, tuple)
else item[0].opts['name'],
)
def _add_ylabels(self):
self.ylabels = []
for arr, item in self.curves:
color = (
item.opts['pen']
if not isinstance(item, tuple)
else [i.opts['pen'] for i in item]
)
label = YAxisLabel(parent=self.yaxis, color=color)
self.ylabels.append(label)
def _update_ylabels(self, vb, rbar):
for i, curve in enumerate(self.curves):
arr, item = curve
ylast = arr[rbar]
ypos = vb.mapFromView(QtCore.QPointF(0, ylast)).y()
axlabel = self.ylabels[i]
axlabel.update_label_test(ypos=ypos, ydata=ylast)
def _update_yrange_limits(self, vb=None):
if not hasattr(self, 'min_curve'):
return
vr = self.chart.viewRect()
lbar, rbar = int(vr.left()), int(vr.right())
ylow = self.min_curve[lbar:rbar].min() * 1.1
yhigh = self.max_curve[lbar:rbar].max() * 1.1
std = np.std(self.max_curve[lbar:rbar]) * 4
self.chart.setLimits(yMin=ylow, yMax=yhigh, minYRange=std)
self.chart.setYRange(ylow, yhigh)
self._update_ylabels(vb, rbar)
@timeit
def plot(self):
equity_curve = Portfolio.equity_curve
eq_pos = np.zeros_like(equity_curve)
eq_neg = np.zeros_like(equity_curve)
eq_pos[equity_curve >= 0] = equity_curve[equity_curve >= 0]
eq_neg[equity_curve <= 0] = equity_curve[equity_curve <= 0]
# Equity
self.eq_pos_curve = pg.PlotCurveItem(
eq_pos,
name='Equity',
fillLevel=0,
antialias=True,
pen=self.eq_pen_pos_color,
brush=self.eq_brush_pos_color,
)
self.eq_neg_curve = pg.PlotCurveItem(
eq_neg,
name='Equity',
fillLevel=0,
antialias=True,
pen=self.eq_pen_neg_color,
brush=self.eq_brush_neg_color,
)
self.chart.addItem(self.eq_pos_curve)
self.chart.addItem(self.eq_neg_curve)
# Only Long
self.long_curve = pg.PlotCurveItem(
Portfolio.long_curve,
name='Only Long',
pen=self.long_pen_color,
antialias=True,
)
self.chart.addItem(self.long_curve)
# Only Short
self.short_curve = pg.PlotCurveItem(
Portfolio.short_curve,
name='Only Short',
pen=self.short_pen_color,
antialias=True,
)
self.chart.addItem(self.short_curve)
# Buy and Hold
self.buy_and_hold_curve = pg.PlotCurveItem(
Portfolio.buy_and_hold_curve,
name='Buy and Hold',
pen=self.buy_and_hold_pen_color,
antialias=True,
)
self.chart.addItem(self.buy_and_hold_curve)
self.curves = [
(Portfolio.equity_curve, (self.eq_pos_curve, self.eq_neg_curve)),
(Portfolio.long_curve, self.long_curve),
(Portfolio.short_curve, self.short_curve),
(Portfolio.buy_and_hold_curve, self.buy_and_hold_curve),
]
self._add_legend()
self._add_ylabels()
ch = CrossHairItem(self.chart)
self.chart.addItem(ch)
arrs = (
Portfolio.equity_curve,
Portfolio.buy_and_hold_curve,
Portfolio.long_curve,
Portfolio.short_curve,
)
np_arrs = np.concatenate(arrs)
_min = abs(np_arrs.min()) * -1.1
_max = np_arrs.max() * 1.1
self.chart.setLimits(
xMin=Quotes[0].id,
xMax=Quotes[-1].id,
yMin=_min,
yMax=_max,
minXRange=60,
)
self.min_curve = arrs[0].copy()
self.max_curve = arrs[0].copy()
for arr in arrs[1:]:
self.min_curve = np.minimum(self.min_curve, arr)
self.max_curve = np.maximum(self.max_curve, arr)

View File

@ -0,0 +1,147 @@
"""Base classes."""
from enum import Enum, auto
import numpy as np
import pandas as pd
from .const import ChartType, TimeFrame
__all__ = ('Indicator', 'Symbol', 'Quotes')
# I actually can't think of a worse reason to override an array than
# this:
# - a method .new() that mutates the data from an input data frame
# - mutating the time column wholesale based on a setting
# - enforcing certain fields / columns
# - zero overriding of any of the array interface for the purposes of
# a different underlying implementation.
# Literally all this can be done in a simple function with way less
# confusion for the reader.
class BaseQuotes(np.recarray):
def __new__(cls, shape=None, dtype=None, order='C'):
dt = np.dtype(
[
('id', int),
('time', float),
('open', float),
('high', float),
('low', float),
('close', float),
('volume', int),
]
)
shape = shape or (1,)
return np.ndarray.__new__(cls, shape, (np.record, dt), order=order)
def _nan_to_closest_num(self):
"""Return interpolated values instead of NaN."""
for col in ['open', 'high', 'low', 'close']:
mask = np.isnan(self[col])
if not mask.size:
continue
self[col][mask] = np.interp(
np.flatnonzero(mask), np.flatnonzero(~mask), self[col][~mask]
)
def _set_time_frame(self, default_tf):
tf = {
1: TimeFrame.M1,
5: TimeFrame.M5,
15: TimeFrame.M15,
30: TimeFrame.M30,
60: TimeFrame.H1,
240: TimeFrame.H4,
1440: TimeFrame.D1,
}
minutes = int(np.diff(self.time[-10:]).min() / 60)
self.timeframe = tf.get(minutes) or tf[default_tf]
# bruh this isn't creating anything it's copying data in
# from a data frame...
def new(self, data, source=None, default_tf=None):
shape = (len(data),)
self.resize(shape, refcheck=False)
if isinstance(data, pd.DataFrame):
data.reset_index(inplace=True)
data.insert(0, 'id', data.index)
data.Date = self.convert_dates(data.Date)
data = data.rename(
columns={
'Date': 'time',
'Open': 'open',
'High': 'high',
'Low': 'low',
'Close': 'close',
'Volume': 'volume',
}
)
for name in self.dtype.names:
self[name] = data[name]
elif isinstance(data, (np.recarray, BaseQuotes)):
self[:] = data[:]
self._nan_to_closest_num()
self._set_time_frame(default_tf)
return self
def convert_dates(self, dates):
breakpoint()
return np.array([d.timestamp() for d in dates])
class SymbolType(Enum):
FOREX = auto()
CFD = auto()
FUTURES = auto()
SHARES = auto()
class Symbol:
FOREX = SymbolType.FOREX
CFD = SymbolType.CFD
FUTURES = SymbolType.FUTURES
SHARES = SymbolType.SHARES
def __init__(self, ticker, mode, tick_size=0, tick_value=None):
self.ticker = ticker
self.mode = mode
if self.mode in [self.FOREX, self.CFD]:
# number of units of the commodity, currency
# or financial asset in one lot
self.contract_size = 100_000 # (100000 == 1 Lot)
elif self.mode == self.FUTURES:
# cost of a single price change point ($10) /
# one minimum price movement
self.tick_value = tick_value
# minimum price change step (0.0001)
self.tick_size = tick_size
if isinstance(tick_size, float):
self.digits = len(str(tick_size).split('.')[1])
else:
self.digits = 0
def __repr__(self):
return 'Symbol (%s | %s)' % (self.ticker, self.mode)
class Indicator:
def __init__(
self, label=None, window=None, data=None, tp=None, base=None, **kwargs
):
self.label = label
self.window = window
self.data = data or [0]
self.type = tp or ChartType.LINE
self.base = base or {'linewidth': 0.5, 'color': 'black'}
self.lineStyle = {'linestyle': '-', 'linewidth': 0.5, 'color': 'blue'}
self.lineStyle.update(kwargs)
# This creates a global array that seems to be shared between all
# charting UI components
Quotes = BaseQuotes()

View File

@ -0,0 +1,79 @@
"""
Real-time quotes charting components
"""
import pyqtgraph as pg
from PyQt5 import QtCore, QtGui
class SampleLegendItem(pg.graphicsItems.LegendItem.ItemSample):
def paint(self, p, *args):
p.setRenderHint(p.Antialiasing)
if isinstance(self.item, tuple):
positive = self.item[0].opts
negative = self.item[1].opts
p.setPen(pg.mkPen(positive['pen']))
p.setBrush(pg.mkBrush(positive['brush']))
p.drawPolygon(
QtGui.QPolygonF(
[
QtCore.QPointF(0, 0),
QtCore.QPointF(18, 0),
QtCore.QPointF(18, 18),
]
)
)
p.setPen(pg.mkPen(negative['pen']))
p.setBrush(pg.mkBrush(negative['brush']))
p.drawPolygon(
QtGui.QPolygonF(
[
QtCore.QPointF(0, 0),
QtCore.QPointF(0, 18),
QtCore.QPointF(18, 18),
]
)
)
else:
opts = self.item.opts
p.setPen(pg.mkPen(opts['pen']))
p.drawRect(0, 10, 18, 0.5)
class CenteredTextItem(QtGui.QGraphicsTextItem):
def __init__(
self,
text='',
parent=None,
pos=(0, 0),
pen=None,
brush=None,
valign=None,
opacity=0.1,
):
super().__init__(text, parent)
self.pen = pen
self.brush = brush
self.opacity = opacity
self.valign = valign
self.text_flags = QtCore.Qt.AlignCenter
self.setPos(*pos)
self.setFlag(self.ItemIgnoresTransformations)
def boundingRect(self): # noqa
r = super().boundingRect()
if self.valign == QtCore.Qt.AlignTop:
return QtCore.QRectF(-r.width() / 2, -37, r.width(), r.height())
elif self.valign == QtCore.Qt.AlignBottom:
return QtCore.QRectF(-r.width() / 2, 15, r.width(), r.height())
def paint(self, p, option, widget):
p.setRenderHint(p.Antialiasing, False)
p.setRenderHint(p.TextAntialiasing, True)
p.setPen(self.pen)
if self.brush.style() != QtCore.Qt.NoBrush:
p.setOpacity(self.opacity)
p.fillRect(option.rect, self.brush)
p.setOpacity(1)
p.drawText(option.rect, self.text_flags, self.toPlainText())

View File

@ -0,0 +1,36 @@
"""Constants."""
from enum import Enum, auto
__all__ = ('ChartType', 'TimeFrame')
class ChartType(Enum):
BAR = auto()
CANDLESTICK = auto()
LINE = auto()
class TimeFrame(Enum):
M1 = auto()
M5 = auto()
M15 = auto()
M30 = auto()
H1 = auto()
H4 = auto()
D1 = auto()
W1 = auto()
MN = auto()
ANNUAL_PERIOD = 252 # number of trading days in a year
# # TODO: 6.5 - US trading hours (trading session); fix it for fx
# ANNUALIZATION_FACTORS = {
# TimeFrame.M1: int(252 * 6.5 * 60),
# TimeFrame.M5: int(252 * 6.5 * 12),
# TimeFrame.M15: int(252 * 6.5 * 4),
# TimeFrame.M30: int(252 * 6.5 * 2),
# TimeFrame.H1: int(252 * 6.5),
# TimeFrame.D1: 252,
# }

View File

@ -0,0 +1,184 @@
"""Parser."""
import logging
import os.path
import pickle
import datetime
import numpy as np
import pandas as pd
import pandas_datareader.data as web
from pandas_datareader._utils import RemoteDataError
from pandas_datareader.data import (
get_data_quandl,
get_data_yahoo,
get_data_alphavantage,
)
from pandas_datareader.nasdaq_trader import get_nasdaq_symbols
from pandas_datareader.exceptions import ImmediateDeprecationError
from .base import Quotes, Symbol
from .utils import get_data_path, timeit
__all__ = (
'YahooQuotesLoader',
'QuandleQuotesLoader',
'get_symbols',
'get_quotes',
)
logger = logging.getLogger(__name__)
class QuotesLoader:
source = None
timeframe = '1D'
sort_index = False
default_tf = None
name_format = '%(symbol)s_%(tf)s_%(date_from)s_%(date_to)s.%(ext)s'
@classmethod
def _get(cls, symbol, date_from, date_to):
quotes = web.DataReader(
symbol, cls.source, start=date_from, end=date_to
)
if cls.sort_index:
quotes.sort_index(inplace=True)
return quotes
@classmethod
def _get_file_path(cls, symbol, tf, date_from, date_to):
fname = cls.name_format % {
'symbol': symbol,
'tf': tf,
'date_from': date_from.isoformat(),
'date_to': date_to.isoformat(),
'ext': 'qdom',
}
return os.path.join(get_data_path('stock_data'), fname)
@classmethod
def _save_to_disk(cls, fpath, data):
logger.debug('Saving quotes to a file: %s', fpath)
with open(fpath, 'wb') as f:
pickle.dump(data, f, pickle.HIGHEST_PROTOCOL)
d = pickle.load(f)
@classmethod
def _load_from_disk(cls, fpath):
logger.debug('Loading quotes from a file: %s', fpath)
with open(fpath, 'rb') as f:
breakpoint()
data = pickle.load(f)
@classmethod
@timeit
def get_quotes(
cls,
symbol: Symbol,
date_from: datetime.datetime,
date_to: datetime.datetime,
) -> Quotes:
"""Retrieve quotes data from a provider and return a ``numpy.ndarray`` subtype.
"""
quotes = None
fpath = cls._get_file_path(symbol, cls.timeframe, date_from, date_to)
# if os.path.exists(fpath):
# quotes = Quotes.new(cls._load_from_disk(fpath))
# else:
quotes_raw = cls._get(symbol, date_from, date_to)
# quotes = Quotes.new(
# quotes_raw, source=cls.source, default_tf=cls.default_tf
# )
# cls._save_to_disk(fpath, quotes)
return quotes_raw
class YahooQuotesLoader(QuotesLoader):
source = 'yahoo'
@classmethod
def _get(cls, symbol, date_from, date_to):
return get_data_yahoo(symbol, date_from, date_to)
class QuandleQuotesLoader(QuotesLoader):
source = 'quandle'
@classmethod
def _get(cls, symbol, date_from, date_to):
quotes = get_data_quandl(symbol, date_from, date_to)
quotes.sort_index(inplace=True)
return quotes
class AlphaVantageQuotesLoader(QuotesLoader):
source = 'alphavantage'
api_key = 'demo'
@classmethod
def _get(cls, symbol, date_from, date_to):
quotes = get_data_alphavantage(
symbol, date_from, date_to, api_key=cls.api_key
)
return quotes
class StooqQuotesLoader(QuotesLoader):
source = 'stooq'
sort_index = True
default_tf = 1440
class IEXQuotesLoader(QuotesLoader):
source = 'iex'
@classmethod
def _get(cls, symbol, date_from, date_to):
quotes = web.DataReader(
symbol, cls.source, start=date_from, end=date_to
)
quotes['Date'] = pd.to_datetime(quotes.index)
return quotes
class RobinhoodQuotesLoader(QuotesLoader):
source = 'robinhood'
def get_symbols():
fpath = os.path.join(get_data_path('stock_data'), 'symbols.qdom')
if os.path.exists(fpath):
with open(fpath, 'rb') as f:
symbols = pickle.load(f)
else:
symbols = get_nasdaq_symbols()
symbols.reset_index(inplace=True)
with open(fpath, 'wb') as f:
pickle.dump(symbols, f, pickle.HIGHEST_PROTOCOL)
return symbols
def get_quotes(*args, **kwargs):
quotes = []
# don't work:
# GoogleQuotesLoader, QuandleQuotesLoader,
# AlphaVantageQuotesLoader, RobinhoodQuotesLoader
loaders = [YahooQuotesLoader, IEXQuotesLoader, StooqQuotesLoader]
while loaders:
loader = loaders.pop(0)
try:
quotes = loader.get_quotes(*args, **kwargs)
break
except (RemoteDataError, ImmediateDeprecationError) as e:
logger.error('get_quotes => error: %r', e)
return quotes

View File

@ -0,0 +1,350 @@
"""Performance."""
import codecs
import json
from collections import OrderedDict, defaultdict
import numpy as np
from .base import Quotes
from .const import ANNUAL_PERIOD
from .utils import fromtimestamp, get_resource_path
__all__ = (
'BriefPerformance',
'Performance',
'Stats',
'REPORT_COLUMNS',
'REPORT_ROWS',
)
REPORT_COLUMNS = ('All', 'Long', 'Short', 'Market')
with codecs.open(
get_resource_path('report_rows.json'), mode='r', encoding='utf-8'
) as f:
REPORT_ROWS = OrderedDict(json.load(f))
class Stats(np.recarray):
def __new__(cls, positions, shape=None, dtype=None, order='C'):
shape = shape or (len(positions['All']),)
dtype = np.dtype(
[
('type', object),
('symbol', object),
('volume', float),
('open_time', float),
('close_time', float),
('open_price', float),
('close_price', float),
('total_profit', float),
('entry_name', object),
('exit_name', object),
('status', object),
('comment', object),
('abs', float),
('perc', float),
('bars', float),
('on_bar', float),
('mae', float),
('mfe', float),
]
)
dt = [(col, dtype) for col in REPORT_COLUMNS]
return np.ndarray.__new__(cls, shape, (np.record, dt), order=order)
def __init__(self, positions, **kwargs):
for col, _positions in positions.items():
for i, p in enumerate(_positions):
self._add_position(p, col, i)
def _add_position(self, p, col, i):
self[col][i].type = p.type
self[col][i].symbol = p.symbol
self[col][i].volume = p.volume
self[col][i].open_time = p.open_time
self[col][i].close_time = p.close_time
self[col][i].open_price = p.open_price
self[col][i].close_price = p.close_price
self[col][i].total_profit = p.total_profit
self[col][i].entry_name = p.entry_name
self[col][i].exit_name = p.exit_name
self[col][i].status = p.status
self[col][i].comment = p.comment
self[col][i].abs = p.profit
self[col][i].perc = p.profit_perc
quotes_on_trade = Quotes[p.id_bar_open : p.id_bar_close]
if not quotes_on_trade.size:
# if position was opened and closed on the last bar
quotes_on_trade = Quotes[p.id_bar_open : p.id_bar_close + 1]
kwargs = {
'low': quotes_on_trade.low.min(),
'high': quotes_on_trade.high.max(),
}
self[col][i].mae = p.calc_mae(**kwargs)
self[col][i].mfe = p.calc_mfe(**kwargs)
bars = p.id_bar_close - p.id_bar_open
self[col][i].bars = bars
self[col][i].on_bar = p.profit_perc / bars
class BriefPerformance(np.recarray):
def __new__(cls, shape=None, dtype=None, order='C'):
dt = np.dtype(
[
('kwargs', object),
('net_profit_abs', float),
('net_profit_perc', float),
('year_profit', float),
('win_average_profit_perc', float),
('loss_average_profit_perc', float),
('max_drawdown_abs', float),
('total_trades', int),
('win_trades_abs', int),
('win_trades_perc', float),
('profit_factor', float),
('recovery_factor', float),
('payoff_ratio', float),
]
)
shape = shape or (1,)
return np.ndarray.__new__(cls, shape, (np.record, dt), order=order)
def _days_count(self, positions):
if hasattr(self, 'days'):
return self.days
self.days = (
(
fromtimestamp(positions[-1].close_time)
- fromtimestamp(positions[0].open_time)
).days
if positions
else 1
)
return self.days
def add(self, initial_balance, positions, i, kwargs):
position_count = len(positions)
profit = np.recarray(
(position_count,), dtype=[('abs', float), ('perc', float)]
)
for n, position in enumerate(positions):
profit[n].abs = position.profit
profit[n].perc = position.profit_perc
s = self[i]
s.kwargs = kwargs
s.net_profit_abs = np.sum(profit.abs)
s.net_profit_perc = np.sum(profit.perc)
days = self._days_count(positions)
gain_factor = (s.net_profit_abs + initial_balance) / initial_balance
s.year_profit = (gain_factor ** (365 / days) - 1) * 100
s.win_average_profit_perc = np.mean(profit.perc[profit.perc > 0])
s.loss_average_profit_perc = np.mean(profit.perc[profit.perc < 0])
s.max_drawdown_abs = profit.abs.min()
s.total_trades = position_count
wins = profit.abs[profit.abs > 0]
loss = profit.abs[profit.abs < 0]
s.win_trades_abs = len(wins)
s.win_trades_perc = round(s.win_trades_abs / s.total_trades * 100, 2)
s.profit_factor = abs(np.sum(wins) / np.sum(loss))
s.recovery_factor = abs(s.net_profit_abs / s.max_drawdown_abs)
s.payoff_ratio = abs(np.mean(wins) / np.mean(loss))
class Performance:
"""Performance Metrics."""
rows = REPORT_ROWS
columns = REPORT_COLUMNS
def __init__(self, initial_balance, stats, positions):
self._data = {}
for col in self.columns:
column = type('Column', (object,), dict.fromkeys(self.rows, 0))
column.initial_balance = initial_balance
self._data[col] = column
self.calculate(column, stats[col], positions[col])
def __getitem__(self, col):
return self._data[col]
def _calc_trade_series(self, col, positions):
win_in_series, loss_in_series = 0, 0
for i, p in enumerate(positions):
if p.profit >= 0:
win_in_series += 1
loss_in_series = 0
if win_in_series > col.win_in_series:
col.win_in_series = win_in_series
else:
win_in_series = 0
loss_in_series += 1
if loss_in_series > col.loss_in_series:
col.loss_in_series = loss_in_series
def calculate(self, col, stats, positions):
self._calc_trade_series(col, positions)
col.total_trades = len(positions)
profit_abs = stats[np.flatnonzero(stats.abs)].abs
profit_perc = stats[np.flatnonzero(stats.perc)].perc
bars = stats[np.flatnonzero(stats.bars)].bars
on_bar = stats[np.flatnonzero(stats.on_bar)].on_bar
gt_zero_abs = stats[stats.abs > 0].abs
gt_zero_perc = stats[stats.perc > 0].perc
win_bars = stats[stats.perc > 0].bars
lt_zero_abs = stats[stats.abs < 0].abs
lt_zero_perc = stats[stats.perc < 0].perc
los_bars = stats[stats.perc < 0].bars
col.average_profit_abs = np.mean(profit_abs) if profit_abs.size else 0
col.average_profit_perc = (
np.mean(profit_perc) if profit_perc.size else 0
)
col.bars_on_trade = np.mean(bars) if bars.size else 0
col.bar_profit = np.mean(on_bar) if on_bar.size else 0
col.win_average_profit_abs = (
np.mean(gt_zero_abs) if gt_zero_abs.size else 0
)
col.win_average_profit_perc = (
np.mean(gt_zero_perc) if gt_zero_perc.size else 0
)
col.win_bars_on_trade = np.mean(win_bars) if win_bars.size else 0
col.loss_average_profit_abs = (
np.mean(lt_zero_abs) if lt_zero_abs.size else 0
)
col.loss_average_profit_perc = (
np.mean(lt_zero_perc) if lt_zero_perc.size else 0
)
col.loss_bars_on_trade = np.mean(los_bars) if los_bars.size else 0
col.win_trades_abs = len(gt_zero_abs)
col.win_trades_perc = (
round(col.win_trades_abs / col.total_trades * 100, 2)
if col.total_trades
else 0
)
col.loss_trades_abs = len(lt_zero_abs)
col.loss_trades_perc = (
round(col.loss_trades_abs / col.total_trades * 100, 2)
if col.total_trades
else 0
)
col.total_profit = np.sum(gt_zero_abs)
col.total_loss = np.sum(lt_zero_abs)
col.net_profit_abs = np.sum(stats.abs)
col.net_profit_perc = np.sum(stats.perc)
col.total_mae = np.sum(stats.mae)
col.total_mfe = np.sum(stats.mfe)
# https://financial-calculators.com/roi-calculator
days = (
(
fromtimestamp(positions[-1].close_time)
- fromtimestamp(positions[0].open_time)
).days
if positions
else 1
)
gain_factor = (
col.net_profit_abs + col.initial_balance
) / col.initial_balance
col.year_profit = (gain_factor ** (365 / days) - 1) * 100
col.month_profit = (gain_factor ** (365 / days / 12) - 1) * 100
col.max_profit_abs = stats.abs.max()
col.max_profit_perc = stats.perc.max()
col.max_profit_abs_day = fromtimestamp(
stats.close_time[stats.abs == col.max_profit_abs][0]
)
col.max_profit_perc_day = fromtimestamp(
stats.close_time[stats.perc == col.max_profit_perc][0]
)
col.max_drawdown_abs = stats.abs.min()
col.max_drawdown_perc = stats.perc.min()
col.max_drawdown_abs_day = fromtimestamp(
stats.close_time[stats.abs == col.max_drawdown_abs][0]
)
col.max_drawdown_perc_day = fromtimestamp(
stats.close_time[stats.perc == col.max_drawdown_perc][0]
)
col.profit_factor = (
abs(col.total_profit / col.total_loss) if col.total_loss else 0
)
col.recovery_factor = (
abs(col.net_profit_abs / col.max_drawdown_abs)
if col.max_drawdown_abs
else 0
)
col.payoff_ratio = (
abs(col.win_average_profit_abs / col.loss_average_profit_abs)
if col.loss_average_profit_abs
else 0
)
col.sharpe_ratio = annualized_sharpe_ratio(stats)
col.sortino_ratio = annualized_sortino_ratio(stats)
# TODO:
col.alpha_ratio = np.nan
col.beta_ratio = np.nan
def day_percentage_returns(stats):
days = defaultdict(float)
trade_count = np.count_nonzero(stats)
if trade_count == 1:
# market position, so returns should based on quotes
# calculate percentage changes on a list of quotes
changes = np.diff(Quotes.close) / Quotes[:-1].close * 100
data = np.column_stack((Quotes[1:].time, changes)) # np.c_
else:
# slice `:trade_count` to exclude zero values in long/short columns
data = stats[['close_time', 'perc']][:trade_count]
# FIXME: [FutureWarning] https://github.com/numpy/numpy/issues/8383
for close_time, perc in data:
days[fromtimestamp(close_time).date()] += perc
returns = np.array(list(days.values()))
# if np.count_nonzero(stats) == 1:
# import pudb; pudb.set_trace()
if len(returns) >= ANNUAL_PERIOD:
return returns
_returns = np.zeros(ANNUAL_PERIOD)
_returns[: len(returns)] = returns
return _returns
def annualized_sharpe_ratio(stats):
# risk_free = 0
returns = day_percentage_returns(stats)
return np.sqrt(ANNUAL_PERIOD) * np.mean(returns) / np.std(returns)
def annualized_sortino_ratio(stats):
# http://www.cmegroup.com/education/files/sortino-a-sharper-ratio.pdf
required_return = 0
returns = day_percentage_returns(stats)
mask = [returns < required_return]
tdd = np.zeros(len(returns))
tdd[mask] = returns[mask] # keep only negative values and zeros
# "or 1" to prevent division by zero, if we don't have negative returns
tdd = np.sqrt(np.mean(np.square(tdd))) or 1
return np.sqrt(ANNUAL_PERIOD) * np.mean(returns) / tdd

View File

@ -0,0 +1,412 @@
"""Portfolio."""
import itertools
from contextlib import contextmanager
from enum import Enum, auto
import numpy as np
from .base import Quotes
from .performance import BriefPerformance, Performance, Stats
from .utils import fromtimestamp, timeit
__all__ = ('Portfolio', 'Position', 'Order')
class BasePortfolio:
def __init__(self, balance=100_000, leverage=5):
self._initial_balance = balance
self.balance = balance
self.equity = None
# TODO:
# self.cash
# self.currency
self.leverage = leverage
self.positions = []
self.balance_curve = None
self.equity_curve = None
self.long_curve = None
self.short_curve = None
self.mae_curve = None
self.mfe_curve = None
self.stats = None
self.performance = None
self.brief_performance = None
def clear(self):
self.positions.clear()
self.balance = self._initial_balance
@property
def initial_balance(self):
return self._initial_balance
@initial_balance.setter
def initial_balance(self, value):
self._initial_balance = value
def add_position(self, position):
position.ticket = len(self.positions) + 1
self.positions.append(position)
def position_count(self, tp=None):
if tp == Order.BUY:
return len([p for p in self.positions if p.type == Order.BUY])
elif tp == Order.SELL:
return len([p for p in self.positions if p.type == Order.SELL])
return len(self.positions)
def _close_open_positions(self):
for p in self.positions:
if p.status == Position.OPEN:
p.close(
price=Quotes[-1].open,
volume=p.volume,
time=Quotes[-1].time
)
def _get_market_position(self):
p = self.positions[0] # real postions
p = Position(
symbol=p.symbol,
ptype=Order.BUY,
volume=p.volume,
price=Quotes[0].open,
open_time=Quotes[0].time,
close_price=Quotes[-1].close,
close_time=Quotes[-1].time,
id_bar_close=len(Quotes) - 1,
status=Position.CLOSED,
)
p.profit = p.calc_profit(close_price=Quotes[-1].close)
p.profit_perc = p.profit / self._initial_balance * 100
return p
def _calc_equity_curve(self):
"""Equity curve."""
self.equity_curve = np.zeros_like(Quotes.time)
for i, p in enumerate(self.positions):
balance = np.sum(self.stats['All'][:i].abs)
for ibar in range(p.id_bar_open, p.id_bar_close):
profit = p.calc_profit(close_price=Quotes[ibar].close)
self.equity_curve[ibar] = balance + profit
# taking into account the real balance after the last trade
self.equity_curve[-1] = self.balance_curve[-1]
def _calc_buy_and_hold_curve(self):
"""Buy and Hold."""
p = self._get_market_position()
self.buy_and_hold_curve = np.array(
[p.calc_profit(close_price=price) for price in Quotes.close]
)
def _calc_long_short_curves(self):
"""Only Long/Short positions curve."""
self.long_curve = np.zeros_like(Quotes.time)
self.short_curve = np.zeros_like(Quotes.time)
for i, p in enumerate(self.positions):
if p.type == Order.BUY:
name = 'Long'
curve = self.long_curve
else:
name = 'Short'
curve = self.short_curve
balance = np.sum(self.stats[name][:i].abs)
# Calculate equity for this position
for ibar in range(p.id_bar_open, p.id_bar_close):
profit = p.calc_profit(close_price=Quotes[ibar].close)
curve[ibar] = balance + profit
for name, curve in [
('Long', self.long_curve),
('Short', self.short_curve),
]:
curve[:] = fill_zeros_with_last(curve)
# taking into account the real balance after the last trade
curve[-1] = np.sum(self.stats[name].abs)
def _calc_curves(self):
self.mae_curve = np.cumsum(self.stats['All'].mae)
self.mfe_curve = np.cumsum(self.stats['All'].mfe)
self.balance_curve = np.cumsum(self.stats['All'].abs)
self._calc_equity_curve()
self._calc_buy_and_hold_curve()
self._calc_long_short_curves()
@contextmanager
def optimization_mode(self):
"""Backup and restore current balance and positions."""
# mode='general',
self.backup_balance = self.balance
self.backup_positions = self.positions.copy()
self.balance = self._initial_balance
self.positions.clear()
yield
self.balance = self.backup_balance
self.positions = self.backup_positions.copy()
self.backup_positions.clear()
@timeit
def run_optimization(self, strategy, params):
keys = list(params.keys())
vals = list(params.values())
variants = list(itertools.product(*vals))
self.brief_performance = BriefPerformance(shape=(len(variants),))
with self.optimization_mode():
for i, vals in enumerate(variants):
kwargs = {keys[n]: val for n, val in enumerate(vals)}
strategy.start(**kwargs)
self._close_open_positions()
self.brief_performance.add(
self._initial_balance, self.positions, i, kwargs
)
self.clear()
@timeit
def summarize(self):
self._close_open_positions()
positions = {
'All': self.positions,
'Long': [p for p in self.positions if p.type == Order.BUY],
'Short': [p for p in self.positions if p.type == Order.SELL],
'Market': [self._get_market_position()],
}
self.stats = Stats(positions)
self.performance = Performance(
self._initial_balance, self.stats, positions
)
self._calc_curves()
Portfolio = BasePortfolio()
class PositionStatus(Enum):
OPEN = auto()
CLOSED = auto()
CANCELED = auto()
class Position:
OPEN = PositionStatus.OPEN
CLOSED = PositionStatus.CLOSED
CANCELED = PositionStatus.CANCELED
__slots__ = (
'type',
'symbol',
'ticket',
'open_price',
'close_price',
'open_time',
'close_time',
'volume',
'sl',
'tp',
'status',
'profit',
'profit_perc',
'commis',
'id_bar_open',
'id_bar_close',
'entry_name',
'exit_name',
'total_profit',
'comment',
)
def __init__(
self,
symbol,
ptype,
price,
volume,
open_time,
sl=None,
tp=None,
status=OPEN,
entry_name='',
exit_name='',
comment='',
**kwargs,
):
self.type = ptype
self.symbol = symbol
self.ticket = None
self.open_price = price
self.close_price = None
self.open_time = open_time
self.close_time = None
self.volume = volume
self.sl = sl
self.tp = tp
self.status = status
self.profit = None
self.profit_perc = None
self.commis = None
self.id_bar_open = np.where(Quotes.time == self.open_time)[0][0]
self.id_bar_close = None
self.entry_name = entry_name
self.exit_name = exit_name
self.total_profit = 0
self.comment = comment
# self.bars_on_trade = None
# self.is_profitable = False
for k, v in kwargs.items():
setattr(self, k, v)
def __repr__(self):
_type = 'LONG' if self.type == Order.BUY else 'SHORT'
time = fromtimestamp(self.open_time).strftime('%d.%m.%y %H:%M')
return '%s/%s/[%s - %.4f]' % (
self.status.name,
_type,
time,
self.open_price,
)
def close(self, price, time, volume=None):
# TODO: allow closing only part of the volume
self.close_price = price
self.close_time = time
self.id_bar_close = np.where(Quotes.time == self.close_time)[0][0]
self.profit = self.calc_profit(volume=volume or self.volume)
self.profit_perc = self.profit / Portfolio.balance * 100
Portfolio.balance += self.profit
self.total_profit = Portfolio.balance - Portfolio.initial_balance
self.status = self.CLOSED
def calc_profit(self, volume=None, close_price=None):
# TODO: rewrite it
close_price = close_price or self.close_price
volume = volume or self.volume
factor = 1 if self.type == Order.BUY else -1
price_delta = (close_price - self.open_price) * factor
if self.symbol.mode in [self.symbol.FOREX, self.symbol.CFD]:
# Margin: Lots*Contract_Size/Leverage
if (
self.symbol.mode == self.symbol.FOREX
and self.symbol.ticker[:3] == 'USD'
):
# Example: 'USD/JPY'
# Прибыль Размер Объем Текущий
# в пунктах пункта позиции курс
# 1 * 0.0001 * 100000 / 1.00770
# USD/CHF: 1*0.0001*100000/1.00770 = $9.92
# 0.01
# USD/JPY: 1*0.01*100000/121.35 = $8.24
# (1.00770-1.00595)/0.0001 = 17.5 пунктов
# (1.00770-1.00595)/0.0001*0.0001*100000*1/1.00770*1
_points = price_delta / self.symbol.tick_size
_profit = (
_points
* self.symbol.tick_size
* self.symbol.contract_size
/ close_price
* volume
)
elif (
self.symbol.mode == self.symbol.FOREX
and self.symbol.ticker[-3:] == 'USD'
):
# Example: 'EUR/USD'
# Profit: (close_price-open_price)*Contract_Size*Lots
# EUR/USD BUY: (1.05875-1.05850)*100000*1 = +$25 (без комиссии)
_profit = price_delta * self.symbol.contract_size * volume
else:
# Cross rates. Example: 'GBP/CHF'
# Цена пункта =
# объем поз.*размер п.*тек.курс баз.вал. к USD/тек. кросс-курс
# GBP/CHF: 100000*0.0001*1.48140/1.48985 = $9.94
# TODO: temporary patch (same as the previous choice) -
# in the future connect to some quotes provider and get rates
_profit = price_delta * self.symbol.contract_size * volume
elif self.symbol.mode == self.symbol.FUTURES:
# Margin: Lots *InitialMargin*Percentage/100
# Profit: (close_price-open_price)*TickPrice/TickSize*Lots
# CL BUY: (46.35-46.30)*10/0.01*1 = $50 (без учета комиссии!)
# EuroFX(6E) BUY:(1.05875-1.05850)*12.50/0.0001*1 =$31.25 (без ком)
# RTS (RIH5) BUY:(84510-84500)*12.26506/10*1 = @12.26506 (без ком)
# E-miniSP500 BUY:(2065.95-2065.25)*12.50/0.25 = $35 (без ком)
# http://americanclearing.ru/specifications.php
# http://www.moex.com/ru/contract.aspx?code=RTS-3.18
# http://www.cmegroup.com/trading/equity-index/us-index/e-mini-sandp500_contract_specifications.html
_profit = (
price_delta
* self.symbol.tick_value
/ self.symbol.tick_size
* volume
)
else:
# shares
_profit = price_delta * volume
return _profit
def calc_mae(self, low, high):
"""Return [MAE] Maximum Adverse Excursion."""
if self.type == Order.BUY:
return self.calc_profit(close_price=low)
return self.calc_profit(close_price=high)
def calc_mfe(self, low, high):
"""Return [MFE] Maximum Favorable Excursion."""
if self.type == Order.BUY:
return self.calc_profit(close_price=high)
return self.calc_profit(close_price=low)
class OrderType(Enum):
BUY = auto()
SELL = auto()
BUY_LIMIT = auto()
SELL_LIMIT = auto()
BUY_STOP = auto()
SELL_STOP = auto()
class Order:
BUY = OrderType.BUY
SELL = OrderType.SELL
BUY_LIMIT = OrderType.BUY_LIMIT
SELL_LIMIT = OrderType.SELL_LIMIT
BUY_STOP = OrderType.BUY_STOP
SELL_STOP = OrderType.SELL_STOP
@staticmethod
def open(symbol, otype, price, volume, time, sl=None, tp=None):
# TODO: add margin calculation
# and if the margin is not enough - do not open the position
position = Position(
symbol=symbol,
ptype=otype,
price=price,
volume=volume,
open_time=time,
sl=sl,
tp=tp,
)
Portfolio.add_position(position)
return position
@staticmethod
def close(position, price, time, volume=None):
# FIXME: may be closed not the whole volume, but
# the position status will be changed to CLOSED
position.close(price=price, time=time, volume=volume)
def fill_zeros_with_last(arr):
"""Fill empty(zero) elements (between positions)."""
index = np.arange(len(arr))
index[arr == 0] = 0
index = np.maximum.accumulate(index)
return arr[index]

View File

@ -0,0 +1,82 @@
"""Utils."""
import importlib.util
import inspect
import logging
import os
import os.path
import sys
import time
from datetime import datetime
from functools import wraps
from PyQt5 import QtCore
__all__ = (
'BASE_DIR',
'Settings',
'timeit',
'fromtimestamp',
'get_data_path',
'get_resource_path',
'strategies_from_file',
)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
def get_data_path(path=''):
data_path = QtCore.QStandardPaths.writableLocation(
QtCore.QStandardPaths.AppDataLocation
)
data_path = os.path.join(data_path, path)
os.makedirs(data_path, mode=0o755, exist_ok=True)
return data_path
def get_resource_path(relative_path):
# PyInstaller creates a temp folder and stores path in _MEIPASS
base_path = getattr(sys, '_MEIPASS', BASE_DIR)
return os.path.join(base_path, relative_path)
config_path = os.path.join(get_data_path(), 'Quantdom', 'config.ini')
Settings = QtCore.QSettings(config_path, QtCore.QSettings.IniFormat)
def timeit(fn):
@wraps(fn)
def wrapper(*args, **kwargs):
t = time.time()
res = fn(*args, **kwargs)
logger = logging.getLogger('runtime')
logger.debug(
'%s.%s: %.4f sec'
% (fn.__module__, fn.__qualname__, time.time() - t)
)
return res
return wrapper
def fromtimestamp(timestamp):
if timestamp == 0:
# on Win zero timestamp cause error
return datetime(1970, 1, 1)
return datetime.fromtimestamp(timestamp)
def strategies_from_file(filepath):
from .strategy import AbstractStrategy
spec = importlib.util.spec_from_file_location('Strategy', filepath)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
is_strategy = lambda _class: ( # noqa:E731
inspect.isclass(_class)
and issubclass(_class, AbstractStrategy)
and _class.__name__ != 'AbstractStrategy'
)
return [_class for _, _class in inspect.getmembers(module, is_strategy)]

View File

@ -30,7 +30,6 @@ setup(
license='AGPLv3', license='AGPLv3',
author='Tyler Goodlet', author='Tyler Goodlet',
maintainer='Tyler Goodlet', maintainer='Tyler Goodlet',
maintainer_email='tgoodlet@gmail.com',
url='https://github.com/pikers/piker', url='https://github.com/pikers/piker',
platforms=['linux'], platforms=['linux'],
packages=[ packages=[
@ -46,25 +45,47 @@ setup(
] ]
}, },
install_requires=[ install_requires=[
'click', 'colorlog', 'trio', 'attrs', 'async_generator', 'click',
'pygments', 'cython', 'asks', 'pandas', 'msgpack', 'colorlog',
'attrs',
'pygments',
# async
'trio',
'trio-websocket',
# 'tractor', # from github currently
'async_generator',
# brokers
'asks',
'ib_insync',
# numerics
'arrow', # better datetimes
'cython',
'numpy',
'pandas',
'msgpack-numpy',
# UI
'PyQt5',
'pyqtgraph',
'qdarkstyle',
# tsdbs
'pymarketstore',
#'kivy', see requirement.txt; using a custom branch atm #'kivy', see requirement.txt; using a custom branch atm
], ],
extras_require={
'questrade': ['asks'],
},
tests_require=['pytest'], tests_require=['pytest'],
python_requires=">=3.7", # literally for ``datetime.datetime.fromisoformat``... python_requires=">=3.7", # literally for ``datetime.datetime.fromisoformat``...
keywords=["async", "trading", "finance", "quant", "charting"], keywords=["async", "trading", "finance", "quant", "charting"],
classifiers=[ classifiers=[
'Development Status :: 3 - Alpha', 'Development Status :: 3 - Alpha',
'License :: OSI Approved :: Mozilla Public License 2.0 (MPL 2.0)', 'License :: OSI Approved :: ',
'Operating System :: POSIX :: Linux', 'Operating System :: POSIX :: Linux',
"Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
"Programming Language :: Python :: 3 :: Only", "Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.6",
'Intended Audience :: Financial and Insurance Industry', 'Intended Audience :: Financial and Insurance Industry',
'Intended Audience :: Science/Research', 'Intended Audience :: Science/Research',
'Intended Audience :: Developers', 'Intended Audience :: Developers',