Compare commits

..

No commits in common. "310_plus" and "use_tractor_logging" have entirely different histories.

125 changed files with 2027 additions and 35059 deletions

View File

@ -1,34 +0,0 @@
name: CI
on:
# Triggers the workflow on push or pull request events but only for the master branch
push:
branches: [ master ]
pull_request:
branches: [ master ]
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
testing:
name: 'install + test-suite'
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Setup python
uses: actions/setup-python@v3
with:
python-version: '3.10'
- name: Install dependencies
run: pip install -U . -r requirements-test.txt -r requirements.txt --upgrade-strategy eager
- name: Test suite
run: pytest tests -rs

4
.gitignore vendored
View File

@ -97,9 +97,5 @@ ENV/
# mkdocs documentation # mkdocs documentation
/site /site
# extra scripts dir
/snippets
# mypy # mypy
.mypy_cache/ .mypy_cache/
.vscode/settings.json

24
.travis.yml 100644
View File

@ -0,0 +1,24 @@
language: python
matrix:
include:
- python: 3.7
dist: xenial
sudo: required
before_install:
- sudo apt-get -qq update
# deps to build kivy from sources for use with trio
- sudo apt-get install -y build-essential libav-tools libgles2-mesa-dev libgles2-mesa-dev libsdl2-dev libsdl2-image-dev libsdl2-mixer-dev libsdl2-ttf-dev libportmidi-dev libswscale-dev libavformat-dev libavcodec-dev zlib1g-dev
install:
- pip install pipenv
- cd $TRAVIS_BUILD_DIR
- pipenv install --dev -e .
cache:
directories:
- $HOME/.config/piker/
script:
- pipenv run pytest tests/

147
LICENSE
View File

@ -1,21 +1,23 @@
GNU AFFERO GENERAL PUBLIC LICENSE GNU GENERAL PUBLIC LICENSE
Version 3, 19 November 2007 Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/> Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed. of this license document, but changing it is not allowed.
Preamble Preamble
The GNU Affero General Public License is a free, copyleft license for The GNU General Public License is a free, copyleft license for
software and other kinds of works, specifically designed to ensure software and other kinds of works.
cooperation with the community in the case of network server software.
The licenses for most software and other practical works are designed The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast, to take away your freedom to share and change the works. By contrast,
our General Public Licenses are intended to guarantee your freedom to the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free share and change all versions of a program--to make sure it remains free
software for all its users. software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you price. Our General Public Licenses are designed to make sure that you
@ -24,34 +26,44 @@ them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things. free programs, and that you know you can do these things.
Developers that use our General Public Licenses protect your rights To protect your rights, we need to prevent others from denying you
with two steps: (1) assert copyright on the software, and (2) offer these rights or asking you to surrender the rights. Therefore, you have
you this License which gives you legal permission to copy, distribute certain responsibilities if you distribute copies of the software, or if
and/or modify the software. you modify it: responsibilities to respect the freedom of others.
A secondary benefit of defending all users' freedom is that For example, if you distribute copies of such a program, whether
improvements made in alternate versions of the program, if they gratis or for a fee, you must pass on to the recipients the same
receive widespread use, become available for other developers to freedoms that you received. You must make sure that they, too, receive
incorporate. Many developers of free software are heartened and or can get the source code. And you must show them these terms so they
encouraged by the resulting cooperation. However, in the case of know their rights.
software used on network servers, this result may fail to come about.
The GNU General Public License permits making a modified version and
letting the public access it on a server without ever releasing its
source code to the public.
The GNU Affero General Public License is designed specifically to Developers that use the GNU GPL protect your rights with two steps:
ensure that, in such cases, the modified source code becomes available (1) assert copyright on the software, and (2) offer you this License
to the community. It requires the operator of a network server to giving you legal permission to copy, distribute and/or modify it.
provide the source code of the modified version running there to the
users of that server. Therefore, public use of a modified version, on
a publicly accessible server, gives the public access to the source
code of the modified version.
An older license, called the Affero General Public License and For the developers' and authors' protection, the GPL clearly explains
published by Affero, was designed to accomplish similar goals. This is that there is no warranty for this free software. For both users' and
a different license, not a version of the Affero GPL, but Affero has authors' sake, the GPL requires that modified versions be marked as
released a new version of the Affero GPL which permits relicensing under changed, so that their problems will not be attributed erroneously to
this license. authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and The precise terms and conditions for copying, distribution and
modification follow. modification follow.
@ -60,7 +72,7 @@ modification follow.
0. Definitions. 0. Definitions.
"This License" refers to version 3 of the GNU Affero General Public License. "This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of "Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks. works, such as semiconductor masks.
@ -537,45 +549,35 @@ to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program. License would be to refrain entirely from conveying the Program.
13. Remote Network Interaction; Use with the GNU General Public License. 13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, if you modify the
Program, your modified version must prominently offer all users
interacting with it remotely through a computer network (if your version
supports such interaction) an opportunity to receive the Corresponding
Source of your version by providing access to the Corresponding Source
from a network server at no charge, through some standard or customary
means of facilitating copying of software. This Corresponding Source
shall include the Corresponding Source for any work covered by version 3
of the GNU General Public License that is incorporated pursuant to the
following paragraph.
Notwithstanding any other provision of this License, you have Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed permission to link or combine any covered work with a work licensed
under version 3 of the GNU General Public License into a single under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work, License will continue to apply to the part which is the covered work,
but the work with which it is combined will remain governed by version but the special requirements of the GNU Affero General Public License,
3 of the GNU General Public License. section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License. 14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of The Free Software Foundation may publish revised and/or new versions of
the GNU Affero General Public License from time to time. Such new versions the GNU General Public License from time to time. Such new versions will
will be similar in spirit to the present version, but may differ in detail to be similar in spirit to the present version, but may differ in detail to
address new problems or concerns. address new problems or concerns.
Each version is given a distinguishing version number. If the Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU Affero General Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the Foundation. If the Program does not specify a version number of the
GNU Affero General Public License, you may choose any version ever published GNU General Public License, you may choose any version ever published
by the Free Software Foundation. by the Free Software Foundation.
If the Program specifies that a proxy can decide which future If the Program specifies that a proxy can decide which future
versions of the GNU Affero General Public License can be used, that proxy's versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you public statement of acceptance of a version permanently authorizes you
to choose that version for the Program. to choose that version for the Program.
@ -633,29 +635,40 @@ the "copyright" line and a pointer to where the full notice is found.
Copyright (C) <year> <name of author> Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Affero General Public License as published by it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or the Free Software Foundation, either version 3 of the License, or
(at your option) any later version. (at your option) any later version.
This program is distributed in the hope that it will be useful, This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU Affero General Public License for more details. GNU General Public License for more details.
You should have received a copy of the GNU Affero General Public License You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>. along with this program. If not, see <http://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail. Also add information on how to contact you by electronic and paper mail.
If your software can interact with users remotely through a computer If the program does terminal interaction, make it output a short
network, you should also make sure that it provides a way for users to notice like this when it starts in an interactive mode:
get its source. For example, if your program is a web application, its
interface could display a "Source" link that leads users to an archive <program> Copyright (C) <year> <name of author>
of the code. There are many ways you could offer source, and different This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
solutions will be better for different programs; see section 13 for the This is free software, and you are welcome to redistribute it
specific requirements. under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school, You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary. if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU AGPL, see For more information on this, and how to apply and follow the GNU GPL, see
<https://www.gnu.org/licenses/>. <http://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
<http://www.gnu.org/philosophy/why-not-lgpl.html>.

View File

@ -1,2 +1 @@
include README.rst include README.rst
include data/brokers.toml

22
Pipfile 100644
View File

@ -0,0 +1,22 @@
[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true
name = "pypi"
[packages]
e1839a8 = {path = ".",editable = true}
trio = "*"
Cython = "*"
# use master branch kivy since wheels seem borked (due to cython stuff)
kivy = {git = "git://github.com/kivy/kivy.git"}
pdbpp = "*"
msgpack = "*"
tractor = {git = "git://github.com/goodboy/tractor.git"}
toml = "*"
pyqtgraph = "*"
pyside2 = "*"
[dev-packages]
pytest = "*"
pdbpp = "*"
piker = {editable = true,path = "."}

735
Pipfile.lock generated 100644
View File

@ -0,0 +1,735 @@
{
"_meta": {
"hash": {
"sha256": "40988e1c2ed3ed6dc896f41b602154af1d8b22389a3d54962998a7d3b996b87f"
},
"pipfile-spec": 6,
"requires": {},
"sources": [
{
"name": "pypi",
"url": "https://pypi.python.org/simple",
"verify_ssl": true
}
]
},
"default": {
"anyio": {
"hashes": [
"sha256:0fba37b3f835ba6ebeb10e9754aff13e6fbc322f10ab1f2ab9e6c8534492930d",
"sha256:5ed13f4668d219a30b5ca1f0cde5efaa94af272c02dd04f6429aaab35906f637"
],
"version": "==1.2.0"
},
"asks": {
"hashes": [
"sha256:d7289cb5b7a28614e4fecab63b3734e2a4296d3c323e315f8dc4b546d64f71b7"
],
"version": "==2.3.6"
},
"async-generator": {
"hashes": [
"sha256:01c7bf666359b4967d2cda0000cc2e4af16a0ae098cbffcb8472fb9e8ad6585b",
"sha256:6ebb3d106c12920aaae42ccb6f787ef5eefdcdd166ea3d628fa8476abe712144"
],
"version": "==1.10"
},
"attrs": {
"hashes": [
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
],
"version": "==19.3.0"
},
"certifi": {
"hashes": [
"sha256:e4f3620cfea4f83eedc95b24abd9cd56f3c4b146dd0177e83a21b4eb49e21e50",
"sha256:fd7c7c74727ddcf00e9acd26bba8da604ffec95bf1c2144e67aff7a8b50e6cef"
],
"version": "==2019.9.11"
},
"chardet": {
"hashes": [
"sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
"sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
],
"version": "==3.0.4"
},
"click": {
"hashes": [
"sha256:2335065e6395b9e67ca716de5f7526736bfa6ceead690adf616d925bdc622b13",
"sha256:5b94b49521f6456670fdb30cd82a4eca9412788a93fa6dd6df72c94d5a8ff2d7"
],
"version": "==7.0"
},
"colorlog": {
"hashes": [
"sha256:3cf31b25cbc8f86ec01fef582ef3b840950dea414084ed19ab922c8b493f9b42",
"sha256:450f52ea2a2b6ebb308f034ea9a9b15cea51e65650593dca1da3eb792e4e4981"
],
"version": "==4.0.2"
},
"cython": {
"hashes": [
"sha256:03f6bbb380ad0acb744fb06e42996ea217e9d00016ca0ff6f2e7d60f580d0360",
"sha256:05e8cfd3a3a6087aec49a1ae08a89171db991956209406d1e5576f9db70ece52",
"sha256:05eb79efc8029d487251c8a2702a909a8ba33c332e06d2f3980866541bd81253",
"sha256:094d28a34c3fa992ae02aea1edbe6ff89b3cc5870b6ee38b5baeb805dc57b013",
"sha256:0c70e842e52e2f50cc43bad43b5e5bc515f30821a374e544abb0e0746f2350ff",
"sha256:1dcdaa319558eb924294a554dcf6c12383ec947acc7e779e8d3622409a7f7d28",
"sha256:1fc5bdda28f25fec44e4721677458aa509d743cd350862270309d61aa148d6ff",
"sha256:280573a01d9348d44a42d6a9c651d9f7eb1fe9217df72555b2a118f902996a10",
"sha256:298ceca7b0f0da4205fcb0b7c9ac9e120e2dafffd5019ba1618e84ef89434b5a",
"sha256:4074a8bff0040035673cc6dd365a762476d6bff4d03d8ce6904e3e53f9a25dc8",
"sha256:41e7068e95fbf9ec94b41437f989caf9674135e770a39cdb9c00de459bafd1bc",
"sha256:47e5e1502d52ef03387cf9d3b3241007961a84a466e58a3b74028e1dd4957f8c",
"sha256:521340844cf388d109ceb61397f3fd5250ccb622a1a8e93559e8de76c80940a9",
"sha256:6c53338c1811f8c6d7f8cb7abd874810b15045e719e8207f957035c9177b4213",
"sha256:75c2dda47dcc3c77449712b1417bb6b89ec3b7b02e18c64262494dceffdf455e",
"sha256:773c5a98e463b52f7e8197254b39b703a5ea1972aef3a94b3b921515d77dd041",
"sha256:78c3068dcba300d473fef57cdf523e34b37de522f5a494ef9ee1ac9b4b8bbe3f",
"sha256:7bc18fc5a170f2c1cef5387a3d997c28942918bbee0f700e73fd2178ee8d474d",
"sha256:7f89eff20e4a7a64b55210dac17aea711ed8a3f2e78f2ff784c0e984302583dd",
"sha256:89458b49976b1dee5d89ab4ac943da3717b4292bf624367e862e4ee172fcce99",
"sha256:986f871c0fa649b293061236b93782d25c293a8dd8117c7ba05f8a61bdc261ae",
"sha256:a0f495a4fe5278aab278feee35e6102efecde5176a8a74dd28c28e3fc5c8d7c7",
"sha256:a14aa436586c41633339415de82a41164691d02d3e661038da533be5d40794a5",
"sha256:b8ab3ab38afc47d8f4fe629b836243544351cef681b6bdb1dc869028d6fdcbfb",
"sha256:bb487881608ebd293592553c618f0c83316f4f13a64cb18605b1d2fb9fd3da3e",
"sha256:c0b24bfe3431b3cb7ced323bca813dbd13aca973a1475b512d3331fd0de8ec60",
"sha256:c7894c06205166d360ab2915ae306d1f7403e9ce3d3aaeff4095eaf98e42ce66",
"sha256:d4039bb7f234ad32267c55e72fd49fb56078ea102f9d9d8559f6ec34d4887630",
"sha256:e4d6bb8703d0319eb04b7319b12ea41580df44fd84d83ccda13ea463c6801414",
"sha256:e8fab9911fd2fa8e5af407057cb8bdf87762f983cba483fa3234be20a9a0af77",
"sha256:f3818e578e687cdb21dc4aa4a3bc6278c656c9c393e9eda14dd04943f478863d",
"sha256:fe666645493d72712c46e4fbe8bec094b06aec3c337400479e9704439c9d9586"
],
"index": "pypi",
"version": "==0.29.14"
},
"docutils": {
"hashes": [
"sha256:6c4f696463b79f1fb8ba0c594b63840ebd41f059e92b31957c46b74a4599b6d0",
"sha256:9e4d7ecfc600058e07ba661411a2b7de2fd0fafa17d1a7f7361cd47b1175c827",
"sha256:a2aeea129088da402665e92e0b25b04b073c04b2dce4ab65caaa38b7ce2e1a99"
],
"version": "==0.15.2"
},
"e1839a8": {
"editable": true,
"path": "."
},
"fancycompleter": {
"hashes": [
"sha256:d2522f1f3512371f295379c4c0d1962de06762eb586c199620a2a5d423539b12"
],
"version": "==0.8"
},
"h11": {
"hashes": [
"sha256:33d4bca7be0fa039f4e84d50ab00531047e53d6ee8ffbc83501ea602c169cae1",
"sha256:4bc6d6a1238b7615b266ada57e0618568066f57dd6fa967d1290ec9309b2f2f1"
],
"version": "==0.9.0"
},
"helpdev": {
"hashes": [
"sha256:6cb2c604d97101a47b81d6d234b48e0f452efe4490b4cc67c33708420c2deaa0",
"sha256:9e61d24458b7506809670222ca656b139e67d46c530cd227a899780152d7b44e"
],
"version": "==0.6.10"
},
"idna": {
"hashes": [
"sha256:c357b3f628cf53ae2c4c05627ecc484553142ca23264e593d327bcde5e9c3407",
"sha256:ea8b7f6188e6fa117537c3df7da9fc686d485087abf6ac197f9c46432f7e4a3c"
],
"version": "==2.8"
},
"importlib-metadata": {
"hashes": [
"sha256:aa18d7378b00b40847790e7c27e11673d7fed219354109d0e7b9e5b25dc3ad26",
"sha256:d5f18a79777f3aa179c145737780282e27b508fc8fd688cb17c7a813e8bd39af"
],
"version": "==0.23"
},
"kivy": {
"hashes": [
"sha256:090d3ded9835a17477cd93fbdaf0a7c42ff2218981cf198ded5ad8795bc74391",
"sha256:11e85eaf6efbfa2362a3334ffdad179a1b0ca8d255cca79eaa6a2765560d4982",
"sha256:1a1ff32f8a95f1e175198cbab81fcd2596783b180d4eafe63e87d171aa7fdb5e",
"sha256:1d28b198a64c30db8d94a0488e85f3037af60d514ab0d7ad5ab45add3ab77090",
"sha256:4a5480cbf837d3780c77a4f61b32b56d22ae9f03845e7a89dd3eaef1ae5fd037",
"sha256:4d0e596f74271e901b551f77661dde238df4765484fce9f5d1c72e8022984e84",
"sha256:5c3d0f2749522d62e9cce09cd54b2d823bf1b6b644ff1f627be49de6f3e3cba0",
"sha256:815a5c0b3b72fcd81ca7b2aa0744087163ed03e4cf9ab4e7c9733cea99fc1571",
"sha256:8819a27a09871af451760cb69486ced52e830c8a0a37480f22ef5e692f12c05b",
"sha256:a687602d90c4629dd036f577ca39acb76ba581370f9d915f3cab99be818ba8ad",
"sha256:b7ef6aad43a86d8df3fb865db864e354f2155a748019f8517f69f65c1a29cb64",
"sha256:b85ccf165050cbf2ee8447671eebbc222b369b40f0e0038dd9547d49a5e37373",
"sha256:c36652caa7f6c327dee834cfc699d5962d346b7a53e54bd81abc17c314226d89",
"sha256:ece170514db3f49844a41e4c910ad9ce9bc46da6f47a49158e11266bdcc6e479",
"sha256:f3bea6e4a21991827885d04127fc6d09a0e974ecfa12da7bf5faae93562ea102",
"sha256:f835462dd9aa491272552ef079b948a088598e2e95d68bb1d885d2c3f3d4e2c3"
],
"index": "pypi",
"version": "==1.11.1"
},
"kivy-garden": {
"hashes": [
"sha256:c256f42788421273a08fbb0a228f0fb0e80dd86b629fb8c0920507f645be6c72"
],
"version": "==0.1.4"
},
"more-itertools": {
"hashes": [
"sha256:409cd48d4db7052af495b09dec721011634af3753ae1ef92d2b32f73a745f832",
"sha256:92b8c4b06dac4f0611c0729b2f2ede52b2e1bac1ab48f089c7ddc12e26bb60c4"
],
"version": "==7.2.0"
},
"msgpack": {
"hashes": [
"sha256:0cc7ca04e575ba34fea7cfcd76039f55def570e6950e4155a4174368142c8e1b",
"sha256:187794cd1eb73acccd528247e3565f6760bd842d7dc299241f830024a7dd5610",
"sha256:1904b7cb65342d0998b75908304a03cb004c63ef31e16c8c43fee6b989d7f0d7",
"sha256:229a0ccdc39e9b6c6d1033cd8aecd9c296823b6c87f0de3943c59b8bc7c64bee",
"sha256:24149a75643aeaa81ece4259084d11b792308a6cf74e796cbb35def94c89a25a",
"sha256:30b88c47e0cdb6062daed88ca283b0d84fa0d2ad6c273aa0788152a1c643e408",
"sha256:32fea0ea3cd1ef820286863a6202dcfd62a539b8ec3edcbdff76068a8c2cc6ce",
"sha256:355f7fd0f90134229eaeefaee3cf42e0afc8518e8f3cd4b25f541a7104dcb8f9",
"sha256:4abdb88a9b67e64810fb54b0c24a1fd76b12297b4f7a1467d85a14dd8367191a",
"sha256:757bd71a9b89e4f1db0622af4436d403e742506dbea978eba566815dc65ec895",
"sha256:76df51492bc6fa6cc8b65d09efdb67cbba3cbfe55004c3afc81352af92b4a43c",
"sha256:774f5edc3475917cd95fe593e625d23d8580f9b48b570d8853d06cac171cd170",
"sha256:8a3ada8401736df2bf497f65589293a86c56e197a80ae7634ec2c3150a2f5082",
"sha256:a06efd0482a1942aad209a6c18321b5e22d64eb531ea20af138b28172d8f35ba",
"sha256:b24afc52e18dccc8c175de07c1d680bdf315844566f4952b5bedb908894bec79",
"sha256:b8b4bd3dafc7b92608ae5462add1c8cc881851c2d4f5d8977fdea5b081d17f21",
"sha256:c6e5024fc0cdf7f83b6624850309ddd7e06c48a75fa0d1c5173de4d93300eb19",
"sha256:db7ff14abc73577b0bcbcf73ecff97d3580ecaa0fc8724babce21fdf3fe08ef6",
"sha256:dedf54d72d9e7b6d043c244c8213fe2b8bbfe66874b9a65b39c4cc892dd99dd4",
"sha256:ea3c2f859346fcd55fc46e96885301d9c2f7a36d453f5d8f2967840efa1e1830",
"sha256:f0f47bafe9c9b8ed03e19a100a743662dd8c6d0135e684feea720a0d0046d116"
],
"index": "pypi",
"version": "==0.6.2"
},
"numpy": {
"hashes": [
"sha256:0a7a1dd123aecc9f0076934288ceed7fd9a81ba3919f11a855a7887cbe82a02f",
"sha256:0c0763787133dfeec19904c22c7e358b231c87ba3206b211652f8cbe1241deb6",
"sha256:3d52298d0be333583739f1aec9026f3b09fdfe3ddf7c7028cb16d9d2af1cca7e",
"sha256:43bb4b70585f1c2d153e45323a886839f98af8bfa810f7014b20be714c37c447",
"sha256:475963c5b9e116c38ad7347e154e5651d05a2286d86455671f5b1eebba5feb76",
"sha256:64874913367f18eb3013b16123c9fed113962e75d809fca5b78ebfbb73ed93ba",
"sha256:683828e50c339fc9e68720396f2de14253992c495fdddef77a1e17de55f1decc",
"sha256:6ca4000c4a6f95a78c33c7dadbb9495c10880be9c89316aa536eac359ab820ae",
"sha256:75fd817b7061f6378e4659dd792c84c0b60533e867f83e0d1e52d5d8e53df88c",
"sha256:7d81d784bdbed30137aca242ab307f3e65c8d93f4c7b7d8f322110b2e90177f9",
"sha256:8d0af8d3664f142414fd5b15cabfd3b6cc3ef242a3c7a7493257025be5a6955f",
"sha256:9679831005fb16c6df3dd35d17aa31dc0d4d7573d84f0b44cc481490a65c7725",
"sha256:a8f67ebfae9f575d85fa859b54d3bdecaeece74e3274b0b5c5f804d7ca789fe1",
"sha256:acbf5c52db4adb366c064d0b7c7899e3e778d89db585feadd23b06b587d64761",
"sha256:ada4805ed51f5bcaa3a06d3dd94939351869c095e30a2b54264f5a5004b52170",
"sha256:c7354e8f0eca5c110b7e978034cd86ed98a7a5ffcf69ca97535445a595e07b8e",
"sha256:e2e9d8c87120ba2c591f60e32736b82b67f72c37ba88a4c23c81b5b8fa49c018",
"sha256:e467c57121fe1b78a8f68dd9255fbb3bb3f4f7547c6b9e109f31d14569f490c3",
"sha256:ede47b98de79565fcd7f2decb475e2dcc85ee4097743e551fe26cfc7eb3ff143",
"sha256:f58913e9227400f1395c7b800503ebfdb0772f1c33ff8cb4d6451c06cabdf316",
"sha256:fe39f5fd4103ec4ca3cb8600b19216cd1ff316b4990f4c0b6057ad982c0a34d5"
],
"version": "==1.17.4"
},
"outcome": {
"hashes": [
"sha256:ee46c5ce42780cde85d55a61819d0e6b8cb490f1dbd749ba75ff2629771dcd2d",
"sha256:fc7822068ba7dd0fc2532743611e8a73246708d3564e29a39f93d6ab3701b66f"
],
"version": "==1.0.1"
},
"pandas": {
"hashes": [
"sha256:00dff3a8e337f5ed7ad295d98a31821d3d0fe7792da82d78d7fd79b89c03ea9d",
"sha256:22361b1597c8c2ffd697aa9bf85423afa9e1fcfa6b1ea821054a244d5f24d75e",
"sha256:255920e63850dc512ce356233081098554d641ba99c3767dde9e9f35630f994b",
"sha256:26382aab9c119735908d94d2c5c08020a4a0a82969b7e5eefb92f902b3b30ad7",
"sha256:33970f4cacdd9a0ddb8f21e151bfb9f178afb7c36eb7c25b9094c02876f385c2",
"sha256:4545467a637e0e1393f7d05d61dace89689ad6d6f66f267f86fff737b702cce9",
"sha256:52da74df8a9c9a103af0a72c9d5fdc8e0183a90884278db7f386b5692a2220a4",
"sha256:61741f5aeb252f39c3031d11405305b6d10ce663c53bc3112705d7ad66c013d0",
"sha256:6a3ac2c87e4e32a969921d1428525f09462770c349147aa8e9ab95f88c71ec71",
"sha256:7458c48e3d15b8aaa7d575be60e1e4dd70348efcd9376656b72fecd55c59a4c3",
"sha256:78bf638993219311377ce9836b3dc05f627a666d0dbc8cec37c0ff3c9ada673b",
"sha256:8153705d6545fd9eb6dd2bc79301bff08825d2e2f716d5dced48daafc2d0b81f",
"sha256:975c461accd14e89d71772e89108a050fa824c0b87a67d34cedf245f6681fc17",
"sha256:9962957a27bfb70ab64103d0a7b42fa59c642fb4ed4cb75d0227b7bb9228535d",
"sha256:adc3d3a3f9e59a38d923e90e20c4922fc62d1e5a03d083440468c6d8f3f1ae0a",
"sha256:bbe3eb765a0b1e578833d243e2814b60c825b7fdbf4cdfe8e8aae8a08ed56ecf",
"sha256:df8864824b1fe488cf778c3650ee59c3a0d8f42e53707de167ba6b4f7d35f133",
"sha256:e45055c30a608076e31a9fcd780a956ed3b1fa20db61561b8d88b79259f526f7",
"sha256:ee50c2142cdcf41995655d499a157d0a812fce55c97d9aad13bc1eef837ed36c"
],
"version": "==0.25.3"
},
"pdbpp": {
"hashes": [
"sha256:73ff220d5006e0ecdc3e2705d8328d8aa5ac27fef95cc06f6e42cd7d22d55eb8"
],
"index": "pypi",
"version": "==0.10.2"
},
"psutil": {
"hashes": [
"sha256:021d361439586a0fd8e64f8392eb7da27135db980f249329f1a347b9de99c695",
"sha256:145e0f3ab9138165f9e156c307100905fd5d9b7227504b8a9d3417351052dc3d",
"sha256:348ad4179938c965a27d29cbda4a81a1b2c778ecd330a221aadc7bd33681afbd",
"sha256:3feea46fbd634a93437b718518d15b5dd49599dfb59a30c739e201cc79bb759d",
"sha256:474e10a92eeb4100c276d4cc67687adeb9d280bbca01031a3e41fb35dfc1d131",
"sha256:47aeb4280e80f27878caae4b572b29f0ec7967554b701ba33cd3720b17ba1b07",
"sha256:73a7e002781bc42fd014dfebb3fc0e45f8d92a4fb9da18baea6fb279fbc1d966",
"sha256:d051532ac944f1be0179e0506f6889833cf96e466262523e57a871de65a15147",
"sha256:dfb8c5c78579c226841908b539c2374da54da648ee5a837a731aa6a105a54c00",
"sha256:e3f5f9278867e95970854e92d0f5fe53af742a7fc4f2eba986943345bcaed05d",
"sha256:e9649bb8fc5cea1f7723af53e4212056a6f984ee31784c10632607f472dec5ee"
],
"version": "==5.6.5"
},
"pygments": {
"hashes": [
"sha256:71e430bc85c88a430f000ac1d9b331d2407f681d6f6aec95e8bcfbc3df5b0127",
"sha256:881c4c157e45f30af185c1ffe8d549d48ac9127433f2c380c24b84572ad66297"
],
"version": "==2.4.2"
},
"pyqtgraph": {
"hashes": [
"sha256:4c08ab34881fae5ecf9ddfe6c1220b9e41e6d3eb1579a7d8ef501abb8e509251"
],
"index": "pypi",
"version": "==0.10.0"
},
"pyside2": {
"hashes": [
"sha256:589b90944c24046d31bf76694590a600d59d20130015086491b793a81753629a",
"sha256:63cc845434388b398b79b65f7b5312b9b5348fbc772d84092c9245efbf341197",
"sha256:7c57fe60ed57a3a8b95d9163abca9caa803a1470f29b40bff8ef4103b97a96c8",
"sha256:7c61a6883f3474939097b9dabc80f028887046be003ce416da1b3565a08d1f92",
"sha256:ed6d22c7a3a99f480d4c9348bcced97ef7bc0c9d353ad3665ae705e8eb61feb5",
"sha256:ede8ed6e7021232184829d16614eb153f188ea250862304eac35e04b2bd0120c"
],
"index": "pypi",
"version": "==5.13.2"
},
"python-dateutil": {
"hashes": [
"sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c",
"sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a"
],
"version": "==2.8.1"
},
"pytz": {
"hashes": [
"sha256:1c557d7d0e871de1f5ccd5833f60fb2550652da6be2693c1e02300743d21500d",
"sha256:b02c06db6cf09c12dd25137e563b31700d3b80fcc4ad23abb7a315f2789819be"
],
"version": "==2019.3"
},
"qdarkstyle": {
"hashes": [
"sha256:c012bfaa482a1444b477b43a0b3585ea2f844ac16bfa2dff40ea535ce669c054",
"sha256:de7aba62119a839556afd7b3b962588410b6249cbcfcbd56e51e6e827264c38f"
],
"index": "pypi",
"version": "==2.7"
},
"requests": {
"hashes": [
"sha256:11e007a8a2aa0323f5a921e9e6a2d7e4e67d9877e85773fba9ba6419025cbeb4",
"sha256:9cf5292fcd0f598c671cfc1e0d7d1a7f13bb8085e9a590f48c010551dc6c4b31"
],
"version": "==2.22.0"
},
"shiboken2": {
"hashes": [
"sha256:5e84a4b4e7ab08bb5db0a8168e5d0316fbf3c25b788012701a82079faadfb19b",
"sha256:7c766c4160636a238e0e4430e2f40b504b13bcc4951902eb78cd5c971f26c898",
"sha256:81fa9b288c6c4b4c91220fcca2002eadb48fc5c3238e8bd88e982e00ffa77c53",
"sha256:ca08a3c95b1b20ac2b243b7b06379609bd73929dbc27b28c01415feffe3bcea1",
"sha256:e2f72b5cfdb8b48bdb55bda4b42ec7d36d1bce0be73d6d7d4a358225d6fb5f25",
"sha256:e6543506cb353d417961b9ec3c6fc726ec2f72eeab609dc88943c2e5cb6d6408"
],
"version": "==5.13.2"
},
"six": {
"hashes": [
"sha256:1f1b7d42e254082a9db6279deae68afb421ceba6158efa6131de7b3003ee93fd",
"sha256:30f610279e8b2578cab6db20741130331735c781b56053c59c4076da27f06b66"
],
"version": "==1.13.0"
},
"sniffio": {
"hashes": [
"sha256:20ed6d5b46f8ae136d00b9dcb807615d83ed82ceea6b2058cecb696765246da5",
"sha256:8e3810100f69fe0edd463d02ad407112542a11ffdc29f67db2bf3771afb87a21"
],
"version": "==1.1.0"
},
"sortedcontainers": {
"hashes": [
"sha256:974e9a32f56b17c1bac2aebd9dcf197f3eb9cd30553c5852a3187ad162e1a03a",
"sha256:d9e96492dd51fae31e60837736b38fe42a187b5404c16606ff7ee7cd582d4c60"
],
"version": "==2.1.0"
},
"toml": {
"hashes": [
"sha256:229f81c57791a41d65e399fc06bf0848bab550a9dfd5ed66df18ce5f05e73d5c",
"sha256:235682dd292d5899d361a811df37e04a8828a5b1da3115886b73cf81ebc9100e"
],
"index": "pypi",
"version": "==0.10.0"
},
"tractor": {
"git": "git://github.com/goodboy/tractor.git",
"ref": "ab349cdb8d8cbf2e3d48c0589cb710a43483f233"
},
"trio": {
"hashes": [
"sha256:a6d83c0cb4a177ec0f5179ce88e27914d5c8e6fd01c4285176b949e6ddc88c6c",
"sha256:f1cf00054ad974c86d9b7afa187a65d79fd5995340abe01e8e4784d86f4acb30"
],
"index": "pypi",
"version": "==0.13.0"
},
"urllib3": {
"hashes": [
"sha256:a8a318824cc77d1fd4b2bec2ded92646630d7fe8619497b142c84a9e6f5a7293",
"sha256:f3c5fd51747d450d4dcf6f923c81f78f811aab8205fda64b0aba34a4e48b0745"
],
"version": "==1.25.7"
},
"wmctrl": {
"hashes": [
"sha256:d806f65ac1554366b6e31d29d7be2e8893996c0acbb2824bbf2b1f49cf628a13"
],
"version": "==0.3"
},
"zipp": {
"hashes": [
"sha256:3718b1cbcd963c7d4c5511a8240812904164b7f381b647143a89d3b98f9bcd8e",
"sha256:f06903e9f1f43b12d371004b4ac7b06ab39a44adc747266928ae6debfa7b3335"
],
"version": "==0.6.0"
}
},
"develop": {
"anyio": {
"hashes": [
"sha256:0fba37b3f835ba6ebeb10e9754aff13e6fbc322f10ab1f2ab9e6c8534492930d",
"sha256:5ed13f4668d219a30b5ca1f0cde5efaa94af272c02dd04f6429aaab35906f637"
],
"version": "==1.2.0"
},
"asks": {
"hashes": [
"sha256:d7289cb5b7a28614e4fecab63b3734e2a4296d3c323e315f8dc4b546d64f71b7"
],
"version": "==2.3.6"
},
"async-generator": {
"hashes": [
"sha256:01c7bf666359b4967d2cda0000cc2e4af16a0ae098cbffcb8472fb9e8ad6585b",
"sha256:6ebb3d106c12920aaae42ccb6f787ef5eefdcdd166ea3d628fa8476abe712144"
],
"version": "==1.10"
},
"atomicwrites": {
"hashes": [
"sha256:03472c30eb2c5d1ba9227e4c2ca66ab8287fbfbbda3888aa93dc2e28fc6811b4",
"sha256:75a9445bac02d8d058d5e1fe689654ba5a6556a1dfd8ce6ec55a0ed79866cfa6"
],
"version": "==1.3.0"
},
"attrs": {
"hashes": [
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
],
"version": "==19.3.0"
},
"click": {
"hashes": [
"sha256:2335065e6395b9e67ca716de5f7526736bfa6ceead690adf616d925bdc622b13",
"sha256:5b94b49521f6456670fdb30cd82a4eca9412788a93fa6dd6df72c94d5a8ff2d7"
],
"version": "==7.0"
},
"colorlog": {
"hashes": [
"sha256:3cf31b25cbc8f86ec01fef582ef3b840950dea414084ed19ab922c8b493f9b42",
"sha256:450f52ea2a2b6ebb308f034ea9a9b15cea51e65650593dca1da3eb792e4e4981"
],
"version": "==4.0.2"
},
"cython": {
"hashes": [
"sha256:03f6bbb380ad0acb744fb06e42996ea217e9d00016ca0ff6f2e7d60f580d0360",
"sha256:05e8cfd3a3a6087aec49a1ae08a89171db991956209406d1e5576f9db70ece52",
"sha256:05eb79efc8029d487251c8a2702a909a8ba33c332e06d2f3980866541bd81253",
"sha256:094d28a34c3fa992ae02aea1edbe6ff89b3cc5870b6ee38b5baeb805dc57b013",
"sha256:0c70e842e52e2f50cc43bad43b5e5bc515f30821a374e544abb0e0746f2350ff",
"sha256:1dcdaa319558eb924294a554dcf6c12383ec947acc7e779e8d3622409a7f7d28",
"sha256:1fc5bdda28f25fec44e4721677458aa509d743cd350862270309d61aa148d6ff",
"sha256:280573a01d9348d44a42d6a9c651d9f7eb1fe9217df72555b2a118f902996a10",
"sha256:298ceca7b0f0da4205fcb0b7c9ac9e120e2dafffd5019ba1618e84ef89434b5a",
"sha256:4074a8bff0040035673cc6dd365a762476d6bff4d03d8ce6904e3e53f9a25dc8",
"sha256:41e7068e95fbf9ec94b41437f989caf9674135e770a39cdb9c00de459bafd1bc",
"sha256:47e5e1502d52ef03387cf9d3b3241007961a84a466e58a3b74028e1dd4957f8c",
"sha256:521340844cf388d109ceb61397f3fd5250ccb622a1a8e93559e8de76c80940a9",
"sha256:6c53338c1811f8c6d7f8cb7abd874810b15045e719e8207f957035c9177b4213",
"sha256:75c2dda47dcc3c77449712b1417bb6b89ec3b7b02e18c64262494dceffdf455e",
"sha256:773c5a98e463b52f7e8197254b39b703a5ea1972aef3a94b3b921515d77dd041",
"sha256:78c3068dcba300d473fef57cdf523e34b37de522f5a494ef9ee1ac9b4b8bbe3f",
"sha256:7bc18fc5a170f2c1cef5387a3d997c28942918bbee0f700e73fd2178ee8d474d",
"sha256:7f89eff20e4a7a64b55210dac17aea711ed8a3f2e78f2ff784c0e984302583dd",
"sha256:89458b49976b1dee5d89ab4ac943da3717b4292bf624367e862e4ee172fcce99",
"sha256:986f871c0fa649b293061236b93782d25c293a8dd8117c7ba05f8a61bdc261ae",
"sha256:a0f495a4fe5278aab278feee35e6102efecde5176a8a74dd28c28e3fc5c8d7c7",
"sha256:a14aa436586c41633339415de82a41164691d02d3e661038da533be5d40794a5",
"sha256:b8ab3ab38afc47d8f4fe629b836243544351cef681b6bdb1dc869028d6fdcbfb",
"sha256:bb487881608ebd293592553c618f0c83316f4f13a64cb18605b1d2fb9fd3da3e",
"sha256:c0b24bfe3431b3cb7ced323bca813dbd13aca973a1475b512d3331fd0de8ec60",
"sha256:c7894c06205166d360ab2915ae306d1f7403e9ce3d3aaeff4095eaf98e42ce66",
"sha256:d4039bb7f234ad32267c55e72fd49fb56078ea102f9d9d8559f6ec34d4887630",
"sha256:e4d6bb8703d0319eb04b7319b12ea41580df44fd84d83ccda13ea463c6801414",
"sha256:e8fab9911fd2fa8e5af407057cb8bdf87762f983cba483fa3234be20a9a0af77",
"sha256:f3818e578e687cdb21dc4aa4a3bc6278c656c9c393e9eda14dd04943f478863d",
"sha256:fe666645493d72712c46e4fbe8bec094b06aec3c337400479e9704439c9d9586"
],
"index": "pypi",
"version": "==0.29.14"
},
"fancycompleter": {
"hashes": [
"sha256:d2522f1f3512371f295379c4c0d1962de06762eb586c199620a2a5d423539b12"
],
"version": "==0.8"
},
"h11": {
"hashes": [
"sha256:33d4bca7be0fa039f4e84d50ab00531047e53d6ee8ffbc83501ea602c169cae1",
"sha256:4bc6d6a1238b7615b266ada57e0618568066f57dd6fa967d1290ec9309b2f2f1"
],
"version": "==0.9.0"
},
"idna": {
"hashes": [
"sha256:c357b3f628cf53ae2c4c05627ecc484553142ca23264e593d327bcde5e9c3407",
"sha256:ea8b7f6188e6fa117537c3df7da9fc686d485087abf6ac197f9c46432f7e4a3c"
],
"version": "==2.8"
},
"more-itertools": {
"hashes": [
"sha256:409cd48d4db7052af495b09dec721011634af3753ae1ef92d2b32f73a745f832",
"sha256:92b8c4b06dac4f0611c0729b2f2ede52b2e1bac1ab48f089c7ddc12e26bb60c4"
],
"version": "==7.2.0"
},
"msgpack": {
"hashes": [
"sha256:0cc7ca04e575ba34fea7cfcd76039f55def570e6950e4155a4174368142c8e1b",
"sha256:187794cd1eb73acccd528247e3565f6760bd842d7dc299241f830024a7dd5610",
"sha256:1904b7cb65342d0998b75908304a03cb004c63ef31e16c8c43fee6b989d7f0d7",
"sha256:229a0ccdc39e9b6c6d1033cd8aecd9c296823b6c87f0de3943c59b8bc7c64bee",
"sha256:24149a75643aeaa81ece4259084d11b792308a6cf74e796cbb35def94c89a25a",
"sha256:30b88c47e0cdb6062daed88ca283b0d84fa0d2ad6c273aa0788152a1c643e408",
"sha256:32fea0ea3cd1ef820286863a6202dcfd62a539b8ec3edcbdff76068a8c2cc6ce",
"sha256:355f7fd0f90134229eaeefaee3cf42e0afc8518e8f3cd4b25f541a7104dcb8f9",
"sha256:4abdb88a9b67e64810fb54b0c24a1fd76b12297b4f7a1467d85a14dd8367191a",
"sha256:757bd71a9b89e4f1db0622af4436d403e742506dbea978eba566815dc65ec895",
"sha256:76df51492bc6fa6cc8b65d09efdb67cbba3cbfe55004c3afc81352af92b4a43c",
"sha256:774f5edc3475917cd95fe593e625d23d8580f9b48b570d8853d06cac171cd170",
"sha256:8a3ada8401736df2bf497f65589293a86c56e197a80ae7634ec2c3150a2f5082",
"sha256:a06efd0482a1942aad209a6c18321b5e22d64eb531ea20af138b28172d8f35ba",
"sha256:b24afc52e18dccc8c175de07c1d680bdf315844566f4952b5bedb908894bec79",
"sha256:b8b4bd3dafc7b92608ae5462add1c8cc881851c2d4f5d8977fdea5b081d17f21",
"sha256:c6e5024fc0cdf7f83b6624850309ddd7e06c48a75fa0d1c5173de4d93300eb19",
"sha256:db7ff14abc73577b0bcbcf73ecff97d3580ecaa0fc8724babce21fdf3fe08ef6",
"sha256:dedf54d72d9e7b6d043c244c8213fe2b8bbfe66874b9a65b39c4cc892dd99dd4",
"sha256:ea3c2f859346fcd55fc46e96885301d9c2f7a36d453f5d8f2967840efa1e1830",
"sha256:f0f47bafe9c9b8ed03e19a100a743662dd8c6d0135e684feea720a0d0046d116"
],
"index": "pypi",
"version": "==0.6.2"
},
"numpy": {
"hashes": [
"sha256:0a7a1dd123aecc9f0076934288ceed7fd9a81ba3919f11a855a7887cbe82a02f",
"sha256:0c0763787133dfeec19904c22c7e358b231c87ba3206b211652f8cbe1241deb6",
"sha256:3d52298d0be333583739f1aec9026f3b09fdfe3ddf7c7028cb16d9d2af1cca7e",
"sha256:43bb4b70585f1c2d153e45323a886839f98af8bfa810f7014b20be714c37c447",
"sha256:475963c5b9e116c38ad7347e154e5651d05a2286d86455671f5b1eebba5feb76",
"sha256:64874913367f18eb3013b16123c9fed113962e75d809fca5b78ebfbb73ed93ba",
"sha256:683828e50c339fc9e68720396f2de14253992c495fdddef77a1e17de55f1decc",
"sha256:6ca4000c4a6f95a78c33c7dadbb9495c10880be9c89316aa536eac359ab820ae",
"sha256:75fd817b7061f6378e4659dd792c84c0b60533e867f83e0d1e52d5d8e53df88c",
"sha256:7d81d784bdbed30137aca242ab307f3e65c8d93f4c7b7d8f322110b2e90177f9",
"sha256:8d0af8d3664f142414fd5b15cabfd3b6cc3ef242a3c7a7493257025be5a6955f",
"sha256:9679831005fb16c6df3dd35d17aa31dc0d4d7573d84f0b44cc481490a65c7725",
"sha256:a8f67ebfae9f575d85fa859b54d3bdecaeece74e3274b0b5c5f804d7ca789fe1",
"sha256:acbf5c52db4adb366c064d0b7c7899e3e778d89db585feadd23b06b587d64761",
"sha256:ada4805ed51f5bcaa3a06d3dd94939351869c095e30a2b54264f5a5004b52170",
"sha256:c7354e8f0eca5c110b7e978034cd86ed98a7a5ffcf69ca97535445a595e07b8e",
"sha256:e2e9d8c87120ba2c591f60e32736b82b67f72c37ba88a4c23c81b5b8fa49c018",
"sha256:e467c57121fe1b78a8f68dd9255fbb3bb3f4f7547c6b9e109f31d14569f490c3",
"sha256:ede47b98de79565fcd7f2decb475e2dcc85ee4097743e551fe26cfc7eb3ff143",
"sha256:f58913e9227400f1395c7b800503ebfdb0772f1c33ff8cb4d6451c06cabdf316",
"sha256:fe39f5fd4103ec4ca3cb8600b19216cd1ff316b4990f4c0b6057ad982c0a34d5"
],
"version": "==1.17.4"
},
"outcome": {
"hashes": [
"sha256:ee46c5ce42780cde85d55a61819d0e6b8cb490f1dbd749ba75ff2629771dcd2d",
"sha256:fc7822068ba7dd0fc2532743611e8a73246708d3564e29a39f93d6ab3701b66f"
],
"version": "==1.0.1"
},
"packaging": {
"hashes": [
"sha256:28b924174df7a2fa32c1953825ff29c61e2f5e082343165438812f00d3a7fc47",
"sha256:d9551545c6d761f3def1677baf08ab2a3ca17c56879e70fecba2fc4dde4ed108"
],
"version": "==19.2"
},
"pandas": {
"hashes": [
"sha256:00dff3a8e337f5ed7ad295d98a31821d3d0fe7792da82d78d7fd79b89c03ea9d",
"sha256:22361b1597c8c2ffd697aa9bf85423afa9e1fcfa6b1ea821054a244d5f24d75e",
"sha256:255920e63850dc512ce356233081098554d641ba99c3767dde9e9f35630f994b",
"sha256:26382aab9c119735908d94d2c5c08020a4a0a82969b7e5eefb92f902b3b30ad7",
"sha256:33970f4cacdd9a0ddb8f21e151bfb9f178afb7c36eb7c25b9094c02876f385c2",
"sha256:4545467a637e0e1393f7d05d61dace89689ad6d6f66f267f86fff737b702cce9",
"sha256:52da74df8a9c9a103af0a72c9d5fdc8e0183a90884278db7f386b5692a2220a4",
"sha256:61741f5aeb252f39c3031d11405305b6d10ce663c53bc3112705d7ad66c013d0",
"sha256:6a3ac2c87e4e32a969921d1428525f09462770c349147aa8e9ab95f88c71ec71",
"sha256:7458c48e3d15b8aaa7d575be60e1e4dd70348efcd9376656b72fecd55c59a4c3",
"sha256:78bf638993219311377ce9836b3dc05f627a666d0dbc8cec37c0ff3c9ada673b",
"sha256:8153705d6545fd9eb6dd2bc79301bff08825d2e2f716d5dced48daafc2d0b81f",
"sha256:975c461accd14e89d71772e89108a050fa824c0b87a67d34cedf245f6681fc17",
"sha256:9962957a27bfb70ab64103d0a7b42fa59c642fb4ed4cb75d0227b7bb9228535d",
"sha256:adc3d3a3f9e59a38d923e90e20c4922fc62d1e5a03d083440468c6d8f3f1ae0a",
"sha256:bbe3eb765a0b1e578833d243e2814b60c825b7fdbf4cdfe8e8aae8a08ed56ecf",
"sha256:df8864824b1fe488cf778c3650ee59c3a0d8f42e53707de167ba6b4f7d35f133",
"sha256:e45055c30a608076e31a9fcd780a956ed3b1fa20db61561b8d88b79259f526f7",
"sha256:ee50c2142cdcf41995655d499a157d0a812fce55c97d9aad13bc1eef837ed36c"
],
"version": "==0.25.3"
},
"pdbpp": {
"hashes": [
"sha256:73ff220d5006e0ecdc3e2705d8328d8aa5ac27fef95cc06f6e42cd7d22d55eb8"
],
"index": "pypi",
"version": "==0.10.2"
},
"piker": {
"editable": true,
"path": "."
},
"pluggy": {
"hashes": [
"sha256:0db4b7601aae1d35b4a033282da476845aa19185c1e6964b25cf324b5e4ec3e6",
"sha256:fa5fa1622fa6dd5c030e9cad086fa19ef6a0cf6d7a2d12318e10cb49d6d68f34"
],
"version": "==0.13.0"
},
"py": {
"hashes": [
"sha256:64f65755aee5b381cea27766a3a147c3f15b9b6b9ac88676de66ba2ae36793fa",
"sha256:dc639b046a6e2cff5bbe40194ad65936d6ba360b52b3c3fe1d08a82dd50b5e53"
],
"version": "==1.8.0"
},
"pygments": {
"hashes": [
"sha256:71e430bc85c88a430f000ac1d9b331d2407f681d6f6aec95e8bcfbc3df5b0127",
"sha256:881c4c157e45f30af185c1ffe8d549d48ac9127433f2c380c24b84572ad66297"
],
"version": "==2.4.2"
},
"pyparsing": {
"hashes": [
"sha256:20f995ecd72f2a1f4bf6b072b63b22e2eb457836601e76d6e5dfcd75436acc1f",
"sha256:4ca62001be367f01bd3e92ecbb79070272a9d4964dce6a48a82ff0b8bc7e683a"
],
"version": "==2.4.5"
},
"pytest": {
"hashes": [
"sha256:8e256fe71eb74e14a4d20a5987bb5e1488f0511ee800680aaedc62b9358714e8",
"sha256:ff0090819f669aaa0284d0f4aad1a6d9d67a6efdc6dd4eb4ac56b704f890a0d6"
],
"index": "pypi",
"version": "==5.2.4"
},
"python-dateutil": {
"hashes": [
"sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c",
"sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a"
],
"version": "==2.8.1"
},
"pytz": {
"hashes": [
"sha256:1c557d7d0e871de1f5ccd5833f60fb2550652da6be2693c1e02300743d21500d",
"sha256:b02c06db6cf09c12dd25137e563b31700d3b80fcc4ad23abb7a315f2789819be"
],
"version": "==2019.3"
},
"six": {
"hashes": [
"sha256:1f1b7d42e254082a9db6279deae68afb421ceba6158efa6131de7b3003ee93fd",
"sha256:30f610279e8b2578cab6db20741130331735c781b56053c59c4076da27f06b66"
],
"version": "==1.13.0"
},
"sniffio": {
"hashes": [
"sha256:20ed6d5b46f8ae136d00b9dcb807615d83ed82ceea6b2058cecb696765246da5",
"sha256:8e3810100f69fe0edd463d02ad407112542a11ffdc29f67db2bf3771afb87a21"
],
"version": "==1.1.0"
},
"sortedcontainers": {
"hashes": [
"sha256:974e9a32f56b17c1bac2aebd9dcf197f3eb9cd30553c5852a3187ad162e1a03a",
"sha256:d9e96492dd51fae31e60837736b38fe42a187b5404c16606ff7ee7cd582d4c60"
],
"version": "==2.1.0"
},
"trio": {
"hashes": [
"sha256:a6d83c0cb4a177ec0f5179ce88e27914d5c8e6fd01c4285176b949e6ddc88c6c",
"sha256:f1cf00054ad974c86d9b7afa187a65d79fd5995340abe01e8e4784d86f4acb30"
],
"index": "pypi",
"version": "==0.13.0"
},
"wcwidth": {
"hashes": [
"sha256:3df37372226d6e63e1b1e1eda15c594bca98a22d33a23832a90998faa96bc65e",
"sha256:f4ebe71925af7b40a864553f761ed559b43544f8f71746c2d756c7fe788ade7c"
],
"version": "==0.1.7"
},
"wmctrl": {
"hashes": [
"sha256:d806f65ac1554366b6e31d29d7be2e8893996c0acbb2824bbf2b1f49cf628a13"
],
"version": "==0.3"
}
}
}

View File

@ -1,231 +1,98 @@
piker piker
----- -----
trading gear for hackers. Trading gear for hackers.
|gh_actions| |travis|
.. |gh_actions| image:: https://img.shields.io/endpoint.svg?url=https%3A%2F%2Factions-badge.atrox.dev%2Fpikers%2Fpiker%2Fbadge&style=popout-square ``piker`` is an attempt at a pro-grade, broker agnostic, next-gen FOSS toolset for real-time
:target: https://actions-badge.atrox.dev/piker/pikers/goto trading and financial analysis.
``piker`` is a broker agnostic, next-gen FOSS toolset for real-time It tries to use as much cutting edge tech as possible including (but not limited to):
computational trading targeted at `hardcore Linux users <comp_trader>`_ .
we use as much bleeding edge tech as possible including (but not limited to): - Python 3.7+
- trio_
- latest python for glue_ - tractor_
- trio_ for `structured concurrency`_ - kivy_
- tractor_ for distributed, multi-core, real-time streaming
- marketstore_ for historical and real-time tick data persistence and sharing
- techtonicdb_ for L2 book storage
- Qt_ for pristine high performance UIs
- pyqtgraph_ for real-time charting
- ``numpy`` and ``numba`` for `fast numerics`_
.. |travis| image:: https://img.shields.io/travis/pikers/piker/master.svg .. |travis| image:: https://img.shields.io/travis/pikers/piker/master.svg
:target: https://travis-ci.org/pikers/piker :target: https://travis-ci.org/pikers/piker
.. _trio: https://github.com/python-trio/trio .. _trio: https://github.com/python-trio/trio
.. _tractor: https://github.com/goodboy/tractor .. _tractor: https://github.com/goodboy/tractor
.. _structured concurrency: https://trio.discourse.group/ .. _kivy: https://kivy.org
.. _marketstore: https://github.com/alpacahq/marketstore
.. _techtonicdb: https://github.com/0b01/tectonicdb Also, we're always open to new framework suggestions and ideas!
.. _Qt: https://www.qt.io/
.. _pyqtgraph: https://github.com/pyqtgraph/pyqtgraph Building the best looking, most reliable, keyboard friendly trading platform is the dream.
.. _glue: https://numpy.org/doc/stable/user/c-info.python-as-glue.html#using-python-as-glue Feel free to pipe in with your ideas and quiffs.
.. _fast numerics: https://zerowithdot.com/python-numpy-and-pandas-performance/
.. _comp_trader: https://jfaleiro.wordpress.com/2019/10/09/computational-trader/
focus and features: Install
*******************
- 100% federated: your code, your hardware, your data feeds, your broker fills.
- zero web: low latency, native software that doesn't try to re-invent the OS
- maximal **privacy**: prevent brokers and mms from knowing your
planz; smack their spreads with dark volume.
- zero clutter: modal, context oriented UIs that echew minimalism, reduce
thought noise and encourage un-emotion.
- first class parallelism: built from the ground up on next-gen structured concurrency
primitives.
- traders first: broker/exchange/asset-class agnostic
- systems grounded: real-time financial signal processing that will
make any queuing or DSP eng juice their shorts.
- non-tina UX: sleek, powerful keyboard driven interaction with expected use in tiling wms
- data collaboration: every process and protocol is multi-host scalable.
- fight club ready: zero interest in adoption by suits; no corporate friendly license, ever.
fitting with these tenets, we're always open to new framework suggestions and ideas.
building the best looking, most reliable, keyboard friendly trading
platform is the dream; join the cause.
install
******* *******
``piker`` is currently under heavy pre-alpha development and as such ``piker`` is currently under heavy pre-alpha development and as such should
should be cloned from this repo and hacked on directly. be cloned from this repo and hacked on directly.
for a development install:: A couple bleeding edge components are being used atm pertaining to
async ports of libraries for use with `trio`_.
Before installing make sure you have `pipenv`_ and have installed
``python3.7`` as well as `kivy source build`_ dependencies
since currently there's reliance on an async development branch.
`kivy` dependencies
===================
On Archlinux you need the following dependencies::
pacman -S python-docutils gstreamer sdl2_ttf sdl2_mixer sdl2_image xclip
To manually install the async branch of ``kivy`` from github do (though
this should be done as part of the ``pipenv install`` below)::
pipenv install -e 'git+git://github.com/matham/kivy.git@async-loop#egg=kivy'
.. _kivy source build:
https://kivy.org/docs/installation/installation-linux.html#installation-in-a-virtual-environment
For a development install::
git clone git@github.com:pikers/piker.git git clone git@github.com:pikers/piker.git
cd piker cd piker
virtualenv env pipenv install --pre -e .
source ./env/bin/activate pipenv shell
pip install -r requirements.txt -e .
install for tinas Broker Support
***************** **************
for windows peeps you can start by installing all the prerequisite software: For live data feeds the only fully functional broker at the moment is Questrade_.
Eventual support is in the works for `IB`, `TD Ameritrade` and `IEX`.
If you want your broker supported and they have an API let us know.
- install git with all default settings - https://git-scm.com/download/win .. _Questrade: https://www.questrade.com/api/documentation
- install anaconda all default settings - https://www.anaconda.com/products/individual
- install microsoft build tools (check the box for Desktop development for C++, you might be able to uncheck some optional downloads) - https://visualstudio.microsoft.com/visual-cpp-build-tools/
- install visual studio code default settings - https://code.visualstudio.com/download
then, `crack a conda shell`_ and run the following commands:: Play with some UIs
******************
mkdir code # create code directory To start the real-time index monitor with the `questrade` backend::
cd code # change directory to code
git clone https://github.com/pikers/piker.git # downloads piker installation package from github
cd piker # change directory to piker
conda create -n pikonda # creates conda environment named pikonda piker -l info monitor indexes
conda activate pikonda # activates pikonda
conda install -c conda-forge python-levenshtein # in case it is not already installed
conda install pip # may already be installed
pip # will show if pip is installed
pip install -e . -r requirements.txt # install piker in editable mode
test Piker to see if it is working::
piker -b binance chart btcusdt.binance # formatting for loading a chart
piker -b kraken -b binance chart xbtusdt.kraken
piker -b kraken -b binance -b ib chart qqq.nasdaq.ib
piker -b ib chart tsla.nasdaq.ib
potential error::
FileNotFoundError: [Errno 2] No such file or directory: 'C:\\Users\\user\\AppData\\Roaming\\piker\\brokers.toml'
solution:
- navigate to file directory above (may be different on your machine, location should be listed in the error code)
- copy and paste file from 'C:\\Users\\user\\code\\data/brokers.toml' or create a blank file using notepad at the location above
Visual Studio Code setup:
- now that piker is installed we can set up vscode as the default terminal for running piker and editing the code
- open Visual Studio Code
- file --> Add Folder to Workspace --> C:\Users\user\code\piker (adds piker directory where all piker files are located)
- file --> Save Workspace As --> save it wherever you want and call it whatever you want, this is going to be your default workspace for running and editing piker code
- ctrl + shift + p --> start typing Python: Select Interpetter --> when the option comes up select it --> Select at the workspace level --> select the one that shows ('pikonda')
- change the default terminal to cmd.exe instead of powershell (default)
- now when you create a new terminal VScode should automatically activate you conda env so that piker can be run as the first command after a new terminal is created
also, try out fancyzones as part of powertoyz for a decent tiling windows manager to manage all the cool new software you are going to be running.
.. _conda installed: https://
.. _C++ build toolz: https://
.. _crack a conda shell: https://
.. _vscode: https://
.. link to the tina guide
.. _setup a coolio tiled wm console: https://
provider support
****************
for live data feeds the in-progress set of supported brokers is:
- IB_ via ``ib_insync``, also see our `container docs`_
- binance_ and kraken_ for crypto over their public websocket API
- questrade_ (ish) which comes with effectively free L1
coming soon...
- webull_ via the reverse engineered public API
- yahoo via yliveticker_
if you want your broker supported and they have an API let us know.
.. _IB: https://interactivebrokers.github.io/tws-api/index.html
.. _container docs: https://github.com/pikers/piker/tree/master/dockering/ib
.. _questrade: https://www.questrade.com/api/documentation
.. _kraken: https://www.kraken.com/features/api#public-market-data
.. _binance: https://github.com/pikers/piker/pull/182
.. _webull: https://github.com/tedchou12/webull
.. _yliveticker: https://github.com/yahoofinancelive/yliveticker
.. _coinbase: https://docs.pro.coinbase.com/#websocket-feed
check out our charts If you want to see super granular price changes, increase the
******************** broker quote query ``rate`` with ``-r``::
bet you weren't expecting this from the foss::
piker -l info -b kraken -b binance chart btcusdt.binance --pdb piker monitor indexes -r 10
this runs the main chart (currently with 1m sampled OHLC) in in debug It is also possible to run the broker data feed micro service as a daemon::
mode and you can practice paper trading using the following
micro-manual:
``order_mode`` ( pikerd -l info
edge triggered activation by any of the following keys,
``mouse-click`` on y-level to submit at that price
):
- ``f``/ ``ctl-f`` to stage buy Then start the client app as normal::
- ``d``/ ``ctl-d`` to stage sell
- ``a`` to stage alert piker monitor indexes
``search_mode`` ( .. _trio: https://github.com/python-trio/trio
``ctl-l`` or ``ctl-space`` to open, .. _pipenv: https://docs.pipenv.org/
``ctl-c`` or ``ctl-space`` to close
) :
- begin typing to have symbol search automatically lookup
symbols from all loaded backend (broker) providers
- arrow keys and mouse click to navigate selection
- vi-like ``ctl-[hjkl]`` for navigation
you can also configure your position allocation limits from the
sidepane.
run in distributed mode
***********************
start the service manager and data feed daemon in the background and
connect to it::
pikerd -l info --pdb
connect your chart::
piker -l info -b kraken -b binance chart xmrusdt.binance --pdb
enjoy persistent real-time data feeds tied to daemon lifetime. the next
time you spawn a chart it will load much faster since the data feed has
been cached and is now always running live in the background until you
kill ``pikerd``.
if anyone asks you what this project is about
*********************************************
you don't talk about it.
how do i get involved?
**********************
enter the matrix.
how come there ain't that many docs
***********************************
suck it up, learn the code; no one is trying to sell you on anything.
also, we need lotsa help so if you want to start somewhere and can't
necessarily write serious code, this might be the place for you!

View File

@ -1,52 +0,0 @@
[questrade]
refresh_token = ""
access_token = ""
api_server = "https://api06.iq.questrade.com/"
expires_in = 1800
token_type = "Bearer"
expires_at = 1616095326.355846
[kraken]
key_descr = "api_0"
api_key = ""
secret = ""
[ib]
hosts = [
"127.0.0.1",
]
# XXX: the order in which ports will be scanned
# (by the `brokerd` daemon-actor)
# is determined # by the line order here.
# TODO: when we eventually spawn gateways in our
# container, we can just dynamically allocate these
# using IBC.
ports = [
4002, # gw
7497, # tws
]
# XXX: for a paper account the flex web query service
# is not supported so you have to manually download
# and XML report and put it in a location that can be
# accessed by the ``brokerd.ib`` backend code for parsing.
flex_token = '666666666666666666666666'
flex_trades_query_id = '666666' # live account
# when clients are being scanned this determines
# which clients are preferred to be used for data
# feeds based on the order of account names, if
# detected as active on an API client.
prefer_data_account = [
'paper',
'margin',
'ira',
]
[ib.accounts]
# the order in which accounts will be selectable
# in the order mode UI (if found via clients during
# API-app scanning)when a new symbol is loaded.
paper = "XX0000000"
margin = "X0000000"
ira = "X0000000"

View File

@ -1,30 +0,0 @@
running ``ib`` gateway in ``docker``
------------------------------------
We have a config based on the (now defunct)
image from "waytrade":
https://github.com/waytrade/ib-gateway-docker
To startup this image with our custom settings
simply run the command::
docker compose up
And you should have the following socket-available services:
- ``x11vnc1@127.0.0.1:3003``
- ``ib-gw@127.0.0.1:4002``
You can attach to the container via a VNC client
without password auth.
SECURITY STUFF!?!?!
-------------------
Though "``ib``" claims they host filter connections outside
localhost (aka ``127.0.0.1``) it's probably better if you filter
the socket at the OS level using a stateless firewall rule::
ip rule add not unicast iif lo to 0.0.0.0/0 dport 4002
We will soon have this baked into our own custom image but for
now you'll have to do it urself dawgy.

View File

@ -1,64 +0,0 @@
# rework from the original @
# https://github.com/waytrade/ib-gateway-docker/blob/master/docker-compose.yml
version: "3.5"
services:
ib-gateway:
# other image tags available:
# https://github.com/waytrade/ib-gateway-docker#supported-tags
image: waytrade/ib-gateway:981.3j
restart: always
network_mode: 'host'
volumes:
- type: bind
source: ./jts.ini
target: /root/Jts/jts.ini
# don't let IBC clobber this file for
# the main reason of not having a stupid
# timezone set..
read_only: true
# force our own IBC config
- type: bind
source: ./ibc.ini
target: /root/ibc/config.ini
# force our noop script - socat isn't needed in host mode.
- type: bind
source: ./fork_ports_delayed.sh
target: /root/scripts/fork_ports_delayed.sh
# force our noop script - socat isn't needed in host mode.
- type: bind
source: ./run_x11_vnc.sh
target: /root/scripts/run_x11_vnc.sh
read_only: true
# NOTE:to fill these out, define an `.env` file in the same dir as
# this compose file which looks something like:
# TWS_USERID='myuser'
# TWS_PASSWORD='guest'
# TRADING_MODE=paper (or live)
# VNC_SERVER_PASSWORD='diggity'
environment:
TWS_USERID: ${TWS_USERID}
TWS_PASSWORD: ${TWS_PASSWORD}
TRADING_MODE: ${TRADING_MODE:-paper}
VNC_SERVER_PASSWORD: ${VNC_SERVER_PASSWORD:-}
# ports:
# - target: 4002
# host_ip: 127.0.0.1
# published: 4002
# protocol: tcp
# original mappings for use in non-host-mode
# which we won't really need going forward since
# ideally we just pick the port to have ib-gw listen
# on **when** we spawn the container - i.e. everything
# will be driven by a ``brokers.toml`` def.
# - "127.0.0.1:4001:4001"
# - "127.0.0.1:4002:4002"
# - "127.0.0.1:5900:5900"

View File

@ -1,6 +0,0 @@
#!/bin/sh
# we now just set this is to a noop script
# since we can just run the container in
# `network_mode: 'host'` and get literally
# the exact same behaviour XD

View File

@ -1,711 +0,0 @@
# Note that in the comments in this file, TWS refers to both the Trader
# Workstation and the IB Gateway, unless explicitly stated otherwise.
#
# When referred to below, the default value for a setting is the value
# assumed if either the setting is included but no value is specified, or
# the setting is not included at all.
#
# IBC may also be used to start the FIX CTCI Gateway. All settings
# relating to this have names prefixed with FIX.
#
# The IB API Gateway and the FIX CTCI Gateway share the same code. Which
# gateway actually runs is governed by an option on the initial gateway
# login screen. The FIX setting described under IBC Startup
# Settings below controls this.
# =============================================================================
# 1. IBC Startup Settings
# =============================================================================
# IBC may be used to start the IB Gateway for the FIX CTCI. This
# setting must be set to 'yes' if you want to run the FIX CTCI gateway. The
# default is 'no'.
FIX=no
# =============================================================================
# 2. Authentication Settings
# =============================================================================
# TWS and the IB API gateway require a single username and password.
# You may specify the username and password using the following settings:
#
# IbLoginId
# IbPassword
#
# Alternatively, you can specify the username and password in the command
# files used to start TWS or the Gateway, but this is not recommended for
# security reasons.
#
# If you don't specify them, you will be prompted for them in the usual
# login dialog when TWS starts (but whatever you have specified will be
# included in the dialog automatically: for example you may specify the
# username but not the password, and then you will be prompted for the
# password via the login dialog). Note that if you specify either
# the username or the password (or both) in the command file, then
# IbLoginId and IbPassword settings defined in this file are ignored.
#
#
# The FIX CTCI gateway requires one username and password for FIX order
# routing, and optionally a separate username and password for market
# data connections. You may specify the usernames and passwords using
# the following settings:
#
# FIXLoginId
# FIXPassword
# IbLoginId (optional - for market data connections)
# IbPassword (optional - for market data connections)
#
# Alternatively you can specify the FIX username and password in the
# command file used to start the FIX CTCI Gateway, but this is not
# recommended for security reasons.
#
# If you don't specify them, you will be prompted for them in the usual
# login dialog when FIX CTCI gateway starts (but whatever you have
# specified will be included in the dialog automatically: for example
# you may specify the usernames but not the passwords, and then you will
# be prompted for the passwords via the login dialog). Note that if you
# specify either the FIX username or the FIX password (or both) on the
# command line, then FIXLoginId and FIXPassword settings defined in this
# file are ignored; he same applies to the market data username and
# password.
# IB API Authentication Settings
# ------------------------------
# Your TWS username:
IbLoginId=
# Your TWS password:
IbPassword=
# FIX CTCI Authentication Settings
# --------------------------------
# Your FIX CTCI username:
FIXLoginId=
# Your FIX CTCI password:
FIXPassword=
# Second Factor Authentication Settings
# -------------------------------------
# If you have enabled more than one second factor authentication
# device, TWS presents a list from which you must select the device
# you want to use for this login. You can use this setting to
# instruct IBC to select a particular item in the list on your
# behalf. Note that you must spell this value exactly as it appears
# in the list. If no value is set, you must manually select the
# relevant list entry.
SecondFactorDevice=
# If you use the IBKR Mobile app for second factor authentication,
# and you fail to complete the process before the time limit imposed
# by IBKR, you can use this setting to tell IBC to exit: arrangements
# can then be made to automatically restart IBC in order to initiate
# the login sequence afresh. Otherwise, manual intervention at TWS's
# Second Factor Authentication dialog is needed to complete the
# login.
#
# Permitted values are 'yes' and 'no'. The default is 'no'.
#
# Note that the scripts provided with the IBC zips for Windows and
# Linux provide options to automatically restart in these
# circumstances, but only if this setting is also set to 'yes'.
ExitAfterSecondFactorAuthenticationTimeout=no
# This setting is only relevant if
# ExitAfterSecondFactorAuthenticationTimeout is set to 'yes'.
#
# It controls how long (in seconds) IBC waits for login to complete
# after the user acknowledges the second factor authentication
# alert at the IBKR Mobile app. If login has not completed after
# this time, IBC terminates.
# The default value is 40.
SecondFactorAuthenticationExitInterval=
# Trading Mode
# ------------
#
# TWS 955 introduced a new Trading Mode combo box on its login
# dialog. This indicates whether the live account or the paper
# trading account corresponding to the supplied credentials is
# to be used. The allowed values are 'live' (the default) and
# 'paper'. For earlier versions of TWS this setting has no
# effect.
TradingMode=
# Paper-trading Account Warning
# -----------------------------
#
# Logging in to a paper-trading account results in TWS displaying
# a dialog asking the user to confirm that they are aware that this
# is not a brokerage account. Until this dialog has been accepted,
# TWS will not allow API connections to succeed. Setting this
# to 'yes' (the default) will cause IBC to automatically
# confirm acceptance. Setting it to 'no' will leave the dialog
# on display, and the user will have to deal with it manually.
AcceptNonBrokerageAccountWarning=yes
# Login Dialog Display Timeout
#-----------------------------
#
# In some circumstances, starting TWS may result in failure to display
# the login dialog. Restarting TWS may help to resolve this situation,
# and IBC does this automatically.
#
# This setting controls how long (in seconds) IBC waits for the login
# dialog to appear before restarting TWS.
#
# Note that in normal circumstances with a reasonably specified
# computer the time to displaying the login dialog is typically less
# than 20 seconds, and frequently much less. However many factors can
# influence this, and it is unwise to set this value too low.
#
# The default value is 60.
LoginDialogDisplayTimeout = 60
# =============================================================================
# 3. TWS Startup Settings
# =============================================================================
# Path to settings store
# ----------------------
#
# Path to the directory where TWS should store its settings. This is
# normally the folder in which TWS is installed. However you may set
# it to some other location if you wish (for example if you want to
# run multiple instances of TWS with different settings).
#
# It is recommended for clarity that you use an absolute path. The
# effect of using a relative path is undefined.
#
# Linux and macOS users should use the appropriate path syntax.
#
# Note that, for Windows users, you MUST use double separator
# characters to separate the elements of the folder path: for
# example, IbDir=C:\\IBLiveSettings is valid, but
# IbDir=C:\IBLiveSettings is NOT valid and will give unexpected
# results. Linux and macOS users need not use double separators,
# but they are acceptable.
#
# The default is the current working directory when IBC is
# started.
IbDir=/root/Jts
# Store settings on server
# ------------------------
#
# If you wish to store a copy of your TWS settings on IB's
# servers as well as locally on your computer, set this to
# 'yes': this enables you to run TWS on different computers
# with the same configuration, market data lines, etc. If set
# to 'no', running TWS on different computers will not share the
# same settings. If no value is specified, TWS will obtain its
# settings from the same place as the last time this user logged
# in (whether manually or using IBC).
StoreSettingsOnServer=
# Minimize TWS on startup
# -----------------------
#
# Set to 'yes' to minimize TWS when it starts:
MinimizeMainWindow=no
# Existing Session Detected Action
# --------------------------------
#
# When a user logs on to an IBKR account for trading purposes by any means, the
# IBKR account server checks to see whether the account is already logged in
# elsewhere. If so, a dialog is displayed to both the users that enables them
# to determine what happens next. The 'ExistingSessionDetectedAction' setting
# instructs TWS how to proceed when it displays this dialog:
#
# * If the new TWS session is set to 'secondary', the existing session continues
# and the new session terminates. Thus a secondary TWS session can never
# override any other session.
#
# * If the existing TWS session is set to 'primary', the existing session
# continues and the new session terminates (even if the new session is also
# set to primary). Thus a primary TWS session can never be overridden by
# any new session).
#
# * If both the existing and the new TWS sessions are set to 'primaryoverride',
# the existing session terminates and the new session proceeds.
#
# * If the existing TWS session is set to 'manual', the user must handle the
# dialog.
#
# The difference between 'primary' and 'primaryoverride' is that a
# 'primaryoverride' session can be overriden over by a new 'primary' session,
# but a 'primary' session cannot be overriden by any other session.
#
# When set to 'primary', if another TWS session is started and manually told to
# end the 'primary' session, the 'primary' session is automatically reconnected.
#
# The default is 'manual'.
ExistingSessionDetectedAction=primary
# Override TWS API Port Number
# ----------------------------
#
# If OverrideTwsApiPort is set to an integer, IBC changes the
# 'Socket port' in TWS's API configuration to that number shortly
# after startup. Leaving the setting blank will make no change to
# the current setting. This setting is only intended for use in
# certain specialized situations where the port number needs to
# be set dynamically at run-time: most users will never need it,
# so don't use it unless you know you need it.
OverrideTwsApiPort=4002
# Read-only Login
# ---------------
#
# If ReadOnlyLogin is set to 'yes', and the user is enrolled in IB's
# account security programme, the user will not be asked to perform
# the second factor authentication action, and login to TWS will
# occur automatically in read-only mode: in this mode, placing or
# managing orders is not allowed. If set to 'no', and the user is
# enrolled in IB's account security programme, the user must perform
# the relevant second factor authentication action to complete the
# login.
# If the user is not enrolled in IB's account security programme,
# this setting is ignored. The default is 'no'.
ReadOnlyLogin=no
# Read-only API
# -------------
#
# If ReadOnlyApi is set to 'yes', API programs cannot submit, modify
# or cancel orders. If set to 'no', API programs can do these things.
# If not set, the existing TWS/Gateway configuration is unchanged.
# NB: this setting is really only supplied for the benefit of new TWS
# or Gateway instances that are being automatically installed and
# started without user intervention (eg Docker containers). Where
# a user is involved, they should use the Global Configuration to
# set the relevant checkbox (this only needs to be done once) and
# not provide a value for this setting.
ReadOnlyApi=no
# Market data size for US stocks - lots or shares
# -----------------------------------------------
#
# Since IB introduced the option of market data for US stocks showing
# bid, ask and last sizes in shares rather than lots, TWS and Gateway
# display a dialog immediately after login notifying the user about
# this and requiring user input before allowing market data to be
# accessed. The user can request that the dialog not be shown again.
#
# It is recommended that the user should handle this dialog manually
# rather than using these settings, which are provided for situations
# where the user interface is not easily accessible, or where user
# settings are not preserved between sessions (eg some Docker images).
#
# - If this setting is set to 'accept', the dialog will be handled
# automatically and the option to not show it again will be
# selected.
#
# Note that in this case, the only way to allow the dialog to be
# displayed again is to manually enable the 'Bid, Ask and Last
# Size Display Update' message in the 'Messages' section of the TWS
# configuration dialog. So you should only use 'Accept' if you are
# sure you really don't want the dialog to be displayed again, or
# you have easy access to the user interface.
#
# - If set to 'defer', the dialog will be handled automatically (so
# that market data will start), but the option to not show it again
# will not be selected, and it will be shown again after the next
# login.
#
# - If set to 'ignore', the user has to deal with the dialog manually.
#
# The default value is 'ignore'.
#
# Note if set to 'accept' or 'defer', TWS also automatically sets
# the API settings checkbox labelled 'Send market data in lots for
# US stocks for dual-mode API clients'. IBC cannot prevent this.
# However you can change this immmediately by setting
# SendMarketDataInLotsForUSstocks (see below) to 'no' .
AcceptBidAskLastSizeDisplayUpdateNotification=accept
# This setting determines whether the API settings checkbox labelled
# 'Send market data in lots for US stocks for dual-mode API clients'
# is set or cleared. If set to 'yes', the checkbox is set. If set to
# 'no' the checkbox is cleared. If defaulted, the checkbox is
# unchanged.
SendMarketDataInLotsForUSstocks=
# =============================================================================
# 4. TWS Auto-Closedown
# =============================================================================
#
# IMPORTANT NOTE: Starting with TWS 974, this setting no longer
# works properly, because IB have changed the way TWS handles its
# autologoff mechanism.
#
# You should now configure the TWS autologoff time to something
# convenient for you, and restart IBC each day.
#
# Alternatively, discontinue use of IBC and use the auto-relogin
# mechanism within TWS 974 and later versions (note that the
# auto-relogin mechanism provided by IB is not available if you
# use IBC).
# Set to yes or no (lower case).
#
# yes means allow TWS to shut down automatically at its
# specified shutdown time, which is set via the TWS
# configuration menu.
#
# no means TWS never shuts down automatically.
#
# NB: IB recommends that you do not keep TWS running
# continuously. If you set this setting to 'no', you may
# experience incorrect TWS operation.
#
# NB: the default for this setting is 'no'. Since this will
# only work properly with TWS versions earlier than 974, you
# should explicitly set this to 'yes' for version 974 and later.
IbAutoClosedown=yes
# =============================================================================
# 5. TWS Tidy Closedown Time
# =============================================================================
#
# NB: starting with TWS 974 this is no longer a useful option
# because both TWS and Gateway now have the same auto-logoff
# mechanism, and IBC can no longer avoid this.
#
# Note that giving this setting a value does not change TWS's
# auto-logoff in any way: any setting will be additional to the
# TWS auto-logoff.
#
# To tell IBC to tidily close TWS at a specified time every
# day, set this value to <hh:mm>, for example:
# ClosedownAt=22:00
#
# To tell IBC to tidily close TWS at a specified day and time
# each week, set this value to <dayOfWeek hh:mm>, for example:
# ClosedownAt=Friday 22:00
#
# Note that the day of the week must be specified using your
# default locale. Also note that Java will only accept
# characters encoded to ISO 8859-1 (Latin-1). This means that
# if the day name in your default locale uses any non-Latin-1
# characters you need to encode them using Unicode escapes
# (see http://java.sun.com/docs/books/jls/third_edition/html/lexical.html#3.3
# for details). For example, to tidily close TWS at 12:00 on
# Saturday where the default locale is Simplified Chinese,
# use the following:
# #ClosedownAt=\u661F\u671F\u516D 12:00
ClosedownAt=
# =============================================================================
# 6. Other TWS Settings
# =============================================================================
# Accept Incoming Connection
# --------------------------
#
# If set to 'accept', IBC automatically accepts incoming
# API connection dialogs. If set to 'reject', IBC
# automatically rejects incoming API connection dialogs. If
# set to 'manual', the user must decide whether to accept or reject
# incoming API connection dialogs. The default is 'manual'.
# NB: it is recommended to set this to 'reject', and to explicitly
# configure which IP addresses can connect to the API in TWS's API
# configuration page, as this is much more secure (in this case, no
# incoming API connection dialogs will occur for those IP addresses).
AcceptIncomingConnectionAction=reject
# Allow Blind Trading
# -------------------
#
# If you attempt to place an order for a contract for which
# you have no market data subscription, TWS displays a dialog
# to warn you against such blind trading.
#
# yes means the dialog is dismissed as though the user had
# clicked the 'Ok' button: this means that you accept
# the risk and want the order to be submitted.
#
# no means the dialog remains on display and must be
# handled by the user.
AllowBlindTrading=yes
# Save Settings on a Schedule
# ---------------------------
#
# You can tell TWS to automatically save its settings on a schedule
# of your choosing. You can specify one or more specific times,
# like this:
#
# SaveTwsSettingsAt=HH:MM [ HH:MM]...
#
# for example:
# SaveTwsSettingsAt=08:00 12:30 17:30
#
# Or you can specify an interval at which settings are to be saved,
# optionally starting at a specific time and continuing until another
# time, like this:
#
#SaveTwsSettingsAt=Every n [{mins | hours}] [hh:mm] [hh:mm]
#
# where the first hh:mm is the start time and the second is the end
# time. If you don't specify the end time, settings are saved regularly
# from the start time till midnight. If you don't specify the start time.
# settings are saved regularly all day, beginning at 00:00. Note that
# settings will always be saved at the end time, even if that is not
# exactly one interval later than the previous time. If neither 'mins'
# nor 'hours' is specified, 'mins' is assumed. Examples:
#
# To save every 30 minutes all day starting at 00:00
#SaveTwsSettingsAt=Every 30
#SaveTwsSettingsAt=Every 30 mins
#
# To save every hour starting at 08:00 and ending at midnight
#SaveTwsSettingsAt=Every 1 hours 08:00
#SaveTwsSettingsAt=Every 1 hours 08:00 00:00
#
# To save every 90 minutes starting at 08:00 up to and including 17:43
#SaveTwsSettingsAt=Every 90 08:00 17:43
SaveTwsSettingsAt=
# =============================================================================
# 7. Settings Specific to Indian Versions of TWS
# =============================================================================
# Indian versions of TWS may display a password expiry
# notification dialog and a NSE Compliance dialog. These can be
# dismissed by setting the following to yes. By default the
# password expiry notice is not dismissed, but the NSE Compliance
# notice is dismissed.
# Warning: setting DismissPasswordExpiryWarning=yes will mean
# you will not be notified when your password is about to expire.
# You must then take other measures to ensure that your password
# is changed within the expiry period, otherwise IBC will
# not be able to login successfully.
DismissPasswordExpiryWarning=no
DismissNSEComplianceNotice=yes
# =============================================================================
# 8. IBC Command Server Settings
# =============================================================================
# Do NOT CHANGE THE FOLLOWING SETTINGS unless you
# intend to issue commands to IBC (for example
# using telnet). Note that these settings have nothing to
# do with running programs that use the TWS API.
# Command Server Port Number
# --------------------------
#
# The port number that IBC listens on for commands
# such as "STOP". DO NOT set this to the port number
# used for TWS API connections. There is no good reason
# to change this setting unless the port is used by
# some other application (typically another instance of
# IBC). The default value is 0, which tells IBC not to
# start the command server
#CommandServerPort=7462
# Permitted Command Sources
# -------------------------
#
# A comma separated list of IP addresses, or host names,
# which are allowed addresses for sending commands to
# IBC. Commands can always be sent from the
# same host as IBC is running on.
ControlFrom=127.0.0.1
# Address for Receiving Commands
# ------------------------------
#
# Specifies the IP address on which the Command Server
# is so listen. For a multi-homed host, this can be used
# to specify that connection requests are only to be
# accepted on the specified address. The default is to
# accept connection requests on all local addresses.
BindAddress=127.0.0.1
# Command Prompt
# --------------
#
# The specified string is output by the server when
# the connection is first opened and after the completion
# of each command. This can be useful if sending commands
# using an interactive program such as telnet. The default
# is that no prompt is output.
# For example:
#
# CommandPrompt=>
CommandPrompt=
# Suppress Command Server Info Messages
# -------------------------------------
#
# Some commands can return intermediate information about
# their progress. This setting controls whether such
# information is sent. The default is that such information
# is not sent.
SuppressInfoMessages=no
# =============================================================================
# 9. Diagnostic Settings
# =============================================================================
#
# IBC can log information about the structure of windows
# displayed by TWS. This information is useful when adding
# new features to IBC or when behaviour is not as expected.
#
# The logged information shows the hierarchical organisation
# of all the components of the window, and includes the
# current values of text boxes and labels.
#
# Note that this structure logging has a small performance
# impact, and depending on the settings can cause the logfile
# size to be significantly increased. It is therefore
# recommended that the LogStructureWhen setting be set to
# 'never' (the default) unless there is a specific reason
# that this information is needed.
# Scope of Structure Logging
# --------------------------
#
# The LogStructureScope setting indicates which windows are
# eligible for structure logging:
#
# - if set to 'known', only windows that IBC recognizes
# are eligible - these are windows that IBC has some
# interest in monitoring, usually to take some action
# on the user's behalf;
#
# - if set to 'unknown', only windows that IBC does not
# recognize are eligible. Most windows displayed by
# TWS fall into this category;
#
# - if set to 'untitled', only windows that IBC does not
# recognize and that have no title are eligible. These
# are usually message boxes or similar small windows,
#
# - if set to 'all', then every window displayed by TWS
# is eligible.
#
# The default value is 'known'.
LogStructureScope=all
# When to Log Window Structure
# ----------------------------
#
# The LogStructureWhen setting specifies the circumstances
# when eligible TWS windows have their structure logged:
#
# - if set to 'open' or 'yes' or 'true', IBC logs the
# structure of an eligible window the first time it
# is encountered;
#
# - if set to 'activate', the structure is logged every
# time an eligible window is made active;
#
# - if set to 'never' or 'no' or 'false', structure
# information is never logged.
#
# The default value is 'never'.
LogStructureWhen=never
# DEPRECATED SETTING
# ------------------
#
# LogComponents - THIS SETTING WILL BE REMOVED IN A FUTURE
# RELEASE
#
# If LogComponents is set to any value, this is equivalent
# to setting LogStructureWhen to that same value and
# LogStructureScope to 'all': the actual values of those
# settings are ignored. The default is that the values
# of LogStructureScope and LogStructureWhen are honoured.
#LogComponents=

View File

@ -1,33 +0,0 @@
[IBGateway]
ApiOnly=true
LocalServerPort=4002
# NOTE: must be set if using IBC's "reject" mode
TrustedIPs=127.0.0.1
; RemoteHostOrderRouting=ndc1.ibllc.com
; WriteDebug=true
; RemotePortOrderRouting=4001
; useRemoteSettings=false
; tradingMode=p
; Steps=8
; colorPalletName=dark
# window geo, this may be useful for sending `xdotool` commands?
; MainWindow.Width=1986
; screenHeight=3960
[Logon]
Locale=en
# most markets are oriented around this zone
# so might as well hard code it.
TimeZone=America/New_York
UseSSL=true
displayedproxymsg=1
os_titlebar=true
s3store=true
useRemoteSettings=false
[Communication]
ctciAutoEncrypt=true
Region=usr
; Peer=cdc1.ibllc.com:4001

View File

@ -1,16 +0,0 @@
#!/bin/sh
# start VNC server
x11vnc \
-ncache_cr \
-listen localhost \
-display :1 \
-forever \
-shared \
-logappend /var/log/x11vnc.log \
-bg \
-noipv6 \
-autoport 3003 \
# can't use this because of ``asyncvnc`` issue:
# https://github.com/barneygale/asyncvnc/issues/1
# -passwd 'ibcansmbz'

View File

@ -1,28 +0,0 @@
Notes to self
=============
chicken scratch we shan't forget, consider this staging
for actual feature issues on wtv git wrapper-provider we're
using (no we shan't stick with GH long term likely).
cool chart features
-------------------
- allow right-click to spawn shell with current in view
data passed to the new process via ``msgpack-numpy``.
- expand OHLC datum to lower time frame.
- auto-highlight current time range on tick feed
features from IB charting
-------------------------
- vlm diffing from ticks and compare when bar arrives from historical
- should help isolate dark vlm / trades
chart ux ideas
--------------
- hotkey to zoom to order intersection (horizontal line) with previous
price levels (+ some margin obvs).
- L1 "lines" (queue size repr) should normalize to some fixed x width
such that when levels with more vlm appear other smaller levels are
scaled down giving an immediate indication of the liquidity diff.

View File

@ -1,20 +1,17 @@
# piker: trading gear for hackers. # Copyright 2018 Tyler Goodlet
# Copyright 2020-eternity Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify # This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by # it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or # the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version. # (at your option) any later version.
# This program is distributed in the hope that it will be useful, # This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of # but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details. # GNU General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
""" """
piker: trading gear for hackers. piker: destroy all suits
""" """

View File

@ -0,0 +1,33 @@
"""
Async utils no one seems to have built into a core lib (yet).
"""
from collections import OrderedDict
def async_lifo_cache(maxsize=128):
"""Async ``cache`` with a LIFO policy.
Implemented my own since no one else seems to have
a standard. I'll wait for the smarter people to come
up with one, but until then...
"""
cache = OrderedDict()
def decorator(fn):
async def wrapper(*args):
key = args
try:
return cache[key]
except KeyError:
if len(cache) >= maxsize:
# discard last added new entry
cache.popitem()
# do it
cache[key] = await fn(*args)
return cache[key]
return wrapper
return decorator

View File

@ -1,79 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Cacheing apis and toolz.
"""
from collections import OrderedDict
from contextlib import (
asynccontextmanager,
)
from tractor.trionics import maybe_open_context
from .brokers import get_brokermod
from .log import get_logger
log = get_logger(__name__)
def async_lifo_cache(maxsize=128):
"""Async ``cache`` with a LIFO policy.
Implemented my own since no one else seems to have
a standard. I'll wait for the smarter people to come
up with one, but until then...
"""
cache = OrderedDict()
def decorator(fn):
async def wrapper(*args):
key = args
try:
return cache[key]
except KeyError:
if len(cache) >= maxsize:
# discard last added new entry
cache.popitem()
# do it
cache[key] = await fn(*args)
return cache[key]
return wrapper
return decorator
@asynccontextmanager
async def open_cached_client(
brokername: str,
) -> 'Client': # noqa
'''
Get a cached broker client from the current actor's local vars.
If one has not been setup do it and cache it.
'''
brokermod = get_brokermod(brokername)
async with maybe_open_context(
acm_func=brokermod.get_client,
) as (cache_hit, client):
yield client

View File

@ -1,561 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Structured, daemon tree service management.
"""
from typing import Optional, Union, Callable, Any
from contextlib import asynccontextmanager as acm
from collections import defaultdict
from pydantic import BaseModel
import trio
from trio_typing import TaskStatus
import tractor
from .log import get_logger, get_console_log
from .brokers import get_brokermod
log = get_logger(__name__)
_root_dname = 'pikerd'
_registry_addr = ('127.0.0.1', 6116)
_tractor_kwargs: dict[str, Any] = {
# use a different registry addr then tractor's default
'arbiter_addr': _registry_addr
}
_root_modules = [
__name__,
'piker.clearing._ems',
'piker.clearing._client',
]
class Services(BaseModel):
actor_n: tractor._supervise.ActorNursery
service_n: trio.Nursery
debug_mode: bool # tractor sub-actor debug mode flag
service_tasks: dict[str, tuple[trio.CancelScope, tractor.Portal]] = {}
class Config:
arbitrary_types_allowed = True
async def start_service_task(
self,
name: str,
portal: tractor.Portal,
target: Callable,
**kwargs,
) -> (trio.CancelScope, tractor.Context):
'''
Open a context in a service sub-actor, add to a stack
that gets unwound at ``pikerd`` teardown.
This allows for allocating long-running sub-services in our main
daemon and explicitly controlling their lifetimes.
'''
async def open_context_in_task(
task_status: TaskStatus[
trio.CancelScope] = trio.TASK_STATUS_IGNORED,
) -> Any:
with trio.CancelScope() as cs:
async with portal.open_context(
target,
**kwargs,
) as (ctx, first):
# unblock once the remote context has started
task_status.started((cs, first))
log.info(
f'`pikerd` service {name} started with value {first}'
)
try:
# wait on any context's return value
ctx_res = await ctx.result()
except tractor.ContextCancelled:
return await self.cancel_service(name)
else:
# wait on any error from the sub-actor
# NOTE: this will block indefinitely until
# cancelled either by error from the target
# context function or by being cancelled here by
# the surrounding cancel scope
return (await portal.result(), ctx_res)
cs, first = await self.service_n.start(open_context_in_task)
# store the cancel scope and portal for later cancellation or
# retstart if needed.
self.service_tasks[name] = (cs, portal)
return cs, first
# TODO: per service cancellation by scope, we aren't using this
# anywhere right?
async def cancel_service(
self,
name: str,
) -> Any:
log.info(f'Cancelling `pikerd` service {name}')
cs, portal = self.service_tasks[name]
# XXX: not entirely sure why this is required,
# and should probably be better fine tuned in
# ``tractor``?
cs.cancel()
return await portal.cancel_actor()
_services: Optional[Services] = None
@acm
async def open_pikerd(
start_method: str = 'trio',
loglevel: Optional[str] = None,
# XXX: you should pretty much never want debug mode
# for data daemons when running in production.
debug_mode: bool = False,
) -> Optional[tractor._portal.Portal]:
'''
Start a root piker daemon who's lifetime extends indefinitely
until cancelled.
A root actor nursery is created which can be used to create and keep
alive underling services (see below).
'''
global _services
assert _services is None
# XXX: this may open a root actor as well
async with (
tractor.open_root_actor(
# passed through to ``open_root_actor``
arbiter_addr=_registry_addr,
name=_root_dname,
loglevel=loglevel,
debug_mode=debug_mode,
start_method=start_method,
# TODO: eventually we should be able to avoid
# having the root have more then permissions to
# spawn other specialized daemons I think?
enable_modules=_root_modules,
) as _,
tractor.open_nursery() as actor_nursery,
):
async with trio.open_nursery() as service_nursery:
# # setup service mngr singleton instance
# async with AsyncExitStack() as stack:
# assign globally for future daemon/task creation
_services = Services(
actor_n=actor_nursery,
service_n=service_nursery,
debug_mode=debug_mode,
)
yield _services
@acm
async def open_piker_runtime(
name: str,
enable_modules: list[str] = [],
start_method: str = 'trio',
loglevel: Optional[str] = None,
# XXX: you should pretty much never want debug mode
# for data daemons when running in production.
debug_mode: bool = False,
) -> Optional[tractor._portal.Portal]:
'''
Start a piker actor who's runtime will automatically
sync with existing piker actors in local network
based on configuration.
'''
global _services
assert _services is None
# XXX: this may open a root actor as well
async with (
tractor.open_root_actor(
# passed through to ``open_root_actor``
arbiter_addr=_registry_addr,
name=name,
loglevel=loglevel,
debug_mode=debug_mode,
start_method=start_method,
# TODO: eventually we should be able to avoid
# having the root have more then permissions to
# spawn other specialized daemons I think?
enable_modules=_root_modules,
) as _,
):
yield tractor.current_actor()
@acm
async def maybe_open_runtime(
loglevel: Optional[str] = None,
**kwargs,
) -> None:
"""
Start the ``tractor`` runtime (a root actor) if none exists.
"""
settings = _tractor_kwargs
settings.update(kwargs)
if not tractor.current_actor(err_on_no_runtime=False):
async with tractor.open_root_actor(
loglevel=loglevel,
**settings,
):
yield
else:
yield
@acm
async def maybe_open_pikerd(
loglevel: Optional[str] = None,
**kwargs,
) -> Union[tractor._portal.Portal, Services]:
"""If no ``pikerd`` daemon-root-actor can be found start it and
yield up (we should probably figure out returning a portal to self
though).
"""
if loglevel:
get_console_log(loglevel)
# subtle, we must have the runtime up here or portal lookup will fail
async with maybe_open_runtime(loglevel, **kwargs):
async with tractor.find_actor(_root_dname) as portal:
# assert portal is not None
if portal is not None:
yield portal
return
# presume pikerd role since no daemon could be found at
# configured address
async with open_pikerd(
loglevel=loglevel,
debug_mode=kwargs.get('debug_mode', False),
) as _:
# in the case where we're starting up the
# tractor-piker runtime stack in **this** process
# we return no portal to self.
yield None
# brokerd enabled modules
_data_mods = [
'piker.brokers.core',
'piker.brokers.data',
'piker.data',
'piker.data.feed',
'piker.data._sampling'
]
class Brokerd:
locks = defaultdict(trio.Lock)
@acm
async def find_service(
service_name: str,
) -> Optional[tractor.Portal]:
log.info(f'Scanning for service `{service_name}`')
# attach to existing daemon by name if possible
async with tractor.find_actor(
service_name,
arbiter_sockaddr=_registry_addr,
) as maybe_portal:
yield maybe_portal
async def check_for_service(
service_name: str,
) -> bool:
'''
Service daemon "liveness" predicate.
'''
async with tractor.query_actor(
service_name,
arbiter_sockaddr=_registry_addr,
) as sockaddr:
return sockaddr
@acm
async def maybe_spawn_daemon(
service_name: str,
service_task_target: Callable,
spawn_args: dict[str, Any],
loglevel: Optional[str] = None,
**kwargs,
) -> tractor.Portal:
'''
If no ``service_name`` daemon-actor can be found,
spawn one in a local subactor and return a portal to it.
If this function is called from a non-pikerd actor, the
spawned service will persist as long as pikerd does or
it is requested to be cancelled.
This can be seen as a service starting api for remote-actor
clients.
'''
if loglevel:
get_console_log(loglevel)
# serialize access to this section to avoid
# 2 or more tasks racing to create a daemon
lock = Brokerd.locks[service_name]
await lock.acquire()
async with find_service(service_name) as portal:
if portal is not None:
lock.release()
yield portal
return
log.warning(f"Couldn't find any existing {service_name}")
# ask root ``pikerd`` daemon to spawn the daemon we need if
# pikerd is not live we now become the root of the
# process tree
async with maybe_open_pikerd(
loglevel=loglevel,
**kwargs,
) as pikerd_portal:
if pikerd_portal is None:
# we are the root and thus are `pikerd`
# so spawn the target service directly by calling
# the provided target routine.
# XXX: this assumes that the target is well formed and will
# do the right things to setup both a sub-actor **and** call
# the ``_Services`` api from above to start the top level
# service task for that actor.
await service_task_target(**spawn_args)
else:
# tell the remote `pikerd` to start the target,
# the target can't return a non-serializable value
# since it is expected that service startingn is
# non-blocking and the target task will persist running
# on `pikerd` after the client requesting it's start
# disconnects.
await pikerd_portal.run(
service_task_target,
**spawn_args,
)
async with tractor.wait_for_actor(service_name) as portal:
lock.release()
yield portal
await portal.cancel_actor()
async def spawn_brokerd(
brokername: str,
loglevel: Optional[str] = None,
**tractor_kwargs,
) -> bool:
log.info(f'Spawning {brokername} broker daemon')
brokermod = get_brokermod(brokername)
dname = f'brokerd.{brokername}'
extra_tractor_kwargs = getattr(brokermod, '_spawn_kwargs', {})
tractor_kwargs.update(extra_tractor_kwargs)
global _services
assert _services
# ask `pikerd` to spawn a new sub-actor and manage it under its
# actor nursery
modpath = brokermod.__name__
broker_enable = [modpath]
for submodname in getattr(
brokermod,
'__enable_modules__',
[],
):
subpath = f'{modpath}.{submodname}'
broker_enable.append(subpath)
portal = await _services.actor_n.start_actor(
dname,
enable_modules=_data_mods + broker_enable,
loglevel=loglevel,
debug_mode=_services.debug_mode,
**tractor_kwargs
)
# non-blocking setup of brokerd service nursery
from .data import _setup_persistent_brokerd
await _services.start_service_task(
dname,
portal,
_setup_persistent_brokerd,
brokername=brokername,
)
return True
@acm
async def maybe_spawn_brokerd(
brokername: str,
loglevel: Optional[str] = None,
**kwargs,
) -> tractor.Portal:
'''
Helper to spawn a brokerd service *from* a client
who wishes to use the sub-actor-daemon.
'''
async with maybe_spawn_daemon(
f'brokerd.{brokername}',
service_task_target=spawn_brokerd,
spawn_args={'brokername': brokername, 'loglevel': loglevel},
loglevel=loglevel,
**kwargs,
) as portal:
yield portal
async def spawn_emsd(
loglevel: Optional[str] = None,
**extra_tractor_kwargs
) -> bool:
"""
Start the clearing engine under ``pikerd``.
"""
log.info('Spawning emsd')
global _services
assert _services
portal = await _services.actor_n.start_actor(
'emsd',
enable_modules=[
'piker.clearing._ems',
'piker.clearing._client',
],
loglevel=loglevel,
debug_mode=_services.debug_mode, # set by pikerd flag
**extra_tractor_kwargs
)
# non-blocking setup of clearing service
from .clearing._ems import _setup_persistent_emsd
await _services.start_service_task(
'emsd',
portal,
_setup_persistent_emsd,
)
return True
@acm
async def maybe_open_emsd(
brokername: str,
loglevel: Optional[str] = None,
**kwargs,
) -> tractor._portal.Portal: # noqa
async with maybe_spawn_daemon(
'emsd',
service_task_target=spawn_emsd,
spawn_args={'loglevel': loglevel},
loglevel=loglevel,
**kwargs,
) as portal:
yield portal
# TODO: ideally we can start the tsdb "on demand" but it's
# probably going to require "rootless" docker, at least if we don't
# want to expect the user to start ``pikerd`` with root perms all the
# time.
# async def maybe_open_marketstored(
# loglevel: Optional[str] = None,
# **kwargs,
# ) -> tractor._portal.Portal: # noqa
# async with maybe_spawn_daemon(
# 'marketstored',
# service_task_target=spawn_emsd,
# spawn_args={'loglevel': loglevel},
# loglevel=loglevel,
# **kwargs,
# ) as portal:
# yield portal

View File

@ -1,46 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Profiling wrappers for internal libs.
"""
import time
from functools import wraps
# NOTE: you can pass a flag to enable this:
# ``piker chart <args> --profile``.
_pg_profile: bool = False
ms_slower_then: float = 0
def pg_profile_enabled() -> bool:
global _pg_profile
return _pg_profile
def timeit(fn):
@wraps(fn)
def wrapper(*args, **kwargs):
t = time.time()
res = fn(*args, **kwargs)
print(
'%s.%s: %.4f sec'
% (fn.__module__, fn.__qualname__, time.time() - t)
)
return res
return wrapper

View File

@ -1,19 +1,3 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
""" """
Broker clients, daemons and general back end machinery. Broker clients, daemons and general back end machinery.
""" """
@ -25,11 +9,8 @@ import asks
asks.init('trio') asks.init('trio')
__brokers__ = [ __brokers__ = [
'binance',
'questrade', 'questrade',
'robinhood', 'robinhood',
'ib',
'kraken',
] ]

View File

@ -1,19 +1,3 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
""" """
Handy utils. Handy utils.
""" """
@ -28,54 +12,10 @@ class BrokerError(Exception):
"Generic broker issue" "Generic broker issue"
class SymbolNotFound(BrokerError):
"Symbol not found by broker search"
class NoData(BrokerError):
'''
Symbol data not permitted or no data
for time range found.
'''
def __init__(
self,
*args,
frame_size: int = 1000,
) -> None:
super().__init__(*args)
# when raised, machinery can check if the backend
# set a "frame size" for doing datetime calcs.
self.frame_size: int = 1000
class DataUnavailable(BrokerError):
'''
Signal storage requests to terminate.
'''
# TODO: add in a reason that can be displayed in the
# UI (for eg. `kraken` is bs and you should complain
# to them that you can't pull more OHLC data..)
class DataThrottle(BrokerError):
'''
Broker throttled request rate for data.
'''
# TODO: add in throttle metrics/feedback
def resproc( def resproc(
resp: asks.response_objects.Response, resp: asks.response_objects.Response,
log: logging.Logger, log: logging.Logger,
return_json: bool = True, return_json: bool = True
log_resp: bool = False,
) -> asks.response_objects.Response: ) -> asks.response_objects.Response:
"""Process response and return its json content. """Process response and return its json content.
@ -84,12 +24,11 @@ def resproc(
if not resp.status_code == 200: if not resp.status_code == 200:
raise BrokerError(resp.body) raise BrokerError(resp.body)
try: try:
msg = resp.json() json = resp.json()
except json.decoder.JSONDecodeError: except json.decoder.JSONDecodeError:
log.exception(f"Failed to process {resp}:\n{resp.text}") log.exception(f"Failed to process {resp}:\n{resp.text}")
raise BrokerError(resp.text) raise BrokerError(resp.text)
else:
log.trace(f"Received json contents:\n{colorize_json(json)}")
if log_resp: return json if return_json else resp
log.debug(f"Received json contents:\n{colorize_json(msg)}")
return msg if return_json else resp

View File

@ -1,566 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Guillermo Rodriguez (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Binance backend
"""
from contextlib import asynccontextmanager as acm
from datetime import datetime
from typing import (
Any, Union, Optional,
AsyncGenerator, Callable,
)
import time
import trio
from trio_typing import TaskStatus
import pendulum
import asks
from fuzzywuzzy import process as fuzzy
import numpy as np
import tractor
from pydantic.dataclasses import dataclass
from pydantic import BaseModel
import wsproto
from .._cacheables import open_cached_client
from ._util import resproc, SymbolNotFound
from ..log import get_logger, get_console_log
from ..data import ShmArray
from ..data._web_bs import open_autorecon_ws, NoBsWs
log = get_logger(__name__)
_url = 'https://api.binance.com'
# Broker specific ohlc schema (rest)
_ohlc_dtype = [
('index', int),
('time', int),
('open', float),
('high', float),
('low', float),
('close', float),
('volume', float),
('bar_wap', float), # will be zeroed by sampler if not filled
# XXX: some additional fields are defined in the docs:
# https://binance-docs.github.io/apidocs/spot/en/#kline-candlestick-data
# ('close_time', int),
# ('quote_vol', float),
# ('num_trades', int),
# ('buy_base_vol', float),
# ('buy_quote_vol', float),
# ('ignore', float),
]
# UI components allow this to be declared such that additional
# (historical) fields can be exposed.
ohlc_dtype = np.dtype(_ohlc_dtype)
_show_wap_in_history = False
# https://binance-docs.github.io/apidocs/spot/en/#exchange-information
class Pair(BaseModel):
symbol: str
status: str
baseAsset: str
baseAssetPrecision: int
quoteAsset: str
quotePrecision: int
quoteAssetPrecision: int
baseCommissionPrecision: int
quoteCommissionPrecision: int
orderTypes: list[str]
icebergAllowed: bool
ocoAllowed: bool
quoteOrderQtyMarketAllowed: bool
isSpotTradingAllowed: bool
isMarginTradingAllowed: bool
filters: list[dict[str, Union[str, int, float]]]
permissions: list[str]
@dataclass
class OHLC:
"""Description of the flattened OHLC quote format.
For schema details see:
https://binance-docs.github.io/apidocs/spot/en/#kline-candlestick-streams
"""
time: int
open: float
high: float
low: float
close: float
volume: float
close_time: int
quote_vol: float
num_trades: int
buy_base_vol: float
buy_quote_vol: float
ignore: int
# null the place holder for `bar_wap` until we
# figure out what to extract for this.
bar_wap: float = 0.0
# convert datetime obj timestamp to unixtime in milliseconds
def binance_timestamp(when):
return int((when.timestamp() * 1000) + (when.microsecond / 1000))
class Client:
def __init__(self) -> None:
self._sesh = asks.Session(connections=4)
self._sesh.base_location = _url
self._pairs: dict[str, Any] = {}
async def _api(
self,
method: str,
params: dict,
) -> dict[str, Any]:
resp = await self._sesh.get(
path=f'/api/v3/{method}',
params=params,
timeout=float('inf')
)
return resproc(resp, log)
async def symbol_info(
self,
sym: Optional[str] = None,
) -> dict[str, Any]:
'''Get symbol info for the exchange.
'''
# TODO: we can load from our self._pairs cache
# on repeat calls...
# will retrieve all symbols by default
params = {}
if sym is not None:
sym = sym.upper()
params = {'symbol': sym}
resp = await self._api(
'exchangeInfo',
params=params,
)
entries = resp['symbols']
if not entries:
raise SymbolNotFound(f'{sym} not found')
syms = {item['symbol']: item for item in entries}
if sym is not None:
return syms[sym]
else:
return syms
async def cache_symbols(
self,
) -> dict:
if not self._pairs:
self._pairs = await self.symbol_info()
return self._pairs
async def search_symbols(
self,
pattern: str,
limit: int = None,
) -> dict[str, Any]:
if self._pairs is not None:
data = self._pairs
else:
data = await self.symbol_info()
matches = fuzzy.extractBests(
pattern,
data,
score_cutoff=50,
)
# repack in dict form
return {item[0]['symbol']: item[0]
for item in matches}
async def bars(
self,
symbol: str,
start_dt: Optional[datetime] = None,
end_dt: Optional[datetime] = None,
limit: int = 1000, # <- max allowed per query
as_np: bool = True,
) -> dict:
if end_dt is None:
end_dt = pendulum.now('UTC')
if start_dt is None:
start_dt = end_dt.start_of(
'minute').subtract(minutes=limit)
start_time = binance_timestamp(start_dt)
end_time = binance_timestamp(end_dt)
# https://binance-docs.github.io/apidocs/spot/en/#kline-candlestick-data
bars = await self._api(
'klines',
params={
'symbol': symbol.upper(),
'interval': '1m',
'startTime': start_time,
'endTime': end_time,
'limit': limit
}
)
# TODO: pack this bars scheme into a ``pydantic`` validator type:
# https://binance-docs.github.io/apidocs/spot/en/#kline-candlestick-data
# TODO: we should port this to ``pydantic`` to avoid doing
# manual validation ourselves..
new_bars = []
for i, bar in enumerate(bars):
bar = OHLC(*bar)
row = []
for j, (name, ftype) in enumerate(_ohlc_dtype[1:]):
# TODO: maybe we should go nanoseconds on all
# history time stamps?
if name == 'time':
# convert to epoch seconds: float
row.append(bar.time / 1000.0)
else:
row.append(getattr(bar, name))
new_bars.append((i,) + tuple(row))
array = np.array(new_bars, dtype=_ohlc_dtype) if as_np else bars
return array
@acm
async def get_client() -> Client:
client = Client()
await client.cache_symbols()
yield client
# validation type
class AggTrade(BaseModel):
e: str # Event type
E: int # Event time
s: str # Symbol
a: int # Aggregate trade ID
p: float # Price
q: float # Quantity
f: int # First trade ID
l: int # Last trade ID
T: int # Trade time
m: bool # Is the buyer the market maker?
M: bool # Ignore
async def stream_messages(ws: NoBsWs) -> AsyncGenerator[NoBsWs, dict]:
timeouts = 0
while True:
with trio.move_on_after(3) as cs:
msg = await ws.recv_msg()
if cs.cancelled_caught:
timeouts += 1
if timeouts > 2:
log.error("binance feed seems down and slow af? rebooting...")
await ws._connect()
continue
# for l1 streams binance doesn't add an event type field so
# identify those messages by matching keys
# https://binance-docs.github.io/apidocs/spot/en/#individual-symbol-book-ticker-streams
if msg.get('u'):
sym = msg['s']
bid = float(msg['b'])
bsize = float(msg['B'])
ask = float(msg['a'])
asize = float(msg['A'])
yield 'l1', {
'symbol': sym,
'ticks': [
{'type': 'bid', 'price': bid, 'size': bsize},
{'type': 'bsize', 'price': bid, 'size': bsize},
{'type': 'ask', 'price': ask, 'size': asize},
{'type': 'asize', 'price': ask, 'size': asize}
]
}
elif msg.get('e') == 'aggTrade':
# validate
msg = AggTrade(**msg)
# TODO: type out and require this quote format
# from all backends!
yield 'trade', {
'symbol': msg.s,
'last': msg.p,
'brokerd_ts': time.time(),
'ticks': [{
'type': 'trade',
'price': msg.p,
'size': msg.q,
'broker_ts': msg.T,
}],
}
def make_sub(pairs: list[str], sub_name: str, uid: int) -> dict[str, str]:
"""Create a request subscription packet dict.
https://binance-docs.github.io/apidocs/spot/en/#live-subscribing-unsubscribing-to-streams
"""
return {
'method': 'SUBSCRIBE',
'params': [
f'{pair.lower()}@{sub_name}'
for pair in pairs
],
'id': uid
}
@acm
async def open_history_client(
symbol: str,
) -> tuple[Callable, int]:
# TODO implement history getter for the new storage layer.
async with open_cached_client('binance') as client:
async def get_ohlc(
end_dt: Optional[datetime] = None,
start_dt: Optional[datetime] = None,
) -> tuple[
np.ndarray,
datetime, # start
datetime, # end
]:
array = await client.bars(
symbol,
start_dt=start_dt,
end_dt=end_dt,
)
start_dt = pendulum.from_timestamp(array[0]['time'])
end_dt = pendulum.from_timestamp(array[-1]['time'])
return array, start_dt, end_dt
yield get_ohlc, {'erlangs': 3, 'rate': 3}
async def backfill_bars(
sym: str,
shm: ShmArray, # type: ignore # noqa
task_status: TaskStatus[trio.CancelScope] = trio.TASK_STATUS_IGNORED,
) -> None:
"""Fill historical bars into shared mem / storage afap.
"""
with trio.CancelScope() as cs:
async with open_cached_client('binance') as client:
bars = await client.bars(symbol=sym)
shm.push(bars)
task_status.started(cs)
async def stream_quotes(
send_chan: trio.abc.SendChannel,
symbols: list[str],
feed_is_live: trio.Event,
loglevel: str = None,
# startup sync
task_status: TaskStatus[tuple[dict, dict]] = trio.TASK_STATUS_IGNORED,
) -> None:
# XXX: required to propagate ``tractor`` loglevel to piker logging
get_console_log(loglevel or tractor.current_actor().loglevel)
sym_infos = {}
uid = 0
async with (
open_cached_client('binance') as client,
send_chan as send_chan,
):
# keep client cached for real-time section
cache = await client.cache_symbols()
for sym in symbols:
d = cache[sym.upper()]
syminfo = Pair(**d) # validation
si = sym_infos[sym] = syminfo.dict()
# XXX: after manually inspecting the response format we
# just directly pick out the info we need
si['price_tick_size'] = float(syminfo.filters[0]['tickSize'])
si['lot_tick_size'] = float(syminfo.filters[2]['stepSize'])
si['asset_type'] = 'crypto'
symbol = symbols[0]
init_msgs = {
# pass back token, and bool, signalling if we're the writer
# and that history has been written
symbol: {
'symbol_info': sym_infos[sym],
'shm_write_opts': {'sum_tick_vml': False},
'fqsn': sym,
},
}
@acm
async def subscribe(ws: wsproto.WSConnection):
# setup subs
# trade data (aka L1)
# https://binance-docs.github.io/apidocs/spot/en/#symbol-order-book-ticker
l1_sub = make_sub(symbols, 'bookTicker', uid)
await ws.send_msg(l1_sub)
# aggregate (each order clear by taker **not** by maker)
# trades data:
# https://binance-docs.github.io/apidocs/spot/en/#aggregate-trade-streams
agg_trades_sub = make_sub(symbols, 'aggTrade', uid)
await ws.send_msg(agg_trades_sub)
# ack from ws server
res = await ws.recv_msg()
assert res['id'] == uid
yield
subs = []
for sym in symbols:
subs.append("{sym}@aggTrade")
subs.append("{sym}@bookTicker")
# unsub from all pairs on teardown
await ws.send_msg({
"method": "UNSUBSCRIBE",
"params": subs,
"id": uid,
})
# XXX: do we need to ack the unsub?
# await ws.recv_msg()
async with open_autorecon_ws(
'wss://stream.binance.com/ws',
fixture=subscribe,
) as ws:
# pull a first quote and deliver
msg_gen = stream_messages(ws)
typ, quote = await msg_gen.__anext__()
while typ != 'trade':
# TODO: use ``anext()`` when it lands in 3.10!
typ, quote = await msg_gen.__anext__()
task_status.started((init_msgs, quote))
# signal to caller feed is ready for consumption
feed_is_live.set()
# import time
# last = time.time()
# start streaming
async for typ, msg in msg_gen:
# period = time.time() - last
# hz = 1/period if period else float('inf')
# if hz > 60:
# log.info(f'Binance quotez : {hz}')
topic = msg['symbol'].lower()
await send_chan.send({topic: msg})
# last = time.time()
@tractor.context
async def open_symbol_search(
ctx: tractor.Context,
) -> Client:
async with open_cached_client('binance') as client:
# load all symbols locally for fast search
cache = await client.cache_symbols()
await ctx.started()
async with ctx.open_stream() as stream:
async for pattern in stream:
# results = await client.symbol_info(sym=pattern.upper())
matches = fuzzy.extractBests(
pattern,
cache,
score_cutoff=50,
)
# repack in dict form
await stream.send(
{item[0]['symbol']: item[0]
for item in matches}
)

View File

@ -1,289 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Console interface to broker client/daemons.
"""
import os
from functools import partial
from operator import attrgetter
from operator import itemgetter
import click
import trio
import tractor
from ..cli import cli
from .. import watchlists as wl
from ..log import get_console_log, colorize_json, get_logger
from .._daemon import maybe_spawn_brokerd, maybe_open_pikerd
from ..brokers import core, get_brokermod, data
log = get_logger('cli')
DEFAULT_BROKER = 'questrade'
_config_dir = click.get_app_dir('piker')
_watchlists_data_path = os.path.join(_config_dir, 'watchlists.json')
@cli.command()
@click.option('--keys', '-k', multiple=True,
help='Return results only for these keys')
@click.argument('meth', nargs=1)
@click.argument('kwargs', nargs=-1)
@click.pass_obj
def api(config, meth, kwargs, keys):
'''
Make a broker-client API method call
'''
# global opts
broker = config['brokers'][0]
_kwargs = {}
for kwarg in kwargs:
if '=' not in kwarg:
log.error(f"kwarg `{kwarg}` must be of form <key>=<value>")
else:
key, _, value = kwarg.partition('=')
_kwargs[key] = value
data = trio.run(
partial(core.api, broker, meth, **_kwargs)
)
if keys:
# filter to requested keys
filtered = []
if meth in data: # often a list of dicts
for item in data[meth]:
filtered.append({key: item[key] for key in keys})
else: # likely just a dict
filtered.append({key: data[key] for key in keys})
data = filtered
click.echo(colorize_json(data))
@cli.command()
@click.argument('tickers', nargs=-1, required=True)
@click.pass_obj
def quote(config, tickers):
'''
Print symbol quotes to the console
'''
# global opts
brokermod = config['brokermods'][0]
quotes = trio.run(partial(core.stocks_quote, brokermod, tickers))
if not quotes:
log.error(f"No quotes could be found for {tickers}?")
return
if len(quotes) < len(tickers):
syms = tuple(map(itemgetter('symbol'), quotes))
for ticker in tickers:
if ticker not in syms:
brokermod.log.warn(f"Could not find symbol {ticker}?")
click.echo(colorize_json(quotes))
@cli.command()
@click.option('--count', '-c', default=1000,
help='Number of bars to retrieve')
@click.argument('symbol', required=True)
@click.pass_obj
def bars(config, symbol, count):
'''
Retreive 1m bars for symbol and print on the console
'''
# global opts
brokermod = config['brokermods'][0]
# broker backend should return at the least a
# list of candle dictionaries
bars = trio.run(
partial(
core.bars,
brokermod,
symbol,
count=count,
as_np=False,
)
)
if not len(bars):
log.error(f"No quotes could be found for {symbol}?")
return
click.echo(colorize_json(bars))
@cli.command()
@click.option('--rate', '-r', default=5, help='Logging level')
@click.option('--filename', '-f', default='quotestream.jsonstream',
help='Logging level')
@click.option('--dhost', '-dh', default='127.0.0.1',
help='Daemon host address to connect to')
@click.argument('name', nargs=1, required=True)
@click.pass_obj
def record(config, rate, name, dhost, filename):
'''
Record client side quotes to a file on disk
'''
# global opts
brokermod = config['brokermods'][0]
loglevel = config['loglevel']
log = config['log']
watchlist_from_file = wl.ensure_watchlists(_watchlists_data_path)
watchlists = wl.merge_watchlist(watchlist_from_file, wl._builtins)
tickers = watchlists[name]
if not tickers:
log.error(f"No symbols found for watchlist `{name}`?")
return
async def main(tries):
async with maybe_spawn_brokerd(
tries=tries, loglevel=loglevel
) as portal:
# run app "main"
return await data.stream_to_file(
name, filename,
portal, tickers,
brokermod, rate,
)
filename = tractor.run(partial(main, tries=1), name='data-feed-recorder')
click.echo(f"Data feed recording saved to {filename}")
# options utils
@cli.command()
@click.option('--broker', '-b', default=DEFAULT_BROKER,
help='Broker backend to use')
@click.option('--loglevel', '-l', default='warning', help='Logging level')
@click.option('--ids', flag_value=True, help='Include numeric ids in output')
@click.argument('symbol', required=True)
@click.pass_context
def contracts(ctx, loglevel, broker, symbol, ids):
'''
Get list of all option contracts for symbol
'''
brokermod = get_brokermod(broker)
get_console_log(loglevel)
contracts = trio.run(partial(core.contracts, brokermod, symbol))
if not ids:
# just print out expiry dates which can be used with
# the option_chain_quote cmd
output = tuple(map(attrgetter('expiry'), contracts))
else:
output = tuple(contracts.items())
# TODO: need a cli test to verify
click.echo(colorize_json(output))
@cli.command()
@click.option('--date', '-d', help='Contracts expiry date')
@click.argument('symbol', required=True)
@click.pass_obj
def optsquote(config, symbol, date):
'''
Retreive symbol option quotes on the console
'''
# global opts
brokermod = config['brokermods'][0]
quotes = trio.run(
partial(
core.option_chain, brokermod, symbol, date
)
)
if not quotes:
log.error(f"No option quotes could be found for {symbol}?")
return
click.echo(colorize_json(quotes))
@cli.command()
@click.argument('tickers', nargs=-1, required=True)
@click.pass_obj
def symbol_info(config, tickers):
'''
Print symbol quotes to the console
'''
# global opts
brokermod = config['brokermods'][0]
quotes = trio.run(partial(core.symbol_info, brokermod, tickers))
if not quotes:
log.error(f"No quotes could be found for {tickers}?")
return
if len(quotes) < len(tickers):
syms = tuple(map(itemgetter('symbol'), quotes))
for ticker in tickers:
if ticker not in syms:
brokermod.log.warn(f"Could not find symbol {ticker}?")
click.echo(colorize_json(quotes))
@cli.command()
@click.argument('pattern', required=True)
@click.pass_obj
def search(config, pattern):
'''
Search for symbols from broker backend(s).
'''
# global opts
brokermods = config['brokermods']
# define tractor entrypoint
async def main(func):
async with maybe_open_pikerd(
loglevel=config['loglevel'],
):
return await func()
quotes = trio.run(
main,
partial(
core.symbol_search,
brokermods,
pattern,
),
)
if not quotes:
log.error(f"No matches could be found for {pattern}?")
return
click.echo(colorize_json(quotes))

View File

@ -0,0 +1,58 @@
"""
Broker configuration mgmt.
"""
import os
import configparser
import toml
import click
from ..log import get_logger
log = get_logger('broker-config')
_config_dir = click.get_app_dir('piker')
_file_name = 'brokers.toml'
def _override_config_dir(
path: str
) -> None:
global _config_dir
_config_dir = path
def get_broker_conf_path():
return os.path.join(_config_dir, _file_name)
def load(
path: str = None
) -> (dict, str):
"""Load broker config.
"""
path = path or get_broker_conf_path()
config = toml.load(path)
log.debug(f"Read config file {path}")
return config, path
def write(
config: dict, # toml config as dict
path: str = None,
) -> None:
"""Write broker config to disk.
Create a ``brokers.ini`` file if one does not exist.
"""
path = path or get_broker_conf_path()
dirname = os.path.dirname(path)
if not os.path.isdir(dirname):
log.debug(f"Creating config dir {_config_dir}")
os.makedirs(dirname)
if not config:
raise ValueError(
"Watch out you're trying to write a blank config!")
log.debug(f"Writing config file {path}")
with open(path, 'w') as cf:
return toml.dump(config, cf)

View File

@ -1,38 +1,23 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
""" """
Broker high level cross-process API layer. Broker high level API layer.
This API should be kept "remote service compatible" meaning inputs to
routines should be primitive data types where possible.
""" """
import inspect import inspect
from types import ModuleType from types import ModuleType
from typing import List, Dict, Any, Optional from typing import List, Dict, Any, Optional
import trio from async_generator import asynccontextmanager
import tractor
from ..log import get_logger from ..log import get_logger
from .data import DataFeed
from . import get_brokermod from . import get_brokermod
from .._daemon import maybe_spawn_brokerd
from .._cacheables import open_cached_client
log = get_logger(__name__) log = get_logger('broker.core')
_data_mods = [
'piker.brokers.core',
'piker.brokers.data',
]
async def api(brokername: str, methname: str, **kwargs) -> dict: async def api(brokername: str, methname: str, **kwargs) -> dict:
@ -40,11 +25,12 @@ async def api(brokername: str, methname: str, **kwargs) -> dict:
""" """
brokermod = get_brokermod(brokername) brokermod = get_brokermod(brokername)
async with brokermod.get_client() as client: async with brokermod.get_client() as client:
meth = getattr(client, methname, None)
if meth is None:
log.debug(
f"Couldn't find API method {methname} looking up on client")
meth = getattr(client.api, methname, None) meth = getattr(client.api, methname, None)
if meth is None:
log.warning(
f"Couldn't find API method {methname} looking up on client")
meth = getattr(client, methname, None)
if meth is None: if meth is None:
log.error(f"No api method `{methname}` could be found?") log.error(f"No api method `{methname}` could be found?")
@ -62,6 +48,24 @@ async def api(brokername: str, methname: str, **kwargs) -> dict:
return await meth(**kwargs) return await meth(**kwargs)
@asynccontextmanager
async def maybe_spawn_brokerd_as_subactor(sleep=0.5, tries=10, loglevel=None):
"""If no ``brokerd`` daemon-actor can be found spawn one in a
local subactor.
"""
async with tractor.open_nursery() as nursery:
async with tractor.find_actor('brokerd') as portal:
if not portal:
log.info(
"No broker daemon could be found, spawning brokerd..")
portal = await nursery.start_actor(
'brokerd',
rpc_module_paths=_data_mods,
loglevel=loglevel,
)
yield portal
async def stocks_quote( async def stocks_quote(
brokermod: ModuleType, brokermod: ModuleType,
tickers: List[str] tickers: List[str]
@ -106,73 +110,3 @@ async def contracts(
async with brokermod.get_client() as client: async with brokermod.get_client() as client:
# return await client.get_all_contracts([symbol]) # return await client.get_all_contracts([symbol])
return await client.get_all_contracts([symbol]) return await client.get_all_contracts([symbol])
async def bars(
brokermod: ModuleType,
symbol: str,
**kwargs,
) -> Dict[str, Dict[str, Dict[str, Any]]]:
"""Return option contracts (all expiries) for ``symbol``.
"""
async with brokermod.get_client() as client:
return await client.bars(symbol, **kwargs)
async def symbol_info(
brokermod: ModuleType,
symbol: str,
**kwargs,
) -> Dict[str, Dict[str, Dict[str, Any]]]:
"""Return symbol info from broker.
"""
async with brokermod.get_client() as client:
return await client.symbol_info(symbol, **kwargs)
async def search_w_brokerd(name: str, pattern: str) -> dict:
async with open_cached_client(name) as client:
# TODO: support multiple asset type concurrent searches.
return await client.search_symbols(pattern=pattern)
async def symbol_search(
brokermods: list[ModuleType],
pattern: str,
**kwargs,
) -> Dict[str, Dict[str, Dict[str, Any]]]:
'''
Return symbol info from broker.
'''
results = []
async def search_backend(
brokermod: ModuleType
) -> None:
brokername: str = mod.name
async with maybe_spawn_brokerd(
mod.name,
infect_asyncio=getattr(mod, '_infect_asyncio', False),
) as portal:
results.append((
brokername,
await portal.run(
search_w_brokerd,
name=brokername,
pattern=pattern,
),
))
async with trio.open_nursery() as n:
for mod in brokermods:
n.start_soon(search_backend, mod.name)
return results

View File

@ -1,30 +1,10 @@
# piker: trading gear for hackers """
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0) Real-time data feed machinery
"""
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
NB: this is the old original implementation that was used way way back
when the project started with ``kivy``.
This code is left for reference but will likely be merged in
appropriately and removed.
'''
import time import time
from functools import partial from functools import partial
from dataclasses import dataclass, field from dataclasses import dataclass, field
from itertools import cycle
import socket import socket
import json import json
from types import ModuleType from types import ModuleType
@ -32,20 +12,20 @@ import typing
from typing import ( from typing import (
Coroutine, Callable, Dict, Coroutine, Callable, Dict,
List, Any, Tuple, AsyncGenerator, List, Any, Tuple, AsyncGenerator,
Sequence Sequence,
) )
import contextlib import contextlib
from operator import itemgetter
import trio import trio
import tractor import tractor
from tractor.experimental import msgpub
from async_generator import asynccontextmanager from async_generator import asynccontextmanager
from ..log import get_logger, get_console_log from ..log import get_logger, get_console_log
from . import get_brokermod from . import get_brokermod
log = get_logger(__name__) log = get_logger('broker.data')
async def wait_for_network( async def wait_for_network(
@ -67,7 +47,7 @@ async def wait_for_network(
continue continue
except socket.gaierror: except socket.gaierror:
if not down: # only report/log network down once if not down: # only report/log network down once
log.warn("Network is down waiting for re-establishment...") log.warn(f"Network is down waiting for re-establishment...")
down = True down = True
await trio.sleep(sleep) await trio.sleep(sleep)
@ -99,31 +79,25 @@ class BrokerFeed:
) )
@msgpub(tasks=['stock', 'option']) @tractor.msg.pub(tasks=['stock', 'option'])
async def stream_poll_requests( async def stream_requests(
get_topics: Callable, get_topics: typing.Callable,
get_quotes: Coroutine, get_quotes: Coroutine,
normalizer: Callable, feed: BrokerFeed,
rate: int = 3, # delay between quote requests rate: int = 3, # delay between quote requests
diff_cached: bool = True, # only deliver "new" quotes to the queue
) -> None: ) -> None:
"""Stream requests for quotes for a set of symbols at the given """Stream requests for quotes for a set of symbols at the given
``rate`` (per second). ``rate`` (per second).
This routine is built for brokers who support quote polling for multiple A stock-broker client ``get_quotes()`` async context manager must be
symbols per request. The ``get_topics()`` func is called to retreive the
set of symbols each iteration and ``get_quotes()`` is to retreive
the quotes.
A stock-broker client ``get_quotes()`` async function must be
provided which returns an async quote retrieval function. provided which returns an async quote retrieval function.
.. note::
This code is mostly tailored (for now) to the questrade backend.
It is currently the only broker that doesn't support streaming without
paying for data. See the note in the diffing section regarding volume
differentials which needs to be addressed in order to get cross-broker
support.
""" """
broker_limit = getattr(feed.mod, '_rate_limit', float('inf'))
if broker_limit < rate:
rate = broker_limit
log.warn(f"Limiting {feed.mod.__name__} query rate to {rate}/sec")
sleeptime = round(1. / rate, 3) sleeptime = round(1. / rate, 3)
_cache = {} # ticker to quote caching _cache = {} # ticker to quote caching
@ -149,15 +123,28 @@ async def stream_poll_requests(
quotes = await wait_for_network(request_quotes) quotes = await wait_for_network(request_quotes)
new_quotes = {} new_quotes = {}
if diff_cached:
normalized = normalizer(quotes, _cache) # If cache is enabled then only deliver "new" changes.
for symbol, quote in normalized.items(): # Useful for polling setups but obviously should be
# disabled if you're rx-ing per-tick data.
for quote in quotes:
symbol = quote['symbol']
last = _cache.setdefault(symbol, {})
new = set(quote.items()) - set(last.items())
if new:
log.info(
f"New quote {quote['symbol']}:\n{new}")
_cache[symbol] = quote
# XXX: we append to a list for the options case where the # XXX: we append to a list for the options case where the
# subscription topic (key) is the same for all # subscription topic (key) is the same for all
# expiries even though this is uncessary for the # expiries even though this is uncessary for the
# stock case (different topic [i.e. symbol] for each # stock case (different topic [i.e. symbol] for each
# quote). # quote).
new_quotes.setdefault(quote['key'], []).append(quote) new_quotes.setdefault(quote['key'], []).append(quote)
else:
# log.debug(f"Delivering quotes:\n{quotes}")
for quote in quotes:
new_quotes.setdefault(quote['key'], []).append(quote)
if new_quotes: if new_quotes:
yield new_quotes yield new_quotes
@ -181,23 +168,65 @@ async def symbol_data(broker: str, tickers: List[str]):
"""Retrieve baseline symbol info from broker. """Retrieve baseline symbol info from broker.
""" """
async with get_cached_feed(broker) as feed: async with get_cached_feed(broker) as feed:
return await feed.client.symbol_info(tickers) return await feed.client.symbol_data(tickers)
_feeds_cache = {} async def smoke_quote(get_quotes, tickers, broker):
"""Do an initial "smoke" request for symbols in ``tickers`` filtering
out any symbols not supported by the broker queried in the call to
``get_quotes()``.
"""
# TODO: trim out with #37
#################################################
# get a single quote filtering out any bad tickers
# NOTE: this code is always run for every new client
# subscription even when a broker quoter task is already running
# since the new client needs to know what symbols are accepted
log.warn(f"Retrieving smoke quote for symbols {tickers}")
quotes = await get_quotes(tickers)
# report any tickers that aren't returned in the first quote
invalid_tickers = set(tickers) - set(map(itemgetter('key'), quotes))
for symbol in invalid_tickers:
tickers.remove(symbol)
log.warn(
f"Symbol `{symbol}` not found by broker `{broker}`"
)
# pop any tickers that return "empty" quotes
payload = {}
for quote in quotes:
symbol = quote['symbol']
if quote is None:
log.warn(
f"Symbol `{symbol}` not found by broker"
f" `{broker}`")
# XXX: not this mutates the input list (for now)
tickers.remove(symbol)
continue
# report any unknown/invalid symbols (QT specific)
if quote.get('low52w', False) is None:
log.error(
f"{symbol} seems to be defunct")
payload[symbol] = quote
return payload
# end of section to be trimmed out with #37
###########################################
# TODO: use the version of this from .api ?
@asynccontextmanager @asynccontextmanager
async def get_cached_feed( async def get_cached_feed(
brokername: str, brokername: str,
) -> BrokerFeed: ) -> BrokerFeed:
"""Get/create a ``BrokerFeed`` from/in the current actor. """Get/create a ``BrokerFeed`` from/in the current actor.
""" """
global _feeds_cache # check if a cached client is in the local actor's statespace
ss = tractor.current_actor().statespace
# check if a cached feed is in the local actor feeds = ss.setdefault('feeds', {'_lock': trio.Lock()})
feeds = _feeds_cache.setdefault('feeds', {'_lock': trio.Lock()})
lock = feeds['_lock'] lock = feeds['_lock']
feed = None feed = None
try: try:
@ -231,6 +260,7 @@ async def start_quote_stream(
broker: str, broker: str,
symbols: List[Any], symbols: List[Any],
feed_type: str = 'stock', feed_type: str = 'stock',
diff_cached: bool = True,
rate: int = 3, rate: int = 3,
) -> None: ) -> None:
"""Handle per-broker quote stream subscriptions using a "lazy" pub-sub """Handle per-broker quote stream subscriptions using a "lazy" pub-sub
@ -238,7 +268,7 @@ async def start_quote_stream(
Spawns new quoter tasks for each broker backend on-demand. Spawns new quoter tasks for each broker backend on-demand.
Since most brokers seems to support batch quote requests we Since most brokers seems to support batch quote requests we
limit to one task per process (for now). limit to one task per process for now.
""" """
# XXX: why do we need this again? # XXX: why do we need this again?
get_console_log(tractor.current_actor().loglevel) get_console_log(tractor.current_actor().loglevel)
@ -249,6 +279,8 @@ async def start_quote_stream(
f"{ctx.chan.uid} subscribed to {broker} for symbols {symbols}") f"{ctx.chan.uid} subscribed to {broker} for symbols {symbols}")
# another actor task may have already created it # another actor task may have already created it
async with get_cached_feed(broker) as feed: async with get_cached_feed(broker) as feed:
# function to format packets delivered to subscribers
packetizer = None
if feed_type == 'stock': if feed_type == 'stock':
get_quotes = feed.quoters.setdefault( get_quotes = feed.quoters.setdefault(
@ -257,8 +289,7 @@ async def start_quote_stream(
) )
# do a smoke quote (note this mutates the input list and filters # do a smoke quote (note this mutates the input list and filters
# out bad symbols for now) # out bad symbols for now)
first_quotes = await feed.mod.smoke_quote(get_quotes, symbols) payload = await smoke_quote(get_quotes, symbols, broker)
formatter = feed.mod.format_stock_quote
elif feed_type == 'option': elif feed_type == 'option':
# FIXME: yeah we need maybe a more general way to specify # FIXME: yeah we need maybe a more general way to specify
@ -269,40 +300,29 @@ async def start_quote_stream(
await feed.mod.option_quoter(feed.client, symbols) await feed.mod.option_quoter(feed.client, symbols)
) )
# packetize # packetize
first_quotes = { payload = {
quote['symbol']: quote quote['symbol']: quote
for quote in await get_quotes(symbols) for quote in await get_quotes(symbols)
} }
formatter = feed.mod.format_option_quote
sd = await feed.client.symbol_info(symbols) def packetizer(topic, quotes):
feed.mod._symbol_info_cache.update(sd) return {quote['symbol']: quote for quote in quotes}
normalize = partial(
feed.mod.normalize,
formatter=formatter,
)
# pre-process first set of quotes
payload = {}
for sym, quote in first_quotes.items():
fquote, _ = formatter(quote, sd)
assert fquote['displayable']
payload[sym] = fquote
# push initial smoke quote response for client initialization
await ctx.send_yield(payload) await ctx.send_yield(payload)
await stream_poll_requests( await stream_requests(
# ``trionics.msgpub`` required kwargs # pub required kwargs
task_name=feed_type, task_name=feed_type,
ctx=ctx, ctx=ctx,
topics=symbols, topics=symbols,
packetizer=feed.mod.packetizer, packetizer=packetizer,
# actual func args # actual func args
feed=feed,
get_quotes=get_quotes, get_quotes=get_quotes,
normalizer=normalize, diff_cached=diff_cached,
rate=rate, rate=rate,
) )
log.info( log.info(
@ -332,13 +352,13 @@ class DataFeed:
self.quote_gen = None self.quote_gen = None
self._symbol_data_cache: Dict[str, Any] = {} self._symbol_data_cache: Dict[str, Any] = {}
@asynccontextmanager
async def open_stream( async def open_stream(
self, self,
symbols: Sequence[str], symbols: Sequence[str],
feed_type: str, feed_type: str,
rate: int = 1, rate: int = 1,
test: str = '', diff_cached: bool = True,
test: bool = None,
) -> (AsyncGenerator, dict): ) -> (AsyncGenerator, dict):
if feed_type not in self._allowed: if feed_type not in self._allowed:
raise ValueError(f"Only feed types {self._allowed} are supported") raise ValueError(f"Only feed types {self._allowed} are supported")
@ -358,22 +378,28 @@ class DataFeed:
# subscribe for tickers (this performs a possible filtering # subscribe for tickers (this performs a possible filtering
# where invalid symbols are discarded) # where invalid symbols are discarded)
sd = await self.portal.run( sd = await self.portal.run(
symbol_data, "piker.brokers.data", 'symbol_data',
broker=self.brokermod.name, broker=self.brokermod.name, tickers=symbols)
tickers=symbols
)
self._symbol_data_cache.update(sd) self._symbol_data_cache.update(sd)
if test:
# stream from a local test file
quote_gen = await self.portal.run(
"piker.brokers.data", 'stream_from_file',
filename=test
)
else:
log.info(f"Starting new stream for {symbols}") log.info(f"Starting new stream for {symbols}")
# start live streaming from broker daemon # start live streaming from broker daemon
async with self.portal.open_stream_from( quote_gen = await self.portal.run(
start_quote_stream, "piker.brokers.data",
'start_quote_stream',
broker=self.brokermod.name, broker=self.brokermod.name,
symbols=symbols, symbols=symbols,
feed_type=feed_type, feed_type=feed_type,
diff_cached=diff_cached,
rate=rate, rate=rate,
) as quote_gen: )
# get first quotes response # get first quotes response
log.debug(f"Waiting on first quote for {symbols}...") log.debug(f"Waiting on first quote for {symbols}...")
@ -382,8 +408,7 @@ class DataFeed:
self.quote_gen = quote_gen self.quote_gen = quote_gen
self.first_quotes = quotes self.first_quotes = quotes
yield quote_gen, quotes return quote_gen, quotes
except Exception: except Exception:
if self.quote_gen: if self.quote_gen:
await self.quote_gen.aclose() await self.quote_gen.aclose()
@ -391,8 +416,6 @@ class DataFeed:
raise raise
def format_quotes(self, quotes, symbol_data={}): def format_quotes(self, quotes, symbol_data={}):
"""Format ``quotes`` using broker defined formatter.
"""
self._symbol_data_cache.update(symbol_data) self._symbol_data_cache.update(symbol_data)
formatter = getattr(self.brokermod, f'format_{self._quote_type}_quote') formatter = getattr(self.brokermod, f'format_{self._quote_type}_quote')
records, displayables = zip(*[ records, displayables = zip(*[
@ -405,7 +428,8 @@ class DataFeed:
"""Call a broker ``Client`` method using RPC and return result. """Call a broker ``Client`` method using RPC and return result.
""" """
return await self.portal.run( return await self.portal.run(
call_client, 'piker.brokers.data',
'call_client',
broker=self.brokermod.name, broker=self.brokermod.name,
methname=method, methname=method,
**kwargs **kwargs
@ -423,11 +447,9 @@ async def stream_to_file(
"""Record client side received quotes to file ``filename``. """Record client side received quotes to file ``filename``.
""" """
# an async generator instance # an async generator instance
async with portal.open_stream_from( agen = await portal.run(
start_quote_stream, "piker.brokers.data", 'start_quote_stream',
broker=brokermod.name, broker=brokermod.name, tickers=tickers)
symbols=tickers
) as agen:
fname = filename or f'{watchlist_name}.jsonstream' fname = filename or f'{watchlist_name}.jsonstream'
with open(fname, 'a') as f: with open(fname, 'a') as f:
@ -438,14 +460,14 @@ async def stream_to_file(
return fname return fname
# async def stream_from_file( async def stream_from_file(
# filename: str, filename: str,
# ): ):
# with open(filename, 'r') as quotes_file: with open(filename, 'r') as quotes_file:
# content = quotes_file.read() content = quotes_file.read()
# pkts = content.split('--')[:-1] # simulate 2 separate quote packets pkts = content.split('--')[:-1] # simulate 2 separate quote packets
# payloads = [json.loads(pkt) for pkt in pkts] payloads = [json.loads(pkt) for pkt in pkts]
# for payload in cycle(payloads): for payload in cycle(payloads):
# yield payload yield payload
# await trio.sleep(0.3) await trio.sleep(0.3)

View File

@ -1,67 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Interactive Brokers API backend.
Sub-modules within break into the core functionalities:
- ``broker.py`` part for orders / trading endpoints
- ``data.py`` for real-time data feed endpoints
- ``client.py`` for the core API machinery which is ``trio``-ized
wrapping around ``ib_insync``.
- ``report.py`` for the hackery to build manual pp calcs
to avoid ib's absolute bullshit FIFO style position
tracking..
"""
from .api import (
get_client,
)
from .feed import (
open_history_client,
open_symbol_search,
stream_quotes,
)
from .broker import trades_dialogue
__all__ = [
'get_client',
'trades_dialogue',
'open_history_client',
'open_symbol_search',
'stream_quotes',
]
# tractor RPC enable arg
__enable_modules__: list[str] = [
'api',
'feed',
'broker',
]
# passed to ``tractor.ActorNursery.start_actor()``
_spawn_kwargs = {
'infect_asyncio': True,
}
# annotation to let backend agnostic code
# know if ``brokerd`` should be spawned with
# ``tractor``'s aio mode.
_infect_asyncio: bool = True

File diff suppressed because it is too large Load Diff

View File

@ -1,590 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Order and trades endpoints for use with ``piker``'s EMS.
"""
from __future__ import annotations
from dataclasses import asdict
from functools import partial
from pprint import pformat
import time
from typing import (
Any,
Optional,
AsyncIterator,
)
import trio
from trio_typing import TaskStatus
import tractor
from ib_insync.contract import (
Contract,
Option,
)
from ib_insync.order import (
Trade,
OrderStatus,
)
from ib_insync.objects import (
Fill,
Execution,
)
from ib_insync.objects import Position
from piker import config
from piker.log import get_console_log
from piker.clearing._messages import (
BrokerdOrder,
BrokerdOrderAck,
BrokerdStatus,
BrokerdPosition,
BrokerdCancel,
BrokerdFill,
BrokerdError,
)
from .api import (
_accounts2clients,
_adhoc_futes_set,
log,
get_config,
open_client_proxies,
Client,
)
def pack_position(
pos: Position
) -> dict[str, Any]:
con = pos.contract
if isinstance(con, Option):
# TODO: option symbol parsing and sane display:
symbol = con.localSymbol.replace(' ', '')
else:
# TODO: lookup fqsn even for derivs.
symbol = con.symbol.lower()
exch = (con.primaryExchange or con.exchange).lower()
symkey = '.'.join((symbol, exch))
if not exch:
# attempt to lookup the symbol from our
# hacked set..
for sym in _adhoc_futes_set:
if symbol in sym:
symkey = sym
break
expiry = con.lastTradeDateOrContractMonth
if expiry:
symkey += f'.{expiry}'
# TODO: options contracts into a sane format..
return BrokerdPosition(
broker='ib',
account=pos.account,
symbol=symkey,
currency=con.currency,
size=float(pos.position),
avg_price=float(pos.avgCost) / float(con.multiplier or 1.0),
)
async def handle_order_requests(
ems_order_stream: tractor.MsgStream,
accounts_def: dict[str, str],
) -> None:
request_msg: dict
async for request_msg in ems_order_stream:
log.info(f'Received order request {request_msg}')
action = request_msg['action']
account = request_msg['account']
acct_number = accounts_def.get(account)
if not acct_number:
log.error(
f'An IB account number for name {account} is not found?\n'
'Make sure you have all TWS and GW instances running.'
)
await ems_order_stream.send(BrokerdError(
oid=request_msg['oid'],
symbol=request_msg['symbol'],
reason=f'No account found: `{account}` ?',
).dict())
continue
client = _accounts2clients.get(account)
if not client:
log.error(
f'An IB client for account name {account} is not found.\n'
'Make sure you have all TWS and GW instances running.'
)
await ems_order_stream.send(BrokerdError(
oid=request_msg['oid'],
symbol=request_msg['symbol'],
reason=f'No api client loaded for account: `{account}` ?',
).dict())
continue
if action in {'buy', 'sell'}:
# validate
order = BrokerdOrder(**request_msg)
# call our client api to submit the order
reqid = client.submit_limit(
oid=order.oid,
symbol=order.symbol,
price=order.price,
action=order.action,
size=order.size,
account=acct_number,
# XXX: by default 0 tells ``ib_insync`` methods that
# there is no existing order so ask the client to create
# a new one (which it seems to do by allocating an int
# counter - collision prone..)
reqid=order.reqid,
)
if reqid is None:
await ems_order_stream.send(BrokerdError(
oid=request_msg['oid'],
symbol=request_msg['symbol'],
reason='Order already active?',
).dict())
# deliver ack that order has been submitted to broker routing
await ems_order_stream.send(
BrokerdOrderAck(
# ems order request id
oid=order.oid,
# broker specific request id
reqid=reqid,
time_ns=time.time_ns(),
account=account,
).dict()
)
elif action == 'cancel':
msg = BrokerdCancel(**request_msg)
client.submit_cancel(reqid=msg.reqid)
else:
log.error(f'Unknown order command: {request_msg}')
async def recv_trade_updates(
client: Client,
to_trio: trio.abc.SendChannel,
) -> None:
"""Stream a ticker using the std L1 api.
"""
client.inline_errors(to_trio)
# sync with trio task
to_trio.send_nowait(None)
def push_tradesies(eventkit_obj, obj, fill=None):
"""Push events to trio task.
"""
if fill is not None:
# execution details event
item = ('fill', (obj, fill))
elif eventkit_obj.name() == 'positionEvent':
item = ('position', obj)
else:
item = ('status', obj)
log.info(f'eventkit event ->\n{pformat(item)}')
try:
to_trio.send_nowait(item)
except trio.BrokenResourceError:
log.exception(f'Disconnected from {eventkit_obj} updates')
eventkit_obj.disconnect(push_tradesies)
# hook up to the weird eventkit object - event stream api
for ev_name in [
'orderStatusEvent', # all order updates
'execDetailsEvent', # all "fill" updates
'positionEvent', # avg price updates per symbol per account
# 'commissionReportEvent',
# XXX: ugh, it is a separate event from IB and it's
# emitted as follows:
# self.ib.commissionReportEvent.emit(trade, fill, report)
# XXX: not sure yet if we need these
# 'updatePortfolioEvent',
# XXX: these all seem to be weird ib_insync intrernal
# events that we probably don't care that much about
# given the internal design is wonky af..
# 'newOrderEvent',
# 'orderModifyEvent',
# 'cancelOrderEvent',
# 'openOrderEvent',
]:
eventkit_obj = getattr(client.ib, ev_name)
handler = partial(push_tradesies, eventkit_obj)
eventkit_obj.connect(handler)
# let the engine run and stream
await client.ib.disconnectedEvent
@tractor.context
async def trades_dialogue(
ctx: tractor.Context,
loglevel: str = None,
) -> AsyncIterator[dict[str, Any]]:
# XXX: required to propagate ``tractor`` loglevel to piker logging
get_console_log(loglevel or tractor.current_actor().loglevel)
accounts_def = config.load_accounts(['ib'])
global _client_cache
# deliver positions to subscriber before anything else
all_positions = []
accounts = set()
clients: list[tuple[Client, trio.MemoryReceiveChannel]] = []
async with (
trio.open_nursery() as nurse,
open_client_proxies() as (proxies, aioclients),
):
for account, proxy in proxies.items():
client = aioclients[account]
async def open_stream(
task_status: TaskStatus[
trio.abc.ReceiveChannel
] = trio.TASK_STATUS_IGNORED,
):
# each api client has a unique event stream
async with tractor.to_asyncio.open_channel_from(
recv_trade_updates,
client=client,
) as (first, trade_event_stream):
task_status.started(trade_event_stream)
await trio.sleep_forever()
trade_event_stream = await nurse.start(open_stream)
clients.append((client, trade_event_stream))
assert account in accounts_def
accounts.add(account)
for client in aioclients.values():
for pos in client.positions():
msg = pack_position(pos)
msg.account = accounts_def.inverse[msg.account]
assert msg.account in accounts, (
f'Position for unknown account: {msg.account}')
all_positions.append(msg.dict())
trades: list[dict] = []
for proxy in proxies.values():
trades.append(await proxy.trades())
log.info(f'Loaded {len(trades)} from this session')
# TODO: write trades to local ``trades.toml``
# - use above per-session trades data and write to local file
# - get the "flex reports" working and pull historical data and
# also save locally.
await ctx.started((
all_positions,
tuple(name for name in accounts_def if name in accounts),
))
async with (
ctx.open_stream() as ems_stream,
trio.open_nursery() as n,
):
# start order request handler **before** local trades event loop
n.start_soon(handle_order_requests, ems_stream, accounts_def)
# allocate event relay tasks for each client connection
for client, stream in clients:
n.start_soon(
deliver_trade_events,
stream,
ems_stream,
accounts_def
)
# block until cancelled
await trio.sleep_forever()
async def deliver_trade_events(
trade_event_stream: trio.MemoryReceiveChannel,
ems_stream: tractor.MsgStream,
accounts_def: dict[str, str],
) -> None:
'''Format and relay all trade events for a given client to the EMS.
'''
action_map = {'BOT': 'buy', 'SLD': 'sell'}
# TODO: for some reason we can receive a ``None`` here when the
# ib-gw goes down? Not sure exactly how that's happening looking
# at the eventkit code above but we should probably handle it...
async for event_name, item in trade_event_stream:
log.info(f'ib sending {event_name}:\n{pformat(item)}')
# TODO: templating the ib statuses in comparison with other
# brokers is likely the way to go:
# https://interactivebrokers.github.io/tws-api/interfaceIBApi_1_1EWrapper.html#a17f2a02d6449710b6394d0266a353313
# short list:
# - PendingSubmit
# - PendingCancel
# - PreSubmitted (simulated orders)
# - ApiCancelled (cancelled by client before submission
# to routing)
# - Cancelled
# - Filled
# - Inactive (reject or cancelled but not by trader)
# XXX: here's some other sucky cases from the api
# - short-sale but securities haven't been located, in this
# case we should probably keep the order in some kind of
# weird state or cancel it outright?
# status='PendingSubmit', message=''),
# status='Cancelled', message='Error 404,
# reqId 1550: Order held while securities are located.'),
# status='PreSubmitted', message='')],
if event_name == 'status':
# XXX: begin normalization of nonsense ib_insync internal
# object-state tracking representations...
# unwrap needed data from ib_insync internal types
trade: Trade = item
status: OrderStatus = trade.orderStatus
# skip duplicate filled updates - we get the deats
# from the execution details event
msg = BrokerdStatus(
reqid=trade.order.orderId,
time_ns=time.time_ns(), # cuz why not
account=accounts_def.inverse[trade.order.account],
# everyone doin camel case..
status=status.status.lower(), # force lower case
filled=status.filled,
reason=status.whyHeld,
# this seems to not be necessarily up to date in the
# execDetails event.. so we have to send it here I guess?
remaining=status.remaining,
broker_details={'name': 'ib'},
)
elif event_name == 'fill':
# for wtv reason this is a separate event type
# from IB, not sure why it's needed other then for extra
# complexity and over-engineering :eyeroll:.
# we may just end up dropping these events (or
# translating them to ``Status`` msgs) if we can
# show the equivalent status events are no more latent.
# unpack ib_insync types
# pep-0526 style:
# https://www.python.org/dev/peps/pep-0526/#global-and-local-variable-annotations
trade: Trade
fill: Fill
trade, fill = item
execu: Execution = fill.execution
# TODO: normalize out commissions details?
details = {
'contract': asdict(fill.contract),
'execution': asdict(fill.execution),
'commissions': asdict(fill.commissionReport),
'broker_time': execu.time, # supposedly server fill time
'name': 'ib',
}
msg = BrokerdFill(
# should match the value returned from `.submit_limit()`
reqid=execu.orderId,
time_ns=time.time_ns(), # cuz why not
action=action_map[execu.side],
size=execu.shares,
price=execu.price,
broker_details=details,
# XXX: required by order mode currently
broker_time=details['broker_time'],
)
elif event_name == 'error':
err: dict = item
# f$#$% gawd dammit insync..
con = err['contract']
if isinstance(con, Contract):
err['contract'] = asdict(con)
if err['reqid'] == -1:
log.error(f'TWS external order error:\n{pformat(err)}')
# TODO: what schema for this msg if we're going to make it
# portable across all backends?
# msg = BrokerdError(**err)
continue
elif event_name == 'position':
msg = pack_position(item)
msg.account = accounts_def.inverse[msg.account]
elif event_name == 'event':
# it's either a general system status event or an external
# trade event?
log.info(f"TWS system status: \n{pformat(item)}")
# TODO: support this again but needs parsing at the callback
# level...
# reqid = item.get('reqid', 0)
# if getattr(msg, 'reqid', 0) < -1:
# log.info(f"TWS triggered trade\n{pformat(msg.dict())}")
continue
# msg.reqid = 'tws-' + str(-1 * reqid)
# mark msg as from "external system"
# TODO: probably something better then this.. and start
# considering multiplayer/group trades tracking
# msg.broker_details['external_src'] = 'tws'
# XXX: we always serialize to a dict for msgpack
# translations, ideally we can move to an msgspec (or other)
# encoder # that can be enabled in ``tractor`` ahead of
# time so we can pass through the message types directly.
await ems_stream.send(msg.dict())
def load_flex_trades(
path: Optional[str] = None,
) -> dict[str, str]:
from pprint import pprint
from ib_insync import flexreport, util
conf = get_config()
if not path:
# load ``brokers.toml`` and try to get the flex
# token and query id that must be previously defined
# by the user.
token = conf.get('flex_token')
if not token:
raise ValueError(
'You must specify a ``flex_token`` field in your'
'`brokers.toml` in order load your trade log, see our'
'intructions for how to set this up here:\n'
'PUT LINK HERE!'
)
qid = conf['flex_trades_query_id']
# TODO: hack this into our logging
# system like we do with the API client..
util.logToConsole()
# TODO: rewrite the query part of this with async..httpx?
report = flexreport.FlexReport(
token=token,
queryId=qid,
)
else:
# XXX: another project we could potentially look at,
# https://pypi.org/project/ibflex/
report = flexreport.FlexReport(path=path)
trade_entries = report.extract('Trade')
trades = {
# XXX: LOL apparently ``toml`` has a bug
# where a section key error will show up in the write
# if you leave this as an ``int``?
str(t.__dict__['tradeID']): t.__dict__
for t in trade_entries
}
ln = len(trades)
log.info(f'Loaded {ln} trades from flex query')
trades_by_account = {}
for tid, trade in trades.items():
trades_by_account.setdefault(
# oddly for some so-called "BookTrade" entries
# this field seems to be blank, no cuckin clue.
# trade['ibExecID']
str(trade['accountId']), {}
)[tid] = trade
section = {'ib': trades_by_account}
pprint(section)
# TODO: load the config first and append in
# the new trades loaded here..
try:
config.write(section, 'trades')
except KeyError:
import pdbpp; pdbpp.set_trace() # noqa
if __name__ == '__main__':
load_flex_trades()

View File

@ -1,938 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Data feed endpoints pre-wrapped and ready for use with ``tractor``/``trio``.
"""
from __future__ import annotations
import asyncio
from contextlib import asynccontextmanager as acm
from dataclasses import asdict
from datetime import datetime
from math import isnan
import time
from typing import (
Callable,
Optional,
Awaitable,
)
from async_generator import aclosing
from fuzzywuzzy import process as fuzzy
import numpy as np
import pendulum
import tractor
import trio
from trio_typing import TaskStatus
from piker.data._sharedmem import ShmArray
from .._util import SymbolNotFound, NoData
from .api import (
_adhoc_futes_set,
log,
load_aio_clients,
ibis,
MethodProxy,
open_client_proxies,
get_preferred_data_client,
Ticker,
RequestError,
Contract,
)
# https://interactivebrokers.github.io/tws-api/tick_types.html
tick_types = {
77: 'trade',
# a "utrade" aka an off exchange "unreportable" (dark) vlm:
# https://interactivebrokers.github.io/tws-api/tick_types.html#rt_volume
48: 'dark_trade',
# standard L1 ticks
0: 'bsize',
1: 'bid',
2: 'ask',
3: 'asize',
4: 'last',
5: 'size',
8: 'volume',
# ``ib_insync`` already packs these into
# quotes under the following fields.
# 55: 'trades_per_min', # `'tradeRate'`
# 56: 'vlm_per_min', # `'volumeRate'`
# 89: 'shortable', # `'shortableShares'`
}
@acm
async def open_data_client() -> MethodProxy:
'''
Open the first found preferred "data client" as defined in the
user's ``brokers.toml`` in the ``ib.prefer_data_account`` variable
and deliver that client wrapped in a ``MethodProxy``.
'''
async with (
open_client_proxies() as (proxies, clients),
):
account_name, client = get_preferred_data_client(clients)
proxy = proxies.get(f'ib.{account_name}')
if not proxy:
raise ValueError(
f'No preferred data client could be found for {account_name}!'
)
yield proxy
@acm
async def open_history_client(
symbol: str,
) -> tuple[Callable, int]:
'''
History retreival endpoint - delivers a historical frame callble
that takes in ``pendulum.datetime`` and returns ``numpy`` arrays.
'''
async with open_data_client() as proxy:
async def get_hist(
end_dt: Optional[datetime] = None,
start_dt: Optional[datetime] = None,
) -> tuple[np.ndarray, str]:
out, fails = await get_bars(proxy, symbol, end_dt=end_dt)
# TODO: add logic here to handle tradable hours and only grab
# valid bars in the range
if out is None:
# could be trying to retreive bars over weekend
log.error(f"Can't grab bars starting at {end_dt}!?!?")
raise NoData(
f'{end_dt}',
frame_size=2000,
)
bars, bars_array, first_dt, last_dt = out
# volume cleaning since there's -ve entries,
# wood luv to know what crookery that is..
vlm = bars_array['volume']
vlm[vlm < 0] = 0
return bars_array, first_dt, last_dt
# TODO: it seems like we can do async queries for ohlc
# but getting the order right still isn't working and I'm not
# quite sure why.. needs some tinkering and probably
# a lookthrough of the ``ib_insync`` machinery, for eg. maybe
# we have to do the batch queries on the `asyncio` side?
yield get_hist, {'erlangs': 1, 'rate': 6}
_pacing: str = (
'Historical Market Data Service error '
'message:Historical data request pacing violation'
)
async def get_bars(
proxy: MethodProxy,
fqsn: str,
# blank to start which tells ib to look up the latest datum
end_dt: str = '',
) -> (dict, np.ndarray):
'''
Retrieve historical data from a ``trio``-side task using
a ``MethoProxy``.
'''
fails = 0
bars: Optional[list] = None
first_dt: datetime = None
last_dt: datetime = None
if end_dt:
last_dt = pendulum.from_timestamp(end_dt.timestamp())
for _ in range(10):
try:
out = await proxy.bars(
fqsn=fqsn,
end_dt=end_dt,
)
if out:
bars, bars_array = out
else:
await tractor.breakpoint()
if bars_array is None:
raise SymbolNotFound(fqsn)
first_dt = pendulum.from_timestamp(
bars[0].date.timestamp())
last_dt = pendulum.from_timestamp(
bars[-1].date.timestamp())
time = bars_array['time']
assert time[-1] == last_dt.timestamp()
assert time[0] == first_dt.timestamp()
log.info(
f'{len(bars)} bars retreived for {first_dt} -> {last_dt}'
)
return (bars, bars_array, first_dt, last_dt), fails
except RequestError as err:
msg = err.message
# why do we always need to rebind this?
# _err = err
if 'No market data permissions for' in msg:
# TODO: signalling for no permissions searches
raise NoData(
f'Symbol: {fqsn}',
)
elif (
err.code == 162
and 'HMDS query returned no data' in err.message
):
# XXX: this is now done in the storage mgmt layer
# and we shouldn't implicitly decrement the frame dt
# index since the upper layer may be doing so
# concurrently and we don't want to be delivering frames
# that weren't asked for.
log.warning(
f'NO DATA found ending @ {end_dt}\n'
)
# try to decrement start point and look further back
# end_dt = last_dt = last_dt.subtract(seconds=2000)
raise NoData(
f'Symbol: {fqsn}',
frame_size=2000,
)
elif _pacing in msg:
log.warning(
'History throttle rate reached!\n'
'Resetting farms with `ctrl-alt-f` hack\n'
)
# TODO: we might have to put a task lock around this
# method..
hist_ev = proxy.status_event(
'HMDS data farm connection is OK:ushmds'
)
# XXX: other event messages we might want to try and
# wait for but i wasn't able to get any of this
# reliable..
# reconnect_start = proxy.status_event(
# 'Market data farm is connecting:usfuture'
# )
# live_ev = proxy.status_event(
# 'Market data farm connection is OK:usfuture'
# )
# try to wait on the reset event(s) to arrive, a timeout
# will trigger a retry up to 6 times (for now).
tries: int = 2
timeout: float = 10
# try 3 time with a data reset then fail over to
# a connection reset.
for i in range(1, tries):
log.warning('Sending DATA RESET request')
await data_reset_hack(reset_type='data')
with trio.move_on_after(timeout) as cs:
for name, ev in [
# TODO: not sure if waiting on other events
# is all that useful here or not. in theory
# you could wait on one of the ones above
# first to verify the reset request was
# sent?
('history', hist_ev),
]:
await ev.wait()
log.info(f"{name} DATA RESET")
break
if cs.cancelled_caught:
fails += 1
log.warning(
f'Data reset {name} timeout, retrying {i}.'
)
continue
else:
log.warning('Sending CONNECTION RESET')
await data_reset_hack(reset_type='connection')
with trio.move_on_after(timeout) as cs:
for name, ev in [
# TODO: not sure if waiting on other events
# is all that useful here or not. in theory
# you could wait on one of the ones above
# first to verify the reset request was
# sent?
('history', hist_ev),
]:
await ev.wait()
log.info(f"{name} DATA RESET")
if cs.cancelled_caught:
fails += 1
log.warning('Data CONNECTION RESET timeout!?')
else:
raise
return None, None
# else: # throttle wasn't fixed so error out immediately
# raise _err
async def backfill_bars(
fqsn: str,
shm: ShmArray, # type: ignore # noqa
# TODO: we want to avoid overrunning the underlying shm array buffer
# and we should probably calc the number of calls to make depending
# on that until we have the `marketstore` daemon in place in which
# case the shm size will be driven by user config and available sys
# memory.
count: int = 16,
task_status: TaskStatus[trio.CancelScope] = trio.TASK_STATUS_IGNORED,
) -> None:
'''
Fill historical bars into shared mem / storage afap.
TODO: avoid pacing constraints:
https://github.com/pikers/piker/issues/128
'''
# last_dt1 = None
last_dt = None
with trio.CancelScope() as cs:
async with open_data_client() as proxy:
out, fails = await get_bars(proxy, fqsn)
if out is None:
raise RuntimeError("Could not pull currrent history?!")
(first_bars, bars_array, first_dt, last_dt) = out
vlm = bars_array['volume']
vlm[vlm < 0] = 0
last_dt = first_dt
# write historical data to buffer
shm.push(bars_array)
task_status.started(cs)
i = 0
while i < count:
out, fails = await get_bars(proxy, fqsn, end_dt=first_dt)
if out is None:
# could be trying to retreive bars over weekend
# TODO: add logic here to handle tradable hours and
# only grab valid bars in the range
log.error(f"Can't grab bars starting at {first_dt}!?!?")
# XXX: get_bars() should internally decrement dt by
# 2k seconds and try again.
continue
(first_bars, bars_array, first_dt, last_dt) = out
# last_dt1 = last_dt
# last_dt = first_dt
# volume cleaning since there's -ve entries,
# wood luv to know what crookery that is..
vlm = bars_array['volume']
vlm[vlm < 0] = 0
# TODO we should probably dig into forums to see what peeps
# think this data "means" and then use it as an indicator of
# sorts? dinkus has mentioned that $vlms for the day dont'
# match other platforms nor the summary stat tws shows in
# the monitor - it's probably worth investigating.
shm.push(bars_array, prepend=True)
i += 1
asset_type_map = {
'STK': 'stock',
'OPT': 'option',
'FUT': 'future',
'CONTFUT': 'continuous_future',
'CASH': 'forex',
'IND': 'index',
'CFD': 'cfd',
'BOND': 'bond',
'CMDTY': 'commodity',
'FOP': 'futures_option',
'FUND': 'mutual_fund',
'WAR': 'warrant',
'IOPT': 'warran',
'BAG': 'bag',
# 'NEWS': 'news',
}
_quote_streams: dict[str, trio.abc.ReceiveStream] = {}
async def _setup_quote_stream(
from_trio: asyncio.Queue,
to_trio: trio.abc.SendChannel,
symbol: str,
opts: tuple[int] = (
'375', # RT trade volume (excludes utrades)
'233', # RT trade volume (includes utrades)
'236', # Shortable shares
# these all appear to only be updated every 25s thus
# making them mostly useless and explains why the scanner
# is always slow XD
# '293', # Trade count for day
'294', # Trade rate / minute
'295', # Vlm rate / minute
),
contract: Optional[Contract] = None,
) -> trio.abc.ReceiveChannel:
'''
Stream a ticker using the std L1 api.
This task is ``asyncio``-side and must be called from
``tractor.to_asyncio.open_channel_from()``.
'''
global _quote_streams
to_trio.send_nowait(None)
async with load_aio_clients() as accts2clients:
caccount_name, client = get_preferred_data_client(accts2clients)
contract = contract or (await client.find_contract(symbol))
ticker: Ticker = client.ib.reqMktData(contract, ','.join(opts))
# NOTE: it's batch-wise and slow af but I guess could
# be good for backchecking? Seems to be every 5s maybe?
# ticker: Ticker = client.ib.reqTickByTickData(
# contract, 'Last',
# )
# # define a simple queue push routine that streams quote packets
# # to trio over the ``to_trio`` memory channel.
# to_trio, from_aio = trio.open_memory_channel(2**8) # type: ignore
def teardown():
ticker.updateEvent.disconnect(push)
log.error(f"Disconnected stream for `{symbol}`")
client.ib.cancelMktData(contract)
# decouple broadcast mem chan
_quote_streams.pop(symbol, None)
def push(t: Ticker) -> None:
"""
Push quotes to trio task.
"""
# log.debug(t)
try:
to_trio.send_nowait(t)
except (
trio.BrokenResourceError,
# XXX: HACK, not sure why this gets left stale (probably
# due to our terrible ``tractor.to_asyncio``
# implementation for streams.. but if the mem chan
# gets left here and starts blocking just kill the feed?
# trio.WouldBlock,
):
# XXX: eventkit's ``Event.emit()`` for whatever redic
# reason will catch and ignore regular exceptions
# resulting in tracebacks spammed to console..
# Manually do the dereg ourselves.
teardown()
except trio.WouldBlock:
log.warning(
f'channel is blocking symbol feed for {symbol}?'
f'\n{to_trio.statistics}'
)
# except trio.WouldBlock:
# # for slow debugging purposes to avoid clobbering prompt
# # with log msgs
# pass
ticker.updateEvent.connect(push)
try:
await asyncio.sleep(float('inf'))
finally:
teardown()
# return from_aio
@acm
async def open_aio_quote_stream(
symbol: str,
contract: Optional[Contract] = None,
) -> trio.abc.ReceiveStream:
from tractor.trionics import broadcast_receiver
global _quote_streams
from_aio = _quote_streams.get(symbol)
if from_aio:
# if we already have a cached feed deliver a rx side clone to consumer
async with broadcast_receiver(
from_aio,
2**6,
) as from_aio:
yield from_aio
return
async with tractor.to_asyncio.open_channel_from(
_setup_quote_stream,
symbol=symbol,
contract=contract,
) as (first, from_aio):
# cache feed for later consumers
_quote_streams[symbol] = from_aio
yield from_aio
# TODO: cython/mypyc/numba this!
def normalize(
ticker: Ticker,
calc_price: bool = False
) -> dict:
# should be real volume for this contract by default
calc_price = False
# check for special contract types
con = ticker.contract
if type(con) in (
ibis.Commodity,
ibis.Forex,
):
# commodities and forex don't have an exchange name and
# no real volume so we have to calculate the price
suffix = con.secType
# no real volume on this tract
calc_price = True
else:
suffix = con.primaryExchange
if not suffix:
suffix = con.exchange
# append a `.<suffix>` to the returned symbol
# key for derivatives that normally is the expiry
# date key.
expiry = con.lastTradeDateOrContractMonth
if expiry:
suffix += f'.{expiry}'
# convert named tuples to dicts so we send usable keys
new_ticks = []
for tick in ticker.ticks:
if tick and not isinstance(tick, dict):
td = tick._asdict()
td['type'] = tick_types.get(
td['tickType'],
'n/a',
)
new_ticks.append(td)
tbt = ticker.tickByTicks
if tbt:
print(f'tickbyticks:\n {ticker.tickByTicks}')
ticker.ticks = new_ticks
# some contracts don't have volume so we may want to calculate
# a midpoint price based on data we can acquire (such as bid / ask)
if calc_price:
ticker.ticks.append(
{'type': 'trade', 'price': ticker.marketPrice()}
)
# serialize for transport
data = asdict(ticker)
# generate fqsn with possible specialized suffix
# for derivatives, note the lowercase.
data['symbol'] = data['fqsn'] = '.'.join(
(con.symbol, suffix)
).lower()
# convert named tuples to dicts for transport
tbts = data.get('tickByTicks')
if tbts:
data['tickByTicks'] = [tbt._asdict() for tbt in tbts]
# add time stamps for downstream latency measurements
data['brokerd_ts'] = time.time()
# stupid stupid shit...don't even care any more..
# leave it until we do a proper latency study
# if ticker.rtTime is not None:
# data['broker_ts'] = data['rtTime_s'] = float(
# ticker.rtTime.timestamp) / 1000.
data.pop('rtTime')
return data
async def stream_quotes(
send_chan: trio.abc.SendChannel,
symbols: list[str],
feed_is_live: trio.Event,
loglevel: str = None,
# startup sync
task_status: TaskStatus[tuple[dict, dict]] = trio.TASK_STATUS_IGNORED,
) -> None:
'''
Stream symbol quotes.
This is a ``trio`` callable routine meant to be invoked
once the brokerd is up.
'''
# TODO: support multiple subscriptions
sym = symbols[0]
log.info(f'request for real-time quotes: {sym}')
async with open_data_client() as proxy:
con, first_ticker, details = await proxy.get_sym_details(symbol=sym)
first_quote = normalize(first_ticker)
# print(f'first quote: {first_quote}')
def mk_init_msgs() -> dict[str, dict]:
'''
Collect a bunch of meta-data useful for feed startup and
pack in a `dict`-msg.
'''
# pass back some symbol info like min_tick, trading_hours, etc.
syminfo = asdict(details)
syminfo.update(syminfo['contract'])
# nested dataclass we probably don't need and that won't IPC
# serialize
syminfo.pop('secIdList')
# TODO: more consistent field translation
atype = syminfo['asset_type'] = asset_type_map[syminfo['secType']]
# for stocks it seems TWS reports too small a tick size
# such that you can't submit orders with that granularity?
min_tick = 0.01 if atype == 'stock' else 0
syminfo['price_tick_size'] = max(syminfo['minTick'], min_tick)
# for "traditional" assets, volume is normally discreet, not
# a float
syminfo['lot_tick_size'] = 0.0
ibclient = proxy._aio_ns.ib.client
host, port = ibclient.host, ibclient.port
# TODO: for loop through all symbols passed in
init_msgs = {
# pass back token, and bool, signalling if we're the writer
# and that history has been written
sym: {
'symbol_info': syminfo,
'fqsn': first_quote['fqsn'],
},
'status': {
'data_ep': f'{host}:{port}',
},
}
return init_msgs
init_msgs = mk_init_msgs()
# TODO: we should instead spawn a task that waits on a feed to start
# and let it wait indefinitely..instead of this hard coded stuff.
with trio.move_on_after(1):
contract, first_ticker, details = await proxy.get_quote(symbol=sym)
# it might be outside regular trading hours so see if we can at
# least grab history.
if isnan(first_ticker.last):
task_status.started((init_msgs, first_quote))
# it's not really live but this will unblock
# the brokerd feed task to tell the ui to update?
feed_is_live.set()
# block and let data history backfill code run.
await trio.sleep_forever()
return # we never expect feed to come up?
async with open_aio_quote_stream(
symbol=sym,
contract=con,
) as stream:
# ugh, clear ticks since we've consumed them
# (ahem, ib_insync is stateful trash)
first_ticker.ticks = []
task_status.started((init_msgs, first_quote))
async with aclosing(stream):
if type(first_ticker.contract) not in (
ibis.Commodity,
ibis.Forex
):
# wait for real volume on feed (trading might be closed)
while True:
ticker = await stream.receive()
# for a real volume contract we rait for the first
# "real" trade to take place
if (
# not calc_price
# and not ticker.rtTime
not ticker.rtTime
):
# spin consuming tickers until we get a real
# market datum
log.debug(f"New unsent ticker: {ticker}")
continue
else:
log.debug("Received first real volume tick")
# ugh, clear ticks since we've consumed them
# (ahem, ib_insync is truly stateful trash)
ticker.ticks = []
# XXX: this works because we don't use
# ``aclosing()`` above?
break
quote = normalize(ticker)
log.debug(f"First ticker received {quote}")
# tell caller quotes are now coming in live
feed_is_live.set()
# last = time.time()
async for ticker in stream:
quote = normalize(ticker)
await send_chan.send({quote['fqsn']: quote})
# ugh, clear ticks since we've consumed them
ticker.ticks = []
# last = time.time()
async def data_reset_hack(
reset_type: str = 'data',
) -> None:
'''
Run key combos for resetting data feeds and yield back to caller
when complete.
This is a linux-only hack around:
https://interactivebrokers.github.io/tws-api/historical_limitations.html#pacing_violations
TODOs:
- a return type that hopefully determines if the hack was
successful.
- other OS support?
- integration with ``ib-gw`` run in docker + Xorg?
'''
async def vnc_click_hack(
reset_type: str = 'data'
) -> None:
'''
Reset the data or netowork connection for the VNC attached
ib gateway using magic combos.
'''
key = {'data': 'f', 'connection': 'r'}[reset_type]
import asyncvnc
async with asyncvnc.connect(
'localhost',
port=3003,
# password='ibcansmbz',
) as client:
# move to middle of screen
# 640x1800
client.mouse.move(
x=500,
y=500,
)
client.mouse.click()
client.keyboard.press('Ctrl', 'Alt', key) # keys are stacked
await tractor.to_asyncio.run_task(vnc_click_hack)
# we don't really need the ``xdotool`` approach any more B)
return True
@tractor.context
async def open_symbol_search(
ctx: tractor.Context,
) -> None:
# TODO: load user defined symbol set locally for fast search?
await ctx.started({})
async with open_data_client() as proxy:
async with ctx.open_stream() as stream:
last = time.time()
async for pattern in stream:
log.debug(f'received {pattern}')
now = time.time()
assert pattern, 'IB can not accept blank search pattern'
# throttle search requests to no faster then 1Hz
diff = now - last
if diff < 1.0:
log.debug('throttle sleeping')
await trio.sleep(diff)
try:
pattern = stream.receive_nowait()
except trio.WouldBlock:
pass
if not pattern or pattern.isspace():
log.warning('empty pattern received, skipping..')
# TODO: *BUG* if nothing is returned here the client
# side will cache a null set result and not showing
# anything to the use on re-searches when this query
# timed out. We probably need a special "timeout" msg
# or something...
# XXX: this unblocks the far end search task which may
# hold up a multi-search nursery block
await stream.send({})
continue
log.debug(f'searching for {pattern}')
last = time.time()
# async batch search using api stocks endpoint and module
# defined adhoc symbol set.
stock_results = []
async def stash_results(target: Awaitable[list]):
stock_results.extend(await target)
async with trio.open_nursery() as sn:
sn.start_soon(
stash_results,
proxy.search_symbols(
pattern=pattern,
upto=5,
),
)
# trigger async request
await trio.sleep(0)
# match against our ad-hoc set immediately
adhoc_matches = fuzzy.extractBests(
pattern,
list(_adhoc_futes_set),
score_cutoff=90,
)
log.info(f'fuzzy matched adhocs: {adhoc_matches}')
adhoc_match_results = {}
if adhoc_matches:
# TODO: do we need to pull contract details?
adhoc_match_results = {i[0]: {} for i in adhoc_matches}
log.debug(f'fuzzy matching stocks {stock_results}')
stock_matches = fuzzy.extractBests(
pattern,
stock_results,
score_cutoff=50,
)
matches = adhoc_match_results | {
item[0]: {} for item in stock_matches
}
# TODO: we used to deliver contract details
# {item[2]: item[0] for item in stock_matches}
log.debug(f"sending matches: {matches.keys()}")
await stream.send(matches)

File diff suppressed because it is too large Load Diff

View File

@ -1,19 +1,3 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
""" """
Questrade API backend. Questrade API backend.
""" """
@ -22,29 +6,19 @@ import inspect
import time import time
from datetime import datetime from datetime import datetime
from functools import partial from functools import partial
import itertools
import configparser import configparser
from pprint import pformat from typing import List, Tuple, Dict, Any, Iterator, NamedTuple
from typing import (
List, Tuple, Dict, Any, Iterator, NamedTuple,
AsyncGenerator,
Callable,
)
import pendulum
import trio import trio
import tractor
from async_generator import asynccontextmanager from async_generator import asynccontextmanager
import numpy as np
import wrapt import wrapt
import asks import asks
from ..calc import humanize, percent_change from ..calc import humanize, percent_change
from .._cacheables import open_cached_client, async_lifo_cache from . import config
from .. import config from ._util import resproc, BrokerError
from ._util import resproc, BrokerError, SymbolNotFound from ..log import get_logger, colorize_json
from ..log import get_logger, colorize_json, get_console_log from .._async_utils import async_lifo_cache
log = get_logger(__name__) log = get_logger(__name__)
@ -56,25 +30,6 @@ _version = 'v1'
# it seems 4 rps is best we can do total # it seems 4 rps is best we can do total
_rate_limit = 4 _rate_limit = 4
_time_frames = {
'1m': 'OneMinute',
'2m': 'TwoMinutes',
'3m': 'ThreeMinutes',
'4m': 'FourMinutes',
'5m': 'FiveMinutes',
'10m': 'TenMinutes',
'15m': 'FifteenMinutes',
'20m': 'TwentyMinutes',
'30m': 'HalfHour',
'1h': 'OneHour',
'2h': 'TwoHours',
'4h': 'FourHours',
'D': 'OneDay',
'W': 'OneWeek',
'M': 'OneMonth',
'Y': 'OneYear',
}
class QuestradeError(Exception): class QuestradeError(Exception):
"Non-200 OK response code" "Non-200 OK response code"
@ -107,7 +62,7 @@ def refresh_token_on_err(tries=3):
for i in range(1, tries): for i in range(1, tries):
try: try:
try: try:
client._request_not_in_progress = trio.Event() client._request_not_in_progress.clear()
return await wrapped(*args, **kwargs) return await wrapped(*args, **kwargs)
finally: finally:
client._request_not_in_progress.set() client._request_not_in_progress.set()
@ -115,9 +70,9 @@ def refresh_token_on_err(tries=3):
if "Access token is invalid" not in str(qterr.args[0]): if "Access token is invalid" not in str(qterr.args[0]):
raise raise
# TODO: this will crash when run from a sub-actor since # TODO: this will crash when run from a sub-actor since
# STDIN can't be acquired (ONLY WITH MP). The right way # STDIN can't be acquired. The right way to handle this
# to handle this is to make a request to the parent # is to make a request to the parent actor (i.e.
# actor (i.e. spawner of this) to call this # spawner of this) to call this
# `client.ensure_access()` locally thus blocking until # `client.ensure_access()` locally thus blocking until
# the user provides an API key on the "client side" # the user provides an API key on the "client side"
log.warning(f"Tokens are invalid refreshing try {i}..") log.warning(f"Tokens are invalid refreshing try {i}..")
@ -213,18 +168,8 @@ class _API:
quote['key'] = quote['symbol'] quote['key'] = quote['symbol']
return quotes return quotes
async def candles( async def candles(self, id: str, start: str, end, interval) -> dict:
self, symbol_id: return await self._get(f'markets/candles/{id}', params={})
str, start: str,
end: str,
interval: str
) -> List[Dict[str, float]]:
"""Retrieve historical candles for provided date range.
"""
return (await self._get(
f'markets/candles/{symbol_id}',
params={'startTime': start, 'endTime': end, 'interval': interval},
))['candles']
async def option_contracts(self, symbol_id: str) -> dict: async def option_contracts(self, symbol_id: str) -> dict:
"Retrieve all option contract API ids with expiry -> strike prices." "Retrieve all option contract API ids with expiry -> strike prices."
@ -248,7 +193,7 @@ class _API:
for (symbol, symbol_id, expiry), bystrike in contracts.items() for (symbol, symbol_id, expiry), bystrike in contracts.items()
] ]
resp = await self._sess.post( resp = await self._sess.post(
path='/markets/quotes/options', path=f'/markets/quotes/options',
# XXX: b'{"code":1024,"message":"The size of the array requested # XXX: b'{"code":1024,"message":"The size of the array requested
# is not valid: optionIds"}' # is not valid: optionIds"}'
# ^ what I get when trying to use too many ids manually... # ^ what I get when trying to use too many ids manually...
@ -326,7 +271,7 @@ class Client:
# token whereby their service can't handle concurrent requests # token whereby their service can't handle concurrent requests
# to differnet end points (particularly the auth ep) which # to differnet end points (particularly the auth ep) which
# causes hangs and premature token invalidation issues. # causes hangs and premature token invalidation issues.
self._has_access = trio.Event() self._has_access.clear()
try: try:
# don't allow simultaneous token refresh requests # don't allow simultaneous token refresh requests
async with self._mutex: async with self._mutex:
@ -404,10 +349,7 @@ class Client:
return data return data
async def tickers2ids( async def tickers2ids(self, tickers):
self,
tickers: Iterator[str]
) -> Dict[str, int]:
"""Helper routine that take a sequence of ticker symbols and returns """Helper routine that take a sequence of ticker symbols and returns
their corresponding QT numeric symbol ids. their corresponding QT numeric symbol ids.
@ -420,7 +362,7 @@ class Client:
if id is not None: if id is not None:
symbols2ids[symbol] = id symbols2ids[symbol] = id
# still missing uncached values - hit the api server # still missing uncached values - hit the server
to_lookup = list(set(tickers) - set(symbols2ids)) to_lookup = list(set(tickers) - set(symbols2ids))
if to_lookup: if to_lookup:
data = await self.api.symbols(names=','.join(to_lookup)) data = await self.api.symbols(names=','.join(to_lookup))
@ -430,10 +372,10 @@ class Client:
return symbols2ids return symbols2ids
async def symbol_info(self, symbols: List[str]): async def symbol_data(self, tickers: List[str]):
"""Return symbol data for ``symbols``. """Return symbol data for ``tickers``.
""" """
t2ids = await self.tickers2ids(symbols) t2ids = await self.tickers2ids(tickers)
ids = ','.join(t2ids.values()) ids = ','.join(t2ids.values())
symbols = {} symbols = {}
for pkt in (await self.api.symbols(ids=ids))['symbols']: for pkt in (await self.api.symbols(ids=ids))['symbols']:
@ -441,9 +383,6 @@ class Client:
return symbols return symbols
# TODO: deprecate
symbol_data = symbol_info
async def quote(self, tickers: [str]): async def quote(self, tickers: [str]):
"""Return stock quotes for each ticker in ``tickers``. """Return stock quotes for each ticker in ``tickers``.
""" """
@ -572,114 +511,6 @@ class Client:
return quotes return quotes
async def bars(
self,
symbol: str,
# EST in ISO 8601 format is required... below is EPOCH
start_date: str = "1970-01-01T00:00:00.000000-05:00",
time_frame: str = '1m',
count: float = 20e3,
is_paid_feed: bool = False,
) -> List[Dict[str, Any]]:
"""Retreive OHLCV bars for a symbol over a range to the present.
.. note::
The candles endpoint only allows "2000" points per query
however tests here show that it is 20k candles per query.
"""
# fix case
if symbol.islower():
symbol = symbol.swapcase()
sids = await self.tickers2ids([symbol])
if not sids:
raise SymbolNotFound(symbol)
sid = sids[symbol]
# get last market open end time
est_end = now = pendulum.now('UTC').in_timezoe(
'America/New_York').start_of('minute')
# on non-paid feeds we can't retreive the first 15 mins
wd = now.isoweekday()
if wd > 5:
quotes = await self.quote([symbol])
est_end = pendulum.parse(
quotes[0]['lastTradeTime']
)
if est_end.hour == 0:
# XXX don't bother figuring out extended hours for now
est_end = est_end.replace(hour=17)
if not is_paid_feed:
est_end = est_end.shift(minutes=-15)
est_start = est_end.shift(minutes=-count)
start = time.time()
bars = await self.api.candles(
sid,
start=est_start.isoformat(),
end=est_end.isoformat(),
interval=_time_frames[time_frame],
)
log.debug(
f"Took {time.time() - start} seconds to retreive {len(bars)} bars")
return bars
async def search_symbols(
self,
pattern: str,
# how many contracts to return
upto: int = 10,
) -> Dict[str, str]:
details = {}
results = await self.api.search(prefix=pattern)
for result in results['symbols']:
sym = result['symbol']
if '.' not in sym:
sym = f"{sym}.{result['listingExchange']}"
details[sym] = result
if len(details) == upto:
return details
# marketstore TSD compatible numpy dtype for bar
_qt_bars_dt = [
('Epoch', 'i8'),
# ('start', 'S40'),
# ('end', 'S40'),
('low', 'f4'),
('high', 'f4'),
('open', 'f4'),
('close', 'f4'),
('volume', 'i8'),
# ('VWAP', 'f4')
]
def get_OHLCV(
bar: Dict[str, Any]
) -> Tuple[str, Any]:
"""Return a marketstore key-compatible OHCLV dictionary.
"""
del bar['end']
del bar['VWAP']
bar['start'] = pendulum.from_timestamp(bar['start']) / 10**9
return tuple(bar.values())
def bars_to_marketstore_structarray(
bars: List[Dict[str, Any]]
) -> np.array:
"""Return marketstore writeable recarray from sequence of bars
retrieved via the ``candles`` endpoint.
"""
return np.array(list(map(get_OHLCV, bars)), dtype=_qt_bars_dt)
async def token_refresher(client): async def token_refresher(client):
"""Coninually refresh the ``access_token`` near its expiry time. """Coninually refresh the ``access_token`` near its expiry time.
@ -718,7 +549,7 @@ def get_config(
has_token = section.get('refresh_token') if section else False has_token = section.get('refresh_token') if section else False
if force_from_user or ask_user_on_failure and not (section or has_token): if force_from_user or ask_user_on_failure and not (section or has_token):
log.warn("Forcing manual token auth from user") log.warn(f"Forcing manual token auth from user")
_token_from_user(conf) _token_from_user(conf)
else: else:
if not section: if not section:
@ -803,7 +634,7 @@ async def option_quoter(client: Client, tickers: List[str]):
if isinstance(tickers[0], tuple): if isinstance(tickers[0], tuple):
datetime.fromisoformat(tickers[0][1]) datetime.fromisoformat(tickers[0][1])
else: else:
raise ValueError('Option subscription format is (symbol, expiry)') raise ValueError(f'Option subscription format is (symbol, expiry)')
@async_lifo_cache(maxsize=128) @async_lifo_cache(maxsize=128)
async def get_contract_by_date( async def get_contract_by_date(
@ -848,7 +679,7 @@ _qt_stock_keys = {
'VWAP': ('VWAP', partial(round, ndigits=3)), 'VWAP': ('VWAP', partial(round, ndigits=3)),
'MC': ('MC', humanize), 'MC': ('MC', humanize),
'$ vol': ('$ vol', humanize), '$ vol': ('$ vol', humanize),
'volume': ('volume', humanize), 'volume': ('vol', humanize),
# 'close': 'close', # 'close': 'close',
# 'openPrice': 'open', # 'openPrice': 'open',
'lowPrice': 'low', 'lowPrice': 'low',
@ -856,9 +687,8 @@ _qt_stock_keys = {
# 'low52w': 'low52w', # put in info widget # 'low52w': 'low52w', # put in info widget
# 'high52w': 'high52w', # 'high52w': 'high52w',
# "lastTradePriceTrHrs": 7.99, # "lastTradePriceTrHrs": 7.99,
# 'lastTradeTime': ('fill_time', datetime.fromisoformat), # 'lastTradeTime': ('time', datetime.fromisoformat),
'lastTradeTime': 'fill_time', # "lastTradeTick": "Equal",
"lastTradeTick": 'tick', # ("Equal", "Up", "Down")
# "symbolId": 3575753, # "symbolId": 3575753,
# "tier": "", # "tier": "",
# 'isHalted': 'halted', # as subscript 'h' # 'isHalted': 'halted', # as subscript 'h'
@ -866,12 +696,12 @@ _qt_stock_keys = {
} }
# BidAskLayout columns which will contain three cells the first stacked on top # BidAskLayout columns which will contain three cells the first stacked on top
# of the other 2 (this is a UI layout instruction) # of the other 2
_stock_bidasks = { _stock_bidasks = {
'last': ['bid', 'ask'], 'last': ['bid', 'ask'],
'size': ['bsize', 'asize'], 'size': ['bsize', 'asize'],
'VWAP': ['low', 'high'], 'VWAP': ['low', 'high'],
'volume': ['MC', '$ vol'], 'vol': ['MC', '$ vol'],
} }
@ -888,43 +718,29 @@ def format_stock_quote(
and the second is the same but with all values converted to a and the second is the same but with all values converted to a
"display-friendly" string format. "display-friendly" string format.
""" """
last = quote['lastTradePrice']
symbol = quote['symbol'] symbol = quote['symbol']
previous = symbol_data[symbol]['prevDayClosePrice'] previous = symbol_data[symbol]['prevDayClosePrice']
computed = {'symbol': symbol}
last = quote.get('lastTradePrice')
if last:
change = percent_change(previous, last) change = percent_change(previous, last)
share_count = symbol_data[symbol].get('outstandingShares', None) share_count = symbol_data[symbol].get('outstandingShares', None)
mktcap = share_count * last if (last and share_count) else 0 mktcap = share_count * last if (last and share_count) else 0
computed.update({ computed = {
# 'symbol': quote['symbol'], 'symbol': quote['symbol'],
'%': round(change, 3), '%': round(change, 3),
'MC': mktcap, 'MC': mktcap,
# why questrade do you have to be shipping null values!!! # why QT do you have to be an asshole shipping null values!!!
# '$ vol': round((quote['VWAP'] or 0) * (quote['volume'] or 0), 3), '$ vol': round((quote['VWAP'] or 0) * (quote['volume'] or 0), 3),
'close': previous, 'close': previous,
}) }
vwap = quote.get('VWAP')
volume = quote.get('volume')
if volume is not None: # could be 0
# why questrade do you have to be an asshole shipping null values!!!
computed['$ vol'] = round((vwap or 0) * (volume or 0), 3)
new = {} new = {}
displayable = {} displayable = {}
for key, value in itertools.chain(quote.items(), computed.items()): for key, new_key in keymap.items():
new_key = keymap.get(key) display_value = value = computed.get(key) or quote.get(key)
if not new_key:
continue
# API servers can return `None` vals when markets are closed (weekend) # API servers can return `None` vals when markets are closed (weekend)
value = 0 if value is None else value value = 0 if value is None else value
display_value = value
# convert values to a displayble format using available formatting func # convert values to a displayble format using available formatting func
if isinstance(new_key, tuple): if isinstance(new_key, tuple):
new_key, func = new_key new_key, func = new_key
@ -933,7 +749,6 @@ def format_stock_quote(
new[new_key] = value new[new_key] = value
displayable[new_key] = display_value displayable[new_key] = display_value
new['displayable'] = displayable
return new, displayable return new, displayable
@ -955,8 +770,7 @@ _qt_option_keys = {
# "theta": ('theta', partial(round, ndigits=3)), # "theta": ('theta', partial(round, ndigits=3)),
# "vega": ('vega', partial(round, ndigits=3)), # "vega": ('vega', partial(round, ndigits=3)),
'$ vol': ('$ vol', humanize), '$ vol': ('$ vol', humanize),
# XXX: required key to trigger trade execution datum msg 'volume': ('vol', humanize),
'volume': ('volume', humanize),
# "2021-01-15T00:00:00.000000-05:00", # "2021-01-15T00:00:00.000000-05:00",
# "isHalted": false, # "isHalted": false,
# "key": [ # "key": [
@ -994,7 +808,6 @@ def format_option_quote(
quote: dict, quote: dict,
symbol_data: dict, symbol_data: dict,
keymap: dict = _qt_option_keys, keymap: dict = _qt_option_keys,
include_displayables: bool = True,
) -> Tuple[dict, dict]: ) -> Tuple[dict, dict]:
"""Remap a list of quote dicts ``quotes`` using the mapping of old keys """Remap a list of quote dicts ``quotes`` using the mapping of old keys
-> new keys ``keymap`` returning 2 dicts: one with raw data and the other -> new keys ``keymap`` returning 2 dicts: one with raw data and the other
@ -1005,25 +818,19 @@ def format_option_quote(
"display-friendly" string format. "display-friendly" string format.
""" """
# TODO: need historical data.. # TODO: need historical data..
# (cause why would questrade keep their quote structure consistent across # (cause why would QT keep their quote structure consistent across
# assets..) # assets..)
# previous = symbol_data[symbol]['prevDayClosePrice'] # previous = symbol_data[symbol]['prevDayClosePrice']
# change = percent_change(previous, last) # change = percent_change(previous, last)
computed = { computed = {
# why QT do you have to be an asshole shipping null values!!! # why QT do you have to be an asshole shipping null values!!!
# '$ vol': round((quote['VWAP'] or 0) * (quote['volume'] or 0), 3), '$ vol': round((quote['VWAP'] or 0) * (quote['volume'] or 0), 3),
# '%': round(change, 3), # '%': round(change, 3),
# 'close': previous, # 'close': previous,
} }
new = {} new = {}
displayable = {} displayable = {}
vwap = quote.get('VWAP')
volume = quote.get('volume')
if volume is not None: # could be 0
# why questrade do you have to be an asshole shipping null values!!!
computed['$ vol'] = round((vwap or 0) * (volume or 0), 3)
# structuring and normalization # structuring and normalization
for key, new_key in keymap.items(): for key, new_key in keymap.items():
display_value = value = computed.get(key) or quote.get(key) display_value = value = computed.get(key) or quote.get(key)
@ -1040,206 +847,3 @@ def format_option_quote(
displayable[new_key] = display_value displayable[new_key] = display_value
return new, displayable return new, displayable
async def smoke_quote(
get_quotes,
tickers
):
"""Do an initial "smoke" request for symbols in ``tickers`` filtering
out any symbols not supported by the broker queried in the call to
``get_quotes()``.
"""
from operator import itemgetter
# TODO: trim out with #37
#################################################
# get a single quote filtering out any bad tickers
# NOTE: this code is always run for every new client
# subscription even when a broker quoter task is already running
# since the new client needs to know what symbols are accepted
log.warn(f"Retrieving smoke quote for symbols {tickers}")
quotes = await get_quotes(tickers)
# report any tickers that aren't returned in the first quote
invalid_tickers = set(tickers) - set(map(itemgetter('key'), quotes))
for symbol in invalid_tickers:
tickers.remove(symbol)
log.warn(
f"Symbol `{symbol}` not found") # by broker `{broker}`"
# )
# pop any tickers that return "empty" quotes
payload = {}
for quote in quotes:
symbol = quote['symbol']
if quote is None:
log.warn(
f"Symbol `{symbol}` not found")
# XXX: not this mutates the input list (for now)
tickers.remove(symbol)
continue
# report any unknown/invalid symbols (QT specific)
if quote.get('low52w', False) is None:
log.error(
f"{symbol} seems to be defunct")
quote['symbol'] = symbol
payload[symbol] = quote
return payload
# end of section to be trimmed out with #37
###########################################
# unbounded, shared between streaming tasks
_symbol_info_cache = {}
# function to format packets delivered to subscribers
def packetizer(
topic: str,
quotes: Dict[str, Any],
) -> Dict[str, Any]:
"""Normalize quotes by name into dicts using broker-specific
processing.
"""
# repack into symbol keyed dict
return {q['symbol']: q for q in quotes}
def normalize(
quotes: Dict[str, Any],
_cache: Dict[str, Any], # dict held in scope of the streaming loop
formatter: Callable,
) -> Dict[str, Any]:
"""Deliver normalized quotes by name into dicts using
broker-specific processing; only emit changes differeing from the
last quote sample creating a psuedo-tick type datum.
"""
new = {}
# XXX: this is effectively emitting "sampled ticks"
# useful for polling setups but obviously should be
# disabled if you're already rx-ing per-tick data.
for quote in quotes:
symbol = quote['symbol']
# look up last quote from cache
last = _cache.setdefault(symbol, {})
_cache[symbol] = quote
# compute volume difference
last_volume = last.get('volume', 0)
current_volume = quote['volume']
volume_diff = current_volume - last_volume
# find all keys that have match to a new value compared
# to the last quote received
changed = set(quote.items()) - set(last.items())
if changed:
log.info(f"New quote {symbol}:\n{changed}")
# TODO: can we reduce the # of iterations here and in
# called funcs?
payload = {k: quote[k] for k, v in changed}
payload['symbol'] = symbol # required by formatter
# TODO: we should probaby do the "computed" fields
# processing found inside this func in a downstream actor?
fquote, _ = formatter(payload, _symbol_info_cache)
fquote['key'] = fquote['symbol'] = symbol
# if there was volume likely the last size of
# shares traded is useful info and it's possible
# that the set difference from above will disregard
# a "size" value since the same # of shares were traded
# volume = payload.get('volume')
if volume_diff:
if volume_diff < 0:
log.error(f"Uhhh {symbol} volume: {volume_diff} ?")
fquote['volume_delta'] = volume_diff
# TODO: We can emit 2 ticks here:
# - one for the volume differential
# - one for the last known trade size
# The first in theory can be unwound and
# interpolated assuming the broker passes an
# accurate daily VWAP value.
# To make this work we need a universal ``size``
# field that is normalized before hitting this logic.
fquote['size'] = quote.get('lastTradeSize', 0)
if 'last' not in fquote:
fquote['last'] = quote.get('lastTradePrice', float('nan'))
new[symbol] = fquote
if new:
log.info(f"New quotes:\n{pformat(new)}")
return new
# TODO: currently this backend uses entirely different
# data feed machinery that was written earlier then the
# existing stuff used in other backends. This needs to
# be ported eventually and should *just work* despite
# being a multi-symbol, poll-style feed system.
@tractor.stream
async def stream_quotes(
ctx: tractor.Context, # marks this as a streaming func
symbols: List[str],
feed_type: str = 'stock',
rate: int = 3,
loglevel: str = None,
# feed_type: str = 'stock',
) -> AsyncGenerator[str, Dict[str, Any]]:
# XXX: required to propagate ``tractor`` loglevel to piker logging
get_console_log(loglevel)
async with open_cached_client('questrade') as client:
if feed_type == 'stock':
formatter = format_stock_quote
get_quotes = await stock_quoter(client, symbols)
# do a smoke quote (note this mutates the input list and filters
# out bad symbols for now)
first_quotes = await smoke_quote(get_quotes, list(symbols))
else:
formatter = format_option_quote
get_quotes = await option_quoter(client, symbols)
# packetize
first_quotes = {
quote['symbol']: quote
for quote in await get_quotes(symbols)
}
# update global symbol data state
sd = await client.symbol_info(symbols)
_symbol_info_cache.update(sd)
# pre-process first set of quotes
payload = {}
for sym, quote in first_quotes.items():
fquote, _ = formatter(quote, sd)
payload[sym] = fquote
# push initial smoke quote response for client initialization
await ctx.send_yield(payload)
from .data import stream_poll_requests
await stream_poll_requests(
# ``msg.pub`` required kwargs
task_name=feed_type,
ctx=ctx,
topics=symbols,
packetizer=packetizer,
# actual target "streaming func" args
get_quotes=get_quotes,
normalizer=partial(normalize, formatter=formatter),
rate=rate,
)
log.info("Terminating stream quoter task")

View File

@ -1,19 +1,3 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
""" """
Robinhood API backend. Robinhood API backend.

View File

@ -1,103 +1,33 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
""" """
Handy financial calculations. Handy financial calculations.
""" """
import math import math
import itertools import itertools
from bidict import bidict
def humanize(number, digits=1):
_mag2suffix = bidict({3: 'k', 6: 'M', 9: 'B'}) """Convert large numbers to something with at most 3 digits and
def humanize(
number: float,
digits: int = 1
) -> str:
'''
Convert large numbers to something with at most ``digits`` and
a letter suffix (eg. k: thousand, M: million, B: billion). a letter suffix (eg. k: thousand, M: million, B: billion).
"""
'''
try: try:
float(number) float(number)
except ValueError: except ValueError:
return '0' return 0
if not number or number <= 0: if not number or number <= 0:
return str(round(number, ndigits=digits)) return number
mag2suffix = {3: 'k', 6: 'M', 9: 'B'}
mag = round(math.log(number, 10)) mag = math.floor(math.log(number, 10))
if mag < 3: if mag < 3:
return str(round(number, ndigits=digits)) return number
maxmag = max(itertools.takewhile(lambda key: mag >= key, mag2suffix))
maxmag = max( return "{:.{digits}f}{}".format(
itertools.takewhile( number/10**maxmag, mag2suffix[maxmag], digits=digits)
lambda key: mag >= key, _mag2suffix
)
)
return "{value}{suffix}".format(
value=round(number/10**maxmag, ndigits=digits),
suffix=_mag2suffix[maxmag],
)
def puterize( def percent_change(init, new):
"""Calcuate the percentage change of some ``new`` value
text: str,
digits: int = 1,
) -> float:
'''Inverse of ``humanize()`` above.
'''
try:
suffix = str(text)[-1]
mult = _mag2suffix.inverse[suffix]
value = text.rstrip(suffix)
return round(float(value) * 10**mult, ndigits=digits)
except KeyError:
# no matching suffix try just the value
return float(text)
def pnl(
init: float,
new: float,
) -> float:
'''Calcuate the percentage change of some ``new`` value
from some initial value, ``init``. from some initial value, ``init``.
"""
'''
if not (init and new): if not (init and new):
return 0 return 0
return (new - init) / init * 100.
return (new - init) / init
def percent_change(
init: float,
new: float,
) -> float:
return pnl(init, new) * 100.

View File

@ -1,20 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Market machinery for order executions, book, management.
"""

View File

@ -1,359 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Position allocation logic and protocols.
'''
from enum import Enum
from typing import Optional
from bidict import bidict
from pydantic import BaseModel, validator
from ..data._source import Symbol
from ._messages import BrokerdPosition, Status
class Position(BaseModel):
'''
Basic pp (personal position) model with attached fills history.
This type should be IPC wire ready?
'''
symbol: Symbol
# last size and avg entry price
size: float
avg_price: float # TODO: contextual pricing
# ordered record of known constituent trade messages
fills: list[Status] = []
def update_from_msg(
self,
msg: BrokerdPosition,
) -> None:
# XXX: better place to do this?
symbol = self.symbol
lot_size_digits = symbol.lot_size_digits
avg_price, size = (
round(msg['avg_price'], ndigits=symbol.tick_size_digits),
round(msg['size'], ndigits=lot_size_digits),
)
self.avg_price = avg_price
self.size = size
@property
def dsize(self) -> float:
'''
The "dollar" size of the pp, normally in trading (fiat) unit
terms.
'''
return self.avg_price * self.size
_size_units = bidict({
'currency': '$ size',
'units': '# units',
# TODO: but we'll need a `<brokermod>.get_accounts()` or something
# 'percent_of_port': '% of port',
})
SizeUnit = Enum(
'SizeUnit',
_size_units,
)
class Allocator(BaseModel):
class Config:
validate_assignment = True
copy_on_model_validation = False
arbitrary_types_allowed = True
# required to get the account validator lookup working?
extra = 'allow'
underscore_attrs_are_private = False
symbol: Symbol
account: Optional[str] = 'paper'
# TODO: for enums this clearly doesn't fucking work, you can't set
# a default at startup by passing in a `dict` but yet you can set
# that value through assignment..for wtv cucked reason.. honestly, pure
# unintuitive garbage.
size_unit: str = 'currency'
_size_units: dict[str, Optional[str]] = _size_units
@validator('size_unit', pre=True)
def maybe_lookup_key(cls, v):
# apply the corresponding enum key for the text "description" value
if v not in _size_units:
return _size_units.inverse[v]
assert v in _size_units
return v
# TODO: if we ever want ot support non-uniform entry-slot-proportion
# "sizes"
# disti_weight: str = 'uniform'
units_limit: float
currency_limit: float
slots: int
def step_sizes(
self,
) -> (float, float):
'''
Return the units size for each unit type as a tuple.
'''
slots = self.slots
return (
self.units_limit / slots,
self.currency_limit / slots,
)
def limit(self) -> float:
if self.size_unit == 'currency':
return self.currency_limit
else:
return self.units_limit
def next_order_info(
self,
# we only need a startup size for exit calcs, we can the
# determine how large slots should be if the initial pp size was
# larger then the current live one, and the live one is smaller
# then the initial config settings.
startup_pp: Position,
live_pp: Position,
price: float,
action: str,
) -> dict:
'''
Generate order request info for the "next" submittable order
depending on position / order entry config.
'''
sym = self.symbol
ld = sym.lot_size_digits
size_unit = self.size_unit
live_size = live_pp.size
abs_live_size = abs(live_size)
abs_startup_size = abs(startup_pp.size)
u_per_slot, currency_per_slot = self.step_sizes()
if size_unit == 'units':
slot_size = u_per_slot
l_sub_pp = self.units_limit - abs_live_size
elif size_unit == 'currency':
live_cost_basis = abs_live_size * live_pp.avg_price
slot_size = currency_per_slot / price
l_sub_pp = (self.currency_limit - live_cost_basis) / price
else:
raise ValueError(
f"Not valid size unit '{size_unit}'"
)
# an entry (adding-to or starting a pp)
if (
action == 'buy' and live_size > 0 or
action == 'sell' and live_size < 0 or
live_size == 0
):
order_size = min(slot_size, l_sub_pp)
# an exit (removing-from or going to net-zero pp)
else:
# when exiting a pp we always try to slot the position
# in the instrument's units, since doing so in a derived
# size measure (eg. currency value, percent of port) would
# result in a mis-mapping of slots sizes in unit terms
# (i.e. it would take *more* slots to exit at a profit and
# *less* slots to exit at a loss).
pp_size = max(abs_startup_size, abs_live_size)
slotted_pp = pp_size / self.slots
if size_unit == 'currency':
# compute the "projected" limit's worth of units at the
# current pp (weighted) price:
slot_size = currency_per_slot / live_pp.avg_price
else:
slot_size = u_per_slot
# TODO: ensure that the limit can never be set **lower**
# then the current pp size? It should be configured
# correctly at startup right?
# if our position is greater then our limit setting
# we'll want to use slot sizes which are larger then what
# the limit would normally determine.
order_size = max(slotted_pp, slot_size)
if (
abs_live_size < slot_size or
# NOTE: front/back "loading" heurstic:
# if the remaining pp is in between 0-1.5x a slot's
# worth, dump the whole position in this last exit
# therefore conducting so called "back loading" but
# **without** going past a net-zero pp. if the pp is
# > 1.5x a slot size, then front load: exit a slot's and
# expect net-zero to be acquired on the final exit.
slot_size < pp_size < round((1.5*slot_size), ndigits=ld) or
# underlying requires discrete (int) units (eg. stocks)
# and thus our slot size (based on our limit) would
# exit a fractional unit's worth so, presuming we aren't
# supporting a fractional-units-style broker, we need
# exit the final unit.
ld == 0 and abs_live_size == 1
):
order_size = abs_live_size
slots_used = 1.0 # the default uniform policy
if order_size < slot_size:
# compute a fractional slots size to display
slots_used = self.slots_used(
Position(symbol=sym, size=order_size, avg_price=price)
)
return {
'size': abs(round(order_size, ndigits=ld)),
'size_digits': ld,
# TODO: incorporate multipliers for relevant derivatives
'fiat_size': round(order_size * price, ndigits=2),
'slots_used': slots_used,
# update line LHS label with account name
'account': self.account,
}
def slots_used(
self,
pp: Position,
) -> float:
'''
Calc and return the number of slots used by this ``Position``.
'''
abs_pp_size = abs(pp.size)
if self.size_unit == 'currency':
# live_currency_size = size or (abs_pp_size * pp.avg_price)
live_currency_size = abs_pp_size * pp.avg_price
prop = live_currency_size / self.currency_limit
else:
# return (size or abs_pp_size) / alloc.units_limit
prop = abs_pp_size / self.units_limit
# TODO: REALLY need a way to show partial slots..
# for now we round at the midway point between slots
return round(prop * self.slots)
_derivs = (
'future',
'continuous_future',
'option',
'futures_option',
)
def mk_allocator(
symbol: Symbol,
startup_pp: Position,
# default allocation settings
defaults: dict[str, float] = {
'account': None, # select paper by default
'size_unit': 'currency',
'units_limit': 400,
'currency_limit': 5e3,
'slots': 4,
},
**kwargs,
) -> Allocator:
if kwargs:
defaults.update(kwargs)
# load and retreive user settings for default allocations
# ``config.toml``
user_def = {
'currency_limit': 6e3,
'slots': 6,
}
defaults.update(user_def)
alloc = Allocator(
symbol=symbol,
**defaults,
)
asset_type = symbol.type_key
# specific configs by asset class / type
if asset_type in _derivs:
# since it's harder to know how currency "applies" in this case
# given leverage properties
alloc.size_unit = '# units'
# set units limit to slots size thus making make the next
# entry step 1.0
alloc.units_limit = alloc.slots
# if the current position is already greater then the limit
# settings, increase the limit to the current position
if alloc.size_unit == 'currency':
startup_size = startup_pp.size * startup_pp.avg_price
if startup_size > alloc.currency_limit:
alloc.currency_limit = round(startup_size, ndigits=2)
else:
startup_size = abs(startup_pp.size)
if startup_size > alloc.units_limit:
alloc.units_limit = startup_size
if asset_type in _derivs:
alloc.slots = alloc.units_limit
return alloc

View File

@ -1,228 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Orders and execution client API.
"""
from contextlib import asynccontextmanager as acm
from typing import Dict
from pprint import pformat
from dataclasses import dataclass, field
import trio
import tractor
from tractor.trionics import broadcast_receiver
from ..log import get_logger
from ._ems import _emsd_main
from .._daemon import maybe_open_emsd
from ._messages import Order, Cancel
log = get_logger(__name__)
@dataclass
class OrderBook:
'''EMS-client-side order book ctl and tracking.
A style similar to "model-view" is used here where this api is
provided as a supervised control for an EMS actor which does all the
hard/fast work of talking to brokers/exchanges to conduct
executions.
Currently, this is mostly for keeping local state to match the EMS
and use received events to trigger graphics updates.
'''
# mem channels used to relay order requests to the EMS daemon
_to_ems: trio.abc.SendChannel
_from_order_book: trio.abc.ReceiveChannel
_sent_orders: Dict[str, Order] = field(default_factory=dict)
_ready_to_receive: trio.Event = trio.Event()
def send(
self,
msg: Order,
) -> dict:
self._sent_orders[msg.oid] = msg
self._to_ems.send_nowait(msg.dict())
return msg
def update(
self,
uuid: str,
**data: dict,
) -> dict:
cmd = self._sent_orders[uuid]
msg = cmd.dict()
msg.update(data)
self._sent_orders[uuid] = Order(**msg)
self._to_ems.send_nowait(msg)
return cmd
def cancel(self, uuid: str) -> bool:
"""Cancel an order (or alert) in the EMS.
"""
cmd = self._sent_orders[uuid]
msg = Cancel(
oid=uuid,
symbol=cmd.symbol,
)
self._to_ems.send_nowait(msg.dict())
_orders: OrderBook = None
def get_orders(
emsd_uid: tuple[str, str] = None
) -> OrderBook:
""""
OrderBook singleton factory per actor.
"""
if emsd_uid is not None:
# TODO: read in target emsd's active book on startup
pass
global _orders
if _orders is None:
size = 100
tx, rx = trio.open_memory_channel(size)
brx = broadcast_receiver(rx, size)
# setup local ui event streaming channels for request/resp
# streamging with EMS daemon
_orders = OrderBook(
_to_ems=tx,
_from_order_book=brx,
)
return _orders
# TODO: we can get rid of this relay loop once we move
# order_mode inputs to async code!
async def relay_order_cmds_from_sync_code(
symbol_key: str,
to_ems_stream: tractor.MsgStream,
) -> None:
"""
Order streaming task: deliver orders transmitted from UI
to downstream consumers.
This is run in the UI actor (usually the one running Qt but could be
any other client service code). This process simply delivers order
messages to the above ``_to_ems`` send channel (from sync code using
``.send_nowait()``), these values are pulled from the channel here
and relayed to any consumer(s) that called this function using
a ``tractor`` portal.
This effectively makes order messages look like they're being
"pushed" from the parent to the EMS where local sync code is likely
doing the pushing from some UI.
"""
book = get_orders()
async with book._from_order_book.subscribe() as orders_stream:
async for cmd in orders_stream:
if cmd['symbol'] == symbol_key:
log.info(f'Send order cmd:\n{pformat(cmd)}')
# send msg over IPC / wire
await to_ems_stream.send(cmd)
@acm
async def open_ems(
fqsn: str,
) -> (
OrderBook,
tractor.MsgStream,
dict,
):
'''
Spawn an EMS daemon and begin sending orders and receiving
alerts.
This EMS tries to reduce most broker's terrible order entry apis to
a very simple protocol built on a few easy to grok and/or
"rantsy" premises:
- most users will prefer "dark mode" where orders are not submitted
to a broker until and execution condition is triggered
(aka client-side "hidden orders")
- Brokers over-complicate their apis and generally speaking hire
poor designers to create them. We're better off using creating a super
minimal, schema-simple, request-event-stream protocol to unify all the
existing piles of shit (and shocker, it'll probably just end up
looking like a decent crypto exchange's api)
- all order types can be implemented with client-side limit orders
- we aren't reinventing a wheel in this case since none of these
brokers are exposing FIX protocol; it is they doing the re-invention.
TODO: make some fancy diagrams using mermaid.io
the possible set of responses from the stream is currently:
- 'dark_submitted', 'broker_submitted'
- 'dark_cancelled', 'broker_cancelled'
- 'dark_executed', 'broker_executed'
- 'broker_filled'
'''
# wait for service to connect back to us signalling
# ready for order commands
book = get_orders()
from ..data._source import unpack_fqsn
broker, symbol, suffix = unpack_fqsn(fqsn)
async with maybe_open_emsd(broker) as portal:
async with (
# connect to emsd
portal.open_context(
_emsd_main,
fqsn=fqsn,
) as (ctx, (positions, accounts)),
# open 2-way trade command stream
ctx.open_stream() as trades_stream,
):
async with trio.open_nursery() as n:
n.start_soon(
relay_order_cmds_from_sync_code,
fqsn,
trades_stream
)
yield book, trades_stream, positions, accounts

File diff suppressed because it is too large Load Diff

View File

@ -1,263 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Clearing system messagingn types and protocols.
"""
from typing import Optional, Union
# TODO: try out just encoding/send direction for now?
# import msgspec
from pydantic import BaseModel
from ..data._source import Symbol
# Client -> emsd
class Cancel(BaseModel):
'''Cancel msg for removing a dark (ems triggered) or
broker-submitted (live) trigger/order.
'''
action: str = 'cancel'
oid: str # uuid4
symbol: str
class Order(BaseModel):
action: str # {'buy', 'sell', 'alert'}
# internal ``emdsd`` unique "order id"
oid: str # uuid4
symbol: Union[str, Symbol]
account: str # should we set a default as '' ?
price: float
size: float
brokers: list[str]
# Assigned once initial ack is received
# ack_time_ns: Optional[int] = None
# determines whether the create execution
# will be submitted to the ems or directly to
# the backend broker
exec_mode: str # {'dark', 'live', 'paper'}
class Config:
# just for pre-loading a ``Symbol`` when used
# in the order mode staging process
arbitrary_types_allowed = True
# don't copy this model instance when used in
# a recursive model
copy_on_model_validation = False
# Client <- emsd
# update msgs from ems which relay state change info
# from the active clearing engine.
class Status(BaseModel):
name: str = 'status'
oid: str # uuid4
time_ns: int
# {
# 'dark_submitted',
# 'dark_cancelled',
# 'dark_triggered',
# 'broker_submitted',
# 'broker_cancelled',
# 'broker_executed',
# 'broker_filled',
# 'broker_errored',
# 'alert_submitted',
# 'alert_triggered',
# }
resp: str # "response", see above
# symbol: str
# trigger info
trigger_price: Optional[float] = None
# price: float
# broker: Optional[str] = None
# this maps normally to the ``BrokerdOrder.reqid`` below, an id
# normally allocated internally by the backend broker routing system
broker_reqid: Optional[Union[int, str]] = None
# for relaying backend msg data "through" the ems layer
brokerd_msg: dict = {}
# emsd -> brokerd
# requests *sent* from ems to respective backend broker daemon
class BrokerdCancel(BaseModel):
action: str = 'cancel'
oid: str # piker emsd order id
time_ns: int
account: str
# "broker request id": broker specific/internal order id if this is
# None, creates a new order otherwise if the id is valid the backend
# api must modify the existing matching order. If the broker allows
# for setting a unique order id then this value will be relayed back
# on the emsd order request stream as the ``BrokerdOrderAck.reqid``
# field
reqid: Optional[Union[int, str]] = None
class BrokerdOrder(BaseModel):
action: str # {buy, sell}
oid: str
account: str
time_ns: int
# "broker request id": broker specific/internal order id if this is
# None, creates a new order otherwise if the id is valid the backend
# api must modify the existing matching order. If the broker allows
# for setting a unique order id then this value will be relayed back
# on the emsd order request stream as the ``BrokerdOrderAck.reqid``
# field
reqid: Optional[Union[int, str]] = None
symbol: str # symbol.<providername> ?
price: float
size: float
# emsd <- brokerd
# requests *received* to ems from broker backend
class BrokerdOrderAck(BaseModel):
'''
Immediate reponse to a brokerd order request providing the broker
specific unique order id so that the EMS can associate this
(presumably differently formatted broker side ID) with our own
``.oid`` (which is a uuid4).
'''
name: str = 'ack'
# defined and provided by backend
reqid: Union[int, str]
# emsd id originally sent in matching request msg
oid: str
account: str = ''
class BrokerdStatus(BaseModel):
name: str = 'status'
reqid: Union[int, str]
time_ns: int
# XXX: should be best effort set for every update
account: str = ''
# {
# 'submitted',
# 'cancelled',
# 'filled',
# }
status: str
filled: float = 0.0
reason: str = ''
remaining: float = 0.0
# XXX: better design/name here?
# flag that can be set to indicate a message for an order
# event that wasn't originated by piker's emsd (eg. some external
# trading system which does it's own order control but that you
# might want to "track" using piker UIs/systems).
external: bool = False
# XXX: not required schema as of yet
broker_details: dict = {
'name': '',
}
class BrokerdFill(BaseModel):
'''
A single message indicating a "fill-details" event from the broker
if avaiable.
'''
name: str = 'fill'
reqid: Union[int, str]
time_ns: int
# order exeuction related
action: str
size: float
price: float
broker_details: dict = {} # meta-data (eg. commisions etc.)
# brokerd timestamp required for order mode arrow placement on x-axis
# TODO: maybe int if we force ns?
# we need to normalize this somehow since backends will use their
# own format and likely across many disparate epoch clocks...
broker_time: float
class BrokerdError(BaseModel):
'''
Optional error type that can be relayed to emsd for error handling.
This is still a TODO thing since we're not sure how to employ it yet.
'''
name: str = 'error'
oid: str
# if no brokerd order request was actually submitted (eg. we errored
# at the ``pikerd`` layer) then there will be ``reqid`` allocated.
reqid: Optional[Union[int, str]] = None
symbol: str
reason: str
broker_details: dict = {}
class BrokerdPosition(BaseModel):
'''Position update event from brokerd.
'''
name: str = 'position'
broker: str
account: str
symbol: str
currency: str
size: float
avg_price: float

View File

@ -1,527 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Fake trading for forward testing.
"""
from contextlib import asynccontextmanager
from datetime import datetime
from operator import itemgetter
import time
from typing import Tuple, Optional, Callable
import uuid
from bidict import bidict
import trio
import tractor
from dataclasses import dataclass
from .. import data
from ..data._normalize import iterticks
from ..data._source import unpack_fqsn
from ..log import get_logger
from ._messages import (
BrokerdCancel, BrokerdOrder, BrokerdOrderAck, BrokerdStatus,
BrokerdFill, BrokerdPosition, BrokerdError
)
log = get_logger(__name__)
@dataclass
class PaperBoi:
"""
Emulates a broker order client providing the same API and
delivering an order-event response stream but with methods for
triggering desired events based on forward testing engine
requirements.
"""
broker: str
ems_trades_stream: tractor.MsgStream
# map of paper "live" orders which be used
# to simulate fills based on paper engine settings
_buys: bidict
_sells: bidict
_reqids: bidict
_positions: dict[str, BrokerdPosition]
# init edge case L1 spread
last_ask: Tuple[float, float] = (float('inf'), 0) # price, size
last_bid: Tuple[float, float] = (0, 0)
async def submit_limit(
self,
oid: str, # XXX: see return value
symbol: str,
price: float,
action: str,
size: float,
reqid: Optional[str],
) -> int:
"""Place an order and return integer request id provided by client.
"""
is_modify: bool = False
if reqid is None:
reqid = str(uuid.uuid4())
else:
# order is already existing, this is a modify
(oid, symbol, action, old_price) = self._reqids[reqid]
assert old_price != price
is_modify = True
# register order internally
self._reqids[reqid] = (oid, symbol, action, price)
if action == 'alert':
# bypass all fill simulation
return reqid
# TODO: net latency model
# we checkpoint here quickly particulalry
# for dark orders since we want the dark_executed
# to trigger first thus creating a lookup entry
# in the broker trades event processing loop
await trio.sleep(0.05)
if action == 'sell':
size = -size
msg = BrokerdStatus(
status='submitted',
reqid=reqid,
broker=self.broker,
time_ns=time.time_ns(),
filled=0.0,
reason='paper_trigger',
remaining=size,
)
await self.ems_trades_stream.send(msg.dict())
# if we're already a clearing price simulate an immediate fill
if (
action == 'buy' and (clear_price := self.last_ask[0]) <= price
) or (
action == 'sell' and (clear_price := self.last_bid[0]) >= price
):
await self.fake_fill(symbol, clear_price, size, action, reqid, oid)
else:
# register this submissions as a paper live order
# submit order to book simulation fill loop
if action == 'buy':
orders = self._buys
elif action == 'sell':
orders = self._sells
# set the simulated order in the respective table for lookup
# and trigger by the simulated clearing task normally
# running ``simulate_fills()``.
if is_modify:
# remove any existing order for the old price
orders[symbol].pop((oid, old_price))
# buys/sells: (symbol -> (price -> order))
orders.setdefault(symbol, {})[(oid, price)] = (size, reqid, action)
return reqid
async def submit_cancel(
self,
reqid: str,
) -> None:
# TODO: fake market simulation effects
oid, symbol, action, price = self._reqids[reqid]
if action == 'buy':
self._buys[symbol].pop((oid, price))
elif action == 'sell':
self._sells[symbol].pop((oid, price))
# TODO: net latency model
await trio.sleep(0.05)
msg = BrokerdStatus(
status='cancelled',
oid=oid,
reqid=reqid,
broker=self.broker,
time_ns=time.time_ns(),
)
await self.ems_trades_stream.send(msg.dict())
async def fake_fill(
self,
symbol: str,
price: float,
size: float,
action: str, # one of {'buy', 'sell'}
reqid: str,
oid: str,
# determine whether to send a filled status that has zero
# remaining lots to fill
order_complete: bool = True,
remaining: float = 0,
) -> None:
"""Pretend to fill a broker order @ price and size.
"""
# TODO: net latency model
await trio.sleep(0.05)
msg = BrokerdFill(
reqid=reqid,
time_ns=time.time_ns(),
action=action,
size=size,
price=price,
broker_time=datetime.now().timestamp(),
broker_details={
'paper_info': {
'oid': oid,
},
# mocking ib
'name': self.broker + '_paper',
},
)
await self.ems_trades_stream.send(msg.dict())
if order_complete:
msg = BrokerdStatus(
reqid=reqid,
time_ns=time.time_ns(),
status='filled',
filled=size,
remaining=0 if order_complete else remaining,
action=action,
size=size,
price=price,
broker_details={
'paper_info': {
'oid': oid,
},
'name': self.broker,
},
)
await self.ems_trades_stream.send(msg.dict())
# lookup any existing position
token = f'{symbol}.{self.broker}'
pp_msg = self._positions.setdefault(
token,
BrokerdPosition(
broker=self.broker,
account='paper',
symbol=symbol,
# TODO: we need to look up the asset currency from
# broker info. i guess for crypto this can be
# inferred from the pair?
currency='',
size=0.0,
avg_price=0,
)
)
# "avg position price" calcs
# TODO: eventually it'd be nice to have a small set of routines
# to do this stuff from a sequence of cleared orders to enable
# so called "contextual positions".
new_size = size + pp_msg.size
# old size minus the new size gives us size differential with
# +ve -> increase in pp size
# -ve -> decrease in pp size
size_diff = abs(new_size) - abs(pp_msg.size)
if new_size == 0:
pp_msg.avg_price = 0
elif size_diff > 0:
# only update the "average position price" when the position
# size increases not when it decreases (i.e. the position is
# being made smaller)
pp_msg.avg_price = (
abs(size) * price + pp_msg.avg_price * abs(pp_msg.size)
) / abs(new_size)
pp_msg.size = new_size
await self.ems_trades_stream.send(pp_msg.dict())
async def simulate_fills(
quote_stream: 'tractor.ReceiveStream', # noqa
client: PaperBoi,
) -> None:
# TODO: more machinery to better simulate real-world market things:
# - slippage models, check what quantopian has:
# https://github.com/quantopian/zipline/blob/master/zipline/finance/slippage.py
# * this should help with simulating partial fills in a fast moving mkt
# afaiu
# - commisions models, also quantopian has em:
# https://github.com/quantopian/zipline/blob/master/zipline/finance/commission.py
# - network latency models ??
# - position tracking:
# https://github.com/quantopian/zipline/blob/master/zipline/finance/ledger.py
# this stream may eventually contain multiple symbols
async for quotes in quote_stream:
for sym, quote in quotes.items():
for tick in iterticks(
quote,
# dark order price filter(s)
types=('ask', 'bid', 'trade', 'last')
):
# print(tick)
tick_price = tick.get('price')
ttype = tick['type']
if ttype in ('ask',):
client.last_ask = (
tick_price,
tick.get('size', client.last_ask[1]),
)
orders = client._buys.get(sym, {})
book_sequence = reversed(
sorted(orders.keys(), key=itemgetter(1)))
def pred(our_price):
return tick_price < our_price
elif ttype in ('bid',):
client.last_bid = (
tick_price,
tick.get('size', client.last_bid[1]),
)
orders = client._sells.get(sym, {})
book_sequence = sorted(orders.keys(), key=itemgetter(1))
def pred(our_price):
return tick_price > our_price
elif ttype in ('trade', 'last'):
# TODO: simulate actual book queues and our orders
# place in it, might require full L2 data?
continue
# iterate book prices descending
for oid, our_price in book_sequence:
if pred(our_price):
# retreive order info
(size, reqid, action) = orders.pop((oid, our_price))
# clearing price would have filled entirely
await client.fake_fill(
symbol=sym,
# todo slippage to determine fill price
price=tick_price,
size=size,
action=action,
reqid=reqid,
oid=oid,
)
else:
# prices are iterated in sorted order so we're done
break
async def handle_order_requests(
client: PaperBoi,
ems_order_stream: tractor.MsgStream,
) -> None:
# order_request: dict
async for request_msg in ems_order_stream:
action = request_msg['action']
if action in {'buy', 'sell'}:
account = request_msg['account']
if account != 'paper':
log.error(
'This is a paper account, only a `paper` selection is valid'
)
await ems_order_stream.send(BrokerdError(
oid=request_msg['oid'],
symbol=request_msg['symbol'],
reason=f'Paper only. No account found: `{account}` ?',
).dict())
continue
# validate
order = BrokerdOrder(**request_msg)
# call our client api to submit the order
reqid = await client.submit_limit(
oid=order.oid,
symbol=order.symbol,
price=order.price,
action=order.action,
size=order.size,
# XXX: by default 0 tells ``ib_insync`` methods that
# there is no existing order so ask the client to create
# a new one (which it seems to do by allocating an int
# counter - collision prone..)
reqid=order.reqid,
)
# deliver ack that order has been submitted to broker routing
await ems_order_stream.send(
BrokerdOrderAck(
# ems order request id
oid=order.oid,
# broker specific request id
reqid=reqid,
).dict()
)
elif action == 'cancel':
msg = BrokerdCancel(**request_msg)
await client.submit_cancel(
reqid=msg.reqid
)
else:
log.error(f'Unknown order command: {request_msg}')
@tractor.context
async def trades_dialogue(
ctx: tractor.Context,
broker: str,
fqsn: str,
loglevel: str = None,
) -> None:
tractor.log.get_console_log(loglevel)
async with (
data.open_feed(
[fqsn],
loglevel=loglevel,
) as feed,
):
# TODO: load paper positions per broker from .toml config file
# and pass as symbol to position data mapping: ``dict[str, dict]``
# await ctx.started(all_positions)
await ctx.started(({}, {'paper',}))
async with (
ctx.open_stream() as ems_stream,
trio.open_nursery() as n,
):
client = PaperBoi(
broker,
ems_stream,
_buys={},
_sells={},
_reqids={},
# TODO: load paper positions from ``positions.toml``
_positions={},
)
n.start_soon(handle_order_requests, client, ems_stream)
# paper engine simulator clearing task
await simulate_fills(feed.stream, client)
@asynccontextmanager
async def open_paperboi(
fqsn: str,
loglevel: str,
) -> Callable:
'''
Spawn a paper engine actor and yield through access to
its context.
'''
broker, symbol, expiry = unpack_fqsn(fqsn)
service_name = f'paperboi.{broker}'
async with (
tractor.find_actor(service_name) as portal,
tractor.open_nursery() as tn,
):
# only spawn if no paperboi already is up
# (we likely don't need more then one proc for basic
# simulated order clearing)
if portal is None:
portal = await tn.start_actor(
service_name,
enable_modules=[__name__]
)
async with portal.open_context(
trades_dialogue,
broker=broker,
fqsn=fqsn,
loglevel=loglevel,
) as (ctx, first):
yield ctx, first

409
piker/cli.py 100644
View File

@ -0,0 +1,409 @@
"""
Console interface to broker client/daemons.
"""
from functools import partial
import json
import os
from operator import attrgetter
from operator import itemgetter
import click
import pandas as pd
import trio
import tractor
from . import watchlists as wl
from .log import get_console_log, colorize_json, get_logger
from .brokers import core, get_brokermod, data, config
from .brokers.core import maybe_spawn_brokerd_as_subactor, _data_mods
log = get_logger('cli')
DEFAULT_BROKER = 'questrade'
_config_dir = click.get_app_dir('piker')
_watchlists_data_path = os.path.join(_config_dir, 'watchlists.json')
_context_defaults = dict(
default_map={
'monitor': {
'rate': 3,
},
'optschain': {
'rate': 1,
},
}
)
@click.command()
@click.option('--loglevel', '-l', default='warning', help='Logging level')
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.option('--host', '-h', default='127.0.0.1', help='Host address to bind')
def pikerd(loglevel, host, tl):
"""Spawn the piker broker-daemon.
"""
get_console_log(loglevel)
tractor.run_daemon(
rpc_module_paths=_data_mods,
name='brokerd',
loglevel=loglevel if tl else None,
)
@click.group(context_settings=_context_defaults)
@click.option('--broker', '-b', default=DEFAULT_BROKER,
help='Broker backend to use')
@click.option('--loglevel', '-l', default='warning', help='Logging level')
@click.option('--configdir', '-c', help='Configuration directory')
@click.pass_context
def cli(ctx, broker, loglevel, configdir):
if configdir is not None:
assert os.path.isdir(configdir), f"`{configdir}` is not a valid path"
config._override_config_dir(configdir)
# ensure that ctx.obj exists even though we aren't using it (yet)
ctx.ensure_object(dict)
ctx.obj.update({
'broker': broker,
'brokermod': get_brokermod(broker),
'loglevel': loglevel,
'log': get_console_log(loglevel),
})
@cli.command()
@click.option('--keys', '-k', multiple=True,
help='Return results only for these keys')
@click.argument('meth', nargs=1)
@click.argument('kwargs', nargs=-1)
@click.pass_obj
def api(config, meth, kwargs, keys):
"""client for testing broker API methods with pretty printing of output.
"""
# global opts
broker = config['broker']
_kwargs = {}
for kwarg in kwargs:
if '=' not in kwarg:
log.error(f"kwarg `{kwarg}` must be of form <key>=<value>")
else:
key, _, value = kwarg.partition('=')
_kwargs[key] = value
data = trio.run(
partial(core.api, broker, meth, **_kwargs)
)
if keys:
# filter to requested keys
filtered = []
if meth in data: # often a list of dicts
for item in data[meth]:
filtered.append({key: item[key] for key in keys})
else: # likely just a dict
filtered.append({key: data[key] for key in keys})
data = filtered
click.echo(colorize_json(data))
@cli.command()
@click.option('--df-output', '-df', flag_value=True,
help='Output in `pandas.DataFrame` format')
@click.argument('tickers', nargs=-1, required=True)
@click.pass_obj
def quote(config, tickers, df_output):
"""Retreive symbol quotes on the console in either json or dataframe
format.
"""
# global opts
brokermod = config['brokermod']
quotes = trio.run(partial(core.stocks_quote, brokermod, tickers))
if not quotes:
log.error(f"No quotes could be found for {tickers}?")
return
if len(quotes) < len(tickers):
syms = tuple(map(itemgetter('symbol'), quotes))
for ticker in tickers:
if ticker not in syms:
brokermod.log.warn(f"Could not find symbol {ticker}?")
if df_output:
cols = next(filter(bool, quotes.values())).copy()
cols.pop('symbol')
df = pd.DataFrame(
(quote or {} for quote in quotes.values()),
index=quotes.keys(),
columns=cols,
)
click.echo(df)
else:
click.echo(colorize_json(quotes))
@cli.command()
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.option('--rate', '-r', default=3, help='Quote rate limit')
@click.option('--test', '-t', help='Test quote stream file')
@click.option('--dhost', '-dh', default='127.0.0.1',
help='Daemon host address to connect to')
@click.argument('name', nargs=1, required=True)
@click.pass_obj
def monitor(config, rate, name, dhost, test, tl):
"""Spawn a real-time watchlist.
"""
# global opts
brokermod = config['brokermod']
loglevel = config['loglevel']
log = config['log']
watchlist_from_file = wl.ensure_watchlists(_watchlists_data_path)
watchlists = wl.merge_watchlist(watchlist_from_file, wl._builtins)
tickers = watchlists[name]
if not tickers:
log.error(f"No symbols found for watchlist `{name}`?")
return
from .ui.monitor import _async_main
async def main(tries):
async with maybe_spawn_brokerd_as_subactor(
tries=tries, loglevel=loglevel
) as portal:
# run app "main"
await _async_main(
name, portal, tickers,
brokermod, rate, test=test,
)
tractor.run(
partial(main, tries=1),
name='monitor',
loglevel=loglevel if tl else None,
rpc_module_paths=['piker.ui.monitor'],
)
@cli.command()
@click.option('--rate', '-r', default=5, help='Logging level')
@click.option('--filename', '-f', default='quotestream.jsonstream',
help='Logging level')
@click.option('--dhost', '-dh', default='127.0.0.1',
help='Daemon host address to connect to')
@click.argument('name', nargs=1, required=True)
@click.pass_obj
def record(config, rate, name, dhost, filename):
"""Record client side quotes to file
"""
# global opts
brokermod = config['brokermod']
loglevel = config['loglevel']
log = config['log']
watchlist_from_file = wl.ensure_watchlists(_watchlists_data_path)
watchlists = wl.merge_watchlist(watchlist_from_file, wl._builtins)
tickers = watchlists[name]
if not tickers:
log.error(f"No symbols found for watchlist `{name}`?")
return
async def main(tries):
async with maybe_spawn_brokerd_as_subactor(
tries=tries, loglevel=loglevel
) as portal:
# run app "main"
return await data.stream_to_file(
name, filename,
portal, tickers,
brokermod, rate,
)
filename = tractor.run(partial(main, tries=1), name='data-feed-recorder')
click.echo(f"Data feed recording saved to {filename}")
@cli.group()
@click.option('--config_dir', '-d', default=_watchlists_data_path,
help='Path to piker configuration directory')
@click.pass_context
def watchlists(ctx, config_dir):
"""Watchlists commands and operations
"""
loglevel = ctx.parent.params['loglevel']
get_console_log(loglevel) # activate console logging
wl.make_config_dir(_config_dir)
ctx.ensure_object(dict)
ctx.obj = {'path': config_dir,
'watchlist': wl.ensure_watchlists(config_dir)}
@watchlists.command(help='show watchlist')
@click.argument('name', nargs=1, required=False)
@click.pass_context
def show(ctx, name):
watchlist = wl.merge_watchlist(ctx.obj['watchlist'], wl._builtins)
click.echo(colorize_json(
watchlist if name is None else watchlist[name]))
@watchlists.command(help='load passed in watchlist')
@click.argument('data', nargs=1, required=True)
@click.pass_context
def load(ctx, data):
try:
wl.write_to_file(json.loads(data), ctx.obj['path'])
except (json.JSONDecodeError, IndexError):
click.echo('You have passed an invalid text respresentation of a '
'JSON object. Try again.')
@watchlists.command(help='add ticker to watchlist')
@click.argument('name', nargs=1, required=True)
@click.argument('ticker_names', nargs=-1, required=True)
@click.pass_context
def add(ctx, name, ticker_names):
for ticker in ticker_names:
watchlist = wl.add_ticker(
name, ticker, ctx.obj['watchlist'])
wl.write_to_file(watchlist, ctx.obj['path'])
@watchlists.command(help='remove ticker from watchlist')
@click.argument('name', nargs=1, required=True)
@click.argument('ticker_name', nargs=1, required=True)
@click.pass_context
def remove(ctx, name, ticker_name):
try:
watchlist = wl.remove_ticker(name, ticker_name, ctx.obj['watchlist'])
except KeyError:
log.error(f"No watchlist with name `{name}` could be found?")
except ValueError:
if name in wl._builtins and ticker_name in wl._builtins[name]:
log.error(f"Can not remove ticker `{ticker_name}` from built-in "
f"list `{name}`")
else:
log.error(f"Ticker `{ticker_name}` not found in list `{name}`")
else:
wl.write_to_file(watchlist, ctx.obj['path'])
@watchlists.command(help='delete watchlist group')
@click.argument('name', nargs=1, required=True)
@click.pass_context
def delete(ctx, name):
watchlist = wl.delete_group(name, ctx.obj['watchlist'])
wl.write_to_file(watchlist, ctx.obj['path'])
@watchlists.command(help='merge a watchlist from another user')
@click.argument('watchlist_to_merge', nargs=1, required=True)
@click.pass_context
def merge(ctx, watchlist_to_merge):
merged_watchlist = wl.merge_watchlist(json.loads(watchlist_to_merge),
ctx.obj['watchlist'])
wl.write_to_file(merged_watchlist, ctx.obj['path'])
@watchlists.command(help='dump text respresentation of a watchlist to console')
@click.argument('name', nargs=1, required=False)
@click.pass_context
def dump(ctx, name):
click.echo(json.dumps(ctx.obj['watchlist']))
# options utils
@cli.command()
@click.option('--broker', '-b', default=DEFAULT_BROKER,
help='Broker backend to use')
@click.option('--loglevel', '-l', default='warning', help='Logging level')
@click.option('--ids', flag_value=True, help='Include numeric ids in output')
@click.argument('symbol', required=True)
@click.pass_context
def contracts(ctx, loglevel, broker, symbol, ids):
brokermod = get_brokermod(broker)
get_console_log(loglevel)
contracts = trio.run(partial(core.contracts, brokermod, symbol))
if not ids:
# just print out expiry dates which can be used with
# the option_chain_quote cmd
output = tuple(map(attrgetter('expiry'), contracts))
else:
output = tuple(contracts.items())
# TODO: need a cli test to verify
click.echo(colorize_json(output))
@cli.command()
@click.option('--df-output', '-df', flag_value=True,
help='Output in `pandas.DataFrame` format')
@click.option('--date', '-d', help='Contracts expiry date')
@click.argument('symbol', required=True)
@click.pass_obj
def optsquote(config, symbol, df_output, date):
"""Retreive symbol quotes on the console in either
json or dataframe format.
"""
# global opts
brokermod = config['brokermod']
quotes = trio.run(
partial(
core.option_chain, brokermod, symbol, date
)
)
if not quotes:
log.error(f"No option quotes could be found for {symbol}?")
return
if df_output:
df = pd.DataFrame(
(quote.values() for quote in quotes),
columns=quotes[0].keys(),
)
click.echo(df)
else:
click.echo(colorize_json(quotes))
@cli.command()
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.option('--date', '-d', help='Contracts expiry date')
@click.option('--test', '-t', help='Test quote stream file')
@click.option('--rate', '-r', default=1, help='Logging level')
@click.argument('symbol', required=True)
@click.pass_obj
def optschain(config, symbol, date, tl, rate, test):
"""Start the real-time option chain UI.
"""
# global opts
loglevel = config['loglevel']
brokername = config['broker']
from .ui.option_chain import _async_main
async def main(tries):
async with maybe_spawn_brokerd_as_subactor(
tries=tries, loglevel=loglevel
) as portal:
# run app "main"
await _async_main(
symbol,
brokername,
rate=rate,
loglevel=loglevel,
test=test,
)
tractor.run(
partial(main, tries=1),
name='kivy-options-chain',
loglevel=loglevel if tl else None,
)

View File

@ -1,171 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
CLI commons.
'''
import os
from pprint import pformat
import click
import trio
import tractor
from ..log import get_console_log, get_logger, colorize_json
from ..brokers import get_brokermod
from .._daemon import _tractor_kwargs
from .. import config
log = get_logger('cli')
DEFAULT_BROKER = 'questrade'
@click.command()
@click.option('--loglevel', '-l', default='warning', help='Logging level')
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.option('--pdb', is_flag=True, help='Enable tractor debug mode')
@click.option('--host', '-h', default='127.0.0.1', help='Host address to bind')
@click.option(
'--tsdb',
is_flag=True,
help='Enable local ``marketstore`` instance'
)
def pikerd(loglevel, host, tl, pdb, tsdb):
'''
Spawn the piker broker-daemon.
'''
from .._daemon import open_pikerd
log = get_console_log(loglevel)
if pdb:
log.warning((
"\n"
"!!! You have enabled daemon DEBUG mode !!!\n"
"If a daemon crashes it will likely block"
" the service until resumed from console!\n"
"\n"
))
async def main():
async with (
open_pikerd(
loglevel=loglevel,
debug_mode=pdb,
), # normally delivers a ``Services`` handle
trio.open_nursery() as n,
):
if tsdb:
from piker.data._ahab import start_ahab
from piker.data.marketstore import start_marketstore
log.info('Spawning `marketstore` supervisor')
ctn_ready, config, (cid, pid) = await n.start(
start_ahab,
'marketstored',
start_marketstore,
)
log.info(
f'`marketstore` up!\n'
f'`marketstored` pid: {pid}\n'
f'docker container id: {cid}\n'
f'config: {pformat(config)}'
)
await trio.sleep_forever()
trio.run(main)
@click.group(context_settings=config._context_defaults)
@click.option(
'--brokers', '-b',
default=[DEFAULT_BROKER],
multiple=True,
help='Broker backend to use'
)
@click.option('--loglevel', '-l', default='warning', help='Logging level')
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.option('--configdir', '-c', help='Configuration directory')
@click.pass_context
def cli(ctx, brokers, loglevel, tl, configdir):
if configdir is not None:
assert os.path.isdir(configdir), f"`{configdir}` is not a valid path"
config._override_config_dir(configdir)
ctx.ensure_object(dict)
if len(brokers) == 1:
brokermods = [get_brokermod(brokers[0])]
else:
brokermods = [get_brokermod(broker) for broker in brokers]
ctx.obj.update({
'brokers': brokers,
'brokermods': brokermods,
'loglevel': loglevel,
'tractorloglevel': None,
'log': get_console_log(loglevel),
'confdir': config._config_dir,
'wl_path': config._watchlists_data_path,
})
# allow enabling same loglevel in ``tractor`` machinery
if tl:
ctx.obj.update({'tractorloglevel': loglevel})
@cli.command()
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.argument('names', nargs=-1, required=False)
@click.pass_obj
def services(config, tl, names):
async def list_services():
async with tractor.get_arbiter(
*_tractor_kwargs['arbiter_addr']
) as portal:
registry = await portal.run_from_ns('self', 'get_registry')
json_d = {}
for key, socket in registry.items():
# name, uuid = uid
host, port = socket
json_d[key] = f'{host}:{port}'
click.echo(f"{colorize_json(json_d)}")
tractor.run(
list_services,
name='service_query',
loglevel=config['loglevel'] if tl else None,
arbiter_addr=_tractor_kwargs['arbiter_addr'],
)
def _load_clis() -> None:
from ..data import marketstore # noqa
from ..data import cli # noqa
from ..brokers import cli # noqa
from ..ui import cli # noqa
from ..watchlists import cli # noqa
# load downstream cli modules
_load_clis()

View File

@ -1,267 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Broker configuration mgmt.
"""
import platform
import sys
import os
from os.path import dirname
import shutil
from typing import Optional
from bidict import bidict
import toml
from .log import get_logger
log = get_logger('broker-config')
# taken from ``click`` since apparently they have some
# super weirdness with sigint and sudo..no clue
def get_app_dir(app_name, roaming=True, force_posix=False):
r"""Returns the config folder for the application. The default behavior
is to return whatever is most appropriate for the operating system.
To give you an idea, for an app called ``"Foo Bar"``, something like
the following folders could be returned:
Mac OS X:
``~/Library/Application Support/Foo Bar``
Mac OS X (POSIX):
``~/.foo-bar``
Unix:
``~/.config/foo-bar``
Unix (POSIX):
``~/.foo-bar``
Win XP (roaming):
``C:\Documents and Settings\<user>\Local Settings\Application Data\Foo``
Win XP (not roaming):
``C:\Documents and Settings\<user>\Application Data\Foo Bar``
Win 7 (roaming):
``C:\Users\<user>\AppData\Roaming\Foo Bar``
Win 7 (not roaming):
``C:\Users\<user>\AppData\Local\Foo Bar``
.. versionadded:: 2.0
:param app_name: the application name. This should be properly capitalized
and can contain whitespace.
:param roaming: controls if the folder should be roaming or not on Windows.
Has no affect otherwise.
:param force_posix: if this is set to `True` then on any POSIX system the
folder will be stored in the home folder with a leading
dot instead of the XDG config home or darwin's
application support folder.
"""
def _posixify(name):
return "-".join(name.split()).lower()
# if WIN:
if platform.system() == 'Windows':
key = "APPDATA" if roaming else "LOCALAPPDATA"
folder = os.environ.get(key)
if folder is None:
folder = os.path.expanduser("~")
return os.path.join(folder, app_name)
if force_posix:
return os.path.join(
os.path.expanduser("~/.{}".format(_posixify(app_name))))
if sys.platform == "darwin":
return os.path.join(
os.path.expanduser("~/Library/Application Support"), app_name
)
return os.path.join(
os.environ.get("XDG_CONFIG_HOME", os.path.expanduser("~/.config")),
_posixify(app_name),
)
_config_dir = _click_config_dir = get_app_dir('piker')
_parent_user = os.environ.get('SUDO_USER')
if _parent_user:
non_root_user_dir = os.path.expanduser(
f'~{_parent_user}'
)
root = 'root'
_config_dir = (
non_root_user_dir +
_click_config_dir[
_click_config_dir.rfind(root) + len(root):
]
)
_conf_names: set[str] = {
'brokers',
'trades',
'watchlists',
}
_watchlists_data_path = os.path.join(_config_dir, 'watchlists.json')
_context_defaults = dict(
default_map={
# Questrade specific quote poll rates
'monitor': {
'rate': 3,
},
'optschain': {
'rate': 1,
},
}
)
def _override_config_dir(
path: str
) -> None:
global _config_dir
_config_dir = path
def _conf_fn_w_ext(
name: str,
) -> str:
# change this if we ever change the config file format.
return f'{name}.toml'
def get_conf_path(
conf_name: str = 'brokers',
) -> str:
"""Return the default config path normally under
``~/.config/piker`` on linux.
Contains files such as:
- brokers.toml
- watchlists.toml
- trades.toml
# maybe coming soon ;)
- signals.toml
- strats.toml
"""
assert conf_name in _conf_names
fn = _conf_fn_w_ext(conf_name)
return os.path.join(
_config_dir,
fn,
)
def repodir():
'''
Return the abspath to the repo directory.
'''
dirpath = os.path.abspath(
# we're 3 levels down in **this** module file
dirname(dirname(os.path.realpath(__file__)))
)
return dirpath
def load(
conf_name: str = 'brokers',
path: str = None
) -> (dict, str):
'''
Load config file by name.
'''
path = path or get_conf_path(conf_name)
if not os.path.isfile(path):
fn = _conf_fn_w_ext(conf_name)
template = os.path.join(
repodir(),
'config',
fn
)
# try to copy in a template config to the user's directory
# if one exists.
if os.path.isfile(template):
shutil.copyfile(template, path)
config = toml.load(path)
log.debug(f"Read config file {path}")
return config, path
def write(
config: dict, # toml config as dict
name: str = 'brokers',
path: str = None,
) -> None:
''''
Write broker config to disk.
Create a ``brokers.ini`` file if one does not exist.
'''
path = path or get_conf_path(name)
dirname = os.path.dirname(path)
if not os.path.isdir(dirname):
log.debug(f"Creating config dir {_config_dir}")
os.makedirs(dirname)
if not config:
raise ValueError(
"Watch out you're trying to write a blank config!")
log.debug(
f"Writing config `{name}` file to:\n"
f"{path}"
)
with open(path, 'w') as cf:
return toml.dump(config, cf)
def load_accounts(
providers: Optional[list[str]] = None
) -> bidict[str, Optional[str]]:
conf, path = load()
accounts = bidict()
for provider_name, section in conf.items():
accounts_section = section.get('accounts')
if (
providers is None or
providers and provider_name in providers
):
if accounts_section is None:
log.warning(f'No accounts named for {provider_name}?')
continue
else:
for label, value in accounts_section.items():
accounts[
f'{provider_name}.{label}'
] = value
# our default paper engine entry
accounts['paper'] = None
return accounts

View File

@ -1,48 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Data infra.
We provide tsdb integrations for retrieving
and storing data from your brokers as well as
sharing live streams over a network.
"""
from ._normalize import iterticks
from ._sharedmem import (
maybe_open_shm_array,
attach_shm_array,
open_shm_array,
get_shm_token,
ShmArray,
)
from .feed import (
open_feed,
_setup_persistent_brokerd,
)
__all__ = [
'open_feed',
'ShmArray',
'iterticks',
'maybe_open_shm_array',
'attach_shm_array',
'open_shm_array',
'get_shm_token',
'_setup_persistent_brokerd',
]

View File

@ -1,385 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Supervisor for docker with included specific-image service helpers.
'''
import os
import time
from typing import (
Optional,
Callable,
Any,
)
from contextlib import asynccontextmanager as acm
import trio
from trio_typing import TaskStatus
import tractor
from tractor.msg import NamespacePath
import docker
import json
from docker.models.containers import Container as DockerContainer
from docker.errors import (
DockerException,
APIError,
)
from requests.exceptions import ConnectionError, ReadTimeout
from ..log import get_logger, get_console_log
from .. import config
log = get_logger(__name__)
class DockerNotStarted(Exception):
'Prolly you dint start da daemon bruh'
class ContainerError(RuntimeError):
'Error reported via app-container logging level'
@acm
async def open_docker(
url: Optional[str] = None,
**kwargs,
) -> docker.DockerClient:
client: Optional[docker.DockerClient] = None
try:
client = docker.DockerClient(
base_url=url,
**kwargs
) if url else docker.from_env(**kwargs)
yield client
except (
DockerException,
APIError,
) as err:
def unpack_msg(err: Exception) -> str:
args = getattr(err, 'args', None)
if args:
return args
else:
return str(err)
# could be more specific so let's check if it's just perms.
if err.args:
errs = err.args
for err in errs:
msg = unpack_msg(err)
if 'PermissionError' in msg:
raise DockerException('You dint run as root yo!')
elif 'FileNotFoundError' in msg:
raise DockerNotStarted('Did you start da service sister?')
# not perms?
raise
finally:
if client:
client.close()
class Container:
'''
Wrapper around a ``docker.models.containers.Container`` to include
log capture and relay through our native logging system and helper
method(s) for cancellation/teardown.
'''
def __init__(
self,
cntr: DockerContainer,
) -> None:
self.cntr = cntr
# log msg de-duplication
self.seen_so_far = set()
async def process_logs_until(
self,
patt: str,
bp_on_msg: bool = False,
) -> bool:
'''
Attempt to capture container log messages and relay through our
native logging system.
'''
seen_so_far = self.seen_so_far
while True:
logs = self.cntr.logs()
entries = logs.decode().split('\n')
for entry in entries:
# ignore null lines
if not entry:
continue
try:
record = json.loads(entry.strip())
except json.JSONDecodeError:
if 'Error' in entry:
raise RuntimeError(entry)
raise
msg = record['msg']
level = record['level']
if msg and entry not in seen_so_far:
seen_so_far.add(entry)
if bp_on_msg:
await tractor.breakpoint()
getattr(log, level, log.error)(f'{msg}')
# print(f'level: {level}')
if level in ('error', 'fatal'):
raise ContainerError(msg)
if patt in msg:
return True
# do a checkpoint so we don't block if cancelled B)
await trio.sleep(0.01)
return False
def try_signal(
self,
signal: str = 'SIGINT',
) -> bool:
try:
# XXX: market store doesn't seem to shutdown nicely all the
# time with this (maybe because there are still open grpc
# connections?) noticably after client connections have been
# made or are in use/teardown. It works just fine if you
# just start and stop the container tho?..
log.cancel(f'SENDING {signal} to {self.cntr.id}')
self.cntr.kill(signal)
return True
except docker.errors.APIError as err:
if 'is not running' in err.explanation:
return False
async def cancel(
self,
stop_msg: str,
) -> None:
cid = self.cntr.id
# first try a graceful cancel
log.cancel(
f'SIGINT cancelling container: {cid}\n'
f'waiting on stop msg: "{stop_msg}"'
)
self.try_signal('SIGINT')
start = time.time()
for _ in range(30):
with trio.move_on_after(0.5) as cs:
cs.shield = True
await self.process_logs_until(stop_msg)
# if we aren't cancelled on above checkpoint then we
# assume we read the expected stop msg and terminated.
break
try:
log.info(f'Polling for container shutdown:\n{cid}')
if self.cntr.status not in {'exited', 'not-running'}:
self.cntr.wait(
timeout=0.1,
condition='not-running',
)
break
except (
ReadTimeout,
):
log.info(f'Still waiting on container:\n{cid}')
continue
except (
docker.errors.APIError,
ConnectionError,
):
log.exception('Docker connection failure')
break
else:
delay = time.time() - start
log.error(
f'Failed to kill container {cid} after {delay}s\n'
'sending SIGKILL..'
)
# get out the big guns, bc apparently marketstore
# doesn't actually know how to terminate gracefully
# :eyeroll:...
self.try_signal('SIGKILL')
self.cntr.wait(
timeout=3,
condition='not-running',
)
log.cancel(f'Container stopped: {cid}')
@tractor.context
async def open_ahabd(
ctx: tractor.Context,
endpoint: str, # ns-pointer str-msg-type
**kwargs,
) -> None:
get_console_log('info', name=__name__)
async with open_docker() as client:
# TODO: eventually offer a config-oriented API to do the mounts,
# params, etc. passing to ``Containter.run()``?
# call into endpoint for container config/init
ep_func = NamespacePath(endpoint).load_ref()
(
dcntr,
cntr_config,
start_msg,
stop_msg,
) = ep_func(client)
cntr = Container(dcntr)
with trio.move_on_after(1):
found = await cntr.process_logs_until(start_msg)
if not found and cntr not in client.containers.list():
raise RuntimeError(
'Failed to start `marketstore` check logs deats'
)
await ctx.started((
cntr.cntr.id,
os.getpid(),
cntr_config,
))
try:
# TODO: we might eventually want a proxy-style msg-prot here
# to allow remote control of containers without needing
# callers to have root perms?
await trio.sleep_forever()
finally:
with trio.CancelScope(shield=True):
await cntr.cancel(stop_msg)
async def start_ahab(
service_name: str,
endpoint: Callable[docker.DockerClient, DockerContainer],
task_status: TaskStatus[
tuple[
trio.Event,
dict[str, Any],
],
] = trio.TASK_STATUS_IGNORED,
) -> None:
'''
Start a ``docker`` container supervisor with given service name.
Currently the actor calling this task should normally be started
with root permissions (until we decide to use something that doesn't
require this, like docker's rootless mode or some wrapper project) but
te root perms are de-escalated after the docker supervisor sub-actor
is started.
'''
cn_ready = trio.Event()
try:
async with tractor.open_nursery(
loglevel='runtime',
) as tn:
portal = await tn.start_actor(
service_name,
enable_modules=[__name__]
)
# TODO: we have issues with this on teardown
# where ``tractor`` tries to issue ``os.kill()``
# and hits perms errors since the root process
# doesn't any longer have root perms..
# de-escalate root perms to the original user
# after the docker supervisor actor is spawned.
if config._parent_user:
import pwd
os.setuid(
pwd.getpwnam(
config._parent_user
)[2] # named user's uid
)
async with portal.open_context(
open_ahabd,
endpoint=str(NamespacePath.from_ref(endpoint)),
) as (ctx, first):
cid, pid, cntr_config = first
task_status.started((
cn_ready,
cntr_config,
(cid, pid),
))
await trio.sleep_forever()
# since we demoted root perms in this parent
# we'll get a perms error on proc cleanup in
# ``tractor`` nursery exit. just make sure
# the child is terminated and don't raise the
# error if so.
# TODO: we could also consider adding
# a ``tractor.ZombieDetected`` or something that we could raise
# if we find the child didn't terminate.
except PermissionError:
log.warning('Failed to cancel root permsed container')
except (
trio.MultiError,
) as err:
for subexc in err.exceptions:
if isinstance(subexc, PermissionError):
log.warning('Failed to cancel root perms-ed container')
return
else:
raise

View File

@ -1,82 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Stream format enforcement.
'''
from itertools import chain
from typing import AsyncIterator
def iterticks(
quote: dict,
types: tuple[str] = (
'trade',
'dark_trade',
),
deduplicate_darks: bool = False,
) -> AsyncIterator:
'''
Iterate through ticks delivered per quote cycle.
'''
if deduplicate_darks:
assert 'dark_trade' in types
# print(f"{quote}\n\n")
ticks = quote.get('ticks', ())
trades = {}
darks = {}
if ticks:
# do a first pass and attempt to remove duplicate dark
# trades with the same tick signature.
if deduplicate_darks:
for tick in ticks:
ttype = tick.get('type')
time = tick.get('time', None)
if time:
sig = (
time,
tick['price'],
tick['size']
)
if ttype == 'dark_trade':
darks[sig] = tick
elif ttype == 'trade':
trades[sig] = tick
# filter duplicates
for sig, tick in trades.items():
tick = darks.pop(sig, None)
if tick:
ticks.remove(tick)
# print(f'DUPLICATE {tick}')
# re-insert ticks
ticks.extend(list(chain(trades.values(), darks.values())))
for tick in ticks:
# print(f"{quote['symbol']}: {tick}")
ttype = tick.get('type')
if ttype in types:
yield tick

View File

@ -1,519 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Sampling and broadcast machinery for (soft) real-time delivery of
financial data flows.
"""
from __future__ import annotations
from collections import Counter
import time
from typing import TYPE_CHECKING, Optional, Union
import tractor
import trio
from trio_typing import TaskStatus
from ..log import get_logger
if TYPE_CHECKING:
from ._sharedmem import ShmArray
from .feed import _FeedsBus
log = get_logger(__name__)
class sampler:
'''
Global sampling engine registry.
Manages state for sampling events, shm incrementing and
sample period logic.
'''
# TODO: we could stick these in a composed type to avoid
# angering the "i hate module scoped variables crowd" (yawn).
ohlcv_shms: dict[int, list[ShmArray]] = {}
# holds one-task-per-sample-period tasks which are spawned as-needed by
# data feed requests with a given detected time step usually from
# history loading.
incrementers: dict[int, trio.CancelScope] = {}
# holds all the ``tractor.Context`` remote subscriptions for
# a particular sample period increment event: all subscribers are
# notified on a step.
subscribers: dict[int, tractor.Context] = {}
async def increment_ohlc_buffer(
delay_s: int,
task_status: TaskStatus[trio.CancelScope] = trio.TASK_STATUS_IGNORED,
):
'''
Task which inserts new bars into the provide shared memory array
every ``delay_s`` seconds.
This task fulfills 2 purposes:
- it takes the subscribed set of shm arrays and increments them
on a common time period
- broadcast of this increment "signal" message to other actor
subscribers
Note that if **no** actor has initiated this task then **none** of
the underlying buffers will actually be incremented.
'''
# # wait for brokerd to signal we should start sampling
# await shm_incrementing(shm_token['shm_name']).wait()
# TODO: right now we'll spin printing bars if the last time stamp is
# before a large period of no market activity. Likely the best way
# to solve this is to make this task aware of the instrument's
# tradable hours?
# adjust delay to compensate for trio processing time
ad = min(sampler.ohlcv_shms.keys()) - 0.001
total_s = 0 # total seconds counted
lowest = min(sampler.ohlcv_shms.keys())
lowest_shm = sampler.ohlcv_shms[lowest][0]
ad = lowest - 0.001
with trio.CancelScope() as cs:
# register this time period step as active
sampler.incrementers[delay_s] = cs
task_status.started(cs)
while True:
# TODO: do we want to support dynamically
# adding a "lower" lowest increment period?
await trio.sleep(ad)
total_s += lowest
# increment all subscribed shm arrays
# TODO:
# - this in ``numba``
# - just lookup shms for this step instead of iterating?
for delay_s, shms in sampler.ohlcv_shms.items():
if total_s % delay_s != 0:
continue
# TODO: ``numba`` this!
for shm in shms:
# TODO: in theory we could make this faster by copying the
# "last" readable value into the underlying larger buffer's
# next value and then incrementing the counter instead of
# using ``.push()``?
# append new entry to buffer thus "incrementing" the bar
array = shm.array
last = array[-1:][shm._write_fields].copy()
# (index, t, close) = last[0][['index', 'time', 'close']]
(t, close) = last[0][['time', 'close']]
# this copies non-std fields (eg. vwap) from the last datum
last[
['time', 'volume', 'open', 'high', 'low', 'close']
][0] = (t + delay_s, 0, close, close, close, close)
# write to the buffer
shm.push(last)
await broadcast(delay_s, shm=lowest_shm)
async def broadcast(
delay_s: int,
shm: Optional[ShmArray] = None,
) -> None:
'''
Broadcast the given ``shm: ShmArray``'s buffer index step to any
subscribers for a given sample period.
The sent msg will include the first and last index which slice into
the buffer's non-empty data.
'''
subs = sampler.subscribers.get(delay_s, ())
first = last = -1
if shm is None:
periods = sampler.ohlcv_shms.keys()
# if this is an update triggered by a history update there
# might not actually be any sampling bus setup since there's
# no "live feed" active yet.
if periods:
lowest = min(periods)
shm = sampler.ohlcv_shms[lowest][0]
first = shm._first.value
last = shm._last.value
for stream in subs:
try:
await stream.send({
'first': first,
'last': last,
'index': last,
})
except (
trio.BrokenResourceError,
trio.ClosedResourceError
):
log.error(
f'{stream._ctx.chan.uid} dropped connection'
)
try:
subs.remove(stream)
except ValueError:
log.warning(
f'{stream._ctx.chan.uid} sub already removed!?'
)
@tractor.context
async def iter_ohlc_periods(
ctx: tractor.Context,
delay_s: int,
) -> None:
'''
Subscribe to OHLC sampling "step" events: when the time
aggregation period increments, this event stream emits an index
event.
'''
# add our subscription
subs = sampler.subscribers.setdefault(delay_s, [])
await ctx.started()
async with ctx.open_stream() as stream:
subs.append(stream)
try:
# stream and block until cancelled
await trio.sleep_forever()
finally:
try:
subs.remove(stream)
except ValueError:
log.error(
f'iOHLC step stream was already dropped {ctx.chan.uid}?'
)
async def sample_and_broadcast(
bus: _FeedsBus, # noqa
shm: ShmArray,
quote_stream: trio.abc.ReceiveChannel,
brokername: str,
sum_tick_vlm: bool = True,
) -> None:
log.info("Started shared mem bar writer")
overruns = Counter()
# iterate stream delivered by broker
async for quotes in quote_stream:
# TODO: ``numba`` this!
for broker_symbol, quote in quotes.items():
# TODO: in theory you can send the IPC msg *before* writing
# to the sharedmem array to decrease latency, however, that
# will require at least some way to prevent task switching
# at the yield such that the array write isn't delayed while
# another consumer is serviced..
# start writing the shm buffer with appropriate
# trade data
# TODO: we should probably not write every single
# value to an OHLC sample stream XD
# for a tick stream sure.. but this is excessive..
ticks = quote['ticks']
for tick in ticks:
ticktype = tick['type']
# write trade events to shm last OHLC sample
if ticktype in ('trade', 'utrade'):
last = tick['price']
# update last entry
# benchmarked in the 4-5 us range
o, high, low, v = shm.array[-1][
['open', 'high', 'low', 'volume']
]
new_v = tick.get('size', 0)
if v == 0 and new_v:
# no trades for this bar yet so the open
# is also the close/last trade price
o = last
if sum_tick_vlm:
volume = v + new_v
else:
# presume backend takes care of summing
# it's own vlm
volume = quote['volume']
shm.array[[
'open',
'high',
'low',
'close',
'bar_wap', # can be optionally provided
'volume',
]][-1] = (
o,
max(high, last),
min(low, last),
last,
quote.get('bar_wap', 0),
volume,
)
# XXX: we need to be very cautious here that no
# context-channel is left lingering which doesn't have
# a far end receiver actor-task. In such a case you can
# end up triggering backpressure which which will
# eventually block this producer end of the feed and
# thus other consumers still attached.
subs: list[
tuple[
Union[tractor.MsgStream, trio.MemorySendChannel],
tractor.Context,
Optional[float], # tick throttle in Hz
]
] = bus._subscribers[broker_symbol.lower()]
# NOTE: by default the broker backend doesn't append
# it's own "name" into the fqsn schema (but maybe it
# should?) so we have to manually generate the correct
# key here.
bsym = f'{broker_symbol}.{brokername}'
lags: int = 0
for (stream, ctx, tick_throttle) in subs:
try:
with trio.move_on_after(0.2) as cs:
if tick_throttle:
# this is a send mem chan that likely
# pushes to the ``uniform_rate_send()`` below.
try:
stream.send_nowait(
(bsym, quote)
)
except trio.WouldBlock:
chan = ctx.chan
if ctx:
log.warning(
f'Feed overrun {bus.brokername} ->'
f'{chan.uid} !!!'
)
else:
key = id(stream)
overruns[key] += 1
log.warning(
f'Feed overrun {broker_symbol}'
'@{bus.brokername} -> '
f'feed @ {tick_throttle} Hz'
)
if overruns[key] > 6:
# TODO: should we check for the
# context being cancelled? this
# could happen but the
# channel-ipc-pipe is still up.
if not chan.connected():
log.warning(
'Dropping broken consumer:\n'
f'{broker_symbol}:'
f'{ctx.cid}@{chan.uid}'
)
await stream.aclose()
raise trio.BrokenResourceError
else:
log.warning(
'Feed getting overrun bro!\n'
f'{broker_symbol}:'
f'{ctx.cid}@{chan.uid}'
)
continue
else:
await stream.send(
{bsym: quote}
)
if cs.cancelled_caught:
lags += 1
if lags > 10:
await tractor.breakpoint()
except (
trio.BrokenResourceError,
trio.ClosedResourceError,
trio.EndOfChannel,
):
chan = ctx.chan
if ctx:
log.warning(
'Dropped `brokerd`-quotes-feed connection:\n'
f'{broker_symbol}:'
f'{ctx.cid}@{chan.uid}'
)
if tick_throttle:
assert stream._closed
# XXX: do we need to deregister here
# if it's done in the fee bus code?
# so far seems like no since this should all
# be single-threaded. Doing it anyway though
# since there seems to be some kinda race..
try:
subs.remove((stream, tick_throttle))
except ValueError:
log.error(
f'Stream was already removed from subs!?\n'
f'{broker_symbol}:'
f'{ctx.cid}@{chan.uid}'
)
# TODO: a less naive throttler, here's some snippets:
# token bucket by njs:
# https://gist.github.com/njsmith/7ea44ec07e901cb78ebe1dd8dd846cb9
async def uniform_rate_send(
rate: float,
quote_stream: trio.abc.ReceiveChannel,
stream: tractor.MsgStream,
task_status: TaskStatus = trio.TASK_STATUS_IGNORED,
) -> None:
# TODO: compute the approx overhead latency per cycle
left_to_sleep = throttle_period = 1/rate - 0.000616
# send cycle state
first_quote = last_quote = None
last_send = time.time()
diff = 0
task_status.started()
while True:
# compute the remaining time to sleep for this throttled cycle
left_to_sleep = throttle_period - diff
if left_to_sleep > 0:
with trio.move_on_after(left_to_sleep) as cs:
try:
sym, last_quote = await quote_stream.receive()
except trio.EndOfChannel:
log.exception(f"feed for {stream} ended?")
break
diff = time.time() - last_send
if not first_quote:
first_quote = last_quote
if (throttle_period - diff) > 0:
# received a quote but the send cycle period hasn't yet
# expired we aren't supposed to send yet so append
# to the tick frame.
# append quotes since last iteration into the last quote's
# tick array/buffer.
ticks = last_quote.get('ticks')
# XXX: idea for frame type data structure we could
# use on the wire instead of a simple list?
# frames = {
# 'index': ['type_a', 'type_c', 'type_n', 'type_n'],
# 'type_a': [tick0, tick1, tick2, .., tickn],
# 'type_b': [tick0, tick1, tick2, .., tickn],
# 'type_c': [tick0, tick1, tick2, .., tickn],
# ...
# 'type_n': [tick0, tick1, tick2, .., tickn],
# }
# TODO: once we decide to get fancy really we should
# have a shared mem tick buffer that is just
# continually filled and the UI just ready from it
# at it's display rate.
if ticks:
first_quote['ticks'].extend(ticks)
# send cycle isn't due yet so continue waiting
continue
if cs.cancelled_caught:
# 2 cases:
# no quote has arrived yet this cycle so wait for
# the next one.
if not first_quote:
# if no last quote was received since the last send
# cycle **AND** if we timed out waiting for a most
# recent quote **but** the throttle cycle is now due to
# be sent -> we want to immediately send the next
# received quote ASAP.
sym, first_quote = await quote_stream.receive()
# we have a quote already so send it now.
# measured_rate = 1 / (time.time() - last_send)
# log.info(
# f'`{sym}` throttled send hz: {round(measured_rate, ndigits=1)}'
# )
# TODO: now if only we could sync this to the display
# rate timing exactly lul
try:
await stream.send({sym: first_quote})
except (
# NOTE: any of these can be raised by ``tractor``'s IPC
# transport-layer and we want to be highly resilient
# to consumers which crash or lose network connection.
# I.e. we **DO NOT** want to crash and propagate up to
# ``pikerd`` these kinds of errors!
trio.ClosedResourceError,
trio.BrokenResourceError,
ConnectionResetError,
):
# if the feed consumer goes down then drop
# out of this rate limiter
log.warning(f'{stream} closed')
await stream.aclose()
return
# reset send cycle state
first_quote = last_quote = None
diff = 0
last_send = time.time()

View File

@ -1,684 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
NumPy compatible shared memory buffers for real-time IPC streaming.
"""
from __future__ import annotations
from sys import byteorder
import time
from typing import Optional
from multiprocessing.shared_memory import SharedMemory, _USE_POSIX
if _USE_POSIX:
from _posixshmem import shm_unlink
import tractor
import numpy as np
from pydantic import BaseModel
from numpy.lib import recfunctions as rfn
from ..log import get_logger
from ._source import base_iohlc_dtype
log = get_logger(__name__)
# how much is probably dependent on lifestyle
_secs_in_day = int(60 * 60 * 24)
# we try for a buncha times, but only on a run-every-other-day kinda week.
_days_worth = 16
_default_size = _days_worth * _secs_in_day
# where to start the new data append index
_rt_buffer_start = int((_days_worth - 1) * _secs_in_day)
def cuckoff_mantracker():
from multiprocessing import resource_tracker as mantracker
# Tell the "resource tracker" thing to fuck off.
class ManTracker(mantracker.ResourceTracker):
def register(self, name, rtype):
pass
def unregister(self, name, rtype):
pass
def ensure_running(self):
pass
# "know your land and know your prey"
# https://www.dailymotion.com/video/x6ozzco
mantracker._resource_tracker = ManTracker()
mantracker.register = mantracker._resource_tracker.register
mantracker.ensure_running = mantracker._resource_tracker.ensure_running
# ensure_running = mantracker._resource_tracker.ensure_running
mantracker.unregister = mantracker._resource_tracker.unregister
mantracker.getfd = mantracker._resource_tracker.getfd
cuckoff_mantracker()
class SharedInt:
"""Wrapper around a single entry shared memory array which
holds an ``int`` value used as an index counter.
"""
def __init__(
self,
shm: SharedMemory,
) -> None:
self._shm = shm
@property
def value(self) -> int:
return int.from_bytes(self._shm.buf, byteorder)
@value.setter
def value(self, value) -> None:
self._shm.buf[:] = value.to_bytes(self._shm.size, byteorder)
def destroy(self) -> None:
if _USE_POSIX:
# We manually unlink to bypass all the "resource tracker"
# nonsense meant for non-SC systems.
name = self._shm.name
try:
shm_unlink(name)
except FileNotFoundError:
# might be a teardown race here?
log.warning(f'Shm for {name} already unlinked?')
class _Token(BaseModel):
'''
Internal represenation of a shared memory "token"
which can be used to key a system wide post shm entry.
'''
class Config:
frozen = True
shm_name: str # this servers as a "key" value
shm_first_index_name: str
shm_last_index_name: str
dtype_descr: tuple
@property
def dtype(self) -> np.dtype:
return np.dtype(list(map(tuple, self.dtype_descr))).descr
def as_msg(self):
return self.dict()
@classmethod
def from_msg(cls, msg: dict) -> _Token:
if isinstance(msg, _Token):
return msg
msg['dtype_descr'] = tuple(map(tuple, msg['dtype_descr']))
return _Token(**msg)
# TODO: this api?
# _known_tokens = tractor.ActorVar('_shm_tokens', {})
# _known_tokens = tractor.ContextStack('_known_tokens', )
# _known_tokens = trio.RunVar('shms', {})
# process-local store of keys to tokens
_known_tokens = {}
def get_shm_token(key: str) -> _Token:
"""Convenience func to check if a token
for the provided key is known by this process.
"""
return _known_tokens.get(key)
def _make_token(
key: str,
dtype: Optional[np.dtype] = None,
) -> _Token:
'''
Create a serializable token that can be used
to access a shared array.
'''
dtype = base_iohlc_dtype if dtype is None else dtype
return _Token(
shm_name=key,
shm_first_index_name=key + "_first",
shm_last_index_name=key + "_last",
dtype_descr=np.dtype(dtype).descr
)
class ShmArray:
'''
A shared memory ``numpy`` (compatible) array API.
An underlying shared memory buffer is allocated based on
a user specified ``numpy.ndarray``. This fixed size array
can be read and written to by pushing data both onto the "front"
or "back" of a set index range. The indexes for the "first" and
"last" index are themselves stored in shared memory (accessed via
``SharedInt`` interfaces) values such that multiple processes can
interact with the same array using a synchronized-index.
'''
def __init__(
self,
shmarr: np.ndarray,
first: SharedInt,
last: SharedInt,
shm: SharedMemory,
# readonly: bool = True,
) -> None:
self._array = shmarr
# indexes for first and last indices corresponding
# to fille data
self._first = first
self._last = last
self._len = len(shmarr)
self._shm = shm
self._post_init: bool = False
# pushing data does not write the index (aka primary key)
dtype = shmarr.dtype
if dtype.fields:
self._write_fields = list(shmarr.dtype.fields.keys())[1:]
else:
self._write_fields = None
# TODO: ringbuf api?
@property
def _token(self) -> _Token:
return _Token(
shm_name=self._shm.name,
shm_first_index_name=self._first._shm.name,
shm_last_index_name=self._last._shm.name,
dtype_descr=tuple(self._array.dtype.descr),
)
@property
def token(self) -> dict:
"""Shared memory token that can be serialized and used by
another process to attach to this array.
"""
return self._token.as_msg()
@property
def index(self) -> int:
return self._last.value % self._len
@property
def array(self) -> np.ndarray:
'''
Return an up-to-date ``np.ndarray`` view of the
so-far-written data to the underlying shm buffer.
'''
a = self._array[self._first.value:self._last.value]
# first, last = self._first.value, self._last.value
# a = self._array[first:last]
# TODO: eventually comment this once we've not seen it in the
# wild in a long time..
# XXX: race where first/last indexes cause a reader
# to load an empty array..
if len(a) == 0 and self._post_init:
raise RuntimeError('Empty array race condition hit!?')
# breakpoint()
return a
def ustruct(
self,
fields: Optional[list[str]] = None,
# type that all field values will be cast to
# in the returned view.
common_dtype: np.dtype = np.float,
) -> np.ndarray:
array = self._array
if fields:
selection = array[fields]
# fcount = len(fields)
else:
selection = array
# fcount = len(array.dtype.fields)
# XXX: manual ``.view()`` attempt that also doesn't work.
# uview = selection.view(
# dtype='<f16',
# ).reshape(-1, 4, order='A')
# assert len(selection) == len(uview)
u = rfn.structured_to_unstructured(
selection,
# dtype=float,
copy=True,
)
# unstruct = np.ndarray(u.shape, dtype=a.dtype, buffer=shm.buf)
# array[:] = a[:]
return u
# return ShmArray(
# shmarr=u,
# first=self._first,
# last=self._last,
# shm=self._shm
# )
def last(
self,
length: int = 1,
) -> np.ndarray:
'''
Return the last ``length``'s worth of ("row") entries from the
array.
'''
return self.array[-length:]
def push(
self,
data: np.ndarray,
field_map: Optional[dict[str, str]] = None,
prepend: bool = False,
update_first: bool = True,
start: Optional[int] = None,
) -> int:
'''
Ring buffer like "push" to append data
into the buffer and return updated "last" index.
NB: no actual ring logic yet to give a "loop around" on overflow
condition, lel.
'''
length = len(data)
if prepend:
index = (start or self._first.value) - length
if index < 0:
raise ValueError(
f'Array size of {self._len} was overrun during prepend.\n'
f'You have passed {abs(index)} too many datums.'
)
else:
index = start if start is not None else self._last.value
end = index + length
if field_map:
src_names, dst_names = zip(*field_map.items())
else:
dst_names = src_names = self._write_fields
try:
self._array[
list(dst_names)
][index:end] = data[list(src_names)][:]
# NOTE: there was a race here between updating
# the first and last indices and when the next reader
# tries to access ``.array`` (which due to the index
# overlap will be empty). Pretty sure we've fixed it now
# but leaving this here as a reminder.
if prepend and update_first and length:
assert index < self._first.value
if (
index < self._first.value
and update_first
):
assert prepend, 'prepend=True not passed but index decreased?'
self._first.value = index
elif not prepend:
self._last.value = end
self._post_init = True
return end
except ValueError as err:
if field_map:
raise
# should raise if diff detected
self.diff_err_fields(data)
raise err
def diff_err_fields(
self,
data: np.ndarray,
) -> None:
# reraise with any field discrepancy
our_fields, their_fields = (
set(self._array.dtype.fields),
set(data.dtype.fields),
)
only_in_ours = our_fields - their_fields
only_in_theirs = their_fields - our_fields
if only_in_ours:
raise TypeError(
f"Input array is missing field(s): {only_in_ours}"
)
elif only_in_theirs:
raise TypeError(
f"Input array has unknown field(s): {only_in_theirs}"
)
# TODO: support "silent" prepends that don't update ._first.value?
def prepend(
self,
data: np.ndarray,
) -> int:
end = self.push(data, prepend=True)
assert end
def close(self) -> None:
self._first._shm.close()
self._last._shm.close()
self._shm.close()
def destroy(self) -> None:
if _USE_POSIX:
# We manually unlink to bypass all the "resource tracker"
# nonsense meant for non-SC systems.
shm_unlink(self._shm.name)
self._first.destroy()
self._last.destroy()
def flush(self) -> None:
# TODO: flush to storage backend like markestore?
...
def open_shm_array(
key: Optional[str] = None,
size: int = _default_size,
dtype: Optional[np.dtype] = None,
readonly: bool = False,
) -> ShmArray:
'''Open a memory shared ``numpy`` using the standard library.
This call unlinks (aka permanently destroys) the buffer on teardown
and thus should be used from the parent-most accessor (process).
'''
# create new shared mem segment for which we
# have write permission
a = np.zeros(size, dtype=dtype)
a['index'] = np.arange(len(a))
shm = SharedMemory(
name=key,
create=True,
size=a.nbytes
)
array = np.ndarray(
a.shape,
dtype=a.dtype,
buffer=shm.buf
)
array[:] = a[:]
array.setflags(write=int(not readonly))
token = _make_token(
key=key,
dtype=dtype
)
# create single entry arrays for storing an first and last indices
first = SharedInt(
shm=SharedMemory(
name=token.shm_first_index_name,
create=True,
size=4, # std int
)
)
last = SharedInt(
shm=SharedMemory(
name=token.shm_last_index_name,
create=True,
size=4, # std int
)
)
# start the "real-time" updated section after 3-days worth of 1s
# sampled OHLC. this allows appending up to a days worth from
# tick/quote feeds before having to flush to a (tsdb) storage
# backend, and looks something like,
# -------------------------
# | | i
# _________________________
# <-------------> <------->
# history real-time
#
# Once fully "prepended", the history section will leave the
# ``ShmArray._start.value: int = 0`` and the yet-to-be written
# real-time section will start at ``ShmArray.index: int``.
# this sets the index to 3/4 of the length of the buffer
# leaving a "days worth of second samples" for the real-time
# section.
last.value = first.value = _rt_buffer_start
shmarr = ShmArray(
array,
first,
last,
shm,
)
assert shmarr._token == token
_known_tokens[key] = shmarr.token
# "unlink" created shm on process teardown by
# pushing teardown calls onto actor context stack
tractor._actor._lifetime_stack.callback(shmarr.close)
tractor._actor._lifetime_stack.callback(shmarr.destroy)
return shmarr
def attach_shm_array(
token: tuple[str, str, tuple[str, str]],
size: int = _default_size,
readonly: bool = True,
) -> ShmArray:
'''
Attach to an existing shared memory array previously
created by another process using ``open_shared_array``.
No new shared mem is allocated but wrapper types for read/write
access are constructed.
'''
token = _Token.from_msg(token)
key = token.shm_name
if key in _known_tokens:
assert _Token.from_msg(_known_tokens[key]) == token, "WTF"
# XXX: ugh, looks like due to the ``shm_open()`` C api we can't
# actually place files in a subdir, see discussion here:
# https://stackoverflow.com/a/11103289
# attach to array buffer and view as per dtype
_err: Optional[Exception] = None
for _ in range(3):
try:
shm = SharedMemory(
name=key,
create=False,
)
break
except OSError as oserr:
_err = oserr
time.sleep(0.1)
else:
if _err:
raise _err
shmarr = np.ndarray(
(size,),
dtype=token.dtype,
buffer=shm.buf
)
shmarr.setflags(write=int(not readonly))
first = SharedInt(
shm=SharedMemory(
name=token.shm_first_index_name,
create=False,
size=4, # std int
),
)
last = SharedInt(
shm=SharedMemory(
name=token.shm_last_index_name,
create=False,
size=4, # std int
),
)
# make sure we can read
first.value
sha = ShmArray(
shmarr,
first,
last,
shm,
)
# read test
sha.array
# Stash key -> token knowledge for future queries
# via `maybe_opepn_shm_array()` but only after we know
# we can attach.
if key not in _known_tokens:
_known_tokens[key] = token
# "close" attached shm on process teardown
tractor._actor._lifetime_stack.callback(sha.close)
return sha
def maybe_open_shm_array(
key: str,
dtype: Optional[np.dtype] = None,
**kwargs,
) -> tuple[ShmArray, bool]:
'''
Attempt to attach to a shared memory block using a "key" lookup
to registered blocks in the users overall "system" registry
(presumes you don't have the block's explicit token).
This function is meant to solve the problem of discovering whether
a shared array token has been allocated or discovered by the actor
running in **this** process. Systems where multiple actors may seek
to access a common block can use this function to attempt to acquire
a token as discovered by the actors who have previously stored
a "key" -> ``_Token`` map in an actor local (aka python global)
variable.
If you know the explicit ``_Token`` for your memory segment instead
use ``attach_shm_array``.
'''
try:
# see if we already know this key
token = _known_tokens[key]
return attach_shm_array(token=token, **kwargs), False
except KeyError:
log.warning(f"Could not find {key} in shms cache")
if dtype:
token = _make_token(key, dtype)
try:
return attach_shm_array(token=token, **kwargs), False
except FileNotFoundError:
log.warning(f"Could not attach to shm with token {token}")
# This actor does not know about memory
# associated with the provided "key".
# Attempt to open a block and expect
# to fail if a block has been allocated
# on the OS by someone else.
return open_shm_array(key=key, dtype=dtype, **kwargs), True
def try_read(
array: np.ndarray
) -> Optional[np.ndarray]:
'''
Try to read the last row from a shared mem array or ``None``
if the array read returns a zero-length array result.
Can be used to check for backfilling race conditions where an array
is currently being (re-)written by a writer actor but the reader is
unaware and reads during the window where the first and last indexes
are being updated.
'''
try:
return array[-1]
except IndexError:
# XXX: race condition with backfilling shm.
#
# the underlying issue is that a backfill (aka prepend) and subsequent
# shm array first/last index update could result in an empty array
# read here since the indices may be updated in such a way that
# a read delivers an empty array (though it seems like we
# *should* be able to prevent that?). also, as and alt and
# something we need anyway, maybe there should be some kind of
# signal that a prepend is taking place and this consumer can
# respond (eg. redrawing graphics) accordingly.
# the array read was emtpy
return None

View File

@ -1,267 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
numpy data source coversion helpers.
"""
from __future__ import annotations
from typing import Any
import decimal
from bidict import bidict
import numpy as np
from pydantic import BaseModel
# from numba import from_dtype
ohlc_fields = [
('time', float),
('open', float),
('high', float),
('low', float),
('close', float),
('volume', float),
('bar_wap', float),
]
ohlc_with_index = ohlc_fields.copy()
ohlc_with_index.insert(0, ('index', int))
# our minimum structured array layout for ohlc data
base_iohlc_dtype = np.dtype(ohlc_with_index)
base_ohlc_dtype = np.dtype(ohlc_fields)
# TODO: for now need to construct this manually for readonly arrays, see
# https://github.com/numba/numba/issues/4511
# numba_ohlc_dtype = from_dtype(base_ohlc_dtype)
# map time frame "keys" to seconds values
tf_in_1s = bidict({
1: '1s',
60: '1m',
60*5: '5m',
60*15: '15m',
60*30: '30m',
60*60: '1h',
60*60*24: '1d',
})
def mk_fqsn(
provider: str,
symbol: str,
) -> str:
'''
Generate a "fully qualified symbol name" which is
a reverse-hierarchical cross broker/provider symbol
'''
return '.'.join([symbol, provider]).lower()
def float_digits(
value: float,
) -> int:
if value == 0:
return 0
return int(-decimal.Decimal(str(value)).as_tuple().exponent)
def ohlc_zeros(length: int) -> np.ndarray:
"""Construct an OHLC field formatted structarray.
For "why a structarray" see here: https://stackoverflow.com/a/52443038
Bottom line, they're faster then ``np.recarray``.
"""
return np.zeros(length, dtype=base_ohlc_dtype)
def unpack_fqsn(fqsn: str) -> tuple[str, str, str]:
'''
Unpack a fully-qualified-symbol-name to ``tuple``.
'''
venue = ''
suffix = ''
# TODO: probably reverse the order of all this XD
tokens = fqsn.split('.')
if len(tokens) < 3:
# probably crypto
symbol, broker = tokens
return (
broker,
symbol,
'',
)
elif len(tokens) > 3:
symbol, venue, suffix, broker = tokens
else:
symbol, venue, broker = tokens
suffix = ''
# head, _, broker = fqsn.rpartition('.')
# symbol, _, suffix = head.rpartition('.')
return (
broker,
'.'.join([symbol, venue]),
suffix,
)
class Symbol(BaseModel):
'''
I guess this is some kinda container thing for dealing with
all the different meta-data formats from brokers?
'''
key: str
tick_size: float = 0.01
lot_tick_size: float = 0.0 # "volume" precision as min step value
tick_size_digits: int = 2
lot_size_digits: int = 0
suffix: str = ''
broker_info: dict[str, dict[str, Any]] = {}
# specifies a "class" of financial instrument
# ex. stock, futer, option, bond etc.
# @validate_arguments
@classmethod
def from_broker_info(
cls,
broker: str,
symbol: str,
info: dict[str, Any],
suffix: str = '',
# XXX: like wtf..
# ) -> 'Symbol':
) -> None:
tick_size = info.get('price_tick_size', 0.01)
lot_tick_size = info.get('lot_tick_size', 0.0)
return Symbol(
key=symbol,
tick_size=tick_size,
lot_tick_size=lot_tick_size,
tick_size_digits=float_digits(tick_size),
lot_size_digits=float_digits(lot_tick_size),
suffix=suffix,
broker_info={broker: info},
)
@classmethod
def from_fqsn(
cls,
fqsn: str,
info: dict[str, Any],
# XXX: like wtf..
# ) -> 'Symbol':
) -> None:
broker, key, suffix = unpack_fqsn(fqsn)
return cls.from_broker_info(
broker,
key,
info=info,
suffix=suffix,
)
@property
def type_key(self) -> str:
return list(self.broker_info.values())[0]['asset_type']
@property
def brokers(self) -> list[str]:
return list(self.broker_info.keys())
def nearest_tick(self, value: float) -> float:
'''
Return the nearest tick value based on mininum increment.
'''
mult = 1 / self.tick_size
return round(value * mult) / mult
def front_feed(self) -> tuple[str, str]:
'''
Return the "current" feed key for this symbol.
(i.e. the broker + symbol key in a tuple).
'''
return (
list(self.broker_info.keys())[0],
self.key,
)
def tokens(self) -> tuple[str]:
broker, key = self.front_feed()
if self.suffix:
return (key, self.suffix, broker)
else:
return (key, broker)
def front_fqsn(self) -> str:
'''
fqsn = "fully qualified symbol name"
Basically the idea here is for all client-ish code (aka programs/actors
that ask the provider agnostic layers in the stack for data) should be
able to tell which backend / venue / derivative each data feed/flow is
from by an explicit string key of the current form:
<instrumentname>.<venue>.<suffixwithmetadata>.<brokerbackendname>
TODO: I have thoughts that we should actually change this to be
more like an "attr lookup" (like how the web should have done
urls, but marketting peeps ruined it etc. etc.):
<broker>.<venue>.<instrumentname>.<suffixwithmetadata>
'''
tokens = self.tokens()
fqsn = '.'.join(tokens)
return fqsn
def iterfqsns(self) -> list[str]:
keys = []
for broker in self.broker_info.keys():
fqsn = mk_fqsn(self.key, broker)
if self.suffix:
fqsn += f'.{self.suffix}'
keys.append(fqsn)
return keys
def _nan_to_closest_num(array: np.ndarray):
"""Return interpolated values instead of NaN.
"""
for col in ['open', 'high', 'low', 'close']:
mask = np.isnan(array[col])
if not mask.size:
continue
array[col][mask] = np.interp(
np.flatnonzero(mask), np.flatnonzero(~mask), array[col][~mask]
)

View File

@ -1,152 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
ToOlS fOr CoPInG wITh "tHE wEB" protocols.
"""
from contextlib import asynccontextmanager, AsyncExitStack
from types import ModuleType
from typing import Any, Callable, AsyncGenerator
import json
import trio
import trio_websocket
from trio_websocket._impl import (
ConnectionClosed,
DisconnectionTimeout,
ConnectionRejected,
HandshakeError,
ConnectionTimeout,
)
from ..log import get_logger
log = get_logger(__name__)
class NoBsWs:
"""Make ``trio_websocket`` sockets stay up no matter the bs.
"""
recon_errors = (
ConnectionClosed,
DisconnectionTimeout,
ConnectionRejected,
HandshakeError,
ConnectionTimeout,
)
def __init__(
self,
url: str,
token: str,
stack: AsyncExitStack,
fixture: Callable,
serializer: ModuleType = json,
):
self.url = url
self.token = token
self.fixture = fixture
self._stack = stack
self._ws: 'WebSocketConnection' = None # noqa
async def _connect(
self,
tries: int = 1000,
) -> None:
while True:
try:
await self._stack.aclose()
except (DisconnectionTimeout, RuntimeError):
await trio.sleep(0.5)
else:
break
last_err = None
for i in range(tries):
try:
self._ws = await self._stack.enter_async_context(
trio_websocket.open_websocket_url(self.url)
)
# rerun user code fixture
if self.token == '':
ret = await self._stack.enter_async_context(
self.fixture(self)
)
else:
ret = await self._stack.enter_async_context(
self.fixture(self, self.token)
)
assert ret is None
log.info(f'Connection success: {self.url}')
return self._ws
except self.recon_errors as err:
last_err = err
log.error(
f'{self} connection bail with '
f'{type(err)}...retry attempt {i}'
)
await trio.sleep(0.5)
continue
else:
log.exception('ws connection fail...')
raise last_err
async def send_msg(
self,
data: Any,
) -> None:
while True:
try:
return await self._ws.send_message(json.dumps(data))
except self.recon_errors:
await self._connect()
async def recv_msg(
self,
) -> Any:
while True:
try:
return json.loads(await self._ws.get_message())
except self.recon_errors:
await self._connect()
@asynccontextmanager
async def open_autorecon_ws(
url: str,
# TODO: proper type annot smh
fixture: Callable,
# used for authenticated websockets
token: str = '',
) -> AsyncGenerator[tuple[...], NoBsWs]:
"""Apparently we can QoS for all sorts of reasons..so catch em.
"""
async with AsyncExitStack() as stack:
ws = NoBsWs(url, token, stack, fixture=fixture)
await ws._connect()
try:
yield ws
finally:
await stack.aclose()

View File

@ -1,196 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
marketstore cli.
"""
from functools import partial
from pprint import pformat
from anyio_marketstore import open_marketstore_client
import trio
import tractor
import click
import numpy as np
from .marketstore import (
get_client,
# stream_quotes,
ingest_quote_stream,
# _url,
_tick_tbk_ids,
mk_tbk,
)
from ..cli import cli
from .. import watchlists as wl
from ..log import get_logger
from ._sharedmem import (
maybe_open_shm_array,
)
from ._source import (
base_iohlc_dtype,
)
log = get_logger(__name__)
@cli.command()
@click.option(
'--url',
default='ws://localhost:5993/ws',
help='HTTP URL of marketstore instance'
)
@click.argument('names', nargs=-1)
@click.pass_obj
def ms_stream(
config: dict,
names: list[str],
url: str,
) -> None:
'''
Connect to a marketstore time bucket stream for (a set of) symbols(s)
and print to console.
'''
async def main():
# async for quote in stream_quotes(symbols=names):
# log.info(f"Received quote:\n{quote}")
...
trio.run(main)
# @cli.command()
# @click.option(
# '--url',
# default=_url,
# help='HTTP URL of marketstore instance'
# )
# @click.argument('names', nargs=-1)
# @click.pass_obj
# def ms_destroy(config: dict, names: list[str], url: str) -> None:
# """Destroy symbol entries in the local marketstore instance.
# """
# async def main():
# nonlocal names
# async with get_client(url) as client:
#
# if not names:
# names = await client.list_symbols()
#
# # default is to wipe db entirely.
# answer = input(
# "This will entirely wipe you local marketstore db @ "
# f"{url} of the following symbols:\n {pformat(names)}"
# "\n\nDelete [N/y]?\n")
#
# if answer == 'y':
# for sym in names:
# # tbk = _tick_tbk.format(sym)
# tbk = tuple(sym, *_tick_tbk_ids)
# print(f"Destroying {tbk}..")
# await client.destroy(mk_tbk(tbk))
# else:
# print("Nothing deleted.")
#
# tractor.run(main)
@cli.command()
@click.option(
'--tl',
is_flag=True,
help='Enable tractor logging')
@click.option(
'--host',
default='localhost'
)
@click.option(
'--port',
default=5993
)
@click.argument('symbols', nargs=-1)
@click.pass_obj
def storesh(
config,
tl,
host,
port,
symbols: list[str],
):
'''
Start an IPython shell ready to query the local marketstore db.
'''
from piker.data.marketstore import tsdb_history_update
from piker._daemon import open_piker_runtime
async def main():
nonlocal symbols
async with open_piker_runtime(
'storesh',
enable_modules=['piker.data._ahab'],
):
symbol = symbols[0]
await tsdb_history_update(symbol)
trio.run(main)
@cli.command()
@click.option('--test-file', '-t', help='Test quote stream file')
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.argument('name', nargs=1, required=True)
@click.pass_obj
def ingest(config, name, test_file, tl):
'''
Ingest real-time broker quotes and ticks to a marketstore instance.
'''
# global opts
loglevel = config['loglevel']
tractorloglevel = config['tractorloglevel']
# log = config['log']
watchlist_from_file = wl.ensure_watchlists(config['wl_path'])
watchlists = wl.merge_watchlist(watchlist_from_file, wl._builtins)
symbols = watchlists[name]
grouped_syms = {}
for sym in symbols:
symbol, _, provider = sym.rpartition('.')
if provider not in grouped_syms:
grouped_syms[provider] = []
grouped_syms[provider].append(symbol)
async def entry_point():
async with tractor.open_nursery() as n:
for provider, symbols in grouped_syms.items():
await n.run_in_actor(
ingest_quote_stream,
name='ingest_marketstore',
symbols=symbols,
brokername=provider,
tries=1,
actorloglevel=loglevel,
loglevel=tractorloglevel
)
tractor.run(entry_point)

File diff suppressed because it is too large Load Diff

View File

@ -1,41 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Ingestion, for dataz.
Api layer likely in here...
"""
from types import ModuleType
from importlib import import_module
from ..log import get_logger
log = get_logger(__name__)
__ingestors__ = [
'marketstore',
]
def get_ingestormod(name: str) -> ModuleType:
"""Return the imported ingestor module by name.
"""
module = import_module('.' + name, 'piker.data')
# we only allow monkeying because it's for internal keying
module.name = module.__name__.split('.')[-1]
return module

View File

@ -1,851 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
``marketstore`` integration.
- client management routines
- ticK data ingest routines
- websocket client for subscribing to write triggers
- todo: tick sequence stream-cloning for testing
'''
from __future__ import annotations
from contextlib import asynccontextmanager as acm
from datetime import datetime
from pprint import pformat
from typing import (
Any,
Optional,
Union,
TYPE_CHECKING,
)
import time
from math import isnan
from bidict import bidict
import msgpack
import pyqtgraph as pg
import numpy as np
import tractor
from trio_websocket import open_websocket_url
from anyio_marketstore import (
open_marketstore_client,
MarketstoreClient,
Params,
)
import pendulum
import purerpc
if TYPE_CHECKING:
import docker
from ._ahab import DockerContainer
from .feed import maybe_open_feed
from ..log import get_logger, get_console_log
log = get_logger(__name__)
# container level config
_config = {
'grpc_listen_port': 5995,
'ws_listen_port': 5993,
'log_level': 'debug',
}
_yaml_config = '''
# piker's ``marketstore`` config.
# mount this config using:
# sudo docker run --mount \
# type=bind,source="$HOME/.config/piker/",target="/etc" -i -p \
# 5993:5993 alpacamarkets/marketstore:latest
root_directory: data
listen_port: {ws_listen_port}
grpc_listen_port: {grpc_listen_port}
log_level: {log_level}
queryable: true
stop_grace_period: 0
wal_rotate_interval: 5
stale_threshold: 5
enable_add: true
enable_remove: false
triggers:
- module: ondiskagg.so
on: "*/1Sec/OHLCV"
config:
# filter: "nasdaq"
destinations:
- 1Min
- 5Min
- 15Min
- 1H
- 1D
- module: stream.so
on: '*/*/*'
# config:
# filter: "nasdaq"
'''.format(**_config)
def start_marketstore(
client: docker.DockerClient,
**kwargs,
) -> tuple[DockerContainer, dict[str, Any]]:
'''
Start and supervise a marketstore instance with its config bind-mounted
in from the piker config directory on the system.
The equivalent cli cmd to this code is:
sudo docker run --mount \
type=bind,source="$HOME/.config/piker/",target="/etc" -i -p \
5993:5993 alpacamarkets/marketstore:latest
'''
import os
import docker
from .. import config
get_console_log('info', name=__name__)
mktsdir = os.path.join(config._config_dir, 'marketstore')
# create when dne
if not os.path.isdir(mktsdir):
os.mkdir(mktsdir)
yml_file = os.path.join(mktsdir, 'mkts.yml')
if not os.path.isfile(yml_file):
log.warning(
f'No `marketstore` config exists?: {yml_file}\n'
'Generating new file from template:\n'
f'{_yaml_config}\n'
)
with open(yml_file, 'w') as yf:
yf.write(_yaml_config)
# create a mount from user's local piker config dir into container
config_dir_mnt = docker.types.Mount(
target='/etc',
source=mktsdir,
type='bind',
)
# create a user config subdir where the marketstore
# backing filesystem database can be persisted.
persistent_data_dir = os.path.join(
mktsdir, 'data',
)
if not os.path.isdir(persistent_data_dir):
os.mkdir(persistent_data_dir)
data_dir_mnt = docker.types.Mount(
target='/data',
source=persistent_data_dir,
type='bind',
)
dcntr: DockerContainer = client.containers.run(
'alpacamarkets/marketstore:latest',
# do we need this for cmds?
# '-i',
# '-p 5993:5993',
ports={
'5993/tcp': 5993, # jsonrpc / ws?
'5995/tcp': 5995, # grpc
},
mounts=[
config_dir_mnt,
data_dir_mnt,
],
detach=True,
# stop_signal='SIGINT',
init=True,
# remove=True,
)
return (
dcntr,
_config,
# expected startup and stop msgs
"launching tcp listener for all services...",
"exiting...",
)
_tick_tbk_ids: tuple[str, str] = ('1Sec', 'TICK')
_tick_tbk: str = '{}/' + '/'.join(_tick_tbk_ids)
_tick_dt = [
# these two are required for as a "primary key"
('Epoch', 'i8'),
('Nanoseconds', 'i4'),
('IsTrade', 'i1'),
('IsBid', 'i1'),
('Price', 'f4'),
('Size', 'f4')
]
_quote_dt = [
# these two are required for as a "primary key"
('Epoch', 'i8'),
('Nanoseconds', 'i4'),
('Tick', 'i4'), # (-1, 0, 1) = (on bid, same, on ask)
# ('fill_time', 'f4'),
('Last', 'f4'),
('Bid', 'f4'),
('Bsize', 'i8'),
('Asize', 'i8'),
('Ask', 'f4'),
('Size', 'i8'),
('Volume', 'i8'),
# ('brokerd_ts', 'i64'),
# ('VWAP', 'f4')
]
_quote_tmp = {}.fromkeys(dict(_quote_dt).keys(), np.nan)
_tick_map = {
'Up': 1,
'Equal': 0,
'Down': -1,
None: np.nan,
}
_ohlcv_dt = [
# these two are required for as a "primary key"
('Epoch', 'i8'),
# ('Nanoseconds', 'i4'),
# ohlcv sampling
('Open', 'f4'),
('High', 'f4'),
('Low', 'f4'),
('Close', 'f4'),
('Volume', 'f4'),
]
ohlc_key_map = bidict({
'Epoch': 'time',
'Open': 'open',
'High': 'high',
'Low': 'low',
'Close': 'close',
'Volume': 'volume',
})
def mk_tbk(keys: tuple[str, str, str]) -> str:
'''
Generate a marketstore table key from a tuple.
Converts,
``('SPY', '1Sec', 'TICK')`` -> ``"SPY/1Sec/TICK"```
'''
return '/'.join(keys)
def quote_to_marketstore_structarray(
quote: dict[str, Any],
last_fill: Optional[float]
) -> np.array:
'''
Return marketstore writeable structarray from quote ``dict``.
'''
if last_fill:
# new fill bby
now = int(pendulum.parse(last_fill).timestamp)
else:
# this should get inserted upstream by the broker-client to
# subtract from IPC latency
now = time.time_ns()
secs, ns = now / 10**9, now % 10**9
# pack into list[tuple[str, Any]]
array_input = []
# insert 'Epoch' entry first and then 'Nanoseconds'.
array_input.append(int(secs))
array_input.append(int(ns))
# append remaining fields
for name, dt in _quote_dt[2:]:
if 'f' in dt:
none = np.nan
else:
# for ``np.int`` we use 0 as a null value
none = 0
# casefold? see https://github.com/alpacahq/marketstore/issues/324
val = quote.get(name.casefold(), none)
array_input.append(val)
return np.array([tuple(array_input)], dtype=_quote_dt)
@acm
async def get_client(
host: str = 'localhost',
port: int = 5995
) -> MarketstoreClient:
'''
Load a ``anyio_marketstore`` grpc client connected
to an existing ``marketstore`` server.
'''
async with open_marketstore_client(
host,
port
) as client:
yield client
class MarketStoreError(Exception):
"Generic marketstore client error"
# def err_on_resp(response: dict) -> None:
# """Raise any errors found in responses from client request.
# """
# responses = response['responses']
# if responses is not None:
# for r in responses:
# err = r['error']
# if err:
# raise MarketStoreError(err)
# map of seconds ints to "time frame" accepted keys
tf_in_1s = bidict({
1: '1Sec',
60: '1Min',
60*5: '5Min',
60*15: '15Min',
60*30: '30Min',
60*60: '1H',
60*60*24: '1D',
})
class Storage:
'''
High level storage api for both real-time and historical ingest.
'''
def __init__(
self,
client: MarketstoreClient,
) -> None:
# TODO: eventually this should be an api/interface type that
# ensures we can support multiple tsdb backends.
self.client = client
# series' cache from tsdb reads
self._arrays: dict[str, np.ndarray] = {}
async def list_keys(self) -> list[str]:
return await self.client.list_symbols()
async def search_keys(self, pattern: str) -> list[str]:
'''
Search for time series key in the storage backend.
'''
...
async def write_ticks(self, ticks: list) -> None:
...
async def load(
self,
fqsn: str,
) -> tuple[
dict[int, np.ndarray], # timeframe (in secs) to series
Optional[datetime], # first dt
Optional[datetime], # last dt
]:
first_tsdb_dt, last_tsdb_dt = None, None
tsdb_arrays = await self.read_ohlcv(
fqsn,
# on first load we don't need to pull the max
# history per request size worth.
limit=3000,
)
log.info(f'Loaded tsdb history {tsdb_arrays}')
if tsdb_arrays:
fastest = list(tsdb_arrays.values())[0]
times = fastest['Epoch']
first, last = times[0], times[-1]
first_tsdb_dt, last_tsdb_dt = map(
pendulum.from_timestamp, [first, last]
)
return tsdb_arrays, first_tsdb_dt, last_tsdb_dt
async def read_ohlcv(
self,
fqsn: str,
timeframe: Optional[Union[int, str]] = None,
end: Optional[int] = None,
limit: int = int(800e3),
) -> tuple[
MarketstoreClient,
Union[dict, np.ndarray]
]:
client = self.client
syms = await client.list_symbols()
if fqsn not in syms:
return {}
tfstr = tf_in_1s[1]
params = Params(
symbols=fqsn,
timeframe=tfstr,
attrgroup='OHLCV',
end=end,
# limit_from_start=True,
# TODO: figure the max limit here given the
# ``purepc`` msg size limit of purerpc: 33554432
limit=limit,
)
if timeframe is None:
log.info(f'starting {fqsn} tsdb granularity scan..')
# loop through and try to find highest granularity
for tfstr in tf_in_1s.values():
try:
log.info(f'querying for {tfstr}@{fqsn}')
params.set('timeframe', tfstr)
result = await client.query(params)
break
except purerpc.grpclib.exceptions.UnknownError:
# XXX: this is already logged by the container and
# thus shows up through `marketstored` logs relay.
# log.warning(f'{tfstr}@{fqsn} not found')
continue
else:
return {}
else:
result = await client.query(params)
# TODO: it turns out column access on recarrays is actually slower:
# https://jakevdp.github.io/PythonDataScienceHandbook/02.09-structured-data-numpy.html#RecordArrays:-Structured-Arrays-with-a-Twist
# it might make sense to make these structured arrays?
# Fill out a `numpy` array-results map
arrays = {}
for fqsn, data_set in result.by_symbols().items():
arrays.setdefault(fqsn, {})[
tf_in_1s.inverse[data_set.timeframe]
] = data_set.array
return arrays[fqsn][timeframe] if timeframe else arrays[fqsn]
async def delete_ts(
self,
key: str,
timeframe: Optional[Union[int, str]] = None,
) -> bool:
client = self.client
syms = await client.list_symbols()
print(syms)
# if key not in syms:
# raise KeyError(f'`{fqsn}` table key not found?')
return await client.destroy(tbk=key)
async def write_ohlcv(
self,
fqsn: str,
ohlcv: np.ndarray,
append_and_duplicate: bool = True,
limit: int = int(800e3),
) -> None:
# build mkts schema compat array for writing
mkts_dt = np.dtype(_ohlcv_dt)
mkts_array = np.zeros(
len(ohlcv),
dtype=mkts_dt,
)
# copy from shm array (yes it's this easy):
# https://numpy.org/doc/stable/user/basics.rec.html#assignment-from-other-structured-arrays
mkts_array[:] = ohlcv[[
'time',
'open',
'high',
'low',
'close',
'volume',
]]
m, r = divmod(len(mkts_array), limit)
for i in range(m, 1):
to_push = mkts_array[i-1:i*limit]
# write to db
resp = await self.client.write(
to_push,
tbk=f'{fqsn}/1Sec/OHLCV',
# NOTE: will will append duplicates
# for the same timestamp-index.
# TODO: pre deduplicate?
isvariablelength=append_and_duplicate,
)
log.info(
f'Wrote {mkts_array.size} datums to tsdb\n'
)
for resp in resp.responses:
err = resp.error
if err:
raise MarketStoreError(err)
if r:
to_push = mkts_array[m*limit:]
# write to db
resp = await self.client.write(
to_push,
tbk=f'{fqsn}/1Sec/OHLCV',
# NOTE: will will append duplicates
# for the same timestamp-index.
# TODO: pre deduplicate?
isvariablelength=append_and_duplicate,
)
log.info(
f'Wrote {mkts_array.size} datums to tsdb\n'
)
for resp in resp.responses:
err = resp.error
if err:
raise MarketStoreError(err)
# XXX: currently the only way to do this is through the CLI:
# sudo ./marketstore connect --dir ~/.config/piker/data
# >> \show mnq.globex.20220617.ib/1Sec/OHLCV 2022-05-15
# and this seems to block and use up mem..
# >> \trim mnq.globex.20220617.ib/1Sec/OHLCV 2022-05-15
# relevant source code for this is here:
# https://github.com/alpacahq/marketstore/blob/master/cmd/connect/session/trim.go#L14
# def delete_range(self, start_dt, end_dt) -> None:
# ...
@acm
async def open_storage_client(
fqsn: str,
period: Optional[Union[int, str]] = None, # in seconds
) -> tuple[Storage, dict[str, np.ndarray]]:
'''
Load a series by key and deliver in ``numpy`` struct array format.
'''
async with (
# eventually a storage backend endpoint
get_client() as client,
):
# slap on our wrapper api
yield Storage(client)
async def tsdb_history_update(
fqsn: Optional[str] = None,
) -> list[str]:
# TODO: real-time dedicated task for ensuring
# history consistency between the tsdb, shm and real-time feed..
# update sequence design notes:
# - load existing highest frequency data from mkts
# * how do we want to offer this to the UI?
# - lazy loading?
# - try to load it all and expect graphics caching/diffing
# to hide extra bits that aren't in view?
# - compute the diff between latest data from broker and shm
# * use sql api in mkts to determine where the backend should
# start querying for data?
# * append any diff with new shm length
# * determine missing (gapped) history by scanning
# * how far back do we look?
# - begin rt update ingest and aggregation
# * could start by always writing ticks to mkts instead of
# worrying about a shm queue for now.
# * we have a short list of shm queues worth groking:
# - https://github.com/pikers/piker/issues/107
# * the original data feed arch blurb:
# - https://github.com/pikers/piker/issues/98
#
profiler = pg.debug.Profiler(
disabled=False, # not pg_profile_enabled(),
delayed=False,
)
async with (
open_storage_client(fqsn) as storage,
maybe_open_feed(
[fqsn],
start_stream=False,
) as (feed, stream),
):
profiler(f'opened feed for {fqsn}')
to_append = feed.shm.array
to_prepend = None
if fqsn:
symbol = feed.symbols.get(fqsn)
if symbol:
fqsn = symbol.front_fqsn()
# diff db history with shm and only write the missing portions
ohlcv = feed.shm.array
# TODO: use pg profiler
tsdb_arrays = await storage.read_ohlcv(fqsn)
# hist diffing
if tsdb_arrays:
for secs in (1, 60):
ts = tsdb_arrays.get(secs)
if ts is not None and len(ts):
# these aren't currently used but can be referenced from
# within the embedded ipython shell below.
to_append = ohlcv[ohlcv['time'] > ts['Epoch'][-1]]
to_prepend = ohlcv[ohlcv['time'] < ts['Epoch'][0]]
profiler('Finished db arrays diffs')
syms = await storage.client.list_symbols()
log.info(f'Existing tsdb symbol set:\n{pformat(syms)}')
profiler(f'listed symbols {syms}')
# TODO: ask if user wants to write history for detected
# available shm buffers?
from tractor.trionics import ipython_embed
await ipython_embed()
# for array in [to_append, to_prepend]:
# if array is None:
# continue
# log.info(
# f'Writing datums {array.size} -> to tsdb from shm\n'
# )
# await storage.write_ohlcv(fqsn, array)
# profiler('Finished db writes')
async def ingest_quote_stream(
symbols: list[str],
brokername: str,
tries: int = 1,
loglevel: str = None,
) -> None:
'''
Ingest a broker quote stream into a ``marketstore`` tsdb.
'''
async with (
maybe_open_feed(brokername, symbols, loglevel=loglevel) as feed,
get_client() as ms_client,
):
async for quotes in feed.stream:
log.info(quotes)
for symbol, quote in quotes.items():
for tick in quote.get('ticks', ()):
ticktype = tick.get('type', 'n/a')
# techtonic tick write
array = quote_to_marketstore_structarray({
'IsTrade': 1 if ticktype == 'trade' else 0,
'IsBid': 1 if ticktype in ('bid', 'bsize') else 0,
'Price': tick.get('price'),
'Size': tick.get('size')
}, last_fill=quote.get('broker_ts', None))
await ms_client.write(array, _tick_tbk)
# LEGACY WRITE LOOP (using old tick dt)
# quote_cache = {
# 'size': 0,
# 'tick': 0
# }
# async for quotes in qstream:
# log.info(quotes)
# for symbol, quote in quotes.items():
# # remap tick strs to ints
# quote['tick'] = _tick_map[quote.get('tick', 'Equal')]
# # check for volume update (i.e. did trades happen
# # since last quote)
# new_vol = quote.get('volume', None)
# if new_vol is None:
# log.debug(f"No fills for {symbol}")
# if new_vol == quote_cache.get('volume'):
# # should never happen due to field diffing
# # on sender side
# log.error(
# f"{symbol}: got same volume as last quote?")
# quote_cache.update(quote)
# a = quote_to_marketstore_structarray(
# quote,
# # TODO: check this closer to the broker query api
# last_fill=quote.get('fill_time', '')
# )
# await ms_client.write(symbol, a)
async def stream_quotes(
symbols: list[str],
host: str = 'localhost',
port: int = 5993,
diff_cached: bool = True,
loglevel: str = None,
) -> None:
'''
Open a symbol stream from a running instance of marketstore and
log to console.
'''
# XXX: required to propagate ``tractor`` loglevel to piker logging
get_console_log(loglevel or tractor.current_actor().loglevel)
tbks: dict[str, str] = {sym: f"{sym}/*/*" for sym in symbols}
async with open_websocket_url(f'ws://{host}:{port}/ws') as ws:
# send subs topics to server
resp = await ws.send_message(
msgpack.dumps({'streams': list(tbks.values())})
)
log.info(resp)
async def recv() -> dict[str, Any]:
return msgpack.loads((await ws.get_message()), encoding='utf-8')
streams = (await recv())['streams']
log.info(f"Subscribed to {streams}")
_cache = {}
while True:
msg = await recv()
# unpack symbol and quote data
# key is in format ``<SYMBOL>/<TIMEFRAME>/<ID>``
symbol = msg['key'].split('/')[0]
data = msg['data']
# calc time stamp(s)
s, ns = data.pop('Epoch'), data.pop('Nanoseconds')
ts = s * 10**9 + ns
data['broker_fill_time_ns'] = ts
quote = {}
for k, v in data.items():
if isnan(v):
continue
quote[k.lower()] = v
quote['symbol'] = symbol
quotes = {}
if diff_cached:
last = _cache.setdefault(symbol, {})
new = set(quote.items()) - set(last.items())
if new:
log.info(f"New quote {quote['symbol']}:\n{new}")
# only ship diff updates and other required fields
payload = {k: quote[k] for k, v in new}
payload['symbol'] = symbol
# if there was volume likely the last size of
# shares traded is useful info and it's possible
# that the set difference from above will disregard
# a "size" value since the same # of shares were traded
size = quote.get('size')
volume = quote.get('volume')
if size and volume:
new_volume_since_last = max(
volume - last.get('volume', 0), 0)
log.warning(
f"NEW VOLUME {symbol}:{new_volume_since_last}")
payload['size'] = size
payload['last'] = quote.get('last')
# XXX: we append to a list for the options case where the
# subscription topic (key) is the same for all
# expiries even though this is uncessary for the
# stock case (different topic [i.e. symbol] for each
# quote).
quotes.setdefault(symbol, []).append(payload)
# update cache
_cache[symbol].update(quote)
else:
quotes = {
symbol: [{key.lower(): val for key, val in quote.items()}]}
if quotes:
yield quotes

View File

@ -1,49 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Fin-sig-proc for the peeps!
'''
from typing import AsyncIterator
import numpy as np
from ._engine import cascade
__all__ = ['cascade']
async def latency(
source: 'TickStream[Dict[str, float]]', # noqa
ohlcv: np.ndarray
) -> AsyncIterator[np.ndarray]:
"""Latency measurements, broker to piker.
"""
# TODO: do we want to offer yielding this async
# before the rt data connection comes up?
# deliver zeros for all prior history
yield np.zeros(len(ohlcv))
async for quote in source:
ts = quote.get('broker_ts')
if ts:
# This is codified in the per-broker normalization layer
# TODO: Add more measure points and diffs for full system
# stack tracing.
value = quote['brokerd_ts'] - quote['broker_ts']
yield value

View File

@ -1,199 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship of pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
FSP (financial signal processing) apis.
'''
# TODO: things to figure the heck out:
# - how to handle non-plottable values (pyqtgraph has facility for this
# now in `arrayToQPath()`)
# - composition of fsps / implicit chaining syntax (we need an issue)
from __future__ import annotations
from functools import partial
from typing import (
Any,
Callable,
Awaitable,
Optional,
)
import numpy as np
import tractor
from tractor.msg import NamespacePath
from ..data._sharedmem import (
ShmArray,
maybe_open_shm_array,
attach_shm_array,
_Token,
)
from ..log import get_logger
log = get_logger(__name__)
# global fsp registry filled out by @fsp decorator below
_fsp_registry = {}
def _load_builtins() -> dict[tuple, Callable]:
# import to implicity trigger registration via ``@fsp``
from . import _momo # noqa
from . import _volume # noqa
return _fsp_registry
class Fsp:
'''
"Financial signal processor" decorator wrapped async function.
'''
# TODO: checkout the advanced features from ``wrapt``:
# - dynamic enable toggling,
# https://wrapt.readthedocs.io/en/latest/decorators.html#dynamically-disabling-decorators
# - custom object proxies, might be useful for implementing n-compose
# https://wrapt.readthedocs.io/en/latest/wrappers.html#custom-object-proxies
# - custom function wrappers,
# https://wrapt.readthedocs.io/en/latest/wrappers.html#custom-function-wrappers
# actor-local map of source flow shm tokens
# + the consuming fsp *to* the consumers output
# shm flow.
_flow_registry: dict[
tuple[_Token, str], _Token,
] = {}
def __init__(
self,
func: Callable[..., Awaitable],
*,
outputs: tuple[str] = (),
display_name: Optional[str] = None,
**config,
) -> None:
# TODO (maybe):
# - type introspection?
# - should we make this a wrapt object proxy?
self.func = func
self.__name__ = func.__name__ # XXX: must have func-object name
self.ns_path: tuple[str, str] = NamespacePath.from_ref(func)
self.outputs = outputs
self.config: dict[str, Any] = config
# register with declared set.
_fsp_registry[self.ns_path] = self
@property
def name(self) -> str:
return self.__name__
def __call__(
self,
# TODO: when we settle on py3.10 we should probably use the new
# type annots from pep 612:
# https://www.python.org/dev/peps/pep-0612/
# instance,
*args,
**kwargs
):
return self.func(*args, **kwargs)
# TODO: lru_cache this? prettty sure it'll work?
def get_shm(
self,
src_shm: ShmArray,
) -> ShmArray:
'''
Provide access to allocated shared mem array
for this "instance" of a signal processor for
the given ``key``.
'''
dst_token = self._flow_registry[
(src_shm._token, self.name)
]
shm = attach_shm_array(dst_token)
return shm
def fsp(
wrapped=None,
*,
outputs: tuple[str] = (),
display_name: Optional[str] = None,
**config,
) -> Fsp:
if wrapped is None:
return partial(
Fsp,
outputs=outputs,
display_name=display_name,
**config,
)
return Fsp(wrapped, outputs=(wrapped.__name__,))
def mk_fsp_shm_key(
sym: str,
target: Fsp
) -> str:
uid = tractor.current_actor().uid
return f'{sym}.fsp.{target.name}.{".".join(uid)}'
def maybe_mk_fsp_shm(
sym: str,
target: Fsp,
readonly: bool = True,
) -> (str, ShmArray, bool):
'''
Allocate a single row shm array for an symbol-fsp pair if none
exists, otherwise load the shm already existing for that token.
'''
assert isinstance(sym, str), '`sym` should be file-name-friendly `str`'
# TODO: load output types from `Fsp`
# - should `index` be a required internal field?
fsp_dtype = np.dtype(
[('index', int)] +
[(field_name, float) for field_name in target.outputs]
)
key = mk_fsp_shm_key(sym, target)
shm, opened = maybe_open_shm_array(
key,
# TODO: create entry for each time frame
dtype=fsp_dtype,
readonly=True,
)
return key, shm, opened

View File

@ -1,460 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship of pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
core task logic for processing chains
'''
from dataclasses import dataclass
from functools import partial
from typing import (
AsyncIterator, Callable, Optional,
Union,
)
import numpy as np
import pyqtgraph as pg
import trio
from trio_typing import TaskStatus
import tractor
from tractor.msg import NamespacePath
from ..log import get_logger, get_console_log
from .. import data
from ..data import attach_shm_array
from ..data.feed import Feed
from ..data._sharedmem import ShmArray
from ..data._source import Symbol
from ._api import (
Fsp,
_load_builtins,
_Token,
)
log = get_logger(__name__)
@dataclass
class TaskTracker:
complete: trio.Event
cs: trio.CancelScope
async def filter_quotes_by_sym(
sym: str,
quote_stream: tractor.MsgStream,
) -> AsyncIterator[dict]:
'''
Filter quote stream by target symbol.
'''
# TODO: make this the actual first quote from feed
# XXX: this allows for a single iteration to run for history
# processing without waiting on the real-time feed for a new quote
yield {}
async for quotes in quote_stream:
quote = quotes.get(sym)
if quote:
yield quote
async def fsp_compute(
symbol: Symbol,
feed: Feed,
quote_stream: trio.abc.ReceiveChannel,
src: ShmArray,
dst: ShmArray,
func: Callable,
# attach_stream: bool = False,
task_status: TaskStatus[None] = trio.TASK_STATUS_IGNORED,
) -> None:
profiler = pg.debug.Profiler(
delayed=False,
disabled=True
)
fqsn = symbol.front_fqsn()
out_stream = func(
# TODO: do we even need this if we do the feed api right?
# shouldn't a local stream do this before we get a handle
# to the async iterable? it's that or we do some kinda
# async itertools style?
filter_quotes_by_sym(fqsn, quote_stream),
# XXX: currently the ``ohlcv`` arg
feed.shm,
)
# Conduct a single iteration of fsp with historical bars input
# and get historical output
history_output: Union[
dict[str, np.ndarray], # multi-output case
np.ndarray, # single output case
]
history_output = await out_stream.__anext__()
func_name = func.__name__
profiler(f'{func_name} generated history')
# build struct array with an 'index' field to push as history
# TODO: push using a[['f0', 'f1', .., 'fn']] = .. syntax no?
# if the output array is multi-field then push
# each respective field.
fields = getattr(dst.array.dtype, 'fields', None).copy()
fields.pop('index')
history: Optional[np.ndarray] = None # TODO: nptyping here!
if fields and len(fields) > 1 and fields:
if not isinstance(history_output, dict):
raise ValueError(
f'`{func_name}` is a multi-output FSP and should yield a '
'`dict[str, np.ndarray]` for history'
)
for key in fields.keys():
if key in history_output:
output = history_output[key]
if history is None:
if output is None:
length = len(src.array)
else:
length = len(output)
# using the first output, determine
# the length of the struct-array that
# will be pushed to shm.
history = np.zeros(
length,
dtype=dst.array.dtype
)
if output is None:
continue
history[key] = output
# single-key output stream
else:
if not isinstance(history_output, np.ndarray):
raise ValueError(
f'`{func_name}` is a single output FSP and should yield an '
'`np.ndarray` for history'
)
history = np.zeros(
len(history_output),
dtype=dst.array.dtype
)
history[func_name] = history_output
# TODO: XXX:
# THERE'S A BIG BUG HERE WITH THE `index` field since we're
# prepending a copy of the first value a few times to make
# sub-curves align with the parent bar chart.
# This likely needs to be fixed either by,
# - manually assigning the index and historical data
# seperately to the shm array (i.e. not using .push())
# - developing some system on top of the shared mem array that
# is `index` aware such that historical data can be indexed
# relative to the true first datum? Not sure if this is sane
# for incremental compuations.
first = dst._first.value = src._first.value
# TODO: can we use this `start` flag instead of the manual
# setting above?
index = dst.push(history, start=first)
profiler(f'{func_name} pushed history')
profiler.finish()
# setup a respawn handle
with trio.CancelScope() as cs:
# TODO: might be better to just make a "restart" method where
# the target task is spawned implicitly and then the event is
# set via some higher level api? At that poing we might as well
# be writing a one-cancels-one nursery though right?
tracker = TaskTracker(trio.Event(), cs)
task_status.started((tracker, index))
profiler(f'{func_name} yield last index')
# import time
# last = time.time()
try:
async for processed in out_stream:
log.debug(f"{func_name}: {processed}")
key, output = processed
index = src.index
dst.array[-1][key] = output
# NOTE: for now we aren't streaming this to the consumer
# stream latest array index entry which basically just acts
# as trigger msg to tell the consumer to read from shm
# TODO: further this should likely be implemented much
# like our `Feed` api where there is one background
# "service" task which computes output and then sends to
# N-consumers who subscribe for the real-time output,
# which we'll likely want to implement using local-mem
# chans for the fan out?
# if attach_stream:
# await client_stream.send(index)
# period = time.time() - last
# hz = 1/period if period else float('nan')
# if hz > 60:
# log.info(f'FSP quote too fast: {hz}')
# last = time.time()
finally:
tracker.complete.set()
@tractor.context
async def cascade(
ctx: tractor.Context,
# data feed key
fqsn: str,
src_shm_token: dict,
dst_shm_token: tuple[str, np.dtype],
ns_path: NamespacePath,
shm_registry: dict[str, _Token],
zero_on_step: bool = False,
loglevel: Optional[str] = None,
) -> None:
'''
Chain streaming signal processors and deliver output to
destination shm array buffer.
'''
profiler = pg.debug.Profiler(
delayed=False,
disabled=False
)
if loglevel:
get_console_log(loglevel)
src = attach_shm_array(token=src_shm_token)
dst = attach_shm_array(readonly=False, token=dst_shm_token)
reg = _load_builtins()
lines = '\n'.join([f'{key.rpartition(":")[2]} => {key}' for key in reg])
log.info(
f'Registered FSP set:\n{lines}'
)
# update actorlocal flows table which registers
# readonly "instances" of this fsp for symbol/source
# so that consumer fsps can look it up by source + fsp.
# TODO: ugh i hate this wind/unwind to list over the wire
# but not sure how else to do it.
for (token, fsp_name, dst_token) in shm_registry:
Fsp._flow_registry[
(_Token.from_msg(token), fsp_name)
] = _Token.from_msg(dst_token)
fsp: Fsp = reg.get(
NamespacePath(ns_path)
)
func = fsp.func
if not func:
# TODO: assume it's a func target path
raise ValueError(f'Unknown fsp target: {ns_path}')
# open a data feed stream with requested broker
async with data.feed.maybe_open_feed(
[fqsn],
# TODO throttle tick outputs from *this* daemon since
# it'll emit tons of ticks due to the throttle only
# limits quote arrival periods, so the consumer of *this*
# needs to get throttled the ticks we generate.
# tick_throttle=60,
) as (feed, quote_stream):
symbol = feed.symbols[fqsn]
profiler(f'{func}: feed up')
assert src.token == feed.shm.token
# last_len = new_len = len(src.array)
func_name = func.__name__
async with (
trio.open_nursery() as n,
):
fsp_target = partial(
fsp_compute,
symbol=symbol,
feed=feed,
quote_stream=quote_stream,
# shm
src=src,
dst=dst,
# target
func=func
)
tracker, index = await n.start(fsp_target)
if zero_on_step:
last = dst.array[-1:]
zeroed = np.zeros(last.shape, dtype=last.dtype)
profiler(f'{func_name}: fsp up')
# sync client
await ctx.started(index)
# XXX: rt stream with client which we MUST
# open here (and keep it open) in order to make
# incremental "updates" as history prepends take
# place.
async with ctx.open_stream() as client_stream:
# TODO: these likely should all become
# methods of this ``TaskLifetime`` or wtv
# abstraction..
async def resync(
tracker: TaskTracker,
) -> tuple[TaskTracker, int]:
# TODO: adopt an incremental update engine/approach
# where possible here eventually!
log.debug(f're-syncing fsp {func_name} to source')
tracker.cs.cancel()
await tracker.complete.wait()
tracker, index = await n.start(fsp_target)
# always trigger UI refresh after history update,
# see ``piker.ui._fsp.FspAdmin.open_chain()`` and
# ``piker.ui._display.trigger_update()``.
await client_stream.send({
'fsp_update': {
'key': dst_shm_token,
'first': dst._first.value,
'last': dst._last.value,
}})
return tracker, index
def is_synced(
src: ShmArray,
dst: ShmArray
) -> tuple[bool, int, int]:
'''Predicate to dertmine if a destination FSP
output array is aligned to its source array.
'''
step_diff = src.index - dst.index
len_diff = abs(len(src.array) - len(dst.array))
return not (
# the source is likely backfilling and we must
# sync history calculations
len_diff > 2 or
# we aren't step synced to the source and may be
# leading/lagging by a step
step_diff > 1 or
step_diff < 0
), step_diff, len_diff
async def poll_and_sync_to_step(
tracker: TaskTracker,
src: ShmArray,
dst: ShmArray,
) -> tuple[TaskTracker, int]:
synced, step_diff, _ = is_synced(src, dst)
while not synced:
tracker, index = await resync(tracker)
synced, step_diff, _ = is_synced(src, dst)
return tracker, step_diff
s, step, ld = is_synced(src, dst)
# detect sample period step for subscription to increment
# signal
times = src.array['time']
delay_s = times[-1] - times[times != times[-1]][-1]
# Increment the underlying shared memory buffer on every
# "increment" msg received from the underlying data feed.
async with feed.index_stream(
int(delay_s)
) as istream:
profiler(f'{func_name}: sample stream up')
profiler.finish()
async for _ in istream:
# respawn the compute task if the source
# array has been updated such that we compute
# new history from the (prepended) source.
synced, step_diff, _ = is_synced(src, dst)
if not synced:
tracker, step_diff = await poll_and_sync_to_step(
tracker,
src,
dst,
)
# skip adding a last bar since we should already
# be step alinged
if step_diff == 0:
continue
# read out last shm row, copy and write new row
array = dst.array
# some metrics like vlm should be reset
# to zero every step.
if zero_on_step:
last = zeroed
else:
last = array[-1:].copy()
dst.push(last)

View File

@ -1,253 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship of pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Momentum bby.
"""
from typing import AsyncIterator, Optional
import numpy as np
from numba import jit, float64, optional, int64
from ._api import fsp
from ..data._normalize import iterticks
from ..data._sharedmem import ShmArray
@jit(
float64[:](
float64[:],
optional(float64),
optional(float64)
),
nopython=True,
nogil=True
)
def ema(
y: 'np.ndarray[float64]',
alpha: optional(float64) = None,
ylast: optional(float64) = None,
) -> 'np.ndarray[float64]':
r'''
Exponential weighted moving average owka 'Exponential smoothing'.
- https://en.wikipedia.org/wiki/Moving_average#Exponential_moving_average
- https://en.wikipedia.org/wiki/Exponential_smoothing
Fun facts:
A geometric progression is the discrete version of an
exponential function, that is where the name for this
smoothing method originated according to statistics lore. In
signal processing parlance, an EMA is a first order IIR filter.
.. math::
.tex
{S_{t}={\begin{cases}Y_{1},&t=1
\\\alpha Y_{t}+(1-\alpha )\cdot S_{t-1},&t>1\end{cases}}}
.nerd
(2) s = {
s[0] = y[0]; t = 0
s[t] = a*y[t] + (1-a)*s[t-1], t > 0.
}
More discussion here:
https://stackoverflow.com/questions/42869495/numpy-version-of-exponential-weighted-moving-average-equivalent-to-pandas-ewm
'''
n = y.shape[0]
if alpha is None:
# https://en.wikipedia.org/wiki/Moving_average#Relationship_between_SMA_and_EMA
# use the "center of mass" convention making an ema compare
# directly to the com of a SMA or WMA:
alpha = 2 / float(n + 1)
s = np.empty(n, dtype=float64)
if n == 1:
s[0] = y[0] * alpha + ylast * (1 - alpha)
else:
if ylast is None:
s[0] = y[0]
else:
s[0] = ylast
for i in range(1, n):
s[i] = y[i] * alpha + s[i-1] * (1 - alpha)
return s
# @jit(
# float64[:](
# float64[:],
# int64,
# float64,
# float64,
# ),
# nopython=True,
# nogil=True
# )
def _rsi(
# TODO: use https://github.com/ramonhagenaars/nptyping
signal: 'np.ndarray[float64]',
period: int64 = 14,
up_ema_last: float64 = None,
down_ema_last: float64 = None,
) -> 'np.ndarray[float64]':
'''
relative strengggth.
'''
alpha = 1/period
df = np.diff(signal, prepend=0)
up = np.where(df > 0, df, 0)
up_ema = ema(up, alpha, up_ema_last)
down = np.where(df < 0, -df, 0)
down_ema = ema(down, alpha, down_ema_last)
# avoid dbz errors, this leaves the first
# index == 0 right?
rs = np.divide(
up_ema,
down_ema,
out=np.zeros_like(signal),
where=down_ema != 0
)
# map rs through sigmoid (with range [0, 100])
rsi = 100 - 100 / (1 + rs)
# rsi = 100 * (up_ema / (up_ema + down_ema))
# also return the last ema state for next iteration
return rsi, up_ema[-1], down_ema[-1]
def _wma(
signal: np.ndarray,
length: int,
weights: Optional[np.ndarray] = None,
) -> np.ndarray:
'''
Compute a windowed moving average of ``signal`` with window
``length`` and optional ``weights`` (must be same size as
``signal``).
'''
if weights is None:
# default is a standard arithmetic mean
seq = np.full((length,), 1)
weights = seq / seq.sum()
assert length == len(weights)
# lol, for long sequences this is nutso slow and expensive..
return np.convolve(signal, weights, 'valid')
@fsp
async def wma(
source, #: AsyncStream[np.ndarray],
length: int,
ohlcv: np.ndarray, # price time-frame "aware"
) -> AsyncIterator[np.ndarray]: # maybe something like like FspStream?
'''
Streaming weighted moving average.
``weights`` is a sequence of already scaled values. As an example
for the WMA often found in "techincal analysis":
``weights = np.arange(1, N) * N*(N-1)/2``.
'''
# deliver historical output as "first yield"
yield _wma(ohlcv.array['close'], length)
# begin real-time section
async for quote in source:
for tick in iterticks(quote, type='trade'):
yield _wma(ohlcv.last(length))
@fsp
async def rsi(
source: 'QuoteStream[Dict[str, Any]]', # noqa
ohlcv: ShmArray,
period: int = 14,
) -> AsyncIterator[np.ndarray]:
'''
Multi-timeframe streaming RSI.
https://en.wikipedia.org/wiki/Relative_strength_index
'''
sig = ohlcv.array['close']
# wilder says to seed the RSI EMAs with the SMA for the "period"
seed = _wma(ohlcv.last(period)['close'], period)[0]
# TODO: the emas here should be seeded with a period SMA as per
# wilder's original formula..
rsi_h, last_up_ema_close, last_down_ema_close = _rsi(
sig, period, seed, seed)
up_ema_last = last_up_ema_close
down_ema_last = last_down_ema_close
# deliver history
yield rsi_h
index = ohlcv.index
async for quote in source:
# tick based updates
for tick in iterticks(quote):
# though incorrect below is interesting
# sig = ohlcv.last(period)['close']
# get only the last 2 "datums" which will be diffed to
# calculate the real-time RSI output datum
sig = ohlcv.last(2)['close']
# the ema needs to be computed from the "last bar"
# TODO: how to make this cleaner
if ohlcv.index > index:
last_up_ema_close = up_ema_last
last_down_ema_close = down_ema_last
index = ohlcv.index
rsi_out, up_ema_last, down_ema_last = _rsi(
sig,
period=period,
up_ema_last=last_up_ema_close,
down_ema_last=last_down_ema_close,
)
yield rsi_out[-1:]

View File

@ -1,358 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship of pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
from typing import AsyncIterator, Optional, Union
import numpy as np
from tractor.trionics._broadcast import AsyncReceiver
from ._api import fsp
from ..data._normalize import iterticks
from ..data._sharedmem import ShmArray
from ._momo import _wma
from ..log import get_logger
log = get_logger(__name__)
# NOTE: is the same as our `wma` fsp, and if so which one is faster?
# Ohhh, this is an IIR style i think? So it has an anchor point
# effectively instead of a moving window/FIR style?
def wap(
signal: np.ndarray,
weights: np.ndarray,
) -> np.ndarray:
'''
Weighted average price from signal and weights.
'''
cum_weights = np.cumsum(weights)
cum_weighted_input = np.cumsum(signal * weights)
# cum_weighted_input / cum_weights
# but, avoid divide by zero errors
avg = np.divide(
cum_weighted_input,
cum_weights,
where=cum_weights != 0
)
return (
avg,
cum_weighted_input,
cum_weights,
)
@fsp
async def tina_vwap(
source: AsyncReceiver[dict],
ohlcv: ShmArray, # OHLC sampled history
# TODO: anchor logic (eg. to session start)
anchors: Optional[np.ndarray] = None,
) -> Union[
AsyncIterator[np.ndarray],
float
]:
'''
Streaming volume weighted moving average.
Calling this "tina" for now since we're using HLC3 instead of tick.
'''
if anchors is None:
# TODO:
# anchor to session start of data if possible
pass
a = ohlcv.array
chl3 = (a['close'] + a['high'] + a['low']) / 3
v = a['volume']
h_vwap, cum_wp, cum_v = wap(chl3, v)
# deliver historical output as "first yield"
yield h_vwap
w_tot = cum_wp[-1]
v_tot = cum_v[-1]
# vwap_tot = h_vwap[-1]
async for quote in source:
for tick in iterticks(
quote,
types=['trade'],
):
# c, h, l, v = ohlcv.array[-1][
# ['closes', 'high', 'low', 'volume']
# ]
# this computes tick-by-tick weightings from here forward
size = tick['size']
price = tick['price']
v_tot += size
w_tot += price * size
# yield ((((o + h + l) / 3) * v) weights_tot) / v_tot
yield 'tina_vwap', w_tot / v_tot
@fsp(
outputs=(
'dolla_vlm',
'dark_vlm',
'trade_count',
'dark_trade_count',
),
curve_style='step',
)
async def dolla_vlm(
source: AsyncReceiver[dict],
ohlcv: ShmArray, # OHLC sampled history
) -> AsyncIterator[
tuple[str, Union[np.ndarray, float]],
]:
'''
"Dollar Volume", aka the volume in asset-currency-units (usually
a fiat) computed from some price function for the sample step
*multiplied* (*) by the asset unit volume.
Useful for comparing cross asset "money flow" in #s that are
asset-currency-independent.
'''
a = ohlcv.array
chl3 = (a['close'] + a['high'] + a['low']) / 3
v = a['volume']
# on first iteration yield history
yield {
'dolla_vlm': chl3 * v,
'dark_vlm': None,
}
i = ohlcv.index
dvlm = vlm = 0
dark_trade_count = trade_count = 0
async for quote in source:
for tick in iterticks(
quote,
types=(
'trade',
'dark_trade',
),
deduplicate_darks=True,
):
# this computes tick-by-tick weightings from here forward
size = tick['size']
price = tick['price']
li = ohlcv.index
if li > i:
i = li
trade_count = dark_trade_count = dvlm = vlm = 0
# TODO: for marginned instruments (futes, etfs?) we need to
# show the margin $vlm by multiplying by whatever multiplier
# is reported in the sym info.
ttype = tick.get('type')
if ttype == 'dark_trade':
dvlm += price * size
yield 'dark_vlm', dvlm
dark_trade_count += 1
yield 'dark_trade_count', dark_trade_count
# print(f'{dark_trade_count}th dark_trade: {tick}')
else:
# print(f'vlm: {tick}')
vlm += price * size
yield 'dolla_vlm', vlm
trade_count += 1
yield 'trade_count', trade_count
# TODO: plot both to compare?
# c, h, l, v = ohlcv.last()[
# ['close', 'high', 'low', 'volume']
# ][0]
# tina_lvlm = c+h+l/3 * v
# print(f' tinal vlm: {tina_lvlm}')
@fsp(
# TODO: eventually I guess we should support some kinda declarative
# graphics config syntax per output yah? That seems like a clean way
# to let users configure things? Not sure how exactly to offer that
# api as well as how to expose such a thing *inside* the body?
outputs=(
# pulled verbatim from `ib` for now
'1m_trade_rate',
'1m_vlm_rate',
# our own instantaneous rate calcs which are all
# parameterized by a samples count (bars) period
'trade_rate',
'dark_trade_rate',
'dvlm_rate',
'dark_dvlm_rate',
),
curve_style='line',
)
async def flow_rates(
source: AsyncReceiver[dict],
ohlcv: ShmArray, # OHLC sampled history
# TODO (idea): a dynamic generic / boxing type that can be updated by other
# FSPs, user input, and possibly any general event stream in
# real-time. Hint: ideally implemented with caching until mutated
# ;)
period: 'Param[int]' = 6, # noqa
# TODO: support other means by providing a map
# to weights `partial()`-ed with `wma()`?
mean_type: str = 'arithmetic',
# TODO (idea): a generic for declaring boxed fsps much like ``pytest``
# fixtures? This probably needs a lot of thought if we want to offer
# a higher level composition syntax eventually (oh right gotta make
# an issue for that).
# ideas for how to allow composition / intercalling:
# - offer a `Fsp.get_history()` to do the first yield output?
# * err wait can we just have shm access directly?
# - how would it work if some consumer fsp wanted to dynamically
# change params which are input to the callee fsp? i guess we could
# lazy copy in that case?
# dvlm: 'Fsp[dolla_vlm]'
) -> AsyncIterator[
tuple[str, Union[np.ndarray, float]],
]:
# generally no history available prior to real-time calcs
yield {
# from ib
'1m_trade_rate': None,
'1m_vlm_rate': None,
'trade_rate': None,
'dark_trade_rate': None,
'dvlm_rate': None,
'dark_dvlm_rate': None,
}
# TODO: 3.10 do ``anext()``
quote = await source.__anext__()
# ltr = 0
# lvr = 0
tr = quote.get('tradeRate')
yield '1m_trade_rate', tr or 0
vr = quote.get('volumeRate')
yield '1m_vlm_rate', vr or 0
yield 'trade_rate', 0
yield 'dark_trade_rate', 0
yield 'dvlm_rate', 0
yield 'dark_dvlm_rate', 0
# NOTE: in theory we could dynamically allocate a cascade based on
# this call but not sure if that's too "dynamic" in terms of
# validating cascade flows from message typing perspective.
# attach to ``dolla_vlm`` fsp running
# on this same source flow.
dvlm_shm = dolla_vlm.get_shm(ohlcv)
# precompute arithmetic mean weights (all ones)
seq = np.full((period,), 1)
weights = seq / seq.sum()
async for quote in source:
if not quote:
log.error("OH WTF NO QUOTE IN FSP")
continue
# dvlm_wma = _wma(
# dvlm_shm.array['dolla_vlm'],
# period,
# weights=weights,
# )
# yield 'dvlm_rate', dvlm_wma[-1]
if period > 1:
trade_rate_wma = _wma(
dvlm_shm.array['trade_count'][-period:],
period,
weights=weights,
)
trade_rate = trade_rate_wma[-1]
# print(trade_rate)
yield 'trade_rate', trade_rate
else:
# instantaneous rate per sample step
count = dvlm_shm.array['trade_count'][-1]
yield 'trade_rate', count
# TODO: skip this if no dark vlm is declared
# by symbol info (eg. in crypto$)
# dark_dvlm_wma = _wma(
# dvlm_shm.array['dark_vlm'],
# period,
# weights=weights,
# )
# yield 'dark_dvlm_rate', dark_dvlm_wma[-1]
if period > 1:
dark_trade_rate_wma = _wma(
dvlm_shm.array['dark_trade_count'][-period:],
period,
weights=weights,
)
yield 'dark_trade_rate', dark_trade_rate_wma[-1]
else:
# instantaneous rate per sample step
dark_count = dvlm_shm.array['dark_trade_count'][-1]
yield 'dark_trade_rate', dark_count
# XXX: ib specific schema we should
# probably pre-pack ourselves.
# tr = quote.get('tradeRate')
# if tr is not None and tr != ltr:
# # print(f'trade rate: {tr}')
# yield '1m_trade_rate', tr
# ltr = tr
# vr = quote.get('volumeRate')
# if vr is not None and vr != lvr:
# # print(f'vlm rate: {vr}')
# yield '1m_vlm_rate', vr
# lvr = vr

View File

@ -1,19 +1,3 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
""" """
Log like a forester! Log like a forester!
""" """
@ -23,15 +7,10 @@ import json
import tractor import tractor
from pygments import highlight, lexers, formatters from pygments import highlight, lexers, formatters
# Makes it so we only see the full module name when using ``__name__`` _proj_name = 'piker'
# without the extra "piker." prefix.
_proj_name: str = 'piker'
def get_logger( def get_logger(name: str = None) -> logging.Logger:
name: str = None,
) -> logging.Logger:
'''Return the package log or a sub-log for `name` if provided. '''Return the package log or a sub-log for `name` if provided.
''' '''
return tractor.log.get_logger(name=name, _root_name=_proj_name) return tractor.log.get_logger(name=name, _root_name=_proj_name)

View File

@ -1,80 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
sugarz for trio/tractor conc peeps.
'''
from typing import AsyncContextManager
from typing import TypeVar
from contextlib import asynccontextmanager as acm
import trio
# A regular invariant generic type
T = TypeVar("T")
async def _enter_and_sleep(
mngr: AsyncContextManager[T],
to_yield: dict[int, T],
all_entered: trio.Event,
# task_status: TaskStatus[T] = trio.TASK_STATUS_IGNORED,
) -> T:
'''Open the async context manager deliver it's value
to this task's spawner and sleep until cancelled.
'''
async with mngr as value:
to_yield[id(mngr)] = value
if all(to_yield.values()):
all_entered.set()
# sleep until cancelled
await trio.sleep_forever()
@acm
async def async_enter_all(
*mngrs: list[AsyncContextManager[T]],
) -> tuple[T]:
to_yield = {}.fromkeys(id(mngr) for mngr in mngrs)
all_entered = trio.Event()
async with trio.open_nursery() as n:
for mngr in mngrs:
n.start_soon(
_enter_and_sleep,
mngr,
to_yield,
all_entered,
)
# deliver control once all managers have started up
await all_entered.wait()
yield tuple(to_yield.values())
# tear down all sleeper tasks thus triggering individual
# mngr ``__aexit__()``s.
n.cancel_scope.cancel()

View File

@ -1,22 +1,15 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
""" """
Stuff for your eyes, aka super hawt Qt UI components. Stuff for your eyes.
Currently we only support PyQt5 due to this issue in Pyside2:
https://bugreports.qt.io/projects/PYSIDE/issues/PYSIDE-1313
""" """
import os
import sys
# XXX clear all flags at import to avoid upsetting
# ol' kivy see: https://github.com/kivy/kivy/issues/4225
# though this is likely a ``click`` problem
sys.argv[1:] = []
# use the trio async loop
os.environ['KIVY_EVENTLOOP'] = 'trio'
import kivy
kivy.require('1.10.0')

View File

@ -1,130 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Anchor funtions for UI placement of annotions.
'''
from __future__ import annotations
from typing import Callable, TYPE_CHECKING
from PyQt5.QtCore import QPointF
from PyQt5.QtWidgets import QGraphicsPathItem
if TYPE_CHECKING:
from ._chart import ChartPlotWidget
from ._label import Label
def vbr_left(
label: Label,
) -> Callable[..., float]:
'''
Return a closure which gives the scene x-coordinate for the leftmost
point of the containing view box.
'''
return label.vbr().left
def right_axis(
chart: ChartPlotWidget, # noqa
label: Label,
side: str = 'left',
offset: float = 10,
avoid_book: bool = True,
# width: float = None,
) -> Callable[..., float]:
'''Return a position closure which gives the scene x-coordinate for
the x point on the right y-axis minus the width of the label given
it's contents.
'''
ryaxis = chart.getAxis('right')
if side == 'left':
if avoid_book:
def right_axis_offset_by_w() -> float:
# l1 spread graphics x-size
l1_len = chart._max_l1_line_len
# sum of all distances "from" the y-axis
right_offset = l1_len + label.w + offset
return ryaxis.pos().x() - right_offset
else:
def right_axis_offset_by_w() -> float:
return ryaxis.pos().x() - (label.w + offset)
return right_axis_offset_by_w
elif 'right':
# axis_offset = ryaxis.style['tickTextOffset'][0]
def on_axis() -> float:
return ryaxis.pos().x() # + axis_offset - 2
return on_axis
def gpath_pin(
gpath: QGraphicsPathItem,
label: Label, # noqa
location_description: str = 'right-of-path-centered',
use_right_of_pp_label: bool = False,
) -> QPointF:
# get actual arrow graphics path
path_br = gpath.mapToScene(gpath.path()).boundingRect()
# label.vb.locate(label.txt) #, children=True)
if location_description == 'right-of-path-centered':
return path_br.topRight() - QPointF(label.h/16, label.h / 3)
if location_description == 'left-of-path-centered':
return path_br.topLeft() - QPointF(label.w, label.h / 6)
elif location_description == 'below-path-left-aligned':
return path_br.bottomLeft() - QPointF(0, label.h / 6)
elif location_description == 'below-path-right-aligned':
return path_br.bottomRight() - QPointF(label.w, label.h / 6)
def pp_tight_and_right(
label: Label
) -> QPointF:
'''
Place *just* right of the pp label.
'''
# txt = label.txt
return label.txt.pos() + QPointF(label.w - label.h/3, 0)

View File

@ -1,281 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Annotations for ur faces.
"""
from typing import Callable, Optional
from PyQt5 import QtCore, QtGui, QtWidgets
from PyQt5.QtCore import QPointF, QRectF
from PyQt5.QtWidgets import QGraphicsPathItem
from pyqtgraph import Point, functions as fn, Color
import numpy as np
def mk_marker_path(
style: str,
) -> QGraphicsPathItem:
"""Add a marker to be displayed on the line wrapped in a ``QGraphicsPathItem``
ready to be placed using scene coordinates (not view).
**Arguments**
style String indicating the style of marker to add:
``'<|'``, ``'|>'``, ``'>|'``, ``'|<'``, ``'<|>'``,
``'>|<'``, ``'^'``, ``'v'``, ``'o'``
size Size of the marker in pixels.
"""
path = QtGui.QPainterPath()
if style == 'o':
path.addEllipse(QtCore.QRectF(-0.5, -0.5, 1, 1))
# arrow pointing away-from the top of line
if '<|' in style:
p = QtGui.QPolygonF([Point(0.5, 0), Point(0, -0.5), Point(-0.5, 0)])
path.addPolygon(p)
path.closeSubpath()
# arrow pointing away-from the bottom of line
if '|>' in style:
p = QtGui.QPolygonF([Point(0.5, 0), Point(0, 0.5), Point(-0.5, 0)])
path.addPolygon(p)
path.closeSubpath()
# arrow pointing in-to the top of line
if '>|' in style:
p = QtGui.QPolygonF([Point(0.5, -0.5), Point(0, 0), Point(-0.5, -0.5)])
path.addPolygon(p)
path.closeSubpath()
# arrow pointing in-to the bottom of line
if '|<' in style:
p = QtGui.QPolygonF([Point(0.5, 0.5), Point(0, 0), Point(-0.5, 0.5)])
path.addPolygon(p)
path.closeSubpath()
if '^' in style:
p = QtGui.QPolygonF([Point(0, -0.5), Point(0.5, 0), Point(0, 0.5)])
path.addPolygon(p)
path.closeSubpath()
if 'v' in style:
p = QtGui.QPolygonF([Point(0, -0.5), Point(-0.5, 0), Point(0, 0.5)])
path.addPolygon(p)
path.closeSubpath()
# self._maxMarkerSize = max([m[2] / 2. for m in self.markers])
return path
class LevelMarker(QGraphicsPathItem):
'''An arrow marker path graphich which redraws itself
to the specified view coordinate level on each paint cycle.
'''
def __init__(
self,
chart: 'ChartPlotWidget', # noqa
style: str,
get_level: Callable[..., float],
size: float = 20,
keep_in_view: bool = True,
on_paint: Optional[Callable] = None,
) -> None:
# get polygon and scale
super().__init__()
self.scale(size, size)
# interally generates path
self._style = None
self.style = style
self.chart = chart
self.get_level = get_level
self._on_paint = on_paint
self.scene_x = lambda: chart.marker_right_points()[1]
self.level: float = 0
self.keep_in_view = keep_in_view
@property
def style(self) -> str:
return self._style
@style.setter
def style(self, value: str) -> None:
if self._style != value:
polygon = mk_marker_path(value)
self.setPath(polygon)
self._style = value
def path_br(self) -> QRectF:
'''Return the bounding rect for the opaque path part
of this item.
'''
return self.mapToScene(
self.path()
).boundingRect()
def delete(self) -> None:
self.scene().removeItem(self)
@property
def h(self) -> float:
return self.path_br().height()
@property
def w(self) -> float:
return self.path_br().width()
def position_in_view(
self,
# level: float,
) -> None:
'''Show a pp off-screen indicator for a level label.
This is like in fps games where you have a gps "nav" indicator
but your teammate is outside the range of view, except in 2D, on
the y-dimension.
'''
level = self.get_level()
view = self.chart.getViewBox()
vr = view.state['viewRange']
ymn, ymx = vr[1]
# _, marker_right, _ = line._chart.marker_right_points()
x = self.scene_x()
if self.style == '>|': # short style, points "down-to" line
top_offset = self.h
bottom_offset = 0
else:
top_offset = 0
bottom_offset = self.h
if level > ymx: # pin to top of view
self.setPos(
QPointF(
x,
top_offset + self.h/3,
)
)
elif level < ymn: # pin to bottom of view
self.setPos(
QPointF(
x,
view.height() - (bottom_offset + self.h/3),
)
)
else:
# pp line is viewable so show marker normally
self.setPos(
x,
self.chart.view.mapFromView(
QPointF(0, self.get_level())
).y()
)
def paint(
self,
p: QtGui.QPainter,
opt: QtWidgets.QStyleOptionGraphicsItem,
w: QtWidgets.QWidget
) -> None:
'''Core paint which we override to always update
our marker position in scene coordinates from a
view cooridnate "level".
'''
if self.keep_in_view:
self.position_in_view()
super().paint(p, opt, w)
if self._on_paint:
self._on_paint(self)
def qgo_draw_markers(
markers: list,
color: Color,
p: QtGui.QPainter,
left: float,
right: float,
right_offset: float,
) -> float:
"""Paint markers in ``pg.GraphicsItem`` style by first
removing the view transform for the painter, drawing the markers
in scene coords, then restoring the view coords.
"""
# paint markers in native coordinate system
orig_tr = p.transform()
start = orig_tr.map(Point(left, 0))
end = orig_tr.map(Point(right, 0))
up = orig_tr.map(Point(left, 1))
dif = end - start
# length = Point(dif).length()
angle = np.arctan2(dif.y(), dif.x()) * 180 / np.pi
p.resetTransform()
p.translate(start)
p.rotate(angle)
up = up - start
det = up.x() * dif.y() - dif.x() * up.y()
p.scale(1, 1 if det > 0 else -1)
p.setBrush(fn.mkBrush(color))
# p.setBrush(fn.mkBrush(self.currentPen.color()))
tr = p.transform()
sizes = []
for path, pos, size in markers:
p.setTransform(tr)
# XXX: we drop the "scale / %" placement
# x = length * pos
x = right_offset
p.translate(x, 0)
p.scale(size, size)
p.drawPath(path)
sizes.append(size)
p.setTransform(orig_tr)
return max(sizes)

View File

@ -1,183 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Main app startup and run.
'''
from functools import partial
from PyQt5.QtCore import QEvent
import trio
from .._daemon import maybe_spawn_brokerd
from ..brokers import get_brokermod
from . import _event
from ._exec import run_qtractor
from ..data.feed import install_brokerd_search
from . import _search
from ._chart import GodWidget
from ..log import get_logger
log = get_logger(__name__)
async def load_provider_search(
broker: str,
loglevel: str,
) -> None:
log.info(f'loading brokerd for {broker}..')
async with (
maybe_spawn_brokerd(
broker,
loglevel=loglevel
) as portal,
install_brokerd_search(
portal,
get_brokermod(broker),
),
):
# keep search engine stream up until cancelled
await trio.sleep_forever()
async def _async_main(
# implicit required argument provided by ``qtractor_run()``
main_widget: GodWidget,
sym: str,
brokernames: str,
loglevel: str,
) -> None:
"""
Main Qt-trio routine invoked by the Qt loop with the widgets ``dict``.
Provision the "main" widget with initial symbol data and root nursery.
"""
from . import _display
godwidget = main_widget
# attempt to configure DPI aware font size
screen = godwidget.window.current_screen()
# configure graphics update throttling based on display refresh rate
_display._quote_throttle_rate = min(
round(screen.refreshRate()),
_display._quote_throttle_rate,
)
log.info(f'Set graphics update rate to {_display._quote_throttle_rate} Hz')
# TODO: do styling / themeing setup
# _style.style_ze_sheets(godwidget)
sbar = godwidget.window.status_bar
starting_done = sbar.open_status('starting ze sexy chartz')
async with (
trio.open_nursery() as root_n,
):
# set root nursery and task stack for spawning other charts/feeds
# that run cached in the bg
godwidget._root_n = root_n
# setup search widget and focus main chart view at startup
# search widget is a singleton alongside the godwidget
search = _search.SearchWidget(godwidget=godwidget)
search.bar.unfocus()
godwidget.hbox.addWidget(search)
godwidget.search = search
symbol, _, provider = sym.rpartition('.')
# this internally starts a ``display_symbol_data()`` task above
order_mode_ready = await godwidget.load_symbol(
provider,
symbol,
loglevel
)
# spin up a search engine for the local cached symbol set
async with _search.register_symbol_search(
provider_name='cache',
search_routine=partial(
_search.search_simple_dict,
source=godwidget._chart_cache,
),
# cache is super fast so debounce on super short period
pause_period=0.01,
):
# load other providers into search **after**
# the chart's select cache
for broker in brokernames:
root_n.start_soon(load_provider_search, broker, loglevel)
await order_mode_ready.wait()
# start handling peripherals input for top level widgets
async with (
# search bar kb input handling
_event.open_handlers(
[search.bar],
event_types={
QEvent.KeyPress,
},
async_handler=_search.handle_keyboard_input,
filter_auto_repeats=False, # let repeats passthrough
),
# completer view mouse click signal handling
_event.open_signal_handler(
search.view.pressed,
search.view.on_pressed,
),
):
# remove startup status text
starting_done()
await trio.sleep_forever()
def _main(
sym: str,
brokernames: [str],
piker_loglevel: str,
tractor_kwargs,
) -> None:
'''
Sync entry point to start a chart: a ``tractor`` + Qt runtime
entry point
'''
run_qtractor(
func=_async_main,
args=(sym, brokernames, piker_loglevel),
main_widget=GodWidget,
tractor_kwargs=tractor_kwargs,
)

View File

@ -1,593 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Chart axes graphics and behavior.
"""
from functools import lru_cache
from typing import Optional, Callable
from math import floor
import numpy as np
import pyqtgraph as pg
from PyQt5 import QtCore, QtGui, QtWidgets
from PyQt5.QtCore import QPointF
from ..data._source import float_digits
from ._label import Label
from ._style import DpiAwareFont, hcolor, _font
from ._interaction import ChartView
_axis_pen = pg.mkPen(hcolor('bracket'))
class Axis(pg.AxisItem):
'''
A better axis that sizes tick contents considering font size.
'''
def __init__(
self,
linkedsplits,
typical_max_str: str = '100 000.000',
text_color: str = 'bracket',
**kwargs
) -> None:
super().__init__(
# textPen=textPen,
**kwargs
)
# XXX: pretty sure this makes things slower
# self.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache)
self.linkedsplits = linkedsplits
self._dpi_font = _font
self.setTickFont(_font.font)
font_size = self._dpi_font.font.pixelSize()
if self.orientation in ('bottom',):
text_offset = floor(0.25 * font_size)
elif self.orientation in ('left', 'right'):
text_offset = floor(font_size / 2)
self.setStyle(**{
'textFillLimits': [(0, 0.5)],
'tickFont': self._dpi_font.font,
# offset of text *away from* axis line in px
# use approx. half the font pixel size (height)
'tickTextOffset': text_offset,
})
self.setTickFont(_font.font)
# NOTE: this is for surrounding "border"
self.setPen(_axis_pen)
# this is the text color
# self.setTextPen(pg.mkPen(hcolor(text_color)))
self.text_color = text_color
self.typical_br = _font._qfm.boundingRect(typical_max_str)
# size the pertinent axis dimension to a "typical value"
self.size_to_values()
@property
def text_color(self) -> str:
return self._text_color
@text_color.setter
def text_color(self, text_color: str) -> None:
self.setTextPen(pg.mkPen(hcolor(text_color)))
self._text_color = text_color
def size_to_values(self) -> None:
pass
def txt_offsets(self) -> tuple[int, int]:
return tuple(self.style['tickTextOffset'])
class PriceAxis(Axis):
def __init__(
self,
*args,
min_tick: int = 2,
title: str = '',
formatter: Optional[Callable[[float], str]] = None,
**kwargs
) -> None:
super().__init__(*args, **kwargs)
self.formatter = formatter
self._min_tick: int = min_tick
self.title = None
def set_title(
self,
title: str,
view: Optional[ChartView] = None,
color: Optional[str] = None,
) -> Label:
'''
Set a sane UX label using our built-in ``Label``.
'''
# XXX: built-in labels but they're huge, and placed weird..
# self.setLabel(title)
# self.showLabel()
label = self.title = Label(
view=view or self.linkedView(),
fmt_str=title,
color=color or self.text_color,
parent=self,
# update_on_range_change=False,
)
def below_axis() -> QPointF:
return QPointF(
0,
self.size().height(),
)
# XXX: doesn't work? have to pass it above
# label.txt.setParent(self)
label.scene_anchor = below_axis
label.render()
label.show()
label.update()
return label
def set_min_tick(
self,
size: int
) -> None:
self._min_tick = size
def size_to_values(self) -> None:
# self.typical_br = _font._qfm.boundingRect(typical_max_str)
self.setWidth(self.typical_br.width())
# XXX: drop for now since it just eats up h space
def tickStrings(
self,
vals: tuple[float],
scale: float,
spacing: float,
) -> list[str]:
# TODO: figure out how to enforce min tick spacing by passing it
# into the parent type
digits = max(
float_digits(spacing * scale),
self._min_tick,
)
if self.title:
self.title.update()
# print(f'vals: {vals}\nscale: {scale}\nspacing: {spacing}')
# print(f'digits: {digits}')
if not self.formatter:
return [
('{value:,.{digits}f}').format(
digits=digits,
value=v,
).replace(',', ' ') for v in vals
]
else:
return list(map(self.formatter, vals))
class DynamicDateAxis(Axis):
# time formats mapped by seconds between bars
tick_tpl = {
60 * 60 * 24: '%Y-%b-%d',
60: '%H:%M',
30: '%H:%M:%S',
5: '%H:%M:%S',
1: '%H:%M:%S',
}
def size_to_values(self) -> None:
self.setHeight(self.typical_br.height() + 1)
def _indexes_to_timestrs(
self,
indexes: list[int],
) -> list[str]:
chart = self.linkedsplits.chart
flow = chart._flows[chart.name]
shm = flow.shm
bars = shm.array
first = shm._first.value
bars_len = len(bars)
times = bars['time']
epochs = times[list(
map(
int,
filter(
lambda i: i > 0 and i < bars_len,
(i-first for i in indexes)
)
)
)]
# TODO: **don't** have this hard coded shift to EST
# delay = times[-1] - times[-2]
dts = np.array(epochs, dtype='datetime64[s]')
# see units listing:
# https://numpy.org/devdocs/reference/arrays.datetime.html#datetime-units
return list(np.datetime_as_string(dts))
# TODO: per timeframe formatting?
# - we probably need this based on zoom now right?
# prec = self.np_dt_precision[delay]
# return dts.strftime(self.tick_tpl[delay])
def tickStrings(
self,
values: tuple[float],
scale: float,
spacing: float,
) -> list[str]:
# info = self.tickStrings.cache_info()
# print(info)
return self._indexes_to_timestrs(values)
class AxisLabel(pg.GraphicsObject):
_x_margin = 0
_y_margin = 0
def __init__(
self,
parent: pg.GraphicsItem,
digits: int = 2,
bg_color: str = 'bracket',
fg_color: str = 'black',
opacity: int = 1, # XXX: seriously don't set this to 0
font_size: str = 'default',
use_arrow: bool = True,
) -> None:
super().__init__()
self.setParentItem(parent)
self.setFlag(self.ItemIgnoresTransformations)
# XXX: pretty sure this is faster
self.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache)
self._parent = parent
self.opacity = opacity
self.label_str = ''
self.digits = digits
self._txt_br: QtCore.QRect = None
self._dpifont = DpiAwareFont(font_size=font_size)
self._dpifont.configure_to_dpi()
self.bg_color = pg.mkColor(hcolor(bg_color))
self.fg_color = pg.mkColor(hcolor(fg_color))
self._use_arrow = use_arrow
# create triangle path
self.path = None
self.rect = None
self._pw = self.pixelWidth()
def paint(
self,
p: QtGui.QPainter,
opt: QtWidgets.QStyleOptionGraphicsItem,
w: QtWidgets.QWidget
) -> None:
"""Draw a filled rectangle based on the size of ``.label_str`` text.
Subtypes can customize further by overloading ``.draw()``.
"""
# p.setCompositionMode(QtWidgets.QPainter.CompositionMode_SourceOver)
if self.label_str:
# if not self.rect:
self._size_br_from_str(self.label_str)
# can be overrided in subtype
self.draw(p, self.rect)
p.setFont(self._dpifont.font)
p.setPen(self.fg_color)
p.drawText(self.rect, self.text_flags, self.label_str)
def draw(
self,
p: QtGui.QPainter,
rect: QtCore.QRectF
) -> None:
if self._use_arrow:
if not self.path:
self._draw_arrow_path()
p.drawPath(self.path)
p.fillPath(self.path, pg.mkBrush(self.bg_color))
# this adds a nice black outline around the label for some odd
# reason; ok by us
p.setOpacity(self.opacity)
# this cause the L1 labels to glitch out if used in the subtype
# and it will leave a small black strip with the arrow path if
# done before the above
p.fillRect(self.rect, self.bg_color)
def boundingRect(self): # noqa
'''
Size the graphics space from the text contents.
'''
if self.label_str:
self._size_br_from_str(self.label_str)
# if self.path:
# self.tl = self.path.controlPointRect().topLeft()
if not self.path:
self.tl = self.rect.topLeft()
return QtCore.QRectF(
self.tl,
self.rect.bottomRight(),
)
return QtCore.QRectF()
# TODO: but the input probably needs to be the "len" of
# the current text value:
@lru_cache
def _size_br_from_str(
self,
value: str
) -> tuple[float, float]:
'''
Do our best to render the bounding rect to a set margin
around provided string contents.
'''
# size the filled rect to text and/or parent axis
# if not self._txt_br:
# # XXX: this can't be called until stuff is rendered?
# self._txt_br = self._dpifont.boundingRect(value)
txt_br = self._txt_br = self._dpifont.boundingRect(value)
txt_h, txt_w = txt_br.height(), txt_br.width()
# print(f'wsw: {self._dpifont.boundingRect(" ")}')
# allow subtypes to specify a static width and height
h, w = self.size_hint()
# print(f'axis size: {self._parent.size()}')
# print(f'axis geo: {self._parent.geometry()}')
self.rect = QtCore.QRectF(
0, 0,
(w or txt_w) + self._x_margin / 2,
(h or txt_h) + self._y_margin / 2,
)
# print(self.rect)
# hb = self.path.controlPointRect()
# hb_size = hb.size()
return (self.rect.width(), self.rect.height())
# _common_text_flags = (
# QtCore.Qt.TextDontClip |
# QtCore.Qt.AlignCenter |
# QtCore.Qt.AlignTop |
# QtCore.Qt.AlignHCenter |
# QtCore.Qt.AlignVCenter
# )
class XAxisLabel(AxisLabel):
_x_margin = 8
text_flags = (
QtCore.Qt.TextDontClip
| QtCore.Qt.AlignCenter
)
def size_hint(self) -> tuple[float, float]:
# size to parent axis height
return self._parent.height(), None
def update_label(
self,
abs_pos: QPointF, # scene coords
value: float, # data for text
offset: int = 0 # if have margins, k?
) -> None:
timestrs = self._parent._indexes_to_timestrs([int(value)])
if not len(timestrs):
return
pad = 1*' '
self.label_str = pad + str(timestrs[0]) + pad
_, y_offset = self._parent.txt_offsets()
w = self.boundingRect().width()
self.setPos(
QPointF(
abs_pos.x() - w/2 - self._pw,
y_offset/2,
)
)
self.update()
def _draw_arrow_path(self):
y_offset = self._parent.style['tickTextOffset'][1]
path = QtGui.QPainterPath()
h, w = self.rect.height(), self.rect.width()
middle = w/2 - self._pw * 0.5
aw = h/2
left = middle - aw
right = middle + aw
path.moveTo(left, 0)
path.lineTo(middle, -y_offset)
path.lineTo(right, 0)
path.closeSubpath()
self.path = path
# top left point is local origin and tip of the arrow path
self.tl = QtCore.QPointF(0, -y_offset)
class YAxisLabel(AxisLabel):
_y_margin = 4
text_flags = (
QtCore.Qt.AlignLeft
# QtCore.Qt.AlignHCenter
| QtCore.Qt.AlignVCenter
| QtCore.Qt.TextDontClip
)
def __init__(
self,
chart,
*args,
**kwargs
) -> None:
super().__init__(*args, **kwargs)
self._chart = chart
chart.sigRangeChanged.connect(self.update_on_resize)
self._last_datum = (None, None)
# pull text offset from axis from parent axis
if getattr(self._parent, 'txt_offsets', False):
self.x_offset, y_offset = self._parent.txt_offsets()
def size_hint(self) -> tuple[float, float]:
# size to parent axis width(-ish)
wsh = self._dpifont.boundingRect(' ').height() / 2
return (
None,
self._parent.size().width() - wsh,
)
def update_label(
self,
abs_pos: QPointF, # scene coords
value: float, # data for text
# on odd dimension and/or adds nice black line
x_offset: Optional[int] = None
) -> None:
# this is read inside ``.paint()``
self.label_str = '{value:,.{digits}f}'.format(
digits=self.digits, value=value).replace(',', ' ')
# pull text offset from axis from parent axis
x_offset = x_offset or self.x_offset
br = self.boundingRect()
h = br.height()
self.setPos(
QPointF(
x_offset,
abs_pos.y() - h / 2 - self._pw,
)
)
self.update()
def update_on_resize(self, vr, r):
'''
This is a ``.sigRangeChanged()`` handler.
'''
index, last = self._last_datum
if index is not None:
self.update_from_data(index, last)
def update_from_data(
self,
index: int,
value: float,
_save_last: bool = True,
) -> None:
'''
Update the label's text contents **and** position from
a view box coordinate datum.
'''
if _save_last:
self._last_datum = (index, value)
self.update_label(
self._chart.mapFromView(QPointF(index, value)),
value
)
def _draw_arrow_path(self):
x_offset = self._parent.style['tickTextOffset'][0]
path = QtGui.QPainterPath()
h = self.rect.height()
path.moveTo(0, 0)
path.lineTo(-x_offset - h/4, h/2. - self._pw/2)
path.lineTo(0, h)
path.closeSubpath()
self.path = path
self.tl = path.controlPointRect().topLeft()

File diff suppressed because it is too large Load Diff

View File

@ -1,318 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Graphics related downsampling routines for compressing to pixel
limits on the display device.
'''
import math
from typing import Optional
import numpy as np
from numpy.lib import recfunctions as rfn
from numba import (
jit,
# float64, optional, int64,
)
from ..log import get_logger
log = get_logger(__name__)
def hl2mxmn(ohlc: np.ndarray) -> np.ndarray:
'''
Convert a OHLC struct-array containing 'high'/'low' columns
to a "joined" max/min 1-d array.
'''
index = ohlc['index']
hls = ohlc[[
'low',
'high',
]]
mxmn = np.empty(2*hls.size, dtype=np.float64)
x = np.empty(2*hls.size, dtype=np.float64)
trace_hl(hls, mxmn, x, index[0])
x = x + index[0]
return mxmn, x
@jit(
# TODO: the type annots..
# float64[:](float64[:],),
nopython=True,
)
def trace_hl(
hl: 'np.ndarray',
out: np.ndarray,
x: np.ndarray,
start: int,
# the "offset" values in the x-domain which
# place the 2 output points around each ``int``
# master index.
margin: float = 0.43,
) -> None:
'''
"Trace" the outline of the high-low values of an ohlc sequence
as a line such that the maximum deviation (aka disperaion) between
bars if preserved.
This routine is expected to modify input arrays in-place.
'''
last_l = hl['low'][0]
last_h = hl['high'][0]
for i in range(hl.size):
row = hl[i]
l, h = row['low'], row['high']
up_diff = h - last_l
down_diff = last_h - l
if up_diff > down_diff:
out[2*i + 1] = h
out[2*i] = last_l
else:
out[2*i + 1] = l
out[2*i] = last_h
last_l = l
last_h = h
x[2*i] = int(i) - margin
x[2*i + 1] = int(i) + margin
return out
def ohlc_flatten(
ohlc: np.ndarray,
use_mxmn: bool = True,
) -> tuple[np.ndarray, np.ndarray]:
'''
Convert an OHLCV struct-array into a flat ready-for-line-plotting
1-d array that is 4 times the size with x-domain values distributed
evenly (by 0.5 steps) over each index.
'''
index = ohlc['index']
if use_mxmn:
# traces a line optimally over highs to lows
# using numba. NOTE: pretty sure this is faster
# and looks about the same as the below output.
flat, x = hl2mxmn(ohlc)
else:
flat = rfn.structured_to_unstructured(
ohlc[['open', 'high', 'low', 'close']]
).flatten()
x = np.linspace(
start=index[0] - 0.5,
stop=index[-1] + 0.5,
num=len(flat),
)
return x, flat
def ds_m4(
x: np.ndarray,
y: np.ndarray,
# units-per-pixel-x(dimension)
uppx: float,
# XXX: troll zone / easter egg..
# want to mess with ur pal, pass in the actual
# pixel width here instead of uppx-proper (i.e. pass
# in our ``pg.GraphicsObject`` derivative's ``.px_width()``
# gto mega-trip-out ur bud). Hint, it used to be implemented
# (wrongly) using "pixel width", so check the git history ;)
xrange: Optional[float] = None,
) -> tuple[int, np.ndarray, np.ndarray]:
'''
Downsample using the M4 algorithm.
This is more or less an OHLC style sampling of a line-style series.
'''
# NOTE: this method is a so called "visualization driven data
# aggregation" approach. It gives error-free line chart
# downsampling, see
# further scientific paper resources:
# - http://www.vldb.org/pvldb/vol7/p797-jugel.pdf
# - http://www.vldb.org/2014/program/papers/demo/p997-jugel.pdf
# Details on implementation of this algo are based in,
# https://github.com/pikers/piker/issues/109
# XXX: from infinite on downsampling viewable graphics:
# "one thing i remembered about the binning - if you are
# picking a range within your timeseries the start and end bin
# should be one more bin size outside the visual range, then
# you get better visual fidelity at the edges of the graph"
# "i didn't show it in the sample code, but it's accounted for
# in the start and end indices and number of bins"
# should never get called unless actually needed
assert uppx > 1
# NOTE: if we didn't pre-slice the data to downsample
# you could in theory pass these as the slicing params,
# do we care though since we can always just pre-slice the
# input?
x_start = x[0] # x value start/lowest in domain
if xrange is None:
x_end = x[-1] # x end value/highest in domain
xrange = (x_end - x_start)
# XXX: always round up on the input pixels
# lnx = len(x)
# uppx *= max(4 / (1 + math.log(uppx, 2)), 1)
pxw = math.ceil(xrange / uppx)
# scale up the frame "width" directly with uppx
w = uppx
# ensure we make more then enough
# frames (windows) for the output pixel
frames = pxw
# if we have more and then exact integer's
# (uniform quotient output) worth of datum-domain-points
# per windows-frame, add one more window to ensure
# we have room for all output down-samples.
pts_per_pixel, r = divmod(xrange, frames)
if r:
# while r:
frames += 1
pts_per_pixel, r = divmod(xrange, frames)
# print(
# f'uppx: {uppx}\n'
# f'xrange: {xrange}\n'
# f'pxw: {pxw}\n'
# f'frames: {frames}\n'
# )
assert frames >= (xrange / uppx)
# call into ``numba``
nb, i_win, y_out = _m4(
x,
y,
frames,
# TODO: see func below..
# i_win,
# y_out,
# first index in x data to start at
x_start,
# window size for each "frame" of data to downsample (normally
# scaled by the ratio of pixels on screen to data in x-range).
w,
)
# filter out any overshoot in the input allocation arrays by
# removing zero-ed tail entries which should start at a certain
# index.
i_win = i_win[i_win != 0]
y_out = y_out[:i_win.size]
return nb, i_win, y_out
@jit(
nopython=True,
nogil=True,
)
def _m4(
xs: np.ndarray,
ys: np.ndarray,
frames: int,
# TODO: using this approach by having the ``.zeros()`` alloc lines
# below, in put python was causing segs faults and alloc crashes..
# we might need to see how it behaves with shm arrays and consider
# allocating them once at startup?
# pre-alloc array of x indices mapping to the start
# of each window used for downsampling in y.
# i_win: np.ndarray,
# pre-alloc array of output downsampled y values
# y_out: np.ndarray,
x_start: int,
step: float,
) -> int:
# nbins = len(i_win)
# count = len(xs)
# these are pre-allocated and mutated by ``numba``
# code in-place.
y_out = np.zeros((frames, 4), ys.dtype)
i_win = np.zeros(frames, xs.dtype)
bincount = 0
x_left = x_start
# Find the first window's starting value which *includes* the
# first value in the x-domain array, i.e. the first
# "left-side-of-window" **plus** the downsampling step,
# creates a window which includes the first x **value**.
while xs[0] >= x_left + step:
x_left += step
# set all bins in the left-most entry to the starting left-most x value
# (aka a row broadcast).
i_win[bincount] = x_left
# set all y-values to the first value passed in.
y_out[bincount] = ys[0]
for i in range(len(xs)):
x = xs[i]
y = ys[i]
if x < x_left + step: # the current window "step" is [bin, bin+1)
y_out[bincount, 1] = min(y, y_out[bincount, 1])
y_out[bincount, 2] = max(y, y_out[bincount, 2])
y_out[bincount, 3] = y
else:
# Find the next bin
while x >= x_left + step:
x_left += step
bincount += 1
i_win[bincount] = x_left
y_out[bincount] = y
return bincount, i_win, y_out

View File

@ -1,661 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Mouse interaction graphics
"""
from functools import partial
from typing import Optional, Callable
import inspect
import numpy as np
import pyqtgraph as pg
from PyQt5 import QtCore, QtWidgets
from PyQt5.QtCore import QPointF, QRectF
from ._style import (
_xaxis_at,
hcolor,
_font_small,
_font,
)
from ._axes import YAxisLabel, XAxisLabel
from ..log import get_logger
log = get_logger(__name__)
# XXX: these settings seem to result in really decent mouse scroll
# latency (in terms of perceived lag in cross hair) so really be sure
# there's an improvement if you want to change it!
_mouse_rate_limit = 60 # TODO; should we calc current screen refresh rate?
_debounce_delay = 0
_ch_label_opac = 1
# TODO: we need to handle the case where index is outside
# the underlying datums range
class LineDot(pg.CurvePoint):
def __init__(
self,
curve: pg.PlotCurveItem,
index: int,
plot: 'ChartPlotWidget', # type: ingore # noqa
pos=None,
color: str = 'default_light',
) -> None:
# scale from dpi aware font size
size = int(_font.px_size * 0.375)
pg.CurvePoint.__init__(
self,
curve,
index=index,
pos=pos,
rotate=False,
)
self._plot = plot
# TODO: get pen from curve if not defined?
cdefault = hcolor(color)
pen = pg.mkPen(cdefault)
brush = pg.mkBrush(cdefault)
# presuming this is fast since it's built in?
dot = self.dot = QtWidgets.QGraphicsEllipseItem(
QtCore.QRectF(-size / 2, -size / 2, size, size)
)
# if we needed transformable dot?
# dot.translate(-size*0.5, -size*0.5)
dot.setPen(pen)
dot.setBrush(brush)
dot.setParentItem(self)
# keep a static size
self.setFlag(self.ItemIgnoresTransformations)
def event(
self,
ev: QtCore.QEvent,
) -> bool:
if (
not isinstance(ev, QtCore.QDynamicPropertyChangeEvent)
or self.curve() is None
):
return False
# TODO: get rid of this ``.getData()`` and
# make a more pythonic api to retreive backing
# numpy arrays...
# (x, y) = self.curve().getData()
# index = self.property('index')
# # first = self._plot._arrays['ohlc'][0]['index']
# # first = x[0]
# # i = index - first
# if index:
# i = round(index - x[0])
# if i > 0 and i < len(y):
# newPos = (index, y[i])
# QtWidgets.QGraphicsItem.setPos(
# self,
# *newPos,
# )
# return True
return False
# TODO: change this into our own ``_label.Label``
class ContentsLabel(pg.LabelItem):
"""Label anchored to a ``ViewBox`` typically for displaying
datum-wise points from the "viewed" contents.
"""
_corner_anchors = {
'top': 0,
'left': 0,
'bottom': 1,
'right': 1,
}
# XXX: fyi naming here is confusing / opposite to coords
_corner_margins = {
('top', 'left'): (-2, lambda font_size: -font_size*0.25),
('top', 'right'): (2, lambda font_size: -font_size*0.25),
('bottom', 'left'): (-2, lambda font_size: font_size),
('bottom', 'right'): (2, lambda font_size: font_size),
}
def __init__(
self,
# chart: 'ChartPlotWidget', # noqa
view: pg.ViewBox,
anchor_at: str = ('top', 'right'),
justify_text: str = 'left',
font_size: Optional[int] = None,
) -> None:
font_size = font_size or _font_small.px_size
super().__init__(
justify=justify_text,
size=f'{str(font_size)}px'
)
# anchor to viewbox
self.setParentItem(view)
self.vb = view
view.scene().addItem(self)
v, h = anchor_at
index = (self._corner_anchors[h], self._corner_anchors[v])
margins = self._corner_margins[(v, h)]
ydim = margins[1]
if inspect.isfunction(margins[1]):
margins = margins[0], ydim(font_size)
self.anchor(itemPos=index, parentPos=index, offset=margins)
def update_from_ohlc(
self,
name: str,
index: int,
array: np.ndarray,
) -> None:
# this being "html" is the dumbest shit :eyeroll:
first = array[0]['index']
self.setText(
"<b>i</b>:{index}<br/>"
# NB: these fields must be indexed in the correct order via
# the slice syntax below.
"<b>epoch</b>:{}<br/>"
"<b>O</b>:{}<br/>"
"<b>H</b>:{}<br/>"
"<b>L</b>:{}<br/>"
"<b>C</b>:{}<br/>"
"<b>V</b>:{}<br/>"
"<b>wap</b>:{}".format(
*array[index - first][
[
'time',
'open',
'high',
'low',
'close',
'volume',
'bar_wap',
]
],
name=name,
index=index,
)
)
def update_from_value(
self,
name: str,
index: int,
array: np.ndarray,
) -> None:
first = array[0]['index']
if index < array[-1]['index'] and index > first:
data = array[index - first][name]
self.setText(f"{name}: {data:.2f}")
class ContentsLabels:
'''Collection of labels that span a ``LinkedSplits`` set of chart plots
and can be updated from the underlying data from an x-index value sent
as input from a cursor or other query mechanism.
'''
def __init__(
self,
linkedsplits: 'LinkedSplits', # type: ignore # noqa
) -> None:
self.linkedsplits = linkedsplits
self._labels: list[(
'CharPlotWidget', # type: ignore # noqa
str,
ContentsLabel,
Callable
)] = []
def update_labels(
self,
index: int,
) -> None:
for chart, name, label, update in self._labels:
flow = chart._flows[name]
array = flow.shm.array
if not (
index >= 0
and index < array[-1]['index']
):
# out of range
print('WTF out of range?')
continue
# call provided update func with data point
try:
label.show()
update(index, array)
except IndexError:
log.exception(f"Failed to update label: {name}")
def hide(self) -> None:
for chart, name, label, update in self._labels:
label.hide()
def add_label(
self,
chart: 'ChartPlotWidget', # type: ignore # noqa
name: str,
anchor_at: tuple[str, str] = ('top', 'left'),
update_func: Callable = ContentsLabel.update_from_value,
) -> ContentsLabel:
label = ContentsLabel(
view=chart.view,
anchor_at=anchor_at,
)
self._labels.append(
(chart, name, label, partial(update_func, label, name))
)
label.hide()
return label
class Cursor(pg.GraphicsObject):
'''
Multi-plot cursor for use on a ``LinkedSplits`` chart (set).
'''
def __init__(
self,
linkedsplits: 'LinkedSplits', # noqa
digits: int = 0
) -> None:
super().__init__()
self.linked = linkedsplits
self.graphics: dict[str, pg.GraphicsObject] = {}
self.plots: list['PlotChartWidget'] = [] # type: ignore # noqa
self.active_plot = None
self.digits: int = digits
self._datum_xy: tuple[int, float] = (0, 0)
self._hovered: set[pg.GraphicsObject] = set()
self._trackers: set[pg.GraphicsObject] = set()
# XXX: not sure why these are instance variables?
# It's not like we can change them on the fly..?
self.pen = pg.mkPen(
color=hcolor('default'),
style=QtCore.Qt.DashLine,
)
self.lines_pen = pg.mkPen(
color=hcolor('davies'),
style=QtCore.Qt.DashLine,
)
# value used for rounding y-axis discreet tick steps
# computing once, up front, here cuz why not
self._y_incr_mult = 1 / self.linked._symbol.tick_size
# line width in view coordinates
self._lw = self.pixelWidth() * self.lines_pen.width()
# xhair label's color name
self.label_color: str = 'default'
self._y_label_update: bool = True
self.contents_labels = ContentsLabels(self.linked)
self._in_query_mode: bool = False
@property
def in_query_mode(self) -> bool:
return self._in_query_mode
@in_query_mode.setter
def in_query_mode(self, value: bool) -> None:
if self._in_query_mode and not value:
# edge trigger "off" hide all labels
self.contents_labels.hide()
elif not self._in_query_mode and value:
# edge trigger "on" hide all labels
self.contents_labels.update_labels(self._datum_xy[0])
self._in_query_mode = value
def add_hovered(
self,
item: pg.GraphicsObject,
) -> None:
assert getattr(item, 'delete'), f"{item} must define a ``.delete()``"
self._hovered.add(item)
def add_plot(
self,
plot: 'ChartPlotWidget', # noqa
digits: int = 0,
) -> None:
'''
Add chart to tracked set such that a cross-hair and possibly
curve tracking cursor can be drawn on the plot.
'''
# add ``pg.graphicsItems.InfiniteLine``s
# vertical and horizonal lines and a y-axis label
vl = plot.addLine(x=0, pen=self.lines_pen, movable=False)
vl.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache)
hl = plot.addLine(y=0, pen=self.lines_pen, movable=False)
hl.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache)
hl.hide()
yl = YAxisLabel(
chart=plot,
# parent=plot.getAxis('right'),
parent=plot.pi_overlay.get_axis(plot.plotItem, 'right'),
digits=digits or self.digits,
opacity=_ch_label_opac,
bg_color=self.label_color,
)
yl.hide() # on startup if mouse is off screen
# TODO: checkout what ``.sigDelayed`` can be used for
# (emitted once a sufficient delay occurs in mouse movement)
px_moved = pg.SignalProxy(
plot.scene().sigMouseMoved,
rateLimit=_mouse_rate_limit,
slot=self.mouseMoved,
delay=_debounce_delay,
)
px_enter = pg.SignalProxy(
plot.sig_mouse_enter,
rateLimit=_mouse_rate_limit,
slot=lambda: self.mouseAction('Enter', plot),
delay=_debounce_delay,
)
px_leave = pg.SignalProxy(
plot.sig_mouse_leave,
rateLimit=_mouse_rate_limit,
slot=lambda: self.mouseAction('Leave', plot),
delay=_debounce_delay,
)
self.graphics[plot] = {
'vl': vl,
'hl': hl,
'yl': yl,
'px': (px_moved, px_enter, px_leave),
}
self.plots.append(plot)
# Determine where to place x-axis label.
# Place below the last plot by default, ow
# keep x-axis right below main chart
plot_index = -1 if _xaxis_at == 'bottom' else 0
# ONLY create an x-axis label for the cursor
# if this plot owns the 'bottom' axis.
# if 'bottom' in plot.plotItem.axes:
if plot.linked.xaxis_chart is plot:
xlabel = self.xaxis_label = XAxisLabel(
parent=self.plots[plot_index].getAxis('bottom'),
# parent=self.plots[plot_index].pi_overlay.get_axis(
# plot.plotItem, 'bottom'
# ),
opacity=_ch_label_opac,
bg_color=self.label_color,
)
# place label off-screen during startup
xlabel.setPos(
self.plots[0].mapFromView(QPointF(0, 0))
)
xlabel.show()
def add_curve_cursor(
self,
plot: 'ChartPlotWidget', # noqa
curve: 'PlotCurveItem', # noqa
) -> LineDot:
# if this plot contains curves add line dot "cursors" to denote
# the current sample under the mouse
main_flow = plot._flows[plot.name]
# read out last index
i = main_flow.shm.array[-1]['index']
cursor = LineDot(
curve,
index=i,
plot=plot
)
plot.addItem(cursor)
self.graphics[plot].setdefault('cursors', []).append(cursor)
return cursor
def mouseAction(self, action, plot): # noqa
log.debug(f"{(action, plot.name)}")
if action == 'Enter':
self.active_plot = plot
# show horiz line and y-label
self.graphics[plot]['hl'].show()
self.graphics[plot]['yl'].show()
else: # Leave
# hide horiz line and y-label
self.graphics[plot]['hl'].hide()
self.graphics[plot]['yl'].hide()
def mouseMoved(
self,
coords: tuple[QPointF], # noqa
) -> None:
'''
Update horizonal and vertical lines when mouse moves inside
either the main chart or any indicator subplot.
'''
pos = coords[0]
# find position inside active plot
try:
# map to view coordinate system
mouse_point = self.active_plot.mapToView(pos)
except AttributeError:
# mouse was not on active plot
return
x, y = mouse_point.x(), mouse_point.y()
plot = self.active_plot
# Update x if cursor changed after discretization calc
# (this saves draw cycles on small mouse moves)
last_ix, last_iy = self._datum_xy
ix = round(x) # since bars are centered around index
# px perfect...
line_offset = self._lw / 2
# round y value to nearest tick step
m = self._y_incr_mult
iy = round(y * m) / m
vl_y = iy - line_offset
# update y-range items
if iy != last_iy:
if self._y_label_update:
self.graphics[self.active_plot]['yl'].update_label(
# abs_pos=plot.mapFromView(QPointF(ix, iy)),
abs_pos=plot.mapFromView(QPointF(ix, vl_y)),
value=iy
)
# only update horizontal xhair line if label is enabled
# self.graphics[plot]['hl'].setY(iy)
self.graphics[plot]['hl'].setY(vl_y)
# update all trackers
for item in self._trackers:
item.on_tracked_source(ix, iy)
if ix != last_ix:
if self.in_query_mode:
# show contents labels on all linked charts and update
# with cursor movement
self.contents_labels.update_labels(ix)
vl_x = ix + line_offset
for plot, opts in self.graphics.items():
# move the vertical line to the current "center of bar"
opts['vl'].setX(vl_x)
# update all subscribed curve dots
for cursor in opts.get('cursors', ()):
cursor.setIndex(ix)
# Update the label on the bottom of the crosshair.
# TODO: make this an up-front calc that we update
# on axis-widget resize events instead of on every mouse
# update cylce.
# left axis offset width for calcuating
# absolute x-axis label placement.
left_axis_width = 0
if len(plot.pi_overlay.overlays):
# breakpoint()
lefts = plot.pi_overlay.get_axes('left')
if lefts:
for left in lefts:
left_axis_width += left.width()
# map back to abs (label-local) coordinates
self.xaxis_label.update_label(
abs_pos=(
plot.mapFromView(QPointF(vl_x, iy)) -
QPointF(left_axis_width, 0)
),
value=ix,
)
self._datum_xy = ix, iy
def boundingRect(self) -> QRectF:
try:
return self.active_plot.boundingRect()
except AttributeError:
return self.plots[0].boundingRect()
def show_xhair(
self,
y_label_level: float = None,
) -> None:
plot = self.active_plot
if not plot:
return
g = self.graphics[plot]
# show horiz line and y-label
g['hl'].show()
g['vl'].show()
self._y_label_update = True
yl = g['yl']
# yl.fg_color = pg.mkColor(hcolor('black'))
# yl.bg_color = pg.mkColor(hcolor(self.label_color))
if y_label_level:
yl.update_from_data(0, y_label_level, _save_last=False)
yl.show()
def hide_xhair(
self,
hide_label: bool = False,
y_label_level: float = None,
just_vertical: bool = False,
fg_color: str = None,
# bg_color: str = 'papas_special',
) -> None:
g = self.graphics[self.active_plot]
hl = g['hl']
if not just_vertical:
hl.hide()
g['vl'].hide()
# only disable cursor y-label updates
# if we're highlighting a line
yl = g['yl']
if hide_label:
yl.hide()
elif y_label_level:
yl.update_from_data(0, y_label_level, _save_last=False)
hl.setY(y_label_level)
if fg_color is not None:
yl.fg_color = pg.mkColor(hcolor(fg_color))
yl.bg_color = pg.mkColor(hcolor('papas_special'))

View File

@ -1,484 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Fast, smooth, sexy curves.
"""
from contextlib import contextmanager as cm
from typing import Optional, Callable
import numpy as np
import pyqtgraph as pg
from PyQt5 import QtWidgets
from PyQt5.QtWidgets import QGraphicsItem
from PyQt5.QtCore import (
Qt,
QLineF,
QSizeF,
QRectF,
# QRect,
QPointF,
)
from PyQt5.QtGui import (
QPainter,
QPainterPath,
)
from .._profile import pg_profile_enabled, ms_slower_then
from ._style import hcolor
# from ._compression import (
# # ohlc_to_m4_line,
# ds_m4,
# )
from ..log import get_logger
log = get_logger(__name__)
_line_styles: dict[str, int] = {
'solid': Qt.PenStyle.SolidLine,
'dash': Qt.PenStyle.DashLine,
'dot': Qt.PenStyle.DotLine,
'dashdot': Qt.PenStyle.DashDotLine,
}
class Curve(pg.GraphicsObject):
'''
A faster, simpler, append friendly version of
``pyqtgraph.PlotCurveItem`` built for highly customizable real-time
updates.
This type is a much stripped down version of a ``pyqtgraph`` style
"graphics object" in the sense that the internal lower level
graphics which are drawn in the ``.paint()`` method are actually
rendered outside of this class entirely and instead are assigned as
state (instance vars) here and then drawn during a Qt graphics
cycle.
The main motivation for this more modular, composed design is that
lower level graphics data can be rendered in different threads and
then read and drawn in this main thread without having to worry
about dealing with Qt's concurrency primitives. See
``piker.ui._flows.Renderer`` for details and logic related to lower
level path generation and incremental update. The main differences in
the path generation code include:
- avoiding regeneration of the entire historical path where possible
and instead only updating the "new" segment(s) via a ``numpy``
array diff calc.
- here, the "last" graphics datum-segment is drawn independently
such that near-term (high frequency) discrete-time-sampled style
updates don't trigger a full path redraw.
'''
# sub-type customization methods
sub_br: Optional[Callable] = None
sub_paint: Optional[Callable] = None
declare_paintables: Optional[Callable] = None
def __init__(
self,
*args,
step_mode: bool = False,
color: str = 'default_lightest',
fill_color: Optional[str] = None,
style: str = 'solid',
name: Optional[str] = None,
use_fpath: bool = True,
**kwargs
) -> None:
self._name = name
# brutaaalll, see comments within..
self.yData = None
self.xData = None
# self._last_cap: int = 0
self.path: Optional[QPainterPath] = None
# additional path used for appends which tries to avoid
# triggering an update/redraw of the presumably larger
# historical ``.path`` above.
self.use_fpath = use_fpath
self.fast_path: Optional[QPainterPath] = None
# TODO: we can probably just dispense with the parent since
# we're basically only using the pen setting now...
super().__init__(*args, **kwargs)
# all history of curve is drawn in single px thickness
pen = pg.mkPen(hcolor(color))
pen.setStyle(_line_styles[style])
if 'dash' in style:
pen.setDashPattern([8, 3])
self._pen = pen
# last segment is drawn in 2px thickness for emphasis
# self.last_step_pen = pg.mkPen(hcolor(color), width=2)
self.last_step_pen = pg.mkPen(pen, width=2)
# self._last_line: Optional[QLineF] = None
self._last_line = QLineF()
self._last_w: float = 1
# flat-top style histogram-like discrete curve
# self._step_mode: bool = step_mode
# self._fill = True
self._brush = pg.functions.mkBrush(hcolor(fill_color or color))
# NOTE: this setting seems to mostly prevent redraws on mouse
# interaction which is a huge boon for avg interaction latency.
# TODO: one question still remaining is if this makes trasform
# interactions slower (such as zooming) and if so maybe if/when
# we implement a "history" mode for the view we disable this in
# that mode?
# don't enable caching by default for the case where the
# only thing drawn is the "last" line segment which can
# have a weird artifact where it won't be fully drawn to its
# endpoint (something we saw on trade rate curves)
self.setCacheMode(QGraphicsItem.DeviceCoordinateCache)
# XXX: see explanation for different caching modes:
# https://stackoverflow.com/a/39410081
# seems to only be useful if we don't re-generate the entire
# QPainterPath every time
# curve.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache)
# don't ever use this - it's a colossal nightmare of artefacts
# and is disastrous for performance.
# curve.setCacheMode(QtWidgets.QGraphicsItem.ItemCoordinateCache)
# allow sub-type customization
declare = self.declare_paintables
if declare:
declare()
# TODO: probably stick this in a new parent
# type which will contain our own version of
# what ``PlotCurveItem`` had in terms of base
# functionality? A `FlowGraphic` maybe?
def x_uppx(self) -> int:
px_vecs = self.pixelVectors()[0]
if px_vecs:
xs_in_px = px_vecs.x()
return round(xs_in_px)
else:
return 0
def px_width(self) -> float:
vb = self.getViewBox()
if not vb:
return 0
vr = self.viewRect()
l, r = int(vr.left()), int(vr.right())
start, stop = self._xrange
lbar = max(l, start)
rbar = min(r, stop)
return vb.mapViewToDevice(
QLineF(lbar, 0, rbar, 0)
).length()
# XXX: lol brutal, the internals of `CurvePoint` (inherited by
# our `LineDot`) required ``.getData()`` to work..
def getData(self):
return self.xData, self.yData
def clear(self):
'''
Clear internal graphics making object ready for full re-draw.
'''
# NOTE: original code from ``pg.PlotCurveItem``
self.xData = None
self.yData = None
# XXX: previously, if not trying to leverage `.reserve()` allocs
# then you might as well create a new one..
# self.path = None
# path reservation aware non-mem de-alloc cleaning
if self.path:
self.path.clear()
if self.fast_path:
# self.fast_path.clear()
self.fast_path = None
@cm
def reset_cache(self) -> None:
self.setCacheMode(QtWidgets.QGraphicsItem.NoCache)
yield
self.setCacheMode(QGraphicsItem.DeviceCoordinateCache)
def boundingRect(self):
'''
Compute and then cache our rect.
'''
if self.path is None:
return QPainterPath().boundingRect()
else:
# dynamically override this method after initial
# path is created to avoid requiring the above None check
self.boundingRect = self._path_br
return self._path_br()
def _path_br(self):
'''
Post init ``.boundingRect()```.
'''
# hb = self.path.boundingRect()
hb = self.path.controlPointRect()
hb_size = hb.size()
fp = self.fast_path
if fp:
fhb = fp.controlPointRect()
hb_size = fhb.size() + hb_size
# print(f'hb_size: {hb_size}')
# if self._last_step_rect:
# hb_size += self._last_step_rect.size()
# if self._line:
# br = self._last_step_rect.bottomRight()
# tl = QPointF(
# # self._vr[0],
# # hb.topLeft().y(),
# # 0,
# # hb_size.height() + 1
# )
# br = self._last_step_rect.bottomRight()
w = hb_size.width()
h = hb_size.height()
sbr = self.sub_br
if sbr:
w, h = self.sub_br(w, h)
else:
# assume plain line graphic and use
# default unit step in each direction.
# only on a plane line do we include
# and extra index step's worth of width
# since in the step case the end of the curve
# actually terminates earlier so we don't need
# this for the last step.
w += self._last_w
# ll = self._last_line
h += 1 # ll.y2() - ll.y1()
# br = QPointF(
# self._vr[-1],
# # tl.x() + w,
# tl.y() + h,
# )
br = QRectF(
# top left
# hb.topLeft()
# tl,
QPointF(hb.topLeft()),
# br,
# total size
# QSizeF(hb_size)
# hb_size,
QSizeF(w, h)
)
# print(f'bounding rect: {br}')
return br
def paint(
self,
p: QPainter,
opt: QtWidgets.QStyleOptionGraphicsItem,
w: QtWidgets.QWidget
) -> None:
profiler = pg.debug.Profiler(
msg=f'Curve.paint(): `{self._name}`',
disabled=not pg_profile_enabled(),
ms_threshold=ms_slower_then,
)
sub_paint = self.sub_paint
if sub_paint:
sub_paint(p, profiler)
p.setPen(self.last_step_pen)
p.drawLine(self._last_line)
profiler('.drawLine()')
p.setPen(self._pen)
path = self.path
# cap = path.capacity()
# if cap != self._last_cap:
# print(f'NEW CAPACITY: {self._last_cap} -> {cap}')
# self._last_cap = cap
if path:
p.drawPath(path)
profiler(f'.drawPath(path): {path.capacity()}')
fp = self.fast_path
if fp:
p.drawPath(fp)
profiler('.drawPath(fast_path)')
# TODO: try out new work from `pyqtgraph` main which should
# repair horrid perf (pretty sure i did and it was still
# horrible?):
# https://github.com/pyqtgraph/pyqtgraph/pull/2032
# if self._fill:
# brush = self.opts['brush']
# p.fillPath(self.path, brush)
def draw_last_datum(
self,
path: QPainterPath,
src_data: np.ndarray,
render_data: np.ndarray,
reset: bool,
array_key: str,
) -> None:
# default line draw last call
# with self.reset_cache():
x = render_data['index']
y = render_data[array_key]
# draw the "current" step graphic segment so it
# lines up with the "middle" of the current
# (OHLC) sample.
self._last_line = QLineF(
x[-2], y[-2],
x[-1], y[-1],
)
return x, y
# TODO: this should probably be a "downsampled" curve type
# that draws a bar-style (but for the px column) last graphics
# element such that the current datum in view can be shown
# (via it's max / min) even when highly zoomed out.
class FlattenedOHLC(Curve):
def draw_last_datum(
self,
path: QPainterPath,
src_data: np.ndarray,
render_data: np.ndarray,
reset: bool,
array_key: str,
) -> None:
lasts = src_data[-2:]
x = lasts['index']
y = lasts['close']
# draw the "current" step graphic segment so it
# lines up with the "middle" of the current
# (OHLC) sample.
self._last_line = QLineF(
x[-2], y[-2],
x[-1], y[-1]
)
return x, y
class StepCurve(Curve):
def declare_paintables(
self,
) -> None:
self._last_step_rect = QRectF()
def draw_last_datum(
self,
path: QPainterPath,
src_data: np.ndarray,
render_data: np.ndarray,
reset: bool,
array_key: str,
w: float = 0.5,
) -> None:
# TODO: remove this and instead place all step curve
# updating into pre-path data render callbacks.
# full input data
x = src_data['index']
y = src_data[array_key]
x_last = x[-1]
y_last = y[-1]
# lol, commenting this makes step curves
# all "black" for me :eyeroll:..
self._last_line = QLineF(
x_last - w, 0,
x_last + w, 0,
)
self._last_step_rect = QRectF(
x_last - w, 0,
x_last + w, y_last,
)
return x, y
def sub_paint(
self,
p: QPainter,
profiler: pg.debug.Profiler,
) -> None:
# p.drawLines(*tuple(filter(bool, self._last_step_lines)))
# p.drawRect(self._last_step_rect)
p.fillRect(self._last_step_rect, self._brush)
profiler('.fillRect()')
def sub_br(
self,
path_w: float,
path_h: float,
) -> (float, float):
# passthrough
return path_w, path_h

View File

@ -1,863 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
real-time display tasks for charting graphics update.
this module ties together quote and computational (fsp) streams with
graphics update methods via our custom ``pyqtgraph`` charting api.
'''
from dataclasses import dataclass
from functools import partial
import time
from typing import Optional, Any, Callable
import numpy as np
import tractor
import trio
import pendulum
import pyqtgraph as pg
# from .. import brokers
from ..data.feed import open_feed
from ._axes import YAxisLabel
from ._chart import (
ChartPlotWidget,
LinkedSplits,
GodWidget,
)
from ._l1 import L1Labels
from ._fsp import (
update_fsp_chart,
start_fsp_displays,
has_vlm,
open_vlm_displays,
)
from ..data._sharedmem import ShmArray
from ..data._source import tf_in_1s
from ._forms import (
FieldsForm,
mk_order_pane_layout,
)
from .order_mode import open_order_mode
from .._profile import (
pg_profile_enabled,
ms_slower_then,
)
from ..log import get_logger
log = get_logger(__name__)
# TODO: load this from a config.toml!
_quote_throttle_rate: int = 22 # Hz
# a working tick-type-classes template
_tick_groups = {
'clears': {'trade', 'utrade', 'last'},
'bids': {'bid', 'bsize'},
'asks': {'ask', 'asize'},
}
# TODO: delegate this to each `Flow.maxmin()` which includes
# caching and further we should implement the following stream based
# approach, likely with ``numba``:
# https://arxiv.org/abs/cs/0610046
# https://github.com/lemire/pythonmaxmin
def chart_maxmin(
chart: ChartPlotWidget,
ohlcv_shm: ShmArray,
vlm_chart: Optional[ChartPlotWidget] = None,
) -> tuple[
tuple[int, int, int, int],
float,
float,
float,
]:
'''
Compute max and min datums "in view" for range limits.
'''
last_bars_range = chart.bars_range()
out = chart.maxmin()
if out is None:
return (last_bars_range, 0, 0, 0)
mn, mx = out
mx_vlm_in_view = 0
if vlm_chart:
out = vlm_chart.maxmin()
if out:
_, mx_vlm_in_view = out
return (
last_bars_range,
mx,
max(mn, 0), # presuming price can't be negative?
mx_vlm_in_view,
)
@dataclass
class DisplayState:
'''
Chart-local real-time graphics state container.
'''
quotes: dict[str, Any]
maxmin: Callable
ohlcv: ShmArray
# high level chart handles
linked: LinkedSplits
chart: ChartPlotWidget
vlm_chart: ChartPlotWidget
# axis labels
l1: L1Labels
last_price_sticky: YAxisLabel
vlm_sticky: YAxisLabel
# misc state tracking
vars: dict[str, Any]
wap_in_history: bool = False
async def graphics_update_loop(
linked: LinkedSplits,
stream: tractor.MsgStream,
ohlcv: np.ndarray,
wap_in_history: bool = False,
vlm_chart: Optional[ChartPlotWidget] = None,
) -> None:
'''
The 'main' (price) chart real-time update loop.
Receive from the primary instrument quote stream and update the OHLC
chart.
'''
# TODO: bunch of stuff (some might be done already, can't member):
# - I'm starting to think all this logic should be
# done in one place and "graphics update routines"
# should not be doing any length checking and array diffing.
# - handle odd lot orders
# - update last open price correctly instead
# of copying it from last bar's close
# - 1-5 sec bar lookback-autocorrection like tws does?
# (would require a background history checker task)
display_rate = linked.godwidget.window.current_screen().refreshRate()
chart = linked.chart
# update last price sticky
last_price_sticky = chart._ysticks[chart.name]
last_price_sticky.update_from_data(
*ohlcv.array[-1][['index', 'close']]
)
if vlm_chart:
vlm_sticky = vlm_chart._ysticks['volume']
maxmin = partial(
chart_maxmin,
chart,
ohlcv,
vlm_chart,
)
last_bars_range: tuple[float, float]
(
last_bars_range,
last_mx,
last_mn,
last_mx_vlm,
) = maxmin()
last, volume = ohlcv.array[-1][['close', 'volume']]
symbol = chart.linked.symbol
l1 = L1Labels(
chart,
# determine precision/decimal lengths
digits=symbol.tick_size_digits,
size_digits=symbol.lot_size_digits,
)
chart._l1_labels = l1
# TODO:
# - in theory we should be able to read buffer data faster
# then msgs arrive.. needs some tinkering and testing
# - if trade volume jumps above / below prior L1 price
# levels this might be dark volume we need to
# present differently -> likely dark vlm
tick_size = chart.linked.symbol.tick_size
tick_margin = 3 * tick_size
chart.show()
# view = chart.view
last_quote = time.time()
i_last = ohlcv.index
# async def iter_drain_quotes():
# # NOTE: all code below this loop is expected to be synchronous
# # and thus draw instructions are not picked up jntil the next
# # wait / iteration.
# async for quotes in stream:
# while True:
# try:
# moar = stream.receive_nowait()
# except trio.WouldBlock:
# yield quotes
# break
# else:
# for sym, quote in moar.items():
# ticks_frame = quote.get('ticks')
# if ticks_frame:
# quotes[sym].setdefault(
# 'ticks', []).extend(ticks_frame)
# print('pulled extra')
# yield quotes
# async for quotes in iter_drain_quotes():
ds = linked.display_state = DisplayState(**{
'quotes': {},
'linked': linked,
'maxmin': maxmin,
'ohlcv': ohlcv,
'chart': chart,
'last_price_sticky': last_price_sticky,
'vlm_chart': vlm_chart,
'vlm_sticky': vlm_sticky,
'l1': l1,
'vars': {
'tick_margin': tick_margin,
'i_last': i_last,
'i_last_append': i_last,
'last_mx_vlm': last_mx_vlm,
'last_mx': last_mx,
'last_mn': last_mn,
}
})
chart.default_view()
# main real-time quotes update loop
async for quotes in stream:
ds.quotes = quotes
quote_period = time.time() - last_quote
quote_rate = round(
1/quote_period, 1) if quote_period > 0 else float('inf')
if (
quote_period <= 1/_quote_throttle_rate
# in the absolute worst case we shouldn't see more then
# twice the expected throttle rate right!?
# and quote_rate >= _quote_throttle_rate * 2
and quote_rate >= display_rate
):
log.warning(f'High quote rate {symbol.key}: {quote_rate}')
last_quote = time.time()
# chart isn't active/shown so skip render cycle and pause feed(s)
if chart.linked.isHidden():
chart.pause_all_feeds()
continue
ic = chart.view._ic
if ic:
chart.pause_all_feeds()
await ic.wait()
chart.resume_all_feeds()
# sync call to update all graphics/UX components.
graphics_update_cycle(ds)
def graphics_update_cycle(
ds: DisplayState,
wap_in_history: bool = False,
trigger_all: bool = False, # flag used by prepend history updates
prepend_update_index: Optional[int] = None,
) -> None:
# TODO: eventually optimize this whole graphics stack with ``numba``
# hopefully XD
chart = ds.chart
profiler = pg.debug.Profiler(
msg=f'Graphics loop cycle for: `{chart.name}`',
delayed=True,
disabled=not pg_profile_enabled(),
# disabled=True,
ms_threshold=ms_slower_then,
# ms_threshold=1/12 * 1e3,
)
# unpack multi-referenced components
vlm_chart = ds.vlm_chart
l1 = ds.l1
ohlcv = ds.ohlcv
array = ohlcv.array
vars = ds.vars
tick_margin = vars['tick_margin']
update_uppx = 16
for sym, quote in ds.quotes.items():
# compute the first available graphic's x-units-per-pixel
uppx = vlm_chart.view.x_uppx()
# NOTE: vlm may be written by the ``brokerd`` backend
# event though a tick sample is not emitted.
# TODO: show dark trades differently
# https://github.com/pikers/piker/issues/116
# NOTE: this used to be implemented in a dedicated
# "increment task": ``check_for_new_bars()`` but it doesn't
# make sense to do a whole task switch when we can just do
# this simple index-diff and all the fsp sub-curve graphics
# are diffed on each draw cycle anyway; so updates to the
# "curve" length is already automatic.
# increment the view position by the sample offset.
i_step = ohlcv.index
i_diff = i_step - vars['i_last']
vars['i_last'] = i_step
append_diff = i_step - vars['i_last_append']
# update the "last datum" (aka extending the flow graphic with
# new data) only if the number of unit steps is >= the number of
# such unit steps per pixel (aka uppx). Iow, if the zoom level
# is such that a datum(s) update to graphics wouldn't span
# to a new pixel, we don't update yet.
do_append = (append_diff >= uppx)
if do_append:
vars['i_last_append'] = i_step
do_rt_update = uppx < update_uppx
# print(
# f'append_diff:{append_diff}\n'
# f'uppx:{uppx}\n'
# f'do_append: {do_append}'
# )
# TODO: we should only run mxmn when we know
# an update is due via ``do_append`` above.
(
brange,
mx_in_view,
mn_in_view,
mx_vlm_in_view,
) = ds.maxmin()
l, lbar, rbar, r = brange
mx = mx_in_view + tick_margin
mn = mn_in_view - tick_margin
profiler('`ds.maxmin()` call')
liv = r >= i_step # the last datum is in view
if (
prepend_update_index is not None
and lbar > prepend_update_index
):
# on a history update (usually from the FSP subsys)
# if the segment of history that is being prepended
# isn't in view there is no reason to do a graphics
# update.
log.debug('Skipping prepend graphics cycle: frame not in view')
return
# don't real-time "shift" the curve to the
# left unless we get one of the following:
if (
(
# i_diff > 0 # no new sample step
do_append
# and uppx < 4 # chart is zoomed out very far
and liv
)
or trigger_all
):
# TODO: we should track and compute whether the last
# pixel in a curve should show new data based on uppx
# and then iff update curves and shift?
chart.increment_view(steps=i_diff)
if vlm_chart:
vlm_chart.increment_view(steps=i_diff)
profiler('view incremented')
ticks_frame = quote.get('ticks', ())
frames_by_type: dict[str, dict] = {}
lasts = {}
# build tick-type "frames" of tick sequences since
# likely the tick arrival rate is higher then our
# (throttled) quote stream rate.
for tick in ticks_frame:
price = tick.get('price')
ticktype = tick.get('type')
if ticktype == 'n/a' or price == -1:
# okkk..
continue
# keys are entered in olded-event-inserted-first order
# since we iterate ``ticks_frame`` in standard order
# above. in other words the order of the keys is the order
# of tick events by type from the provider feed.
frames_by_type.setdefault(ticktype, []).append(tick)
# overwrites so the last tick per type is the entry
lasts[ticktype] = tick
# from pprint import pformat
# frame_counts = {
# typ: len(frame) for typ, frame in frames_by_type.items()
# }
# print(f'{pformat(frame_counts)}')
# print(f'framed: {pformat(frames_by_type)}')
# print(f'lasts: {pformat(lasts)}')
# TODO: eventually we want to separate out the utrade (aka
# dark vlm prices) here and show them as an additional
# graphic.
clear_types = _tick_groups['clears']
# XXX: if we wanted to iterate in "latest" (i.e. most
# current) tick first order as an optimization where we only
# update from the last tick from each type class.
# last_clear_updated: bool = False
# update ohlc sampled price bars
if (
do_rt_update
or do_append
or trigger_all
):
chart.update_graphics_from_flow(
chart.name,
# do_append=uppx < update_uppx,
do_append=do_append,
)
# NOTE: we always update the "last" datum
# since the current range should at least be updated
# to it's max/min on the last pixel.
# iterate in FIFO order per tick-frame
for typ, tick in lasts.items():
price = tick.get('price')
size = tick.get('size')
# compute max and min prices (including bid/ask) from
# tick frames to determine the y-range for chart
# auto-scaling.
# TODO: we need a streaming minmax algo here, see def above.
if liv:
mx = max(price + tick_margin, mx)
mn = min(price - tick_margin, mn)
if typ in clear_types:
# XXX: if we only wanted to update graphics from the
# "current"/"latest received" clearing price tick
# once (see alt iteration order above).
# if last_clear_updated:
# continue
# last_clear_updated = True
# we only want to update grahpics from the *last*
# tick event that falls under the "clearing price"
# set.
# update price sticky(s)
end = array[-1]
ds.last_price_sticky.update_from_data(
*end[['index', 'close']]
)
if wap_in_history:
# update vwap overlay line
chart.update_graphics_from_flow(
'bar_wap',
)
# L1 book label-line updates
# XXX: is this correct for ib?
# if ticktype in ('trade', 'last'):
# if ticktype in ('last',): # 'size'):
if typ in ('last',): # 'size'):
label = {
l1.ask_label.fields['level']: l1.ask_label,
l1.bid_label.fields['level']: l1.bid_label,
}.get(price)
if (
label is not None
and liv
):
label.update_fields(
{'level': price, 'size': size}
)
# TODO: on trades should we be knocking down
# the relevant L1 queue?
# label.size -= size
elif (
typ in _tick_groups['asks']
# TODO: instead we could check if the price is in the
# y-view-range?
and liv
):
l1.ask_label.update_fields({'level': price, 'size': size})
elif (
typ in _tick_groups['bids']
# TODO: instead we could check if the price is in the
# y-view-range?
and liv
):
l1.bid_label.update_fields({'level': price, 'size': size})
# check for y-range re-size
if (
(mx > vars['last_mx']) or (mn < vars['last_mn'])
and not chart._static_yrange == 'axis'
and liv
):
main_vb = chart.view
if (
main_vb._ic is None
or not main_vb._ic.is_set()
):
# print(f'updating range due to mxmn')
main_vb._set_yrange(
# TODO: we should probably scale
# the view margin based on the size
# of the true range? This way you can
# slap in orders outside the current
# L1 (only) book range.
# range_margin=0.1,
yrange=(mn, mx),
)
# XXX: update this every draw cycle to make L1-always-in-view work.
vars['last_mx'], vars['last_mn'] = mx, mn
# run synchronous update on all linked flows
# TODO: should the "main" (aka source) flow be special?
for curve_name, flow in chart._flows.items():
# update any overlayed fsp flows
if curve_name != chart.data_key:
update_fsp_chart(
chart,
flow,
curve_name,
array_key=curve_name,
)
# even if we're downsampled bigly
# draw the last datum in the final
# px column to give the user the mx/mn
# range of that set.
if (
not do_append
# and not do_rt_update
and liv
):
flow.draw_last(
array_key=curve_name,
only_last_uppx=True,
)
# volume chart logic..
# TODO: can we unify this with the above loop?
if vlm_chart:
# always update y-label
ds.vlm_sticky.update_from_data(
*array[-1][['index', 'volume']]
)
if (
(
do_rt_update
or do_append
and liv
)
or trigger_all
):
# TODO: make it so this doesn't have to be called
# once the $vlm is up?
vlm_chart.update_graphics_from_flow(
'volume',
# UGGGh, see ``maxmin()`` impl in `._fsp` for
# the overlayed plotitems... we need a better
# bay to invoke a maxmin per overlay..
render=False,
# XXX: ^^^^ THIS IS SUPER IMPORTANT! ^^^^
# without this, since we disable the
# 'volume' (units) chart after the $vlm starts
# up we need to be sure to enable this
# auto-ranging otherwise there will be no handler
# connected to update accompanying overlay
# graphics..
)
profiler('`vlm_chart.update_graphics_from_flow()`')
if (
mx_vlm_in_view != vars['last_mx_vlm']
):
yrange = (0, mx_vlm_in_view * 1.375)
vlm_chart.view._set_yrange(
yrange=yrange,
)
profiler('`vlm_chart.view._set_yrange()`')
# print(f'mx vlm: {last_mx_vlm} -> {mx_vlm_in_view}')
vars['last_mx_vlm'] = mx_vlm_in_view
for curve_name, flow in vlm_chart._flows.items():
if (
curve_name != 'volume' and
flow.render and (
liv and
do_rt_update or do_append
)
):
update_fsp_chart(
vlm_chart,
flow,
curve_name,
array_key=curve_name,
# do_append=uppx < update_uppx,
do_append=do_append,
)
# is this even doing anything?
# (pretty sure it's the real-time
# resizing from last quote?)
fvb = flow.plot.vb
fvb._set_yrange(
name=curve_name,
)
elif (
curve_name != 'volume'
and not do_append
and liv
and uppx >= 1
# even if we're downsampled bigly
# draw the last datum in the final
# px column to give the user the mx/mn
# range of that set.
):
# always update the last datum-element
# graphic for all flows
# print(f'drawing last {flow.name}')
flow.draw_last(array_key=curve_name)
async def display_symbol_data(
godwidget: GodWidget,
provider: str,
sym: str,
loglevel: str,
order_mode_started: trio.Event,
) -> None:
'''
Spawn a real-time updated chart for ``symbol``.
Spawned ``LinkedSplits`` chart widgets can remain up but hidden so
that multiple symbols can be viewed and switched between extremely
fast from a cached watch-list.
'''
sbar = godwidget.window.status_bar
loading_sym_key = sbar.open_status(
f'loading {sym}.{provider} ->',
group_key=True
)
# historical data fetch
# brokermod = brokers.get_brokermod(provider)
# ohlc_status_done = sbar.open_status(
# 'retreiving OHLC history.. ',
# clear_on_next=True,
# group_key=loading_sym_key,
# )
fqsn = '.'.join((sym, provider))
async with open_feed(
[fqsn],
loglevel=loglevel,
# limit to at least display's FPS
# avoiding needless Qt-in-guest-mode context switches
tick_throttle=_quote_throttle_rate,
) as feed:
ohlcv: ShmArray = feed.shm
bars = ohlcv.array
symbol = feed.symbols[sym]
fqsn = symbol.front_fqsn()
times = bars['time']
end = pendulum.from_timestamp(times[-1])
start = pendulum.from_timestamp(times[times != times[-1]][-1])
step_size_s = (end - start).seconds
tf_key = tf_in_1s[step_size_s]
# load in symbol's ohlc data
godwidget.window.setWindowTitle(
f'{fqsn} '
f'tick:{symbol.tick_size} '
f'step:{tf_key} '
)
linked = godwidget.linkedsplits
linked._symbol = symbol
# generate order mode side-pane UI
# A ``FieldsForm`` form to configure order entry
pp_pane: FieldsForm = mk_order_pane_layout(godwidget)
# add as next-to-y-axis singleton pane
godwidget.pp_pane = pp_pane
# create main OHLC chart
chart = linked.plot_ohlc_main(
symbol,
ohlcv,
sidepane=pp_pane,
)
chart.default_view()
chart._feeds[symbol.key] = feed
chart.setFocus()
# plot historical vwap if available
wap_in_history = False
# XXX: FOR SOME REASON THIS IS CAUSING HANGZ!?!
# if brokermod._show_wap_in_history:
# if 'bar_wap' in bars.dtype.fields:
# wap_in_history = True
# chart.draw_curve(
# name='bar_wap',
# shm=ohlcv,
# color='default_light',
# add_label=False,
# )
# size view to data once at outset
chart.cv._set_yrange()
# NOTE: we must immediately tell Qt to show the OHLC chart
# to avoid a race where the subplots get added/shown to
# the linked set *before* the main price chart!
linked.show()
linked.focus()
await trio.sleep(0)
vlm_chart: Optional[ChartPlotWidget] = None
async with trio.open_nursery() as ln:
# if available load volume related built-in display(s)
if has_vlm(ohlcv):
vlm_chart = await ln.start(
open_vlm_displays,
linked,
ohlcv,
)
# load (user's) FSP set (otherwise known as "indicators")
# from an input config.
ln.start_soon(
start_fsp_displays,
linked,
ohlcv,
loading_sym_key,
loglevel,
)
# start graphics update loop after receiving first live quote
ln.start_soon(
graphics_update_loop,
linked,
feed.stream,
ohlcv,
wap_in_history,
vlm_chart,
)
async with (
open_order_mode(
feed,
chart,
fqsn,
order_mode_started
)
):
# let Qt run to render all widgets and make sure the
# sidepanes line up vertically.
await trio.sleep(0)
linked.resize_sidepanes()
# NOTE: we pop the volume chart from the subplots set so
# that it isn't double rendered in the display loop
# above since we do a maxmin calc on the volume data to
# determine if auto-range adjustements should be made.
# linked.subplots.pop('volume', None)
# TODO: make this not so shit XD
# close group status
sbar._status_groups[loading_sym_key][1]()
# let the app run.. bby
# linked.graphics_cycle()
await trio.sleep_forever()

View File

@ -1,382 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Higher level annotation editors.
"""
from dataclasses import dataclass, field
from typing import Optional
import pyqtgraph as pg
from pyqtgraph import ViewBox, Point, QtCore, QtGui
from pyqtgraph import functions as fn
from PyQt5.QtCore import QPointF
import numpy as np
from ._style import hcolor, _font
from ._lines import LevelLine
from ..log import get_logger
log = get_logger(__name__)
@dataclass
class ArrowEditor:
chart: 'ChartPlotWidget' # noqa
_arrows: field(default_factory=dict)
def add(
self,
uid: str,
x: float,
y: float,
color='default',
pointing: Optional[str] = None,
) -> pg.ArrowItem:
"""Add an arrow graphic to view at given (x, y).
"""
angle = {
'up': 90,
'down': -90,
None: 180, # pointing to right (as in an alert)
}[pointing]
# scale arrow sizing to dpi-aware font
size = _font.font.pixelSize() * 0.8
arrow = pg.ArrowItem(
angle=angle,
baseAngle=0,
headLen=size,
headWidth=size/2,
tailLen=None,
pxMode=True,
# coloring
pen=pg.mkPen(hcolor('papas_special')),
brush=pg.mkBrush(hcolor(color)),
)
arrow.setPos(x, y)
self._arrows[uid] = arrow
# render to view
self.chart.plotItem.addItem(arrow)
return arrow
def remove(self, arrow) -> bool:
self.chart.plotItem.removeItem(arrow)
@dataclass
class LineEditor:
'''The great editor of linez.
'''
chart: 'ChartPlotWidget' = None # type: ignore # noqa
_order_lines: dict[str, LevelLine] = field(default_factory=dict)
_active_staged_line: LevelLine = None
def stage_line(
self,
line: LevelLine,
) -> LevelLine:
"""Stage a line at the current chart's cursor position
and return it.
"""
# add a "staged" cursor-tracking line to view
# and cash it in a a var
if self._active_staged_line:
self.unstage_line()
self._active_staged_line = line
return line
def unstage_line(self) -> LevelLine:
"""Inverse of ``.stage_line()``.
"""
# chart = self.chart._cursor.active_plot
# # chart.setCursor(QtCore.Qt.ArrowCursor)
cursor = self.chart.linked.cursor
# delete "staged" cursor tracking line from view
line = self._active_staged_line
if line:
cursor._trackers.remove(line)
line.delete()
self._active_staged_line = None
# show the crosshair y line and label
cursor.show_xhair()
def submit_line(
self,
line: LevelLine,
uuid: str,
) -> LevelLine:
staged_line = self._active_staged_line
if not staged_line:
raise RuntimeError("No line is currently staged!?")
# for now, until submission reponse arrives
line.hide_labels()
# register for later lookup/deletion
self._order_lines[uuid] = line
return line
def commit_line(self, uuid: str) -> LevelLine:
"""Commit a "staged line" to view.
Submits the line graphic under the cursor as a (new) permanent
graphic in view.
"""
try:
line = self._order_lines[uuid]
except KeyError:
log.warning(f'No line for {uuid} could be found?')
return
else:
line.show_labels()
# TODO: other flashy things to indicate the order is active
log.debug(f'Level active for level: {line.value()}')
return line
def lines_under_cursor(self) -> list[LevelLine]:
"""Get the line(s) under the cursor position.
"""
# Delete any hoverable under the cursor
return self.chart.linked.cursor._hovered
def all_lines(self) -> tuple[LevelLine]:
return tuple(self._order_lines.values())
def remove_line(
self,
line: LevelLine = None,
uuid: str = None,
) -> Optional[LevelLine]:
'''Remove a line by refernce or uuid.
If no lines or ids are provided remove all lines under the
cursor position.
'''
# try to look up line from our registry
line = self._order_lines.pop(uuid, line)
if line:
# if hovered remove from cursor set
cursor = self.chart.linked.cursor
hovered = cursor._hovered
if line in hovered:
hovered.remove(line)
# make sure the xhair doesn't get left off
# just because we never got a un-hover event
cursor.show_xhair()
log.debug(f'deleting {line} with oid: {uuid}')
line.delete()
else:
log.warning(f'Could not find line for {line}')
return line
class SelectRect(QtGui.QGraphicsRectItem):
def __init__(
self,
viewbox: ViewBox,
color: str = 'dad_blue',
) -> None:
super().__init__(0, 0, 1, 1)
# self.rbScaleBox = QtGui.QGraphicsRectItem(0, 0, 1, 1)
self.vb = viewbox
self._chart: 'ChartPlotWidget' = None # noqa
# override selection box color
color = QtGui.QColor(hcolor(color))
self.setPen(fn.mkPen(color, width=1))
color.setAlpha(66)
self.setBrush(fn.mkBrush(color))
self.setZValue(1e9)
self.hide()
self._label = None
label = self._label = QtGui.QLabel()
label.setTextFormat(0) # markdown
label.setFont(_font.font)
label.setMargin(0)
label.setAlignment(
QtCore.Qt.AlignLeft
# | QtCore.Qt.AlignVCenter
)
# proxy is created after containing scene is initialized
self._label_proxy = None
self._abs_top_right = None
# TODO: "swing %" might be handy here (data's max/min # % change)
self._contents = [
'change: {pchng:.2f} %',
'range: {rng:.2f}',
'bars: {nbars}',
'max: {dmx}',
'min: {dmn}',
# 'time: {nbars}m', # TODO: compute this per bar size
'sigma: {std:.2f}',
]
@property
def chart(self) -> 'ChartPlotWidget': # noqa
return self._chart
@chart.setter
def chart(self, chart: 'ChartPlotWidget') -> None: # noqa
self._chart = chart
chart.sigRangeChanged.connect(self.update_on_resize)
palette = self._label.palette()
# TODO: get bg color working
palette.setColor(
self._label.backgroundRole(),
# QtGui.QColor(chart.backgroundBrush()),
QtGui.QColor(hcolor('papas_special')),
)
def update_on_resize(self, vr, r):
"""Re-position measure label on view range change.
"""
if self._abs_top_right:
self._label_proxy.setPos(
self.vb.mapFromView(self._abs_top_right)
)
def mouse_drag_released(
self,
p1: QPointF,
p2: QPointF
) -> None:
"""Called on final button release for mouse drag with start and
end positions.
"""
self.set_pos(p1, p2)
def set_pos(
self,
p1: QPointF,
p2: QPointF
) -> None:
"""Set position of selection rect and accompanying label, move
label to match.
"""
if self._label_proxy is None:
# https://doc.qt.io/qt-5/qgraphicsproxywidget.html
self._label_proxy = self.vb.scene().addWidget(self._label)
start_pos = self.vb.mapToView(p1)
end_pos = self.vb.mapToView(p2)
# map to view coords and update area
r = QtCore.QRectF(start_pos, end_pos)
# old way; don't need right?
# lr = QtCore.QRectF(p1, p2)
# r = self.vb.childGroup.mapRectFromParent(lr)
self.setPos(r.topLeft())
self.resetTransform()
self.scale(r.width(), r.height())
self.show()
y1, y2 = start_pos.y(), end_pos.y()
x1, x2 = start_pos.x(), end_pos.x()
# TODO: heh, could probably use a max-min streamin algo here too
_, xmn = min(y1, y2), min(x1, x2)
ymx, xmx = max(y1, y2), max(x1, x2)
pchng = (y2 - y1) / y1 * 100
rng = abs(y1 - y2)
ixmn, ixmx = round(xmn), round(xmx)
nbars = ixmx - ixmn + 1
chart = self._chart
data = chart._flows[chart.name].shm.array[ixmn:ixmx]
if len(data):
std = data['close'].std()
dmx = data['high'].max()
dmn = data['low'].min()
else:
dmn = dmx = std = np.nan
# update label info
self._label.setText('\n'.join(self._contents).format(
pchng=pchng, rng=rng, nbars=nbars,
std=std, dmx=dmx, dmn=dmn,
))
# print(f'x2, y2: {(x2, y2)}')
# print(f'xmn, ymn: {(xmn, ymx)}')
label_anchor = Point(xmx + 2, ymx)
# XXX: in the drag bottom-right -> top-left case we don't
# want the label to overlay the box.
# if (x2, y2) == (xmn, ymx):
# # could do this too but needs to be added after coords transform
# # label_anchor = Point(x2, y2 + self._label.height())
# label_anchor = Point(xmn, ymn)
self._abs_top_right = label_anchor
self._label_proxy.setPos(self.vb.mapFromView(label_anchor))
# self._label.show()
def clear(self):
"""Clear the selection box from view.
"""
self._label.hide()
self.hide()

View File

@ -1,235 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Qt event proxying and processing using ``trio`` mem chans.
"""
from contextlib import asynccontextmanager, AsyncExitStack
from typing import Callable
from pydantic import BaseModel
import trio
from PyQt5 import QtCore
from PyQt5.QtCore import QEvent, pyqtBoundSignal
from PyQt5.QtWidgets import QWidget
from PyQt5.QtWidgets import (
QGraphicsSceneMouseEvent as gs_mouse,
)
MOUSE_EVENTS = {
gs_mouse.GraphicsSceneMousePress,
gs_mouse.GraphicsSceneMouseRelease,
QEvent.MouseButtonPress,
QEvent.MouseButtonRelease,
# QtGui.QMouseEvent,
}
# TODO: maybe consider some constrained ints down the road?
# https://pydantic-docs.helpmanual.io/usage/types/#constrained-types
class KeyboardMsg(BaseModel):
'''Unpacked Qt keyboard event data.
'''
class Config:
arbitrary_types_allowed = True
event: QEvent
etype: int
key: int
mods: int
txt: str
def to_tuple(self) -> tuple:
return tuple(self.dict().values())
class MouseMsg(BaseModel):
'''Unpacked Qt keyboard event data.
'''
class Config:
arbitrary_types_allowed = True
event: QEvent
etype: int
button: int
# TODO: maybe add some methods to detect key combos? Or is that gonna be
# better with pattern matching?
# # ctl + alt as combo
# ctlalt = False
# if (QtCore.Qt.AltModifier | QtCore.Qt.ControlModifier) == mods:
# ctlalt = True
class EventRelay(QtCore.QObject):
'''
Relay Qt events over a trio memory channel for async processing.
'''
_event_types: set[QEvent] = set()
_send_chan: trio.abc.SendChannel = None
_filter_auto_repeats: bool = True
def eventFilter(
self,
source: QWidget,
ev: QEvent,
) -> None:
'''
Qt global event filter: return `False` to pass through and `True`
to filter event out.
https://doc.qt.io/qt-5/qobject.html#eventFilter
https://doc.qt.io/qtforpython/overviews/eventsandfilters.html#event-filters
'''
etype = ev.type()
# TODO: turn this on and see what we can filter by default (such
# as mouseWheelEvent).
# print(f'ev: {ev}')
if etype not in self._event_types:
return False
# XXX: we unpack here because apparently doing it
# after pop from the mem chan isn't showing the same
# event object? no clue wtf is going on there, likely
# something to do with Qt internals and calling the
# parent handler?
if etype in {QEvent.KeyPress, QEvent.KeyRelease}:
msg = KeyboardMsg(
event=ev,
etype=etype,
key=ev.key(),
mods=ev.modifiers(),
txt=ev.text(),
)
# TODO: is there a global setting for this?
if ev.isAutoRepeat() and self._filter_auto_repeats:
ev.ignore()
# filter out this event and stop it's processing
# https://doc.qt.io/qt-5/qobject.html#installEventFilter
return True
# NOTE: the event object instance coming out
# the other side is mutated since Qt resumes event
# processing **before** running a ``trio`` guest mode
# tick, thus special handling or copying must be done.
elif etype in MOUSE_EVENTS:
# print('f mouse event: {ev}')
msg = MouseMsg(
event=ev,
etype=etype,
button=ev.button(),
)
else:
msg = ev
# send event-msg to async handler
self._send_chan.send_nowait(msg)
# **do not** filter out this event
# and instead forward to the source widget
# https://doc.qt.io/qt-5/qobject.html#installEventFilter
return False
@asynccontextmanager
async def open_event_stream(
source_widget: QWidget,
event_types: set[QEvent] = {QEvent.KeyPress},
filter_auto_repeats: bool = True,
) -> trio.abc.ReceiveChannel:
# 1 to force eager sending
send, recv = trio.open_memory_channel(16)
kc = EventRelay()
kc._send_chan = send
kc._event_types = event_types
kc._filter_auto_repeats = filter_auto_repeats
source_widget.installEventFilter(kc)
try:
async with send:
yield recv
finally:
source_widget.removeEventFilter(kc)
@asynccontextmanager
async def open_signal_handler(
signal: pyqtBoundSignal,
async_handler: Callable,
) -> trio.abc.ReceiveChannel:
send, recv = trio.open_memory_channel(0)
def proxy_args_to_chan(*args):
send.send_nowait(args)
signal.connect(proxy_args_to_chan)
async def proxy_to_handler():
async for args in recv:
await async_handler(*args)
async with trio.open_nursery() as n:
n.start_soon(proxy_to_handler)
async with send:
yield
@asynccontextmanager
async def open_handlers(
source_widgets: list[QWidget],
event_types: set[QEvent],
async_handler: Callable[[QWidget, trio.abc.ReceiveChannel], None],
**kwargs,
) -> None:
async with (
trio.open_nursery() as n,
AsyncExitStack() as stack,
):
for widget in source_widgets:
event_recv_stream = await stack.enter_async_context(
open_event_stream(widget, event_types, **kwargs)
)
n.start_soon(async_handler, widget, event_recv_stream)
yield

View File

@ -1,188 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Trio - Qt integration
Run ``trio`` in guest mode on top of the Qt event loop.
All global Qt runtime settings are mostly defined here.
"""
from typing import Tuple, Callable, Dict, Any
import platform
import traceback
# Qt specific
import PyQt5 # noqa
import pyqtgraph as pg
from pyqtgraph import QtGui
from PyQt5 import QtCore
# from PyQt5.QtGui import QLabel, QStatusBar
from PyQt5.QtCore import (
pyqtRemoveInputHook,
Qt,
QCoreApplication,
)
import qdarkstyle
from qdarkstyle import DarkPalette
# import qdarkgraystyle
import trio
from outcome import Error
from .._daemon import maybe_open_pikerd, _tractor_kwargs
from ..log import get_logger
from ._pg_overrides import _do_overrides
from . import _style
log = get_logger(__name__)
# pyqtgraph global config
# engage core tweaks that give us better response
# latency then the average pg user
_do_overrides()
# XXX: pretty sure none of this shit works on linux as per:
# https://bugreports.qt.io/browse/QTBUG-53022
# it seems to work on windows.. no idea wtf is up.
is_windows = False
if platform.system() == "Windows":
is_windows = True
# Proper high DPI scaling is available in Qt >= 5.6.0. This attibute
# must be set before creating the application
if hasattr(Qt, 'AA_EnableHighDpiScaling'):
QCoreApplication.setAttribute(Qt.AA_EnableHighDpiScaling, True)
if hasattr(Qt, 'AA_UseHighDpiPixmaps'):
QCoreApplication.setAttribute(Qt.AA_UseHighDpiPixmaps, True)
def run_qtractor(
func: Callable,
args: Tuple,
main_widget: QtGui.QWidget,
tractor_kwargs: Dict[str, Any] = {},
window_type: QtGui.QMainWindow = None,
) -> None:
# avoids annoying message when entering debugger from qt loop
pyqtRemoveInputHook()
app = QtGui.QApplication.instance()
if app is None:
app = PyQt5.QtWidgets.QApplication([])
# TODO: we might not need this if it's desired
# to cancel the tractor machinery on Qt loop
# close, however the details of doing that correctly
# currently seem tricky..
app.setQuitOnLastWindowClosed(False)
# XXX: lmfao, this is how you disable text edit cursor blinking..smh
app.setCursorFlashTime(0)
# This code is from Nathaniel, and I quote:
# "This is substantially faster than using a signal... for some
# reason Qt signal dispatch is really slow (and relies on events
# underneath anyway, so this is strictly less work)."
# source gist and credit to njs:
# https://gist.github.com/njsmith/d996e80b700a339e0623f97f48bcf0cb
REENTER_EVENT = QtCore.QEvent.Type(QtCore.QEvent.registerEventType())
class ReenterEvent(QtCore.QEvent):
pass
class Reenter(QtCore.QObject):
def event(self, event):
event.fn()
return False
reenter = Reenter()
def run_sync_soon_threadsafe(fn):
event = ReenterEvent(REENTER_EVENT)
event.fn = fn
app.postEvent(reenter, event)
def done_callback(outcome):
if isinstance(outcome, Error):
exc = outcome.error
if isinstance(outcome.error, KeyboardInterrupt):
# make it kinda look like ``trio``
print("Terminated!")
else:
traceback.print_exception(type(exc), exc, exc.__traceback__)
app.quit()
# load dark theme
stylesheet = qdarkstyle.load_stylesheet(
qt_api='pyqt5',
palette=DarkPalette,
)
app.setStyleSheet(stylesheet)
# make window and exec
from . import _window
if window_type is None:
window_type = _window.MainWindow
window = window_type()
# set global app's main window singleton
_window._qt_win = window
# configure global DPI aware font sizes now that a screen
# should be active from which we can read a DPI.
_style._config_fonts_to_screen()
# hook into app focus change events
app.focusChanged.connect(window.on_focus_change)
instance = main_widget()
instance.window = window
# override tractor's defaults
tractor_kwargs.update(_tractor_kwargs)
# define tractor entrypoint
async def main():
async with maybe_open_pikerd(
**tractor_kwargs,
):
await func(*((instance,) + args))
# guest mode entry
trio.lowlevel.start_guest_run(
main,
run_sync_soon_threadsafe=run_sync_soon_threadsafe,
done_callback=done_callback,
# restrict_keyboard_interrupt_to_checkpoints=True,
)
window.main_widget = main_widget
window.setCentralWidget(instance)
if is_windows:
window.configure_to_desktop()
# actually render to screen
window.show()
app.exec_()

View File

@ -1,83 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Feed status and controls widget(s) for embedding in a UI-pane.
"""
from __future__ import annotations
from textwrap import dedent
from typing import TYPE_CHECKING
# from PyQt5.QtCore import Qt
from ._style import _font, _font_small
# from ..calc import humanize
from ._label import FormatLabel
if TYPE_CHECKING:
from ._chart import ChartPlotWidget
from ..data.feed import Feed
from ._forms import FieldsForm
def mk_feed_label(
form: FieldsForm,
feed: Feed,
chart: ChartPlotWidget,
) -> FormatLabel:
'''
Generate a label from feed meta-data to be displayed
in a UI sidepane.
TODO: eventually buttons for changing settings over
a feed control protocol.
'''
status = feed.status
assert status
msg = dedent("""
actor: **{actor_name}**\n
|_ @**{host}:{port}**\n
""")
for key, val in status.items():
if key in ('host', 'port', 'actor_name'):
continue
msg += f'\n|_ {key}: **{{{key}}}**\n'
feed_label = FormatLabel(
fmt_str=msg,
# |_ streams: **{symbols}**\n
font=_font.font,
font_size=_font_small.px_size,
font_color='default_lightest',
)
# form.vbox.setAlignment(feed_label, Qt.AlignBottom)
# form.vbox.setAlignment(Qt.AlignBottom)
_ = chart.height() - (
form.height() +
form.fill_bar.height()
# feed_label.height()
)
feed_label.format(**feed.status)
return feed_label

File diff suppressed because it is too large Load Diff

View File

@ -1,814 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Text entry "forms" widgets (mostly for configuration and UI user input).
'''
from __future__ import annotations
from contextlib import asynccontextmanager
from functools import partial
from math import floor
from typing import (
Optional, Any, Callable, Awaitable
)
import trio
from PyQt5 import QtGui
from PyQt5.QtCore import QSize, QModelIndex, Qt, QEvent
from PyQt5.QtWidgets import (
QWidget,
QLabel,
QComboBox,
QLineEdit,
QHBoxLayout,
QVBoxLayout,
QFormLayout,
QProgressBar,
QSizePolicy,
QStyledItemDelegate,
QStyleOptionViewItem,
)
from ._event import open_handlers
from ._icons import mk_icons
from ._style import hcolor, _font, _font_small, DpiAwareFont
from ._label import FormatLabel
class Edit(QLineEdit):
def __init__(
self,
parent: QWidget,
# parent_chart: QWidget, # noqa
font: DpiAwareFont = _font,
width_in_chars: int = None,
) -> None:
# self.setContextMenuPolicy(Qt.CustomContextMenu)
# self.customContextMenuRequested.connect(self.show_menu)
# self.setStyleSheet(f"font: 18px")
self.dpi_font = font
# self.godwidget = parent_chart
if width_in_chars:
self._chars = int(width_in_chars)
x_size_policy = QSizePolicy.Fixed
else:
# chart count which will be used to calculate
# width of input field.
self._chars: int = 6
# fit to surroundingn frame width
x_size_policy = QSizePolicy.Expanding
super().__init__(parent)
# size it as we specify
# https://doc.qt.io/qt-5/qsizepolicy.html#Policy-enum
self.setSizePolicy(
x_size_policy,
QSizePolicy.Fixed,
)
self.setFont(font.font)
# witty bit of margin
self.setTextMargins(2, 2, 2, 2)
def sizeHint(self) -> QSize:
"""
Scale edit box to size of dpi aware font.
"""
psh = super().sizeHint()
dpi_font = self.dpi_font
psh.setHeight(dpi_font.px_size)
# make space for ``._chars: int`` for of characters in view
# TODO: somehow this math ain't right?
chars_w_pxs = dpi_font.boundingRect('0'*self._chars).width()
scale = round(dpi_font.scale())
psh.setWidth(int(chars_w_pxs * scale))
return psh
def set_width_in_chars(
self,
chars: int,
) -> None:
self._chars = chars
self.sizeHint()
self.update()
def focus(self) -> None:
self.selectAll()
self.show()
self.setFocus()
class FontScaledDelegate(QStyledItemDelegate):
'''
Super simple view delegate to render text in the same
font size as the search widget.
'''
def __init__(
self,
parent=None,
font: DpiAwareFont = _font,
) -> None:
super().__init__(parent)
self.dpi_font = font
def sizeHint(
self,
option: QStyleOptionViewItem,
index: QModelIndex,
) -> QSize:
# value = index.data()
# br = self.dpi_font.boundingRect(value)
# w, h = br.width(), br.height()
parent = self.parent()
if getattr(parent, '_max_item_size', None):
return QSize(*self.parent()._max_item_size)
else:
return super().sizeHint(option, index)
# NOTE: hack to display icons on RHS
# TODO: is there a way to set this stype option once?
# def paint(self, painter, option, index):
# # display icons on RHS
# # https://stackoverflow.com/a/39943629
# option.decorationPosition = QtGui.QStyleOptionViewItem.Right
# option.decorationAlignment = Qt.AlignRight | Qt.AlignVCenter
# QStyledItemDelegate.paint(self, painter, option, index)
class Selection(QComboBox):
def __init__(
self,
parent=None,
) -> None:
self._items: dict[str, int] = {}
super().__init__(parent=parent)
self.setSizeAdjustPolicy(QComboBox.AdjustToContents)
# make line edit expand to surrounding frame
self.setSizePolicy(
QSizePolicy.Expanding,
QSizePolicy.Fixed,
)
view = self.view()
view.setUniformItemSizes(True)
# TODO: this doesn't seem to work for the currently selected item?
self.setItemDelegate(FontScaledDelegate(self))
self.resize()
self._icons = mk_icons(
self.style(),
self.iconSize()
)
def set_style(
self,
color: str,
font_size: int,
) -> None:
self.setStyleSheet(
f"""QComboBox {{
color : {hcolor(color)};
font-size : {font_size}px;
}}
"""
)
def resize(
self,
char: str = 'W',
) -> None:
br = _font.boundingRect(str(char))
_, h = br.width(), int(br.height())
# TODO: something better then this monkey patch
view = self.view()
# XXX: see size policy settings of line edit
# view._max_item_size = w, h
self.setMinimumHeight(h) # at least one entry in view
view.setMaximumHeight(6*h) # limit to 6 items max in view
icon_size = round(h * 0.75)
self.setIconSize(QSize(icon_size, icon_size))
def set_items(
self,
keys: list[str],
) -> None:
'''
Write keys to the selection verbatim.
All other items are cleared beforehand.
'''
self.clear()
self._items.clear()
for i, key in enumerate(keys):
strkey = str(key)
self.insertItem(i, strkey)
# store map of entry keys to row indexes
self._items[strkey] = i
# compute max item size so that the weird
# "style item delegate" thing can then specify
# that size on each item...
keys.sort()
self.resize(keys[-1])
def set_icon(
self,
key: str,
icon_name: Optional[str],
) -> None:
self.setItemIcon(
self._items[key],
self._icons[icon_name],
)
def items(self) -> list[(str, int)]:
return list(self._items.items())
# NOTE: in theory we can put icons on the RHS side with this hackery:
# https://stackoverflow.com/a/64256969
# def showPopup(self):
# print('show')
# QComboBox.showPopup(self)
# def hidePopup(self):
# # self.setItemDelegate(FontScaledDelegate(self.parent()))
# print('hide')
# QComboBox.hidePopup(self)
# slew of resources which helped get this where it is:
# https://stackoverflow.com/questions/20648210/qcombobox-adjusttocontents-changing-height
# https://stackoverflow.com/questions/3151798/how-do-i-set-the-qcombobox-width-to-fit-the-largest-item
# https://stackoverflow.com/questions/6337589/qlistwidget-adjust-size-to-content#6370892
# https://stackoverflow.com/questions/25304267/qt-resize-of-qlistview
# https://stackoverflow.com/questions/28227406/how-to-set-qlistview-rows-height-permanently
class FieldsForm(QWidget):
vbox: QVBoxLayout
form: QFormLayout
def __init__(
self,
parent=None,
) -> None:
super().__init__(parent)
# size it as we specify
self.setSizePolicy(
QSizePolicy.Expanding,
QSizePolicy.Expanding,
)
# XXX: not sure why we have to create this here exactly
# (instead of in the pane creation routine) but it's
# here and is managed by downstream layout routines.
# best guess is that you have to create layouts in order
# of hierarchy in order for things to display correctly?
# TODO: we may want to hand this *down* from some "pane manager"
# thing eventually?
self.vbox = QVBoxLayout(self)
# self.vbox.setAlignment(Qt.AlignVCenter)
self.vbox.setAlignment(Qt.AlignBottom)
self.vbox.setContentsMargins(3, 6, 3, 6)
self.vbox.setSpacing(0)
# split layout for the (<label>: |<widget>|) parameters entry
self.form = QFormLayout()
self.form.setAlignment(Qt.AlignTop | Qt.AlignLeft)
self.form.setContentsMargins(0, 0, 3, 0)
self.form.setSpacing(0)
self.form.setHorizontalSpacing(0)
self.vbox.addLayout(self.form, stretch=3)
self.labels: dict[str, QLabel] = {}
self.fields: dict[str, QWidget] = {}
self._font_size = _font_small.px_size - 2
self._max_item_width: (float, float) = 0, 0
def add_field_label(
self,
name: str,
font_size: Optional[int] = None,
font_color: str = 'default_lightest',
) -> QtGui.QLabel:
# add label to left of search bar
# self.label = label = QtGui.QLabel()
font_size = font_size or self._font_size - 1
self.label = label = FormatLabel(
fmt_str=name,
font=_font.font,
font_size=font_size,
font_color=font_color,
)
# for later lookup
self.labels[name] = label
return label
def add_edit_field(
self,
key: str,
label_name: str,
value: str,
readonly: bool = False,
) -> Edit:
# TODO: maybe a distint layout per "field" item?
label = self.add_field_label(label_name)
edit = Edit(
parent=self,
# width_in_chars=6,
)
edit.setStyleSheet(
f"""QLineEdit {{
color : {hcolor('gunmetal')};
font-size : {self._font_size}px;
}}
"""
)
edit.setReadOnly(readonly)
edit.setText(str(value))
self.form.addRow(label, edit)
self.fields[key] = edit
return edit
def add_select_field(
self,
key: str,
label_name: str,
values: list[str],
) -> Selection:
# TODO: maybe a distint layout per "field" item?
label = self.add_field_label(label_name)
select = Selection(self)
select.set_style(color='gunmetal', font_size=self._font_size)
select._key = key
select.set_items(values)
self.setSizePolicy(
QSizePolicy.Fixed,
QSizePolicy.Fixed,
)
select.show()
self.form.addRow(label, select)
self.fields[key] = select
return select
async def handle_field_input(
widget: QWidget,
recv_chan: trio.abc.ReceiveChannel,
form: FieldsForm,
on_value_change: Callable[[str, Any], Awaitable[bool]],
focus_next: QWidget,
) -> None:
async for kbmsg in recv_chan:
if kbmsg.etype in {QEvent.KeyPress, QEvent.KeyRelease}:
event, etype, key, mods, txt = kbmsg.to_tuple()
print(f'key: {kbmsg.key}, mods: {kbmsg.mods}, txt: {kbmsg.txt}')
# default controls set
ctl = False
if kbmsg.mods == Qt.ControlModifier:
ctl = True
if ctl and key in { # cancel and refocus
Qt.Key_C,
Qt.Key_Space, # i feel like this is the "native" one
Qt.Key_Alt,
}:
widget.clearFocus()
# normally the godwidget
focus_next.focus()
continue
# process field input
if key in (Qt.Key_Enter, Qt.Key_Return):
key = widget._key
value = widget.text()
on_value_change(key, value)
def mk_form(
parent: QWidget,
fields_schema: dict,
font_size: Optional[int] = None,
) -> FieldsForm:
# TODO: generate components from model
# instead of schema dict (aka use an ORM)
form = FieldsForm(parent=parent)
form._font_size = font_size or _font_small.px_size
# generate sub-components from schema dict
for key, conf in fields_schema.items():
wtype = conf['type']
label = str(conf.get('label', key))
kwargs = conf.get('kwargs', {})
# plain (line) edit field
if wtype == 'edit':
w = form.add_edit_field(
key,
label,
conf['default_value'],
**kwargs,
)
# drop-down selection
elif wtype == 'select':
values = list(conf['default_value'])
w = form.add_select_field(
key,
label,
values,
**kwargs,
)
w._key = key
return form
@asynccontextmanager
async def open_form_input_handling(
form: FieldsForm,
focus_next: QWidget,
on_value_change: Callable[[str, Any], Awaitable[bool]],
) -> FieldsForm:
async with open_handlers(
list(form.fields.values()),
event_types={
QEvent.KeyPress,
},
async_handler=partial(
handle_field_input,
form=form,
focus_next=focus_next,
on_value_change=on_value_change,
),
# block key repeats?
filter_auto_repeats=True,
):
yield form
class FillStatusBar(QProgressBar):
'''
A status bar for fills up to a position limit.
'''
border_px: int = 2
slot_margin_px: int = 1
def __init__(
self,
approx_height_px: float,
width_px: float,
font_size: int,
parent=None
) -> None:
super().__init__(parent=parent)
self.approx_h = int(round(approx_height_px))
self.setMinimumHeight(self.approx_h)
self.setMaximumHeight(self.approx_h)
self.font_size = font_size
self.setFormat('') # label format
self.setMinimumWidth(int(width_px))
self.setMaximumWidth(int(width_px))
def set_slots(
self,
slots: int,
value: float,
) -> None:
self.setOrientation(Qt.Vertical)
h = self.height()
# TODO: compute "used height" thus far and mostly fill the rest
tot_slot_h, r = divmod(h, slots)
self.setStyleSheet(
f"""
QProgressBar {{
text-align: center;
font-size : {self.font_size - 2}px;
background-color: {hcolor('papas_special')};
color : {hcolor('papas_special')};
border: {self.border_px}px solid {hcolor('default_light')};
border-radius: 2px;
}}
QProgressBar::chunk {{
background-color: {hcolor('default_spotlight')};
color: {hcolor('bracket')};
border-radius: 2px;
}}
"""
)
# to set a discrete "block" per slot...
# XXX: couldn't get the discrete math to work here such
# that it was always correctly showing a discretized value
# up to the limit; not sure if it's the ``.setRange()``
# / ``.setValue()`` api or not but i was able to get something
# close screwing with the divmod above above but after so large
# a value it would always be less chunks then the correct
# value..
# margin: {self.slot_margin_px}px;
# height: {slot_height_px}px;
# margin-bottom: {slot_margin_px*2}px;
# margin-top: {slot_margin_px*2}px;
# color: #19232D;
# width: 10px;
self.setRange(0, slots)
self.setValue(value)
def mk_fill_status_bar(
parent_pane: QWidget,
form: FieldsForm,
pane_vbox: QVBoxLayout,
label_font_size: Optional[int] = None,
) -> (
# TODO: turn this into a composite?
QHBoxLayout,
QProgressBar,
QLabel,
QLabel,
QLabel,
):
# indent = 18
# bracket_val = 0.375 * 0.666 * w
# indent = bracket_val / (1 + 5/8)
# TODO: calc this height from the ``ChartnPane``
chart_h = round(parent_pane.height() * 5/8)
bar_h = chart_h * 0.375
# TODO: once things are sized to screen
bar_label_font_size = label_font_size or _font.px_size - 2
label_font = DpiAwareFont()
label_font._set_qfont_px_size(bar_label_font_size)
br = label_font.boundingRect(f'{3.32:.1f}% port')
_, h = br.width(), br.height()
# text_w = 8/3 * w
# PnL on lhs
bar_labels_lhs = QVBoxLayout()
left_label = form.add_field_label(
'{pnl:>+.2%} pnl',
font_size=bar_label_font_size,
font_color='gunmetal',
)
# size according to dpi scaled fonted contents to avoid
# resizes on magnitude changes (eg. 9 -> 10 %)
min_w = int(_font.boundingRect('1000.0M% pnl').width())
left_label.setMinimumWidth(min_w)
left_label.resize(
min_w,
left_label.size().height(),
)
bar_labels_lhs.addSpacing(int(5/8 * bar_h))
bar_labels_lhs.addWidget(
left_label,
# XXX: doesn't seem to actually push up against
# the status bar?
alignment=Qt.AlignRight | Qt.AlignTop,
)
# this hbox is added as a layout by the paner maker/caller
hbox = QHBoxLayout()
hbox.addLayout(bar_labels_lhs)
# hbox.addSpacing(indent) # push to right a bit
# config
# hbox.setSpacing(indent * 0.375)
hbox.setSpacing(0)
hbox.setAlignment(Qt.AlignTop | Qt.AlignLeft)
hbox.setContentsMargins(0, 0, 0, 0)
# TODO: use percentage str formatter:
# https://docs.python.org/3/library/string.html#grammar-token-precision
top_label = form.add_field_label(
'{limit}',
font_size=bar_label_font_size,
font_color='gunmetal',
)
bottom_label = form.add_field_label(
'x: {step_size}',
font_size=bar_label_font_size,
font_color='gunmetal',
)
bar = FillStatusBar(
approx_height_px=bar_h,
width_px=h * (1 + 1/6),
font_size=form._font_size,
parent=form
)
hbox.addWidget(bar, alignment=Qt.AlignLeft | Qt.AlignTop)
bar_labels_rhs_vbox = QVBoxLayout()
bar_labels_rhs_vbox.addWidget(
top_label,
alignment=Qt.AlignLeft | Qt.AlignTop
)
bar_labels_rhs_vbox.addWidget(
bottom_label,
alignment=Qt.AlignLeft | Qt.AlignBottom
)
hbox.addLayout(bar_labels_rhs_vbox)
# compute "chunk" sizes for fill-status-bar based on some static height
slots = 4
bar.set_slots(slots, value=0)
return hbox, bar, left_label, top_label, bottom_label
def mk_order_pane_layout(
parent: QWidget,
# accounts: dict[str, Optional[str]],
) -> FieldsForm:
font_size: int = _font.px_size - 2
# TODO: maybe just allocate the whole fields form here
# and expect an async ctx entry?
form = mk_form(
parent=parent,
fields_schema={
'account': {
'label': '**accnt**:',
'type': 'select',
'default_value': ['paper'],
},
'size_unit': {
'label': '**alloc**:',
'type': 'select',
'default_value': [
'$ size',
'# units',
# '% of port',
],
},
# 'disti_weight': {
# 'label': '**weighting**:',
# 'type': 'select',
# 'default_value': ['uniform'],
# },
'limit': {
'label': '**limit**:',
'type': 'edit',
'default_value': 5000,
},
'slots': {
'label': '**slots**:',
'type': 'edit',
'default_value': 4,
},
},
)
# top level pane layout
# XXX: see ``FieldsForm.__init__()`` for why we can't do basic
# config of the vbox here
vbox = form.vbox
# _, h = form.width(), form.height()
# print(f'w, h: {w, h}')
hbox, fill_bar, left_label, top_label, bottom_label = mk_fill_status_bar(
parent,
form,
pane_vbox=vbox,
label_font_size=font_size,
)
# TODO: would be nice to have some better way of reffing these over
# monkey patching...
form.fill_bar = fill_bar
form.left_label = left_label
form.bottom_label = bottom_label
form.top_label = top_label
# add pp fill bar + spacing
vbox.addLayout(hbox, stretch=3)
# TODO: handle resize events and appropriately scale this
# to the sidepane height?
# https://doc.qt.io/qt-5/layout.html#adding-widgets-to-a-layout
# vbox.setSpacing(_font.px_size * 1.375)
form.show()
return form

View File

@ -1,970 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
FSP UI and graphics components.
Financial signal processing cluster and real-time graphics management.
'''
from contextlib import asynccontextmanager as acm
from functools import partial
import inspect
from itertools import cycle
from typing import Optional, AsyncGenerator, Any
import numpy as np
from pydantic import create_model
import tractor
import pyqtgraph as pg
import trio
from trio_typing import TaskStatus
from ._axes import PriceAxis
from .._cacheables import maybe_open_context
from ..calc import humanize
from ..data._sharedmem import (
ShmArray,
_Token,
try_read,
)
from ._chart import (
ChartPlotWidget,
LinkedSplits,
)
from ._forms import (
FieldsForm,
mk_form,
open_form_input_handling,
)
from ..fsp._api import maybe_mk_fsp_shm, Fsp
from ..fsp import cascade
from ..fsp._volume import (
tina_vwap,
dolla_vlm,
flow_rates,
)
from ..log import get_logger
log = get_logger(__name__)
def has_vlm(ohlcv: ShmArray) -> bool:
# make sure that the instrument supports volume history
# (sometimes this is not the case for some commodities and
# derivatives)
vlm = ohlcv.array['volume']
return not bool(np.all(np.isin(vlm, -1)) or np.all(np.isnan(vlm)))
def update_fsp_chart(
chart: ChartPlotWidget,
flow,
graphics_name: str,
array_key: Optional[str],
**kwargs,
) -> None:
shm = flow.shm
if not shm:
return
array = shm.array
last_row = try_read(array)
# guard against unreadable case
if not last_row:
log.warning(f'Read-race on shm array: {graphics_name}@{shm.token}')
return
# update graphics
# NOTE: this does a length check internally which allows it
# staying above the last row check below..
chart.update_graphics_from_flow(
graphics_name,
array_key=array_key or graphics_name,
**kwargs,
)
# XXX: re: ``array_key``: fsp func names must be unique meaning we
# can't have duplicates of the underlying data even if multiple
# sub-charts reference it under different 'named charts'.
# read from last calculated value and update any label
last_val_sticky = chart._ysticks.get(graphics_name)
if last_val_sticky:
last = last_row[array_key]
last_val_sticky.update_from_data(-1, last)
@acm
async def open_fsp_sidepane(
linked: LinkedSplits,
conf: dict[str, dict[str, str]],
) -> FieldsForm:
schema = {}
assert len(conf) == 1 # for now
# add (single) selection widget
for name, config in conf.items():
schema[name] = {
'label': '**fsp**:',
'type': 'select',
'default_value': [name],
}
# add parameters for selection "options"
params = config.get('params', {})
for name, config in params.items():
default = config['default_value']
kwargs = config.get('widget_kwargs', {})
# add to ORM schema
schema.update({
name: {
'label': f'**{name}**:',
'type': 'edit',
'default_value': default,
'kwargs': kwargs,
},
})
sidepane: FieldsForm = mk_form(
parent=linked.godwidget,
fields_schema=schema,
)
# https://pydantic-docs.helpmanual.io/usage/models/#dynamic-model-creation
FspConfig = create_model(
'FspConfig',
name=name,
**params,
)
sidepane.model = FspConfig()
# just a logger for now until we get fsp configs up and running.
async def settings_change(
key: str,
value: str
) -> bool:
print(f'{key}: {value}')
return True
# TODO:
async with (
open_form_input_handling(
sidepane,
focus_next=linked.godwidget,
on_value_change=settings_change,
)
):
yield sidepane
@acm
async def open_fsp_actor_cluster(
names: list[str] = ['fsp_0', 'fsp_1'],
) -> AsyncGenerator[int, dict[str, tractor.Portal]]:
from tractor._clustering import open_actor_cluster
# profiler = pg.debug.Profiler(
# delayed=False,
# disabled=False
# )
async with open_actor_cluster(
count=2,
names=names,
modules=['piker.fsp._engine'],
) as cluster_map:
# profiler('started fsp cluster')
yield cluster_map
async def run_fsp_ui(
linkedsplits: LinkedSplits,
shm: ShmArray,
started: trio.Event,
target: Fsp,
conf: dict[str, dict],
loglevel: str,
# profiler: pg.debug.Profiler,
# _quote_throttle_rate: int = 58,
) -> None:
'''
Taskf for UI spawning around a ``LinkedSplits`` chart for fsp
related graphics/UX management.
This is normally spawned/called once for each entry in the fsp
config.
'''
name = target.name
# profiler(f'started UI task for fsp: {name}')
async with (
# side UI for parameters/controls
open_fsp_sidepane(
linkedsplits,
{name: conf},
) as sidepane,
):
await started.wait()
# profiler(f'fsp:{name} attached to fsp ctx-stream')
overlay_with = conf.get('overlay', False)
if overlay_with:
if overlay_with == 'ohlc':
chart = linkedsplits.chart
else:
chart = linkedsplits.subplots[overlay_with]
chart.draw_curve(
name=name,
shm=shm,
overlay=True,
color='default_light',
array_key=name,
**conf.get('chart_kwargs', {})
)
else:
# create a new sub-chart widget for this fsp
chart = linkedsplits.add_plot(
name=name,
shm=shm,
array_key=name,
sidepane=sidepane,
# curve by default
ohlc=False,
# settings passed down to ``ChartPlotWidget``
**conf.get('chart_kwargs', {})
)
# should **not** be the same sub-chart widget
assert chart.name != linkedsplits.chart.name
array_key = name
# profiler(f'fsp:{name} chart created')
# first UI update, usually from shm pushed history
update_fsp_chart(
chart,
chart._flows[array_key],
name,
array_key=array_key,
)
chart.linked.focus()
# TODO: figure out if we can roll our own `FillToThreshold` to
# get brush filled polygons for OS/OB conditions.
# ``pg.FillBetweenItems`` seems to be one technique using
# generic fills between curve types while ``PlotCurveItem`` has
# logic inside ``.paint()`` for ``self.opts['fillLevel']`` which
# might be the best solution?
# graphics = chart.update_from_array(chart.name, array[name])
# graphics.curve.setBrush(50, 50, 200, 100)
# graphics.curve.setFillLevel(50)
# if func_name == 'rsi':
# from ._lines import level_line
# # add moveable over-[sold/bought] lines
# # and labels only for the 70/30 lines
# level_line(chart, 20)
# level_line(chart, 30, orient_v='top')
# level_line(chart, 70, orient_v='bottom')
# level_line(chart, 80, orient_v='top')
chart.view._set_yrange()
# done() # status updates
# profiler(f'fsp:{func_name} starting update loop')
# profiler.finish()
# update chart graphics
# last = time.time()
# XXX: this currently doesn't loop since
# the FSP engine does **not** push updates atm
# since we do graphics update in the main loop
# in ``._display.
# async for value in stream:
# print(value)
# # chart isn't actively shown so just skip render cycle
# if chart.linked.isHidden():
# continue
# else:
# now = time.time()
# period = now - last
# if period <= 1/_quote_throttle_rate:
# # faster then display refresh rate
# print(f'fsp too fast: {1/period}')
# continue
# # run synchronous update
# update_fsp_chart(
# chart,
# shm,
# display_name,
# array_key=func_name,
# )
# # set time of last graphics update
# last = time.time()
class FspAdmin:
'''
Client API for orchestrating FSP actors and displaying
real-time graphics output.
'''
def __init__(
self,
tn: trio.Nursery,
cluster: dict[str, tractor.Portal],
linked: LinkedSplits,
src_shm: ShmArray,
) -> None:
self.tn = tn
self.cluster = cluster
self.linked = linked
self._rr_next_actor = cycle(cluster.items())
self._registry: dict[
tuple,
tuple[tractor.MsgStream, ShmArray]
] = {}
self._flow_registry: dict[_Token, str] = {}
self.src_shm = src_shm
def rr_next_portal(self) -> tractor.Portal:
name, portal = next(self._rr_next_actor)
return portal
async def open_chain(
self,
portal: tractor.Portal,
complete: trio.Event,
started: trio.Event,
fqsn: str,
dst_shm: ShmArray,
conf: dict,
target: Fsp,
loglevel: str,
) -> None:
'''
Task which opens a remote FSP endpoint in the managed
cluster and sleeps until signalled to exit.
'''
ns_path = str(target.ns_path)
async with (
portal.open_context(
# chaining entrypoint
cascade,
# data feed key
fqsn=fqsn,
# mems
src_shm_token=self.src_shm.token,
dst_shm_token=dst_shm.token,
# target
ns_path=ns_path,
loglevel=loglevel,
zero_on_step=conf.get('zero_on_step', False),
shm_registry=[
(token.as_msg(), fsp_name, dst_token.as_msg())
for (token, fsp_name), dst_token
in self._flow_registry.items()
],
) as (ctx, last_index),
ctx.open_stream() as stream,
):
# register output data
self._registry[
(fqsn, ns_path)
] = (
stream,
dst_shm,
complete
)
started.set()
# wait for graceful shutdown signal
async with stream.subscribe() as stream:
async for msg in stream:
info = msg.get('fsp_update')
if info:
# if the chart isn't hidden try to update
# the data on screen.
if not self.linked.isHidden():
log.debug(f'Re-syncing graphics for fsp: {ns_path}')
self.linked.graphics_cycle(
trigger_all=True,
prepend_update_index=info['first'],
)
else:
log.info(f'recved unexpected fsp engine msg: {msg}')
await complete.wait()
async def start_engine_task(
self,
target: Fsp,
conf: dict[str, dict[str, Any]],
worker_name: Optional[str] = None,
loglevel: str = 'info',
) -> (ShmArray, trio.Event):
fqsn = self.linked.symbol.front_fqsn()
# allocate an output shm array
key, dst_shm, opened = maybe_mk_fsp_shm(
fqsn,
target=target,
readonly=True,
)
self._flow_registry[
(self.src_shm._token, target.name)
] = dst_shm._token
# if not opened:
# raise RuntimeError(
# f'Already started FSP `{fqsn}:{func_name}`'
# )
portal = self.cluster.get(worker_name) or self.rr_next_portal()
complete = trio.Event()
started = trio.Event()
self.tn.start_soon(
self.open_chain,
portal,
complete,
started,
fqsn,
dst_shm,
conf,
target,
loglevel,
)
return dst_shm, started
async def open_fsp_chart(
self,
target: Fsp,
conf: dict, # yeah probably dumb..
loglevel: str = 'error',
) -> (trio.Event, ChartPlotWidget):
shm, started = await self.start_engine_task(
target,
conf,
loglevel,
)
# init async
self.tn.start_soon(
partial(
run_fsp_ui,
self.linked,
shm,
started,
target,
conf=conf,
loglevel=loglevel,
)
)
return started
@acm
async def open_fsp_admin(
linked: LinkedSplits,
src_shm: ShmArray,
**kwargs,
) -> AsyncGenerator[dict, dict[str, tractor.Portal]]:
async with (
maybe_open_context(
# for now make a cluster per client?
acm_func=open_fsp_actor_cluster,
kwargs=kwargs,
) as (cache_hit, cluster_map),
trio.open_nursery() as tn,
):
if cache_hit:
log.info('re-using existing fsp cluster')
admin = FspAdmin(
tn,
cluster_map,
linked,
src_shm,
)
try:
yield admin
finally:
# terminate all tasks via signals
for key, entry in admin._registry.items():
_, _, event = entry
event.set()
async def open_vlm_displays(
linked: LinkedSplits,
ohlcv: ShmArray,
dvlm: bool = True,
task_status: TaskStatus[ChartPlotWidget] = trio.TASK_STATUS_IGNORED,
) -> ChartPlotWidget:
'''
Volume subchart displays.
Since "volume" is often included directly alongside OHLCV price
data, we don't really need a separate FSP-actor + shm array for it
since it's likely already directly adjacent to OHLC samples from the
data provider.
Further only if volume data is detected (it sometimes isn't provided
eg. forex, certain commodities markets) will volume dependent FSPs
be spawned here.
'''
sig = inspect.signature(flow_rates.func)
params = sig.parameters
async with (
open_fsp_sidepane(
linked, {
'flows': {
# TODO: add support for dynamically changing these
'params': {
u'\u03BC' + '_type': {
'default_value': str(params['mean_type'].default),
},
'period': {
'default_value': str(params['period'].default),
# make widget un-editable for now.
'widget_kwargs': {'readonly': True},
},
},
}
},
) as sidepane,
open_fsp_admin(linked, ohlcv) as admin,
):
# TODO: support updates
# period_field = sidepane.fields['period']
# period_field.setText(
# str(period_param.default)
# )
# built-in vlm which we plot ASAP since it's
# usually data provided directly with OHLC history.
shm = ohlcv
chart = linked.add_plot(
name='volume',
shm=shm,
array_key='volume',
sidepane=sidepane,
# curve by default
ohlc=False,
# Draw vertical bars from zero.
# we do this internally ourselves since
# the curve item internals are pretty convoluted.
style='step',
)
# force 0 to always be in view
def multi_maxmin(
names: list[str],
) -> tuple[float, float]:
mx = 0
for name in names:
mxmn = chart.maxmin(name=name)
if mxmn:
ymax = mxmn[1]
if ymax > mx:
mx = ymax
return 0, mx
chart.view.maxmin = partial(multi_maxmin, names=['volume'])
# TODO: fix the x-axis label issue where if you put
# the axis on the left it's totally not lined up...
# show volume units value on LHS (for dinkus)
# chart.hideAxis('right')
# chart.showAxis('left')
# send back new chart to caller
task_status.started(chart)
# should **not** be the same sub-chart widget
assert chart.name != linked.chart.name
# sticky only on sub-charts atm
last_val_sticky = chart._ysticks[chart.name]
# read from last calculated value
value = shm.array['volume'][-1]
last_val_sticky.update_from_data(-1, value)
vlm_curve = chart.update_graphics_from_flow(
'volume',
# shm.array,
)
# size view to data once at outset
chart.view._set_yrange()
# add axis title
axis = chart.getAxis('right')
axis.set_title(' vlm')
if dvlm:
tasks_ready = []
# spawn and overlay $ vlm on the same subchart
dvlm_shm, started = await admin.start_engine_task(
dolla_vlm,
{ # fsp engine conf
'func_name': 'dolla_vlm',
'zero_on_step': True,
'params': {
'price_func': {
'default_value': 'chl3',
},
},
},
# loglevel,
)
tasks_ready.append(started)
# FIXME: we should error on starting the same fsp right
# since it might collide with existing shm.. or wait we
# had this before??
# dolla_vlm,
tasks_ready.append(started)
# profiler(f'created shm for fsp actor: {display_name}')
# wait for all engine tasks to startup
async with trio.open_nursery() as n:
for event in tasks_ready:
n.start_soon(event.wait)
# dolla vlm overlay
# XXX: the main chart already contains a vlm "units" axis
# so here we add an overlay wth a y-range in
# $ liquidity-value units (normally a fiat like USD).
dvlm_pi = chart.overlay_plotitem(
'dolla_vlm',
index=0, # place axis on inside (nearest to chart)
axis_title=' $vlm',
axis_side='right',
axis_kwargs={
'typical_max_str': ' 100.0 M ',
'formatter': partial(
humanize,
digits=2,
),
},
)
# all to be overlayed curve names
fields = [
'dolla_vlm',
'dark_vlm',
]
# dvlm_rate_fields = [
# 'dvlm_rate',
# 'dark_dvlm_rate',
# ]
trade_rate_fields = [
'trade_rate',
'dark_trade_rate',
]
group_mxmn = partial(
multi_maxmin,
# keep both regular and dark vlm in view
names=fields,
# names=fields + dvlm_rate_fields,
)
# add custom auto range handler
dvlm_pi.vb._maxmin = group_mxmn
# use slightly less light (then bracket) gray
# for volume from "main exchange" and a more "bluey"
# gray for "dark" vlm.
vlm_color = 'i3'
dark_vlm_color = 'charcoal'
# add dvlm (step) curves to common view
def chart_curves(
names: list[str],
pi: pg.PlotItem,
shm: ShmArray,
step_mode: bool = False,
style: str = 'solid',
) -> None:
for name in names:
if 'dark' in name:
color = dark_vlm_color
elif 'rate' in name:
color = vlm_color
else:
color = 'bracket'
curve, _ = chart.draw_curve(
name=name,
shm=shm,
array_key=name,
overlay=pi,
color=color,
step_mode=step_mode,
style=style,
pi=pi,
)
# TODO: we need a better API to do this..
# specially store ref to shm for lookup in display loop
# since only a placeholder of `None` is entered in
# ``.draw_curve()``.
flow = chart._flows[name]
assert flow.plot is pi
chart_curves(
fields,
dvlm_pi,
dvlm_shm,
step_mode=True,
)
# spawn flow rates fsp **ONLY AFTER** the 'dolla_vlm' fsp is
# up since this one depends on it.
fr_shm, started = await admin.start_engine_task(
flow_rates,
{ # fsp engine conf
'func_name': 'flow_rates',
'zero_on_step': False,
},
# loglevel,
)
await started.wait()
# chart_curves(
# dvlm_rate_fields,
# dvlm_pi,
# fr_shm,
# )
# TODO: is there a way to "sync" the dual axes such that only
# one curve is needed?
# hide the original vlm curve since the $vlm one is now
# displayed and the curves are effectively the same minus
# liquidity events (well at least on low OHLC periods - 1s).
vlm_curve.hide()
chart.removeItem(vlm_curve)
vflow = chart._flows['volume']
vflow.render = False
# avoid range sorting on volume once disabled
chart.view.disable_auto_yrange()
# Trade rate overlay
# XXX: requires an additional overlay for
# a trades-per-period (time) y-range.
tr_pi = chart.overlay_plotitem(
'trade_rates',
# TODO: dynamically update period (and thus this axis?)
# title from user input.
axis_title='clears',
axis_side='left',
axis_kwargs={
'typical_max_str': ' 10.0 M ',
'formatter': partial(
humanize,
digits=2,
),
'text_color': vlm_color,
},
)
# add custom auto range handler
tr_pi.vb.maxmin = partial(
multi_maxmin,
# keep both regular and dark vlm in view
names=trade_rate_fields,
)
chart_curves(
trade_rate_fields,
tr_pi,
fr_shm,
# step_mode=True,
# dashed line to represent "individual trades" being
# more "granular" B)
style='dash',
)
for pi in (
dvlm_pi,
tr_pi,
):
for name, axis_info in pi.axes.items():
# lol this sux XD
axis = axis_info['item']
if isinstance(axis, PriceAxis):
axis.size_to_values()
# built-in vlm fsps
for target, conf in {
# tina_vwap: {
# 'overlay': 'ohlc', # overlays with OHLCV (main) chart
# 'anchor': 'session',
# },
}.items():
started = await admin.open_fsp_chart(
target,
conf,
)
async def start_fsp_displays(
linked: LinkedSplits,
ohlcv: ShmArray,
group_status_key: str,
loglevel: str,
) -> None:
'''
Create fsp charts from a config input attached to a local actor
compute cluster.
Pass target entrypoint and historical data via ``ShmArray``.
'''
linked.focus()
# TODO: eventually we'll support some kind of n-compose syntax
fsp_conf = {
# 'rsi': {
# 'func_name': 'rsi', # literal python func ref lookup name
# # map of parameters to place on the fsp sidepane widget
# # which should map to dynamic inputs available to the
# # fsp function at runtime.
# 'params': {
# 'period': {
# 'default_value': 14,
# 'widget_kwargs': {'readonly': True},
# },
# },
# # ``ChartPlotWidget`` options passthrough
# 'chart_kwargs': {
# 'static_yrange': (0, 100),
# },
# },
}
profiler = pg.debug.Profiler(
delayed=False,
disabled=False
)
async with (
# NOTE: this admin internally opens an actor cluster
open_fsp_admin(linked, ohlcv) as admin,
):
statuses = []
for target, conf in fsp_conf.items():
started = await admin.open_fsp_chart(
target,
conf,
)
done = linked.window().status_bar.open_status(
f'loading fsp, {target}..',
group_key=group_status_key,
)
statuses.append((started, done))
for fsp_loaded, status_cb in statuses:
await fsp_loaded.wait()
profiler(f'attached to fsp portal: {target}')
status_cb()
# blocks on nursery until all fsp actors complete

View File

@ -1,90 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
``QIcon`` hackery.
'''
from PyQt5.QtWidgets import QStyle
from PyQt5.QtGui import (
QIcon, QPixmap, QColor
)
from PyQt5.QtCore import QSize
from ._style import hcolor
# https://www.pythonguis.com/faq/built-in-qicons-pyqt/
# account status icons taken from built-in set
_icon_names: dict[str, str] = {
# these two seem to work for our mask hack done below to
# change the coloring.
'long_pp': 'SP_TitleBarShadeButton',
'short_pp': 'SP_TitleBarUnshadeButton',
'ready': 'SP_MediaPlay',
}
_icons: dict[str, QIcon] = {}
def mk_icons(
style: QStyle,
size: QSize,
) -> dict[str, QIcon]:
'''This helper is indempotent.
'''
global _icons, _icon_names
if _icons:
return _icons
_icons[None] = QIcon() # the "null" icon
# load account selection using current style
for name, icon_name in _icon_names.items():
stdpixmap = getattr(QStyle, icon_name)
stdicon = style.standardIcon(stdpixmap)
pixmap = stdicon.pixmap(size)
# fill hack from SO to change icon color:
# https://stackoverflow.com/a/38369468
out_pixmap = QPixmap(size)
out_pixmap.fill(QColor(hcolor('default_spotlight')))
out_pixmap.setMask(pixmap.createHeuristicMask())
# TODO: not idea why this doesn't work / looks like
# trash. Sure would be nice to just generate our own
# pixmaps on the fly..
# p = QPainter(out_pixmap)
# p.setOpacity(1)
# p.setBrush(QColor(hcolor('papas_special')))
# p.setPen(QColor(hcolor('default_lightest')))
# path = mk_marker_path(style='|<')
# p.scale(6, 6)
# # p.translate(0, 0)
# p.drawPath(path)
# p.save()
# p.end()
# del p
# icon = QIcon(out_pixmap)
icon = QIcon()
icon.addPixmap(out_pixmap)
_icons[name] = icon
return _icons

View File

@ -1,947 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Chart view box primitives
"""
from __future__ import annotations
from contextlib import asynccontextmanager
import time
from typing import Optional, Callable
import pyqtgraph as pg
# from pyqtgraph.GraphicsScene import mouseEvents
from PyQt5.QtWidgets import QGraphicsSceneMouseEvent as gs_mouse
from PyQt5.QtCore import Qt, QEvent
from pyqtgraph import ViewBox, Point, QtCore
from pyqtgraph import functions as fn
import numpy as np
import trio
from ..log import get_logger
from .._profile import pg_profile_enabled, ms_slower_then
# from ._style import _min_points_to_show
from ._editors import SelectRect
from . import _event
log = get_logger(__name__)
NUMBER_LINE = {
Qt.Key_1,
Qt.Key_2,
Qt.Key_3,
Qt.Key_4,
Qt.Key_5,
Qt.Key_6,
Qt.Key_7,
Qt.Key_8,
Qt.Key_9,
Qt.Key_0,
}
ORDER_MODE = {
Qt.Key_A,
Qt.Key_F,
Qt.Key_D,
}
async def handle_viewmode_kb_inputs(
view: 'ChartView',
recv_chan: trio.abc.ReceiveChannel,
) -> None:
order_mode = view.order_mode
# track edge triggered keys
# (https://en.wikipedia.org/wiki/Interrupt#Triggering_methods)
pressed: set[str] = set()
last = time.time()
trigger_mode: str
action: str
on_next_release: Optional[Callable] = None
# for quick key sequence-combo pattern matching
# we have a min_tap period and these should not
# ever be auto-repeats since we filter those at the
# event filter level prior to the above mem chan.
min_tap = 1/6
fast_key_seq: list[str] = []
fast_taps: dict[str, Callable] = {
'cc': order_mode.cancel_all_orders,
}
async for kbmsg in recv_chan:
event, etype, key, mods, text = kbmsg.to_tuple()
log.debug(f'key: {key}, mods: {mods}, text: {text}')
now = time.time()
period = now - last
# reset mods
ctrl: bool = False
shift: bool = False
# press branch
if etype in {QEvent.KeyPress}:
pressed.add(key)
if (
# clear any old values not part of a "fast" tap sequence:
# presumes the period since last tap is longer then our
# min_tap period
fast_key_seq and period >= min_tap or
# don't support more then 2 key sequences for now
len(fast_key_seq) > 2
):
fast_key_seq.clear()
# capture key to fast tap sequence if we either
# have no previous keys or we do and the min_tap period is
# met
if (
not fast_key_seq or
period <= min_tap and fast_key_seq
):
fast_key_seq.append(text)
log.debug(f'fast keys seqs {fast_key_seq}')
# mods run through
if mods == Qt.ShiftModifier:
shift = True
if mods == Qt.ControlModifier:
ctrl = True
# SEARCH MODE #
# ctlr-<space>/<l> for "lookup", "search" -> open search tree
if (
ctrl and key in {
Qt.Key_L,
Qt.Key_Space,
}
):
view._chart.linked.godwidget.search.focus()
# esc and ctrl-c
if key == Qt.Key_Escape or (ctrl and key == Qt.Key_C):
# ctrl-c as cancel
# https://forum.qt.io/topic/532/how-to-catch-ctrl-c-on-a-widget/9
view.select_box.clear()
# cancel order or clear graphics
if key == Qt.Key_C or key == Qt.Key_Delete:
order_mode.cancel_orders_under_cursor()
# View modes
if key == Qt.Key_R:
# TODO: set this for all subplots
# edge triggered default view activation
view.chart.default_view()
if len(fast_key_seq) > 1:
# begin matches against sequences
func: Callable = fast_taps.get(''.join(fast_key_seq))
if func:
func()
fast_key_seq.clear()
# release branch
elif etype in {QEvent.KeyRelease}:
if on_next_release:
on_next_release()
on_next_release = None
if key in pressed:
pressed.remove(key)
# QUERY/QUOTE MODE #
if {Qt.Key_Q}.intersection(pressed):
view.linkedsplits.cursor.in_query_mode = True
else:
view.linkedsplits.cursor.in_query_mode = False
# SELECTION MODE
# --------------
if shift:
if view.state['mouseMode'] == ViewBox.PanMode:
view.setMouseMode(ViewBox.RectMode)
else:
view.setMouseMode(ViewBox.PanMode)
# Toggle position config pane
if (
ctrl and key in {
Qt.Key_P,
}
):
pp_pane = order_mode.current_pp.pane
if pp_pane.isHidden():
pp_pane.show()
else:
pp_pane.hide()
# ORDER MODE
# ----------
# live vs. dark trigger + an action {buy, sell, alert}
order_keys_pressed = ORDER_MODE.intersection(pressed)
if order_keys_pressed:
# show the pp size label
order_mode.current_pp.show()
# TODO: show pp config mini-params in status bar widget
# mode.pp_config.show()
if (
# 's' for "submit" to activate "live" order
Qt.Key_S in pressed or
ctrl
):
trigger_type: str = 'live'
else:
trigger_type: str = 'dark'
# order mode trigger "actions"
if Qt.Key_D in pressed: # for "damp eet"
action = 'sell'
elif Qt.Key_F in pressed: # for "fillz eet"
action = 'buy'
elif Qt.Key_A in pressed:
action = 'alert'
trigger_type = 'live'
order_mode.active = True
# XXX: order matters here for line style!
order_mode._trigger_type = trigger_type
order_mode.stage_order(
action,
trigger_type=trigger_type,
)
prefix = trigger_type + '-' if action != 'alert' else ''
view._chart.window().set_mode_name(f'{prefix}{action}')
elif (
(
Qt.Key_S in pressed or
order_keys_pressed or
Qt.Key_O in pressed
) and
key in NUMBER_LINE
):
# hot key to set order slots size.
# change edit field to current number line value,
# update the pp allocator bar, unhighlight the
# field when ctrl is released.
num = int(text)
pp_pane = order_mode.pane
pp_pane.on_ui_settings_change('slots', num)
edit = pp_pane.form.fields['slots']
edit.selectAll()
# un-highlight on ctrl release
on_next_release = edit.deselect
pp_pane.update_status_ui(pp_pane.order_mode.current_pp)
else: # none active
# hide pp label
order_mode.current_pp.hide_info()
# if none are pressed, remove "staged" level
# line under cursor position
order_mode.lines.unstage_line()
if view.hasFocus():
# update mode label
view._chart.window().set_mode_name('view')
order_mode.active = False
last = time.time()
async def handle_viewmode_mouse(
view: 'ChartView',
recv_chan: trio.abc.ReceiveChannel,
) -> None:
async for msg in recv_chan:
button = msg.button
# XXX: ugggh ``pyqtgraph`` has its own mouse events..
# so we can't overried this easily.
# it's going to take probably some decent
# reworking of the mouseClickEvent() handler.
# if button == QtCore.Qt.RightButton and view.menuEnabled():
# event = mouseEvents.MouseClickEvent(msg.event)
# # event.accept()
# view.raiseContextMenu(event)
if (
view.order_mode.active and
button == QtCore.Qt.LeftButton
):
# when in order mode, submit execution
# msg.event.accept()
# breakpoint()
view.order_mode.submit_order()
class ChartView(ViewBox):
'''
Price chart view box with interaction behaviors you'd expect from
any interactive platform:
- zoom on mouse scroll that auto fits y-axis
- vertical scrolling on y-axis
- zoom on x to most recent in view datum
- zoom on right-click-n-drag to cursor position
'''
mode_name: str = 'view'
# "relay events" for making overlaid views work.
# NOTE: these MUST be defined here (and can't be monkey patched
# on later) due to signal construction requiring refs to be
# in place during the run of meta-class machinery.
mouseDragEventRelay = QtCore.Signal(object, object, object)
wheelEventRelay = QtCore.Signal(object, object, object)
event_relay_source: 'Optional[ViewBox]' = None
relays: dict[str, QtCore.Signal] = {}
def __init__(
self,
name: str,
parent: pg.PlotItem = None,
static_yrange: Optional[tuple[float, float]] = None,
**kwargs,
):
super().__init__(
parent=parent,
name=name,
# TODO: look into the default view padding
# support that might replace somem of our
# ``ChartPlotWidget._set_yrange()`
# defaultPadding=0.,
**kwargs
)
# for "known y-range style"
self._static_yrange = static_yrange
self._maxmin = None
# disable vertical scrolling
self.setMouseEnabled(
x=True,
y=True,
)
self.linkedsplits = None
self._chart: 'ChartPlotWidget' = None # noqa
# add our selection box annotator
self.select_box = SelectRect(self)
self.addItem(self.select_box, ignoreBounds=True)
self.mode = None
self.order_mode: bool = False
self.setFocusPolicy(QtCore.Qt.StrongFocus)
self._ic = None
def start_ic(
self,
) -> None:
'''
Signal the beginning of a click-drag interaction
to any interested task waiters.
'''
if self._ic is None:
self.chart.pause_all_feeds()
self._ic = trio.Event()
def signal_ic(
self,
*args,
) -> None:
'''
Signal the end of a click-drag interaction
to any waiters.
'''
if self._ic:
self._ic.set()
self._ic = None
self.chart.resume_all_feeds()
@asynccontextmanager
async def open_async_input_handler(
self,
) -> 'ChartView':
async with (
_event.open_handlers(
[self],
event_types={
QEvent.KeyPress,
QEvent.KeyRelease,
},
async_handler=handle_viewmode_kb_inputs,
),
_event.open_handlers(
[self],
event_types={
gs_mouse.GraphicsSceneMousePress,
},
async_handler=handle_viewmode_mouse,
),
):
yield self
@property
def chart(self) -> 'ChartPlotWidget': # type: ignore # noqa
return self._chart
@chart.setter
def chart(self, chart: 'ChartPlotWidget') -> None: # type: ignore # noqa
self._chart = chart
self.select_box.chart = chart
if self._maxmin is None:
self._maxmin = chart.maxmin
@property
def maxmin(self) -> Callable:
return self._maxmin
@maxmin.setter
def maxmin(self, callback: Callable) -> None:
self._maxmin = callback
def wheelEvent(
self,
ev,
axis=None,
relayed_from: ChartView = None,
):
'''
Override "center-point" location for scrolling.
This is an override of the ``ViewBox`` method simply changing
the center of the zoom to be the y-axis.
TODO: PR a method into ``pyqtgraph`` to make this configurable
'''
if axis in (0, 1):
mask = [False, False]
mask[axis] = self.state['mouseEnabled'][axis]
else:
mask = self.state['mouseEnabled'][:]
chart = self.linkedsplits.chart
# don't zoom more then the min points setting
l, lbar, rbar, r = chart.bars_range()
# vl = r - l
# if ev.delta() > 0 and vl <= _min_points_to_show:
# log.debug("Max zoom bruh...")
# return
# if (
# ev.delta() < 0
# and vl >= len(chart._flows[chart.name].shm.array) + 666
# ):
# log.debug("Min zoom bruh...")
# return
# actual scaling factor
s = 1.015 ** (ev.delta() * -1 / 20) # self.state['wheelScaleFactor'])
s = [(None if m is False else s) for m in mask]
if (
# zoom happened on axis
axis == 1
# if already in axis zoom mode then keep it
or self.chart._static_yrange == 'axis'
):
self.chart._static_yrange = 'axis'
self.setLimits(yMin=None, yMax=None)
# print(scale_y)
# pos = ev.pos()
# lastPos = ev.lastPos()
# dif = pos - lastPos
# dif = dif * -1
center = Point(
fn.invertQTransform(
self.childGroup.transform()
).map(ev.pos())
)
# scale_y = 1.3 ** (center.y() * -1 / 20)
self.scaleBy(s, center)
else:
# center = pg.Point(
# fn.invertQTransform(self.childGroup.transform()).map(ev.pos())
# )
# XXX: scroll "around" the right most element in the view
# which stays "pinned" in place.
# furthest_right_coord = self.boundingRect().topRight()
# yaxis = pg.Point(
# fn.invertQTransform(
# self.childGroup.transform()
# ).map(furthest_right_coord)
# )
# This seems like the most "intuitive option, a hybrid of
# tws and tv styles
last_bar = pg.Point(int(rbar)) + 1
ryaxis = chart.getAxis('right')
r_axis_x = ryaxis.pos().x()
end_of_l1 = pg.Point(
round(
chart.cv.mapToView(
pg.Point(r_axis_x - chart._max_l1_line_len)
# QPointF(chart._max_l1_line_len, 0)
).x()
)
) # .x()
# self.state['viewRange'][0][1] = end_of_l1
# focal = pg.Point((last_bar.x() + end_of_l1)/2)
focal = min(
last_bar,
end_of_l1,
key=lambda p: p.x()
)
# focal = pg.Point(last_bar.x() + end_of_l1)
self._resetTarget()
self.scaleBy(s, focal)
# XXX: the order of the next 2 lines i'm pretty sure
# matters, we want the resize to trigger before the graphics
# update, but i gotta feelin that because this one is signal
# based (and thus not necessarily sync invoked right away)
# that calling the resize method manually might work better.
self.sigRangeChangedManually.emit(mask)
# XXX: without this is seems as though sometimes
# when zooming in from far out (and maybe vice versa?)
# the signal isn't being fired enough since if you pan
# just after you'll see further downsampling code run
# (pretty noticeable on the OHLC ds curve) but with this
# that never seems to happen? Only question is how much this
# "double work" is causing latency when these missing event
# fires don't happen?
self.maybe_downsample_graphics()
ev.accept()
def mouseDragEvent(
self,
ev,
axis: Optional[int] = None,
relayed_from: ChartView = None,
) -> None:
pos = ev.pos()
lastPos = ev.lastPos()
dif = pos - lastPos
dif = dif * -1
# NOTE: if axis is specified, event will only affect that axis.
button = ev.button()
# Ignore axes if mouse is disabled
mouseEnabled = np.array(self.state['mouseEnabled'], dtype=np.float)
mask = mouseEnabled.copy()
if axis is not None:
mask[1-axis] = 0.0
# Scale or translate based on mouse button
if button & (
QtCore.Qt.LeftButton | QtCore.Qt.MidButton
):
# zoom y-axis ONLY when click-n-drag on it
# if axis == 1:
# # set a static y range special value on chart widget to
# # prevent sizing to data in view.
# self.chart._static_yrange = 'axis'
# scale_y = 1.3 ** (dif.y() * -1 / 20)
# self.setLimits(yMin=None, yMax=None)
# # print(scale_y)
# self.scaleBy((0, scale_y))
# SELECTION MODE
if (
self.state['mouseMode'] == ViewBox.RectMode
and axis is None
):
# XXX: WHY
ev.accept()
down_pos = ev.buttonDownPos()
# This is the final position in the drag
if ev.isFinish():
self.select_box.mouse_drag_released(down_pos, pos)
ax = QtCore.QRectF(down_pos, pos)
ax = self.childGroup.mapRectFromParent(ax)
# this is the zoom transform cmd
self.showAxRect(ax)
# axis history tracking
self.axHistoryPointer += 1
self.axHistory = self.axHistory[
:self.axHistoryPointer] + [ax]
else:
print('drag finish?')
self.select_box.set_pos(down_pos, pos)
# update shape of scale box
# self.updateScaleBox(ev.buttonDownPos(), ev.pos())
self.updateScaleBox(
down_pos,
ev.pos(),
)
# PANNING MODE
else:
# XXX: WHY
ev.accept()
self.start_ic()
# if self._ic is None:
# self.chart.pause_all_feeds()
# self._ic = trio.Event()
if axis == 1:
self.chart._static_yrange = 'axis'
tr = self.childGroup.transform()
tr = fn.invertQTransform(tr)
tr = tr.map(dif*mask) - tr.map(Point(0, 0))
x = tr.x() if mask[0] == 1 else None
y = tr.y() if mask[1] == 1 else None
self._resetTarget()
if x is not None or y is not None:
self.translateBy(x=x, y=y)
self.sigRangeChangedManually.emit(self.state['mouseEnabled'])
if ev.isFinish():
self.signal_ic()
# self._ic.set()
# self._ic = None
# self.chart.resume_all_feeds()
# WEIRD "RIGHT-CLICK CENTER ZOOM" MODE
elif button & QtCore.Qt.RightButton:
if self.state['aspectLocked'] is not False:
mask[0] = 0
dif = ev.screenPos() - ev.lastScreenPos()
dif = np.array([dif.x(), dif.y()])
dif[0] *= -1
s = ((mask * 0.02) + 1) ** dif
tr = self.childGroup.transform()
tr = fn.invertQTransform(tr)
x = s[0] if mouseEnabled[0] == 1 else None
y = s[1] if mouseEnabled[1] == 1 else None
center = Point(tr.map(ev.buttonDownPos(QtCore.Qt.RightButton)))
self._resetTarget()
self.scaleBy(x=x, y=y, center=center)
self.sigRangeChangedManually.emit(self.state['mouseEnabled'])
# XXX: WHY
ev.accept()
# def mouseClickEvent(self, event: QtCore.QEvent) -> None:
# '''This routine is rerouted to an async handler.
# '''
# pass
def keyReleaseEvent(self, event: QtCore.QEvent) -> None:
'''This routine is rerouted to an async handler.
'''
pass
def keyPressEvent(self, event: QtCore.QEvent) -> None:
'''This routine is rerouted to an async handler.
'''
pass
def _set_yrange(
self,
*,
yrange: Optional[tuple[float, float]] = None,
range_margin: float = 0.06,
bars_range: Optional[tuple[int, int, int, int]] = None,
# flag to prevent triggering sibling charts from the same linked
# set from recursion errors.
autoscale_linked_plots: bool = False,
name: Optional[str] = None,
) -> None:
'''
Set the viewable y-range based on embedded data.
This adds auto-scaling like zoom on the scroll wheel such
that data always fits nicely inside the current view of the
data set.
'''
name = self.name
# print(f'YRANGE ON {name}')
profiler = pg.debug.Profiler(
msg=f'`ChartView._set_yrange()`: `{name}`',
disabled=not pg_profile_enabled(),
ms_threshold=ms_slower_then,
delayed=True,
)
set_range = True
chart = self._chart
# view has been set in 'axis' mode
# meaning it can be panned and zoomed
# arbitrarily on the y-axis:
# - disable autoranging
# - remove any y range limits
if chart._static_yrange == 'axis':
set_range = False
self.setLimits(yMin=None, yMax=None)
# static y-range has been set likely by
# a specialized FSP configuration.
elif chart._static_yrange is not None:
ylow, yhigh = chart._static_yrange
# range passed in by caller, usually a
# maxmin detection algos inside the
# display loop for re-draw efficiency.
elif yrange is not None:
ylow, yhigh = yrange
if set_range:
# XXX: only compute the mxmn range
# if none is provided as input!
if not yrange:
# flow = chart._flows[name]
yrange = self._maxmin()
if yrange is None:
log.warning(f'No yrange provided for {name}!?')
print(f"WTF NO YRANGE {name}")
return
ylow, yhigh = yrange
profiler(f'callback ._maxmin(): {yrange}')
# view margins: stay within a % of the "true range"
diff = yhigh - ylow
ylow = ylow - (diff * range_margin)
yhigh = yhigh + (diff * range_margin)
# XXX: this often needs to be unset
# to get different view modes to operate
# correctly!
self.setLimits(
yMin=ylow,
yMax=yhigh,
)
self.setYRange(ylow, yhigh)
profiler(f'set limits: {(ylow, yhigh)}')
profiler.finish()
def enable_auto_yrange(
self,
src_vb: Optional[ChartView] = None,
) -> None:
'''
Assign callback for rescaling y-axis automatically
based on data contents and ``ViewBox`` state.
'''
if src_vb is None:
src_vb = self
# splitter(s) resizing
src_vb.sigResized.connect(self._set_yrange)
# TODO: a smarter way to avoid calling this needlessly?
# 2 things i can think of:
# - register downsample-able graphics specially and only
# iterate those.
# - only register this when certain downsampleable graphics are
# "added to scene".
src_vb.sigRangeChangedManually.connect(
self.maybe_downsample_graphics
)
# mouse wheel doesn't emit XRangeChanged
src_vb.sigRangeChangedManually.connect(self._set_yrange)
# src_vb.sigXRangeChanged.connect(self._set_yrange)
# src_vb.sigXRangeChanged.connect(
# self.maybe_downsample_graphics
# )
def disable_auto_yrange(self) -> None:
self.sigResized.disconnect(
self._set_yrange,
)
self.sigRangeChangedManually.disconnect(
self.maybe_downsample_graphics
)
self.sigRangeChangedManually.disconnect(
self._set_yrange,
)
# self.sigXRangeChanged.disconnect(self._set_yrange)
# self.sigXRangeChanged.disconnect(
# self.maybe_downsample_graphics
# )
def x_uppx(self) -> float:
'''
Return the "number of x units" within a single
pixel currently being displayed for relevant
graphics items which are our children.
'''
graphics = [f.graphics for f in self._chart._flows.values()]
if not graphics:
return 0
for graphic in graphics:
xvec = graphic.pixelVectors()[0]
if xvec:
return xvec.x()
else:
return 0
def maybe_downsample_graphics(
self,
autoscale_overlays: bool = True,
):
profiler = pg.debug.Profiler(
msg=f'ChartView.maybe_downsample_graphics() for {self.name}',
disabled=not pg_profile_enabled(),
# XXX: important to avoid not seeing underlying
# ``.update_graphics_from_flow()`` nested profiling likely
# due to the way delaying works and garbage collection of
# the profiler in the delegated method calls.
ms_threshold=6,
# ms_threshold=ms_slower_then,
)
# TODO: a faster single-loop-iterator way of doing this XD
chart = self._chart
linked = self.linkedsplits
plots = linked.subplots | {chart.name: chart}
for chart_name, chart in plots.items():
for name, flow in chart._flows.items():
if (
not flow.render
# XXX: super important to be aware of this.
# or not flow.graphics.isVisible()
):
continue
# pass in no array which will read and render from the last
# passed array (normally provided by the display loop.)
chart.update_graphics_from_flow(
name,
use_vr=True,
)
# for each overlay on this chart auto-scale the
# y-range to max-min values.
if autoscale_overlays:
overlay = chart.pi_overlay
if overlay:
for pi in overlay.overlays:
pi.vb._set_yrange(
# TODO: get the range once up front...
# bars_range=br,
)
profiler('autoscaled linked plots')
profiler(f'<{chart_name}>.update_graphics_from_flow({name})')

View File

@ -1,296 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Double auction top-of-book (L1) graphics.
"""
from typing import Tuple
import pyqtgraph as pg
from PyQt5 import QtCore, QtGui
from PyQt5.QtCore import QPointF
from ._axes import YAxisLabel
from ._style import hcolor
class LevelLabel(YAxisLabel):
"""Y-axis (vertically) oriented, horizontal label that sticks to
where it's placed despite chart resizing and supports displaying
multiple fields.
TODO: replace the rectangle-text part with our new ``Label`` type.
"""
_x_margin = 0
_y_margin = 0
# adjustment "further away from" anchor point
_x_offset = 9
_y_offset = 0
# fields to be displayed in the label string
_fields = {
'level': 0,
'level_digits': 2,
}
# default label template is just a y-level with so much precision
_fmt_str = '{level:,.{level_digits}f} '
def __init__(
self,
chart,
parent,
color: str = 'bracket',
orient_v: str = 'bottom',
orient_h: str = 'left',
opacity: float = 0,
# makes order line labels offset from their parent axis
# such that they don't collide with the L1/L2 lines/prices
# that are displayed on the axis
adjust_to_l1: bool = False,
**axis_label_kwargs,
) -> None:
super().__init__(
chart,
parent=parent,
use_arrow=False,
opacity=opacity,
**axis_label_kwargs
)
# TODO: this is kinda cludgy
self._hcolor: pg.Pen = None
self.color: str = color
# orientation around axis options
self._orient_v = orient_v
self._orient_h = orient_h
self._adjust_to_l1 = adjust_to_l1
self._v_shift = {
'top': -1.,
'bottom': 0.,
'middle': 1 / 2.
}[orient_v]
self._h_shift = {
'left': -1.,
'right': 0.
}[orient_h]
self.fields = self._fields.copy()
# ensure default format fields are in correct
self.set_fmt_str(self._fmt_str, self.fields)
@property
def color(self):
return self._hcolor
@color.setter
def color(self, color: str) -> None:
self._hcolor = color
self._pen = self.pen = pg.mkPen(hcolor(color))
def update_on_resize(self, vr, r):
"""Tiis is a ``.sigRangeChanged()`` handler.
"""
self.update_fields(self.fields)
def update_fields(
self,
fields: dict = None,
) -> None:
"""Update the label's text contents **and** position from
a view box coordinate datum.
"""
self.fields.update(fields)
level = self.fields['level']
# map "level" to local coords
abs_xy = self._chart.mapFromView(QPointF(0, level))
self.update_label(
abs_xy,
self.fields,
)
def update_label(
self,
abs_pos: QPointF, # scene coords
fields: dict,
) -> None:
# write contents, type specific
h, w = self.set_label_str(fields)
if self._adjust_to_l1:
self._x_offset = self._chart._max_l1_line_len
self.setPos(QPointF(
self._h_shift * (w + self._x_offset),
abs_pos.y() + self._v_shift * h
))
# XXX: definitely need this!
self.update()
def set_fmt_str(
self,
fmt_str: str,
fields: dict,
) -> (str, str):
# test that new fmt str can be rendered
self._fmt_str = fmt_str
self.set_label_str(fields)
self.fields.update(fields)
return fmt_str, self.label_str
def set_label_str(
self,
fields: dict,
):
# use space as e3 delim
self.label_str = self._fmt_str.format(**fields).replace(',', ' ')
br = self.boundingRect()
h, w = br.height(), br.width()
return h, w
def size_hint(self) -> Tuple[None, None]:
return None, None
def draw(
self,
p: QtGui.QPainter,
rect: QtCore.QRectF
) -> None:
p.setPen(self._pen)
rect = self.rect
if self._orient_v == 'bottom':
lp, rp = rect.topLeft(), rect.topRight()
# p.drawLine(rect.topLeft(), rect.topRight())
elif self._orient_v == 'top':
lp, rp = rect.bottomLeft(), rect.bottomRight()
p.drawLine(
*map(int, [
lp.x(),
lp.y(),
rp.x(),
rp.y(),
])
)
def highlight(self, pen) -> None:
self._pen = pen
self.update()
def unhighlight(self):
self._pen = self.pen
self.update()
class L1Label(LevelLabel):
text_flags = (
QtCore.Qt.TextDontClip
| QtCore.Qt.AlignLeft
)
def set_label_str(
self,
fields: dict,
) -> None:
"""Make sure the max L1 line module var is kept up to date.
"""
h, w = super().set_label_str(fields)
# Set a global "max L1 label length" so we can
# look it up on order lines and adjust their
# labels not to overlap with it.
chart = self._chart
chart._max_l1_line_len: float = max(
chart._max_l1_line_len,
w
)
return h, w
class L1Labels:
"""Level 1 bid ask labels for dynamic update on price-axis.
"""
def __init__(
self,
chart: 'ChartPlotWidget', # noqa
digits: int = 2,
size_digits: int = 3,
font_size: str = 'small',
) -> None:
self.chart = chart
raxis = chart.getAxis('right')
kwargs = {
'chart': chart,
'parent': raxis,
'opacity': 1,
'font_size': font_size,
'fg_color': chart.pen_color,
'bg_color': chart.view_color,
}
fmt_str = (
' {size:.{size_digits}f} x '
'{level:,.{level_digits}f} '
)
fields = {
'level': 0,
'level_digits': digits,
'size': 0,
'size_digits': size_digits,
}
bid = self.bid_label = L1Label(
orient_v='bottom',
**kwargs,
)
bid.set_fmt_str(fmt_str=fmt_str, fields=fields)
bid.show()
ask = self.ask_label = L1Label(
orient_v='top',
**kwargs,
)
ask.set_fmt_str(fmt_str=fmt_str, fields=fields)
ask.show()

View File

@ -1,290 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Non-shitty labels that don't re-invent the wheel.
"""
from inspect import isfunction
from typing import Callable, Optional, Any
import pyqtgraph as pg
from PyQt5 import QtGui, QtWidgets
from PyQt5.QtWidgets import QLabel, QSizePolicy
from PyQt5.QtCore import QPointF, QRectF, Qt
from ._style import (
DpiAwareFont,
hcolor,
_font,
)
class Label:
'''
A plain ol' "scene label" using an underlying ``QGraphicsTextItem``.
After hacking for many days on multiple "label" systems inside
``pyqtgraph`` yet again we're left writing our own since it seems
all of those are over complicated, ad-hoc, transform-mangling,
messes which can't accomplish the simplest things via their inputs
(such as pinning to the left hand side of a view box).
Here we do the simple thing where the label uses callables to figure
out the (x, y) coordinate "pin point": nice and simple.
This type is another effort (see our graphics) to start making
small, re-usable label components that can actually be used to build
production grade UIs...
'''
def __init__(
self,
view: pg.ViewBox,
fmt_str: str,
color: str = 'default_light',
x_offset: float = 0,
font_size: str = 'small',
opacity: float = 1,
fields: dict = {},
parent: pg.GraphicsObject = None,
update_on_range_change: bool = True,
) -> None:
vb = self.vb = view
self._fmt_str = fmt_str
self._view_xy = QPointF(0, 0)
self.scene_anchor: Optional[
Callable[..., QPointF]
] = None
self._x_offset = x_offset
txt = self.txt = QtWidgets.QGraphicsTextItem(parent=parent)
txt.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache)
vb.scene().addItem(txt)
# configure font size based on DPI
dpi_font = DpiAwareFont(
font_size=font_size,
)
dpi_font.configure_to_dpi()
txt.setFont(dpi_font.font)
txt.setOpacity(opacity)
# register viewbox callbacks
if update_on_range_change:
vb.sigRangeChanged.connect(self.on_sigrange_change)
self._hcolor: str = ''
self.color = color
self.fields = fields
self.orient_v = 'bottom'
self._anchor_func = self.txt.pos().x
# not sure if this makes a diff
self.txt.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache)
# TODO: edit and selection support
# https://doc.qt.io/qt-5/qt.html#TextInteractionFlag-enum
# self.setTextInteractionFlags(QtGui.Qt.TextEditorInteraction)
@property
def color(self) -> str:
return self._hcolor
@color.setter
def color(self, color: str) -> None:
self.txt.setDefaultTextColor(pg.mkColor(hcolor(color)))
self._hcolor = color
def update(self) -> None:
'''
Update this label either by invoking its user defined anchoring
function, or by positioning to the last recorded data view
coordinates.
'''
# move label in scene coords to desired position
anchor = self.scene_anchor
if anchor:
self.txt.setPos(anchor())
else:
# position based on last computed view coordinate
self.set_view_pos(self._view_xy.y())
def on_sigrange_change(self, vr, r) -> None:
return self.update()
@property
def w(self) -> float:
return self.txt.boundingRect().width()
def scene_br(self) -> QRectF:
txt = self.txt
return txt.mapToScene(
txt.boundingRect()
).boundingRect()
@property
def h(self) -> float:
return self.txt.boundingRect().height()
def vbr(self) -> QRectF:
return self.vb.boundingRect()
def set_x_anchor_func(
self,
func: Callable,
) -> None:
assert isinstance(func(), float)
self._anchor_func = func
def set_view_pos(
self,
y: float,
x: Optional[float] = None,
) -> None:
if x is None:
scene_x = self._anchor_func() or self.txt.pos().x()
x = self.vb.mapToView(QPointF(scene_x, scene_x)).x()
# get new (inside the) view coordinates / position
self._view_xy = QPointF(x, y)
# map back to the outer UI-land "scene" coordinates
s_xy = self.vb.mapFromView(self._view_xy)
if self.orient_v == 'top':
s_xy = QPointF(s_xy.x(), s_xy.y() - self.h)
# move label in scene coords to desired position
self.txt.setPos(s_xy)
assert s_xy == self.txt.pos()
@property
def fmt_str(self) -> str:
return self._fmt_str
@fmt_str.setter
def fmt_str(self, fmt_str: str) -> None:
self._fmt_str = fmt_str
def format(
self,
**fields: dict
) -> str:
out = {}
# this is hacky support for single depth
# calcs of field data from field data
# ex. to calculate a $value = price * size
for k, v in fields.items():
if isfunction(v):
out[k] = v(fields)
else:
out[k] = v
text = self._fmt_str.format(**out)
# for large numbers with a thousands place
text = text.replace(',', ' ')
self.txt.setPlainText(text)
def render(self) -> None:
self.format(**self.fields)
def show(self) -> None:
self.txt.show()
self.txt.update()
def hide(self) -> None:
self.txt.hide()
def delete(self) -> None:
self.vb.scene().removeItem(self.txt)
class FormatLabel(QLabel):
'''
Kinda similar to above but using the widget apis.
'''
def __init__(
self,
fmt_str: str,
font: QtGui.QFont,
font_size: int,
font_color: str,
parent=None,
) -> None:
super().__init__(parent)
# by default set the format string verbatim and expect user to
# call ``.format()`` later (presumably they'll notice the
# unformatted content if ``fmt_str`` isn't meant to be
# unformatted).
self.fmt_str = fmt_str
self.setText(fmt_str)
self.setStyleSheet(
f"""QLabel {{
color : {hcolor(font_color)};
font-size : {font_size}px;
}}
"""
)
self.setFont(_font.font)
self.setTextFormat(Qt.MarkdownText) # markdown
self.setMargin(0)
self.setSizePolicy(
QSizePolicy.Expanding,
QSizePolicy.Expanding,
)
self.setAlignment(
Qt.AlignVCenter | Qt.AlignLeft
)
self.setText(self.fmt_str)
def format(
self,
**fields: dict[str, Any],
) -> str:
out = self.fmt_str.format(**fields)
self.setText(out)
return out

View File

@ -1,803 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Lines for orders, alerts, L2.
"""
from functools import partial
from math import floor
from typing import Optional, Callable
import pyqtgraph as pg
from pyqtgraph import Point, functions as fn
from PyQt5 import QtCore, QtGui, QtWidgets
from PyQt5.QtCore import QPointF
from ._annotate import qgo_draw_markers, LevelMarker
from ._anchors import (
vbr_left,
right_axis,
gpath_pin,
)
from ..calc import humanize
from ._label import Label
from ._style import hcolor, _font
# TODO: probably worth investigating if we can
# make .boundingRect() faster:
# https://stackoverflow.com/questions/26156486/determine-bounding-rect-of-line-in-qt
class LevelLine(pg.InfiniteLine):
def __init__(
self,
chart: 'ChartPlotWidget', # type: ignore # noqa
# style
color: str = 'default',
highlight_color: str = 'default_light',
dotted: bool = False,
# UX look and feel opts
always_show_labels: bool = False,
highlight_on_hover: bool = True,
hide_xhair_on_hover: bool = True,
only_show_markers_on_hover: bool = True,
use_marker_margin: bool = False,
movable: bool = True,
) -> None:
# TODO: at this point it's probably not worth the inheritance
# any more since we've reimplemented ``.pain()`` among other
# things..
super().__init__(
movable=movable,
angle=0,
# don't use the shitty ``InfLineLabel``
label=None,
)
self._chart = chart
self.highlight_on_hover = highlight_on_hover
self._dotted = dotted
self._hide_xhair_on_hover = hide_xhair_on_hover
# callback that can be assigned by user code
# to get updates from each level change
self._on_level_change: Callable[[float], None] = lambda y: None
self._marker = None
self.only_show_markers_on_hover = only_show_markers_on_hover
self.show_markers: bool = True # presuming the line is hovered at init
# should line go all the way to far end or leave a "margin"
# space for other graphics (eg. L1 book)
self.use_marker_margin: bool = use_marker_margin
if dotted:
self._style = QtCore.Qt.DashLine
else:
self._style = QtCore.Qt.SolidLine
self._hcolor: str = None
# the float y-value in the view coords
self.level: float = 0
# list of labels anchored at one of the 2 line endpoints
# inside the viewbox
self._labels: list[Label] = []
self._markers: list[(int, Label)] = []
# whenever this line is moved trigger label updates
self.sigPositionChanged.connect(self.on_pos_change)
# sets color to value triggering pen creation
self._hl_color = highlight_color
self.color = color
# TODO: for when we want to move groups of lines?
self._track_cursor: bool = False
self.always_show_labels = always_show_labels
self._on_drag_start = lambda l: None
self._on_drag_end = lambda l: None
self._y_incr_mult = 1 / chart.linked.symbol.tick_size
self._right_end_sc: float = 0
def txt_offsets(self) -> tuple[int, int]:
return 0, 0
@property
def color(self):
return self._hcolor
@color.setter
def color(self, color: str) -> None:
# set pens to new color
self._hcolor = color
pen = pg.mkPen(hcolor(color))
hoverpen = pg.mkPen(hcolor(self._hl_color))
pen.setStyle(self._style)
hoverpen.setStyle(self._style)
# set regular pen
self.setPen(pen)
# use slightly thicker highlight for hover pen
hoverpen.setWidth(2)
self.hoverPen = hoverpen
def on_pos_change(
self,
line: 'LevelLine', # noqa
) -> None:
"""Position changed handler.
"""
level = self.value()
self.update_labels({'level': level})
self.set_level(level, called_from_on_pos_change=True)
def update_labels(
self,
fields_data: dict,
) -> None:
for label in self._labels:
label.color = self.color
label.fields.update(fields_data)
label.render()
level = fields_data.get('level')
if level:
label.set_view_pos(y=level)
self.update()
def hide_labels(self) -> None:
for label in self._labels:
label.hide()
def show_labels(self) -> None:
for label in self._labels:
label.show()
def set_level(
self,
level: float,
called_from_on_pos_change: bool = False,
) -> None:
if not called_from_on_pos_change:
last = self.value()
# if the position hasn't changed then ``.update_labels()``
# will not be called by a non-triggered `.on_pos_change()`,
# so we need to call it manually to avoid mismatching
# label-to-line color when the line is updated but not
# from a "moved" event.
if level == last:
self.update_labels({'level': level})
self.setPos(level)
self.level = self.value()
self.update()
# invoke any user code
self._on_level_change(level)
def on_tracked_source(
self,
x: int,
y: float
) -> None:
'''Chart coordinates cursor tracking callback.
this is called by our ``Cursor`` type once this line is set to
track the cursor: for every movement this callback is invoked to
reposition the line with the current view coordinates.
'''
self.movable = True
self.set_level(y) # implictly calls reposition handler
def mouseDragEvent(self, ev):
"""Override the ``InfiniteLine`` handler since we need more
detailed control and start end signalling.
"""
cursor = self._chart.linked.cursor
# hide y-crosshair
cursor.hide_xhair()
# highlight
self.currentPen = self.hoverPen
self.show_labels()
# XXX: normal tracking behavior pulled out from parent type
if self.movable and ev.button() == QtCore.Qt.LeftButton:
ev.accept()
if ev.isStart():
self.moving = True
down_pos = ev.buttonDownPos()
self.cursorOffset = self.pos() - self.mapToParent(down_pos)
self.startPosition = self.pos()
self._on_drag_start(self)
if not self.moving:
return
pos = self.cursorOffset + self.mapToParent(ev.pos())
# TODO: we should probably figure out a std api
# for this kind of thing given we already have
# it on the cursor system...
# round to nearest symbol tick
m = self._y_incr_mult
self.setPos(
QPointF(
self.pos().x(), # don't allow shifting horizontally
round(pos.y() * m) / m
)
)
self.sigDragged.emit(self)
if ev.isFinish():
self.moving = False
self.sigPositionChangeFinished.emit(self)
self._on_drag_end(self)
# This is the final position in the drag
if ev.isFinish():
# show y-crosshair again
cursor.show_xhair()
def delete(self) -> None:
"""Remove this line from containing chart/view/scene.
"""
scene = self.scene()
if scene:
for label in self._labels:
label.delete()
# gc managed labels?
self._labels.clear()
if self._marker:
self.scene().removeItem(self._marker)
# remove from chart/cursor states
chart = self._chart
cur = chart.linked.cursor
if self in cur._hovered:
cur._hovered.remove(self)
chart.plotItem.removeItem(self)
def mouseDoubleClickEvent(
self,
ev: QtGui.QMouseEvent,
) -> None:
# TODO: enter labels edit mode
print(f'double click {ev}')
def paint(
self,
p: QtGui.QPainter,
opt: QtWidgets.QStyleOptionGraphicsItem,
w: QtWidgets.QWidget
) -> None:
'''
Core paint which we override (yet again)
from pg..
'''
p.setRenderHint(p.Antialiasing)
# these are in viewbox coords
vb_left, vb_right = self._endPoints
vb = self.getViewBox()
line_end, marker_right, r_axis_x = self._chart.marker_right_points()
if self.show_markers and self.markers:
p.setPen(self.pen)
qgo_draw_markers(
self.markers,
self.pen.color(),
p,
vb_left,
vb_right,
marker_right,
)
# marker_size = self.markers[0][2]
self._maxMarkerSize = max([m[2] / 2. for m in self.markers])
# this seems slower when moving around
# order lines.. not sure wtf is up with that.
# for now we're just using it on the position line.
elif self._marker:
# TODO: make this label update part of a scene-aware-marker
# composed annotation
self._marker.setPos(
QPointF(marker_right, self.scene_y())
)
if hasattr(self._marker, 'label'):
self._marker.label.update()
elif not self.use_marker_margin:
# basically means **don't** shorten the line with normally
# reserved space for a direction marker but, leave small
# blank gap for style
line_end = r_axis_x - 10
line_end_view = vb.mapToView(Point(line_end, 0)).x()
# self.currentPen.setJoinStyle(QtCore.Qt.MiterJoin)
p.setPen(self.currentPen)
p.drawLine(
Point(vb_left, 0),
Point(line_end_view, 0)
)
self._right_end_sc = line_end
def hide(self) -> None:
super().hide()
if self._marker:
self._marker.hide()
# needed for ``order_line()`` lines currently
self._marker.label.hide()
def show(self) -> None:
super().show()
if self._marker:
self._marker.show()
# self._marker.label.show()
def scene_y(self) -> float:
return self.getViewBox().mapFromView(
Point(0, self.value())
).y()
def scene_endpoint(self) -> QPointF:
if not self._right_end_sc:
line_end, _, _ = self._chart.marker_right_points()
self._right_end_sc = line_end - 10
return QPointF(self._right_end_sc, self.scene_y())
def add_marker(
self,
path: QtWidgets.QGraphicsPathItem,
) -> QtWidgets.QGraphicsPathItem:
self._marker = path
self._marker.setPen(self.currentPen)
self._marker.setBrush(fn.mkBrush(self.currentPen.color()))
# add path to scene
self.getViewBox().scene().addItem(path)
# place to just-left of L1 labels
rsc = self._chart.pre_l1_xs()[0]
path.setPos(QPointF(rsc, self.scene_y()))
return path
def hoverEvent(self, ev):
'''
Mouse hover callback.
'''
cur = self._chart.linked.cursor
# hovered
if (not ev.isExit()) and ev.acceptDrags(QtCore.Qt.LeftButton):
# if already hovered we don't need to run again
if self.mouseHovering is True:
return
if self.only_show_markers_on_hover:
self.show_markers = True
if self._marker:
self._marker.show()
# highlight if so configured
if self.highlight_on_hover:
self.currentPen = self.hoverPen
if self not in cur._trackers:
# only disable cursor y-label updates
# if we're highlighting a line
cur._y_label_update = False
# add us to cursor state
cur.add_hovered(self)
if self._hide_xhair_on_hover:
cur.hide_xhair(
# set y-label to current value
y_label_level=self.value(),
just_vertical=True,
# fg_color=self._hcolor,
# bg_color=self._hcolor,
)
# if we want highlighting of labels
# it should be delegated into this method
self.show_labels()
self.mouseHovering = True
# un-hovered
else:
if self.mouseHovering is False:
return
cur._y_label_update = True
self.currentPen = self.pen
cur._hovered.remove(self)
if self.only_show_markers_on_hover:
self.show_markers = False
if self._marker:
self._marker.hide()
self._marker.label.hide()
if self not in cur._trackers:
cur.show_xhair(y_label_level=self.value())
if not self.always_show_labels:
self.hide_labels()
self.mouseHovering = False
self.update()
def level_line(
chart: 'ChartPlotWidget', # noqa
level: float,
# line style
dotted: bool = False,
color: str = 'default',
# ux
highlight_on_hover: bool = True,
# label fields and options
always_show_labels: bool = False,
add_label: bool = True,
orient_v: str = 'bottom',
**kwargs,
) -> LevelLine:
"""Convenience routine to add a styled horizontal line to a plot.
"""
hl_color = color + '_light' if highlight_on_hover else color
line = LevelLine(
chart,
color=color,
# lookup "highlight" equivalent
highlight_color=hl_color,
dotted=dotted,
# UX related options
highlight_on_hover=highlight_on_hover,
# when set to True the label is always shown instead of just on
# highlight (which is a privacy thing for orders)
always_show_labels=always_show_labels,
**kwargs,
)
chart.plotItem.addItem(line)
if add_label:
label = Label(
view=line.getViewBox(),
# by default we only display the line's level value
# in the label
fmt_str=('{level:,.{level_digits}f}'),
color=color,
)
# anchor to right side (of view ) label
label.set_x_anchor_func(
right_axis(
chart,
label,
side='left', # side of axis
offset=0,
avoid_book=False,
)
)
# add to label set which will be updated on level changes
line._labels.append(label)
label.orient_v = orient_v
line.update_labels({'level': level, 'level_digits': 2})
label.render()
# keep pp label details private until
# the user edge triggers "order mode"
line.hide_labels()
# activate/draw label
line.set_level(level)
return line
def order_line(
chart,
level: float,
action: Optional[str] = 'buy', # buy or sell
marker_style: Optional[str] = None,
level_digits: Optional[float] = 3,
size: Optional[int] = 1,
size_digits: int = 1,
show_markers: bool = False,
submit_price: float = None,
orient_v: str = 'bottom',
**line_kwargs,
) -> LevelLine:
'''
Convenience routine to add a line graphic representing an order
execution submitted to the EMS via the chart's "order mode".
'''
line = level_line(
chart,
level,
add_label=False,
use_marker_margin=True,
**line_kwargs
)
font_size = _font.font.pixelSize()
# scale marker size with dpi-aware font size
marker_size = floor(1.375 * font_size)
orient_v = 'top' if action == 'sell' else 'bottom'
if action == 'alert':
label = Label(
view=line.getViewBox(),
color=line.color,
# completely different labelling for alerts
fmt_str='alert => {level}',
)
# for now, we're just duplicating the label contents i guess..
line._labels.append(label)
# anchor to left side of view / line
label.set_x_anchor_func(vbr_left(label))
label.fields = {
'level': level,
'level_digits': level_digits,
}
marker_size = marker_size * 0.666
else:
view = line.getViewBox()
# far-side label
label = Label(
view=view,
# display the order pos size, which is some multiple
# of the user defined base unit size
fmt_str=(
'{account_text}{size:.{size_digits}f}u{fiat_text}'
),
color=line.color,
)
label.set_x_anchor_func(vbr_left(label))
line._labels.append(label)
def maybe_show_fiat_text(fields: dict) -> str:
fiat_size = fields.get('fiat_size')
if not fiat_size:
return ''
return f' ~ ${humanize(fiat_size)}'
def maybe_show_account_name(fields: dict) -> str:
account = fields.get('account')
if not account:
return ''
return f'{account}: '
label.fields = {
'size': size,
'size_digits': 0,
'fiat_size': None,
'fiat_text': maybe_show_fiat_text,
'account': None,
'account_text': maybe_show_account_name,
}
label.orient_v = orient_v
label.render()
label.show()
if show_markers:
# add arrow marker on end of line nearest y-axis
marker_style = marker_style or {
'buy': '|<',
'sell': '>|',
'alert': 'v',
}[action]
# the old way which is still somehow faster?
marker = LevelMarker(
chart=chart,
style=marker_style,
get_level=line.value,
size=marker_size,
keep_in_view=False,
)
# XXX: this is our new approach but seems slower?
marker = line.add_marker(marker)
# XXX: DON'T COMMENT THIS!
# this fixes it the artifact issue! .. of course, bounding rect stuff
line._maxMarkerSize = marker_size
assert line._marker is marker
assert not line.markers
# above we use ``QPathGraphicsItem``s directly to draw markers
# in scene coords instead of the way ``InfiniteLine`` does it
# internally: by resetting the graphics item transform
# intermittently inside ``.paint()`` which we've copied and
# seperated as ``qgo_draw_markers()`` if we ever want to go back
# to it; likely we can remove this.
# manually append for later ``InfiniteLine.paint()`` drawing
# XXX: this was manually tested as faster then using the
# QGraphicsItem around a painter path.. probably needs further
# testing to figure out why tf that's true.
# line.markers.append((marker, 0, marker_size))
if action != 'alert':
# add a partial position label if we also added a level marker
pp_size_label = Label(
view=view,
color=line.color,
# this is "static" label
# update_on_range_change=False,
fmt_str='\n'.join((
'{slots_used:.1f}x',
)),
fields={
'slots_used': 0,
},
)
pp_size_label.render()
pp_size_label.show()
line._labels.append(pp_size_label)
# TODO: pretty sure one of the reasons these "label
# updatess" are a bit "jittery" is because we aren't
# leveraging the "scene coordinates hierarchy" stuff:
# i.e. using some parent object as the coord "origin"
# which i presume would result in better pixel caching
# results? def something to dig into..
pp_size_label.scene_anchor = partial(
gpath_pin,
gpath=marker,
label=pp_size_label,
)
# XXX: without this the pp proportion label next the marker
# seems to lag? this is the same issue we had with position
# lines which we handle with ``.update_graphcis()``.
# marker._on_paint=lambda marker: pp_size_label.update()
marker._on_paint = lambda marker: pp_size_label.update()
marker.label = label
# sanity check
line.update_labels({'level': level})
return line
# TODO: should probably consider making this a more general
# purpose class method on the type?
def copy_from_order_line(
chart: 'ChartPlotWidget', # noqa
line: LevelLine
) -> LevelLine:
return order_line(
chart,
# label fields default values
level=line.value(),
marker_style=line._marker.style,
# LevelLine kwargs
color=line.color,
dotted=line._dotted,
show_markers=line.show_markers,
only_show_markers_on_hover=line.only_show_markers_on_hover,
)

View File

@ -1,250 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Super fast OHLC sampling graphics types.
"""
from __future__ import annotations
from typing import (
Optional,
TYPE_CHECKING,
)
import numpy as np
import pyqtgraph as pg
from PyQt5 import QtCore, QtGui, QtWidgets
from PyQt5.QtCore import QLineF, QPointF
from PyQt5.QtGui import QPainterPath
from .._profile import pg_profile_enabled, ms_slower_then
from ._style import hcolor
from ..log import get_logger
if TYPE_CHECKING:
from ._chart import LinkedSplits
log = get_logger(__name__)
def bar_from_ohlc_row(
row: np.ndarray,
# 0.5 is no overlap between arms, 1.0 is full overlap
w: float = 0.43
) -> tuple[QLineF]:
'''
Generate the minimal ``QLineF`` lines to construct a single
OHLC "bar" for use in the "last datum" of a series.
'''
open, high, low, close, index = row[
['open', 'high', 'low', 'close', 'index']]
# TODO: maybe consider using `QGraphicsLineItem` ??
# gives us a ``.boundingRect()`` on the objects which may make
# computing the composite bounding rect of the last bars + the
# history path faster since it's done in C++:
# https://doc.qt.io/qt-5/qgraphicslineitem.html
# high -> low vertical (body) line
if low != high:
hl = QLineF(index, low, index, high)
else:
# XXX: if we don't do it renders a weird rectangle?
# see below for filtering this later...
hl = None
# NOTE: place the x-coord start as "middle" of the drawing range such
# that the open arm line-graphic is at the left-most-side of
# the index's range according to the view mapping coordinates.
# open line
o = QLineF(index - w, open, index, open)
# close line
c = QLineF(index, close, index + w, close)
return [hl, o, c]
class BarItems(pg.GraphicsObject):
'''
"Price range" bars graphics rendered from a OHLC sampled sequence.
'''
def __init__(
self,
linked: LinkedSplits,
plotitem: 'pg.PlotItem', # noqa
pen_color: str = 'bracket',
last_bar_color: str = 'bracket',
name: Optional[str] = None,
) -> None:
super().__init__()
self.linked = linked
# XXX: for the mega-lulz increasing width here increases draw
# latency... so probably don't do it until we figure that out.
self._color = pen_color
self.bars_pen = pg.mkPen(hcolor(pen_color), width=1)
self.last_bar_pen = pg.mkPen(hcolor(last_bar_color), width=2)
self._name = name
self.setCacheMode(QtWidgets.QGraphicsItem.DeviceCoordinateCache)
self.path = QPainterPath()
self._last_bar_lines: Optional[tuple[QLineF, ...]] = None
def x_uppx(self) -> int:
# we expect the downsample curve report this.
return 0
def boundingRect(self):
# Qt docs: https://doc.qt.io/qt-5/qgraphicsitem.html#boundingRect
# TODO: Can we do rect caching to make this faster
# like `pg.PlotCurveItem` does? In theory it's just
# computing max/min stuff again like we do in the udpate loop
# anyway. Not really sure it's necessary since profiling already
# shows this method is faf.
# boundingRect _must_ indicate the entire area that will be
# drawn on or else we will get artifacts and possibly crashing.
# (in this case, QPicture does all the work of computing the
# bounding rect for us).
# apparently this a lot faster says the docs?
# https://doc.qt.io/qt-5/qpainterpath.html#controlPointRect
hb = self.path.controlPointRect()
hb_tl, hb_br = (
hb.topLeft(),
hb.bottomRight(),
)
# need to include last bar height or BR will be off
mx_y = hb_br.y()
mn_y = hb_tl.y()
last_lines = self._last_bar_lines
if last_lines:
body_line = self._last_bar_lines[0]
if body_line:
mx_y = max(mx_y, max(body_line.y1(), body_line.y2()))
mn_y = min(mn_y, min(body_line.y1(), body_line.y2()))
return QtCore.QRectF(
# top left
QPointF(
hb_tl.x(),
mn_y,
),
# bottom right
QPointF(
hb_br.x() + 1,
mx_y,
)
)
def paint(
self,
p: QtGui.QPainter,
opt: QtWidgets.QStyleOptionGraphicsItem,
w: QtWidgets.QWidget
) -> None:
profiler = pg.debug.Profiler(
disabled=not pg_profile_enabled(),
ms_threshold=ms_slower_then,
)
# p.setCompositionMode(0)
# TODO: one thing we could try here is pictures being drawn of
# a fixed count of bars such that based on the viewbox indices we
# only draw the "rounded up" number of "pictures worth" of bars
# as is necesarry for what's in "view". Not sure if this will
# lead to any perf gains other then when zoomed in to less bars
# in view.
p.setPen(self.last_bar_pen)
if self._last_bar_lines:
p.drawLines(*tuple(filter(bool, self._last_bar_lines)))
profiler('draw last bar')
p.setPen(self.bars_pen)
p.drawPath(self.path)
profiler(f'draw history path: {self.path.capacity()}')
def draw_last_datum(
self,
path: QPainterPath,
src_data: np.ndarray,
render_data: np.ndarray,
reset: bool,
array_key: str,
fields: list[str] = [
'index',
'open',
'high',
'low',
'close',
],
) -> None:
# relevant fields
ohlc = src_data[fields]
last_row = ohlc[-1:]
# individual values
last_row = i, o, h, l, last = ohlc[-1]
# generate new lines objects for updatable "current bar"
self._last_bar_lines = bar_from_ohlc_row(last_row)
# assert i == graphics.start_index - 1
# assert i == last_index
body, larm, rarm = self._last_bar_lines
# XXX: is there a faster way to modify this?
rarm.setLine(rarm.x1(), last, rarm.x2(), last)
# writer is responsible for changing open on "first" volume of bar
larm.setLine(larm.x1(), o, larm.x2(), o)
if l != h: # noqa
if body is None:
body = self._last_bar_lines[0] = QLineF(i, l, i, h)
else:
# update body
body.setLine(i, l, i, h)
# XXX: pretty sure this is causing an issue where the
# bar has a large upward move right before the next
# sample and the body is getting set to None since the
# next bar is flat but the shm array index update wasn't
# read by the time this code runs. Iow we're doing this
# removal of the body for a bar index that is now out of
# date / from some previous sample. It's weird though
# because i've seen it do this to bars i - 3 back?
return ohlc['index'], ohlc['close']

View File

@ -1,129 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
micro-ORM for coupling ``pydantic`` models with Qt input/output widgets.
"""
from __future__ import annotations
from typing import (
Optional, Generic,
TypeVar, Callable,
Literal,
)
import enum
import sys
from pydantic import BaseModel, validator
from pydantic.generics import GenericModel
from PyQt5.QtWidgets import (
QWidget,
QComboBox,
)
from ._forms import (
# FontScaledDelegate,
Edit,
)
DataType = TypeVar('DataType')
class Field(GenericModel, Generic[DataType]):
widget_factory: Optional[
Callable[
[QWidget, 'Field'],
QWidget
]
]
value: Optional[DataType] = None
class Selection(Field[DataType], Generic[DataType]):
'''Model which maps to a finite set of drop down entries declared as
a ``dict[str, DataType]``.
'''
widget_factory = QComboBox
options: dict[str, DataType]
# value: DataType = None
@validator('value') # , always=True)
def set_value_first(
cls,
v: DataType,
values: dict[str, DataType],
) -> DataType:
'''If no initial value is set, use the first in
the ``options`` dict.
'''
# breakpoint()
options = values['options']
if v is None:
return next(options.values())
else:
assert v in options, f'{v} is not in {options}'
return v
# class SizeUnit(Enum):
# currency = '$ size'
# percent_of_port = '% of port'
# shares = '# shares'
# class Weighter(str, Enum):
# uniform = 'uniform'
class Edit(Field[DataType], Generic[DataType]):
'''An edit field which takes a number.
'''
widget_factory = Edit
class AllocatorPane(BaseModel):
account = Selection[str](
options=dict.fromkeys(
['paper', 'ib.paper', 'ib.margin'],
'paper',
),
)
allocate = Selection[str](
# options=list(Size),
options={
'$ size': 'currency',
'% of port': 'percent_of_port',
'# shares': 'shares',
},
# TODO: save/load from config and/or last session
# value='currency'
)
weight = Selection[str](
options={
'uniform': 'uniform',
},
# value='uniform',
)
size = Edit[float](value=1000)
slots = Edit[int](value=4)

View File

@ -1,648 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Charting overlay helpers.
'''
from typing import Callable, Optional
from pyqtgraph.Qt.QtCore import (
# QObject,
# Signal,
Qt,
# QEvent,
)
from pyqtgraph.graphicsItems.AxisItem import AxisItem
from pyqtgraph.graphicsItems.ViewBox import ViewBox
from pyqtgraph.graphicsItems.GraphicsWidget import GraphicsWidget
from pyqtgraph.graphicsItems.PlotItem.PlotItem import PlotItem
from pyqtgraph.Qt.QtCore import QObject, Signal, QEvent
from pyqtgraph.Qt.QtWidgets import QGraphicsGridLayout, QGraphicsLinearLayout
from ._interaction import ChartView
__all__ = ["PlotItemOverlay"]
# Define the layout "position" indices as to be passed
# to a ``QtWidgets.QGraphicsGridlayout.addItem()`` call:
# https://doc.qt.io/qt-5/qgraphicsgridlayout.html#addItem
# This was pulled from the internals of ``PlotItem.setAxisItem()``.
_axes_layout_indices: dict[str] = {
# row incremented axes
'top': (1, 1),
'bottom': (3, 1),
# view is @ (2, 1)
# column incremented axes
'left': (2, 0),
'right': (2, 2),
}
# NOTE: To clarify this indexing, ``PlotItem.__init__()`` makes a grid
# with dimensions 4x3 and puts the ``ViewBox`` at postiion (2, 1) (aka
# row=2, col=1) in the grid layout since row (0, 1) is reserved for
# a title label and row 1 is for any potential "top" axis. Column 1
# is the "middle" (since 3 columns) and is where the plot/vb is placed.
class ComposedGridLayout:
'''
List-like interface to managing a sequence of overlayed
``PlotItem``s in the form:
| | | | | top0 | | | | |
| | | | | top1 | | | | |
| | | | | ... | | | | |
| | | | | topN | | | | |
| lN | ... | l1 | l0 | ViewBox | r0 | r1 | ... | rN |
| | | | | bottom0 | | | | |
| | | | | bottom1 | | | | |
| | | | | ... | | | | |
| | | | | bottomN | | | | |
Where the index ``i`` in the sequence specifies the index
``<axis_name>i`` in the layout.
The ``item: PlotItem`` passed to the constructor's grid layout is
used verbatim as the "main plot" who's view box is give precedence
for input handling. The main plot's axes are removed from it's
layout and placed in the surrounding exterior layouts to allow for
re-ordering if desired.
'''
def __init__(
self,
item: PlotItem,
grid: QGraphicsGridLayout,
reverse: bool = False, # insert items to the "center"
) -> None:
self.items: list[PlotItem] = []
# self.grid = grid
self.reverse = reverse
# TODO: use a ``bidict`` here?
self._pi2axes: dict[
int,
dict[str, AxisItem],
] = {}
# TODO: better name?
# construct surrounding layouts for placing outer axes and
# their legends and title labels.
self.sides: dict[
str,
tuple[QGraphicsLinearLayout, list[AxisItem]]
] = {}
for name, pos in _axes_layout_indices.items():
layout = QGraphicsLinearLayout()
self.sides[name] = (layout, [])
layout.setContentsMargins(0, 0, 0, 0)
layout.setSpacing(0)
if name in ('top', 'bottom'):
orient = Qt.Vertical
elif name in ('left', 'right'):
orient = Qt.Horizontal
layout.setOrientation(orient)
self.insert(0, item)
# insert surrounding linear layouts into the parent pi's layout
# such that additional axes can be appended arbitrarily without
# having to expand or resize the parent's grid layout.
for name, (linlayout, axes) in self.sides.items():
# TODO: do we need this?
# axis should have been removed during insert above
index = _axes_layout_indices[name]
axis = item.layout.itemAt(*index)
if axis and axis.isVisible():
assert linlayout.itemAt(0) is axis
# item.layout.removeItem(axis)
item.layout.addItem(linlayout, *index)
layout = item.layout.itemAt(*index)
assert layout is linlayout
def _register_item(
self,
index: int,
plotitem: PlotItem,
) -> None:
for name, axis_info in plotitem.axes.items():
axis = axis_info['item']
# register this plot's (maybe re-placed) axes for lookup.
# print(f'inserting {name}:{axis} to index {index}')
self._pi2axes.setdefault(name, {})[index] = axis
# enter plot into list for index tracking
self.items.insert(index, plotitem)
def insert(
self,
index: int,
plotitem: PlotItem,
) -> (int, int):
'''
Place item at index by inserting all axes into the grid
at list-order appropriate position.
'''
if index < 0:
raise ValueError('`insert()` only supports an index >= 0')
# add plot's axes in sequence to the embedded linear layouts
# for each "side" thus avoiding graphics collisions.
for name, axis_info in plotitem.axes.copy().items():
linlayout, axes = self.sides[name]
axis = axis_info['item']
if axis in axes:
# TODO: re-order using ``.pop()`` ?
ValueError(f'{axis} is already in {name} layout!?')
# linking sanity
axis_view = axis.linkedView()
assert axis_view is plotitem.vb
if (
not axis.isVisible()
# XXX: we never skip moving the axes for the *first*
# plotitem inserted (even if not shown) since we need to
# move all the hidden axes into linear sub-layouts for
# that "central" plot in the overlay. Also if we don't
# do it there's weird geomoetry calc offsets that make
# view coords slightly off somehow .. smh
and not len(self.items) == 0
):
continue
# XXX: Remove old axis? No, turns out we don't need this?
# DON'T unlink it since we the original ``ViewBox``
# to still drive it B)
# popped = plotitem.removeAxis(name, unlink=False)
# assert axis is popped
# invert insert index for layouts which are
# not-left-to-right, top-to-bottom insert oriented
insert_index = index
if name in ('top', 'left'):
insert_index = min(len(axes) - index, 0)
assert insert_index >= 0
linlayout.insertItem(insert_index, axis)
axes.insert(index, axis)
self._register_item(index, plotitem)
return index
def append(
self,
item: PlotItem,
) -> (int, int):
'''
Append item's axes at indexes which put its axes "outside"
previously overlayed entries.
'''
# for left and bottom axes we have to first remove
# items and re-insert to maintain a list-order.
return self.insert(len(self.items), item)
def get_axis(
self,
plot: PlotItem,
name: str,
) -> Optional[AxisItem]:
'''
Retrieve the named axis for overlayed ``plot`` or ``None``
if axis for that name is not shown.
'''
index = self.items.index(plot)
named = self._pi2axes[name]
return named.get(index)
def pop(
self,
item: PlotItem,
) -> PlotItem:
'''
Remove item and restack all axes in list-order.
'''
raise NotImplementedError
# Unimplemented features TODO:
# - 'A' (autobtn) should relay to all views
# - context menu single handler + relay?
# - layout unwind and re-pack for 'left' and 'top' axes
# - add labels to layout if detected in source ``PlotItem``
# UX nice-to-have TODO:
# - optional "focussed" view box support for view boxes
# that have custom input handlers (eg. you might want to
# scale the view to some "focussed" data view and have overlayed
# viewboxes only respond to relayed events.)
# - figure out how to deal with menu raise events for multi-viewboxes.
# (we might want to add a different menu which specs the name of the
# view box currently being handled?
# - allow selection of a particular view box by interacting with its
# axis?
# TODO: we might want to enabled some kind of manual flag to disable
# this method wrapping during type creation? As example a user could
# definitively decide **not** to enable broadcasting support by
# setting something like ``ViewBox.disable_relays = True``?
def mk_relay_method(
signame: str,
slot: Callable[
[ViewBox,
'QEvent',
Optional[AxisItem]],
None,
],
) -> Callable[
[
ViewBox,
# lol, there isn't really a generic type thanks
# to the rewrite of Qt's event system XD
'QEvent',
'Optional[AxisItem]',
'Optional[ViewBox]', # the ``relayed_from`` arg we provide
],
None,
]:
def maybe_broadcast(
vb: 'ViewBox',
ev: 'QEvent',
axis: 'Optional[int]' = None,
relayed_from: 'ViewBox' = None,
) -> None:
'''
(soon to be) Decorator which makes an event handler
"broadcastable" to overlayed ``GraphicsWidget``s.
Adds relay signals based on the decorated handler's name
and conducts a signal broadcast of the relay signal if there
are consumers registered.
'''
# When no relay source has been set just bypass all
# the broadcast machinery.
if vb.event_relay_source is None:
ev.accept()
return slot(
vb,
ev,
axis=axis,
)
if relayed_from:
assert axis is None
# this is a relayed event and should be ignored (so it does not
# halt/short circuit the graphicscene loop). Further the
# surrounding handler for this signal must be allowed to execute
# and get processed by **this consumer**.
# print(f'{vb.name} rx relayed from {relayed_from.name}')
ev.ignore()
return slot(
vb,
ev,
axis=axis,
)
if axis is not None:
# print(f'{vb.name} handling axis event:\n{str(ev)}')
ev.accept()
return slot(
vb,
ev,
axis=axis,
)
elif (
relayed_from is None
and vb.event_relay_source is vb # we are the broadcaster
and axis is None
):
# Broadcast case: this is a source event which will be
# relayed to attached consumers and accepted after all
# consumers complete their own handling followed by this
# routine's processing. Sequence is,
# - pre-relay to all consumers *first* - ``.emit()`` blocks
# until all downstream relay handlers have run.
# - run the source handler for **this** event and accept
# the event
# Access the "bound signal" that is created
# on the widget type as part of instantiation.
signal = getattr(vb, signame)
# print(f'{vb.name} emitting {signame}')
# TODO/NOTE: we could also just bypass a "relay" signal
# entirely and instead call the handlers manually in
# a loop? This probably is a lot simpler and also doesn't
# have any downside, and allows not touching target widget
# internals.
signal.emit(
ev,
axis,
# passing this demarks a broadcasted/relayed event
vb,
)
# accept event so no more relays are fired.
ev.accept()
# call underlying wrapped method with an extra
# ``relayed_from`` value to denote that this is a relayed
# event handling case.
return slot(
vb,
ev,
axis=axis,
)
return maybe_broadcast
# XXX: :( can't define signals **after** class compile time
# so this is not really useful.
# def mk_relay_signal(
# func,
# name: str = None,
# ) -> Signal:
# (
# args,
# varargs,
# varkw,
# defaults,
# kwonlyargs,
# kwonlydefaults,
# annotations
# ) = inspect.getfullargspec(func)
# # XXX: generate a relay signal with 1 extra
# # argument for a ``relayed_from`` kwarg. Since
# # ``'self'`` is already ignored by signals we just need
# # to count the arguments since we're adding only 1 (and
# # ``args`` will capture that).
# numargs = len(args + list(defaults))
# signal = Signal(*tuple(numargs * [object]))
# signame = name or func.__name__ + 'Relay'
# return signame, signal
def enable_relays(
widget: GraphicsWidget,
handler_names: list[str],
) -> list[Signal]:
'''
Method override helper which enables relay of a particular
``Signal`` from some chosen broadcaster widget to a set of
consumer widgets which should operate their event handlers normally
but instead of signals "relayed" from the broadcaster.
Mostly useful for overlaying widgets that handle user input
that you want to overlay graphically. The target ``widget`` type must
define ``QtCore.Signal``s each with a `'Relay'` suffix for each
name provided in ``handler_names: list[str]``.
'''
signals = []
for name in handler_names:
handler = getattr(widget, name)
signame = name + 'Relay'
# ensure the target widget defines a relay signal
relay = getattr(widget, signame)
widget.relays[signame] = name
signals.append(relay)
method = mk_relay_method(signame, handler)
setattr(widget, name, method)
return signals
enable_relays(
ChartView,
['wheelEvent', 'mouseDragEvent']
)
class PlotItemOverlay:
'''
A composite for managing overlaid ``PlotItem`` instances such that
you can make multiple graphics appear on the same graph with
separate (non-colliding) axes apply ``ViewBox`` signal broadcasting
such that all overlaid items respond to input simultaneously.
'''
def __init__(
self,
root_plotitem: PlotItem
) -> None:
self.root_plotitem: PlotItem = root_plotitem
vb = root_plotitem.vb
vb.event_relay_source = vb # TODO: maybe change name?
vb.setZValue(1000) # XXX: critical for scene layering/relaying
self.overlays: list[PlotItem] = []
self.layout = ComposedGridLayout(
root_plotitem,
root_plotitem.layout,
)
self._relays: dict[str, Signal] = {}
def add_plotitem(
self,
plotitem: PlotItem,
index: Optional[int] = None,
# TODO: we could also put the ``ViewBox.XAxis``
# style enum here?
# (0,), # link x
# (1,), # link y
# (0, 1), # link both
link_axes: tuple[int] = (),
) -> None:
index = index or len(self.overlays)
root = self.root_plotitem
# layout: QGraphicsGridLayout = root.layout
self.overlays.insert(index, plotitem)
vb: ViewBox = plotitem.vb
# mark this consumer overlay as ready to expect relayed events
# from the root plotitem.
vb.event_relay_source = root.vb
# TODO: some sane way to allow menu event broadcast XD
# vb.setMenuEnabled(False)
# TODO: inside the `maybe_broadcast()` (soon to be) decorator
# we need have checks that consumers have been attached to
# these relay signals.
if link_axes != (0, 1):
# wire up relay signals
for relay_signal_name, handler_name in vb.relays.items():
# print(handler_name)
# XXX: Signal class attrs are bound after instantiation
# of the defining type, so we need to access that bound
# version here.
signal = getattr(root.vb, relay_signal_name)
handler = getattr(vb, handler_name)
signal.connect(handler)
# link dim-axes to root if requested by user.
# TODO: solve more-then-wanted scaled panning on click drag
# which seems to be due to broadcast. So we probably need to
# disable broadcast when axes are linked in a particular
# dimension?
for dim in link_axes:
# link x and y axes to new view box such that the top level
# viewbox propagates to the root (and whatever other
# plotitem overlays that have been added).
vb.linkView(dim, root.vb)
# make overlaid viewbox impossible to focus since the top
# level should handle all input and relay to overlays.
# NOTE: this was solved with the `setZValue()` above!
# TODO: we will probably want to add a "focus" api such that
# a new "top level" ``PlotItem`` can be selected dynamically
# (and presumably the axes dynamically sorted to match).
vb.setFlag(
vb.GraphicsItemFlag.ItemIsFocusable,
False
)
vb.setFocusPolicy(Qt.NoFocus)
# append-compose into the layout all axes from this plot
self.layout.insert(index, plotitem)
plotitem.setGeometry(root.vb.sceneBoundingRect())
def size_to_viewbox(vb: 'ViewBox'):
plotitem.setGeometry(vb.sceneBoundingRect())
root.vb.sigResized.connect(size_to_viewbox)
# ensure the overlayed view is redrawn on each cycle
root.scene().sigPrepareForPaint.connect(vb.prepareForPaint)
# focus state sanity
vb.clearFocus()
assert not vb.focusWidget()
root.vb.setFocus()
assert root.vb.focusWidget()
# XXX: do we need this? Why would you build then destroy?
def remove_plotitem(self, plotItem: PlotItem) -> None:
'''
Remove this ``PlotItem`` from the overlayed set making not shown
and unable to accept input.
'''
...
# TODO: i think this would be super hot B)
def focus_item(self, plotitem: PlotItem) -> PlotItem:
'''
Apply focus to a contained PlotItem thus making it the "top level"
item in the overlay able to accept peripheral's input from the user
and responsible for zoom and panning control via its ``ViewBox``.
'''
...
def get_axis(
self,
plot: PlotItem,
name: str,
) -> AxisItem:
'''
Retrieve the named axis for overlayed ``plot``.
'''
return self.layout.get_axis(plot, name)
def get_axes(
self,
name: str,
) -> list[AxisItem]:
'''
Retrieve all axes for all plots with ``name: str``.
If a particular overlay doesn't have a displayed named axis
then it is not delivered in the returned ``list``.
'''
axes = []
for plot in self.overlays:
axis = self.layout.get_axis(plot, name)
if axis:
axes.append(axis)
return axes
# TODO: i guess we need this if you want to detach existing plots
# dynamically? XXX: untested as of now.
def _disconnect_all(
self,
plotitem: PlotItem,
) -> list[Signal]:
'''
Disconnects all signals related to this widget for the given chart.
'''
disconnected = []
for pi, sig in self._relays.items():
QObject.disconnect(sig)
disconnected.append(sig)
return disconnected

View File

@ -1,236 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Super fast ``QPainterPath`` generation related operator routines.
"""
from __future__ import annotations
from typing import (
# Optional,
TYPE_CHECKING,
)
import numpy as np
from numpy.lib import recfunctions as rfn
from numba import njit, float64, int64 # , optional
# import pyqtgraph as pg
from PyQt5 import QtGui
# from PyQt5.QtCore import QLineF, QPointF
from ..data._sharedmem import (
ShmArray,
)
# from .._profile import pg_profile_enabled, ms_slower_then
from ._compression import (
ds_m4,
)
if TYPE_CHECKING:
from ._flows import Renderer
def xy_downsample(
x,
y,
uppx,
x_spacer: float = 0.5,
) -> tuple[np.ndarray, np.ndarray]:
# downsample whenever more then 1 pixels per datum can be shown.
# always refresh data bounds until we get diffing
# working properly, see above..
bins, x, y = ds_m4(
x,
y,
uppx,
)
# flatten output to 1d arrays suitable for path-graphics generation.
x = np.broadcast_to(x[:, None], y.shape)
x = (x + np.array(
[-x_spacer, 0, 0, x_spacer]
)).flatten()
y = y.flatten()
return x, y
@njit(
# TODO: for now need to construct this manually for readonly arrays, see
# https://github.com/numba/numba/issues/4511
# ntypes.tuple((float64[:], float64[:], float64[:]))(
# numba_ohlc_dtype[::1], # contiguous
# int64,
# optional(float64),
# ),
nogil=True
)
def path_arrays_from_ohlc(
data: np.ndarray,
start: int64,
bar_gap: float64 = 0.43,
) -> np.ndarray:
'''
Generate an array of lines objects from input ohlc data.
'''
size = int(data.shape[0] * 6)
x = np.zeros(
# data,
shape=size,
dtype=float64,
)
y, c = x.copy(), x.copy()
# TODO: report bug for assert @
# /home/goodboy/repos/piker/env/lib/python3.8/site-packages/numba/core/typing/builtins.py:991
for i, q in enumerate(data[start:], start):
# TODO: ask numba why this doesn't work..
# open, high, low, close, index = q[
# ['open', 'high', 'low', 'close', 'index']]
open = q['open']
high = q['high']
low = q['low']
close = q['close']
index = float64(q['index'])
istart = i * 6
istop = istart + 6
# x,y detail the 6 points which connect all vertexes of a ohlc bar
x[istart:istop] = (
index - bar_gap,
index,
index,
index,
index,
index + bar_gap,
)
y[istart:istop] = (
open,
open,
low,
high,
close,
close,
)
# specifies that the first edge is never connected to the
# prior bars last edge thus providing a small "gap"/"space"
# between bars determined by ``bar_gap``.
c[istart:istop] = (1, 1, 1, 1, 1, 0)
return x, y, c
def gen_ohlc_qpath(
r: Renderer,
data: np.ndarray,
array_key: str, # we ignore this
vr: tuple[int, int],
start: int = 0, # XXX: do we need this?
# 0.5 is no overlap between arms, 1.0 is full overlap
w: float = 0.43,
) -> QtGui.QPainterPath:
'''
More or less direct proxy to ``path_arrays_from_ohlc()``
but with closed in kwargs for line spacing.
'''
x, y, c = path_arrays_from_ohlc(
data,
start,
bar_gap=w,
)
return x, y, c
def ohlc_to_line(
ohlc_shm: ShmArray,
data_field: str,
fields: list[str] = ['open', 'high', 'low', 'close']
) -> tuple[
np.ndarray,
np.ndarray,
]:
'''
Convert an input struct-array holding OHLC samples into a pair of
flattened x, y arrays with the same size (datums wise) as the source
data.
'''
y_out = ohlc_shm.ustruct(fields)
first = ohlc_shm._first.value
last = ohlc_shm._last.value
# write pushed data to flattened copy
y_out[first:last] = rfn.structured_to_unstructured(
ohlc_shm.array[fields]
)
# generate an flat-interpolated x-domain
x_out = (
np.broadcast_to(
ohlc_shm._array['index'][:, None],
(
ohlc_shm._array.size,
# 4, # only ohlc
y_out.shape[1],
),
) + np.array([-0.5, 0, 0, 0.5])
)
assert y_out.any()
return (
x_out,
y_out,
)
def to_step_format(
shm: ShmArray,
data_field: str,
index_field: str = 'index',
) -> tuple[int, np.ndarray, np.ndarray]:
'''
Convert an input 1d shm array to a "step array" format
for use by path graphics generation.
'''
i = shm._array['index'].copy()
out = shm._array[data_field].copy()
x_out = np.broadcast_to(
i[:, None],
(i.size, 2),
) + np.array([-0.5, 0.5])
y_out = np.empty((len(out), 2), dtype=out.dtype)
y_out[:] = out[:, np.newaxis]
# start y at origin level
y_out[0, 0] = 0
return x_out, y_out

View File

@ -1,48 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Customization of ``pyqtgraph`` core routines to speed up our use mostly
based on not requiring "scentific precision" for pixel perfect view
transforms.
"""
import pyqtgraph as pg
def invertQTransform(tr):
"""Return a QTransform that is the inverse of *tr*.
Raises an exception if tr is not invertible.
Note that this function is preferred over QTransform.inverted() due to
bugs in that method. (specifically, Qt has floating-point precision issues
when determining whether a matrix is invertible)
"""
# see https://doc.qt.io/qt-5/qtransform.html#inverted
# NOTE: if ``invertable == False``, ``qt_t`` is an identity
qt_t, invertable = tr.inverted()
return qt_t
def _do_overrides() -> None:
"""Dooo eeet.
"""
# we don't care about potential fp issues inside Qt
pg.functions.invertQTransform = invertQTransform

View File

@ -1,696 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Position info and display
"""
from __future__ import annotations
from dataclasses import dataclass
from functools import partial
from math import floor, copysign
from typing import Optional
# from PyQt5.QtWidgets import QStyle
# from PyQt5.QtGui import (
# QIcon, QPixmap, QColor
# )
from pyqtgraph import functions as fn
from ._annotate import LevelMarker
from ._anchors import (
pp_tight_and_right, # wanna keep it straight in the long run
gpath_pin,
)
from ..calc import humanize, pnl, puterize
from ..clearing._allocate import Allocator, Position
from ..data._normalize import iterticks
from ..data.feed import Feed
from ._label import Label
from ._lines import LevelLine, order_line
from ._style import _font
from ._forms import FieldsForm, FillStatusBar, QLabel
from ..log import get_logger
log = get_logger(__name__)
_pnl_tasks: dict[str, bool] = {}
async def update_pnl_from_feed(
feed: Feed,
order_mode: OrderMode, # noqa
tracker: PositionTracker,
) -> None:
'''Real-time display the current pp's PnL in the appropriate label.
``ValueError`` if this task is spawned where there is a net-zero pp.
'''
global _pnl_tasks
pp = order_mode.current_pp
live = pp.live_pp
key = live.symbol.key
log.info(f'Starting pnl display for {pp.alloc.account}')
if live.size < 0:
types = ('ask', 'last', 'last', 'dark_trade')
elif live.size > 0:
types = ('bid', 'last', 'last', 'dark_trade')
else:
log.info(f'No position (yet) for {tracker.alloc.account}@{key}')
return
# real-time update pnl on the status pane
try:
async with feed.stream.subscribe() as bstream:
# last_tick = time.time()
async for quotes in bstream:
# now = time.time()
# period = now - last_tick
for sym, quote in quotes.items():
for tick in iterticks(quote, types):
# print(f'{1/period} Hz')
size = order_mode.current_pp.live_pp.size
if size == 0:
# terminate this update task since we're
# no longer in a pp
order_mode.pane.pnl_label.format(pnl=0)
return
else:
# compute and display pnl status
order_mode.pane.pnl_label.format(
pnl=copysign(1, size) * pnl(
# live.avg_price,
order_mode.current_pp.live_pp.avg_price,
tick['price'],
),
)
# last_tick = time.time()
finally:
assert _pnl_tasks[key]
assert _pnl_tasks.pop(key)
@dataclass
class SettingsPane:
'''
Composite set of widgets plus an allocator model for configuring
order entry sizes and position limits per tradable instrument.
'''
# input fields
form: FieldsForm
# output fill status and labels
fill_bar: FillStatusBar
step_label: QLabel
pnl_label: QLabel
limit_label: QLabel
# encompasing high level namespace
order_mode: Optional['OrderMode'] = None # typing: ignore # noqa
def set_accounts(
self,
names: list[str],
sizes: Optional[list[float]] = None,
) -> None:
combo = self.form.fields['account']
return combo.set_items(names)
def on_selection_change(
self,
text: str,
key: str,
) -> None:
'''
Called on any order pane drop down selection change.
'''
log.info(f'selection input {key}:{text}')
self.on_ui_settings_change(key, text)
def on_ui_settings_change(
self,
key: str,
value: str,
) -> bool:
'''
Called on any order pane edit field value change.
'''
mode = self.order_mode
# an account switch request
if key == 'account':
# hide details on the old selection
old_tracker = mode.current_pp
old_tracker.hide_info()
# re-assign the order mode tracker
account_name = value
tracker = mode.trackers.get(account_name)
# if selection can't be found (likely never discovered with
# a ``brokerd`) then error and switch back to the last
# selection.
if tracker is None:
sym = old_tracker.chart.linked.symbol.key
log.error(
f'Account `{account_name}` can not be set for {sym}'
)
self.form.fields['account'].setCurrentText(
old_tracker.alloc.account)
return
self.order_mode.current_pp = tracker
assert tracker.alloc.account == account_name
self.form.fields['account'].setCurrentText(account_name)
tracker.show()
tracker.hide_info()
self.display_pnl(tracker)
# load the new account's allocator
alloc = tracker.alloc
else:
tracker = mode.current_pp
alloc = tracker.alloc
size_unit = alloc.size_unit
# WRITE any settings to current pp's allocator
try:
if key == 'size_unit':
# implicit re-write of value if input
# is the "text name" of the units.
# yah yah, i know this is badd..
alloc.size_unit = value
else:
value = puterize(value)
if key == 'limit':
pp = mode.current_pp.live_pp
if size_unit == 'currency':
dsize = pp.dsize
if dsize > value:
log.error(
f'limit must > then current pp: {dsize}'
)
raise ValueError
alloc.currency_limit = value
else:
size = pp.size
if size > value:
log.error(
f'limit must > then current pp: {size}'
)
raise ValueError
alloc.units_limit = value
elif key == 'slots':
if value <= 0:
raise ValueError('slots must be > 0')
alloc.slots = int(value)
else:
log.error(f'Unknown setting {key}')
raise ValueError
log.info(f'settings change: {key}: {value}')
except ValueError:
log.error(f'Invalid value for `{key}`: {value}')
# READ out settings and update the status UI / settings widgets
suffix = {'currency': ' $', 'units': ' u'}[size_unit]
limit = alloc.limit()
# TODO: a reverse look up from the position to the equivalent
# account(s), if none then look to user config for default?
self.update_status_ui(pp=tracker)
step_size, currency_per_slot = alloc.step_sizes()
if size_unit == 'currency':
step_size = currency_per_slot
self.step_label.format(
step_size=str(humanize(step_size)) + suffix
)
self.limit_label.format(
limit=str(humanize(limit)) + suffix
)
# update size unit in UI
self.form.fields['size_unit'].setCurrentText(
alloc._size_units[alloc.size_unit]
)
self.form.fields['slots'].setText(str(alloc.slots))
self.form.fields['limit'].setText(str(limit))
# update of level marker size label based on any new settings
tracker.update_from_pp()
# TODO: maybe return a diff of settings so if we can an error we
# can have general input handling code to report it through the
# UI in some way?
return True
def update_status_ui(
self,
pp: PositionTracker,
) -> None:
alloc = pp.alloc
slots = alloc.slots
used = alloc.slots_used(pp.live_pp)
# calculate proportion of position size limit
# that exists and display in fill bar
# TODO: what should we do for fractional slot pps?
self.fill_bar.set_slots(
slots,
# TODO: how to show "partial" slots?
# min(round(prop * slots), slots)
min(used, slots)
)
self.update_account_icons({alloc.account: pp.live_pp})
def update_account_icons(
self,
pps: dict[str, Position],
) -> None:
form = self.form
accounts = form.fields['account']
for account_name, pp in pps.items():
icon_name = None
if pp.size > 0:
icon_name = 'long_pp'
elif pp.size < 0:
icon_name = 'short_pp'
accounts.set_icon(account_name, icon_name)
def display_pnl(
self,
tracker: PositionTracker,
) -> None:
'''Display the PnL for the current symbol and personal positioning (pp).
If a position is open start a background task which will
real-time update the pnl label in the settings pane.
'''
mode = self.order_mode
sym = mode.chart.linked.symbol
size = tracker.live_pp.size
feed = mode.quote_feed
pnl_value = 0
if size:
# last historical close price
last = feed.shm.array[-1][['close']][0]
pnl_value = copysign(1, size) * pnl(
tracker.live_pp.avg_price,
last,
)
# maybe start update task
global _pnl_tasks
if sym.key not in _pnl_tasks:
_pnl_tasks[sym.key] = True
self.order_mode.nursery.start_soon(
update_pnl_from_feed,
feed,
mode,
tracker,
)
# immediately display in status label
self.pnl_label.format(pnl=pnl_value)
def position_line(
chart: 'ChartPlotWidget', # noqa
size: float,
level: float,
color: str,
orient_v: str = 'bottom',
marker: Optional[LevelMarker] = None,
) -> LevelLine:
'''
Convenience routine to create a line graphic representing a "pp"
aka the acro for a,
"{piker, private, personal, puny, <place your p-word here>} position".
If ``marker`` is provided it will be configured appropriately for
the "direction" of the position.
'''
line = order_line(
chart,
level,
# TODO: could we maybe add a ``action=None`` which
# would be a mechanism to check a marker was passed in?
color=color,
highlight_on_hover=False,
movable=False,
hide_xhair_on_hover=False,
only_show_markers_on_hover=False,
always_show_labels=False,
# explicitly disable ``order_line()`` factory's creation
# of a level marker since we do it in this tracer thing.
show_markers=False,
)
if marker:
# configure marker to position data
if size > 0: # long
style = '|<' # point "up to" the line
elif size < 0: # short
style = '>|' # point "down to" the line
marker.style = style
# set marker color to same as line
marker.setPen(line.currentPen)
marker.setBrush(fn.mkBrush(line.currentPen.color()))
marker.level = level
marker.update()
marker.show()
# show position marker on view "edge" when out of view
vb = line.getViewBox()
vb.sigRangeChanged.connect(marker.position_in_view)
line.set_level(level)
return line
class PositionTracker:
'''
Track and display real-time positions for a single symbol
over multiple accounts on a single chart.
Graphically composed of a level line and marker as well as labels
for indcating current position information. Updates are made to the
corresponding "settings pane" for the chart's "order mode" UX.
'''
# inputs
chart: 'ChartPlotWidget' # noqa
alloc: Allocator
startup_pp: Position
live_pp: Position
# allocated
pp_label: Label
size_label: Label
line: Optional[LevelLine] = None
_color: str = 'default_lightest'
def __init__(
self,
chart: 'ChartPlotWidget', # noqa
alloc: Allocator,
startup_pp: Position,
) -> None:
self.chart = chart
self.alloc = alloc
self.startup_pp = startup_pp
self.live_pp = startup_pp.copy()
view = chart.getViewBox()
# literally the 'pp' (pee pee) label that's always in view
self.pp_label = pp_label = Label(
view=view,
fmt_str='pp',
color=self._color,
update_on_range_change=False,
)
# create placeholder 'up' level arrow
self._level_marker = None
self._level_marker = self.level_marker(size=1)
pp_label.scene_anchor = partial(
gpath_pin,
gpath=self._level_marker,
label=pp_label,
)
pp_label.render()
self.size_label = size_label = Label(
view=view,
color=self._color,
# this is "static" label
# update_on_range_change=False,
fmt_str='\n'.join((
':{slots_used:.1f}x',
)),
fields={
'slots_used': 0,
},
)
size_label.render()
size_label.scene_anchor = partial(
pp_tight_and_right,
label=self.pp_label,
)
@property
def pane(self) -> FieldsForm:
'''
Return handle to pp side pane form.
'''
return self.chart.linked.godwidget.pp_pane
def update_graphics(
self,
marker: LevelMarker
) -> None:
'''
Update all labels.
Meant to be called from the maker ``.paint()``
for immediate, lag free label draws.
'''
self.pp_label.update()
self.size_label.update()
def update_from_pp(
self,
position: Optional[Position] = None,
) -> None:
'''Update graphics and data from average price and size passed in our
EMS ``BrokerdPosition`` msg.
'''
# live pp updates
pp = position or self.live_pp
self.update_line(
pp.avg_price,
pp.size,
self.chart.linked.symbol.lot_size_digits,
)
# label updates
self.size_label.fields['slots_used'] = round(
self.alloc.slots_used(pp), ndigits=1)
self.size_label.render()
if pp.size == 0:
self.hide()
else:
self._level_marker.level = pp.avg_price
# these updates are critical to avoid lag on view/scene changes
self._level_marker.update() # trigger paint
self.pp_label.update()
self.size_label.update()
self.show()
# don't show side and status widgets unless
# order mode is "engaged" (which done via input controls)
self.hide_info()
def level(self) -> float:
if self.line:
return self.line.value()
else:
return 0
def show(self) -> None:
if self.live_pp.size:
self.line.show()
self.line.show_labels()
self._level_marker.show()
self.pp_label.show()
self.size_label.show()
def hide(self) -> None:
self.pp_label.hide()
self._level_marker.hide()
self.size_label.hide()
if self.line:
self.line.hide()
def hide_info(self) -> None:
'''Hide details (right now just size label?) of position.
'''
self.size_label.hide()
if self.line:
self.line.hide_labels()
# TODO: move into annoate module
def level_marker(
self,
size: float,
) -> LevelMarker:
if self._level_marker:
self._level_marker.delete()
# arrow marker
# scale marker size with dpi-aware font size
font_size = _font.font.pixelSize()
# scale marker size with dpi-aware font size
arrow_size = floor(1.375 * font_size)
if size > 0:
style = '|<'
elif size < 0:
style = '>|'
arrow = LevelMarker(
chart=self.chart,
style=style,
get_level=self.level,
size=arrow_size,
on_paint=self.update_graphics,
)
self.chart.getViewBox().scene().addItem(arrow)
arrow.show()
return arrow
def update_line(
self,
price: float,
size: float,
size_digits: int,
) -> None:
'''Update personal position level line.
'''
# do line update
line = self.line
if size:
if line is None:
# create and show a pp line
line = self.line = position_line(
chart=self.chart,
level=price,
size=size,
color=self._color,
marker=self._level_marker,
)
else:
line.set_level(price)
self._level_marker.level = price
self._level_marker.update()
# update LHS sizing label
line.update_labels({
'size': size,
'size_digits': size_digits,
'fiat_size': round(price * size, ndigits=2),
# TODO: per account lines on a single (or very related) symbol
'account': self.alloc.account,
})
line.show()
elif line: # remove pp line from view if it exists on a net-zero pp
line.delete()
self.line = None

View File

@ -1,989 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
qompleterz: embeddable search and complete using trio, Qt and fuzzywuzzy.
"""
# link set for hackzin on this stuff:
# https://doc.qt.io/qt-5/qheaderview.html#moving-header-sections
# https://doc.qt.io/qt-5/model-view-programming.html
# https://doc.qt.io/qt-5/modelview.html
# https://doc.qt.io/qt-5/qtreeview.html#selectedIndexes
# https://doc.qt.io/qt-5/qmodelindex.html#siblingAtColumn
# https://doc.qt.io/qt-5/qitemselectionmodel.html#currentIndex
# https://www.programcreek.com/python/example/108109/PyQt5.QtWidgets.QTreeView
# https://doc.qt.io/qt-5/qsyntaxhighlighter.html
# https://github.com/qutebrowser/qutebrowser/blob/master/qutebrowser/completion/completiondelegate.py#L243
# https://forum.qt.io/topic/61343/highlight-matched-substrings-in-qstyleditemdelegate
from collections import defaultdict
from contextlib import asynccontextmanager
from functools import partial
from typing import (
Optional, Callable,
Awaitable, Sequence,
Any, AsyncIterator
)
import time
# from pprint import pformat
from fuzzywuzzy import process as fuzzy
import trio
from trio_typing import TaskStatus
from PyQt5 import QtCore
from PyQt5 import QtWidgets
from PyQt5.QtCore import (
Qt,
QModelIndex,
QItemSelectionModel,
)
from PyQt5.QtGui import (
# QLayout,
QStandardItem,
QStandardItemModel,
)
from PyQt5.QtWidgets import (
QWidget,
QTreeView,
# QListWidgetItem,
# QAbstractScrollArea,
# QStyledItemDelegate,
)
from ..log import get_logger
from ._style import (
_font,
hcolor,
)
from ._forms import Edit, FontScaledDelegate
log = get_logger(__name__)
class CompleterView(QTreeView):
mode_name: str = 'search-nav'
# XXX: relevant docs links:
# - simple widget version of this:
# https://doc.qt.io/qt-5/qtreewidget.html#details
# - MVC high level instructional:
# https://doc.qt.io/qt-5/model-view-programming.html
# - MV tut:
# https://doc.qt.io/qt-5/modelview.html
# - custome header view (for doing stuff like we have in kivy?):
# https://doc.qt.io/qt-5/qheaderview.html#moving-header-sections
# TODO: selection model stuff for eventual aggregate feeds
# charting and mgmt;
# https://doc.qt.io/qt-5/qabstractitemview.html#setSelectionModel
# https://doc.qt.io/qt-5/qitemselectionmodel.html
# https://doc.qt.io/qt-5/modelview.html#3-2-working-with-selections
# https://doc.qt.io/qt-5/model-view-programming.html#handling-selections-of-items
# TODO: mouse extended handling:
# https://doc.qt.io/qt-5/qabstractitemview.html#entered
def __init__(
self,
parent=None,
labels: list[str] = [],
) -> None:
super().__init__(parent)
model = QStandardItemModel(self)
self.labels = labels
# a std "tabular" config
self.setItemDelegate(FontScaledDelegate(self))
self.setModel(model)
self.setAlternatingRowColors(True)
# TODO: size this based on DPI font
self.setIndentation(_font.px_size)
# self.setUniformRowHeights(True)
# self.setColumnWidth(0, 3)
# self.setVerticalBarPolicy(Qt.ScrollBarAlwaysOff)
# self.setSizeAdjustPolicy(QAbstractScrollArea.AdjustIgnored)
# ux settings
self.setSizePolicy(
QtWidgets.QSizePolicy.Expanding,
QtWidgets.QSizePolicy.Expanding,
)
self.setItemsExpandable(True)
self.setExpandsOnDoubleClick(False)
self.setAnimated(False)
self.setHorizontalScrollBarPolicy(Qt.ScrollBarAlwaysOff)
# column headers
model.setHorizontalHeaderLabels(labels)
self._font_size: int = 0 # pixels
async def on_pressed(self, idx: QModelIndex) -> None:
'''Mouse pressed on view handler.
'''
search = self.parent()
await search.chart_current_item(clear_to_cache=False)
search.focus()
def set_font_size(self, size: int = 18):
# print(size)
if size < 0:
size = 16
self._font_size = size
self.setStyleSheet(f"font: {size}px")
# def resizeEvent(self, event: 'QEvent') -> None:
# event.accept()
# super().resizeEvent(event)
def on_resize(self) -> None:
'''
Resize relay event from god.
'''
self.resize_to_results()
def resize_to_results(self):
model = self.model()
cols = model.columnCount()
# rows = model.rowCount()
col_w_tot = 0
for i in range(cols):
self.resizeColumnToContents(i)
col_w_tot += self.columnWidth(i)
win = self.window()
win_h = win.height()
edit_h = self.parent().bar.height()
sb_h = win.statusBar().height()
# TODO: probably make this more general / less hacky
# we should figure out the exact number of rows to allow
# inclusive of search bar and header "rows", in pixel terms.
# Eventually when we have an "info" widget below the results we
# will want space for it and likely terminating the results-view
# space **exactly on a row** would be ideal.
# if row_px > 0:
# rows = ceil(window_h / row_px) - 4
# else:
# rows = 16
# self.setFixedHeight(rows * row_px)
# self.resize(self.width(), rows * row_px)
# NOTE: if the heigh set here is **too large** then the resize
# event will perpetually trigger as the window causes some kind
# of recompute of callbacks.. so we have to ensure it's limited.
h = win_h - (edit_h + 1.666*sb_h)
assert h > 0
self.setFixedHeight(round(h))
# size to width of longest result seen thus far
# TODO: should we always dynamically scale to longest result?
if self.width() < col_w_tot:
self.setFixedWidth(col_w_tot)
self.update()
def is_selecting_d1(self) -> bool:
cidx = self.selectionModel().currentIndex()
return cidx.parent() == QModelIndex()
def previous_index(self) -> QModelIndex:
cidx = self.selectionModel().currentIndex()
one_above = self.indexAbove(cidx)
if one_above.parent() == QModelIndex():
# if the next node up's parent is the root we don't want to select
# the next node up since it's a top level node and we only
# select entries depth >= 2.
# see if one more up is not the root and we can select it.
two_above = self.indexAbove(one_above)
if two_above != QModelIndex():
return two_above
else:
return cidx
return one_above # just next up
def next_index(self) -> QModelIndex:
cidx = self.selectionModel().currentIndex()
one_below = self.indexBelow(cidx)
if one_below.parent() == QModelIndex():
# if the next node up's parent is the root we don't want to select
# the next node up since it's a top level node and we only
# select entries depth >= 2.
# see if one more up is not the root and we can select it.
two_below = self.indexBelow(one_below)
if two_below != QModelIndex():
return two_below
else:
return cidx
return one_below # just next down
def select_from_idx(
self,
idx: QModelIndex,
) -> QStandardItem:
'''
Select and return the item at index ``idx``.
'''
sel = self.selectionModel()
model = self.model()
sel.setCurrentIndex(
idx,
QItemSelectionModel.ClearAndSelect |
QItemSelectionModel.Rows
)
return model.itemFromIndex(idx)
def select_first(self) -> QStandardItem:
'''
Select the first depth >= 2 entry from the completer tree and
return it's item.
'''
# ensure we're **not** selecting the first level parent node and
# instead its child.
model = self.model()
for idx, item in self.iter_d1():
if model.rowCount(idx) == 0:
continue
else:
return self.select_from_idx(self.indexBelow(idx))
def select_next(self) -> QStandardItem:
idx = self.next_index()
assert idx.isValid()
return self.select_from_idx(idx)
def select_previous(self) -> QStandardItem:
idx = self.previous_index()
assert idx.isValid()
return self.select_from_idx(idx)
def next_section(self, direction: str = 'down') -> QModelIndex:
cidx = start_idx = self.selectionModel().currentIndex()
# step up levels to depth == 1
while cidx.parent() != QModelIndex():
cidx = cidx.parent()
# move to next section in `direction`
op = {'up': -1, 'down': +1}[direction]
next_row = cidx.row() + op
nidx = self.model().index(next_row, cidx.column(), QModelIndex())
# do nothing, if there is no valid "next" section
if not nidx.isValid():
return self.select_from_idx(start_idx)
# go to next selectable child item
self.select_from_idx(nidx)
return self.select_next()
def iter_d1(
self,
) -> tuple[QModelIndex, QStandardItem]:
model = self.model()
isections = model.rowCount()
# much thanks to following code to figure out breadth-first
# traversing from the root node:
# https://stackoverflow.com/a/33126689
for i in range(isections):
idx = model.index(i, 0, QModelIndex())
item = model.itemFromIndex(idx)
yield idx, item
def find_section(
self,
section: str,
) -> Optional[QModelIndex]:
'''
Find the *first* depth = 1 section matching ``section`` in
the tree and return its index.
'''
for idx, item in self.iter_d1():
if item.text() == section:
return idx
else:
# caller must expect his
return None
def clear_section(
self,
section: str,
status_field: str = None,
) -> None:
'''Clear all result-rows from under the depth = 1 section.
'''
idx = self.find_section(section)
model = self.model()
if idx is not None:
if model.hasChildren(idx):
rows = model.rowCount(idx)
# print(f'removing {rows} from {section}')
assert model.removeRows(0, rows, parent=idx)
# remove section as well ?
# model.removeRow(i, QModelIndex())
if status_field is not None:
model.setItem(idx.row(), 1, QStandardItem(status_field))
else:
model.setItem(idx.row(), 1, QStandardItem())
self.resize_to_results()
return idx
else:
return None
def set_section_entries(
self,
section: str,
values: Sequence[str],
clear_all: bool = False,
) -> None:
'''
Set result-rows for depth = 1 tree section ``section``.
'''
model = self.model()
if clear_all:
# XXX: rewrite the model from scratch if caller requests it
model.clear()
model.setHorizontalHeaderLabels(self.labels)
section_idx = self.clear_section(section)
# if we can't find a section start adding to the root
if section_idx is None:
root = model.invisibleRootItem()
section_item = QStandardItem(section)
blank = QStandardItem('')
root.appendRow([section_item, blank])
else:
section_item = model.itemFromIndex(section_idx)
# values just needs to be sequence-like
for i, s in enumerate(values):
ix = QStandardItem(str(i))
item = QStandardItem(s)
# Add the item to the model
section_item.appendRow([ix, item])
self.expandAll()
# TODO: figure out if we can avoid this line in a better way
# such that "re-selection" doesn't happen tree-wise for each new
# sub-search:
# https://doc.qt.io/qt-5/model-view-programming.html#handling-selections-in-item-views
# XXX: THE BELOW LINE MUST BE CALLED.
# this stuff is super finicky and if not done right will cause
# Qt crashes out our buttz. it's required in order to get the
# view to show right after typing input.
self.select_first()
# TODO: the way we might be able to do this more sanely is,
# 1. for the currently selected item, when start rewriting
# a section figure out the row, column, parent "abstract"
# position in the tree view and store it
# 2. take that position and re-apply the selection to the new
# model/tree by looking up the new "equivalent element" and
# selecting
self.show_matches()
def show_matches(self) -> None:
self.show()
self.resize_to_results()
class SearchBar(Edit):
mode_name: str = 'search'
def __init__(
self,
parent: QWidget,
godwidget: QWidget,
view: Optional[CompleterView] = None,
**kwargs,
) -> None:
self.godwidget = godwidget
super().__init__(parent, **kwargs)
self.view: CompleterView = view
godwidget._widgets[view.mode_name] = view
def show(self) -> None:
super().show()
self.view.show_matches()
def unfocus(self) -> None:
self.parent().hide()
self.clearFocus()
if self.view:
self.view.hide()
class SearchWidget(QtWidgets.QWidget):
'''
Composed widget of ``SearchBar`` + ``CompleterView``.
Includes helper methods for item management in the sub-widgets.
'''
mode_name: str = 'search'
def __init__(
self,
godwidget: 'GodWidget', # type: ignore # noqa
columns: list[str] = ['src', 'symbol'],
parent=None,
) -> None:
super().__init__(parent or godwidget)
# size it as we specify
self.setSizePolicy(
QtWidgets.QSizePolicy.Fixed,
QtWidgets.QSizePolicy.Expanding,
)
self.godwidget = godwidget
self.vbox = QtWidgets.QVBoxLayout(self)
self.vbox.setContentsMargins(0, 4, 4, 0)
self.vbox.setSpacing(4)
# split layout for the (label:| search bar entry)
self.bar_hbox = QtWidgets.QHBoxLayout()
self.bar_hbox.setContentsMargins(0, 0, 0, 0)
self.bar_hbox.setSpacing(4)
# add label to left of search bar
self.label = label = QtWidgets.QLabel(parent=self)
label.setStyleSheet(
f"""QLabel {{
color : {hcolor('default_lightest')};
font-size : {_font.px_size - 2}px;
}}
"""
)
label.setTextFormat(3) # markdown
label.setFont(_font.font)
label.setMargin(4)
label.setText("search:")
label.show()
label.setAlignment(
QtCore.Qt.AlignVCenter
| QtCore.Qt.AlignLeft
)
self.bar_hbox.addWidget(label)
self.view = CompleterView(
parent=self,
labels=columns,
)
self.bar = SearchBar(
parent=self,
view=self.view,
godwidget=godwidget,
)
self.bar_hbox.addWidget(self.bar)
self.vbox.addLayout(self.bar_hbox)
self.vbox.setAlignment(self.bar, Qt.AlignTop | Qt.AlignRight)
self.vbox.addWidget(self.bar.view)
self.vbox.setAlignment(self.view, Qt.AlignTop | Qt.AlignLeft)
def focus(self) -> None:
if self.view.model().rowCount(QModelIndex()) == 0:
# fill cache list if nothing existing
self.view.set_section_entries(
'cache',
list(reversed(self.godwidget._chart_cache)),
clear_all=True,
)
self.bar.focus()
self.show()
def get_current_item(self) -> Optional[tuple[str, str]]:
'''Return the current completer tree selection as
a tuple ``(parent: str, child: str)`` if valid, else ``None``.
'''
model = self.view.model()
sel = self.view.selectionModel()
cidx = sel.currentIndex()
# TODO: get rid of this hard coded column -> 1
# and use the ``CompleterView`` schema/settings
# to figure out the desired field(s)
# https://doc.qt.io/qt-5/qstandarditemmodel.html#itemFromIndex
node = model.itemFromIndex(cidx.siblingAtColumn(1))
if node:
symbol = node.text()
try:
provider = node.parent().text()
except AttributeError:
# no text set
return None
# TODO: move this to somewhere non-search machinery specific?
if provider == 'cache':
symbol, _, provider = symbol.rpartition('.')
return provider, symbol
else:
return None
async def chart_current_item(
self,
clear_to_cache: bool = True,
) -> Optional[str]:
'''Attempt to load and switch the current selected
completion result to the affiliated chart app.
Return any loaded symbol.
'''
value = self.get_current_item()
if value is None:
return None
provider, symbol = value
chart = self.godwidget
log.info(f'Requesting symbol: {symbol}.{provider}')
await chart.load_symbol(
provider,
symbol,
'info',
)
# fully qualified symbol name (SNS i guess is what we're
# making?)
fqsn = '.'.join([symbol, provider]).lower()
if clear_to_cache:
self.bar.clear()
# Re-order the symbol cache on the chart to display in
# LIFO order. this is normally only done internally by
# the chart on new symbols being loaded into memory
chart.set_chart_symbol(fqsn, chart.linkedsplits)
self.view.set_section_entries(
'cache',
values=list(reversed(chart._chart_cache)),
# remove all other completion results except for cache
clear_all=True,
)
return fqsn
_search_active: trio.Event = trio.Event()
_search_enabled: bool = False
async def pack_matches(
view: CompleterView,
has_results: dict[str, set[str]],
matches: dict[(str, str), list[str]],
provider: str,
pattern: str,
search: Callable[..., Awaitable[dict]],
task_status: TaskStatus[
trio.CancelScope] = trio.TASK_STATUS_IGNORED,
) -> None:
log.info(f'Searching {provider} for "{pattern}"')
if provider != 'cache':
# insert provider entries with search status
view.set_section_entries(
section=provider,
values=[],
)
view.clear_section(provider, status_field='-> searchin..')
else: # for the cache just clear it's entries and don't put a status
view.clear_section(provider)
with trio.CancelScope() as cs:
task_status.started(cs)
# ensure ^ status is updated
results = await search(pattern)
if provider != 'cache': # XXX: don't cache the cache results xD
matches[(provider, pattern)] = results
# print(f'results from {provider}: {results}')
has_results[pattern].add(provider)
if results:
# display completion results
view.set_section_entries(
section=provider,
values=results,
)
else:
view.clear_section(provider)
async def fill_results(
search: SearchBar,
recv_chan: trio.abc.ReceiveChannel,
# kb debouncing pauses (bracket defaults)
min_pause_time: float = 0.01, # absolute min typing throttle
# max pause required before slow relay
max_pause_time: float = 6/16 + 0.001,
) -> None:
"""Task to search through providers and fill in possible
completion results.
"""
global _search_active, _search_enabled, _searcher_cache
bar = search.bar
view = bar.view
view.select_from_idx(QModelIndex())
last_text = bar.text()
repeats = 0
# cache of prior patterns to search results
matches = defaultdict(list)
has_results: defaultdict[str, set[str]] = defaultdict(set)
while True:
await _search_active.wait()
period = None
while True:
last_text = bar.text()
wait_start = time.time()
with trio.move_on_after(max_pause_time):
pattern = await recv_chan.receive()
period = time.time() - wait_start
print(f'{pattern} after {period}')
# during fast multiple key inputs, wait until a pause
# (in typing) to initiate search
if period < min_pause_time:
log.debug(f'Ignoring fast input for {pattern}')
continue
text = bar.text()
# print(f'search: {text}')
if not text or text.isspace():
# print('idling')
_search_active = trio.Event()
break
if text == last_text:
repeats += 1
if not _search_enabled:
# print('search currently disabled')
break
already_has_results = has_results[text]
log.debug(f'Search req for {text}')
# issue multi-provider fan-out search request and place
# "searching.." statuses on outstanding results providers
async with trio.open_nursery() as n:
for provider, (search, pause) in (
_searcher_cache.copy().items()
):
# XXX: only conduct search on this backend if it's
# registered for the corresponding pause period AND
# it hasn't already been searched with the current
# input pattern (in which case just look up the old
# results).
if (period >= pause) and (
provider not in already_has_results
):
# TODO: it may make more sense TO NOT search the
# cache in a bg task since we know it's fully
# cpu-bound.
if provider != 'cache':
view.clear_section(
provider, status_field='-> searchin..')
await n.start(
pack_matches,
view,
has_results,
matches,
provider,
text,
search
)
else: # already has results for this input text
results = matches[(provider, text)]
# TODO really for the cache we need an
# invalidation signal so that we only re-search
# the cache once it's been mutated by the chart
# switcher.. right now we're just always
# re-searching it's ``dict`` since it's easier
# but it also causes it to be slower then cached
# results from other providers on occasion.
if results and provider != 'cache':
view.set_section_entries(
section=provider,
values=results,
)
else:
view.clear_section(provider)
if repeats > 2 and period > max_pause_time:
_search_active = trio.Event()
repeats = 0
break
bar.show()
async def handle_keyboard_input(
searchbar: SearchBar,
recv_chan: trio.abc.ReceiveChannel,
) -> None:
global _search_active, _search_enabled
# startup
bar = searchbar
search = searchbar.parent()
godwidget = search.godwidget
view = bar.view
view.set_font_size(bar.dpi_font.px_size)
send, recv = trio.open_memory_channel(16)
async with trio.open_nursery() as n:
# start a background multi-searcher task which receives
# patterns relayed from this keyboard input handler and
# async updates the completer view's results.
n.start_soon(
partial(
fill_results,
search,
recv,
)
)
async for kbmsg in recv_chan:
event, etype, key, mods, txt = kbmsg.to_tuple()
log.debug(f'key: {key}, mods: {mods}, txt: {txt}')
ctl = False
if mods == Qt.ControlModifier:
ctl = True
if key in (Qt.Key_Enter, Qt.Key_Return):
await search.chart_current_item(clear_to_cache=True)
_search_enabled = False
continue
elif not ctl and not bar.text():
# if nothing in search text show the cache
view.set_section_entries(
'cache',
list(reversed(godwidget._chart_cache)),
clear_all=True,
)
continue
# cancel and close
if ctl and key in {
Qt.Key_C,
Qt.Key_Space, # i feel like this is the "native" one
Qt.Key_Alt,
}:
search.bar.unfocus()
# kill the search and focus back on main chart
if godwidget:
godwidget.focus()
continue
if ctl and key in {
Qt.Key_L,
}:
# like url (link) highlight in a web browser
bar.focus()
# selection navigation controls
elif ctl and key in {
Qt.Key_D,
}:
view.next_section(direction='down')
_search_enabled = False
elif ctl and key in {
Qt.Key_U,
}:
view.next_section(direction='up')
_search_enabled = False
# selection navigation controls
elif (ctl and key in {
Qt.Key_K,
Qt.Key_J,
}) or key in {
Qt.Key_Up,
Qt.Key_Down,
}:
_search_enabled = False
if key in {Qt.Key_K, Qt.Key_Up}:
item = view.select_previous()
elif key in {Qt.Key_J, Qt.Key_Down}:
item = view.select_next()
if item:
parent_item = item.parent()
if parent_item and parent_item.text() == 'cache':
# if it's a cache item, switch and show it immediately
await search.chart_current_item(clear_to_cache=False)
elif not ctl:
# relay to completer task
_search_enabled = True
send.send_nowait(search.bar.text())
_search_active.set()
async def search_simple_dict(
text: str,
source: dict,
) -> dict[str, Any]:
# search routine can be specified as a function such
# as in the case of the current app's local symbol cache
matches = fuzzy.extractBests(
text,
source.keys(),
score_cutoff=90,
)
return [item[0] for item in matches]
# cache of provider names to async search routines
_searcher_cache: dict[str, Callable[..., Awaitable]] = {}
@asynccontextmanager
async def register_symbol_search(
provider_name: str,
search_routine: Callable,
pause_period: Optional[float] = None,
) -> AsyncIterator[dict]:
global _searcher_cache
pause_period = pause_period or 0.1
# deliver search func to consumer
try:
_searcher_cache[provider_name] = (search_routine, pause_period)
yield search_routine
finally:
_searcher_cache.pop(provider_name)

View File

@ -1,114 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Signalling graphics and APIs.
WARNING: this code likely doesn't work at all (yet)
since it was copied from another class that shouldn't
have had it.
"""
import numpy as np
import pyqtgraph as pg
from PyQt5 import QtCore, QtGui, QtWidgets
from .quantdom.charts import CenteredTextItem
from .quantdom.base import Quotes
from .quantdom.portfolio import Order, Portfolio
class SignallingApi(object):
def __init__(self, plotgroup):
self.plotgroup = plotgroup
self.chart = plotgroup.chart
def _show_text_signals(self, lbar, rbar):
signals = [
sig
for sig in self.signals_text_items[lbar:rbar]
if isinstance(sig, CenteredTextItem)
]
if len(signals) <= 50:
for sig in signals:
sig.show()
else:
for sig in signals:
sig.hide()
def _remove_signals(self):
self.chart.removeItem(self.signals_group_arrow)
self.chart.removeItem(self.signals_group_text)
del self.signals_text_items
del self.signals_group_arrow
del self.signals_group_text
self.signals_visible = False
def add_signals(self):
self.signals_group_text = QtWidgets.QGraphicsItemGroup()
self.signals_group_arrow = QtWidgets.QGraphicsItemGroup()
self.signals_text_items = np.empty(len(Quotes), dtype=object)
for p in Portfolio.positions:
x, price = p.id_bar_open, p.open_price
if p.type == Order.BUY:
y = Quotes[x].low * 0.99
pg.ArrowItem(
parent=self.signals_group_arrow,
pos=(x, y),
pen=self.plotgroup.long_pen,
brush=self.plotgroup.long_brush,
angle=90,
headLen=12,
tipAngle=50,
)
text_sig = CenteredTextItem(
parent=self.signals_group_text,
pos=(x, y),
pen=self.plotgroup.long_pen,
brush=self.plotgroup.long_brush,
text=(
'Buy at {:.%df}' % self.plotgroup.digits).format(
price),
valign=QtCore.Qt.AlignBottom,
)
text_sig.hide()
else:
y = Quotes[x].high * 1.01
pg.ArrowItem(
parent=self.signals_group_arrow,
pos=(x, y),
pen=self.plotgroup.short_pen,
brush=self.plotgroup.short_brush,
angle=-90,
headLen=12,
tipAngle=50,
)
text_sig = CenteredTextItem(
parent=self.signals_group_text,
pos=(x, y),
pen=self.plotgroup.short_pen,
brush=self.plotgroup.short_brush,
text=('Sell at {:.%df}' % self.plotgroup.digits).format(
price),
valign=QtCore.Qt.AlignTop,
)
text_sig.hide()
self.signals_text_items[x] = text_sig
self.chart.addItem(self.signals_group_arrow)
self.chart.addItem(self.signals_group_text)
self.signals_visible = True

View File

@ -1,313 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
'''
Qt UI styling.
'''
from typing import Optional, Dict
import math
import pyqtgraph as pg
from PyQt5 import QtCore, QtGui
from PyQt5.QtCore import Qt, QCoreApplication
from qdarkstyle import DarkPalette
from ..log import get_logger
log = get_logger(__name__)
_magic_inches = 0.0666 * (1 + 6/16)
# chart-wide fonts specified in inches
_font_sizes: Dict[str, Dict[str, float]] = {
'hi': {
'default': _magic_inches,
'small': 0.9 * _magic_inches,
},
'lo': {
'default': 6.5 / 64,
'small': 6 / 64,
},
}
class DpiAwareFont:
def __init__(
self,
# TODO: move to config
name: str = 'Hack',
font_size: str = 'default',
# size_in_inches: Optional[float] = None,
) -> None:
self.name = name
self._qfont = QtGui.QFont(name)
self._font_size: str = font_size
self._qfm = QtGui.QFontMetrics(self._qfont)
self._font_inches: float = None
self._screen = None
def _set_qfont_px_size(self, px_size: int) -> None:
self._qfont.setPixelSize(px_size)
self._qfm = QtGui.QFontMetrics(self._qfont)
@property
def screen(self) -> QtGui.QScreen:
from ._window import main_window
if self._screen is not None:
try:
self._screen.refreshRate()
except RuntimeError:
self._screen = main_window().current_screen()
else:
self._screen = main_window().current_screen()
return self._screen
@property
def font(self):
return self._qfont
def scale(self) -> float:
screen = self.screen
return screen.logicalDotsPerInch() / screen.physicalDotsPerInch()
@property
def px_size(self) -> int:
return self._qfont.pixelSize()
def configure_to_dpi(self, screen: Optional[QtGui.QScreen] = None):
"""Set an appropriately sized font size depending on the screen DPI.
If we end up needing to generalize this more here there are resources
listed in the script in ``snippets/qt_screen_info.py``.
"""
if screen is None:
screen = self.screen
# take the max since scaling can make things ugly in some cases
pdpi = screen.physicalDotsPerInch()
ldpi = screen.logicalDotsPerInch()
# XXX: this is needed on sway/wayland where you set
# ``QT_WAYLAND_FORCE_DPI=physical``
if ldpi == 0:
ldpi = pdpi
mx_dpi = max(pdpi, ldpi)
mn_dpi = min(pdpi, ldpi)
scale = round(ldpi/pdpi, ndigits=2)
if mx_dpi <= 97: # for low dpi use larger font sizes
inches = _font_sizes['lo'][self._font_size]
else: # hidpi use smaller font sizes
inches = _font_sizes['hi'][self._font_size]
dpi = mn_dpi
mult = 1.0
# No implicit DPI scaling was done by the DE so let's engage
# some hackery ad-hoc scaling shiat.
# dpi is likely somewhat scaled down so use slightly larger font size
if scale >= 1.1 and self._font_size:
# no idea why
if 1.2 <= scale:
mult = 1.0375
if scale >= 1.5:
mult = 1.375
# TODO: this multiplier should probably be determined from
# relative aspect ratios or something?
inches *= mult
# XXX: if additionally we detect a known DE scaling factor we
# also scale *up* our font size on top of the existing
# heuristical (aka no clue why it works) scaling from the block
# above XD
if (
hasattr(Qt, 'AA_EnableHighDpiScaling')
and QCoreApplication.testAttribute(Qt.AA_EnableHighDpiScaling)
):
inches *= round(scale)
# TODO: we might want to fiddle with incrementing font size by
# +1 for the edge cases above. it seems doing it via scaling is
# always going to hit that error in range mapping from inches:
# float to px size: int.
self._font_inches = inches
font_size = math.floor(inches * dpi)
log.debug(
f"screen:{screen.name()}\n"
f"pDPI: {pdpi}, lDPI: {ldpi}, scale: {scale}\n"
f"\nOur best guess font size is {font_size}\n"
)
# apply the size
self._set_qfont_px_size(font_size)
def boundingRect(self, value: str) -> QtCore.QRectF:
screen = self.screen
if screen is None:
raise RuntimeError("You must call .configure_to_dpi() first!")
unscaled_br = self._qfm.boundingRect(value)
return QtCore.QRectF(
0,
0,
unscaled_br.width(),
unscaled_br.height(),
)
# use inches size to be cross-resolution compatible?
_font = DpiAwareFont()
_font_small = DpiAwareFont(font_size='small')
def _config_fonts_to_screen() -> None:
'configure global DPI aware font sizes'
global _font, _font_small
_font.configure_to_dpi()
_font_small.configure_to_dpi()
# TODO: re-compute font size when main widget switches screens?
# https://forum.qt.io/topic/54136/how-do-i-get-the-qscreen-my-widget-is-on-qapplication-desktop-screen-returns-a-qwidget-and-qobject_cast-qscreen-returns-null/3
# splitter widget config
_xaxis_at = 'bottom'
# charting config
CHART_MARGINS = (0, 0, 2, 2)
_min_points_to_show = 6
_tina_mode = False
def enable_tina_mode() -> None:
"""Enable "tina mode" to make everything look "conventional"
like your pet hedgehog always wanted.
"""
# white background (for tinas like our pal xb)
pg.setConfigOption('background', 'w')
def hcolor(name: str) -> str:
"""Hex color codes by hipster speak.
This is an internal set of color codes hand picked
for certain purposes.
"""
return {
# lives matter
'black': '#000000',
'erie_black': '#1B1B1B',
'licorice': '#1A1110',
'papas_special': '#06070c',
'svags': '#0a0e14',
# fifty shades
'original': '#a9a9a9',
'gray': '#808080', # like the kick
'grayer': '#4c4c4c',
'grayest': '#3f3f3f',
'cadet': '#91A3B0',
'marengo': '#91A3B0',
'gunmetal': '#91A3B0',
'battleship': '#848482',
# bluish
'charcoal': '#36454F',
# default bars
'bracket': '#666666', # like the logo
# work well for filled polygons which want a 'bracket' feel
# going light to dark
'davies': '#555555',
'i3': '#494D4F',
'jet': '#343434',
# from ``qdarkstyle`` palette
'default_darkest': DarkPalette.COLOR_BACKGROUND_1,
'default_dark': DarkPalette.COLOR_BACKGROUND_2,
'default': DarkPalette.COLOR_BACKGROUND_3,
'default_light': DarkPalette.COLOR_BACKGROUND_4,
'default_lightest': DarkPalette.COLOR_BACKGROUND_5,
'default_spotlight': DarkPalette.COLOR_BACKGROUND_6,
'white': '#ffffff', # for tinas and sunbathers
# blue zone
'dad_blue': '#326693', # like his shirt
'vwap_blue': '#0582fb',
'dodger_blue': '#1e90ff', # like the team?
'panasonic_blue': '#0040be', # from japan
# 'bid_blue': '#0077ea', # like the L1
'bid_blue': '#3094d9', # like the L1
'aquaman': '#39abd0',
# traditional
'tina_green': '#00cc00',
'tina_red': '#fa0000',
'cucumber': '#006400',
'cool_green': '#33b864',
'dull_green': '#74a662',
'hedge_green': '#518360',
# orders and alerts
'alert_yellow': '#e2d083',
'alert_yellow_light': '#ffe366',
# buys
# 'hedge': '#768a75',
# 'hedge': '#41694d',
# 'hedge': '#558964',
# 'hedge_light': '#5e9870',
'80s_neon_green': '#00b677',
# 'buy_green': '#41694d',
'buy_green': '#558964',
'buy_green_light': '#558964',
# sells
# techincally "raspberry"
# 'sell_red': '#990036',
# 'sell_red': '#9A0036',
# brighter then above
# 'sell_red': '#8c0030',
'sell_red': '#b6003f',
# 'sell_red': '#d00048',
'sell_red_light': '#f85462',
# 'sell_red': '#f85462',
# 'sell_red_light': '#ff4d5c',
}[name]

View File

@ -1,305 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) Tyler Goodlet (in stewardship for pikers)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Qt main window singletons and stuff.
"""
import os
import signal
import time
from typing import Callable, Optional, Union
import uuid
from pyqtgraph import QtGui
from PyQt5 import QtCore
from PyQt5.QtWidgets import QLabel, QStatusBar
from ..log import get_logger
from ._style import _font_small, hcolor
log = get_logger(__name__)
class MultiStatus:
bar: QStatusBar
statuses: list[str]
def __init__(self, bar, statuses) -> None:
self.bar = bar
self.statuses = statuses
self._to_clear: set = set()
self._status_groups: dict[str, (set, Callable)] = {}
def open_status(
self,
msg: str,
final_msg: Optional[str] = None,
clear_on_next: bool = False,
group_key: Optional[Union[bool, str]] = False,
) -> Union[Callable[..., None], str]:
'''
Add a status to the status bar and return a close callback which
when called will remove the status ``msg``.
'''
for old_msg in self._to_clear:
try:
self.statuses.remove(old_msg)
except ValueError:
pass
self.statuses.append(msg)
def remove_msg() -> None:
try:
self.statuses.remove(msg)
except ValueError:
pass
self.render()
if final_msg is not None:
self.statuses.append(final_msg)
self.render()
self._to_clear.add(final_msg)
ret = remove_msg
# create a "status group" such that new `.open_status()`
# calls can be made passing in the returned group key.
# once all clear callbacks have been called from all statuses
# in the group the final status msg to be removed will be the one
# the one provided when `group_key=True`, this way you can
# create a long living status that completes once all
# sub-statuses have finished.
if group_key is True:
if clear_on_next:
ValueError("Can't create group status and clear it on next?")
# generate a key for a new "status group"
new_group_key = str(uuid.uuid4())
def pop_group_and_clear():
subs, final_clear = self._status_groups.pop(new_group_key)
assert not subs
return remove_msg()
self._status_groups[new_group_key] = (set(), pop_group_and_clear)
ret = new_group_key
elif group_key:
def pop_from_group_and_maybe_clear_group():
# remove the message for this sub-status
remove_msg()
# check to see if all other substatuses have cleared
group_tup = self._status_groups.get(group_key)
if group_tup:
subs, group_clear = group_tup
try:
subs.remove(msg)
except KeyError:
raise KeyError(f'no msg {msg} for group {group_key}!?')
if not subs:
group_clear()
group = self._status_groups.get(group_key)
if group:
group[0].add(msg)
ret = pop_from_group_and_maybe_clear_group
self.render()
if clear_on_next:
self._to_clear.add(msg)
return ret
def render(self) -> None:
'''
Display all open statuses to bar.
'''
if self.statuses:
self.bar.showMessage(f'{" ".join(self.statuses)}')
else:
self.bar.clearMessage()
class MainWindow(QtGui.QMainWindow):
# XXX: for tiling wms this should scale
# with the alloted window size.
# TODO: detect for tiling and if untrue set some size?
size = (300, 500)
title = 'piker chart (ur symbol is loading bby)'
def __init__(self, parent=None):
super().__init__(parent)
# self.setMinimumSize(*self.size)
self.setWindowTitle(self.title)
self._status_bar: QStatusBar = None
self._status_label: QLabel = None
self._size: Optional[tuple[int, int]] = None
@property
def mode_label(self) -> QtGui.QLabel:
# init mode label
if not self._status_label:
self._status_label = label = QtGui.QLabel()
label.setStyleSheet(
f"""QLabel {{
color : {hcolor('gunmetal')};
}}
"""
# font-size : {font_size}px;
)
label.setTextFormat(3) # markdown
label.setFont(_font_small.font)
label.setMargin(2)
label.setAlignment(
QtCore.Qt.AlignVCenter
| QtCore.Qt.AlignRight
)
self.statusBar().addPermanentWidget(label)
label.show()
return self._status_label
def closeEvent(
self,
event: QtGui.QCloseEvent,
) -> None:
'''Cancel the root actor asap.
'''
# raising KBI seems to get intercepted by by Qt so just use the system.
os.kill(os.getpid(), signal.SIGINT)
@property
def status_bar(self) -> QStatusBar:
# style and cached the status bar on first access
if not self._status_bar:
sb = self.statusBar()
sb.setStyleSheet((
f"color : {hcolor('gunmetal')};"
f"background : {hcolor('default_dark')};"
f"font-size : {_font_small.px_size}px;"
"padding : 0px;"
# "min-height : 19px;"
# "qproperty-alignment: AlignVCenter;"
))
self.setStatusBar(sb)
self._status_bar = MultiStatus(sb, [])
return self._status_bar
def set_mode_name(
self,
name: str,
) -> None:
self.mode_label.setText(f'mode:{name}')
def on_focus_change(
self,
last: QtGui.QWidget,
current: QtGui.QWidget,
) -> None:
log.info(f'widget focus changed from {last} -> {current}')
if current is not None:
# cursor left window?
name = getattr(current, 'mode_name', '')
self.set_mode_name(name)
def current_screen(self) -> QtGui.QScreen:
"""Get a frickin screen (if we can, gawd).
"""
app = QtGui.QApplication.instance()
for _ in range(3):
screen = app.screenAt(self.pos())
log.debug('trying to access QScreen...')
if screen is None:
time.sleep(0.5)
continue
break
else:
if screen is None:
# try for the first one we can find
screen = app.screens()[0]
assert screen, "Wow Qt is dumb as shit and has no screen..."
return screen
def configure_to_desktop(
self,
size: Optional[tuple[int, int]] = None,
) -> None:
'''
Explicitly size the window dimensions (for stacked window
managers).
For tina systems (like windoze) try to do a sane window size on
startup.
'''
# https://stackoverflow.com/a/18975846
if not size and not self._size:
app = QtGui.QApplication.instance()
geo = self.current_screen().geometry()
h, w = geo.height(), geo.width()
# use approx 1/3 of the area of the screen by default
self._size = round(w * .666), round(h * .666)
self.resize(*size or self._size)
# singleton app per actor
_qt_win: QtGui.QMainWindow = None
def main_window() -> MainWindow:
'Return the actor-global Qt window.'
global _qt_win
assert _qt_win
return _qt_win

View File

@ -1,174 +0,0 @@
# piker: trading gear for hackers
# Copyright (C) 2018-present Tyler Goodlet (in stewardship of piker0)
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
"""
Console interface to UI components.
"""
import os
import click
import tractor
from ..cli import cli
from .. import watchlists as wl
from .._daemon import maybe_spawn_brokerd
_config_dir = click.get_app_dir('piker')
_watchlists_data_path = os.path.join(_config_dir, 'watchlists.json')
def _kivy_import_hack():
# Command line hacks to make it work.
# See the pkg mod.
from .kivy import kivy # noqa
@cli.command()
@click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.option('--rate', '-r', default=3, help='Quote rate limit')
@click.option('--test', '-t', help='Test quote stream file')
@click.option('--dhost', '-dh', default='127.0.0.1',
help='Daemon host address to connect to')
@click.argument('name', nargs=1, required=True)
@click.pass_obj
def monitor(config, rate, name, dhost, test, tl):
"""Start a real-time watchlist UI
"""
# global opts
brokermod = config['brokermods'][0]
loglevel = config['loglevel']
log = config['log']
watchlist_from_file = wl.ensure_watchlists(_watchlists_data_path)
watchlists = wl.merge_watchlist(watchlist_from_file, wl._builtins)
tickers = watchlists[name]
if not tickers:
log.error(f"No symbols found for watchlist `{name}`?")
return
_kivy_import_hack()
from .kivy.monitor import _async_main
async def main():
async with maybe_spawn_brokerd(
brokername=brokermod.name,
loglevel=loglevel
) as portal:
# run app "main"
await _async_main(
name, portal, tickers,
brokermod, rate, test=test,
)
tractor.run(
main,
name='monitor',
loglevel=loglevel if tl else None,
rpc_module_paths=['piker.ui.kivy.monitor'],
debug_mode=True,
)
@cli.command()
# @click.option('--tl', is_flag=True, help='Enable tractor logging')
@click.option('--date', '-d', help='Contracts expiry date')
@click.option('--test', '-t', help='Test quote stream file')
@click.option('--rate', '-r', default=1, help='Logging level')
@click.argument('symbol', required=True)
@click.pass_obj
def optschain(config, symbol, date, rate, test):
"""Start an option chain UI
"""
# global opts
loglevel = config['loglevel']
brokername = config['broker']
_kivy_import_hack()
from .kivy.option_chain import _async_main
async def main():
async with maybe_spawn_brokerd(
loglevel=loglevel
):
# run app "main"
await _async_main(
symbol,
brokername,
rate=rate,
loglevel=loglevel,
test=test,
)
tractor.run(
main,
name='kivy-options-chain',
)
@cli.command()
@click.option(
'--profile',
'-p',
default=None,
help='Enable pyqtgraph profiling'
)
@click.option(
'--pdb',
is_flag=True,
help='Enable tractor debug mode'
)
@click.argument('symbol', required=True)
@click.pass_obj
def chart(config, symbol, profile, pdb):
'''
Start a real-time chartng UI
'''
# eg. ``--profile 3`` reports profiling for anything slower then 3 ms.
if profile is not None:
from .. import _profile
_profile._pg_profile = True
_profile.ms_slower_then = float(profile)
from ._app import _main
if '.' not in symbol:
click.echo(click.style(
f'symbol: {symbol} must have a {symbol}.<provider> suffix',
fg='red',
))
return
# global opts
brokernames = config['brokers']
tractorloglevel = config['tractorloglevel']
pikerloglevel = config['loglevel']
_main(
sym=symbol,
brokernames=brokernames,
piker_loglevel=pikerloglevel,
tractor_kwargs={
'debug_mode': pdb,
'loglevel': tractorloglevel,
'name': 'chart',
'enable_modules': [
'piker.clearing._client'
],
},
)

View File

@ -1,15 +0,0 @@
"""
Legacy kivy components.
"""
import os
import sys
# XXX clear all flags at import to avoid upsetting
# ol' kivy see: https://github.com/kivy/kivy/issues/4225
# though this is likely a ``click`` problem
sys.argv[1:] = []
# use the trio async loop
os.environ['KIVY_EVENTLOOP'] = 'trio'
import kivy
kivy.require('1.10.0')

View File

@ -1,299 +0,0 @@
"""
monitor: a real-time, sorted watchlist.
Launch with ``piker monitor <watchlist name>``.
(Currently there's a bunch of questrade specific stuff in here)
"""
from types import ModuleType, AsyncGeneratorType
from typing import List, Callable
import trio
import tractor
from kivy.uix.boxlayout import BoxLayout
from kivy.lang import Builder
from kivy.app import async_runTouchApp
from kivy.core.window import Window
from ...brokers.data import DataFeed
from .tabular import (
Row, TickerTable, _kv, _black_rgba, colorcode,
)
from ...log import get_logger
from .pager import PagerView
log = get_logger('monitor')
async def update_quotes(
nursery: trio._core._run.Nursery,
formatter: Callable,
widgets: dict,
agen: AsyncGeneratorType,
symbol_data: dict,
first_quotes: dict,
task_status: trio._core._run._TaskStatus = trio.TASK_STATUS_IGNORED,
):
"""Process live quotes by updating ticker rows.
"""
log.debug("Initializing UI update loop")
table = widgets['table']
flash_keys = {'low', 'high'}
async def revert_cells_color(cells):
await trio.sleep(0.3)
for cell in cells:
cell.background_color = _black_rgba
def color_row(row, record, cells):
hdrcell = row.get_cell('symbol')
chngcell = row.get_cell('%')
# determine daily change color
percent_change = record.get('%')
if percent_change is not None and percent_change != chngcell:
daychange = float(percent_change)
if daychange < 0.:
color = colorcode('red2')
elif daychange > 0.:
color = colorcode('forestgreen')
else:
color = colorcode('gray')
# if the cell has been "highlighted" make sure to change its color
if hdrcell.background_color != [0] * 4:
hdrcell.background_color = color
# update row header and '%' cell text color
chngcell.color = color
hdrcell.color = color
# briefly highlight bg of certain cells on each trade execution
unflash = set()
tick_color = None
last = cells.get('last')
if not last:
vol = cells.get('volume')
if not vol:
return # no trade exec took place
# flash gray on volume tick
# (means trade exec @ current price)
last = row.get_cell('last')
tick_color = colorcode('gray')
else:
tick_color = last.color
last.background_color = tick_color
unflash.add(last)
# flash the size cell
size = row.get_cell('size')
size.background_color = tick_color
unflash.add(size)
# flash all other cells
for key in flash_keys:
cell = cells.get(key)
if cell:
cell.background_color = cell.color
unflash.add(cell)
# revert flash state momentarily
nursery.start_soon(revert_cells_color, unflash)
# initial coloring
to_sort = set()
for quote in first_quotes:
row = table.get_row(quote['symbol'])
row.update(quote)
color_row(row, quote, {})
to_sort.add(row.widget)
table.render_rows(to_sort)
log.debug("Finished initializing update loop")
task_status.started()
# real-time cell update loop
async for quotes in agen: # new quotes data only
to_sort = set()
for symbol, quote in quotes.items():
row = table.get_row(symbol)
# don't red/green the header cell in ``row.update()``
quote.pop('symbol')
quote.pop('key')
# determine if sorting should happen
sort_key = table.sort_key
last = row.get_field(sort_key)
new = quote.get(sort_key, last)
if new != last:
to_sort.add(row.widget)
# update and color
cells = row.update(quote)
color_row(row, quote, cells)
if to_sort:
table.render_rows(to_sort)
log.debug("Waiting on quotes")
log.warn("Data feed connection dropped")
_widgets = {}
async def stream_symbol_selection():
"""An RPC async gen for streaming the symbol corresponding
value corresponding to the last clicked row.
Essentially of an event stream of clicked symbol values.
"""
global _widgets
table = _widgets['table']
send_chan, recv_chan = trio.open_memory_channel(0)
table._click_queues.append(send_chan)
try:
async with recv_chan:
async for symbol in recv_chan:
yield symbol
finally:
table._click_queues.remove(send_chan)
async def _async_main(
name: str,
portal: tractor._portal.Portal,
symbols: List[str],
brokermod: ModuleType,
loglevel: str = 'info',
rate: int = 3,
test: str = '',
) -> None:
'''Launch kivy app + all other related tasks.
This is started with cli cmd `piker monitor`.
'''
feed = DataFeed(portal, brokermod)
async with feed.open_stream(
symbols,
'stock',
rate=rate,
test=test,
) as (quote_gen, first_quotes):
first_quotes_list = list(first_quotes.copy().values())
quotes = list(first_quotes.copy().values())
# build out UI
Window.set_title(f"monitor: {name}\t(press ? for help)")
Builder.load_string(_kv)
box = BoxLayout(orientation='vertical', spacing=0)
# define bid-ask "stacked" cells
# (TODO: needs some rethinking and renaming for sure)
bidasks = brokermod._stock_bidasks
# add header row
headers = list(first_quotes_list[0].keys())
headers.remove('displayable')
header = Row(
{key: key for key in headers},
headers=headers,
bidasks=bidasks,
is_header=True,
size_hint=(1, None),
)
box.add_widget(header)
# build table
table = TickerTable(
cols=1,
size_hint=(1, None),
)
for ticker_record in first_quotes_list:
symbol = ticker_record['symbol']
table.append_row(
symbol,
Row(
ticker_record,
headers=('symbol',),
bidasks=bidasks,
no_cell=('displayable',),
table=table
)
)
table.last_clicked_row = next(iter(table.symbols2rows.values()))
# associate the col headers row with the ticker table even though
# they're technically wrapped separately in containing BoxLayout
header.table = table
# mark the initial sorted column header as bold and underlined
sort_cell = header.get_cell(table.sort_key)
sort_cell.bold = sort_cell.underline = True
table.last_clicked_col_cell = sort_cell
# set up a pager view for large ticker lists
table.bind(minimum_height=table.setter('height'))
async def spawn_opts_chain():
"""Spawn an options chain UI in a new subactor.
"""
from .option_chain import _async_main
try:
async with tractor.open_nursery() as tn:
portal = await tn.run_in_actor(
'optschain',
_async_main,
symbol=table.last_clicked_row._last_record['symbol'],
brokername=brokermod.name,
loglevel=tractor.log.get_loglevel(),
)
except tractor.RemoteActorError:
# don't allow option chain errors to crash this monitor
# this is, like, the most basic of resliency policies
log.exception(f"{portal.actor.name} crashed:")
async with trio.open_nursery() as nursery:
pager = PagerView(
container=box,
contained=table,
nursery=nursery,
# spawn an option chain on 'o' keybinding
kbctls={('o',): spawn_opts_chain},
)
box.add_widget(pager)
widgets = {
'root': box,
'table': table,
'box': box,
'header': header,
'pager': pager,
}
global _widgets
_widgets = widgets
nursery.start_soon(
update_quotes,
nursery,
brokermod.format_stock_quote,
widgets,
quote_gen,
feed._symbol_data_cache,
quotes
)
try:
await async_runTouchApp(widgets['root'])
finally:
# cancel remote data feed task
await quote_gen.aclose()
# cancel GUI update task
nursery.cancel_scope.cancel()

View File

@ -11,6 +11,7 @@ from kivy.properties import BooleanProperty, ObjectProperty
from kivy.core.window import Window from kivy.core.window import Window
from kivy.clock import Clock from kivy.clock import Clock
from ...log import get_logger from ...log import get_logger
@ -99,8 +100,7 @@ class MouseOverBehavior(object):
# throttle at 10ms latency # throttle at 10ms latency
@triggered(timeout=0.01, interval=False) @triggered(timeout=0.01, interval=False)
def _on_mouse_pos(cls, *args): def _on_mouse_pos(cls, *args):
log.debug( log.debug(f"{cls} time since last call: {time.time() - cls._last_time}")
f"{cls} time since last call: {time.time() - cls._last_time}")
cls._last_time = time.time() cls._last_time = time.time()
# XXX: how to still do this at the class level? # XXX: how to still do this at the class level?
# don't proceed if I'm not displayed <=> If have no parent # don't proceed if I'm not displayed <=> If have no parent

View File

@ -61,10 +61,10 @@ class AsyncCallbackQueue(object):
return self return self
async def __anext__(self): async def __anext__(self):
self.event = async_lib.Event() self.event.clear()
while not self.callback_result and not self.quit: while not self.callback_result and not self.quit:
await self.event.wait() await self.event.wait()
self.event = async_lib.Event() self.event.clear()
if self.callback_result: if self.callback_result:
return self.callback_result.popleft() return self.callback_result.popleft()

Some files were not shown because too many files have changed in this diff Show More