Revert auto-gen readme and merge in auto-gen code blocks by hand for now

readme_pump
Tyler Goodlet 2021-02-21 23:52:41 -05:00
parent 92f4b402ad
commit 0e7db46631
5 changed files with 210 additions and 181 deletions

View File

@ -1,92 +1,78 @@
tractor tractor
******* =======
The Python async-native multi-core system *you always wanted*. The Python async-native multi-core system *you always wanted*.
https://actions-badge.atrox.dev/goodboy/tractor/goto
https://tractor.readthedocs.io/en/latest/?badge=latest
``tractor`` is a `structured concurrent |gh_actions|
<https://trio.discourse.group/t/concise-definition-of-structured-concurrency/228>`_ |docs|
`actor model <https://en.wikipedia.org/wiki/Actor_model>`_” built on
`trio <https://github.com/python-trio/trio>`_ and `multi-processing
<https://en.wikipedia.org/wiki/Multiprocessing>`_.
It is an attempt to pair `trionic .. _actor model: https://en.wikipedia.org/wiki/Actor_model
<https://trio.readthedocs.io/en/latest/design.html#high-level-design-principles>`_ .. _trio: https://github.com/python-trio/trio
`structured concurrency .. _multi-processing: https://en.wikipedia.org/wiki/Multiprocessing
<https://vorpus.org/blog/notes-on-structured-concurrency-or-go-statement-considered-harmful/>`_ .. _trionic: https://trio.readthedocs.io/en/latest/design.html#high-level-design-principles
with distributed Python. You can think of it as a ``trio`` .. _async sandwich: https://trio.readthedocs.io/en/latest/tutorial.html#async-sandwich
.. _structured concurrent: https://trio.discourse.group/t/concise-definition-of-structured-concurrency/228
``tractor`` is a `structured concurrent`_ "`actor model`_" built on trio_ and multi-processing_.
It is an attempt to pair trionic_ `structured concurrency`_ with
distributed Python. You can think of it as a ``trio``
*-across-processes* or simply as an opinionated replacement for the *-across-processes* or simply as an opinionated replacement for the
stdlibs ``multiprocessing`` but built on async programming primitives stdlib's ``multiprocessing`` but built on async programming primitives
from the ground up. from the ground up.
Dont be scared off by this description. ``tractor`` **is just Don't be scared off by this description. ``tractor`` **is just ``trio``**
``trio``** but with nurseries for process management and cancel-able but with nurseries for process management and cancel-able IPC.
IPC. If you understand how to work with ``trio``, ``tractor`` will If you understand how to work with ``trio``, ``tractor`` will give you
give you the parallelism youve been missing. the parallelism you've been missing.
``tractor``s nurseries let you spawn ``trio`` *“actors”*: new Python ``tractor``'s nurseries let you spawn ``trio`` *"actors"*: new Python
processes which each run a ``trio`` scheduled task tree (also known as processes which each run a ``trio`` scheduled task tree (also known as
an `async sandwich an `async sandwich`_ - a call to ``trio.run()``). That is, each
<https://trio.readthedocs.io/en/latest/tutorial.html#async-sandwich>`_ "*Actor*" is a new process plus a ``trio`` runtime.
- a call to ``trio.run()``). That is, each “*Actor*” is a new process
plus a ``trio`` runtime.
“Actors” communicate by exchanging asynchronous `messages "Actors" communicate by exchanging asynchronous messages_ and avoid
<https://en.wikipedia.org/wiki/Message_passing>`_ and avoid sharing sharing state. The intention of this model is to allow for highly
state. The intention of this model is to allow for highly distributed distributed software that, through the adherence to *structured
software that, through the adherence to *structured concurrency*, concurrency*, results in systems which fail in predictable and
results in systems which fail in predictable and recoverable ways. recoverable ways.
The first step to grok ``tractor`` is to get the basics of ``trio`` The first step to grok ``tractor`` is to get the basics of ``trio`` down.
down. A great place to start is the `trio docs A great place to start is the `trio docs`_ and this `blog post`_.
<https://trio.readthedocs.io/en/latest/>`_ and this `blog post
<https://vorpus.org/blog/notes-on-structured-concurrency-or-go-statement-considered-harmful/>`_. .. _messages: https://en.wikipedia.org/wiki/Message_passing
.. _trio docs: https://trio.readthedocs.io/en/latest/
.. _blog post: https://vorpus.org/blog/notes-on-structured-concurrency-or-go-statement-considered-harmful/
.. _structured concurrency: https://vorpus.org/blog/notes-on-structured-concurrency-or-go-statement-considered-harmful/
.. _3 axioms: https://en.wikipedia.org/wiki/Actor_model#Fundamental_concepts
.. _unrequirements: https://en.wikipedia.org/wiki/Actor_model#Direct_communication_and_asynchrony
.. _async generators: https://www.python.org/dev/peps/pep-0525/
Install Install
======= -------
No PyPi release yet! No PyPi release yet!
:: ::
pip install git+git://github.com/goodboy/tractor.git pip install git+git://github.com/goodboy/tractor.git
Alluring Features Alluring Features
================= -----------------
- **It's just** ``trio``, but with SC applied to processes (aka "actors")
* **Its just** ``trio``, but with SC applied to processes (aka - Infinitely nesteable process trees
“actors”) - Built-in API for inter-process streaming
- A (first ever?) "native" multi-core debugger for Python using `pdb++`_
* Infinitely nesteable process trees - (Soon to land) ``asyncio`` support allowing for "infected" actors where
`trio` drives the `asyncio` scheduler via the astounding "`guest mode`_"
* Built-in API for inter-process streaming
* A (first ever?) “native” multi-core debugger for Python using
`pdb++ <https://github.com/pdbpp/pdbpp>`_
* (Soon to land) ``asyncio`` support allowing for “infected” actors
where *trio* drives the *asyncio* scheduler via the astounding
“`guest mode
<https://trio.readthedocs.io/en/stable/reference-lowlevel.html?highlight=guest%20mode#using-guest-mode-to-run-trio-on-top-of-other-event-loops>`_”
The example youre probably after… Example: self-destruct a process tree
================================== -------------------------------------
It seems the initial query from most new users is “how do I make a .. code:: python
worker pool thing?”.
``tractor`` is built to handle any SC process tree you can imagine;
the “worker pool” pattern is a trivial special case:
# TODO: workerpool example
.. code::
""" """
Run with a process monitor from a terminal using: Run with a process monitor from a terminal using:
@ -127,13 +113,154 @@ the “worker pool” pattern is a trivial special case:
print('Zombies Contained') print('Zombies Contained')
Feel like saying hi? The example you're probably after...
==================== ------------------------------------
It seems the initial query from most new users is "how do I make a worker
pool thing?".
``tractor`` is built to handle any SC process tree you can
imagine; the "worker pool" pattern is a trivial special case:
.. code:: python
"""
Demonstration of the prime number detector example from the
``concurrent.futures`` docs:
https://docs.python.org/3/library/concurrent.futures.html#processpoolexecutor-example
This uses no extra threads, fancy semaphores or futures; all we need
is ``tractor``'s channels.
"""
from contextlib import asynccontextmanager
from typing import List, Callable
import itertools
import math
import time
import tractor
import trio
from async_generator import aclosing
PRIMES = [
112272535095293,
112582705942171,
112272535095293,
115280095190773,
115797848077099,
1099726899285419,
]
def is_prime(n):
if n < 2:
return False
if n == 2:
return True
if n % 2 == 0:
return False
sqrt_n = int(math.floor(math.sqrt(n)))
for i in range(3, sqrt_n + 1, 2):
if n % i == 0:
return False
return True
@asynccontextmanager
async def worker_pool(workers=4):
"""Though it's a trivial special case for ``tractor``, the well
known "worker pool" seems to be the defacto "but, I want this
process pattern!" for most parallelism pilgrims.
Yes, the workers stay alive (and ready for work) until you close
the context.
"""
async with tractor.open_nursery() as tn:
portals = []
snd_chan, recv_chan = trio.open_memory_channel(len(PRIMES))
for i in range(workers):
# this starts a new sub-actor (process + trio runtime) and
# stores it's "portal" for later use to "submit jobs" (ugh).
portals.append(
await tn.start_actor(
f'worker_{i}',
enable_modules=[__name__],
)
)
async def _map(
worker_func: Callable[[int], bool],
sequence: List[int]
) -> List[bool]:
# define an async (local) task to collect results from workers
async def send_result(func, value, portal):
await snd_chan.send((value, await portal.run(func, n=value)))
async with trio.open_nursery() as n:
for value, portal in zip(sequence, itertools.cycle(portals)):
n.start_soon(
send_result,
worker_func,
value,
portal
)
# deliver results as they arrive
for _ in range(len(sequence)):
yield await recv_chan.receive()
# deliver the parallel "worker mapper" to user code
yield _map
# tear down all "workers" on pool close
await tn.cancel()
async def main():
async with worker_pool() as actor_map:
start = time.time()
async with aclosing(actor_map(is_prime, PRIMES)) as results:
async for number, prime in results:
print(f'{number} is prime: {prime}')
print(f'processing took {time.time() - start} seconds')
if __name__ == '__main__':
start = time.time()
trio.run(main)
print(f'script took {time.time() - start} seconds')
Feel like saying hi?
--------------------
This project is very much coupled to the ongoing development of This project is very much coupled to the ongoing development of
``trio`` (i.e. ``tractor`` gets most of its ideas from that brilliant ``trio`` (i.e. ``tractor`` gets most of its ideas from that brilliant
community). If you want to help, have suggestions or just want to say community). If you want to help, have suggestions or just want to
hi, please feel free to reach us in our `matrix channel say hi, please feel free to reach us in our `matrix channel`_. If
<https://matrix.to/#/!tractor:matrix.org>`_. If matrix seems too hip, matrix seems too hip, we're also mostly all in the the `trio gitter
were also mostly all in the the `trio gitter channel channel`_!
<https://gitter.im/python-trio/general>`_!
.. _trio gitter channel: https://gitter.im/python-trio/general
.. _matrix channel: https://matrix.to/#/!tractor:matrix.org
.. _pdb++: https://github.com/pdbpp/pdbpp
.. _guest mode: https://trio.readthedocs.io/en/stable/reference-lowlevel.html?highlight=guest%20mode#using-guest-mode-to-run-trio-on-top-of-other-event-loops
.. |gh_actions| image:: https://img.shields.io/endpoint.svg?url=https%3A%2F%2Factions-badge.atrox.dev%2Fgoodboy%2Ftractor%2Fbadge&style=popout-square
:target: https://actions-badge.atrox.dev/goodboy/tractor/goto
.. |docs| image:: https://readthedocs.org/projects/tractor/badge/?version=latest
:target: https://tractor.readthedocs.io/en/latest/?badge=latest
:alt: Documentation Status

View File

@ -1,103 +0,0 @@
tractor
=======
The Python async-native multi-core system *you always wanted*.
|gh_actions|
|docs|
.. _actor model: https://en.wikipedia.org/wiki/Actor_model
.. _trio: https://github.com/python-trio/trio
.. _multi-processing: https://en.wikipedia.org/wiki/Multiprocessing
.. _trionic: https://trio.readthedocs.io/en/latest/design.html#high-level-design-principles
.. _async sandwich: https://trio.readthedocs.io/en/latest/tutorial.html#async-sandwich
.. _structured concurrent: https://trio.discourse.group/t/concise-definition-of-structured-concurrency/228
``tractor`` is a `structured concurrent`_ "`actor model`_" built on trio_ and multi-processing_.
It is an attempt to pair trionic_ `structured concurrency`_ with
distributed Python. You can think of it as a ``trio``
*-across-processes* or simply as an opinionated replacement for the
stdlib's ``multiprocessing`` but built on async programming primitives
from the ground up.
Don't be scared off by this description. ``tractor`` **is just ``trio``**
but with nurseries for process management and cancel-able IPC.
If you understand how to work with ``trio``, ``tractor`` will give you
the parallelism you've been missing.
``tractor``'s nurseries let you spawn ``trio`` *"actors"*: new Python
processes which each run a ``trio`` scheduled task tree (also known as
an `async sandwich`_ - a call to ``trio.run()``). That is, each
"*Actor*" is a new process plus a ``trio`` runtime.
"Actors" communicate by exchanging asynchronous messages_ and avoid
sharing state. The intention of this model is to allow for highly
distributed software that, through the adherence to *structured
concurrency*, results in systems which fail in predictable and
recoverable ways.
The first step to grok ``tractor`` is to get the basics of ``trio`` down.
A great place to start is the `trio docs`_ and this `blog post`_.
.. _messages: https://en.wikipedia.org/wiki/Message_passing
.. _trio docs: https://trio.readthedocs.io/en/latest/
.. _blog post: https://vorpus.org/blog/notes-on-structured-concurrency-or-go-statement-considered-harmful/
.. _structured concurrency: https://vorpus.org/blog/notes-on-structured-concurrency-or-go-statement-considered-harmful/
.. _3 axioms: https://en.wikipedia.org/wiki/Actor_model#Fundamental_concepts
.. _unrequirements: https://en.wikipedia.org/wiki/Actor_model#Direct_communication_and_asynchrony
.. _async generators: https://www.python.org/dev/peps/pep-0525/
Install
-------
No PyPi release yet!
::
pip install git+git://github.com/goodboy/tractor.git
Alluring Features
-----------------
- **It's just** ``trio``, but with SC applied to processes (aka "actors")
- Infinitely nesteable process trees
- Built-in API for inter-process streaming
- A (first ever?) "native" multi-core debugger for Python using `pdb++`_
- (Soon to land) ``asyncio`` support allowing for "infected" actors where
`trio` drives the `asyncio` scheduler via the astounding "`guest mode`_"
The example you're probably after...
------------------------------------
It seems the initial query from most new users is "how do I make a worker
pool thing?".
``tractor`` is built to handle any SC process tree you can
imagine; the "worker pool" pattern is a trivial special case:
# TODO: workerpool example
Feel like saying hi?
--------------------
This project is very much coupled to the ongoing development of
``trio`` (i.e. ``tractor`` gets most of its ideas from that brilliant
community). If you want to help, have suggestions or just want to
say hi, please feel free to reach us in our `matrix channel`_. If
matrix seems too hip, we're also mostly all in the the `trio gitter
channel`_!
.. _trio gitter channel: https://gitter.im/python-trio/general
.. _matrix channel: https://matrix.to/#/!tractor:matrix.org
.. _pdb++: https://github.com/pdbpp/pdbpp
.. _guest mode: https://trio.readthedocs.io/en/stable/reference-lowlevel.html?highlight=guest%20mode#using-guest-mode-to-run-trio-on-top-of-other-event-loops
.. |gh_actions| image:: https://img.shields.io/endpoint.svg?url=https%3A%2F%2Factions-badge.atrox.dev%2Fgoodboy%2Ftractor%2Fbadge&style=popout-square
:target: https://actions-badge.atrox.dev/goodboy/tractor/goto
.. |docs| image:: https://readthedocs.org/projects/tractor/badge/?version=latest
:target: https://tractor.readthedocs.io/en/latest/?badge=latest
:alt: Documentation Status

View File

@ -69,6 +69,12 @@ Alluring Features
`trio` drives the `asyncio` scheduler via the astounding "`guest mode`_" `trio` drives the `asyncio` scheduler via the astounding "`guest mode`_"
Example: self-destruct a process tree
-------------------------------------
.. literalinclude:: ../../examples/parallelism/we_are_processes.py
:language: python
The example you're probably after... The example you're probably after...
------------------------------------ ------------------------------------
It seems the initial query from most new users is "how do I make a worker It seems the initial query from most new users is "how do I make a worker
@ -77,9 +83,8 @@ pool thing?".
``tractor`` is built to handle any SC process tree you can ``tractor`` is built to handle any SC process tree you can
imagine; the "worker pool" pattern is a trivial special case: imagine; the "worker pool" pattern is a trivial special case:
# TODO: workerpool example .. literalinclude:: ../../examples/parallelism/concurrent_actors_primes.py
:language: python
.. literalinclude:: ../../examples/parallelism/we_are_processes.py
Feel like saying hi? Feel like saying hi?

View File

@ -35,9 +35,9 @@ release = '0.0.0a0.dev0'
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones. # ones.
extensions = [ extensions = [
# 'sphinx.ext.autodoc', 'sphinx.ext.autodoc',
# 'sphinx.ext.intersphinx', 'sphinx.ext.intersphinx',
# 'sphinx.ext.todo', 'sphinx.ext.todo',
'sphinxcontrib.restbuilder', 'sphinxcontrib.restbuilder',
] ]

View File

@ -1,4 +1,4 @@
#!/bin/bash #!/bin/bash
sphinx-build -b rst ./github_readme ./ sphinx-build -b rst ./github_readme ./
mv _sphinx_readme.rst README.rst mv _sphinx_readme.rst _README.rst