https://github.com/compas-dev/compas_cloud

COMPAS Remote Procedure Calls using websockets

https://github.com/compas-dev/compas_cloud

Science Score: 36.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Committers with academic emails
    2 of 5 committers (40.0%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (12.7%) to scientific vocabulary

Keywords

remote-procedure-calls sessions websockets

Keywords from Contributors

form-finding labels polyhedral-diagrams graphic-statics viewer cad thrust-network-analysis funicular-structures rbe prd
Last synced: 5 months ago · JSON representation

Repository

COMPAS Remote Procedure Calls using websockets

Basic Info
  • Host: GitHub
  • Owner: compas-dev
  • License: mit
  • Language: Python
  • Default Branch: main
  • Homepage:
  • Size: 237 KB
Statistics
  • Stars: 6
  • Watchers: 4
  • Forks: 2
  • Open Issues: 9
  • Releases: 13
Topics
remote-procedure-calls sessions websockets
Created over 6 years ago · Last pushed over 1 year ago
Metadata Files
Readme Changelog License

README.md

compas_cloud

compas_cloud is the further development of compas.rpc module. It uses websocktes instead of RESTful APIs to allow bi-directional communications between various front-end programs like Rhino, GH, RhinoVault2, blender or web-based viewers that are implemented in different enviroments including CPython, IronPython and Javascript. It also allows to save certain variables to backend inside a user session to avoid overheads created by redundant data transfers.

Installation

Install from source

bash git clone https://github.com/BlockResearchGroup/compas_cloud.git pip install -e .

Install for Rhino

bash python -m compas_rhino.install -p compas_cloud

Using Proxy

Running the sever:

  1. Start from command line: bash python -m compas_cloud.server
  2. The proxy will automatically start a server in background if there isn't one to connect to. If the server is started this way, it will keep operating in background and reconnect if a new proxy is create later.

Basic Usage

One of the main purposes of compascloud is to allow usage of full COMPAS functionalities in more closed envinroments like IronPython. The following example shows how to use a numpy based COMPAS function through a proxy which can be run in softwares like Rhino:
basic.py ```python from compas
cloud import Proxy from compas.geometry import Translation

proxy = Proxy() transformpointsnumpy = proxy.function('compas.geometry.transformpointsnumpy')

create a proxy funciton

pts = [[0,0,0], [1,0,0]] T = Translation([100, 0, 0]).matrix transformpointsnumpy(pts, T) # call the function through proxy print(result)

will print: [[100.0, 0.0 ,0.0], [101.0, 0.0, 0.0]]

```

Caching

Compas_cloud allows to cache data or function outputs at server side instead of sending them to the front-end all the time. This can vastly improve the performance for long iterative operations that involves large amount of data inputs and outputs.

caching.py ```python from compas_cloud import Proxy from compas.geometry import Translation

CACHING INPUT PARAMETERS

proxy = Proxy() transformpointsnumpy = proxy.function('compas.geometry.transformpointsnumpy')

create a proxy funciton

pts = [[0,0,0], [1,0,0]] ptscache = proxy.cache(pts) # cache the object to server side and return its reference print(ptscache) # will print: {'cached': someuniqueid}

T = Translation([100, 0, 0]).matrix result = transformpointsnumpy(pts_cache, T) # call the function through proxy print(result) # will print: [[100.0, 0.0 ,0.0], [101.0, 0.0, 0.0]]

CACHING RETURNED DATA

transformpointsnumpy = proxy.function('compas.geometry.transformpointsnumpy', cache=True)

this function will now return a cache object instead of the actual data

pts = [[0,0,0], [1,0,0]] ptscache = proxy.cache(pts) print(ptscache) # will print: {'cached': someuniqueid}

T = Translation([100, 0, 0]).matrix resultcache = transformpointsnumpy(ptscache, T) # call the function through proxy print(resultcache) # will print: {'cached': someunique_id}

result = proxy.get(result_cache) # fetch the actual data of the cache object print(result) # will print: [[100.0, 0.0 ,0.0], [101.0, 0.0, 0.0]] ```

Server control

User can restart/check/shutdown a connected server from proxy with commands in following example: server_control.py ```python from compas_cloud import Proxy import time

print("\n starting a new Proxy and by default starts a server in background") proxy = Proxy(background=True) time.sleep(3)

print("\n restarting the background server and open a new one in a prompt console") proxy.background = False proxy.restart() time.sleep(3)

print("\n check if the proxy is healthily connected to server") print(proxy.check()) time.sleep(3)

print("\n shut the the server and quite the program") proxy.shutdown() time.sleep(3) ```

Other Examples

A benchmark test comparing pure python and numpy with caching to transform 10k points for 100 times: bash python examples/benchmark.py

Iterative plotting example with callbacks:
bash python examples/dr_numpy.py

Using non-compas packages like numpy with IronPython:
run examples/example_numpy.py with Rhino

Using Sessions (Currently only work with MacOS/Linux)

Compas_cloud.Sessions is a task-manager class that helps to execute a batch of long-lasting tasks such as FEA and DEM simulations. It creates a queue of tasks and a collection of workers to execute the tasks in parallel and save the program logs into each corresponding locations. Sessions can be run either locally or in a background server through Proxy.

Examples

Running Sessions Locally:

bash python examples/sessions_local.py

```python from compas_cloud import Sessions

define a psuedo task that will take few seconds to finish

def func(a): import time

for i in range(a):
    time.sleep(1)
    print('sleeped ', i, 's')

initiate a session object, and specify where the logs will be stored and number of workers

if no log_path is given, all logs will be streamed to terminal and not saved

the default worker_num is equal to the number of cpus accessible on the computer

s = Sessions(logpath=None, workernum=4)

add several tasks to the session using different parameters

s.addtask(func, 1) s.addtask(func, 2) s.addtask(func, 3) s.addtask(func, 4) s.add_task(func, 5)

kick of the taks and start to listen to the events when tasks start or finish

s.start() s.listen() ```

You should see following logs:

``` {'waiting': 5, 'running': 0, 'failed': 0, 'finished': 0, 'total': 5} ________ START {'waiting': 5, 'running': 0, 'failed': 0, 'finished': 0, 'total': 5} ________ using 4 workers {'waiting': 5, 'running': 0, 'failed': 0, 'finished': 0, 'total': 5} ________ worker 58884 started {'waiting': 4, 'running': 1, 'failed': 0, 'finished': 0, 'total': 5} ________ task-0: started {'waiting': 4, 'running': 1, 'failed': 0, 'finished': 0, 'total': 5} ________ worker 58885 started {'waiting': 4, 'running': 1, 'failed': 0, 'finished': 0, 'total': 5} ________ task-0: streaming log to temp/task-0.log {'waiting': 3, 'running': 2, 'failed': 0, 'finished': 0, 'total': 5} ________ task-1: started ...

{'waiting': 0, 'running': 0, 'failed': 0, 'finished': 5, 'total': 5} ________ task-4: finished {'waiting': 0, 'running': 0, 'failed': 0, 'finished': 5, 'total': 5} ________ worker 58884 terminated {'waiting': 0, 'running': 0, 'failed': 0, 'finished': 5, 'total': 5} ________ FINISHED ```

Running Sessions With Proxy:

bash python examples/sessions_remote.py

```python from compas_cloud import Proxy

define a psuedo task that will take few seconds to finish

def func(a): import time

for i in range(a):
    time.sleep(1)
    print('sleeped ', i, 's')

initiate a Sessions object through Proxy that connects to a background server

p = Proxy() s = p.Sessions()

add several tasks to the session using different parameters

s.addtask(func, 1) s.addtask(func, 2) s.addtask(func, 3) s.addtask(func, 4) s.add_task(func, 5)

kick of the taks and start to listen to the events when tasks start or finish

s.start() s.listen() ```

You should be able to see same logs from above example

Owner

  • Name: compas-dev
  • Login: compas-dev
  • Kind: organization

COMPAS - an open source computational framework for research and collaboration in AEC

GitHub Events

Total
Last Year

Committers

Last synced: almost 3 years ago

All Time
  • Total Commits: 142
  • Total Committers: 5
  • Avg Commits per committer: 28.4
  • Development Distribution Score (DDS): 0.092
Past Year
  • Commits: 9
  • Committers: 1
  • Avg Commits per committer: 9.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Li l****0@g****m 129
brgcode v****e@a****h 6
Tom Van Mele v****t@e****h 3
dependabot[bot] 4****]@u****m 2
tkmmark m****m@g****m 2
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 7 months ago

All Time
  • Total issues: 12
  • Total pull requests: 9
  • Average time to close issues: about 2 months
  • Average time to close pull requests: 19 days
  • Total issue authors: 4
  • Total pull request authors: 5
  • Average comments per issue: 1.17
  • Average comments per pull request: 0.0
  • Merged pull requests: 7
  • Bot issues: 0
  • Bot pull requests: 2
Past Year
  • Issues: 0
  • Pull requests: 1
  • Average time to close issues: N/A
  • Average time to close pull requests: about 14 hours
  • Issue authors: 0
  • Pull request authors: 1
  • Average comments per issue: 0
  • Average comments per pull request: 0.0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • tkmmark (7)
  • Licini (2)
  • brgcode (2)
  • ZacZhangzhuo (1)
Pull Request Authors
  • Licini (3)
  • petrasvestartas (2)
  • dependabot[bot] (2)
  • tkmmark (2)
  • brgcode (1)
Top Labels
Issue Labels
bug (1) Priority: Low (1) Type: Infrastructure (1)
Pull Request Labels
dependencies (2)

Packages

  • Total packages: 1
  • Total downloads: unknown
  • Total dependent packages: 1
  • Total dependent repositories: 0
  • Total versions: 3
conda-forge.org: compas_cloud
  • Versions: 3
  • Dependent Packages: 1
  • Dependent Repositories: 0
Rankings
Dependent packages count: 28.8%
Dependent repos count: 34.0%
Average: 42.9%
Forks count: 54.2%
Stargazers count: 54.5%
Last synced: 6 months ago

Dependencies

requirements-dev.txt pypi
  • attrs >=17.4 development
  • autopep8 * development
  • bump2version >=0.5.11 development
  • check-manifest >=0.36 development
  • doc8 * development
  • flake8 * development
  • graphviz * development
  • invoke >=0.14 development
  • ipykernel * development
  • ipython >=5.8 development
  • isort * development
  • m2r * development
  • nbsphinx * development
  • pydocstyle * development
  • pytest >=3.2 development
  • sphinx >=3.4 development
  • sphinx_compas_theme >=0.13 development
  • twine * development
requirements.txt pypi
  • autobahn ==20.12.3
  • compas *
  • websockets ==9.1
.github/workflows/build.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
  • compas-dev/compas-actions.build v1.1.1 composite
.github/workflows/pr-checks.yml actions
  • Zomzog/changelog-checker v1.2.0 composite
  • actions/checkout v1 composite
.github/workflows/release.yml actions
  • actions/checkout v2 composite
  • actions/create-release v1 composite
  • actions/setup-python v2 composite
  • compas-dev/compas-actions.build v1.1.1 composite
  • compas-dev/compas-actions.publish v1.0.0 composite
  • mindsers/changelog-reader-action v2 composite
setup.py pypi