Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support asynchronous send API #6

Open
rshivane opened this issue Sep 27, 2018 · 0 comments
Open

Support asynchronous send API #6

rshivane opened this issue Sep 27, 2018 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@rshivane
Copy link
Contributor

It would be very convenient to have an asynchronous send client, that could buffer data points and send them as a batch, like in the sample below.

The async client could have a param (max_datapoints_in_mem) that controls the max number of data points to keep in memory. A send is triggered if the memory limit is reached.

The client could have another param (max_wait_time) that controls the max amount of time to buffer the points locally before sending. A send is triggered if the wait time is reached.

client = AsyncSender(Apptuit(token=token), max_datapoints_in_mem=10000, max_wait_time=5)

metrics = ["proc.cpu.percent", "node.memory.bytes", "network.send.bytes", "network.receive.bytes", "node.load.avg"]
tags = {"host": "localhost", "ip": "127.0.0.1"}
curtime = int(time.time())
while True:
    curtime = int(time.time())
    for metric in metrics:
        client.send(DataPoint(metric, tags, curtime, random.random()))
    time.sleep(60)
@rshivane rshivane added the enhancement New feature or request label Sep 27, 2018
@abhinav-upadhyay abhinav-upadhyay self-assigned this Sep 27, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants