-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use DAQ API instead of direct database calls #38
base: master
Are you sure you want to change the base?
Conversation
…ications to ensure they were successful and throw if they weren't
Awesome! On point 1: we typically run cax in a cron-job style so that if cax updates, the new one is run. Therefore, the timeouts shouldn't be a problem. On point 2: is the line number right? Are you referring to: Line 82 in 906536f
The main issue I was trying to avoid is dropping data that was already dropped. However, within clear, you're doing a pull. Therefore, even if you have two cax running, the first will remove then second will just have no operation to perform even though search matched. Can I get the API keys from you? I'll test at Chicago and Datamanager. @XeBoris do you want to test this there after I do my tests or just let it deploy there? |
API keys sent for midway. We can decide later how we want to organize keys. For now I guess one per site. |
If it works for you I don't see a reason why it should not work for me at PDC. But I can do a test before deploying it to Stockholm. Just notice me before deploying after your test and I will then kill my own update manager that I can test cax before. |
@XeBoris yeah you will need keys. Maybe it's good to issue one per analysis site? I'm not sure. Send me a PM when you want to test and I'll generate a user and API key for you. |
# Conflicts: # cax/tasks/checksum.py # cax/tasks/clear.py
# Conflicts: # cax/tasks/checksum.py # cax/tasks/clear.py # cax/tasks/process.py
Humm... I created a conflict. How do I fix the clear conflict with your API @coderdj ? Then we can merge. |
I made a few changes to process.py so that cax-process works with API calls (there were a few minor errors due to _process() being defined outside of the ProcessBatchQueue class.) I successfully tested it on one run. Not sure about unresolved conflicts that seem to be present still. |
This update replaces most (maybe all) pymongo query and update calls with API methods over the DAQ gateway. It adds a new class (api.py) that now requires API credentials to function. You have to set API_USER and API_KEY as environment variables. Failure to provide these will raise an exception. The whole point of this is to run cax at non-whitelisted sites, like the GRID.
This has been tested for basic transfers only. It would be nice to have some help making sure processing works correctly too.
There are two points I was unsure of.