Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/mymagicai #10750

Merged
merged 10 commits into from
Feb 19, 2024
Merged

Feat/mymagicai #10750

merged 10 commits into from
Feb 19, 2024

Conversation

vitali87
Copy link
Contributor

Description

MyMagic AI provides batch inference at scale. This PR integrates (wraps) MyMagic AI's API into existing LlamaIndex integrations.

Type of Change

  • New feature (non-breaking change which adds functionality)
  • This change requires a documentation update

How Has This Been Tested?

  • Called the integrated MyMagic API with test user
  • Added new notebook (that tests end-to-end)

Suggested Checklist:

  • I have performed a self-review of my own code
  • I have made corresponding changes to the documentation
  • I have added Google Colab support for the newly added notebooks.
  • My changes generate no new warnings
  • I ran make format; make lint to appease the lint gods

@dosubot dosubot bot added the size:XL This PR changes 500-999 lines, ignoring generated files. label Feb 15, 2024
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The notebook has a few error outputs

You can install rhe package with pip install -e <path to root dir of mymgaic> and then ensure your notebook venv us using the same env where you installed it.

If you can't get that to work, let's remove the notebook outputs altogether

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I will try to make it work with venv and if it fails, I will remove outputs

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, since the users need to use their personal access token to run the notebook, there is no point to show any output. Let's opt in for the second option, i.e. removing outputs from the notebook.

@logan-markewich
Copy link
Collaborator

I'm not able to contribute to this PR, but are you able to install pants and run pants tailor :: ? This will insert the two missing BUILD files causing CI to fail

(We need to update the CLI package init tool to insert these 😅)

@vitali87
Copy link
Contributor Author

I'm not able to contribute to this PR, but are you able to install pants and run pants tailor :: ? This will insert the two missing BUILD files causing CI to fail

(We need to update the CLI package init tool to insert these 😅)

Sure I will install and run pants

@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. and removed size:XL This PR changes 500-999 lines, ignoring generated files. labels Feb 19, 2024
@logan-markewich logan-markewich enabled auto-merge (squash) February 19, 2024 14:45
@logan-markewich logan-markewich merged commit 069f2c1 into run-llama:main Feb 19, 2024
8 checks passed
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Feb 19, 2024
Dominastorm pushed a commit to uptrain-ai/llama_index that referenced this pull request Feb 28, 2024
anoopshrma pushed a commit to anoopshrma/llama_index that referenced this pull request Mar 2, 2024
Izukimat pushed a commit to Izukimat/llama_index that referenced this pull request Mar 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lgtm This PR has been approved by a maintainer size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants