Skip to content

Latest commit

 

History

History
117 lines (70 loc) · 4.08 KB

README.md

File metadata and controls

117 lines (70 loc) · 4.08 KB

Mech tool template

A template for development of mech tools on the Olas Network. Find the documentation here.

What is a mech?

A mech is an autonomous service that listens for on-chain requests and performs the needed actions in exchange for a small payment. These requests are usually LLM requests (although they can be other generic jobs), and their metadata is stored on IPFS while its hash is written to a smart contract that also handles the payment. We can think of a mech as an on-demand brain for your applications.

Why do we need mechs?

Mechs implement different AI-oriented tools and pay for private API access like OpenAI API. Mechs act as a central hub or library where your applications can make LLM requests and avoid having to pay for multiple APIs or implementing different API interfaces. Think of it as a generic interface to multiple LLMs and smart tools.

What does the mech request-response flow look like?

Mech request-response flow

What a mech tool looks like

Mech tools are just python scripts that are dynamically loaded and executed by the mech. This also means that it is possible to run these scripts locally for testing.

The only requisite a mech tool needs to meet is to implement the following method, which is the one that will be called by the mech:

def run(**kwargs) -> Tuple[Optional[str], Optional[Dict[str, Any]], Any, Any]:
    """Run the task

    Returns:
    - Response to send to the user
    - [Optional] Prompt sent to the model
    - [Optional] Transaction generated by the tool to be executed by the mech
    - [Optional] Cost calculation object
    """

If your tool requires an API key or any other secret, it will be passed as a kwarg. The mech owner will need to configure the mech service in order for this API key to be available.

System requirements

How to run the local tool examples

Create a virtual environment with all development dependencies:

poetry shell
poetry install

Run the demo calculator tool:

python scripts/run_calculator_request.py

Run the prediction tool:

  1. Create a free OpenRouter account and get an API key.

  2. Create a .env file

    cp sample.env .env
  3. In the .env file, fill in OPENROUTER_API_KEY with your API key.

  4. Run the tool

    python scripts/run_prediction_tool.py

How to interact with already deployed mech tools

  1. Prepare your private key. You can either export an already existing key from wallets like Metamask and save it to a file called ethereum_private_key.txt:

    echo -n YOUR_PRIVATE_KEY > ethereum_private_key.txt

    Or create a new key (you will need to send some funds) by running:

    aea generate-key ethereum
  2. Ensure that you have some funds in your wallet (i.e 0.05 xDAI if you're running on Gnosis Chain)

  3. Send a mech request using the mech cli:

    mechx interact "write a short poem" 6 --key ethereum_private_key.txt --tool openai-gpt-3.5-turbo --chain-config gnosis --confirm on-chain
  4. Send another mech request programmatically:

    python scripts/mech_request.py

Develop your own tool

  1. Create your tool under packages/<your_author_handle>/customs/<your_tool_name> using the examples as reference.

  2. Calculate your tool hashes. The first time you run this command you will be asked to add your tool to either dev or third_party packages section. Use dev:

    autonomy packages lock
    
  3. Test your tool running it as a Python script and ensuring it does what it is intented to do.

  4. Open a PR against the mech repository and tag an engineer for review.

  5. Once your PR has been approved, merged and deployed to the Mech, you will be able to interact with your tool using the mech-client.