Skip to content

Commit

Permalink
Reinit
Browse files Browse the repository at this point in the history
  • Loading branch information
SpyderRex committed Sep 14, 2024
0 parents commit cae1a86
Show file tree
Hide file tree
Showing 256 changed files with 14,095 additions and 0 deletions.
11 changes: 11 additions & 0 deletions .env.template
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
GROQ_API_KEY=

SERPAPI_API_KEY=

SQUADAI_STORAGE_DIR=Storage

GROQ_MODEL_NAME=llama-3.1-70b-versatile

WOLFRAM_ALPHA_APPID=

FIRECRAWL_API_KEY=
9 changes: 9 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
__pycache__/

.env

Workspace/

venv/

find_os.py
21 changes: 21 additions & 0 deletions LICENSE.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
MIT License

Copyright (c) 2024 Spyder Rex

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
73 changes: 73 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
# SquadAI

SquadAI is a an autonomous agent program based on CrewAI, but it is intended to be used as a standalone program like AutoGPT rather than a package. It uses the open source Llama3 model from Groq API rather than OpenAI's models.

## Features
- **Llama3 Model Integration**: Utilizes the Llama3 model via Groq API, providing a free alternative to other AI models.
- **Lightweight Design**: Built to be simple and easy to understand, making it accessible for developers at any level.
- **Open-Source Focus**: Aiming to attract contributors to help develop and enhance the project.
- **Access to LangChain tools

## Getting Started

### Prerequisites
Ensure you have Python installed on your system. You can check by running:
```bash
python --version
```
or
```bash
python3 --version
```

You will also need to go to Groq Cloud and get a Groq API key. Chage the name of env.template to .env and add your API key. Do the same thing for the WolframAlpha API. Also, consider getting a SerpApi API key as well and add it to the .env file. As this project grows more API keys will probably be needed, but I intend to keep everything free and open source. And you will need to get a Firecrawl api key for the webscraper tool.

### Installation
1. Clone the Repository:
```bash
git clone https://github.com/SpyderRex/SquadAI.git
cd SquadAI
```

2. Install the Requirements:
Install the necessary dependencies using pip:
```bash
pip install -r requirements.txt
```

## Usage
To run SquadAI, simply execute the following command in your terminal:
```bash
python3 main.py
```
A prompt will appear asking the user to provide a goal. The process toward completing that goal will be executed.

Alternately, you can create a project in the same way that crewAI does:
```bash
python3 -m squadai create squad test_squad
```

## squadai_tools
The original crewAI program also has a separate package called crewai-tools that must be installed separately. However, I have added this functionality within the project itself, in a module called squadai_tools. This is separate from the tool_reg directory that initializes the LangChain tools for the agents.

## Contributing
Contributions are what make the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

Obviously this is a work in project and an experiment with autonomous agent programs using free, open source models. More tools and functionality will be added as the project grows.

1. Fork the Project
2. Create your Feature Branch (`git checkout -b feature/AmazingFeature`)
3. Commit your Changes (`git commit -m 'Add some AmazingFeature'`)
4. Push to the Branch (`git push origin feature/AmazingFeature`)
5. Open a Pull Request

## License
Distributed under the MIT License. See `LICENSE.txt` for more information.

## Contact
Spyder Rex - [email protected]

Project Link: https://github.com/SpyderRex/SquadAI

## Donating
If you wish to donate financially to this project, you can do so [here](https://www.paypal.com/donate/?hosted_button_id=N8HR4SN2J6FPG)
207 changes: 207 additions & 0 deletions main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,207 @@
import os
import json
from typing import List, Dict, Any
from squadai import Agent, Task, Squad, Process
from squadai.squadai_tools import FileWriterTool
from langchain_community.tools import DuckDuckGoSearchRun
from langchain_community.tools import WikipediaQueryRun
from langchain_community.utilities import WikipediaAPIWrapper
from groq import Groq
from dotenv import load_dotenv
from tool_reg import tool_registry
import tool_reg.search_tools
import tool_reg.file_tools
import tool_reg.info_tools
from tool_reg.scrape_tools import BrowserTools

load_dotenv()

# Initialize tools
duckduckgo_tool = tool_registry.get("duckduckgo")
wikipedia_tool = tool_registry.get("wikipedia")
wolframalpha_tool = tool_registry.get("wolframalpha")
write_file_tool = tool_registry.get("write_file")
read_file_tool = tool_registry.get("read_file")
list_directory_tool = tool_registry.get("list_directory")
copy_file_tool = tool_registry.get("copy_file")
delete_file_tool = tool_registry.get("delete_file")
file_search_tool = tool_registry.get("file_search")
move_file_tool = tool_registry.get("move_file")
scrape_tool = BrowserTools.scrape_and_summarize_website

# Set up Groq API (make sure to set your API key in the environment variables)
groq_api_key = os.getenv("GROQ_API_KEY")
client = Groq(api_key=groq_api_key)

def get_squad_config(user_prompt: str) -> Dict[str, Any]:
"""
Use Groq's API to generate a SquadAI configuration based on the user's prompt.
"""
with open("system_message.txt", "r") as f:
system_message = f.read()

response = client.chat.completions.create(
model="llama-3.1-70b-versatile",
messages=[
{"role": "system", "content": system_message},
{"role": "user", "content": f"Create a SquadAI configuration for the following goal: {user_prompt}"}
]
)

llm_response = response.choices[0].message.content
print("Raw LLM response:", llm_response)

# Remove backticks if present
llm_response = llm_response.strip('`')
if llm_response.startswith('json'):
llm_response = llm_response[4:].strip()

try:
config = json.loads(llm_response)
except json.JSONDecodeError:
print("Error: Invalid JSON. Attempting to fix...")
config = fix_json(llm_response)

return config

def fix_json(invalid_json: str) -> Dict[str, Any]:
"""
Attempt to fix invalid JSON by sending it back to the LLM for correction.
"""
system_message = """
The following JSON is invalid. Please correct any syntax errors and return a valid JSON object.
Only respond with the corrected JSON, nothing else.
"""

response = client.chat.completions.create(
model="llama-3.1-70b-versatile",
messages=[
{"role": "system", "content": system_message},
{"role": "user", "content": invalid_json}
]
)

corrected_json = response.choices[0].message.content
corrected_json = corrected_json.strip('`')
if corrected_json.startswith('json'):
corrected_json = corrected_json[4:].strip()

try:
return json.loads(corrected_json)
except json.JSONDecodeError:
raise ValueError("Unable to generate valid JSON configuration. Please try again with a different prompt.")

def create_agent(agent_config: Dict[str, Any]) -> Agent:
"""
Create an Agent instance from a configuration dictionary.
"""
tools = []
if "duckduckgo_tool" in agent_config["tools"]:
tools.append(duckduckgo_tool)
if "wikipedia_tool" in agent_config["tools"]:
tools.append(wikipedia_tool)
if "wolframalpha_tool" in agent_config["tools"]:
tools.append(wolframalpha_tool)
if "write_file_tool" in agent_config["tools"]:
tools.append(write_file_tool)
if "read_file_tool" in agent_config["tools"]:
tools.append(read_file_tool)
if "list_directory_tool" in agent_config["tools"]:
tools.append(list_directory_tool)
if "copy_file_tool" in agent_config["tools"]:
tools.append(copy_file_tool)
if "delete_file_tool" in agent_config["tools"]:
tools.append(delete_file_tool)
if "file_search_tool" in agent_config["tools"]:
tools.append(file_search_tool)
if "move_file_tool" in agent_config["tools"]:
tools.append(move_file_tool)
if "scrape_tool" in agent_config["tools"]:
tools.append(scrape_tool)

return Agent(
role=agent_config["role"],
goal=agent_config["goal"],
backstory=agent_config["backstory"],
verbose=agent_config["verbose"],
allow_delegation=agent_config["allow_delegation"],
tools=tools
)

def create_task(task_config: Dict[str,Any], agents: List[Agent]) -> Task:
"""
Create a Task instance from a configuration dictionary and a list of available agents.
"""
agent = next(agent for agent in agents if agent.role == task_config["agent"])
return Task(
description=task_config["description"],
expected_output=task_config["expected_output"],
agent=agent
)

def create_squad(squad_config: Dict[str, Any], agents: List[Agent], tasks: List[Task]) -> Squad:
"""
Create a Squad instance from configuration dictionary, a list of available agents, and a list of tasks.
"""
squad_agents = [next(agent for agent in agents if agent.role == role) for role in squad_config["agents"]]
squad_tasks = [next(task for task in tasks if task.description == desc) for desc in squad_config["tasks"]]

manager = None
if squad_config["process"] == "hierarchical":
manager = Agent(
role="Project Manager",
goal="Efficiently manage the squad and ensure high-quality task completion",
backstory="You're an experienced project manager, skilled in overseeing complex projects and guiding teams to success. Your role is to coordinate the efforts of the squad members, ensuring that each task is completed on time and to the highest standard.",
allow_delegation=True,
verbose=True
)

return Squad(
name=squad_config["name"],
agents=squad_agents,
tasks=squad_tasks,
process=Process.sequential if squad_config["process"] == "sequential" else Process.hierarchical,
memory=True,
embedder={
"provider": "cohere",
"config": {
"model": "embed-english-v3.0", "vector_dimension": 1024
}
},
verbose=squad_config["verbose"],
manager_agent=manager
)

def run_squad(config: Dict[str, Any], user_prompt: str) -> str:
"""
Run the squad based on the configuration.
"""
agents = [create_agent(agent_config) for agent_config in config["agents"]]
tasks = [create_task(task_config, agents) for task_config in config["tasks"]]
squads = [create_squad(squad_config, agents, tasks) for squad_config in config["squads"]]

result = "Running squads:\n"
for squad in squads:
squad_result = squad.kickoff()
result += f"\n{squad.name}: {squad_result}"
return result

def run_dynamic_squad(user_prompt: str) -> str:
"""
Run a dynamically created squadAI based on the user's prompt.
"""
config = get_squad_config(user_prompt)
return run_squad(config, user_prompt)

def main():
user_prompt = input("Enter your goal for squadAI: ")
try:
result = run_dynamic_squad(user_prompt)

except Exception as e:
print(f"An error occurred: {str(e)}")
print("Please try again with a different prompt or check your configuration.")


if __name__ == "__main__":
main()
Loading

0 comments on commit cae1a86

Please sign in to comment.