Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ONNX Support #3562

Merged
merged 91 commits into from
Jul 31, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
91 commits
Select commit Hold shift + click to select a range
67d05d2
chore: Update model config type names
blessedcoolant Jun 17, 2023
76dd749
chore: Rebuild API
blessedcoolant Jun 17, 2023
c8dfa49
fix: Update missing name types to new names
blessedcoolant Jun 17, 2023
7a66856
wip: Update Linear UI Txt2Img and Img2Img Graphs
blessedcoolant Jun 17, 2023
16dc78f
Generate config names for openapi
StAlKeR7779 Jun 17, 2023
0f3b7d2
chore: Rebuild API with new Model API names
blessedcoolant Jun 17, 2023
ce4110b
wip: Add 2.x Models to the Model List
blessedcoolant Jun 17, 2023
dc669d1
Add name, base_mode, type fields to model info
StAlKeR7779 Jun 17, 2023
24673fd
chore: Rebuild API - base_model and type added
blessedcoolant Jun 17, 2023
bf0577c
fix: 2.1 models breaking generation
blessedcoolant Jun 17, 2023
61c426f
feat: Enable 2.x Model Generation in Linear UI
blessedcoolant Jun 17, 2023
4133d77
wip: Move Model Selector to own file
blessedcoolant Jun 17, 2023
28373db
cleanup: Updated model slice names to be more descriptive
blessedcoolant Jun 18, 2023
f0bf32c
Merge branch 'main' into model-manager-ui-30
blessedcoolant Jun 18, 2023
e0c105f
feat: Port Schedulers to Mantine
blessedcoolant Jun 18, 2023
9634c96
revert: getModels to receivedModels
blessedcoolant Jun 18, 2023
7c9a939
fix: Unserialization key issue
blessedcoolant Jun 18, 2023
809ec71
fix: Remove type from Model type name
blessedcoolant Jun 18, 2023
9fda21c
Revert "feat: Port Schedulers to Mantine"
blessedcoolant Jun 18, 2023
91016d8
Merge branch 'main' into model-manager-ui-30
blessedcoolant Jun 18, 2023
17e2a35
fix: merge conflicts
blessedcoolant Jun 18, 2023
d493152
Merge branch 'main' into model-manager-ui-30
blessedcoolant Jun 19, 2023
b0c4451
Merge branch 'main' into model-manager-ui-30
blessedcoolant Jun 19, 2023
cfe81b5
fix: Adjust the Schedular select width
blessedcoolant Jun 19, 2023
85b4b35
tweal: UI colors
blessedcoolant Jun 19, 2023
7df7a95
Merge branch 'main' into model-manager-ui-30
blessedcoolant Jun 19, 2023
82b73c5
Remove default model logic
StAlKeR7779 Jun 20, 2023
4cefe37
Rename format to model_format(still named format when work with config)
StAlKeR7779 Jun 20, 2023
46dc751
Update model format field to use enums
StAlKeR7779 Jun 20, 2023
92c86fd
Set model type to const value in openapi schema, add model format enu…
StAlKeR7779 Jun 20, 2023
4d337f6
ONNX Model/runtime first implementation
StAlKeR7779 Jun 20, 2023
7759b3f
Small refactor
StAlKeR7779 Jun 21, 2023
6c7668a
Update onnx model structure, change code according
StAlKeR7779 Jun 22, 2023
bb85608
Merge branch 'main' into feat/onnx
blessedcoolant Jun 22, 2023
0327eae
chore: Regen API
blessedcoolant Jun 22, 2023
524888b
Merge branch 'main' into feat/onnx
brandonrising Jul 13, 2023
bd7b599
Testing onnx in new ui updates
brandonrising Jul 14, 2023
9111216
Fix syntax err
brandonrising Jul 16, 2023
932112b
testing being super wasteful with data
brandonrising Jul 16, 2023
bcce70f
Testing different session opts, added timings for testing
brandonrising Jul 17, 2023
35d5ef9
Emit step completions
brandonrising Jul 18, 2023
869f418
Setup onnx on linear text2image
brandonrising Jul 18, 2023
e201ad2
Switch to io_binding for run, testing different session options
brandonrising Jul 19, 2023
487455e
Add model_type to the model state object
brandonrising Jul 19, 2023
ee7b36c
Merge branch 'main' into onnx-testing
brandonrising Jul 19, 2023
f4e52fa
Fix as part of merging main in
brandonrising Jul 19, 2023
9e65470
Setup dist
brandonrising Jul 19, 2023
8699fd7
Fix invoke UI graphs for onnx
brandonrising Jul 19, 2023
a28ab65
Setup dist folder
brandonrising Jul 19, 2023
e8299d0
Comment out erroniously removed del statement, comment out opt tests
brandonrising Jul 19, 2023
43b6a07
io binding seems to be massively resource intensive compared to sessi…
brandonrising Jul 19, 2023
8f61413
Setup dist folder
brandonrising Jul 19, 2023
6aab8f1
Fix issue from merge
brandonrising Jul 19, 2023
23f4a4e
Fix dist
brandonrising Jul 19, 2023
4e90376
Allow passing in of precision, use available providers if none provided
brandonrising Jul 20, 2023
ba1a934
Fix Lora typings
brandonrising Jul 20, 2023
ce08aa3
Allow controlnet passthrough for now
brandonrising Jul 20, 2023
7875004
Pass in dim overrides
brandonrising Jul 21, 2023
c16da75
Merge branch 'main' into feat/onnx
brandonrising Jul 26, 2023
861c0fe
Correct issues caused by merging main
brandonrising Jul 26, 2023
f26a423
Fix merge issue
brandonrising Jul 26, 2023
4d732e0
Remove onnx models from img2img and unified canvas
brandonrising Jul 26, 2023
024f92f
Add onnx models to the model manager UI
brandonrising Jul 27, 2023
4ebde01
Allow deleting onnx models in model manager ui
brandonrising Jul 27, 2023
eb1ba8d
Merge branch 'main' into feat/onnx
brandonrising Jul 27, 2023
d2a46b4
Fix dist and schema after merge
brandonrising Jul 27, 2023
989d3d7
Remove onnx changes from canvas img2img, inpaint, and linear image2image
brandonrising Jul 27, 2023
81d8fb8
Removed things no longer needed in main
brandonrising Jul 27, 2023
33245b3
Removed things no longer needed in main
brandonrising Jul 27, 2023
57271ad
Move onnx to optional dependencies
brandonrising Jul 27, 2023
f7bb4c3
Remove more files no longer needed in main
brandonrising Jul 27, 2023
a491e32
This is no longer needed
brandonrising Jul 27, 2023
918a0de
Always install onnx
brandonrising Jul 27, 2023
5971693
Remove TensorRT support at the current time until we validate it work…
brandonrising Jul 27, 2023
bfdc8c8
Testing caching onnx sessions
brandonrising Jul 27, 2023
1ea9ba8
Release session if applying ti or lora
brandonrising Jul 27, 2023
dc11481
Just install onnxruntime by default
brandonrising Jul 28, 2023
2b7b3dd
Run python black
brandonrising Jul 28, 2023
da751da
Merge branch 'main' into feat/onnx
brandonrising Jul 28, 2023
a2aa66f
Run Python black
brandonrising Jul 28, 2023
8935ae0
Fix issues caused by merge
brandonrising Jul 28, 2023
390ce9f
Fix onnx installer
brandonrising Jul 28, 2023
d3f6c7f
Remove onnxruntime
brandonrising Jul 28, 2023
1bbf2f2
Update installer
brandonrising Jul 29, 2023
6ca0c38
Merge branch 'main' into feat/onnx
brandonrising Jul 29, 2023
f5ac73b
Merge branch 'main' into feat/onnx
brandonrising Jul 31, 2023
1bafbaf
Regen schema and rebuild frontend after merging main
brandonrising Jul 31, 2023
f784e84
Some cleanup after the merge
brandonrising Jul 31, 2023
af4fd32
Merge branch 'main' into feat/onnx
brandonrising Jul 31, 2023
aeac557
Run python black, point out that onnx is an alpha feature in the inst…
brandonrising Jul 31, 2023
746afcd
Merge branch 'main' into feat/onnx
hipsterusername Jul 31, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 5 additions & 2 deletions installer/lib/installer.py
Original file line number Diff line number Diff line change
Expand Up @@ -455,7 +455,7 @@ def get_torch_source() -> (Union[str, None], str):
device = graphical_accelerator()

url = None
optional_modules = None
optional_modules = "[onnx]"
if OS == "Linux":
if device == "rocm":
url = "https://download.pytorch.org/whl/rocm5.4.2"
Expand All @@ -464,7 +464,10 @@ def get_torch_source() -> (Union[str, None], str):

if device == "cuda":
url = "https://download.pytorch.org/whl/cu117"
optional_modules = "[xformers]"
optional_modules = "[xformers,onnx-cuda]"
if device == "cuda_and_dml":
url = "https://download.pytorch.org/whl/cu117"
optional_modules = "[xformers,onnx-directml]"

# in all other cases, Torch wheels should be coming from PyPi as of Torch 1.13

Expand Down
6 changes: 5 additions & 1 deletion installer/lib/messages.py
Original file line number Diff line number Diff line change
Expand Up @@ -167,6 +167,10 @@ def graphical_accelerator():
"an [gold1 b]NVIDIA[/] GPU (using CUDA™)",
"cuda",
)
nvidia_with_dml = (
"an [gold1 b]NVIDIA[/] GPU (using CUDA™, and DirectML™ for ONNX) -- ALPHA",
"cuda_and_dml",
)
amd = (
"an [gold1 b]AMD[/] GPU (using ROCm™)",
"rocm",
Expand All @@ -181,7 +185,7 @@ def graphical_accelerator():
)

if OS == "Windows":
options = [nvidia, cpu]
options = [nvidia, nvidia_with_dml, cpu]
if OS == "Linux":
options = [nvidia, amd, cpu]
elif OS == "Darwin":
Expand Down
8 changes: 8 additions & 0 deletions invokeai/app/invocations/compel.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,14 @@
from typing import Literal, Optional, Union, List, Annotated
from pydantic import BaseModel, Field
import re

from .baseinvocation import BaseInvocation, BaseInvocationOutput, InvocationContext, InvocationConfig
from .model import ClipField

from ...backend.util.devices import torch_dtype
from ...backend.stable_diffusion.diffusion import InvokeAIDiffuserComponent
from ...backend.model_management import BaseModelType, ModelType, SubModelType, ModelPatcher

import torch
from compel import Compel, ReturnedEmbeddingsType
from compel.prompt_parser import Blend, Conjunction, CrossAttentionControlSubstitute, FlattenedPrompt, Fragment
Expand Down
1 change: 1 addition & 0 deletions invokeai/app/invocations/latent.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
)
from ...backend.stable_diffusion.diffusion.shared_invokeai_diffusion import PostprocessingSettings
from ...backend.stable_diffusion.schedulers import SCHEDULER_MAP
from ...backend.model_management import ModelPatcher
from ...backend.util.devices import choose_torch_device, torch_dtype, choose_precision
from ..models.image import ImageCategory, ImageField, ResourceOrigin
from .baseinvocation import BaseInvocation, BaseInvocationOutput, InvocationConfig, InvocationContext
Expand Down
1 change: 1 addition & 0 deletions invokeai/app/invocations/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,7 @@ class MainModelField(BaseModel):

model_name: str = Field(description="Name of the model")
base_model: BaseModelType = Field(description="Base model")
model_type: ModelType = Field(description="Model Type")


class LoRAModelField(BaseModel):
Expand Down
Loading
Loading