Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Images not being tagged - Hoarder working fine otherwise #939

Open
r3l1990 opened this issue Jan 26, 2025 · 2 comments
Open

Images not being tagged - Hoarder working fine otherwise #939

r3l1990 opened this issue Jan 26, 2025 · 2 comments
Labels
question Further information is requested

Comments

@r3l1990
Copy link

r3l1990 commented Jan 26, 2025

Weird one, assuming its down to my model choice but for some reason when I add an image to Hoarder it does not have any tags generated for it, everything else works fine.

Here is my log below:

s6-rc: info: service legacy-services: stopping
s6-rc: info: service legacy-services successfully stopped
s6-rc: info: service legacy-cont-init: stopping
s6-rc: info: service svc-workers: stopping
s6-rc: info: service svc-web: stopping
s6-rc: info: service legacy-cont-init successfully stopped
s6-rc: info: service fix-attrs: stopping
s6-rc: info: service fix-attrs successfully stopped
s6-rc: info: service svc-web successfully stopped
s6-rc: info: service svc-workers successfully stopped
s6-rc: info: service init-db-migration: stopping
s6-rc: info: service init-db-migration successfully stopped
s6-rc: info: service s6rc-oneshot-runner: stopping
s6-rc: info: service s6rc-oneshot-runner successfully stopped
s6-rc: info: service s6rc-oneshot-runner: starting
s6-rc: info: service s6rc-oneshot-runner successfully started
s6-rc: info: service fix-attrs: starting
s6-rc: info: service init-db-migration: starting
s6-rc: info: service fix-attrs successfully started
s6-rc: info: service legacy-cont-init: starting
s6-rc: info: service legacy-cont-init successfully started
s6-rc: info: service init-db-migration successfully started
s6-rc: info: service svc-workers: starting
s6-rc: info: service svc-web: starting
s6-rc: info: service svc-workers successfully started
s6-rc: info: service svc-web successfully started
s6-rc: info: service legacy-services: starting
s6-rc: info: service legacy-services successfully started
(node:121) [DEP0040] DeprecationWarning: The punycode module is deprecated. Please use a userland alternative instead.
(Use node --trace-deprecation ... to show where the warning was created)
2025-01-26T13:33:25.956Z warn: Got an exception from ollama, will still attempt to deserialize the response we got so far: Error: Failed to create new sequence: failed to process inputs: unable to make llava embedding from image

2025-01-26T13:33:25.956Z error: [inference][4724] inference job failed: Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
at inferTags (/app/apps/workers/openaiWorker.ts:6:4172)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6694)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/[email protected][email protected]/node_modules/liteque/dist/runner.js:2:2578)
2025-01-26T13:33:25.987Z info: [inference][4724] Starting an inference job for bookmark with id "hadu1r3ca2p9p3mu2v7u095k"
2025-01-26T13:33:26.000Z warn: Got an exception from ollama, will still attempt to deserialize the response we got so far: Error: Failed to create new sequence: failed to process inputs: unable to make llava embedding from image

2025-01-26T13:33:26.000Z error: [inference][4724] inference job failed: Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
at inferTags (/app/apps/workers/openaiWorker.ts:6:4172)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6694)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/[email protected][email protected]/node_modules/liteque/dist/runner.js:2:2578)
2025-01-26T13:33:26.013Z info: [inference][4724] Starting an inference job for bookmark with id "hadu1r3ca2p9p3mu2v7u095k"
2025-01-26T13:33:26.026Z warn: Got an exception from ollama, will still attempt to deserialize the response we got so far: Error: Failed to create new sequence: failed to process inputs: unable to make llava embedding from image

2025-01-26T13:33:26.027Z error: [inference][4724] inference job failed: Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
at inferTags (/app/apps/workers/openaiWorker.ts:6:4172)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6694)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/[email protected][email protected]/node_modules/liteque/dist/runner.js:2:2578)
2025-01-26T13:33:26.042Z info: [inference][4724] Starting an inference job for bookmark with id "hadu1r3ca2p9p3mu2v7u095k"
2025-01-26T13:33:26.054Z warn: Got an exception from ollama, will still attempt to deserialize the response we got so far: Error: Failed to create new sequence: failed to process inputs: unable to make llava embedding from image

2025-01-26T13:33:26.055Z error: [inference][4724] inference job failed: Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
at inferTags (/app/apps/workers/openaiWorker.ts:6:4172)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6694)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/[email protected][email protected]/node_modules/liteque/dist/runner.js:2:2578)
2025-01-26T14:00:00.169Z info: [feed] Scheduling feed refreshing jobs ...
2025-01-26T14:09:37.059Z info: Received SIGTERM, shutting down ...
2025-01-26T14:09:37.059Z info: Shutting down crawler, openai, tidyAssets, video, feed, assetPreprocessing, webhook and search workers ...
Running db migration script
▲ Next.js 14.2.21

✓ Starting...
✓ Ready in 381ms

@hoarder/[email protected] start:prod /app/apps/workers
tsx index.ts

2025-01-26T14:09:56.677Z info: Workers version: nightly
2025-01-26T14:09:56.692Z info: [crawler] Loading adblocker ...
2025-01-26T14:09:56.758Z info: [Crawler] Browser connect on demand is enabled, won't proactively start the browser instance
2025-01-26T14:09:56.758Z info: Starting crawler worker ...
2025-01-26T14:09:56.758Z info: Starting inference worker ...
2025-01-26T14:09:56.759Z info: Starting search indexing worker ...
2025-01-26T14:09:56.759Z info: Starting tidy assets worker ...
2025-01-26T14:09:56.760Z info: Starting video worker ...
2025-01-26T14:09:56.760Z info: Starting feed worker ...
2025-01-26T14:09:56.761Z info: Starting asset preprocessing worker ...
2025-01-26T14:09:56.761Z info: Starting webhook worker ...

@r3l1990 r3l1990 changed the title Images not being tagged - Hoarder wirking fine otherwise Images not being tagged - Hoarder working fine otherwise Jan 26, 2025
@MohamedBassem
Copy link
Collaborator

Which model are you using for image tagging?

@MohamedBassem MohamedBassem added the question Further information is requested label Jan 26, 2025
@r3l1990
Copy link
Author

r3l1990 commented Jan 26, 2025

llava-phi3 - to be honest its the same I have had setup since I started using this app so it may be time for an update :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants