You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Weird one, assuming its down to my model choice but for some reason when I add an image to Hoarder it does not have any tags generated for it, everything else works fine.
Here is my log below:
s6-rc: info: service legacy-services: stopping
s6-rc: info: service legacy-services successfully stopped
s6-rc: info: service legacy-cont-init: stopping
s6-rc: info: service svc-workers: stopping
s6-rc: info: service svc-web: stopping
s6-rc: info: service legacy-cont-init successfully stopped
s6-rc: info: service fix-attrs: stopping
s6-rc: info: service fix-attrs successfully stopped
s6-rc: info: service svc-web successfully stopped
s6-rc: info: service svc-workers successfully stopped
s6-rc: info: service init-db-migration: stopping
s6-rc: info: service init-db-migration successfully stopped
s6-rc: info: service s6rc-oneshot-runner: stopping
s6-rc: info: service s6rc-oneshot-runner successfully stopped
s6-rc: info: service s6rc-oneshot-runner: starting
s6-rc: info: service s6rc-oneshot-runner successfully started
s6-rc: info: service fix-attrs: starting
s6-rc: info: service init-db-migration: starting
s6-rc: info: service fix-attrs successfully started
s6-rc: info: service legacy-cont-init: starting
s6-rc: info: service legacy-cont-init successfully started
s6-rc: info: service init-db-migration successfully started
s6-rc: info: service svc-workers: starting
s6-rc: info: service svc-web: starting
s6-rc: info: service svc-workers successfully started
s6-rc: info: service svc-web successfully started
s6-rc: info: service legacy-services: starting
s6-rc: info: service legacy-services successfully started
(node:121) [DEP0040] DeprecationWarning: The punycode module is deprecated. Please use a userland alternative instead.
(Use node --trace-deprecation ... to show where the warning was created)
2025-01-26T13:33:25.956Z warn: Got an exception from ollama, will still attempt to deserialize the response we got so far: Error: Failed to create new sequence: failed to process inputs: unable to make llava embedding from image
2025-01-26T13:33:25.956Z error: [inference][4724] inference job failed: Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
at inferTags (/app/apps/workers/openaiWorker.ts:6:4172)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6694)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/[email protected][email protected]/node_modules/liteque/dist/runner.js:2:2578)
2025-01-26T13:33:25.987Z info: [inference][4724] Starting an inference job for bookmark with id "hadu1r3ca2p9p3mu2v7u095k"
2025-01-26T13:33:26.000Z warn: Got an exception from ollama, will still attempt to deserialize the response we got so far: Error: Failed to create new sequence: failed to process inputs: unable to make llava embedding from image
2025-01-26T13:33:26.000Z error: [inference][4724] inference job failed: Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
at inferTags (/app/apps/workers/openaiWorker.ts:6:4172)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6694)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/[email protected][email protected]/node_modules/liteque/dist/runner.js:2:2578)
2025-01-26T13:33:26.013Z info: [inference][4724] Starting an inference job for bookmark with id "hadu1r3ca2p9p3mu2v7u095k"
2025-01-26T13:33:26.026Z warn: Got an exception from ollama, will still attempt to deserialize the response we got so far: Error: Failed to create new sequence: failed to process inputs: unable to make llava embedding from image
2025-01-26T13:33:26.027Z error: [inference][4724] inference job failed: Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
at inferTags (/app/apps/workers/openaiWorker.ts:6:4172)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6694)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/[email protected][email protected]/node_modules/liteque/dist/runner.js:2:2578)
2025-01-26T13:33:26.042Z info: [inference][4724] Starting an inference job for bookmark with id "hadu1r3ca2p9p3mu2v7u095k"
2025-01-26T13:33:26.054Z warn: Got an exception from ollama, will still attempt to deserialize the response we got so far: Error: Failed to create new sequence: failed to process inputs: unable to make llava embedding from image
2025-01-26T13:33:26.055Z error: [inference][4724] inference job failed: Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
at inferTags (/app/apps/workers/openaiWorker.ts:6:4172)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6694)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/[email protected][email protected]/node_modules/liteque/dist/runner.js:2:2578)
2025-01-26T14:00:00.169Z info: [feed] Scheduling feed refreshing jobs ...
2025-01-26T14:09:37.059Z info: Received SIGTERM, shutting down ...
2025-01-26T14:09:37.059Z info: Shutting down crawler, openai, tidyAssets, video, feed, assetPreprocessing, webhook and search workers ...
Running db migration script
▲ Next.js 14.2.21
The text was updated successfully, but these errors were encountered:
r3l1990
changed the title
Images not being tagged - Hoarder wirking fine otherwise
Images not being tagged - Hoarder working fine otherwise
Jan 26, 2025
Weird one, assuming its down to my model choice but for some reason when I add an image to Hoarder it does not have any tags generated for it, everything else works fine.
Here is my log below:
s6-rc: info: service legacy-services: stopping
s6-rc: info: service legacy-services successfully stopped
s6-rc: info: service legacy-cont-init: stopping
s6-rc: info: service svc-workers: stopping
s6-rc: info: service svc-web: stopping
s6-rc: info: service legacy-cont-init successfully stopped
s6-rc: info: service fix-attrs: stopping
s6-rc: info: service fix-attrs successfully stopped
s6-rc: info: service svc-web successfully stopped
s6-rc: info: service svc-workers successfully stopped
s6-rc: info: service init-db-migration: stopping
s6-rc: info: service init-db-migration successfully stopped
s6-rc: info: service s6rc-oneshot-runner: stopping
s6-rc: info: service s6rc-oneshot-runner successfully stopped
s6-rc: info: service s6rc-oneshot-runner: starting
s6-rc: info: service s6rc-oneshot-runner successfully started
s6-rc: info: service fix-attrs: starting
s6-rc: info: service init-db-migration: starting
s6-rc: info: service fix-attrs successfully started
s6-rc: info: service legacy-cont-init: starting
s6-rc: info: service legacy-cont-init successfully started
s6-rc: info: service init-db-migration successfully started
s6-rc: info: service svc-workers: starting
s6-rc: info: service svc-web: starting
s6-rc: info: service svc-workers successfully started
s6-rc: info: service svc-web successfully started
s6-rc: info: service legacy-services: starting
s6-rc: info: service legacy-services successfully started
(node:121) [DEP0040] DeprecationWarning: The
punycode
module is deprecated. Please use a userland alternative instead.(Use
node --trace-deprecation ...
to show where the warning was created)2025-01-26T13:33:25.956Z warn: Got an exception from ollama, will still attempt to deserialize the response we got so far: Error: Failed to create new sequence: failed to process inputs: unable to make llava embedding from image
2025-01-26T13:33:25.956Z error: [inference][4724] inference job failed: Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
at inferTags (/app/apps/workers/openaiWorker.ts:6:4172)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6694)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/[email protected][email protected]/node_modules/liteque/dist/runner.js:2:2578)
2025-01-26T13:33:25.987Z info: [inference][4724] Starting an inference job for bookmark with id "hadu1r3ca2p9p3mu2v7u095k"
2025-01-26T13:33:26.000Z warn: Got an exception from ollama, will still attempt to deserialize the response we got so far: Error: Failed to create new sequence: failed to process inputs: unable to make llava embedding from image
2025-01-26T13:33:26.000Z error: [inference][4724] inference job failed: Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
at inferTags (/app/apps/workers/openaiWorker.ts:6:4172)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6694)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/[email protected][email protected]/node_modules/liteque/dist/runner.js:2:2578)
2025-01-26T13:33:26.013Z info: [inference][4724] Starting an inference job for bookmark with id "hadu1r3ca2p9p3mu2v7u095k"
2025-01-26T13:33:26.026Z warn: Got an exception from ollama, will still attempt to deserialize the response we got so far: Error: Failed to create new sequence: failed to process inputs: unable to make llava embedding from image
2025-01-26T13:33:26.027Z error: [inference][4724] inference job failed: Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
at inferTags (/app/apps/workers/openaiWorker.ts:6:4172)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6694)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/[email protected][email protected]/node_modules/liteque/dist/runner.js:2:2578)
2025-01-26T13:33:26.042Z info: [inference][4724] Starting an inference job for bookmark with id "hadu1r3ca2p9p3mu2v7u095k"
2025-01-26T13:33:26.054Z warn: Got an exception from ollama, will still attempt to deserialize the response we got so far: Error: Failed to create new sequence: failed to process inputs: unable to make llava embedding from image
2025-01-26T13:33:26.055Z error: [inference][4724] inference job failed: Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
Error: [inference][4724] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response:
at inferTags (/app/apps/workers/openaiWorker.ts:6:4172)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6694)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/[email protected][email protected]/node_modules/liteque/dist/runner.js:2:2578)
2025-01-26T14:00:00.169Z info: [feed] Scheduling feed refreshing jobs ...
2025-01-26T14:09:37.059Z info: Received SIGTERM, shutting down ...
2025-01-26T14:09:37.059Z info: Shutting down crawler, openai, tidyAssets, video, feed, assetPreprocessing, webhook and search workers ...
Running db migration script
▲ Next.js 14.2.21
✓ Starting...
✓ Ready in 381ms
2025-01-26T14:09:56.677Z info: Workers version: nightly
2025-01-26T14:09:56.692Z info: [crawler] Loading adblocker ...
2025-01-26T14:09:56.758Z info: [Crawler] Browser connect on demand is enabled, won't proactively start the browser instance
2025-01-26T14:09:56.758Z info: Starting crawler worker ...
2025-01-26T14:09:56.758Z info: Starting inference worker ...
2025-01-26T14:09:56.759Z info: Starting search indexing worker ...
2025-01-26T14:09:56.759Z info: Starting tidy assets worker ...
2025-01-26T14:09:56.760Z info: Starting video worker ...
2025-01-26T14:09:56.760Z info: Starting feed worker ...
2025-01-26T14:09:56.761Z info: Starting asset preprocessing worker ...
2025-01-26T14:09:56.761Z info: Starting webhook worker ...
The text was updated successfully, but these errors were encountered: