Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get the rotating dots when I hit CTRL, then this error #298

Open
sgjohnson1981 opened this issue Aug 25, 2024 · 2 comments
Open

Get the rotating dots when I hit CTRL, then this error #298

sgjohnson1981 opened this issue Aug 25, 2024 · 2 comments

Comments

@sgjohnson1981
Copy link

INFO:     Started server process [51254]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:10101 (Press CTRL+C to quit)
INFO:     ('127.0.0.1', 60558) - "WebSocket /" [accepted]
INFO:     connection open
Connected to server.
Hold CTRL to speak to the assistant. Press 'CTRL-C' to quit.


Could not load library libcudnn_ops_infer.so.8. Error: libcudnn_ops_infer.so.8: cannot open shared object file: No such file or directory


--- SENT ERROR: ---


Traceback (most recent call last):
  File "/home/user/.cache/pypoetry/virtualenvs/01os-jX8ul_1R-py3.11/lib/python3.11/site-packages/interpreter/core/async_core.py", line 460, in receive_input
    await async_interpreter.input(data)
  File "/media/user/8E0C30BE0C30A2DF/code/LLMs/01/software/source/server/async_server.py", line 67, in new_input
    content = self.stt.text()
              ^^^^^^^^^^^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01os-jX8ul_1R-py3.11/lib/python3.11/site-packages/RealtimeSTT/audio_recorder.py", line 894, in text
    return self.transcribe()
           ^^^^^^^^^^^^^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01os-jX8ul_1R-py3.11/lib/python3.11/site-packages/RealtimeSTT/audio_recorder.py", line 845, in transcribe
    status, result = self.parent_transcription_pipe.recv()
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/multiprocessing/connection.py", line 249, in recv
    buf = self._recv_bytes()
          ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/multiprocessing/connection.py", line 413, in _recv_bytes
    buf = self._recv(4)
          ^^^^^^^^^^^^^
  File "/usr/lib/python3.11/multiprocessing/connection.py", line 382, in _recv
    raise EOFError
EOFError




--- (ERROR ABOVE) ---


Client wants to disconnect, that's fine..
INFO:     connection closed
INFO:     ('127.0.0.1', 55156) - "WebSocket /" [accepted]
INFO:     connection open
Connected to server.




--- SENT ERROR: ---


Traceback (most recent call last):
  File "/home/user/.cache/pypoetry/virtualenvs/01os-jX8ul_1R-py3.11/lib/python3.11/site-packages/interpreter/core/async_core.py", line 460, in receive_input
    await async_interpreter.input(data)
  File "/media/user/8E0C30BE0C30A2DF/code/LLMs/01/software/source/server/async_server.py", line 67, in new_input
    content = self.stt.text()
              ^^^^^^^^^^^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01os-jX8ul_1R-py3.11/lib/python3.11/site-packages/RealtimeSTT/audio_recorder.py", line 894, in text
    return self.transcribe()
           ^^^^^^^^^^^^^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01os-jX8ul_1R-py3.11/lib/python3.11/site-packages/RealtimeSTT/audio_recorder.py", line 844, in transcribe
    self.parent_transcription_pipe.send((self.audio, self.language))
  File "/usr/lib/python3.11/multiprocessing/connection.py", line 205, in send
    self._send_bytes(_ForkingPickler.dumps(obj))
  File "/usr/lib/python3.11/multiprocessing/connection.py", line 403, in _send_bytes
    self._send(header)
  File "/usr/lib/python3.11/multiprocessing/connection.py", line 367, in _send
    n = write(self._handle, buf)
        ^^^^^^^^^^^^^^^^^^^^^^^^
BrokenPipeError: [Errno 32] Broken pipe

[Errno 32] Broken pipe


--- (ERROR ABOVE) ---






--- SENT ERROR: ---


Traceback (most recent call last):
  File "/home/user/.cache/pypoetry/virtualenvs/01os-jX8ul_1R-py3.11/lib/python3.11/site-packages/interpreter/core/async_core.py", line 460, in receive_input
    await async_interpreter.input(data)
  File "/media/user/8E0C30BE0C30A2DF/code/LLMs/01/software/source/server/async_server.py", line 67, in new_input
    content = self.stt.text()
              ^^^^^^^^^^^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01os-jX8ul_1R-py3.11/lib/python3.11/site-packages/RealtimeSTT/audio_recorder.py", line 894, in text
    return self.transcribe()
           ^^^^^^^^^^^^^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01os-jX8ul_1R-py3.11/lib/python3.11/site-packages/RealtimeSTT/audio_recorder.py", line 844, in transcribe
    self.parent_transcription_pipe.send((self.audio, self.language))
  File "/usr/lib/python3.11/multiprocessing/connection.py", line 205, in send
    self._send_bytes(_ForkingPickler.dumps(obj))
  File "/usr/lib/python3.11/multiprocessing/connection.py", line 403, in _send_bytes
    self._send(header)
  File "/usr/lib/python3.11/multiprocessing/connection.py", line 367, in _send
    n = write(self._handle, buf)
        ^^^^^^^^^^^^^^^^^^^^^^^^
BrokenPipeError: [Errno 32] Broken pipe

[Errno 32] Broken pipe


--- (ERROR ABOVE) ---

Desktop (please complete the following information):

  • OS: Debian 12.6
  • Python Version: 3.11
  • Running --profile local with ollama/llama3.1:8b on desktop
@sgjohnson1981
Copy link
Author

Accounts above have been blocked and reported for spam/phishing.

@sgjohnson1981
Copy link
Author

After yesterday's update I pulled everything and rebuilt the poetry env. Getting a completely different error now.

Starting server...
Starting client...
ALSA lib pcm.c:2666:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.rear
ALSA lib pcm.c:2666:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.center_lfe
ALSA lib pcm.c:2666:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.side
ALSA lib pcm_route.c:877:(find_matching_chmap) Found no matching channel map
ALSA lib pcm_route.c:877:(find_matching_chmap) Found no matching channel map
ALSA lib pcm_route.c:877:(find_matching_chmap) Found no matching channel map
ALSA lib pcm_route.c:877:(find_matching_chmap) Found no matching channel map
Cannot connect to server socket err = No such file or directory
Cannot connect to server request channel
jack server is not running or cannot be started
JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for -1, skipping unlock
JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for -1, skipping unlock
Cannot connect to server socket err = No such file or directory
Cannot connect to server request channel
jack server is not running or cannot be started
JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for -1, skipping unlock
JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for -1, skipping unlock
ALSA lib pcm_oss.c:397:(_snd_pcm_oss_open) Cannot open device /dev/dsp
ALSA lib pcm_oss.c:397:(_snd_pcm_oss_open) Cannot open device /dev/dsp
ALSA lib pcm_a52.c:1001:(_snd_pcm_a52_open) a52 is only for playback
ALSA lib confmisc.c:160:(snd_config_get_card) Invalid field card
ALSA lib pcm_usb_stream.c:482:(_snd_pcm_usb_stream_open) Invalid card 'card'
ALSA lib confmisc.c:160:(snd_config_get_card) Invalid field card
ALSA lib pcm_usb_stream.c:482:(_snd_pcm_usb_stream_open) Invalid card 'card'
Cannot connect to server socket err = No such file or directory
Cannot connect to server request channel
jack server is not running or cannot be started
JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for -1, skipping unlock
JackShmReadWritePtr::~JackShmReadWritePtr - Init not done for -1, skipping unlock
Loading llama3.1:8b...

Loading...
Loading...
Loading...
Loading...
Model loaded.                                                                                                                                                                                    


▌ Local model set to ollama/llama3.1:8b, Local TTS set to coqui.                                                                                                                               

Loading...
Loading...
Exception in thread Thread-3 (run):
Traceback (most recent call last):
  File "/usr/lib/python3.11/threading.py", line 1038, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.11/threading.py", line 975, in run
    self._target(*self._args, **self._kwargs)
  File "/media/user/8E0C30BE0C30A2DF/code/LLMs/01/software/source/clients/light-python/client.py", line 100, in run
    device.start()
  File "/media/user/8E0C30BE0C30A2DF/code/LLMs/01/software/source/clients/light-python/client.py", line 94, in start
    asyncio.run(self.main())
  File "/usr/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/media/user/8E0C30BE0C30A2DF/code/LLMs/01/software/source/clients/light-python/client.py", line 87, in main
    await self.connect_with_retry()
  File "/media/user/8E0C30BE0C30A2DF/code/LLMs/01/software/source/clients/light-python/client.py", line 38, in connect_with_retry
    raise Exception("Failed to connect to the server after multiple attempts")
Exception: Failed to connect to the server after multiple attempts
Error loading model for checkpoint /media/user/8E0C30BE0C30A2DF/code/LLMs/01/software/models/v2.0.2: Cannot re-initialize CUDA in forked subprocess. To use CUDA with multiprocessing, you must use the 'spawn' start method
RealTimeSTT: root - ERROR - Error initializing main coqui engine model: Cannot re-initialize CUDA in forked subprocess. To use CUDA with multiprocessing, you must use the 'spawn' start method
Traceback (most recent call last):
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/RealtimeTTS/engines/coqui_engine.py", line 502, in _synthesize_worker
    tts = load_model(checkpoint, tts)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/RealtimeTTS/engines/coqui_engine.py", line 486, in load_model
    tts.to(torch_device)
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1173, in to
    return self._apply(convert)
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 779, in _apply
    module._apply(fn)
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 779, in _apply
    module._apply(fn)
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 779, in _apply
    module._apply(fn)
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 804, in _apply
    param_applied = fn(param)
                    ^^^^^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1159, in convert
    return t.to(
           ^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/cuda/__init__.py", line 279, in _lazy_init
    raise RuntimeError(
RuntimeError: Cannot re-initialize CUDA in forked subprocess. To use CUDA with multiprocessing, you must use the 'spawn' start method
Process Process-1:
Traceback (most recent call last):
  File "/usr/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/usr/lib/python3.11/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/RealtimeTTS/engines/coqui_engine.py", line 502, in _synthesize_worker
    tts = load_model(checkpoint, tts)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/RealtimeTTS/engines/coqui_engine.py", line 486, in load_model
    tts.to(torch_device)
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1173, in to
    return self._apply(convert)
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 779, in _apply
    module._apply(fn)
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 779, in _apply
    module._apply(fn)
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 779, in _apply
    module._apply(fn)
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 804, in _apply
    param_applied = fn(param)
                    ^^^^^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1159, in convert
    return t.to(
           ^^^^^
  File "/home/user/.cache/pypoetry/virtualenvs/01-jX8ul_1R-py3.11/lib/python3.11/site-packages/torch/cuda/__init__.py", line 279, in _lazy_init
    raise RuntimeError(
RuntimeError: Cannot re-initialize CUDA in forked subprocess. To use CUDA with multiprocessing, you must use the 'spawn' start method

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants
@sgjohnson1981 and others