Merge branch 'SciSharp:master' into Development #114
Annotations
1 error and 30 warnings
Test (macos-release)
Process completed with exit code 1.
|
Test (linux-release):
LLama/Native/SafeLLamaContextHandle.cs#L203
'NativeApi.llama_eval(SafeLLamaContextHandle, int*, int, int)' is obsolete: 'use llama_decode() instead'
|
Test (linux-release):
LLama/LLamaInstructExecutor.cs#L109
Possible null reference argument for parameter 'data' in 'Task InstructExecutor.LoadState(ExecutorBaseState data)'.
|
Test (linux-release):
LLama/LLamaInteractExecutor.cs#L92
Possible null reference argument for parameter 'data' in 'Task InteractiveExecutor.LoadState(ExecutorBaseState data)'.
|
Test (linux-release):
LLama/LLamaExecutorBase.cs#L297
'LLamaContext.DeTokenize(IReadOnlyList<int>)' is obsolete: 'Use a `StreamingTokenDecoder` instead'
|
Test (linux-release):
LLama/LLamaInteractExecutor.cs#L137
'IReadOnlyListExtensions.TokensEndsWithAnyString<TTokens>(TTokens, IList<string>?, SafeLlamaModelHandle, Encoding)' is obsolete: 'Use an Antiprompt processor instead'
|
Test (linux-release):
LLama/LLamaInteractExecutor.cs#L133
This async method lacks 'await' operators and will run synchronously. Consider using the 'await' operator to await non-blocking API calls, or 'await Task.Run(...)' to do CPU-bound work on a background thread.
|
Test (linux-release):
LLama/LLamaInstructExecutor.cs#L154
'IReadOnlyListExtensions.TokensEndsWithAnyString<TTokens>(TTokens, IList<string>?, SafeLlamaModelHandle, Encoding)' is obsolete: 'Use an Antiprompt processor instead'
|
Test (linux-release):
LLama/LLamaInstructExecutor.cs#L150
This async method lacks 'await' operators and will run synchronously. Consider using the 'await' operator to await non-blocking API calls, or 'await Task.Run(...)' to do CPU-bound work on a background thread.
|
Test (linux-release):
LLama/LLamaInstructExecutor.cs#L210
Possible null reference argument for parameter 'filename' in 'void StatefulExecutorBase.SaveSessionFile(string filename)'.
|
Test (linux-release):
LLama/LLamaInteractExecutor.cs#L189
Possible null reference argument for parameter 'filename' in 'void StatefulExecutorBase.SaveSessionFile(string filename)'.
|
Test (windows-release):
LLama/Native/SafeLLamaContextHandle.cs#L203
'NativeApi.llama_eval(SafeLLamaContextHandle, int*, int, int)' is obsolete: 'use llama_decode() instead'
|
Test (windows-release):
LLama/LLamaInteractExecutor.cs#L92
Possible null reference argument for parameter 'data' in 'Task InteractiveExecutor.LoadState(ExecutorBaseState data)'.
|
Test (windows-release):
LLama/LLamaInteractExecutor.cs#L137
'IReadOnlyListExtensions.TokensEndsWithAnyString<TTokens>(TTokens, IList<string>?, SafeLlamaModelHandle, Encoding)' is obsolete: 'Use an Antiprompt processor instead'
|
Test (windows-release):
LLama/LLamaInteractExecutor.cs#L133
This async method lacks 'await' operators and will run synchronously. Consider using the 'await' operator to await non-blocking API calls, or 'await Task.Run(...)' to do CPU-bound work on a background thread.
|
Test (windows-release):
LLama/LLamaInteractExecutor.cs#L189
Possible null reference argument for parameter 'filename' in 'void StatefulExecutorBase.SaveSessionFile(string filename)'.
|
Test (windows-release):
LLama/LLamaInteractExecutor.cs#L159
This async method lacks 'await' operators and will run synchronously. Consider using the 'await' operator to await non-blocking API calls, or 'await Task.Run(...)' to do CPU-bound work on a background thread.
|
Test (windows-release):
LLama/LLamaInstructExecutor.cs#L109
Possible null reference argument for parameter 'data' in 'Task InstructExecutor.LoadState(ExecutorBaseState data)'.
|
Test (windows-release):
LLama/LLamaInstructExecutor.cs#L154
'IReadOnlyListExtensions.TokensEndsWithAnyString<TTokens>(TTokens, IList<string>?, SafeLlamaModelHandle, Encoding)' is obsolete: 'Use an Antiprompt processor instead'
|
Test (windows-release):
LLama/LLamaInstructExecutor.cs#L150
This async method lacks 'await' operators and will run synchronously. Consider using the 'await' operator to await non-blocking API calls, or 'await Task.Run(...)' to do CPU-bound work on a background thread.
|
Test (windows-release):
LLama/LLamaInstructExecutor.cs#L210
Possible null reference argument for parameter 'filename' in 'void StatefulExecutorBase.SaveSessionFile(string filename)'.
|
Test (macos-release):
LLama/Native/SafeLLamaContextHandle.cs#L203
'NativeApi.llama_eval(SafeLLamaContextHandle, int*, int, int)' está obsoleto: 'use llama_decode() instead'
|
Test (macos-release):
LLama/LLamaInteractExecutor.cs#L92
Posible argumento de referencia nulo para el parámetro "data" en "Task InteractiveExecutor.LoadState(ExecutorBaseState data)".
|
Test (macos-release):
LLama/LLamaInstructExecutor.cs#L109
Posible argumento de referencia nulo para el parámetro "data" en "Task InstructExecutor.LoadState(ExecutorBaseState data)".
|
Test (macos-release):
LLama/LLamaInteractExecutor.cs#L137
'IReadOnlyListExtensions.TokensEndsWithAnyString<TTokens>(TTokens, IList<string>?, SafeLlamaModelHandle, Encoding)' está obsoleto: 'Use an Antiprompt processor instead'
|
Test (macos-release):
LLama/LLamaInteractExecutor.cs#L133
El método asincrónico carece de operadores "await" y se ejecutará de forma sincrónica. Puede usar el operador 'await' para esperar llamadas API que no sean de bloqueo o 'await Task.Run(...)' para hacer tareas enlazadas a la CPU en un subproceso en segundo plano.
|
Test (macos-release):
LLama/LLamaExecutorBase.cs#L297
'LLamaContext.DeTokenize(IReadOnlyList<int>)' está obsoleto: 'Use a `StreamingTokenDecoder` instead'
|
Test (macos-release):
LLama/LLamaInstructExecutor.cs#L154
'IReadOnlyListExtensions.TokensEndsWithAnyString<TTokens>(TTokens, IList<string>?, SafeLlamaModelHandle, Encoding)' está obsoleto: 'Use an Antiprompt processor instead'
|
Test (macos-release):
LLama/LLamaInstructExecutor.cs#L150
El método asincrónico carece de operadores "await" y se ejecutará de forma sincrónica. Puede usar el operador 'await' para esperar llamadas API que no sean de bloqueo o 'await Task.Run(...)' para hacer tareas enlazadas a la CPU en un subproceso en segundo plano.
|
Test (macos-release):
LLama/LLamaExecutorBase.cs#L359
El elemento propiedad "Embeds" que no acepta valores NULL debe contener un valor distinto de NULL al salir del constructor. Considere la posibilidad de declarar el elemento propiedad como que admite un valor NULL.
|
Test (macos-release):
LLama/LLamaExecutorBase.cs#L362
El elemento propiedad "EmbedInps" que no acepta valores NULL debe contener un valor distinto de NULL al salir del constructor. Considere la posibilidad de declarar el elemento propiedad como que admite un valor NULL.
|