Details about package llama-cpp
Name: | llama-cpp |
---|---|
Uploader: | Mathieu Baudier <mbaudier@argeo.org> (Debian QA page) |
Description: | libllama - Inference of LLMs in pure C/C++ (shared library) llama-cpp-cli - Inference of LLMs in pure C/C++ (CLI) llama-cpp-server - Inference of LLMs in pure C/C++ (server) llama-cpp-quantize - Inference of LLMs in pure C/C++ (quantize) libllama-dev - Inference of LLMs in pure C/C++ (development files) llama-cpp-dev - Inference of LLMs in pure C/C++ (common static library) |
Package uploads
Upload #1
Information
Version: | 0.0.4719-1 |
---|---|
Uploaded: | 2025-02-17 13:11 |
Source package: | llama-cpp_0.0.4719-1.dsc |
Distribution: | unstable |
Section: | science |
Priority: | optional |
Homepage: | https://github.com/ggerganov/llama.cpp |
Changelog
llama-cpp (0.0.4719-1) unstable; urgency=medium . * Update upstream
QA information
-
–
Package uses debhelper-compatDebhelper compatibility level 13
-
–
Watch file is not present
-
–
Package is not native
Format: 3.0 (quilt) -
–
"Maintainer" email is the same as the uploader
-
–
Package has lintian warningsllama-cpp source
-
W
mismatched-override
- source-is-missing [examples/server/webui/index.html] [debian/source/lintian-overrides:4]
-
I
debian-watch-file-is-missing
-
I
file-contains-fixme-placeholder
- FIXME [debian/rules:21]
-
I
installable-field-mirrors-source
- (in section for libllama) Priority [debian/control:12]
- (in section for libllama-dev) Priority [debian/control:47]
- (in section for llama-cpp-cli) Priority [debian/control:23]
- (in section for llama-cpp-dev) Priority [debian/control:55]
- (in section for llama-cpp-quantize) Priority [debian/control:39]
- (in section for llama-cpp-server) Priority [debian/control:31]
-
I
missing-prerequisite-for-pyproject-backend
- poetry.core.masonry.api (does not satisfy python3-poetry-core:any, pybuild-plugin-pyproject:any) [pyproject.toml:39]
-
I
out-of-date-standards-version
- 4.5.1 (released 2020-11-17) (current is 4.7.0)
-
P
source-contains-prebuilt-java-object
- [examples/llama.android/gradle/wrapper/gradle-wrapper.jar]
-
P
source-contains-prebuilt-javascript-object
- [examples/server/public_legacy/index.js]
- [examples/server/public_legacy/system-prompts.js]
-
X
upstream-metadata-file-is-missing
-
X
very-long-line-length-in-source-file
- 1054728 > 512 [models/ggml-vocab-command-r.gguf:17831]
- 1136 > 512 [examples/gritlm/gritlm.cpp:195]
- 1138 > 512 [models/ggml-vocab-roberta-bpe.gguf.out:46]
- 1198 > 512 [models/ggml-vocab-llama-spm.gguf.out:46]
- 1198 > 512 [models/ggml-vocab-phi-3.gguf.out:46]
- 1245 > 512 [poetry.lock:1118]
- 1256 > 512 [examples/server/webui/src/components/Header.tsx:125]
- 128742 > 512 [models/ggml-vocab-llama-spm.gguf:2009]
- 129441 > 512 [models/ggml-vocab-phi-3.gguf:2063]
- 130593 > 512 [models/ggml-vocab-bert-bge.gguf:3636]
- 136617 > 512 [models/ggml-vocab-deepseek-coder.gguf:1886]
- 1536 > 512 [prompts/dan-modified.txt:1]
- 1579 > 512 [models/ggml-vocab-chameleon.gguf.out:46]
- 1661 > 512 [prompts/dan.txt:1]
- 1763 > 512 [examples/server/public_legacy/system-prompts.js:48]
- 205861 > 512 [models/ggml-vocab-refact.gguf:2496]
- 206025 > 512 [models/ggml-vocab-starcoder.gguf:2434]
- 207776 > 512 [models/ggml-vocab-gpt-2.gguf:3658]
- 2090 > 512 [models/templates/deepseek-ai-DeepSeek-R1-Distill-Llama-8B.jinja:1]
- 2090 > 512 [models/templates/deepseek-ai-DeepSeek-R1-Distill-Qwen-32B.jinja:1]
- 209930 > 512 [models/ggml-vocab-gpt-neox.gguf:3303]
- 209930 > 512 [models/ggml-vocab-mpt.gguf:3303]
- 23078 > 512 [examples/server/public_legacy/index.js:1]
- 267123 > 512 [models/ggml-vocab-falcon.gguf:4587]
- 364703 > 512 [models/ggml-vocab-baichuan.gguf:2117]
- 4191 > 512 [tests/test-chat-template.cpp:218]
- 467504 > 512 [models/ggml-vocab-deepseek-llm.gguf:5878]
- 523 > 512 [examples/tts/tts-outetts.py:141]
- 530 > 512 [docs/backend/OPENCL.md:16]
- 534 > 512 [docs/development/token_generation_performance_tips.md:20]
- 543375 > 512 [models/ggml-vocab-llama-bpe.gguf:7855]
- 546 > 512 [ggml/src/ggml-kompute/kompute/README.md:511]
- 560 > 512 [models/templates/CohereForAI-c4ai-command-r-plus-tool_use.jinja:117]
- 575 > 512 [examples/tts/tts.cpp:534]
- 577 > 512 [examples/llava/MobileVLM-README.md:185]
- 584 > 512 [ggml/src/ggml-kompute/kompute/examples/pi4_mesa_build/README.md:9]
- 585 > 512 [examples/chat-13B.bat:42]
- 590 > 512 [examples/server/README.md:433]
- 591 > 512 [docs/backend/CANN.md:18]
- 621201 > 512 [models/ggml-vocab-qwen2.gguf:7400]
- 673 > 512 [ggml/src/ggml-kompute/kompute/CODE_OF_CONDUCT.md:111]
- 695 > 512 [examples/main/README.md:317]
- 703 > 512 [common/json.hpp:2742]
- 703 > 512 [examples/llama-bench/README.md:257]
- 733 > 512 [grammars/README.md:299]
- 736 > 512 [examples/server/public_legacy/index-new.html:860]
- 744 > 512 [models/templates/CohereForAI-c4ai-command-r7b-12-2024-tool_use.jinja:91]
- 790 > 512 [models/ggml-vocab-llama-bpe.gguf.out:46]
- 795 > 512 [models/templates/meetkai-functionary-medium-v3.1.jinja:28]
- 799 > 512 [models/ggml-vocab-deepseek-r1-qwen.gguf.out:46]
- 799 > 512 [models/ggml-vocab-qwen2.gguf.out:46]
- 816 > 512 [models/ggml-vocab-bert-bge.gguf.out:46]
- 821933 > 512 [models/ggml-vocab-aquila.gguf:2933]
- 849 > 512 [models/ggml-vocab-mpt.gguf.out:46]
- 851 > 512 [ggml/src/ggml-vulkan/ggml-vulkan.cpp:3793]
- 853 > 512 [models/ggml-vocab-refact.gguf.out:46]
- 854 > 512 [models/ggml-vocab-starcoder.gguf.out:46]
- 860 > 512 [models/ggml-vocab-command-r.gguf.out:46]
- 870 > 512 [models/ggml-vocab-deepseek-llm.gguf.out:46]
- 928 > 512 [models/ggml-vocab-falcon.gguf.out:46]
- 940 > 512 [models/ggml-vocab-deepseek-coder.gguf.out:46]
- 9726 > 512 [docs/development/llama-star/idea-arch.key:2914]
- 982 > 512 [models/ggml-vocab-gpt-2.gguf.out:46]
-
O
source-is-missing
- [examples/server/public_legacy/index-new.html]
- [examples/server/public_legacy/index.js]
- [examples/server/public_legacy/system-prompts.js]
-
W
mismatched-override
-
–
No VCS field present
-
–
Package is not in Debian
-
–
d/copyright is in DEP5 format
Upstream Contact: https://github.com/ggerganov/llama.cpp/issues Licenses: MIT
Comments
-
Hi, Please review and rectify issue noted above. Some work fixing ggml may also apply here.
Needs work Phil Wyett at Feb. 19, 2025, 7:37 a.m.