Skip to content

Actions: ggerganov/llama.cpp

Pull Request Labeler

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
6,828 workflow runs
6,828 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Vulkan: Destroy Vulkan instance on exit
Pull Request Labeler #6629: Pull request #10989 opened by 0cc4m
December 26, 2024 19:17 15s
December 26, 2024 19:17 15s
vulkan: Use push constant offset to handle misaligned descriptors
Pull Request Labeler #6628: Pull request #10987 opened by jeffbolznv
December 26, 2024 17:10 25s
December 26, 2024 17:10 25s
examples, ggml : fix GCC compiler warnings
Pull Request Labeler #6627: Pull request #10983 synchronize by peter277
December 26, 2024 12:12 14s
December 26, 2024 12:12 14s
examples, ggml : fix GCC compiler warnings
Pull Request Labeler #6626: Pull request #10983 opened by peter277
December 26, 2024 12:08 19s
December 26, 2024 12:08 19s
llamafile_sgemm API - INT8 implementation
Pull Request Labeler #6625: Pull request #10912 synchronize by amritahs-ibm
December 26, 2024 10:12 16s
December 26, 2024 10:12 16s
ggml : fix undefined reference to std::filesystem(#10978)
Pull Request Labeler #6624: Pull request #10979 synchronize by Clauszy
December 26, 2024 07:08 19s
December 26, 2024 07:08 19s
ggml : fix undefined reference to std::filesystem(#10978)
Pull Request Labeler #6623: Pull request #10979 opened by Clauszy
December 26, 2024 05:25 17s
December 26, 2024 05:25 17s
server : add OAI compat for /v1/completions
Pull Request Labeler #6622: Pull request #10974 synchronize by ngxson
December 25, 2024 16:03 13s
December 25, 2024 16:03 13s
server : add OAI compat for /v1/completions
Pull Request Labeler #6621: Pull request #10974 synchronize by ngxson
December 25, 2024 15:52 17s
December 25, 2024 15:52 17s
server : add OAI compat for /v1/completions
Pull Request Labeler #6620: Pull request #10974 synchronize by ngxson
December 25, 2024 13:41 19s
December 25, 2024 13:41 19s
server : add OAI compat for /v1/completions
Pull Request Labeler #6619: Pull request #10974 opened by ngxson
December 25, 2024 12:49 14s
December 25, 2024 12:49 14s
Removed unnecessary iteration of batch n_tokens on sequence embedding…
Pull Request Labeler #6618: Pull request #10972 opened by Emreerdog
December 25, 2024 11:21 15s
December 25, 2024 11:21 15s
DO NOT MERGE Add olmo2 tokenizer to convert script (leaving open for discussion)
Pull Request Labeler #6617: Pull request #10535 synchronize by bartowski1182
December 25, 2024 01:01 13s
December 25, 2024 01:01 13s
Introduce Graph Profiler
Pull Request Labeler #6616: Pull request #9659 synchronize by max-krasnyansky
December 24, 2024 23:38 17s
December 24, 2024 23:38 17s
Cosine similarity is undefined when any vector is zero.
Pull Request Labeler #6615: Pull request #10968 opened by AndyM3
December 24, 2024 17:13 11s
December 24, 2024 17:13 11s
more perfo with llamafile tinyblas on x86_64.
Pull Request Labeler #6614: Pull request #10714 synchronize by Djip007
December 24, 2024 16:41 5m 47s
December 24, 2024 16:41 5m 47s
server : add support for "encoding_format": "base64" to the */embeddings endpoints
Pull Request Labeler #6613: Pull request #10967 synchronize by ngxson
December 24, 2024 16:00 13s
December 24, 2024 16:00 13s
server: allow filtering llama server response fields
Pull Request Labeler #6612: Pull request #10940 synchronize by ngxson
December 24, 2024 15:29 13s
December 24, 2024 15:29 13s
server : add support for "encoding_format": "base64" to the */embeddings endpoints
Pull Request Labeler #6611: Pull request #10967 opened by elk-cloner
December 24, 2024 15:06 14s
December 24, 2024 15:06 14s
more perfo with llamafile tinyblas on x86_64.
Pull Request Labeler #6610: Pull request #10714 synchronize by Djip007
December 24, 2024 14:10 19s
December 24, 2024 14:10 19s
more perfo with llamafile tinyblas on x86_64.
Pull Request Labeler #6609: Pull request #10714 synchronize by Djip007
December 24, 2024 14:02 15s
December 24, 2024 14:02 15s
llama : the WPM vocabs use the CLS token as BOS
Pull Request Labeler #6608: Pull request #10930 synchronize by ggerganov
December 24, 2024 07:44 37m 26s
December 24, 2024 07:44 37m 26s
llama : refactor src/llama.cpp
Pull Request Labeler #6607: Pull request #10902 synchronize by ggerganov
December 24, 2024 07:43 10m 43s
December 24, 2024 07:43 10m 43s
fix: ggml: fix vulkan-shaders-gen build
Pull Request Labeler #6606: Pull request #10448 synchronize by sparkleholic
December 24, 2024 04:06 15s
December 24, 2024 04:06 15s
fix: ggml: fix vulkan-shaders-gen build
Pull Request Labeler #6605: Pull request #10448 synchronize by sparkleholic
December 24, 2024 04:03 14s
December 24, 2024 04:03 14s