Skip to content

Actions: huggingface/candle

Continuous integration

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
3,093 workflow runs
3,093 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Add Support for mixed quantized BitNet Architecture Inference
Continuous integration #6246: Pull request #2683 synchronize by JoseCarlosGarcia95
January 1, 2025 12:14 Action required JoseCarlosGarcia95:feat/qbitnet
January 1, 2025 12:14 Action required
Add Support for mixed quantized BitNet Architecture Inference
Continuous integration #6245: Pull request #2683 synchronize by JoseCarlosGarcia95
January 1, 2025 11:33 Action required JoseCarlosGarcia95:feat/qbitnet
January 1, 2025 11:33 Action required
UniPC for diffusion sampling
Continuous integration #6244: Pull request #2684 synchronize by nicksenger
December 31, 2024 19:15 Action required nicksenger:feat/unipc
December 31, 2024 19:15 Action required
UniPC for diffusion sampling
Continuous integration #6243: Pull request #2684 synchronize by nicksenger
December 31, 2024 18:42 Action required nicksenger:feat/unipc
December 31, 2024 18:42 Action required
Update the hf-hub dependency to 0.4.0. (#2691)
Continuous integration #6242: Commit b12c7c2 pushed by LaurentMazare
December 31, 2024 18:07 11m 42s main
December 31, 2024 18:07 11m 42s
Update the hf-hub dependency to 0.4.0.
Continuous integration #6241: Pull request #2691 synchronize by LaurentMazare
December 31, 2024 18:00 8m 57s hf-hub-0.4.0
December 31, 2024 18:00 8m 57s
Actually remove the default hf-hub cache path for glm. (#2696)
Continuous integration #6240: Commit 94ffc2e pushed by LaurentMazare
December 31, 2024 10:00 10m 47s main
December 31, 2024 10:00 10m 47s
Actually remove the default hf-hub cache path for glm.
Continuous integration #6239: Pull request #2696 opened by LaurentMazare
December 31, 2024 09:57 10m 2s glm-remove-default-cache
December 31, 2024 09:57 10m 2s
Use the default hf-hub cache for glm. (#2695)
Continuous integration #6238: Commit 7354afc pushed by LaurentMazare
December 31, 2024 09:55 10m 34s main
December 31, 2024 09:55 10m 34s
Add Support for BitNet Architecture Inference
Continuous integration #6237: Pull request #2664 synchronize by JoseCarlosGarcia95
December 31, 2024 09:54 Action required JoseCarlosGarcia95:feat/bitnet-support
December 31, 2024 09:54 Action required
Use the default hf-hub cache for glm.
Continuous integration #6236: Pull request #2695 opened by LaurentMazare
December 31, 2024 09:53 10m 33s glm-hub-cache
December 31, 2024 09:53 10m 33s
Add Support for mixed quantized BitNet Architecture Inference
Continuous integration #6235: Pull request #2683 synchronize by JoseCarlosGarcia95
December 31, 2024 09:40 Action required JoseCarlosGarcia95:feat/qbitnet
December 31, 2024 09:40 Action required
Add Support for mixed quantized BitNet Architecture Inference
Continuous integration #6234: Pull request #2683 synchronize by JoseCarlosGarcia95
December 31, 2024 09:36 Action required JoseCarlosGarcia95:feat/qbitnet
December 31, 2024 09:36 Action required
Flash-Attn upgrade / SoftCap Candle-FlashAttn [3/n] (#2690)
Continuous integration #6233: Commit 2a705e6 pushed by LaurentMazare
December 31, 2024 09:04 10m 11s main
December 31, 2024 09:04 10m 11s
Flash-Attn upgrade / SoftCap Candle-FlashAttn [2/n] (#2689)
Continuous integration #6232: Commit a594ef6 pushed by LaurentMazare
December 31, 2024 08:41 8m 57s main
December 31, 2024 08:41 8m 57s
Flash-Attn upgrade / SoftCap Candle-FlashAttn [1/n] (#2688)
Continuous integration #6230: Commit 71cd6d5 pushed by LaurentMazare
December 31, 2024 08:32 10m 33s main
December 31, 2024 08:32 10m 33s
Streamline the glm4 example. (#2694)
Continuous integration #6229: Commit d60eba1 pushed by LaurentMazare
December 31, 2024 08:21 9m 42s main
December 31, 2024 08:21 9m 42s
Streamline the glm4 example.
Continuous integration #6228: Pull request #2694 opened by LaurentMazare
December 31, 2024 08:18 10m 14s glm-prompt
December 31, 2024 08:18 10m 14s
Fix a cuda warning. (#2693)
Continuous integration #6227: Commit e38e2a8 pushed by LaurentMazare
December 31, 2024 08:06 9m 24s main
December 31, 2024 08:06 9m 24s
Fix a cuda warning.
Continuous integration #6226: Pull request #2693 opened by LaurentMazare
December 31, 2024 07:59 8m 52s cuda-fix-warning
December 31, 2024 07:59 8m 52s
Update the hf-hub dependency to 0.4.0.
Continuous integration #6225: Pull request #2691 synchronize by LaurentMazare
December 30, 2024 16:56 9m 58s hf-hub-0.4.0
December 30, 2024 16:56 9m 58s
Update the hf-hub dependency to 0.4.0.
Continuous integration #6224: Pull request #2691 opened by LaurentMazare
December 30, 2024 16:52 10m 9s hf-hub-0.4.0
December 30, 2024 16:52 10m 9s
Add Support for mixed quantized BitNet Architecture Inference
Continuous integration #6223: Pull request #2683 synchronize by JoseCarlosGarcia95
December 30, 2024 16:31 Action required JoseCarlosGarcia95:feat/qbitnet
December 30, 2024 16:31 Action required
Update README.org (#2670)
Continuous integration #6221: Commit 460616f pushed by LaurentMazare
December 30, 2024 10:32 10m 13s main
December 30, 2024 10:32 10m 13s