this post was submitted on 21 Aug 2023
33 points (92.3% liked)

Technology

58131 readers
4901 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

cross-posted from: https://lemmy.world/post/3549390

stable-diffusion.cpp

Introducing stable-diffusion.cpp, a pure C/C++ inference engine for Stable Diffusion! This is a really awesome implementation to help speed up home inference of diffusion models.

Tailored for developers and AI enthusiasts, this repository offers a high-performance solution for creating and manipulating images using various quantization techniques and accelerated inference.


Key Features:

  • Efficient Implementation: Utilizing plain C/C++, it operates seamlessly like llama.cpp and is built on the ggml framework.
  • Multiple Precision Support: Choose between 16-bit, 32-bit float, and 4-bit to 8-bit integer quantization.
  • Optimized Performance: Experience memory-efficient CPU inference with AVX, AVX2, and AVX512 support for x86 architectures.
  • Versatile Modes: From original txt2img to img2img modes and negative prompt handling, customize your processing needs.
  • Cross-Platform Compatibility: Runs smoothly on Linux, Mac OS, and Windows.

Getting Started

Cloning, building, and running are made simple, and detailed examples are provided for both text-to-image and image-to-image generation. With an array of options for precision and comprehensive usage guidelines, you can easily adapt the code for your specific project requirements.

git clone --recursive https://github.com/leejet/stable-diffusion.cpp
cd stable-diffusion.cpp
  • If you have already cloned the repository, you can use the following command to update the repository to the latest code.
cd stable-diffusion.cpp
git pull origin master
git submodule update

More Details

  • Plain C/C++ implementation based on ggml, working in the same way as llama.cpp
  • 16-bit, 32-bit float support
  • 4-bit, 5-bit and 8-bit integer quantization support
  • Accelerated memory-efficient CPU inference
    • Only requires ~2.3GB when using txt2img with fp16 precision to generate a 512x512 image
  • AVX, AVX2 and AVX512 support for x86 architectures
  • Original txt2img and img2img mode
  • Negative prompt
  • stable-diffusion-webui style tokenizer (not all the features, only token weighting for now)
  • Sampling method
    • Euler A
  • Supported platforms
    • Linux
    • Mac OS
    • Windows

This is a really exciting repo. I'll be honest, I don't think I am as well versed in what's going on for diffusion inference - but I do know more efficient and effective methods running those models are always welcome by people frequently using diffusers. Especially for those who need to multi-task and maintain performance headroom.

top 5 comments
sorted by: hot top controversial new old
[โ€“] JackGreenEarth@lemm.ee 5 points 1 year ago (1 children)

Does this run faster than the python model?

[โ€“] olicvb@lemmy.ca 3 points 1 year ago* (last edited 1 year ago) (1 children)

Got a 1.42s generation on the Cpp one and 2.1s with auto1111's SD (note my torch is outdated).

Though i'm having trouble finding the generated image ๐Ÿ˜….

All on the same generation settings, 5800x cpu & 3080 12gb

[โ€“] JackGreenEarth@lemm.ee 0 points 1 year ago (1 children)

I'd love a 2s generation, it usually takes about 60s with my 1660ti 4gb

[โ€“] Fubarberry@sopuli.xyz 0 points 1 year ago

Are you sure that it's using your GPU? That seems way slower than I'd expect.

[โ€“] TimeSquirrel@kbin.social 2 points 1 year ago

Nice to see C++ getting some love. The kiddies and their Python have taken over.