Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: Add development support for Dev Containers #898

Open
rrmistry opened this issue Aug 12, 2024 · 6 comments
Open

[Feature]: Add development support for Dev Containers #898

rrmistry opened this issue Aug 12, 2024 · 6 comments

Comments

@rrmistry
Copy link

Background & Description

With advancements in Dev Containers it is now possible to create a reliable and consistent development working environment declaratively.

The value of implementing is:

  1. Automate environment setup
  2. Make environment setup independent of OS, IDEs, platform/architecture, etc. by following open specifications
  3. Reliable and consistent environment between developers (stop causing issues like "it works on my machine")

API & Usage

No response

How to implement

With VS Code, add Dev Container and implement any environment dependencies, tooling, setup, etc.

And to do it declaratively in the same repo so setup can be version-controlled, and reproducible reliably.

@SignalRT
Copy link
Collaborator

I would like to understand the problem that you want to solve:

In my understanding the project doesn't contains external dependencies other than llama.cpp. And we will still need Github pipelines to be able to compile the binaries in all platforms. DecContainers will not solve this issue.

Docker containers (in my knowledge) will limit the GPU usage to NVIDIA + CUDA, the rest of the GPU alternatives will not use the GPU executing as a docker container.

DevContainers is the technology that Microsoft is pushing, but I would like to understand the value that you expect in this project.

@rrmistry
Copy link
Author

Because I've been fighting with setup issues and haven't managed to get it going yet.

image

Besides, our company hosting is containerized. And having a self-hosted AI agent is key for us. This means we need LLamaSharp to be containerized.

Having a dev container allows us to:

  • Quickly get a working environment where we can test a product and get it working
  • Understand the components, dependencies, and tooling necessary to host LLamaSharp in production
  • Deploy in Kubernetes/OpenShift and use standard monitoring, security, and scaling patterns

This has less to do with Microsoft tooling and more to do with using vendor-agnostic industry standards.

@martindevans
Copy link
Member

The issue you're getting there is because the DLLs/SOs cannot be loaded, probably because there's a missing dependency. Usually that's because of a GPU issue (e.g. bad drivers, incompatible hardware, missing cudart etc), which wouldn't be fixed by containerising it.

I don't know anything about dev containers, so feel free to tell me if anything I'm saying here is rubbish!

@AsakusaRinne
Copy link
Collaborator

I didn't know much about dev container but I'm pretty sure LLamaSharp can be integrated into a docker container (CPU or CUDA). Could you please tell more about why dev container instead of docker container?

@rrmistry
Copy link
Author

I didn't know much about dev container but I'm pretty sure LLamaSharp can be integrated into a docker container (CPU or CUDA). Could you please tell more about why dev container instead of docker container?

Docker containers is the real goal for production setup. Dev containers is just an intermediate step to get there so that development and customizations for LLamaSharp is a lot easier.

It is really well illustrated in the below visual:

dev container use case

The value add here is to reduce inconsistencies between development setup and production setup.

I'm still not able to figure out what I'm missing to get CPU based inference going with LLamaSharp and 3.1-8B model.

A simple way to reproduce my problem is to:

  1. Create a fresh WSL instance or Linux VM / container
  2. Install dotnet 8
  3. Clone LLamaSharp repo (this repo)
  4. Run the LLamaSharp.Example project via CLI dotnet run --project LLamaSharp.Example/LLamaSharp.Example.csproj --framework net8.0

@martindevans
Copy link
Member

Oh, was the error you showed above from using the CPU backend? If so everything I said about GPU/drivers/etc issues is not relevant!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants