- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
AFAIK every NAS just uses unauthenticated connections to pull containers, I’m not sure how many actually allow you to log in even (raising the limit to a whopping 40 per hour).
So hopefully systems like /r/unRAID handle the throttling gracefully when clicking “update all”.
Anyone have ideas on how to set up a local docker hub proxy to keep the most common containers on-site instead of hitting docker hub every time?
Is there a project that acts like a registry? It can proxy the request with TTL, and you can push images to it too?
Almost all of them. Forgejo handles containers already for example
How? I was looking for this (although not very thoroughly)
[Edit] found it https://forgejo.org/docs/v1.21/user/packages/container/
Pull through Cache / proxy is what you’re looking for.
Artifactory is mandatory in some industries because it will keep all the versions of the images forever so that you can build your projects reliably without an internet connction.
I think most self-hosted Git+CI/CD platforms have container registry as a feature, but I’m not aware of a service that is just a standalone registry.
It’s easy to oversee because of the generic name, but this is pretty much that: https://hub.docker.com/_/registry
Edit: forgot there’s jfrog artifactory as well