I use apt cacher ng. Most of my use case though is for caching of packages related to Docker image builds as I build up to 200+ images daily. In reality, I have aggressive image caching so I don’t actually build anywhere close to that many each day but the stats are impressive. 8.1 GB of data fetched from the internet but 108 GB served from the acng instance as it shows in the stats page of recent history.
Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
At the beginning of the COVID pandemic, I was attempting to mirror all of my favorite distro repos, just in case of societal collapse.
jigdo
the entire repositories of your Debian's favorite architectures for the win.
I want to look into apt-cacher-ng for learning purposes, to stop 10s of VMs in my homelab from adding load to Debian official repos, and also to check if there is a way to have it only mirror a list of "approved" packages.
saw a huge time improvement even though I have a good internet connection
Note that for best performance you should use https://deb.debian.org/
Semi-related I have set up a personal APT repository on gitlab pages: https://nodiscc.gitlab.io/toolbox/ (I think Ubuntu users would call that a "PPA"). It uses aptly and a homegrown Makefile/Gitlab CI-based build system (sources/build tools are linked from the page). I wouldn't recommend this exact setup for critical production needs, but it works.
And... you can also convert the ISO files into an hosted repository in your network using Apache:
apt install apache2 build-essential mkdir /var/www/html/packages
Now, create additional directories under /var/www/html/packages/ to save packages depending upon your system’s architecture. For example, create a directory “amd64″. You can keep multiple directories and serve packages to different architecture systems at the same time.
mkdir /var/www/html/packages/amd64
Copying all DEB files from Debian installation media
Mount the first CD/DVD and copy all .deb packages to /var/www/packages/amd64/ directory from your CD/DVD.
mount /dev/cdrom /media/cdrom find /media/cdrom/pool/ -name "*.deb" -exec cp {} /var/www/html/packages/amd64 ;
After copying all deb files, unmount the first DVD using the following command.
umount /media/cdrom
Again mount all remaining CD/DVD one by one and copy the .deb files as shown above.
To verify the files, navigate to http://192.168.1.150/packages/amd64/ from your browser. You will see all packages of your Debian DVD’s. Here 192.168.1.150 is my Debian server’s IP address.
Index of -packages-amd64 - Google Chrome_002 Create Catalog file
Switch to your repository directory i.e /var/www/html/packages/amd64/ :
cd /var/www/html/packages/amd64/
and enter the following command to create a catalog file for APT use. You should run this command so that Synaptic Manager or APT will fetch the packages from our local repository. Otherwise the packages in your local repository will not be shown in Synaptic and APT.
dpkg-scanpackages . /dev/null | gzip -9c > Packages.gz
This command will scan all deb files and create the local repository in your Debian server. This may take a while depending upon the number of packages in your local repository folder. Be patient or grab a cup of coffee.
Sample output:
dpkg-scanpackages: warning: Packages in archive but missing from override file: dpkg-scanpackages: warning: accountsservice acl acpi acpi-support-base acpid adduser adwaita-icon-theme apache2-bin apg apt apt-listchanges apt-offline apt-utils aptitude aptitude-common aptitude-doc-en aspell aspell-en at at-spi2-core avahi-daemon
[...]
xserver-xorg-video-neomagic xserver-xorg-video-nouveau xserver-xorg-video-openchrome xserver-xorg-video-r128 xserver-xorg-video-radeon xserver-xorg-video-savage xserver-xorg-video-siliconmotion xserver-xorg-video-sisusb xserver-xorg-video-tdfx xserver-xorg-video-trident xserver-xorg-video-vesa xserver-xorg-video-vmware xterm xwayland xz-utils yelp yelp-xsl zenity zenity-common zlib1g
dpkg-scanpackages: info: Wrote 1151 entries to output Packages file.
Please note that whenever you add a new deb file in this repository, you should run the above command to create catalog file.
Done! We created the catalog file. Configure Server sources list
After creating the catalog file, go to your server(local) system. Open /etc/apt/sources.list file.
nano /etc/apt/sources.list
Comment out all lines and add your APT repository location as shown below.
deb file:/var/www/html/packages/amd64/ /
Configure Clients
After creating the catalog file, go to your client systems. Open /etc/apt/sources.list file.
vim /etc/apt/sources.list
Add the server repository location as shown below. Comment out all sources list except the local repository.
deb http://192.168.1.150/packages/amd64/ /
Note: Put a space between deb and http://192.168.1.150/packages/amd64/ and /.
The dvds are fine for offline use. But I dont know how to keep them updated. Probably result in taking loads of spaces as I guess they are equal to a repo mirror
So what are you using for a local repository mirror? apt-mirror
or ftpsync
? I usually keep ISOs for the architectures that interest me using jigdo
as it can update them later on.
ISOs are harder to maintain for sure but they're more standalone and might survive adversities better.
I use it with Kubuntu. Doing apt update is now much faster. I did some testing and found some good public mirror so I could max my connection(100 Mbit) with about 15ms latency to the server. But I think the problem was there are so many small files. Running nala to fetch the files in parallel helps of course. With apt local ng I don't need nala at all. The low latency and files on gigabit connection to my server leads to fast access. Just need to find a good way to fill it with new updates.
A second problem is to figure out if something can be done to speed up the apt upgrade, which I guess is not possible. Workaround with snapshots and send diff does not sound efficient either, even on older hardware.
apt update - 4 seconds vs 16 seconds.
apt upgrade --download-only - 10 seconds vs 84 seconds;
Do you know you can use the ISO files as repositories? Easier in some situations.
- Create the folders (mountpoint) to mount the ISO files
sudo mkdir -p /media/repo_1
sudo mkdir -p /media/repo_2
sudo mkdir -p /media/repo_3
- Mount the ISO files
sudo mount -o loop ~/Downloads/debian-8.0.0-amd64-DVD-1.iso /media/repo_1/
sudo mount -o loop ~/Downloads/debian-8.0.0-amd64-DVD-2.iso /media/repo_2/
sudo mount -o loop ~/Downloads/debian-8.0.0-amd64-DVD-3.iso /media/repo_3/
- Edit the
/etc/apt/sources.list
file to add the repository
vim /etc/apt/sources.list
deb file:///media/repo_1/ jessie main contrib
deb file:///media/repo_2/ jessie main contrib
deb file:///media/repo_3/ jessie main contrib
- Run
sudo apt-get update