AWS updates on the cheap

Gentoo Build Server/Compute

Intro and Prep

Gentoo compiles everything. After I uploaded my gentoo custom image I realized updating on a 634MB 1 virtual cpu machine may be a bad thing. I would hit the IO limit or the CPU limit and actually have to pay money, ugh. From rich0’s blog lets assume your first instance is up and running, has been for a while, and now needs an update, so keep the chrooted environment you created in step 7.

Exposing your update server

Now you need to have an apache web server set up. It does not need to be exposed to the internet, but that makes it easier. Go into the apache document root and create a symlink into the chrooted environment. Specifically where emerge and quickpkg leave their package files. In my case I did this:
ln -s bounce-bin /home/binserver/bounce/home/portage/distfiles/

Where /home/binserver/bounce is my chrooted environment and /home/portage/distfiles/ is my PKGDIR in make.conf. If you chose the default I believe it would be usr/portage/distfiles in your chroot. Next we need to expose the portage directory of the update server via rsync. First install rsync on the same box as your apache server. Second, share the portage tree of your chroot. For me I have the following configuration in rsync.conf:

path = /home/binserver/bounce/usr/portage
comment = Bin server for outpost.

This is an important point, it will make your EC2 instance sync with your update server instead of the gentoo tree. This keeps your generated package versions in sync with the EC2 portage tree.Your update server should now be ready to go. If you have an internet connection that allows rsync and http make sure they are mapped to the right ports.

Lastly, chroot to the update environment, mount the special file systems and update. For me that process is:

cd <chroot env>
mount -t proc none <chroot env>/proc
mount -o bind /dev <chroot env>/dev
chroot <chroot env> /bin/bash
emerge --sync
emerge -avuNDb world
cd /usr/portage
find /usr/portage -maxdepth 2 -type d | sed -e 's/\/usr\/portage\/.*\///g' | xargs -P1 quickpkg --include-config=y
chown -R apache:apache /home/portage/distfiles

Now we are ready to update.

The EC2 image

Now you need to make your make.conf file point to your update server. Go to and get your external ip. In make.conf :

PORTAGE_BINHOST="http://<external ip>/binhost/bounce-bin"
SYNC="rsync://<external ip>/gentoo-portage"

Now on the EC2 image:

emerge --sync
emerge -avGK world

This should download all your updates and install them. Hopefully you are now up to date!

Isolated Compute Server

You have two options, tar or ssh port forwarding magic. If you choose the tar option, you are basically coping the compute server’s portage tree and package directory up the the EC2 instance. If you use the ssh forwarding method, you are basically substituting the <external ip> in the make.conf example for localhost.

SSH Forwarding

This method is preferred, if you the tar method and forget a package you need to move another binary up to the EC2 server. If you use ssh tunnelling, you just run emerge again. In order for ssh forwarding to work  an ssh server needs to be running on the EC2 instance, the EC2 instance must have an addressable ip from your compute server(public or internal vpn) and the ssh port must be allowed through all firewalls between the compute server and the EC2 instance. Once all the prerequisites are met you can do the following:

compute server> ssh -R 873:localhost:873 -R 80:localhost:80 root@ec2tinstance
emerge --sync
emerge -avuNDGK world

Tar method

On the compute server:

cd /
tar cvf /tmp/portage.tar /usr/portage /home/portage/distfiles

Then move the tar ball up to the ec2 server.

cd /
tar xvf portage.tar
emerge -avuNDGK world