Any way to avoid repeatedly fetching files from SVN?

Mojca Miklavec mojca at macports.org
Mon Jan 21 02:23:02 PST 2013


Hello,

When trying to write a new Portfile for some software whose sources
are fetched from SVN (I was working on a port for Root 6) I realised
that 480 MB of sources need to be fetched for every single tiny change
in the Portfile (which is a bit problematic when I'm not working on a
super-fast network). I can do it once, but I cannot afford to fetch it
for every single option that I want to try out.

Is there any way or interest to optimise this? I'm not sure how, but
some (possibly bad) ideas:
a) doing a checkout to the same location where tar.gz files would
normally be stored
($HOME/.macports/opt/local/var/macports/distfiles/...), and making
sure that the repository is clean next time when we try to use it (use
"svn status" or other tricks to make sure that no files are deleted or
changed); when changing the revision this would also mean
significantly less traffic
b) making sure that timestamps during checkout are set to commit times
(not checkout times), tar-gziping the contents and doing some kind of
checksum on the resulting file (if checksum doesn't depend on the time
when tar.gz is made)

I changed the Portfile to use a local tar.gz file, but if other
developers want to help me develop the port, everyone has to repeat
the same process, do local modifications to the Portfile and it
becomes a bit tedious to keep modifying the Portfile.

Thank you,
    Mojca


More information about the macports-dev mailing list