Any way to avoid repeatedly fetching files from SVN?
Joshua Root
jmr at macports.org
Mon Jan 21 04:26:58 PST 2013
On 2013-1-21 21:23 , Mojca Miklavec wrote:
> Hello,
>
> When trying to write a new Portfile for some software whose sources
> are fetched from SVN (I was working on a port for Root 6) I realised
> that 480 MB of sources need to be fetched for every single tiny change
> in the Portfile (which is a bit problematic when I'm not working on a
> super-fast network). I can do it once, but I cannot afford to fetch it
> for every single option that I want to try out.
Well, the best option is not to fetch from a VCS if at all possible. The
sources don't get mirrored, among other issues. Even if the sources are
only available through a VCS, you can always export a tarball.
> Is there any way or interest to optimise this? I'm not sure how, but
> some (possibly bad) ideas:
> a) doing a checkout to the same location where tar.gz files would
> normally be stored
> ($HOME/.macports/opt/local/var/macports/distfiles/...), and making
> sure that the repository is clean next time when we try to use it (use
> "svn status" or other tricks to make sure that no files are deleted or
> changed); when changing the revision this would also mean
> significantly less traffic
This is <https://trac.macports.org/ticket/16373>.
> b) making sure that timestamps during checkout are set to commit times
> (not checkout times), tar-gziping the contents and doing some kind of
> checksum on the resulting file (if checksum doesn't depend on the time
> when tar.gz is made)
I don't really understand what you're proposing here.
- Josh
More information about the macports-dev
mailing list