[cvsnt] Performance Note

Matt Schuckmann matthew_schuckmann at amat.com
Wed Apr 20 19:39:25 BST 2005


Community technical support mailing list was retired 2010 and replaced with a professional technical support team. For assistance please contact: Pre-sales Technical support via email to sales@march-hare.com.


Warning, rloging large binary files can be extreamly slow, and I would
imagine that other operations on such files would also be slow.

We've got a 400mb binary ,v file (curse you MS Access) (the actuall checked
out file size is 8.5mb and it has ~63 revisions)
and I've found that running the following command on it takes upwards of 50
min

cvs -q rlog -R -S -rFIX4438 database/calhistory.mdb

It looks like the server is loading the entire file into memory before
finally deciding that the branch tag doesn't exist on the file. I'm guessing
that cvs is memory mapping the file before reading it, or something like
that.
The memory usage on the server goes from about 120mb to almost 1gig while
it's servicing the command.

Part of the problem could be that this system only has 500mb of physical
memory, not enough to hold the file in memory so there could be extra disk
thrashing going on for loading it into virtual memory.

I'm thinking that I can speed this up by getting rid of some unneeded
revisions on this file, using cvs admin -o.
Any other ideas?

The server is a Pentium III 766mhz with 500 mb of memory, running Win 2k.
Client and server running CVSNT 2.05.01 (Travis) Build 1927

On a happier note, I was very impressed that performing this same command on
all of our other 8000 files (360mb, mostly text files) takes less than 2
min.

Matt S.





More information about the cvsnt mailing list
Download the latest CVSNT, TortosieCVS, WinCVS etc. for Windows 8 etc.
@CVSNT on Twitter   CVSNT on Facebook