I have been evaluating different version control programs for my company for the last month or two. My current evaluation is that all of them are broken, although I hope to discover a good solution someday soon.
Serena's Version Manager (PVCS)
I have spent a lot of time with Version Manager, and it has a number of deficiencies. Unfortunately, this is the product my company uses. They originally switched to VM because has client apps on mac, unix and windows. However, there are now a number of products that support that, so that is no longer an advantage.
For some reason, they released 7 versions of their product before they figured out that they shouldn't send two or three copies of the entire archive between the client and server while doing a transaction. They claim this is a form of reliability. Hrm. Version 8 now sends smaller amounts of data, which is good, but it is still slower than any other product I tried. They say if you care about speed, you should use their INET server, where you then use Internet Explorer (only, it doesn't support any other browser) to access your data, with limited features.
It does share files between projects quite nicely, and does that better than most. It defaults to the lock-modify-unlock model, but can be used in copy-modify-merge as well, although I wouldn't trust their diff tool at all. It doesn't support Unicode. It cannot track deletions or moving of files, and if you do move a file using their GUI tool, it will move it incorrectly, and if you have to recreate the project if something gets screwed up, it will move the file back to the old location. It cannot integrate with Visual Studio, if you use multiple projects in one VS project. Their archive files, while there is a nice one-to-one ratio of archive files to work files, are in a binary format, and I have had a number of archive files be destroyed, and AnswerLine (their technical support) had no explanation of why it happened, or any solution for recovering the data.
There was a large change from version 7.5 to version 8.0, and that upgrade is not smooth at all. I spent 12 hours yesterday (after spending a couple weeks with the new product) to try to convert a couple projects for real production use. I ended up spending a half hour with AnswerLine, trying to get one project upgraded. It looked like it worked, but it ended being broken, and actually still referring to old unconverted data, so when I removed the "old" data, everything stopped working. Their command-line tools do not parse spaces and other "funny" characters in a consistent way, and so escaping them are close to impossible. Thus, their claim that their tool does not require an consultant to setup the server is false. Remember this upgrade was exactly that, an upgrade between versions of their own software and they are unable to figure out how to do it correctly. Their solution is spend days finding the shared files, copying them over specially, and recreating the shared files. It is unacceptable for me to have the database down for a week to upgrade it.

Subversion (SVN)
I have only used Subversion a little, but server set up is quite easy and fast. They put a high priority on data integrity, and I have read from a number of long-term users that it hasn't ever had an unrecoverable error. They use journaling with a Berkeley database by default, and it supposedly can recreate any transaction, so unless you get a magnet near your harddrive, you should be alright. The developers brag that no one has ever lost any data with their product.
It also supports hot backups, which most products do not. A hot backup is when you can backup the server without shutting it down.
There are two big problems that keep me from using it. It is unable to share files between projects, you have to have your shared code in a separate directory (svn:externals), and even then, it treats that code differently, and so you have to explicitly take action on those directories, rather than recursively acting on everything, like the rest of the directories.
It also has trouble with platforms that have case-insensitive file systems, such as Windows. It can easily erase valid data if the case of a file changes. Since Windows will sometimes change the case, such as capitalizing the first letter, this is an easy problem to run into. Since they track file moving and deletions, presumably you can get the data back, but I am sure it is a pain to have to do that more than once or twice.
It will support file-locking shortly, but currently uses the copy-modify-merge model, which makes some people nervous. It uses a database format, which means you need their tools to get your data back. Most products do that, so they are all making it harder to recover together.
It supports binary diffs, which is excellent. Most other tools only support text diffs, and default to binary, so you have to explicitly tell it if an unknown extension is a text file.
It is extremely fast at the expense of client disk space. I did some tests with a project that was 100MB, and with Version Manager, it took 200 seconds to check the server to see if there were any updates (when there weren't any). Under SVN, it took 4 seconds. Pretty impressive. It actually doesn't use the network for some operations, such as checking to see what files have been modified locally, reverting your changes back to the latest version, and I think a couple more. It tracks file deletions and moves, so that when you go back to get an old version of the project, it will have the correct files and directories in place.
CVS and PVCS are unable to do that and silently fail.

Concurrent Versioning System (CVS)
Old standby. I have used it quite a bit for personal use, but never more than a couple developers. It has worked quite well for my purposes. And I don't care about branching, and renaming/moving/file-sharing, which lots of people complain about. I think it might be a little weak for a larger group of developers, although I do know some groups that use it and are perfectly happy with it.
It is pretty good for speed, although most of my work has been local. I did use it over AFS a while back and the performance was pretty poor, but that might have been my AFS client's fault. I have used it securely to SourceForge and it has worked pretty well.
Commercial companies always try scare you by telling you about how bad it is for corrupting files, but I haven't ever heard a user say that. It has one archive file per "real" file, so you can go in to the CVSROOT and modify the files as you like. It isn't entirely easy, but one could imagine editing the archive file if it contains text. If you were backing up a Word document, it would probably be hard to recover your data if it did break.

SourceSafe (VSS)
The database can be easily corrupted without your knowledge, due to bugs in the client. I have run the "analyze and fix" utility each time, and the data was recovered, at least as far as I can tell, without doing exhaustive checking. People generally say it has a 2GB limit for the database. It has this weird hashtable/filesystem way of storing the files, that I just don't like. I used it at my last company and it was pretty good. If I knew about branching then, it would have been a little better, although I hear it doesn't support anything fancy.
Contrary to popular opinion, if you have Microsoft's Universal Subscription, that does not gave you license to use SourceSafe for free. It is slow over a VPN connection, but SourceGear makes a SourceOffSite utility that is supposed to be really good, although in the time I have adminstrated it (about a month, for two users), it has crashed a couple of times.

BitKeeper
I haven't used this at all yet. People generally say it is really good, but really expensive, and the owner has made some in the open source community mad, due to the restrictive (albeit pseudo-free) license. I have a request in for a quote, but haven't heard back from them yet.

Perforce
The biggest gripe I have with Perforce is that you have to get used to a different mindset of how you think about the projects. I have not used it enough or read about it to be able to say anything more.

SourceVault
I have only tried out their demo server, so I can't comment too much on it. It seems nice, and they say it is better than VSS. I haven't looked up customer reviews yet.
Posted by Jon Daley on October 15, 2004, 2:42 pm | Read 172935 times
Category Programming: [first] [previous] [next] [newest] Reviews: [first] [previous] [next] [newest]
Comments
«Previous   1 2 3

hi
I have been evaluating the SCM s for last month.
I needed a low cost feature packed SCM tool.
I guess i was competing against Perforce.

CVS the old workhorse...is a proven thing, but requires a lot of effort on developers to merge and remove conflicts... branch merges are a pain.
it doesnot do binary diff.
I tried Subversion recently.
server installation is quite easy....I picked visualSVN for server and tortoiseSVN for client.
all goes in well. i can tag , branch diff....
but my problem starts when i need to have a perforce/TFS kind of feature to notify when the checkin , merge etc happen..
I looked upon google....found subversionnotify.com
downloaded the plugin, it is not clear how to make it work...
anyone who worked on it....lend me experiences....

hope to hear soon

Posted by harman on January 9, 2009, 10:05 pm

You don't need anything fancy to do notifications - simple use the supplied scripts in the "hooks" directory. You can get as fancy as you would like. For windows, you use the post-commit.bat (or post-commit.exe) file. If you don't have an email program handy for windows, I have used blat.exe before, and that has worked well.

I have only used the post-commit hook, and mostly, I am using the default script (on Linux), but I did make a modification or two. Mostly, the emails on commit just need to go to myself, but one tree I also send to someone else.

#!/bin/sh
REPOS="$1"
REV="$2"
FROM="--from svn@xxx.com"

# get top-level directory being worked on
DIR=`svnlook dirs-changed "$REPOS" --revision "$REV" | cut -f1 -d/ | sort -u`

if [ "x$DIR" == "xblogfuse" ] ; then
TO="xxx@gmail.com svn@xxx.com"
else
TO="svn@xxx.com"
fi

/usr/lib/subversion/hook-scripts/commit-email.pl "$REPOS" "$REV" $FROM $TO >> post-commit.log 2>&1

Posted by jondaley on January 10, 2009, 12:23 am

Hi I run test director setup.after that i run http://abc/TDBIN/start_a.htm but the page shows Download Test Run scheduler in Progress.... & Download Execution Flow in Progress....I have already tried directory security page for permission but to no effect.Kindly,help me out it is very urgent.Thanks a Lot

Posted by sanjeet kumar thakur on January 20, 2009, 2:19 am

The lack of any good IDE integration (other than Java IDEs) and the fact that everyone recommends those disgusting Explorer extensions (which slow it down and are often the source of instability) pretty much rules Subversion out to me. I'll stay on Visual SourceSafe, thank you.

Ankh is not good enough. They can bash VSS all they want, but at least it knows how to work with an IDE. It's not 1985 anymore. IDE integration is pretty compulsory, and should be evident with 99.99% of commervial source control systems providing it.

A lot of the Open Source VCS developers have the mindset of anything is better than nothing, and it just doesn't fly to a lot of Windows developers. I want to right click on a file in my IDE project manager to "Commit", not open another explorer Window to do the same thing. That is [obviously] far less productive.

Posted by Nate on March 27, 2009, 7:26 am

I agree that the tortoise gui is not any good.

I've heard good things about Eclipse's integration, and also integration into VC6. I haven't heard if there is any integration into the newer Microsoft compilers.

SmartSVN and another one (FastSVN?) both look good, but are standalone clients.

For me, I am on a command line all day long, so not using a gui doesn't bother me, and I actually prefer it, since it is so much faster than requiring a mouse, etc.

I value stability over a gui, so VSS is definitely out, in favor of any other source control software I've ever used. True, it doesn't fail often, but once every two or three years is enough for me to never want to use it again. And I've only used it for small teams (3-4 people), I have to assume it fails more often with more checkins - ie. the fail rate can actually be calculated per-checkin or per-byte, rather than per-day.

Posted by jondaley on March 27, 2009, 10:16 am
«Previous   1 2 3
Add Comment
Add comment
E-mail me when comments occur on this article

culpable-adaptable