[GLLUG] Perl Concurrent Transactions

Jeremy Bowers jerf@jerf.org
Wed, 17 Jul 2002 06:52:58 -0500


chuck@europa.affordablehost.com wrote:
> - Perl 6 developers are working on this problem.
> - Modules like fastCGI or similar handle concurrent transactions.
> - Of course relational databases came up repeatedly.
> Only the last one, relational databases, I have to ignore. Perl hands 
> the problem off to RDBMS. His question revolves only around Perl and 
> flat data files.

I'm going to assume there's more reasoning behind this then you've 
shared. RDBMS are a standard solution to some aspects of this problem.

> Submitting at the same millisecond may be rare but he's right our group 
> has to face the problem. Any ideas or experiences here?
> Essentially the problem is two users trying to update a data file at the 
> same time.

To correctly answer this really requires more information. Is this a 
large file, with sparse updates reasonably guarenteed to be distant? Is 
this a binary file or a text file? Is the webserver interface a strict 
requirement, or an old (1996) convenience that might be better server 
with different technology now?

What do you plan to do/do right now for conflicts, where two people 
update overlapping pieces? Do you just take the latter? Is there a merge 
process? Are there ever any conflicts at all?

Large text file with spare updates screams for CVS updates. You'll only 
be using a fraction of the power of CVS and depending on usage you may 
need to periodically clean out the repository, but it should be OK. 
(Other management systems may work as well, but I don't know much about 
them right now.) Web interfaces for that can be created and exist, but 
you might as well use the tools directly. Merging and locking should be 
handled by the tool.

Binary files are the worst; there may not be a clean solution in some 
situations.