Share this content

Sage Line 50 limits transaction volumes

 We have been approached by a prospect who are having reporting issues when it comes to their accounts. Now this is something we excel at, so will be able to help them out.

However, during our initial conversation, they advised that due to the number of transactions being posted through Sage, each six months they perform a year end close down in order for the system to keep working!

I have not yet seen the data set to see if previous years have been purged, but has any one else come across this problem before where the number of transactions cause Line 50 to crash? If so, has anyone identified a way round the issue?

The prospect has approx 200,000 transactions being posted across ledgers on an annual basis, so if Line 50 is not able to cope then ultimately the accounts package will have to be upgraded. Year end is not until February, so I am aiming to keep Line 50 in place until such time as we can do a migration at year end.

Any help gratefully appreciated

Paul Smalley


Please login or register to join the discussion.

03rd Nov 2010 12:02

Sage 50 Transaction Limits

Sage 50 uses what is, these days, an archaic flat file database so transaction volumes are an issue albeith, purely on a technical level, the database can handle 4 billion transactions.  Whilst I've not seen large volumes of data (and I have seen data with close to a million transactions in it) cause Sage 50 to crash as such the program, especially when it comes to reporting slows down massively.

The artice that JC linked to is incorrect with regards to the maximum number of transactions, the limit is not, nor has it ever been around 100,000 transactions.

They won't be performing a year end close down as such, they will be running the Clear Audit Trail routine to remove paid and reconciled transactions from the system thereby reducing the number of stored transactions.  Depending on the reports they are running this, in itself will cause problems as the removed transactions are no longer available for reports.

As for a solution, there isn't anything more they could do within Sage 50 and should really be looking to upgrade.  Sage 100 (which was Sage 50 with a SQL database) would have been an ideal solution but Sage have just cancelled it so, within the Sage range, the next possibility is Sage 200 albeit it is a magnitude more expensive that Sage 50...



Thanks (0)
03rd Nov 2010 15:58

Doesn't crash just becomes so slow its un-useable ....

@johndon68 '.. transaction volumes are an issue albeith, purely on a technical level, the database can handle 4 billion transactions ..' - purely at a technical level - Eh! Technically bumble bees should not be able to fly!

Don't think anyone said it crashed per se - however, if it takes an excessive length of time to perform a task then it is effectively useless; no-one really want to wait indefinately whilst the software does its job - Not sure your stated transaction levels accord with users experience or Sage admissions by Jamie Saul, accounts technical lead for Sage UK  (see Slow Performance - Remedies).

10,000 customer records would seem to be a great deal less than the 1m/4bn you have outlined and by a magnitude that means it is very difficult to reconcile your figures with either those experienced by users or acknowledged by Sage as being a problem level


Thanks (0)
03rd Nov 2010 16:33

Sage Slow


You mention customer accounts a lot in your reply, note I didn't mention customer accounts at all in my post, I mentioned transactions...

With regards to v2010, there was a specific bug (actually fixed in Update 2) relating to the way that v2010 handled customer delivery addresses that caused Sage to be very slow with even moderate numbers of customer records specifically.

I've seen literally thousands of Sage 50 data sets over the years, a large number of which have had tens of thousands of records of all kinds.

I completely agree that Sage 50 will slow down with large volumes of data but the 100,000 transaction limit mentioned in your original link is still a number that someone has plucked out of the air, it is not, nor has it ever been a physical limit.  As for whether the slow down is a problem also depends on the reports you are running - for example, the standard P&L and Balance Sheet should take around the same time to run whether you have 1000 or 100,000 transactions as they don't get their data from the individual transactions.


Thanks (0)
03rd Nov 2010 17:36

one transaction = records in many tables

The whole description of transactions is itself somewhat of a misnomer - we should probably be referring to records. In this respect a transaction has a one to many relationship with records (i.e. one transaction can spawn master, detail (possibly many), nominal/gl, vat, etc.. records depending upon transaction type vat ... & so on)

So to be able to accommodate 4bn transactions we are talking about a huge number of actual records (updwards of 4 * 3+ = min 12bn records). If each record is 100 bytes then 10 records = 1kb & we are talking about 12bn records - I seriously doubt that the Sage db could handle that lot in a sensible time frame

The original question defined about 200,000 transactions which probably roughly boils down to 600,000-1,000,000 records and with this in mind it is not surprising that Sage is struggling

Of course a p/l report is relatively fast because it only looks at totals for the number of p/l related nominal accounts - some of the other reports requiring detail level information will just kill the program

Essentially for the volume of data going through the program they are probably using the wrong tool

Thanks (0)
03rd Nov 2010 18:12

I agree...

"Essentially for the volume of data going through the program they are probably using the wrong tool"

Agreed and it's exactly this sort of customer that Sage 100 would have been ideal for had it not been given the chop...


Thanks (0)