Share this content
8

Sage Cloud Line50 Large File Import Anomalies

Sage Cloud Line50 Large File Import Errors (Audit Trail)

Didn't find your answer?

Anyone else having problems importing large accounts data into Sage Cloud Line50? 

I have an Audit Trail I'm trying to import of just over 600,000 lines.  Yeah ok, it's a bit big, but that's not the problem. 

The first error I get is on line 8414.  This is right in the middle of a batch of SI lines with nothing wrong with any of the lines. 

The second is in line 18762.  Non-existent Purchase Ledger Supplier.  The supplier has appeared before in the Audit Trail and it didn't choke on it then, so why now? 

I extracted these error lines, together with a few lines either side into a small dummy import file, which would have imported correctly, had I not put a fake record in at the end so that Sage rejected it.

Now 18762 is round about double 8414.  Not in accounting terms it isn't, but if you look at the number of characters in the file at each of those points, it's about double (1,459,000 bytes approximately).   This doesn't equate to any "magic number" that I can see e.g., 2 to the power something.

Anyone else experiencing the same issue? 

In the meantime, I'm going to have to chop up my import into chunks of about a megabyte each, making sure that the file does not split up journal entries. 

 

Replies (8)

Please login or register to join the discussion.

avatar
By jonathan.kempson
13th Apr 2020 15:37

I'm afraid it doesn't help solve your problem, but I'd be surprised if Sage tested import files of anything like that size. It would be a surprising number of records to import into Sage 200, never mind Sage 50.

You sound as if you know what you're doing, so apologies for a couple of naive questions: have you checked whether and how much memory use increases during the import; have you looked at the files using something like Notepad++ which shows you non-standard characters. Also you don't say what the error was when the SI row was rejected.

Thanks (1)
Replying to jonathan.kempson:
avatar
By Ken Moorhouse
13th Apr 2020 21:50

Thank you for your quick response. No need to apologise for your questions, sometimes it's a case of seeing the wood for the trees. Sometimes in the course of preparing a question I see the stupidity of my question before I get to post it, but no joy on this one.

Memory is a thing I've not checked, however the results I'm getting are reproducible: the same error on the same line, every time. The thing is that before this I was having a problem with unmatched journals (that's another story), which produced the error messages that I would have expected, so the parsing mechanism seems to be working correctly in that respect (where the parser has to remember the running total between input lines). I've created a Nominal Code and funneled balancing entries to that Nominal Code so that the JC's and JD's now match. Those errors have now gone.

I use a hex editor to check for spurious characters and there are none. The error message I get is as follows:-

Error in record 8414: Type: Illegal Transaction Type
Error in record 8414: Account Reference: Reference to non-existent Customer/Supplier/Bank Record

As there are a bunch of Sales Invoice lines for the same customer, with the same Nominal Code, and 8414 is in the middle of that bunch, I'm taking the error message with a pinch of salt.

Thanks (0)
avatar
By DouglasN
14th Apr 2020 10:40

The techy bit behind this is beyond me, but do the offending line scontain any unusual characters like ! or " that might be [***] it up?

Other than that can you not just import the 599,998 lines and then manually input or separately import the 2 lines? You said that they imported ok on a different import, what happens if you move them to a different line on the data, to the end of the SI batches or to the bottom of the data?

Thanks (0)
Replying to DouglasN:
avatar
By Ken Moorhouse
14th Apr 2020 11:04

There are lots of other errors I've got to deal with, I'm just picking them off one at a time.

Thanks (0)
Replying to Ken Moorhouse:
avatar
By DouglasN
14th Apr 2020 11:16

Ugh, good luck!

Thanks (0)
Replying to DouglasN:
avatar
By Ken Moorhouse
14th Apr 2020 13:41

Thanks!
Made a breakthrough by rolling up all the Sales Invoice lines into one line per Invoice per TaxCode type (ok later on I'll need to run a check to make sure invoice totals tally). That's reduced the import to 230300 records, with 165 errors. (I have a feeling the max errors Sage counts up to is 1023). Next problem SD's. May have the same problem further on though...

Thanks (0)
Replying to Ken Moorhouse:
avatar
By Ken Moorhouse
15th Apr 2020 08:49

Mystery Solved!
When I was viewing the CSV in a spreadsheet to pinpoint the line with the problem I was assuming that I could use the Row number on the spreadsheet as gospel.
Though I was getting an error when loading the CSV into the spreadsheet (OpenOffice), saying not all rows could be loaded, I assumed (HaHaHaHa) that the lines being deleted would be removed from the end of the file, not 4373 lines in. Getting my parser to tack line numbers onto the end of each import line proved there was a discontinuity of about 600 lines at that point. Duh!
So I've now got to get my parser to act as a viewer as well: type in a line number and display the offending entry.
So the suggestion (jonathan kempson) about memory was spot on, albeit at a different point to what was perhaps envisaged.

Thanks (0)
By ara.martirossian
16th Apr 2020 11:31

Hi Ken

I would be interested to see if our Excel2Sage import module www.red-it.co.uk/excel2sage does a better job of importing this large file. We have the ability to pre process the file looking for 'oddities' and fixing them before they are imported in Sage.

Can you please contact me if you are interested and we can look at this together.

Thanks and regards
Ara Martirossian FCA
07808 318813
[email protected]

Thanks (0)
Share this content

Related posts