Journal Import Concurrent Request completed with error, the error is OAR-04030 out of process memory and Journal Import has huge amount of records to be processed. If this is the Scenerio, We can split the journal import and delete process into two steps. 1. Run the program from the command line, in NODEL - no delete mode. $GL_TOP/bin/GLLEZL / 0 Y NODEL 2. Run the Journal Import Delete program to delete all processed records, at a later time (when there is no time crunch). To run Journal Import from the operating system follow the steps below: $GL_TOP/bin/GLLEZL username/passwd 0 Y no 1 N "" "" N N 2 where: 0 : request id no : interface_run_id 1 : set of books id N : (Yes or No) Post Errors to Suspense "" : Start and End Date (Note there is no space between the double quote
Yet another blog for dba's. Shared my work experience here. Hope it will be useful for you guys all.