Upload Log file (.csv) restrictions

Hi everyone,

I’m currently using Version 7.15 of community edition. I have to implement a solution for the company I work for, that will upload a csv file into Apromore in an automated way, so I wrote some code (selenium Java) to automate this process. I made it, but I came along some problems, so I need your insight.

The process is straight forward enough…
login to Apromore site (which lives on our server) → open folder of preference → select upload file → do the column mappings (Case ID, Start/End Timestamps, Activity) etc

The problem comes when the automation uploads a “big” file. The site becomes unresponsive and needs long time working with the file, no error msg. Are there any restrictions for file size, rows of csv or anything else??

Everything runs smoothly when I use “small” files, but I need to make some actions to ensure that I made a reliable solution that can handle bigger files.

I’m looking forward for your advice, information etc.

Thanks a lot!!

Apromore 7.15 is an old version of Apromore, which has capacity restrictions. On a server with 32GB of RAM, Apromore 7.15 can handle event logs of at most 1 million rows (possibly more if the log has very few columns).
IMPORTANT: Do make sure that the Xmx parameter used by the Java virtual machine is set to 80% of the memory of the server, for example if the server is 32GB, then set the Xmx Parameter to 25GB.
Newer versions of Apromore (based on Apromore Core) have better performance, for example Apromore Core v8.3 can handle event logs of up to 2 million rows on a 32-GB server (and proportionally more if the memory is increased).
The new Apromore series (Apromore 9), to be released at the end of July, will be based on a Big Data engine and will have a capacity of up to 200 million rows (when deployed on a cluster of 6-8 servers) or up to 20M rows when deployed in a single-server configuration.