When convert plain text to mobi, the time used grow more rapidly as text grows. It is roughly like:
T = k * n ^ 2
where T is the time used, n is the total lines of text file, and k is a constant between 2 to 3 on my system.
In my case, each line of text file is converted to a <p> </p> paragraph, If for each <p> </p>, Calibre try find its parent during convertion, presumingly by search everyline before that <p>..., then n * (n + 1) /2 search need to be done, that might be an explaination.
May I suggest add performance tuning to the future development?
T = k * n ^ 2
where T is the time used, n is the total lines of text file, and k is a constant between 2 to 3 on my system.
In my case, each line of text file is converted to a <p> </p> paragraph, If for each <p> </p>, Calibre try find its parent during convertion, presumingly by search everyline before that <p>..., then n * (n + 1) /2 search need to be done, that might be an explaination.
May I suggest add performance tuning to the future development?