When I trained HTS (ver. 2.1.1) with my own English DB, the “ERROR [+5105] AllocBlock: Cannot allocate block data of 500000 bytes” occurred during the re-estimation of the full context model. The DB size is about 750MB. The error didn’t happen when I trained the DB under 600MB.
I read the question for the similar problem in the query page. The answer is “the upper memory limit per process on 32-bit linux is 3GB.” My machine has 3 GB RAM and dual core CPU.
I think if I use the 64-bit linux, the problem may be solved. However, I hope to solve the problem except by the installation of the 64-bit linux. Is there any solution?
Or, Have I the other problem?
Thank you for your help in advance.