lack of internal memory
Hi.
We use the personal edition of datameer. Version 5.5.6.
Since lately we have massive problems with loading time of datameer. Launching datameer takes about 10 minutes. You can see how datameer eats up internal memory of the computer. Before launch about 4 GB memory are free, after only about 100 MB. When opening an folder in datameer it tooks a few minutes until the workbooks of the folder were displayed.
Can anybody help?
Thanks,
Simon
-
The size of the "das-data" ist about 264 GB.
When Datameer is starting up there are one error and one warning which repeat a few times.
ERROR:
ERROR [2015-10-30 06:16:30.711] HousekeepingService thread-1 - Error occurred.
org.hibernate.HibernateException: IOException occurred reading text
at org.hibernate.type.descriptor.java.DataHelper.extractString(DataHelper.java:83)
at org.hibernate.type.descriptor.java.StringTypeDescriptor.wrap(StringTypeDescriptor.java:91)
at org.hibernate.type.descriptor.java.StringTypeDescriptor.wrap(StringTypeDescriptor.java:40)
at org.hibernate.type.descriptor.sql.ClobTypeDescriptor$4.doExtract(ClobTypeDescriptor.java:104)
at org.hibernate.type.descriptor.sql.BasicExtractor.extract(BasicExtractor.java:64)
at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:254)
at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:250)
at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:230)
at org.hibernate.type.AbstractStandardBasicType.hydrate(AbstractStandardBasicType.java:331)
at org.hibernate.persister.entity.AbstractEntityPersister.hydrate(AbstractEntityPersister.java:2283)
at org.hibernate.loader.Loader.loadFromResultSet(Loader.java:1527)
at org.hibernate.loader.Loader.instanceNotYetLoaded(Loader.java:1455)
at org.hibernate.loader.Loader.getRow(Loader.java:1355)
at org.hibernate.loader.Loader.getRowFromResultSet(Loader.java:611)
at org.hibernate.loader.Loader.doQuery(Loader.java:829)
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:274)
at org.hibernate.loader.Loader.doList(Loader.java:2542)
at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2276)
at org.hibernate.loader.Loader.list(Loader.java:2271)
at org.hibernate.loader.criteria.CriteriaLoader.list(CriteriaLoader.java:119)
at org.hibernate.impl.SessionImpl.list(SessionImpl.java:1716)
at org.hibernate.impl.CriteriaImpl.list(CriteriaImpl.java:347)
at datameer.dap.conductor.persistence.dao.ReadOnlyDaoImpl.getByIds(ReadOnlyDaoImpl.java:82)
at datameer.dap.conductor.job.JobExecutionService.deleteJobExecutionsLessThenStopDate(JobExecutionService.java:370)
at datameer.dap.conductor.service.HousekeepingService.deleteOutdatedExecutions(HousekeepingService.java:262)
at datameer.dap.conductor.service.HousekeepingService.executeInTransaction(HousekeepingService.java:148)
at datameer.dap.conductor.service.HousekeepingService.executeInTransaction(HousekeepingService.java:129)
at datameer.dap.conductor.job.SingleThreadedTransactionController$1.call(SingleThreadedTransactionController.java:48)
at datameer.dap.conductor.job.SingleThreadedTransactionController$1.call(SingleThreadedTransactionController.java:39)
at datameer.dap.conductor.persistence.TransactionHandler.execute(TransactionHandler.java:118)
at datameer.dap.conductor.persistence.TransactionHandler.executeInNewTransaction(TransactionHandler.java:99)
at datameer.dap.conductor.job.SingleThreadedTransactionController.execute(SingleThreadedTransactionController.java:39)
at datameer.dap.conductor.job.SingleThreadedController.executeAndLogMetrics(SingleThreadedController.java:140)
at datameer.dap.conductor.job.SingleThreadedController.loop(SingleThreadedController.java:116)
at datameer.dap.conductor.job.SingleThreadedController$2$1.run(SingleThreadedController.java:88)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at datameer.dap.common.security.DatameerSecurityService.runAsUser(DatameerSecurityService.java:100)
at datameer.dap.common.security.DatameerSecurityService.runAsUser(DatameerSecurityService.java:135)
at datameer.dap.conductor.job.SingleThreadedController$2.run(SingleThreadedController.java:84)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: org.hsqldb.HsqlException: lob is no longer valid
at org.hsqldb.lib.java.JavaSystem.toIOException(Unknown Source)
at org.hsqldb.types.ClobInputStream.read(Unknown Source)
at org.hsqldb.types.ClobInputStream.read(Unknown Source)
at org.hibernate.type.descriptor.java.DataHelper.extractString(DataHelper.java:75)
... 41 more
Caused by: org.hsqldb.HsqlException: lob is no longer valid
at org.hsqldb.error.Error.error(Unknown Source)
at org.hsqldb.error.Error.error(Unknown Source)
at org.hsqldb.persist.LobManager.getBytesNormal(Unknown Source)
at org.hsqldb.persist.LobManager.getChars(Unknown Source)
at org.hsqldb.Session.performLOBOperation(Unknown Source)
at org.hsqldb.Session.execute(Unknown Source)
at org.hsqldb.types.ClobDataID.getChars(Unknown Source)
at org.hsqldb.types.ClobInputStream.readIntoBuffer(Unknown Source)WARNING:
WARN [2015-10-30 08:22:55.113] JobScheduler thread-1 - Job DapJobExecution{id=802154, type=NORMAL, status=ERROR} completed with status ERROR.After starting up these messages appear:
[30.10.15 09:27:49] Data Meer: WARN [2015-10-30 08:22:55.113] JobScheduler thread-1 - Job DapJobExecution{id=802154, type=NORMAL, status=ERROR} completed with status ERROR.
[30.10.15 09:28:32] Data Meer: ERROR [2015-10-30 08:48:31.519] JobScheduler thread-1 - Job 802155 failed with exception.
java.lang.OutOfMemoryError: GC overhead limit exceeded
[30.10.15 09:29:10] Data Meer: WARN [2015-10-30 09:20:28.650] qtp136606306-797 - Too many sessions are open (1/1). -
In respect to the "OutOfMemoryError: GC overhead limit exceeded" message and your description, it is recommended to increase the heap size slightly
and in reference to former communication it is recommend to optimize the scheduling of
https://documentation.datameer.com/documentation/current/Importing+Data
The first option will lead to some more memory available for the application, whereby the second will decrease the memory consumption.
Please sign in to leave a comment.
Comments
3 comments