by Deizreal » Mon Sep 29, 2014 3:34 pm
I trust that this finds you good and well
I solved our previous problem that we were facing so feel free to close that topic.
Currently we have a database of 249 tables which is around 49.9GB in hard disk space that we are importing into the kognitio database with the following setup:
76GB of Ram in total with 200Gb Hard Drive space.
During the time that we are loading the CSVs into Kognitio's schema that we have created it gets to 60.61GBs memory in use and rejects the rest of tables telling us that there is not enough memory available for the next table.
Can you please provide me with further information about this?
What would be the most efficient way to load these tables in? From our tests the 76Gbs of Ram that we have should be more than enough for the tables we are loading.
by markc » Mon Sep 29, 2014 3:47 pm
So with 76GB RAM allocated to Kognitio, only 53.2GB would be available for your table and view images.
I'm not sure why you see 60.61GB being used before you get errors about not enough memory - if you captured the output from running $f2 and then $f7 as queries in wxsubmit when the system is in this state, that might explain that.
To avoid this problem you can either provide more memory if you really need all this data to be in RAM, or drop the images of some of the tables if you don't need them to always be in RAM.
Reading the documentation will explain how to use various different sorts of memory images, including having predicates on views and/or tables so only certain rows are placed into memory.
by Deizreal » Mon Sep 29, 2014 4:27 pm
I will give it a try as you say I missed the part about the reserved memory and I apologize for asking such a nourish question like that.
I will give my feedback and yours onwards to the guy I report to about this.
But thank you for the speedy reply and assistance :)
by markc » Mon Sep 29, 2014 4:38 pm
However, if you were using version 7 before the most likely causes of stopping loading having used 60GB RAM are either that you were using hashed images and so saw skew, or that your ram stores only get about 60GB RAM total of the 76GB on the platform (some RAM is reserved for Linux and other applications).
Again, running the $f2 and $f7 queries in submit will show that.
Moving to version 8 will actually make you run out of RAM somewhat sooner, as version 8 has the 70% limit for table/view images and version 7's limit is somewhat lower.
by Deizreal » Mon Sep 29, 2014 5:18 pm
Well for the loading we have left most of it default with the loader and we created a bat file that handles each csv that we load.
Also for the tables we took a direct copy from our MySQL database and translated it into SQL which I think is also were our problem lies.
I am home at the moment so tomorrow morning I will upgrade the lot to Version 8 and give that queries you suggested a shot and look at everything else so that we have it running as smoothly as we can get it.
I was told that the data we have been able to load onto our Test setup will suffice for our tests but I would just like to be prepared and also have a better understanding of it all so...
Gonna get the coffee going and got lots of reading to go through tonight :)
Once again thanks for the assistance Mark and I will let you know what happens as we go
by markc » Mon Oct 06, 2014 8:59 am
If you have more RAM than that you can constrain the RAM available to Kognitio on each server by using wxviconf to add a memsize setting to the [system] section of the config file. Note that the memsize parameter sets the RAM PER NODE, rather than across the whole system.
So if you have a single node system with 256GB RAM you could edit the config file with wxviconf to add the following setting, which would let the system work with the free licence:
by Deizreal » Mon Oct 06, 2014 9:05 am
Getting told to really push the bracket on this so I am just looking over the in depth options before orders and that are made.
I was aware of setting the memsize per node really handy to know ^.^
so to my understanding the space reserved for OS and workspace that is excluded then from the 128GB total?
by markc » Mon Oct 06, 2014 9:29 am
Note that you should NOT set memsize to more memory than your node really has! It should only be used to reduce the amount of RAM used by Kognitio, to either allow the free licence to be used, or to allow more non-Kognitio software to run on the same nodes.
Who is online
Users browsing this forum: No registered users and 1 guest