Forum

Discussions specific to version 8.2
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Wed Jan 30, 2019 7:52 am

Hi,

I have created one external table in Kognitio from hive external table. Table contains 10000 records. When I am running select query with limit option its giving out of memory issue.I have created cluster with 16 GB of RAM and 4 core with 1 container.

Kindly help to resolve this issue.

Regards,
Sadesh Jayraj
Reply with quote Top
Contributor
Offline
User avatar
Posts: 12
Joined: Tue Jan 30, 2018 11:20 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by markm » Wed Jan 30, 2019 11:44 am

Hi Sadesh,

There should be no problem running that query on a 16GB container and I can't replicate the problem here (I tried with an 85,000,000 record table).

Could you please send us the exact error message returned and the diagnose for the query (e.g. "diagnose select * from <ext_table> limit 100").

How is that table stored in Hive (ORC, Parquet, text etc) and is it one file or several?

Thanks,
Mark
Reply with quote Top
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Thu Jan 31, 2019 9:43 am

Hi mark,

Thanks for help.

I have created kognitio external table from HIVE external table.It is PARQUET table.The table has 212 column with 10000 records.

Even though for selecting single row I am facing OOM error.I did not find any error text in kognitio logs. Should it require any YARN memory configuration?

Or need to tune some OS level parameter.

As per suggestion I have attached output of "diagnose select <>" command. Please look into.

Regards,
Sadesh Jayraj
Reply with quote Top
Contributor
Offline
User avatar
Posts: 12
Joined: Tue Jan 30, 2018 11:20 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by markm » Thu Jan 31, 2019 10:54 am

Hi Sadesh,

Thank you for the diagnose output - we will use that to replicate your problem here.

In the meantime, could you please re-run the query but select a small number of columns just to check basic access - e.g.

select FLATFILE_NAME, CC_ITEM_INDEX, SRCTATYPE, SRCTAID, SUBPROCESS from zar001_parquet_hive_spark_05121949 limit 100;

We think the OOM problem may be related to the combination of the number and width of the columns - do you know the maximum width of a row?

Regards,
Mark
Reply with quote Top
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Thu Jan 31, 2019 4:09 pm

Hi Mark,

I have followed suggestions given by you.I cant clearly say width of entire row for this scenario. As we have 212 column in table and we want to check performance for given table.

Meanwhile i tried to run select command with limited columns. I got below mentioned error

08S01: [Kognitio][WX2 Driver] Communications link failure


Regards,
Sadesh Jayraj
Reply with quote Top
Contributor
Offline
Posts: 18
Joined: Wed Jun 05, 2013 1:19 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by andym » Thu Jan 31, 2019 6:53 pm

Hi Sadesh

We are trying to reproduce the problem you are having in-house but would like some more information. Can you please run a metadata scan on the parquet file you are trying to load so we can find out a bit more about it? You can download the parquet tools jarfile from our website here:

https://kognitio.com/downloads/parquet- ... .0-kog.jar

This can be put on the edge node where you ran the kodoop command and used to report on the parquet file with a command like this:

hadoop jar /local/path/to/parquet-tools-1.10.0-kog.jar meta /hdfs/path/to/parquet/file

Please run this command against the parquet file you are trying to load and send us the results.

Thanks in advance,

Andy MacLean
Reply with quote Top
Contributor
Offline
Posts: 18
Joined: Wed Jun 05, 2013 1:19 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by andym » Thu Jan 31, 2019 7:04 pm

Hi Sadesh

It would also be helpful if you could provide us with some extra information about the 'Communications link failure' error above. Kognitio keeps a log of incidents which can cause this behaviour and you can see these by running a command like this:

kodoop incidents NNNN list

(where NNNN is the name of your cluster)

This will give output similar to this:
$kodoop incidents andyhdp list
Kognitio Analytical Platform software for Hadoop ver80202rel180914.
(c)Copyright Kognitio Ltd 2001-2018.

Found 1 items
wxdb-crash.T_2019-01-31_18_23_11_GMT

A report on the incident can be generated by taking the incident ID (the line starting wxdb-) for the most recent incident and running a command like this:

kodoop incidents NNN short IIII

For example:

$kodoop incidents andyhdp short wxdb-crash.T_2019-01-31_18_23_11_GMT
Kognitio Analytical Platform software for Hadoop ver80202rel180914.
(c)Copyright Kognitio Ltd 2001-2018.

The report is andyhdp_wxdb-crash.T_2019-01-31_18_23_11_GMT_short_report.tar.gz

Please have a look at the incident list for your Kognitio cluster and send us the short report for the most recent incident if any are present.

You may want to send this and the metadata report from the previous comment privately by e-mail instead of attaching them here, in which case you can send mail to helpdesk@kognitio.com with the files attached.

Thanks in advance,

Andy MacLean
Reply with quote Top
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Fri Feb 01, 2019 7:08 am

Hi Andy,

Thanks for your kind suggestion,

I have executed given commands on my in house setup and results are as given below

kodoop incidents hdpko list
Kognitio Analytical Platform software for Hadoop ver80202rel180914.
(c)Copyright Kognitio Ltd 2001-2018.

Found 5 items
wxdb-crash.T_2019-01-30_19_43_19_GMT
wxdb-crash.T_2019-01-31_09_37_57_GMT
wxdb-crash.T_2019-01-31_15_32_21_GMT
wxdb-crash.T_2019-01-31_15_50_43_GMT
wxdb-crash.T_2019-01-31_15_59_32_GMT

I have found 5 incidents but unable to generate report for anyone. Every time I getting same message like

kodoop incidents hdpko short wxdb-crash.T_2019-01-31_15_59_32_GMT
Kognitio Analytical Platform software for Hadoop ver80202rel180914.
(c)Copyright Kognitio Ltd 2001-2018.

No incident found with id wxdb-crash.T_2019-01-31_15_59_32_GMT

How can I find actual location of all generated incidents?. so if required I can share with you.

Also I will share output of given parquet- ... .0-kog.jar which include parquet file metadata separately on given email id.

Are you able to reproduce the same issue on your setup?

Please look into and do the needful.

Regards,
Sadesh Jayraj
Reply with quote Top
Contributor
Offline
User avatar
Posts: 12
Joined: Tue Jan 30, 2018 11:20 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by markm » Fri Feb 01, 2019 10:33 am

Hi Sadesh,

Andy will get back to you about the incident reports but I just wanted to let you know that we have managed to reproduce the out of memory error. It turned out to be a problem in the client tool, not the server and engineering are working on a fix for it now.

Regards,
Mark
Reply with quote Top
Contributor
Offline
Posts: 18
Joined: Wed Jun 05, 2013 1:19 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by andym » Fri Feb 01, 2019 11:42 am

sadesh wrote:
Fri Feb 01, 2019 7:08 am
Hi Andy,

Thanks for your kind suggestion,

I have executed given commands on my in house setup and results are as given below

...

I have found 5 incidents but unable to generate report for anyone.

...

How can I find actual location of all generated incidents?. so if required I can share with you.
Hi Sadesh

That's unusual. Each Incident gets written out to a folder inside HDFS which will be named .kodoop-clusters/<clustername>/dumps/<incident>. So for me:

$ hadoop fs -ls .kodoop-clusters/andyhdp/dumps
Found 1 items
drwxr-xr-x - kodoop hdfs 0 2019-01-31 18:38 .kodoop-clusters/andyhdp/dumps/wxdb-crash.T_2019-01-31_18_23_11_GMT

The incident folder contains a lot of different files though and the 'incident not found' error comes from not being able to find the edge node logfile, which is a file with an @ in the name, which is the short report. Perhaps you can 'hadoop fs -ls' one of the incident folders and post the file list here?

Thanks in advance,

Andy MacLean
Reply with quote Top
Contributor
Offline
User avatar
Posts: 12
Joined: Tue Jan 30, 2018 11:20 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by markm » Fri Feb 01, 2019 11:50 am

Hi Sadesh,

We have an alternate query tool that you can use while we get a fix out for the OOM error - this is a Java based tool and it requires Java 8 or later.

You can download it from http://kognitio.com/downloads/KogSQL.jar and run it with the following command:

java -jar KogSQL.jar -s <server address> -u <user> -p <password>

If you run "java -jar KogSQL.jar" you will get the command line help.

When running the application, help is available by entering "help;" and you can exit using ctrl-d or "quit;".

If you want to run a SQL query directly from the command line use the following method:

echo "<SQL Query>" | java -jar KogSQL.jar -s <server address> -u <user> -p <password> -

Regards,
Mark
Reply with quote Top
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Sat Feb 02, 2019 5:45 pm

Hi Mark and Andy,

Thank you for continuous support and suggestion.

I have downloaded the given jar KogSQL.jar from given link. Using given jar, i tried to fetch single column from mentioned external table. But I got some kodnitio JDBC java exception while reading spark schema.

Query:
select FLATFILE_NAME from zar001_parquet_hive_spark_05121949 limit 100;

HY000 [Kognitio][JDBC Driver] PI0183: Module JAVA: com.kognitio.javad.connector.KECException: org.apache.parquet.io.InvalidRecordException: flatfile_name not found in message spark_schema

Meanwhile I have created external hive table from from CSV file. But session is terminating continuously as I am trying to fetch some columns from HIVE external table.

select FLATFILE_NAME from zar001_kognitio_test limit 100;

08006 [Kognitio][JDBC Driver] Session has been aborted: Connection unexpectedly terminated while waiting for reply from server.

Also I have observed WX2 dump

kodoop server hdpko status
Kognitio Analytical Platform software for Hadoop ver80202rel180914.
(c)Copyright Kognitio Ltd 2001-2018.

Current state: Booted.
Goal state: Booted.
Second Goal State: UNKNOWN.
Status: Performing a debug dump after ERROR: WXDB crash detected by monitor..
Current operation: .

I am trying to attached WX2 crash dump file but wx2 extesion file is not allow to upload here. I will share full dump folder with you guys on given email id.Some log text are attached for your reference.

Regards,
Sadesh Jayraj
Attachments
WX2_crash_dump.docx
(23.58 KiB) Downloaded 19 times
Reply with quote Top
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Tue Feb 05, 2019 5:16 am

Hi Team,

Waiting for your suggestion.

For your information, I have noticed that while running below select query, the container getting kill due to error

>select FLATFILE_NAME from zar001_parquet_hive_spark_05121949 limit 100;

Container [pid=6330,containerID=container_e33_1542606969813_0102_01_000002] is running beyond physical memory limits. Current usage: 16.0 GB of 16 GB physical memory used; 24.8 GB of 33.6 GB virtual memory used. Killing container. Dump of the process-tree for container_e33_1542606969813_0102_01_000002 : |- PID PPID PGRPID SESSID CMD_NAME USER_MODE_TIME(MILLIS) SYSTEM_TIME(MILLIS) VMEM_USAGE(BYTES) RSSMEM_USAGE(PAGES) FULL_CMD_LINE |- 6330 6328 6330 6330 (bash) 0 0 118022144 369 /bin/bash -c python ./infra/agent/slider-agent/agent/main.py --label container_e33_1542606969813_0102_01_000002___kodoop_container --zk-quorum prdiqip3w06.bss.jio.com:2181 --zk-reg-path /registry/users/kognitio/services/org-apache-slider/kognitio-hdpko > /DATA/hadoop/yarn/log/application_1542606969813_0102/container_e33_1542606969813_0102_01_000002/slider-agent.out 2>&1 |- 30585 30358 30358 6330 (wx2-linker) 242 209 589864960 132042

To resolve this issue, I followed suggestion given in forum and tried to create parquet connector with max_thred_per_node 5, But its giving some error

CREATE CONNECTOR test_parquet command 'java-ext-table.plugin' target 'connector_mode PLUGIN, max_connectors_per_node 5,connector_type PARQUET, uri_location "hdfs://prdiqip3w05.bss.jio.com:8020"';
Query 1 retcode = -1 ---- 0:00.1 0:00.1 0:00.1
HY000: [Kognitio][WX2 Driver][localhost:6550] ET011A: External connector command not found


Thanks and Regards,
Sadesh Jayraj
Reply with quote Top
Contributor
Offline
User avatar
Posts: 384
Joined: Thu May 23, 2013 4:48 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by markc » Tue Feb 05, 2019 8:06 am

Sadesh,

The problem is that the Parquet connector does not match filenames insensitively - it assumes that the names will always be lower-case, which is what normally happens in Hive.

We are preparing a fix for this, and will make that available so you can continue testing.

As that is the underlying problem, any attempts to e.g. increase the number of threads used won't be helpful.

Regards,
Mark
Reply with quote Top
Contributor
Offline
User avatar
Posts: 384
Joined: Thu May 23, 2013 4:48 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by markc » Thu Feb 07, 2019 10:20 am

Sadesh,

8.2.2rel190206 contains a fix for the case-sensitivity issue with the Parquet connector.

Can you download that from the Kognitio All Downloads section of the website, and follow the instructions at https://kognitio.com/documentation/late ... p-versions to upgrade to that version.
Reply with quote Top
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Mon Feb 11, 2019 6:11 am

Hi Mark and Andy,

Thanks for your kind help and support.

As I have upgraded my cluster from ver80202rel180914 to ver80202rel190206, after upgrade already defined cluster works . I tried to fetch some columns from mentioned table.It read table columns properly and case sensitive issue has been fix. Now I tried to fetch all records (10000) with all columns and I am facing OOM error. Kindly add some your inputs so I can resolve it.

Regards,
Sadesh Jayraj
Reply with quote Top
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Mon Feb 11, 2019 6:58 am

Hi,

While I used KODOOP sql command line I am getting OOM error.I have tried KogSQL.jar to select rows from mentioned table.In this case I did not face any OOM issue.
I will try to load bulk data into same table and inform you in case of any query.

Regards,
Sadesh Jayraj
Reply with quote Top
Contributor
Offline
User avatar
Posts: 384
Joined: Thu May 23, 2013 4:48 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by markc » Mon Feb 11, 2019 8:00 am

Sadesh,

Thanks for the feedback - your observations are in line with our expectations, as there were two problems:

1) the issue with wxsubmit giving an OOM error when fetching from a table with many potentially-large VARCHAR columns. This was a quirk in how wxsubmit allocates memory for row retrieval, but KogSQL.jar does not suffer from the same problem (which is why we made that available to you). We have a fix for wxsubmit which will be in a future version.

2) the issue with case sensitivity for Parquet filenames. We fixed this in the patch that we made available last week, so we don't expect you to see that problem on any systems using that patch or later versions.

Regards,
Mark
Reply with quote Top
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Tue Feb 12, 2019 10:47 am

Hi All,

1.Yesterday we have upgraded kodoop cluster from ver80202rel180914 to ver80202rel190206 to resolve the case sensitivity issue for Parquet filenames. After upgrading the server sql query to fetch data from parquet file works fine.

But today I am facing same issue while accessing same parquet table

>>select FLATFILE_NAME from zar001_parquet_hive_spark_05121949 limit 10;

HY000 [Kognitio][JDBC Driver] PI0183: Module JAVA: com.kognitio.javad.connector.KECException: org.apache.parquet.io.InvalidRecordException: flatfile_name not found in message spark_schema {

As per guideline it should work on cluster version ver80202rel190206

>>kodoop server hdpko staus
Kognitio Analytical Platform software for Hadoop ver80202rel190206.
(c)Copyright Kognitio Ltd 2001-2018.

I did not perform any config changes after upgrading server.What should I do to resolve this ?


2. Meanwhile I have created kognitio external table from hive table with 77 million record. But while selecting only single column like flatfile_name with limit 10, server become unresponsive

>> select FLATFILE_NAME from zar001_kognitio limit 10;


Regards,
Sadesh Jayraj
Reply with quote Top
Multiple Poster
Offline
Posts: 5
Joined: Wed Jun 05, 2013 1:18 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by GraemeCole » Tue Feb 12, 2019 11:18 am

sadesh wrote:
Tue Feb 12, 2019 10:47 am
Hi All,

1.Yesterday we have upgraded kodoop cluster from ver80202rel180914 to ver80202rel190206 to resolve the case sensitivity issue for Parquet filenames. After upgrading the server sql query to fetch data from parquet file works fine.

But today I am facing same issue while accessing same parquet table

>>select FLATFILE_NAME from zar001_parquet_hive_spark_05121949 limit 10;

HY000 [Kognitio][JDBC Driver] PI0183: Module JAVA: com.kognitio.javad.connector.KECException: org.apache.parquet.io.InvalidRecordException: flatfile_name not found in message spark_schema {
Hi Sadesh,

Is that the whole error message, or was there a Parquet schema definition after the "{" character? If so, could you please post the whole error message? It could be that one or more of the Parquet files does not have a field called FLATFILE_NAME.
sadesh wrote:
Tue Feb 12, 2019 10:47 am
2. Meanwhile I have created kognitio external table from hive table with 77 million record. But while selecting only single column like flatfile_name with limit 10, server become unresponsive

>> select FLATFILE_NAME from zar001_kognitio limit 10;
Could you run "explain zar001_kognitio;" and send us the result please? Also, can you tell us whether this Hive table's data is stored as Parquet, CSV or something else?

Thanks,

Graeme
Reply with quote Top
Multiple Poster
Offline
User avatar
Posts: 2
Joined: Tue Feb 12, 2019 1:19 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by sandesh » Tue Feb 12, 2019 1:24 pm

Hi Graeme,

I have attached error and explain table output.

Hive table data stored as CSV.

Regards,
Sandesh
Reply with quote Top
Multiple Poster
Offline
User avatar
Posts: 2
Joined: Tue Feb 12, 2019 1:19 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by sandesh » Tue Feb 12, 2019 1:28 pm

Hi Greame,
I have attached error message and explain table output.

We have only one parquet file and hive table data saved in CSV format.

Regards,
Sandesh
Reply with quote Top
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Tue Feb 12, 2019 2:18 pm

Hi,

Is that the whole error message, or was there a Parquet schema definition after the "{" character? If so, could you please post the whole error message? It could be that one or more of the Parquet files does not have a field called FLATFILE_NAME.

I have created external hive table only on single parquet file, no multiple file is available. Whole error message I have updated in attached file.

Could you run "explain zar001_kognitio;" and send us the result please?

Please refer attachment for the same. Table zar001_kognitio is an external table stored as CSV (TEXTFILE)

Thanks in advance.

Regards,
Sadesh Jayraj
Attachments
Error Message.docx
(15.56 KiB) Downloaded 19 times
Reply with quote Top
Multiple Poster
Offline
Posts: 5
Joined: Wed Jun 05, 2013 1:18 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by GraemeCole » Tue Feb 12, 2019 4:02 pm

Thanks, Sadesh.

Looking further up the thread, it appears that the query "select FLATFILE_NAME from zar001_parquet_hive_spark_05121949 limit 10;" started working when you applied the upgrade to ver80202rel190206. But you say that you're now getting the original issue again with the same query on the same table on the same system - is this correct? If so, what has changed on that system since it last worked?

I'm just wondering if the server has somehow been downgraded back to ver80202rel180914. Could you please run the following query on the system which is showing the "flatfile_name not found" problem, and send us the result:

select version, patch_level from sys.ipe_system;

This will tell us what version of the server is running.

Graeme
Reply with quote Top
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Wed Feb 13, 2019 4:55 am

Hi Graeme,

Looking further up the thread, it appears that the query "select FLATFILE_NAME from zar001_parquet_hive_spark_05121949 limit 10;" started working when you applied the upgrade to ver80202rel190206. But you say that you're now getting the original issue again with the same query on the same table on the same system - is this correct?- yes that is correct
If so, what has changed on that system since it last worked? - we just mapped hive table zar001_kognitio in kognitio which has 0.7 million records.

> select version, patch_level from sys.ipe_system;
VERSION | PATCH_LEVEL
---------+-------------
80202 | rel180914

From above we can see , we are on old version but simultaneously if we get status of server it shows newer version.

[kognitio@prdiqip1w58 packages]$ kodoop server hdpko status
Kognitio Analytical Platform software for Hadoop ver80202rel190206.
(c)Copyright Kognitio Ltd 2001-2018.

Current state: Booted.
Goal state: Booted.
Second Goal State: UNKNOWN.
Status: OK.
Current operation: .


Kindly let us know what could be the next step.

Regards,
Sandesh
Reply with quote Top
Contributor
Offline
User avatar
Posts: 12
Joined: Tue Jan 30, 2018 11:20 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by markm » Wed Feb 13, 2019 10:33 am

Hi Sadesh,

We think this is a problem with an auto restart running immediately after an upgrade. Could you please stop and then restart the Kognitio cluster:

Code: Select all

kodoop cluster hdpko stop
Wait for it to finish and then:

Code: Select all

kodoop cluster hdpko start
This will start the cluster and then start the server in the background.

You can then check the version as before.

Could you also please tar up the ~/kodoop/logs directory and upload it to the sftp server we set up for you before.

Thanks,
Mark
Reply with quote Top
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Thu Feb 14, 2019 4:44 am

Hi Mark,

After cluster restart , it is reflecting correct version .

But we are unable to fore query on table having 0.7 million records and getting below error..

> select FLATFILE_NAME from zar001_kognitio limit 10;
08006 [Kognitio][JDBC Driver] Session has been aborted: Connection unexpectedly terminated while waiting for reply from server.

Regards,
Sandesh
Reply with quote Top
Contributor
Offline
Posts: 20
Joined: Thu May 23, 2013 5:11 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by Simon.Darkin » Thu Feb 14, 2019 10:26 am

Sandesh,

I have just sent an email to Sadesh cc'ing you and your colleagues explaining how to capture additional debug information if the system should crash when running 'select FLATFILE_NAME from zar001_kognitio limit 10'

Regards

Simon
Reply with quote Top
Contributor
Offline
User avatar
Posts: 13
Joined: Wed Jan 30, 2019 7:24 am

Re: Getting out of memory error while running query on external hive table created in kognitio

by sadesh » Thu Feb 21, 2019 11:10 am

Hi Simon,

Do we have any update on this?

Regards,
Sandesh
Reply with quote Top
Contributor
Offline
Posts: 20
Joined: Thu May 23, 2013 5:11 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by Simon.Darkin » Thu Feb 21, 2019 1:45 pm

sadesh wrote:
Thu Feb 21, 2019 11:10 am
Hi Simon,

Do we have any update on this?

Regards,
Sandesh

Sandesh,

I’m not sure if you saw my last post dated 14th Feb at 10:26? I sent Sadesh some instructions on what to do if the systems errors with…

‘08006 [Kognitio][JDBC Driver] Session has been aborted: Connection unexpectedly terminated while waiting for reply from server.’

Those instruction involve capturing some information and sending to Kognitio so that we can investigate further. Is that something you or Sadesh can do?

Regards

Simon
Reply with quote Top
Contributor
Offline
Posts: 20
Joined: Thu May 23, 2013 5:11 pm

Re: Getting out of memory error while running query on external hive table created in kognitio

by Simon.Darkin » Mon Feb 25, 2019 4:30 pm

Sandesh,

We believe Kognitio on Hadoop version 8.2.3 fixes the issue that you encounter when running ‘SELECT FLATFILE_NAME FROM ZAR001_KOGNITIO LIMIT 10’ so can you download the latest version of 8.2.3 using the following link and then upgrade the software please?

# software download
https://kognitio.com/all-downloads/#All ... n%20Hadoop

# upgrade instructions
https://kognitio.com/documentation/late ... -edge-node

Regards

Simon
Reply with quote Top

Who is online

Users browsing this forum: No registered users and 1 guest

cron