-
tspann
- Single Poster
- Offline
-
- Posts: 1
- Joined: Thu Oct 20, 2016 3:18 pm
-
markc
- Contributor
- Offline
-
- Posts: 386
- Joined: Thu May 23, 2013 4:48 pm
Information and discussion related to the Kognitio on Hadoop product
Hortonworks + Kognitio
by tspann » Thu Oct 20, 2016 3:23 pm
I am going to try to install on my OpenStack Centos 7.2 HDP 2.5 cluster to try it out.
Also to use for my meetup
http://meetup.com/futureofdata-princeton
Any tips or suggestions?
Can I run queries in Hive? Spark SQL? Zeppelin? hive CLI? beeline?
Also to use for my meetup
http://meetup.com/futureofdata-princeton
Any tips or suggestions?
Can I run queries in Hive? Spark SQL? Zeppelin? hive CLI? beeline?
Re: Hortonworks + Kognitio
by markc » Thu Oct 20, 2016 8:08 pm
Reading http://www.kognitio.com/forums/Getting% ... Hadoop.pdf is a good starting point.
Then you can download the Kognitio client tools and run e.g. Kognitio Console against the system to submit SQL. The default is that the edge node you initiate the Kognitio YARN application from will listen on port 6550 for ODBC connections. So if you install the Kognitio clients and create a DSN with the edge node's IP address specified for connections, you can then run SQL. You could also use other third party tools like Tableau, Microstrategy, etc. to connect to your Kognitio application and query data that is stored in the Hadoop cluster.
The general Kognitio documentation explains how to e.g. create external tables which allow you to access data stored in HDFS. Ideally you'd create in-memory images of these to allow you to have best performance - see the general documentation on create view images here, for example.
Mark.
Then you can download the Kognitio client tools and run e.g. Kognitio Console against the system to submit SQL. The default is that the edge node you initiate the Kognitio YARN application from will listen on port 6550 for ODBC connections. So if you install the Kognitio clients and create a DSN with the edge node's IP address specified for connections, you can then run SQL. You could also use other third party tools like Tableau, Microstrategy, etc. to connect to your Kognitio application and query data that is stored in the Hadoop cluster.
The general Kognitio documentation explains how to e.g. create external tables which allow you to access data stored in HDFS. Ideally you'd create in-memory images of these to allow you to have best performance - see the general documentation on create view images here, for example.
Mark.
2 posts
• Page 1 of 1
Who is online
Users browsing this forum: No registered users and 2 guests