Forum

Discussions specific to version 8.1
Contributor
Offline
User avatar
Posts: 386
Joined: Thu May 23, 2013 4:48 pm

Hadoop connector runtime issues

by markc » Fri May 24, 2013 1:10 pm

This topic lists a number of common runtime errors seen when using the Hadoop connector, along with advice on how to address them:

ET010F - invalid external data record

To find more detail on the problem, look in `wxlogd smd`/serverdbg* around the time the ET010F error was returned. This should contain more information on the underlying cause of the problem.

The most common cause of this error is the encoding scheme used for the data. By default, this is typically UTF-8, so if a file is encoded differently (e.g as LATIN1) that information needs to be added to the target string used for the external table. The attribute name to use is character_set, so adding "character_set LATIN1" in this case would resolve the problem.

Another possible cause is that the data is too long for the target column - in this case error 0x202 will appear. In this situation, you will need to make the target table column bigger (e.g. if it is currently a char(100), change that to a char(1000) or whatever is required for the data to be loaded).

If the error 0x110B is returned, this indicates a mismatch in data. This might be because the data is not well-formed (in which case it cannot be loaded), or because a null field indicator is occurring - in that case, add the following to the target string (assuming the NULL field indicator is "X" in this case:

fmt_null_value "X"
Reply with quote Top
Contributor
Offline
User avatar
Posts: 386
Joined: Thu May 23, 2013 4:48 pm

Re: Hadoop connector runtime issues

by markc » Wed May 29, 2013 12:47 pm

This link has an attachment giving more details on formatting options for handling various data issues:

http://www.kognitio.com/forums/viewtopic.php?f=2&t=6
Reply with quote Top

Who is online

Users browsing this forum: No registered users and 1 guest

cron