Beehive of an external table requiring write access

I am trying to load a dataset stored in HDFS (text file) into a hive for analysis. I use create an external table as follows:

CREATE EXTERNAL table myTable(field1 STRING...) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' STORED AS TEXTFILE LOCATION '/user/myusername/datasetlocation'; 

This works fine, but write access is required to host hdfs. Why is this?

In general, how to download text data that I donโ€™t have write access to? Is there a read-only table type?

Edit: I noticed this problem on the hive regarding a question. This does not seem to have been allowed.

+5
source share
3 answers

I have no solution for this, but as a workaround, I found that

CREATE TEMPORARY EXTERNAL TABLE

working without write permission, the difference is that the table will disappear after your session.

0
source

Partially answering my own question:

In fact, it seems this hive is not allowed at this moment. But here's an interesting fact: the hive does not require write access to the files themselves, but only to the folder. For example, you may have a folder with 777 permissions, but the files inside it that the bush accesses may remain read-only, for example. 644 .

+3
source

If you need write access to hdfs files, hadoop dfs -chmod 777 / folder name

this means that you grant all permissions to access this particular file.

+1
source

Source: https://habr.com/ru/post/1262830/


All Articles