How to save a file in hadoop with python -
Question:
I am starting to learn the hop, however, I need to save a lot of files by using Python. I am unable to understand what I'm doing wrong. Can anyone help me with it?
Below is my code. I think that HDFS_PATH
is correct because I did not change it in settings while installing it. pythonfile.txt
is on my desktop (it is also that the Python code running through the command line).
Code:
Import Thaiopi import OS hdfs_path = 'hdfs: // localhost: 9000 / python' def main (): Hesopi . Vitatb (hdfs_path, [('pythonfile.txt', Open ('Pythonfile.txt'). ()))] Main ()
Output If I run the code above, then I get a directory in Python. IMac-van-Brian: Desktop Brian $ $ HADOOP_HOME / bin / hasoop dfs -ls / python Disclaimer: The use of this script has been removed to execute the hdfs command. Instead, use the hdfs command for this 14/10/28 11:30:05 Warne util.NativeCodeLoader: Unable to load the original-al-Badid library for its platform ... where the Java-class is being used - rw-r - r-- 1 Brian Supergroup 236 2014- 10-28 11:30 / dragon
I think that you are typing / in a file named Python, when you want it to be the directory in which the file is stored
what
Hdfs dfs-cat / python If you show the contents of the file, then you just have to edit your hdfs_path to include the name of your file (you should first put a dragon with RAM) Otherwise, Pidop (PIP installed padop) Use and do this:
Import pydoop.hdfs from hdfs as_path = '/tmp/infile.txt' to_path = 'hdfs: // localhost: 9000 / python / outfile. Txt 'hdfs.put (from_path, to_path)
Comments
Post a Comment