gpt4 book ai didi

python - 在 sc.textFile 中加载本地文件

转载 作者:行者123 更新时间:2023-11-30 22:49:56 25 4
gpt4 key购买 nike

我尝试加载本地文件,如下所示

File = sc.textFile('file:///D:/Python/files/tit.csv')
File.count()

完整回溯

IllegalArgumentException                  Traceback (most recent call last)
<ipython-input-72-a84ae28a29dc> in <module>()
----> 1 File.count()

/databricks/spark/python/pyspark/rdd.pyc in count(self)
1002 3
1003 """
-> 1004 return self.mapPartitions(lambda i: [sum(1 for _ in i)]).sum()
1005
1006 def stats(self):

/databricks/spark/python/pyspark/rdd.pyc in sum(self)
993 6.0
994 """
--> 995 return self.mapPartitions(lambda x: [sum(x)]).fold(0, operator.add)
996
997 def count(self):

/databricks/spark/python/pyspark/rdd.pyc in fold(self, zeroValue, op)
867 # zeroValue provided to each partition is unique from the one provided
868 # to the final reduce call
--> 869 vals = self.mapPartitions(func).collect()
870 return reduce(op, vals, zeroValue)
871

/databricks/spark/python/pyspark/rdd.pyc in collect(self)
769 """
770 with SCCallSiteSync(self.context) as css:
--> 771 port = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
772 return list(_load_from_socket(port, self._jrdd_deserializer))
773

/databricks/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py in __call__(self, *args)
811 answer = self.gateway_client.send_command(command)
812 return_value = get_return_value(
--> 813 answer, self.gateway_client, self.target_id, self.name)
814
815 for temp_arg in temp_args:

/databricks/spark/python/pyspark/sql/utils.pyc in deco(*a, **kw)
51 raise AnalysisException(s.split(': ', 1)[1], stackTrace)
52 if s.startswith('java.lang.IllegalArgumentException: '):
---> 53 raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
54 raise
55 return deco

IllegalArgumentException: u'java.net.URISyntaxException: Expected scheme-specific part at index 2: D:'

怎么了?我按照平常的方式例如 load a local file to spark using sc.textFile()或者 How to load local file in sc.textFile, instead of HDFS这些示例适用于 scala,但如果我不介意的话,对于 python 也是如此

但是

val File = 'D:\\\Python\\files\\tit.csv'


SyntaxError: invalid syntax
File "<ipython-input-132-2a3878e0290d>", line 1
val File = 'D:\\\Python\\files\\tit.csv'
^
SyntaxError: invalid syntax

最佳答案

更新:hadoop 中的“:”似乎有问题...

filenames with ':' colon throws java.lang.IllegalArgumentException

https://issues.apache.org/jira/browse/HDFS-13

Path should handle all characters

https://issues.apache.org/jira/browse/HADOOP-3257

在这个问答中,有人设法用 Spark 2.0 克服了这个问题

Spark 2.0: Relative path in absolute URI (spark-warehouse)

<小时/>

问题中有几个问题:

1)python访问windows中本地文件

File = sc.textFile('file:///D:/Python/files/tit.csv')
File.count()

你能尝试一下吗:

import os
inputfile = sc.textFile(os.path.normpath("file://D:/Python/files/tit.csv"))
inputfile.count()

os.path.normpath(路径)

通过折叠冗余分隔符和上层引用来规范路径名,以便 A//B、A/B/、A/./B 和 A/foo/../B 全部变为 A/B。此字符串操作可能会更改包含符号链接(symbolic link)的路径的含义。在 Windows 上,它将正斜杠转换为反斜杠。要标准化大小写,请使用normcase()。

https://docs.python.org/2/library/os.path.html#os.path.normpath

输出为:

>>> os.path.normpath("file://D:/Python/files/tit.csv")
'file:\\D:\\Python\\files\\tit.csv'

2) 在 python 中测试的 scala 代码:

val File = 'D:\\\Python\\files\\tit.csv'
SyntaxError: invalid syntax

此代码无法在 python 中运行,因为它是 scala 代码。

关于python - 在 sc.textFile 中加载本地文件,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39552235/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com