进入pyspark报错了,求救啊?
0
出现信息
Python 2.7.10 (default, Feb 24 2016, 16:02:27)
[GCC 4.4.6 20120305 (Red Hat 4.4.6-4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
16/06/20 19:58:01 WARN SparkConf:
SPARK_WORKER_INSTANCES was detected (set to '3').
This is deprecated in Spark 1.0+.
Please instead use:
- ./spark-submit with --num-executors to specify the number of executors
- Or set SPARK_EXECUTOR_INSTANCES
- spark.executor.instances to configure the number of instances in the spark config.
16/06/20 19:58:02 WARN AbstractHandler: No Server set for org.spark_project.jetty.server.handler.ErrorHandler@5f242a67
Traceback (most recent call last):
File "/home/hadoop/soft/spark-2.0.0-preview-bin-hadoop2.7/python/pyspark/shell.py", line 38, in <module>
sc = SparkContext()
File "/home/hadoop/soft/spark-2.0.0-preview-bin-hadoop2.7/python/pyspark/context.py", line 115, in __init__
conf, jsc, profiler_cls)
File "/home/hadoop/soft/spark-2.0.0-preview-bin-hadoop2.7/python/pyspark/context.py", line 176, in _do_init
self._accumulatorServer = accumulators._start_update_server()
File "/home/hadoop/soft/spark-2.0.0-preview-bin-hadoop2.7/python/pyspark/accumulators.py", line 259, in _start_update_server
server = AccumulatorServer(("localhost", 0), _UpdateRequestHandler)
File "/usr/local/lib/python2.7/SocketServer.py", line 420, in __init__
self.server_bind()
File "/usr/local/lib/python2.7/SocketServer.py", line 434, in server_bind
self.socket.bind(self.server_address)
File "/usr/local/lib/python2.7/socket.py", line 228, in meth
return getattr(self._sock,name)(*args)
socket.gaierror: [Errno -2] Name or service not known
Python 2.7.10 (default, Feb 24 2016, 16:02:27)
[GCC 4.4.6 20120305 (Red Hat 4.4.6-4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
16/06/20 19:58:01 WARN SparkConf:
SPARK_WORKER_INSTANCES was detected (set to '3').
This is deprecated in Spark 1.0+.
Please instead use:
- ./spark-submit with --num-executors to specify the number of executors
- Or set SPARK_EXECUTOR_INSTANCES
- spark.executor.instances to configure the number of instances in the spark config.
16/06/20 19:58:02 WARN AbstractHandler: No Server set for org.spark_project.jetty.server.handler.ErrorHandler@5f242a67
Traceback (most recent call last):
File "/home/hadoop/soft/spark-2.0.0-preview-bin-hadoop2.7/python/pyspark/shell.py", line 38, in <module>
sc = SparkContext()
File "/home/hadoop/soft/spark-2.0.0-preview-bin-hadoop2.7/python/pyspark/context.py", line 115, in __init__
conf, jsc, profiler_cls)
File "/home/hadoop/soft/spark-2.0.0-preview-bin-hadoop2.7/python/pyspark/context.py", line 176, in _do_init
self._accumulatorServer = accumulators._start_update_server()
File "/home/hadoop/soft/spark-2.0.0-preview-bin-hadoop2.7/python/pyspark/accumulators.py", line 259, in _start_update_server
server = AccumulatorServer(("localhost", 0), _UpdateRequestHandler)
File "/usr/local/lib/python2.7/SocketServer.py", line 420, in __init__
self.server_bind()
File "/usr/local/lib/python2.7/SocketServer.py", line 434, in server_bind
self.socket.bind(self.server_address)
File "/usr/local/lib/python2.7/socket.py", line 228, in meth
return getattr(self._sock,name)(*args)
socket.gaierror: [Errno -2] Name or service not known
没有找到相关结果
重要提示:提问者不能发表回复,可以通过评论与回答者沟通,沟通后可以通过编辑功能完善问题描述,以便后续其他人能够更容易理解问题.
1 个回复
MarsJ - 大数据玩家~DS 2016-06-21 回答
赞同来自: BIWORK