进入pyspark报错了,求救啊?

0
出现信息

Python 2.7.10 (default, Feb 24 2016, 16:02:27)

[GCC 4.4.6 20120305 (Red Hat 4.4.6-4)] on linux2

Type "help", "copyright", "credits" or "license" for more information.

16/06/20 19:58:01 WARN SparkConf:

SPARK_WORKER_INSTANCES was detected (set to '3').

This is deprecated in Spark 1.0+.
Please instead use:

- ./spark-submit with --num-executors to specify the number of executors

- Or set SPARK_EXECUTOR_INSTANCES

- spark.executor.instances to configure the number of instances in the spark config.

16/06/20 19:58:02 WARN AbstractHandler: No Server set for org.spark_project.jetty.server.handler.ErrorHandler@5f242a67

Traceback (most recent call last):

File "/home/hadoop/soft/spark-2.0.0-preview-bin-hadoop2.7/python/pyspark/shell.py", line 38, in <module>

sc = SparkContext()

File "/home/hadoop/soft/spark-2.0.0-preview-bin-hadoop2.7/python/pyspark/context.py", line 115, in __init__

conf, jsc, profiler_cls)

File "/home/hadoop/soft/spark-2.0.0-preview-bin-hadoop2.7/python/pyspark/context.py", line 176, in _do_init

self._accumulatorServer = accumulators._start_update_server()

File "/home/hadoop/soft/spark-2.0.0-preview-bin-hadoop2.7/python/pyspark/accumulators.py", line 259, in _start_update_server

server = AccumulatorServer(("localhost", 0), _UpdateRequestHandler)

File "/usr/local/lib/python2.7/SocketServer.py", line 420, in __init__

self.server_bind()

File "/usr/local/lib/python2.7/SocketServer.py", line 434, in server_bind

self.socket.bind(self.server_address)

File "/usr/local/lib/python2.7/socket.py", line 228, in meth

return getattr(self._sock,name)(*args)

socket.gaierror: [Errno -2] Name or service not known
已邀请:
1

MarsJ - 大数据玩家~DS 2016-06-21 回答

在/etc/hosts中,127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4这一行不要注释掉,就可以解决问题

要回复问题请先登录注册