Hadoop集群部署参考: 点击查看

    Spark集群部署参考: 点击查看

    最近在自己搭建的平台上测试spark-python脚本,发现一个错误如下:

      <span style="font-size:14px;">[master@slave1 spark]$ bin/pyspark 
    Python 2.6.6 (r266:84292, Jul 23 2015, 15:22:56) 
    [GCC 4.4.7 20120313 (Red Hat 4.4.7-11)] on linux2
    Type "help", "copyright", "credits" or "license" for more information.
    /opt/spark/python/pyspark/sql/context.py:477: DeprecationWarning: HiveContext is deprecated in Spark 2.0.0. Please use SparkSession.builder.enableHiveSupport().getOrCreate() instead.
      DeprecationWarning)
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel).
    16/08/01 02:33:38 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Traceback (most recent call last):
      File "/opt/spark/python/pyspark/shell.py", line 43, in <module>
        spark = SparkSession.builder\
      File "/opt/spark/python/pyspark/sql/session.py", line 169, in getOrCreate
        sc = SparkContext.getOrCreate(sparkConf)
      File "/opt/spark/python/pyspark/context.py", line 294, in getOrCreate
        SparkContext(conf=conf or SparkConf())
      File "/opt/spark/python/pyspark/context.py", line 115, in __init__
        conf, jsc, profiler_cls)
      File "/opt/spark/python/pyspark/context.py", line 174, in _do_init
        self._accumulatorServer = accumulators._start_update_server()
      File "/opt/spark/python/pyspark/accumulators.py", line 259, in _start_update_server
        server = AccumulatorServer(("localhost", 0), _UpdateRequestHandler)
      File "/usr/lib64/python2.6/SocketServer.py", line 412, in __init__
        self.server_bind()
      File "/usr/lib64/python2.6/SocketServer.py", line 423, in server_bind
        self.socket.bind(self.server_address)
      File "<string>", line 1, in bind
    socket.gaierror: [Errno -3] Temporary failure in name resolution
    >>> num = sc.parallelize([1,2,3,4])
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    NameError: name 'sc' is not defined</span>
     

    刚开始无从下手,找不到错误的原因,过了一天之后再去看这个问题的时候,仔细一看是socket 即通信的问题,于是谷歌搜索了一番,终于知道原因了:

    原因:

    ssh 不能登录localhost 使用 ssh localhost 也报同样的错误说明免密码登录不能登录自己

    这是因为我在配置hadoop集群时直接清除了/etc/hosts文件里的内容,所以在这里致使ssh localhost 不通

    解决办法:

    在/etc/hosts文件中加入以下内容:

      <span style="font-size:14px;">localhost localhost4.localdomain4
    127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
    ::1         localhost localhost.localdomain localhost6 localhost6.localdomain6</span>
     


    再次启动pysaprk 或者执行 py文件就可以了