python - 属性错误 : module 'functools' has no attribute 'wraps'

标签 python python-3.x anaconda python-import importerror

我正在尝试使用 Anaconda 4.2/Python 3.5 测试第 3 方代码当我执行测试时出现以下异常:

Traceback (most recent call last):
  File "pyspark/sql/tests.py", line 25, in <module>
    import subprocess
  File "/home/user/anaconda3/lib/python3.5/subprocess.py", line 364, in <module>
    import signal
  File "/home/user/anaconda3/lib/python3.5/signal.py", line 3, in <module>
    from functools import wraps as _wraps
  File "/home/user/anaconda3/lib/python3.5/functools.py", line 22, in <module>
    from types import MappingProxyType
  File "/home/user/Spark/spark-2.1.0-bin-hadoop2.7/python/pyspark/sql/types.py", line 22, in <module>
    import calendar
  File "/home/user/anaconda3/lib/python3.5/calendar.py", line 10, in <module>
    import locale as _locale
  File "/home/user/anaconda3/lib/python3.5/locale.py", line 108, in <module>
    @functools.wraps(_localeconv)
AttributeError: module 'functools' has no attribute 'wraps'

通常我会假设某些模块正在隐藏内置模块,但据我所知这不是问题:

  • 我从测试中记录了模块路径 (functools.__file__),它产生了预期的路径。我在异常中得到的路径也没有什么奇怪的。
  • 为了排除可能的模块损坏,我测试了全新的 Anaconda 安装。
  • 当我使用相同的配置和路径从 IPython shell (%run pyspark/sql/tests.py) 执行测试时,问题消失了。
  • functools.wraps 可以在相同目录和相同配置下启动的 shell 中导入。
  • 当我用 Python 2 环境替换 Python 3 环境时,问题就消失了。
  • 使用 virtualenv 创建的环境无法重现问题。

对于同一个项目的不同版本,我得到:

Traceback (most recent call last):
  File "pyspark/sql/tests.py", line 25, in <module>
    import pydoc
  File "/home/user/anaconda3/lib/python3.5/pydoc.py", line 55, in <module>
    import importlib._bootstrap
  File "/home/user/anaconda3/lib/python3.5/importlib/__init__.py", line 57, in <module>
    import types
  File "/home/user/Spark/spark-1.6.3-bin-hadoop2.6/python/pyspark/sql/types.py", line 22, in <module>
    import calendar
  File "/home/user/anaconda3/lib/python3.5/calendar.py", line 10, in <module>
    import locale as _locale
  File "/home/user/anaconda3/lib/python3.5/locale.py", line 19, in <module>
    import functools
  File "/home/user/anaconda3/lib/python3.5/functools.py", line 22, in <module>
    from types import MappingProxyType
ImportError: cannot import name 'MappingProxyType'

我在这里遗漏了什么明显的东西吗?

编辑:

可用于重现问题的 Dockerfile:

FROM debian:latest

RUN apt-get update
RUN apt-get install -y wget bzip2
RUN wget https://repo.continuum.io/archive/Anaconda3-4.2.0-Linux-x86_64.sh
RUN bash Anaconda3-4.2.0-Linux-x86_64.sh -b -p /anaconda3
RUN wget ftp://ftp.piotrkosoft.net/pub/mirrors/ftp.apache.org/spark/spark-2.1.0/spark-2.1.0-bin-hadoop2.7.tgz
RUN tar xf spark-2.1.0-bin-hadoop2.7.tgz
ENV PATH /anaconda3/bin:$PATH
ENV SPARK_HOME /spark-2.1.0-bin-hadoop2.7
ENV PYTHONPATH $PYTHONPATH:$SPARK_HOME/python/lib/py4j-0.10.4-src.zip:$SPARK_HOME/python
WORKDIR /spark-2.1.0-bin-hadoop2.7
RUN python python/pyspark/sql/tests.py

最佳答案

我怀疑发生这种情况是因为 python3functools 模块具有以下导入:from types import MappingProxyType 并且,而不是选择从 ${CONDA_PREFIX}/lib/python3.5/types.py 启动这个模块,它尝试从 sql 目录中导入模块:${SPARK_HOME}/python/pyspark/sql/types.pypython2functools 模块没有这个导入,因此不会抛出错误。

解决此问题的方法是先以某种方式导入required types 模块,然后再调用脚本。作为概念证明:

(root) ~/condaexpts$ PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.10.4-src.zip:$SPARK_HOME/python python
Python 3.5.2 |Anaconda 4.2.0 (64-bit)| (default, Jul  2 2016, 17:53:06) 
[GCC 4.4.7 20120313 (Red Hat 4.4.7-1)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import types
>>> import os
>>> sqltests=os.environ['SPARK_HOME'] + '/python/pyspark/sql/tests.py'
>>> exec(open(sqltests).read())
.....Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/01/30 05:59:43 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
17/01/30 05:59:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

...

----------------------------------------------------------------------
Ran 128 tests in 372.565s

另请注意,conda 没有什么特别之处。人们可以在普通的 virtualenv(使用 python3)中看到同样的东西:

~/condaexpts$ virtualenv -p python3 venv
Running virtualenv with interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in venv/bin/python3
Also creating executable in venv/bin/python
Installing setuptools, pip...done.

~/condaexpts$ source venv/bin/activate

(venv)~/condaexpts$ python --version
Python 3.4.3

(venv)~/condaexpts$ python $WORKDIR/python/pyspark/sql/tests.py                                                                                                                                      
Traceback (most recent call last):
  File "/home/ubuntu/condaexpts/spark-2.1.0-bin-hadoop2.7/python/pyspark/sql/tests.py", line 26, in <module>
    import pydoc
  File "/usr/lib/python3.4/pydoc.py", line 59, in <module>
    import importlib._bootstrap
  File "/home/ubuntu/condaexpts/venv/lib/python3.4/importlib/__init__.py", line 40, in <module>
    import types
  File "/home/ubuntu/condaexpts/spark-2.1.0-bin-hadoop2.7/python/pyspark/sql/types.py", line 22, in <module>
    import calendar
  File "/usr/lib/python3.4/calendar.py", line 10, in <module>
    import locale as _locale
  File "/home/ubuntu/condaexpts/venv/lib/python3.4/locale.py", line 20, in <module>
    import functools
  File "/home/ubuntu/condaexpts/venv/lib/python3.4/functools.py", line 22, in <module>
    from types import MappingProxyType
ImportError: cannot import name 'MappingProxyType'

关于python - 属性错误 : module 'functools' has no attribute 'wraps' ,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41926351/

相关文章:

classification - NLTK 感知器标记器 "TypeError: ' LazySubsequence 对象不支持项目分配”

python - 如何卸载在anaconda中使用pip安装的opencv-python包?

python - 装修契约(Contract)

python - 如何将 aws cdk 自定义构造库托管为 API

python - 如何在 python 中使用正则表达式 'skip' 特定单词?

python - 更新传递给线程python的变量

python - 通过 Anaconda 安装 `libm.so.6`

python - 重新采样/时间分组到特定的时间跨度/周期

Python:附加到基类列表/元组成员

Python参数注解未解析引用