Jetson Orin Nano v6.0 + tensorflow2.15.0+nv24.05 GPU版本安装

Jetson Orin Nano v6.0 + tensorflow2.15.0+nv24.05 GPU版本安装

  • 1. 源由
  • 2. 步骤
    • 2.1 Step1:系统安装
    • 2.2 Step2: nvidia-jetpack安装
    • 2.3 Step3:jtop安装
    • 2.4 Step4:h5py安装
    • 2.5 Step5:tensorflow安装
    • 2.6 Step6:jupyterlab安装
  • 3. 测试
  • 4. 参考资料
  • 5. 补充
    • 5.1 直接安装tensorflow==2.15.0+nv24.05 - “Failed to build h5py”
    • 5.2 直接安装h5py - “Failed to build h5py”

1. 源由

  1. Jetson Orin Nano Linux 36.2 6.0DP 对tensorflow支持上存在BUG,导致某些场景异常。不推荐使用6.0DP, Develop View Version,详见:Jammy@Jetson Orin Nano - Tensorflow GPU版本安装
  2. NVIDIA对于三方库(tensorflow)的支持不是很给力,可能源于内部商业逻辑,研发资源投入不足。发布的版本,仍然存在诸多安装问题。

虽然NVIDIA存在诸多资源配置上的问题,但是对开源还是有些许资源配给和验证,证明了这块热点区域的价值。

为此,我们特地整理一份资料,以便对于Jetson Orin Nano v6.0 + tensorflow2.15.0+nv24.05 GPU版本的安装提供解决方法。

2. 步骤

2.1 Step1:系统安装

详细请参考:

  • Linux 36.3@Jetson Orin Nano之系统安装
  • Linux 36.2@Jetson Orin Nano之基础环境构建

2.2 Step2: nvidia-jetpack安装

注:默认不安装nvidia-jetpack。

$ sudo apt update
$ sudo apt install nvidia-jetpack

2.3 Step3:jtop安装

用于查看nvidia-jetpack安装情况。

$ sudo apt update
$ sudo apt install python3-pip
$ sudo pip3 install -U jetson-stats
$ sudo systemctl restart jtop.service

2.4 Step4:h5py安装

注:这个步骤非常重要,如果不做会出现补充部分描述的h5py编译失败错误。相关解决方法在jetson nano上就有,但是到了jetson orin nano上依然存在:Failed to build wheel for h5py , in JETSON NANO。

$ sudo apt-get install python3-pip
$ sudo apt-get install libhdf5-serial-dev hdf5-tools libhdf5-dev 
$ sudo pip3 install cython
$ sudo pip3 install h5py
Collecting h5py
  Using cached h5py-3.11.0.tar.gz (406 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Installing backend dependencies ... done
  Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: numpy>=1.17.3 in /usr/lib/python3/dist-packages (from h5py) (1.21.5)
Building wheels for collected packages: h5py
  Building wheel for h5py (pyproject.toml) ... -                                                                                                                                                done
  Created wheel for h5py: filename=h5py-3.11.0-cp310-cp310-linux_aarch64.whl size=6906150 sha256=e91885c8ae20d8207e79bd0aee4f794338ba8df1bd4634a8d41926c2f230697e
  Stored in directory: /root/.cache/pip/wheels/54/6c/66/4f9de317fb7a5505a348881fc3666b289fde493612707458a3
Successfully built h5py
Installing collected packages: h5py
Successfully installed h5py-3.11.0
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv

2.5 Step5:tensorflow安装

虽然这里提示不少问题,重点放在第一点:

  1. tensorflow-2.15.0+nv24.5安装成功

Successfully installed MarkupSafe-2.1.5 absl-py-2.1.0 astunparse-1.6.3 cachetools-5.3.3 flatbuffers-24.3.25 gast-0.5.4 google-auth-2.29.0 google-auth-oauthlib-1.2.0 google-pasta-0.2.0 grpcio-1.64.0 keras-2.15.0 libclang-18.1.1 ml-dtypes-0.2.0 numpy-1.26.4 opt-einsum-3.3.0 protobuf-4.25.3 pyasn1-0.6.0 pyasn1-modules-0.4.0 requests-oauthlib-2.0.0 rsa-4.9 tensorboard-2.15.2 tensorboard-data-server-0.7.2 tensorflow-2.15.0+nv24.5 tensorflow-estimator-2.15.0 tensorflow-io-gcs-filesystem-0.37.0 termcolor-2.4.0 werkzeug-3.0.3 wrapt-1.14.1

  1. pip的依赖关系可能存在问题

ERROR: pip’s dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
onnx-graphsurgeon 0.3.12 requires onnx, which is not installed.

  1. sudo安装友情提示

WARNING: Running pip as the ‘root’ user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv

$ sudo pip3 install --extra-index-url https://developer.download.nvidia.com/compute/redist/jp/v60 tensorflow==2.15.0+nv24.05
Looking in indexes: https://pypi.org/simple, https://developer.download.nvidia.com/compute/redist/jp/v60
Collecting tensorflow==2.15.0+nv24.05
  Using cached https://developer.download.nvidia.cn/compute/redist/jp/v60/tensorflow/tensorflow-2.15.0%2Bnv24.05-cp310-cp310-linux_aarch64.whl (465.5 MB)
Collecting absl-py>=1.0.0 (from tensorflow==2.15.0+nv24.05)
  Using cached absl_py-2.1.0-py3-none-any.whl.metadata (2.3 kB)
Collecting astunparse>=1.6.0 (from tensorflow==2.15.0+nv24.05)
  Using cached astunparse-1.6.3-py2.py3-none-any.whl.metadata (4.4 kB)
Collecting flatbuffers>=23.5.26 (from tensorflow==2.15.0+nv24.05)
  Using cached flatbuffers-24.3.25-py2.py3-none-any.whl.metadata (850 bytes)
Collecting gast!=0.5.0,!=0.5.1,!=0.5.2,>=0.2.1 (from tensorflow==2.15.0+nv24.05)
  Using cached gast-0.5.4-py3-none-any.whl.metadata (1.3 kB)
Collecting google-pasta>=0.1.1 (from tensorflow==2.15.0+nv24.05)
  Using cached google_pasta-0.2.0-py3-none-any.whl.metadata (814 bytes)
Requirement already satisfied: h5py>=2.9.0 in /usr/local/lib/python3.10/dist-packages (from tensorflow==2.15.0+nv24.05) (3.11.0)
Collecting libclang>=13.0.0 (from tensorflow==2.15.0+nv24.05)
  Using cached libclang-18.1.1-py2.py3-none-manylinux2014_aarch64.whl.metadata (5.2 kB)
Collecting ml-dtypes~=0.2.0 (from tensorflow==2.15.0+nv24.05)
  Using cached ml_dtypes-0.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (20 kB)
Collecting numpy<2.0.0,>=1.23.5 (from tensorflow==2.15.0+nv24.05)
  Using cached numpy-1.26.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (62 kB)
Collecting opt-einsum>=2.3.2 (from tensorflow==2.15.0+nv24.05)
  Using cached opt_einsum-3.3.0-py3-none-any.whl.metadata (6.5 kB)
Requirement already satisfied: packaging in /usr/local/lib/python3.10/dist-packages (from tensorflow==2.15.0+nv24.05) (24.0)
Collecting protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3 (from tensorflow==2.15.0+nv24.05)
  Using cached protobuf-4.25.3-cp37-abi3-manylinux2014_aarch64.whl.metadata (541 bytes)
Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from tensorflow==2.15.0+nv24.05) (70.0.0)
Requirement already satisfied: six>=1.12.0 in /usr/lib/python3/dist-packages (from tensorflow==2.15.0+nv24.05) (1.16.0)
Collecting termcolor>=1.1.0 (from tensorflow==2.15.0+nv24.05)
  Using cached termcolor-2.4.0-py3-none-any.whl.metadata (6.1 kB)
Requirement already satisfied: typing-extensions>=3.6.6 in /usr/local/lib/python3.10/dist-packages (from tensorflow==2.15.0+nv24.05) (4.12.0)
Collecting wrapt<1.15,>=1.11.0 (from tensorflow==2.15.0+nv24.05)
  Using cached wrapt-1.14.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (6.7 kB)
Collecting tensorflow-io-gcs-filesystem>=0.23.1 (from tensorflow==2.15.0+nv24.05)
  Using cached tensorflow_io_gcs_filesystem-0.37.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (14 kB)
Collecting grpcio<2.0,>=1.24.3 (from tensorflow==2.15.0+nv24.05)
  Using cached grpcio-1.64.0-cp310-cp310-manylinux_2_17_aarch64.whl.metadata (3.3 kB)
Collecting tensorboard<2.16,>=2.15 (from tensorflow==2.15.0+nv24.05)
  Using cached tensorboard-2.15.2-py3-none-any.whl.metadata (1.7 kB)
Collecting tensorflow-estimator<2.16,>=2.15.0 (from tensorflow==2.15.0+nv24.05)
  Using cached tensorflow_estimator-2.15.0-py2.py3-none-any.whl.metadata (1.3 kB)
Collecting keras<2.16,>=2.15.0 (from tensorflow==2.15.0+nv24.05)
  Using cached keras-2.15.0-py3-none-any.whl.metadata (2.4 kB)
Requirement already satisfied: wheel<1.0,>=0.23.0 in /usr/local/lib/python3.10/dist-packages (from astunparse>=1.6.0->tensorflow==2.15.0+nv24.05) (0.43.0)
Collecting google-auth<3,>=1.6.3 (from tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05)
  Using cached google_auth-2.29.0-py2.py3-none-any.whl.metadata (4.7 kB)
Collecting google-auth-oauthlib<2,>=0.5 (from tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05)
  Using cached google_auth_oauthlib-1.2.0-py2.py3-none-any.whl.metadata (2.7 kB)
Requirement already satisfied: markdown>=2.6.8 in /usr/lib/python3/dist-packages (from tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (3.3.6)
Requirement already satisfied: requests<3,>=2.21.0 in /usr/local/lib/python3.10/dist-packages (from tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (2.32.2)
Collecting tensorboard-data-server<0.8.0,>=0.7.0 (from tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05)
  Using cached tensorboard_data_server-0.7.2-py3-none-any.whl.metadata (1.1 kB)
Collecting werkzeug>=1.0.1 (from tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05)
  Using cached werkzeug-3.0.3-py3-none-any.whl.metadata (3.7 kB)
Collecting cachetools<6.0,>=2.0.0 (from google-auth<3,>=1.6.3->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05)
  Using cached cachetools-5.3.3-py3-none-any.whl.metadata (5.3 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.6.3->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05)
  Using cached pyasn1_modules-0.4.0-py3-none-any.whl.metadata (3.4 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.6.3->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05)
  Using cached rsa-4.9-py3-none-any.whl.metadata (4.2 kB)
Collecting requests-oauthlib>=0.7.0 (from google-auth-oauthlib<2,>=0.5->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05)
  Using cached requests_oauthlib-2.0.0-py2.py3-none-any.whl.metadata (11 kB)
Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests<3,>=2.21.0->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (3.3.2)
Requirement already satisfied: idna<4,>=2.5 in /usr/lib/python3/dist-packages (from requests<3,>=2.21.0->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (3.3)
Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/lib/python3/dist-packages (from requests<3,>=2.21.0->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (1.26.5)
Requirement already satisfied: certifi>=2017.4.17 in /usr/lib/python3/dist-packages (from requests<3,>=2.21.0->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (2020.6.20)
Collecting MarkupSafe>=2.1.1 (from werkzeug>=1.0.1->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05)
  Using cached MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (3.0 kB)
Collecting pyasn1<0.7.0,>=0.4.6 (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05)
  Using cached pyasn1-0.6.0-py2.py3-none-any.whl.metadata (8.3 kB)
Requirement already satisfied: oauthlib>=3.0.0 in /usr/lib/python3/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<2,>=0.5->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (3.2.0)
Using cached absl_py-2.1.0-py3-none-any.whl (133 kB)
Using cached astunparse-1.6.3-py2.py3-none-any.whl (12 kB)
Using cached flatbuffers-24.3.25-py2.py3-none-any.whl (26 kB)
Using cached gast-0.5.4-py3-none-any.whl (19 kB)
Using cached google_pasta-0.2.0-py3-none-any.whl (57 kB)
Using cached grpcio-1.64.0-cp310-cp310-manylinux_2_17_aarch64.whl (5.4 MB)
Using cached keras-2.15.0-py3-none-any.whl (1.7 MB)
Using cached libclang-18.1.1-py2.py3-none-manylinux2014_aarch64.whl (23.8 MB)
Using cached ml_dtypes-0.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (1.0 MB)
Using cached numpy-1.26.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (14.2 MB)
Using cached opt_einsum-3.3.0-py3-none-any.whl (65 kB)
Using cached protobuf-4.25.3-cp37-abi3-manylinux2014_aarch64.whl (293 kB)
Using cached tensorboard-2.15.2-py3-none-any.whl (5.5 MB)
Using cached tensorflow_estimator-2.15.0-py2.py3-none-any.whl (441 kB)
Using cached tensorflow_io_gcs_filesystem-0.37.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (4.8 MB)
Using cached termcolor-2.4.0-py3-none-any.whl (7.7 kB)
Using cached wrapt-1.14.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (78 kB)
Using cached google_auth-2.29.0-py2.py3-none-any.whl (189 kB)
Using cached google_auth_oauthlib-1.2.0-py2.py3-none-any.whl (24 kB)
Using cached tensorboard_data_server-0.7.2-py3-none-any.whl (2.4 kB)
Using cached werkzeug-3.0.3-py3-none-any.whl (227 kB)
Using cached cachetools-5.3.3-py3-none-any.whl (9.3 kB)
Using cached MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (26 kB)
Using cached pyasn1_modules-0.4.0-py3-none-any.whl (181 kB)
Using cached requests_oauthlib-2.0.0-py2.py3-none-any.whl (24 kB)
Using cached rsa-4.9-py3-none-any.whl (34 kB)
Using cached pyasn1-0.6.0-py2.py3-none-any.whl (85 kB)
Installing collected packages: libclang, flatbuffers, wrapt, termcolor, tensorflow-io-gcs-filesystem, tensorflow-estimator, tensorboard-data-server, pyasn1, protobuf, numpy, MarkupSafe, keras, grpcio, google-pasta, gast, cachetools, astunparse, absl-py, werkzeug, rsa, requests-oauthlib, pyasn1-modules, opt-einsum, ml-dtypes, google-auth, google-auth-oauthlib, tensorboard, tensorflow
  Attempting uninstall: protobuf
    Found existing installation: protobuf 3.12.4
    Uninstalling protobuf-3.12.4:
      Successfully uninstalled protobuf-3.12.4
  Attempting uninstall: numpy
    Found existing installation: numpy 1.21.5
    Uninstalling numpy-1.21.5:
      Successfully uninstalled numpy-1.21.5
  Attempting uninstall: MarkupSafe
    Found existing installation: MarkupSafe 2.0.1
    Uninstalling MarkupSafe-2.0.1:
      Successfully uninstalled MarkupSafe-2.0.1
  Attempting uninstall: gast
    Found existing installation: gast 0.5.2
    Uninstalling gast-0.5.2:
      Successfully uninstalled gast-0.5.2
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
onnx-graphsurgeon 0.3.12 requires onnx, which is not installed.
Successfully installed MarkupSafe-2.1.5 absl-py-2.1.0 astunparse-1.6.3 cachetools-5.3.3 flatbuffers-24.3.25 gast-0.5.4 google-auth-2.29.0 google-auth-oauthlib-1.2.0 google-pasta-0.2.0 grpcio-1.64.0 keras-2.15.0 libclang-18.1.1 ml-dtypes-0.2.0 numpy-1.26.4 opt-einsum-3.3.0 protobuf-4.25.3 pyasn1-0.6.0 pyasn1-modules-0.4.0 requests-oauthlib-2.0.0 rsa-4.9 tensorboard-2.15.2 tensorboard-data-server-0.7.2 tensorflow-2.15.0+nv24.5 tensorflow-estimator-2.15.0 tensorflow-io-gcs-filesystem-0.37.0 termcolor-2.4.0 werkzeug-3.0.3 wrapt-1.14.1
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv

2.6 Step6:jupyterlab安装

$ sudo pip3 install jupyterlab

3. 测试

在v6.0DP版本中存在Inconsistency of NVIDIA 2.15.0+nv24.03 v.s. Colab v.s. Tensorflow Documentation问题。

Jetson Orin Nano v6.0 + tensorflow2.15.0+nv24.05 GPU版本不存在上述问题,经验证:

在这里插入图片描述

在这里插入图片描述

4. 参考资料

【1】Linux 36.2@Jetson Orin Nano之Hello AI World!
【2】ubuntu22.04@Jetson Orin Nano之OpenCV安装
【3】ubuntu22.04@Jetson Orin Nano之CSI IMX219安装
【4】ubuntu22.04@Jetson Orin Nano安装&配置VNC服务端

5. 补充

5.1 直接安装tensorflow==2.15.0+nv24.05 - “Failed to build h5py”

$ sudo pip3 install --extra-index-url https://developer.download.nvidia.com/compute/redist/jp/v60 tensorflow==2.15.0+nv24.05
[sudo] password for daniel:
Looking in indexes: https://pypi.org/simple, https://developer.download.nvidia.com/compute/redist/jp/v60
Collecting tensorflow==2.15.0+nv24.05
  Downloading https://developer.download.nvidia.cn/compute/redist/jp/v60/tensorflow/tensorflow-2.15.0%2Bnv24.05-cp310-cp310-linux_aarch64.whl (465.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 465.5/465.5 MB 2.0 MB/s eta 0:00:00
Collecting wrapt<1.15,>=1.11.0
  Using cached wrapt-1.14.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (78 kB)
Collecting absl-py>=1.0.0
  Using cached absl_py-2.1.0-py3-none-any.whl (133 kB)
Requirement already satisfied: packaging in /usr/local/lib/python3.10/dist-packages (from tensorflow==2.15.0+nv24.05) (24.0)
Requirement already satisfied: setuptools in /usr/lib/python3/dist-packages (from tensorflow==2.15.0+nv24.05) (59.6.0)
Collecting protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3
  Using cached protobuf-4.25.3-cp37-abi3-manylinux2014_aarch64.whl (293 kB)
Collecting gast!=0.5.0,!=0.5.1,!=0.5.2,>=0.2.1
  Using cached gast-0.5.4-py3-none-any.whl (19 kB)
Collecting grpcio<2.0,>=1.24.3
  Using cached grpcio-1.64.0-cp310-cp310-manylinux_2_17_aarch64.whl (5.4 MB)
Collecting google-pasta>=0.1.1
  Using cached google_pasta-0.2.0-py3-none-any.whl (57 kB)
Collecting astunparse>=1.6.0
  Using cached astunparse-1.6.3-py2.py3-none-any.whl (12 kB)
Collecting opt-einsum>=2.3.2
  Using cached opt_einsum-3.3.0-py3-none-any.whl (65 kB)
Collecting keras<2.16,>=2.15.0
  Using cached keras-2.15.0-py3-none-any.whl (1.7 MB)
Collecting tensorflow-estimator<2.16,>=2.15.0
  Using cached tensorflow_estimator-2.15.0-py2.py3-none-any.whl (441 kB)
Collecting numpy<2.0.0,>=1.23.5
  Using cached numpy-1.26.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (14.2 MB)
Collecting flatbuffers>=23.5.26
  Using cached flatbuffers-24.3.25-py2.py3-none-any.whl (26 kB)
Requirement already satisfied: six>=1.12.0 in /usr/lib/python3/dist-packages (from tensorflow==2.15.0+nv24.05) (1.16.0)
Collecting libclang>=13.0.0
  Using cached libclang-18.1.1-py2.py3-none-manylinux2014_aarch64.whl (23.8 MB)
Collecting ml-dtypes~=0.2.0
  Using cached ml_dtypes-0.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (1.0 MB)
Collecting termcolor>=1.1.0
  Using cached termcolor-2.4.0-py3-none-any.whl (7.7 kB)
Collecting tensorboard<2.16,>=2.15
  Using cached tensorboard-2.15.2-py3-none-any.whl (5.5 MB)
Collecting h5py>=2.9.0
  Using cached h5py-3.11.0.tar.gz (406 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Installing backend dependencies ... done
  Preparing metadata (pyproject.toml) ... done
Collecting tensorflow-io-gcs-filesystem>=0.23.1
  Using cached tensorflow_io_gcs_filesystem-0.37.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (4.8 MB)
Requirement already satisfied: typing-extensions>=3.6.6 in /usr/local/lib/python3.10/dist-packages (from tensorflow==2.15.0+nv24.05) (4.12.0)
Requirement already satisfied: wheel<1.0,>=0.23.0 in /usr/lib/python3/dist-packages (from astunparse>=1.6.0->tensorflow==2.15.0+nv24.05) (0.37.1)
Requirement already satisfied: markdown>=2.6.8 in /usr/lib/python3/dist-packages (from tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (3.3.6)
Requirement already satisfied: requests<3,>=2.21.0 in /usr/local/lib/python3.10/dist-packages (from tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (2.32.2)
Collecting google-auth-oauthlib<2,>=0.5
  Using cached google_auth_oauthlib-1.2.0-py2.py3-none-any.whl (24 kB)
Collecting werkzeug>=1.0.1
  Using cached werkzeug-3.0.3-py3-none-any.whl (227 kB)
Collecting tensorboard-data-server<0.8.0,>=0.7.0
  Using cached tensorboard_data_server-0.7.2-py3-none-any.whl (2.4 kB)
Collecting google-auth<3,>=1.6.3
  Using cached google_auth-2.29.0-py2.py3-none-any.whl (189 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.4.0-py3-none-any.whl (181 kB)
Collecting cachetools<6.0,>=2.0.0
  Using cached cachetools-5.3.3-py3-none-any.whl (9.3 kB)
Collecting requests-oauthlib>=0.7.0
  Using cached requests_oauthlib-2.0.0-py2.py3-none-any.whl (24 kB)
Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests<3,>=2.21.0->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (3.3.2)
Requirement already satisfied: idna<4,>=2.5 in /usr/lib/python3/dist-packages (from requests<3,>=2.21.0->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (3.3)
Requirement already satisfied: certifi>=2017.4.17 in /usr/lib/python3/dist-packages (from requests<3,>=2.21.0->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (2020.6.20)
Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/lib/python3/dist-packages (from requests<3,>=2.21.0->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (1.26.5)
Collecting MarkupSafe>=2.1.1
  Using cached MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (26 kB)
Collecting pyasn1<0.7.0,>=0.4.6
  Using cached pyasn1-0.6.0-py2.py3-none-any.whl (85 kB)
Requirement already satisfied: oauthlib>=3.0.0 in /usr/lib/python3/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<2,>=0.5->tensorboard<2.16,>=2.15->tensorflow==2.15.0+nv24.05) (3.2.0)
Building wheels for collected packages: h5py
  Building wheel for h5py (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for h5py (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [7 lines of output]
      running bdist_wheel
      running build
      running build_ext
      Loading library to get build settings and version: libhdf5.so
      error: Unable to load dependency HDF5, make sure HDF5 is installed properly
      Library dirs checked: []
      error: libhdf5.so: cannot open shared object file: No such file or directory
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for h5py
Failed to build h5py
ERROR: Could not build wheels for h5py, which is required to install pyproject.toml-based projects

5.2 直接安装h5py - “Failed to build h5py”

$ sudo pip3 install h5py
Collecting h5py
  Using cached h5py-3.11.0.tar.gz (406 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Installing backend dependencies ... done
  Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: numpy>=1.17.3 in /usr/lib/python3/dist-packages (from h5py) (1.21.5)
Building wheels for collected packages: h5py
  Building wheel for h5py (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for h5py (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [75 lines of output]
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build/lib.linux-aarch64-cpython-310
      creating build/lib.linux-aarch64-cpython-310/h5py
      copying h5py/ipy_completer.py -> build/lib.linux-aarch64-cpython-310/h5py
      copying h5py/__init__.py -> build/lib.linux-aarch64-cpython-310/h5py
      copying h5py/h5py_warnings.py -> build/lib.linux-aarch64-cpython-310/h5py
      copying h5py/version.py -> build/lib.linux-aarch64-cpython-310/h5py
      creating build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/files.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/group.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/selections.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/compat.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/datatype.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/__init__.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/filters.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/attrs.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/dataset.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/vds.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/dims.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/base.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      copying h5py/_hl/selections2.py -> build/lib.linux-aarch64-cpython-310/h5py/_hl
      creating build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_h5d_direct_chunk.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_objects.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_group.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_file2.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_h5o.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/conftest.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_dimension_scales.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_dataset_swmr.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_attrs_data.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_selections.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_file.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_h5.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_h5z.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_completions.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_dtype.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_dataset_getitem.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_base.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/__init__.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_filters.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_attrs.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_h5pl.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/common.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_h5t.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_dataset.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_h5f.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_file_image.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_ros3.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_errors.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_datatype.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_h5p.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_attribute_create.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_slicing.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_big_endian_file.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_file_alignment.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      copying h5py/tests/test_dims_dimensionproxy.py -> build/lib.linux-aarch64-cpython-310/h5py/tests
      creating build/lib.linux-aarch64-cpython-310/h5py/tests/data_files
      copying h5py/tests/data_files/__init__.py -> build/lib.linux-aarch64-cpython-310/h5py/tests/data_files
      creating build/lib.linux-aarch64-cpython-310/h5py/tests/test_vds
      copying h5py/tests/test_vds/test_virtual_source.py -> build/lib.linux-aarch64-cpython-310/h5py/tests/test_vds
      copying h5py/tests/test_vds/test_lowlevel_vds.py -> build/lib.linux-aarch64-cpython-310/h5py/tests/test_vds
      copying h5py/tests/test_vds/__init__.py -> build/lib.linux-aarch64-cpython-310/h5py/tests/test_vds
      copying h5py/tests/test_vds/test_highlevel_vds.py -> build/lib.linux-aarch64-cpython-310/h5py/tests/test_vds
      copying h5py/tests/data_files/vlen_string_dset_utc.h5 -> build/lib.linux-aarch64-cpython-310/h5py/tests/data_files
      copying h5py/tests/data_files/vlen_string_s390x.h5 -> build/lib.linux-aarch64-cpython-310/h5py/tests/data_files
      copying h5py/tests/data_files/vlen_string_dset.h5 -> build/lib.linux-aarch64-cpython-310/h5py/tests/data_files
      running build_ext
      Loading library to get build settings and version: libhdf5.so
      error: Unable to load dependency HDF5, make sure HDF5 is installed properly
      Library dirs checked: []
      error: libhdf5.so: cannot open shared object file: No such file or directory
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for h5py
Failed to build h5py
ERROR: Could not build wheels for h5py, which is required to install pyproject.toml-based projects

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:/a/657919.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

Windows搭建Nginx代理本地盘的文件(共享路径或本地路径)

文章目录 Windows搭建Nginx代理本地盘的文件 - 前言需求背景挂载网络共享路径检查连接状态下载Nginx编辑 Nginx 配置文件启动 Nginx检测Nginx是否成功启动使用方法远程共享路径示例本地文件示例 测试 Windows搭建Nginx代理本地盘的文件 - 前言 在开发过程中&#xff0c;确保文…

广东省保健食品行业协会批复成为“世界酒中国菜”活动指导单位

广东省保健食品行业协会正式批复成为“世界酒中国菜”系列活动指导单位&#xff0c;共促餐饮文化交流发展 近日&#xff0c;广东省保健食品行业协会正式批复荐酒师国际认证&#xff08;广州&#xff09;有限公司&#xff0c;成为备受瞩目的“世界酒中国菜”系列活动的指导单位…

8.2 数组遍历访问

本节必须掌握的知识点&#xff1a; 示例三十 代码分析 汇编解析 在上一节中介绍了数组相关的概念&#xff0c;而在本节中将介绍数组的使用。 8.2.1 示例三十 ■访问数组 示例代码三十 ●第一步&#xff1a;分析需求&#xff0c;设计程序…

基于C++11实现的手写线程池

在实际的项目中&#xff0c;使用线程池是非常广泛的&#xff0c;所以最近学习了线程池的开发&#xff0c;在此做一个总结。 源码&#xff1a;https://github.com/Cheeron955/Handwriting-threadpool-based-on-C-17 项目介绍 项目分为两个部分&#xff0c;在初版的时候&#x…

STM32——定时器

一、简介 *定时器可以对输入的时钟进行计数&#xff0c;并在计数值达到设定值时触发中断 *16位计数器、预分频器、自动重装寄存器的时基单元&#xff0c;在72MHz计数时钟下可以实现最大59.65s的定时 *不仅具备基本的定时中断功能&#xff0c;而且还包含内外时钟源选择、输入…

ubuntu使用oh my zsh美化终端

ubuntu使用oh my zsh美化终端 文章目录 ubuntu使用oh my zsh美化终端1. 安装zsh和oh my zsh2. 修改zsh主题3. 安装zsh插件4. 将.bashrc移植到.zshrcReference 1. 安装zsh和oh my zsh 首先安装zsh sudo apt install zsh然后查看本地有哪些shell可以使用 cat /etc/shells 将默…

平方回文数-第13届蓝桥杯选拔赛Python真题精选

[导读]&#xff1a;超平老师的Scratch蓝桥杯真题解读系列在推出之后&#xff0c;受到了广大老师和家长的好评&#xff0c;非常感谢各位的认可和厚爱。作为回馈&#xff0c;超平老师计划推出《Python蓝桥杯真题解析100讲》&#xff0c;这是解读系列的第73讲。 平方回文数&#…

监控云安全的9个方法和措施

如今&#xff0c;很多企业致力于提高云计算安全指标的可见性&#xff0c;这是由于云计算的安全性与本地部署的安全性根本不同&#xff0c;并且随着企业将应用程序、服务和数据移动到新环境&#xff0c;需要不同的实践。检测云的云检测就显得极其重要。 如今&#xff0c;很多企业…

windows tomcat服务注册和卸载

首页解压tomcat压缩包&#xff0c;然后进入tomcat bin目录&#xff0c;在此目录通过cmd进入窗口&#xff0c; 1&#xff1a;tomcat服务注册 执行命令&#xff1a;service.bat install tomcat8.5.100 命令执行成功后&#xff0c;会在注册服务列表出现这个服务&#xff0c;如果…

打造爆款活动:确定目标受众与吸引策略的实战指南

身为一名文案策划经理&#xff0c;我深知在活动策划的海洋中&#xff0c;确定目标受众并设计出能触动他们心弦的策略是何等重要。 通过以下步骤&#xff0c;你可以更准确地确定目标受众&#xff0c;并制定出有效的吸引策略&#xff0c;确保活动的成功&#xff1a; 明确活动目…

Unity【入门】环境搭建、界面基础、工作原理

Unity环境搭建、界面基础、工作原理 Unity环境搭建 文章目录 Unity环境搭建1、Unity引擎概念1、什么是游戏引擎2、游戏引擎对于我们的意义3、如何学习游戏引擎 2、软件下载和安装3、新工程和工程文件夹 Unity界面基础1、Scene场景和Hierarchy层级窗口1、窗口布局2、Hierarchy层…

企业如何实现数据采集分析展示一体化

在当今数字化时代&#xff0c;企业越来越依赖于数据的力量来驱动决策和创新。通过全量实时采集各类数据&#xff0c;并利用智能化工具进行信息处理&#xff0c;企业能够借助大数据分析平台深入挖掘数据背后的价值&#xff0c;从而为企业发展注入新动力。 一、企业痛点 随着数字…

基于单片机智能防触电装置的研究与设计

摘 要 &#xff1a; 针对潮湿天气下配电线路附近易发生触电事故等问题 &#xff0c; 对单片机的控制算法进行了研究 &#xff0c; 设 计 了 一 种 基 于 单片机的野外智能防触电装置。 首先建立了该装置的整体结构框架 &#xff0c; 再分别进行硬件设计和软件流程分析 &#xf…

水电表远程抄表:智能化时代的能源管理新方式

1.行业背景与界定 水电表远程抄表&#xff0c;是随着物联网技术发展&#xff0c;完成的一种新型的能源计量管理方式。主要是通过无线传输技术&#xff0c;如GPRS、NB-IoT、LoRa等&#xff0c;将水电表的信息实时传输到云服务器&#xff0c;进而取代了传统人工当场抄水表。这种…

MySQL 重启之后无法写入数据了?

数据库交接后因 persist_only 级别的参数设置引发的故障分析。 作者&#xff1a;不吃芫荽&#xff0c;爱可生华东交付服务部 DBA 成员&#xff0c;主要负责 MySQL 故障处理及相关技术支持。 爱可生开源社区出品&#xff0c;原创内容未经授权不得随意使用&#xff0c;转载请联系…

冯喜运:5.29市场避险情绪升温,黄金原油小幅收涨

【黄金消息面分析】&#xff1a;周二&#xff08;5月28日&#xff09;美盘时段&#xff0c;由于美元走弱且市场情绪出现负面变化&#xff0c;黄金收复早前跌幅&#xff0c;站上2350美元关口。金价早盘一度走弱&#xff0c;源于美联储降息可能性降低带来压力&#xff0c;投资者在…

HTML+CSS TAB导航栏

效果演示 这段代码实现了一个名为"Tab导航栏"的效果,它是一个基于CSS的导航栏,包含五个选项卡,每个选项卡都有一个带有渐变背景色的滑块,当用户点击选项卡时,滑块会滑动到相应的位置。同时,选中的选项卡会变为白色,未选中的选项卡会变为灰色。 Code <!DOC…

SARscape雷达图像处理软件简介

合成孔径雷达&#xff08;SAR&#xff09;拥有独特的技术魅力和优势&#xff0c;渐成为国际上的研究热点之一&#xff0c;其应用领域越来越广泛。SAR数据可以全天候对研究区域进行量测、分析以及获取目标信息。高级雷达图像处理工具SARscape&#xff0c;能让您轻松将原始SAR数据…

每天写两道(二)LRU缓存、

146.LRU 缓存 . - 力扣&#xff08;LeetCode&#xff09; 请你设计并实现一个满足 LRU (最近最少使用) 缓存 约束的数据结构。 实现 LRUCache 类&#xff1a; LRUCache(int capacity) 以 正整数 作为容量 capacity 初始化 LRU 缓存int get(int key) 如果关键字 key 存在于缓存…

猜猜我是谁游戏

猜谜过程 在TabControl控件中&#xff0c;第一个tab中放了一个PictureBox&#xff0c;里面有一张黑色的图片。 玩家点击显示答案按钮&#xff0c;切换图片。 设计器 private void button1_Click(object sender, EventArgs e){this.pictureBox1.Image Image.FromFile(&qu…