Anaconda环境命令样例

启动命令行Anaconda Powershell Prompt

查看环境列表

(base) PS C:\Users\Administrator> conda env list
# conda environments:
#
base                  *  G:\ProgramData\anaconda3
MoneyprinterTurbo        G:\ProgramData\anaconda3\envs\MoneyprinterTurbo
pytorch-gpu              G:\ProgramData\anaconda3\envs\pytorch-gpu
transformer_gpt          G:\ProgramData\anaconda3\envs\transformer_gpt
videoEnv                 G:\ProgramData\anaconda3\envs\videoEnv

创建环境

(base) PS C:\Users\Administrator> conda create -n media_crawler python=3.9
Retrieving notices: ...working... done
Channels:
 - defaults
Platform: win-64
Collecting package metadata (repodata.json): done
Solving environment: done

## Package Plan ##

  environment location: G:\ProgramData\anaconda3\envs\media_crawler

  added / updated specs:
    - python=3.9


The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    pip-23.3.1                 |   py39haa95532_0         2.8 MB
    python-3.9.19              |       h1aa4202_0        19.5 MB
    setuptools-68.2.2          |   py39haa95532_0         933 KB
    wheel-0.41.2               |   py39haa95532_0         126 KB
    ------------------------------------------------------------
                                           Total:        23.4 MB

The following NEW packages will be INSTALLED:

  ca-certificates    pkgs/main/win-64::ca-certificates-2024.3.11-haa95532_0
  openssl            pkgs/main/win-64::openssl-3.0.13-h2bbff1b_0
  pip                pkgs/main/win-64::pip-23.3.1-py39haa95532_0
  python             pkgs/main/win-64::python-3.9.19-h1aa4202_0
  setuptools         pkgs/main/win-64::setuptools-68.2.2-py39haa95532_0
  sqlite             pkgs/main/win-64::sqlite-3.41.2-h2bbff1b_0
  tzdata             pkgs/main/noarch::tzdata-2024a-h04d1e81_0
  vc                 pkgs/main/win-64::vc-14.2-h21ff451_1
  vs2015_runtime     pkgs/main/win-64::vs2015_runtime-14.27.29016-h5e58377_2
  wheel              pkgs/main/win-64::wheel-0.41.2-py39haa95532_0


Proceed ([y]/n)? y


Downloading and Extracting Packages:

Preparing transaction: done
Verifying transaction: done
Executing transaction: done
#
# To activate this environment, use
#
#     $ conda activate media_crawler
#
# To deactivate an active environment, use
#
#     $ conda deactivate

激活环境

(base) PS C:\Users\Administrator> conda activate media_crawler
(media_crawler) PS C:\Users\Administrator> conda env list
# conda environments:
#
base                     G:\ProgramData\anaconda3
MoneyprinterTurbo        G:\ProgramData\anaconda3\envs\MoneyprinterTurbo
media_crawler         *  G:\ProgramData\anaconda3\envs\media_crawler
pytorch-gpu              G:\ProgramData\anaconda3\envs\pytorch-gpu
transformer_gpt          G:\ProgramData\anaconda3\envs\transformer_gpt
videoEnv                 G:\ProgramData\anaconda3\envs\videoEnv

安装项目依赖,如果在安装过程中出现错误,则进行重试。


(media_crawler) PS C:\Users\Administrator> pip install -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt
Collecting httpx==0.24.0 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Downloading httpx-0.24.0-py3-none-any.whl.metadata (8.1 kB)
Collecting Pillow==9.5.0 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 2))
  Downloading Pillow-9.5.0-cp39-cp39-win_amd64.whl.metadata (9.7 kB)
Collecting playwright==1.42.0 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 3))
  Downloading playwright-1.42.0-py3-none-win_amd64.whl.metadata (3.5 kB)
Collecting tenacity==8.2.2 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 4))
  Downloading tenacity-8.2.2-py3-none-any.whl.metadata (1.1 kB)
Collecting tornado (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 5))
  Using cached tornado-6.4-cp38-abi3-win_amd64.whl.metadata (2.6 kB)
Collecting PyExecJS==1.5.1 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 6))
  Downloading PyExecJS-1.5.1.tar.gz (13 kB)
  Preparing metadata (setup.py) ... done
Collecting opencv-python==4.7.0.72 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 7))
  Downloading opencv_python-4.7.0.72-cp37-abi3-win_amd64.whl.metadata (18 kB)
Collecting tortoise-orm==0.20.0 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 8))
  Downloading tortoise_orm-0.20.0-py3-none-any.whl.metadata (10 kB)
Collecting aiomysql==0.2.0 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 9))
  Downloading aiomysql-0.2.0-py3-none-any.whl.metadata (11 kB)
Collecting aerich==0.7.2 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 10))
  Downloading aerich-0.7.2-py3-none-any.whl.metadata (8.5 kB)
Collecting redis~=4.6.0 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 11))
  Downloading redis-4.6.0-py3-none-any.whl.metadata (8.3 kB)
Collecting pydantic==2.5.2 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 12))
  Downloading pydantic-2.5.2-py3-none-any.whl.metadata (65 kB)
     ---------------------------------------- 65.2/65.2 kB 18.0 kB/s eta 0:00:00
Collecting aiofiles~=23.2.1 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 13))
  Downloading aiofiles-23.2.1-py3-none-any.whl.metadata (9.7 kB)
Collecting certifi (from httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached certifi-2024.2.2-py3-none-any.whl.metadata (2.2 kB)
Collecting httpcore<0.18.0,>=0.15.0 (from httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Downloading httpcore-0.17.3-py3-none-any.whl.metadata (18 kB)
Collecting idna (from httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached idna-3.6-py3-none-any.whl.metadata (9.9 kB)
Collecting sniffio (from httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached sniffio-1.3.1-py3-none-any.whl.metadata (3.9 kB)
Collecting greenlet==3.0.3 (from playwright==1.42.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 3))
  Downloading greenlet-3.0.3-cp39-cp39-win_amd64.whl.metadata (3.9 kB)
Collecting pyee==11.0.1 (from playwright==1.42.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 3))
  Downloading pyee-11.0.1-py3-none-any.whl.metadata (2.7 kB)
Collecting six>=1.10.0 (from PyExecJS==1.5.1->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 6))
  Using cached six-1.16.0-py2.py3-none-any.whl.metadata (1.8 kB)
Collecting numpy>=1.17.0 (from opencv-python==4.7.0.72->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 7))
  Downloading numpy-1.26.4-cp39-cp39-win_amd64.whl.metadata (61 kB)
     ---------------------------------------- 61.0/61.0 kB 25.6 kB/s eta 0:00:00
Collecting aiosqlite<0.18.0,>=0.16.0 (from tortoise-orm==0.20.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 8))
  Downloading aiosqlite-0.17.0-py3-none-any.whl.metadata (4.1 kB)
Collecting iso8601<2.0.0,>=1.0.2 (from tortoise-orm==0.20.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 8))
  Downloading iso8601-1.1.0-py3-none-any.whl.metadata (9.8 kB)
Collecting pypika-tortoise<0.2.0,>=0.1.6 (from tortoise-orm==0.20.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 8))
  Downloading pypika_tortoise-0.1.6-py3-none-any.whl.metadata (2.2 kB)
Collecting pytz (from tortoise-orm==0.20.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 8))
  Using cached pytz-2024.1-py2.py3-none-any.whl.metadata (22 kB)
Collecting PyMySQL>=1.0 (from aiomysql==0.2.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 9))
  Downloading PyMySQL-1.1.0-py3-none-any.whl.metadata (4.4 kB)
Collecting click (from aerich==0.7.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 10))
  Using cached click-8.1.7-py3-none-any.whl.metadata (3.0 kB)
Collecting dictdiffer (from aerich==0.7.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 10))
  Downloading dictdiffer-0.9.0-py2.py3-none-any.whl.metadata (4.8 kB)
Collecting tomlkit (from aerich==0.7.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 10))
  Downloading tomlkit-0.12.4-py3-none-any.whl.metadata (2.7 kB)
Collecting annotated-types>=0.4.0 (from pydantic==2.5.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 12))
  Using cached annotated_types-0.6.0-py3-none-any.whl.metadata (12 kB)
Collecting pydantic-core==2.14.5 (from pydantic==2.5.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 12))
  Downloading pydantic_core-2.14.5-cp39-none-win_amd64.whl.metadata (6.6 kB)
Collecting typing-extensions>=4.6.1 (from pydantic==2.5.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 12))
  Downloading typing_extensions-4.11.0-py3-none-any.whl.metadata (3.0 kB)
Collecting async-timeout>=4.0.2 (from redis~=4.6.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 11))
  Using cached async_timeout-4.0.3-py3-none-any.whl.metadata (4.2 kB)
Collecting h11<0.15,>=0.13 (from httpcore<0.18.0,>=0.15.0->httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached h11-0.14.0-py3-none-any.whl.metadata (8.2 kB)
Collecting anyio<5.0,>=3.0 (from httpcore<0.18.0,>=0.15.0->httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached anyio-4.3.0-py3-none-any.whl.metadata (4.6 kB)
Collecting colorama (from click->aerich==0.7.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 10))
  Using cached colorama-0.4.6-py2.py3-none-any.whl.metadata (17 kB)
Collecting exceptiongroup>=1.0.2 (from anyio<5.0,>=3.0->httpcore<0.18.0,>=0.15.0->httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached exceptiongroup-1.2.0-py3-none-any.whl.metadata (6.6 kB)
Downloading httpx-0.24.0-py3-none-any.whl (75 kB)
   ---------------------------------------- 75.3/75.3 kB 21.1 kB/s eta 0:00:00
Downloading Pillow-9.5.0-cp39-cp39-win_amd64.whl (2.5 MB)
   -------------------------------- ------- 2.1/2.5 MB 20.3 kB/s eta 0:00:23
ERROR: Exception:
Traceback (most recent call last):
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_vendor\urllib3\response.py", line 438, in _error_catcher
    yield
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_vendor\urllib3\response.py", line 561, in read
    data = self._fp_read(amt) if not fp_closed else b""
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_vendor\urllib3\response.py", line 527, in _fp_read
    return self._fp.read(amt) if amt is not None else self._fp.read()
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_vendor\cachecontrol\filewrapper.py", line 98, in read
    data: bytes = self.__fp.read(amt)
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\http\client.py", line 463, in read
    n = self.readinto(b)
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\http\client.py", line 507, in readinto
    n = self.fp.readinto(b)
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\socket.py", line 704, in readinto
    return self._sock.recv_into(b)
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\ssl.py", line 1275, in recv_into
    return self.read(nbytes, buffer)
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\ssl.py", line 1133, in read
    return self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_internal\cli\base_command.py", line 180, in exc_logging_wrapper
    status = run_func(*args)
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_internal\cli\req_command.py", line 245, in wrapper
    return func(self, options, args)
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_internal\commands\install.py", line 377, in run
    requirement_set = resolver.resolve(
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_internal\resolution\resolvelib\resolver.py", line 179, in resolve
    self.factory.preparer.prepare_linked_requirements_more(reqs)
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_internal\operations\prepare.py", line 552, in prepare_linked_requirements_more
    self._complete_partial_requirements(
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_internal\operations\prepare.py", line 467, in _complete_partial_requirements
    for link, (filepath, _) in batch_download:
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_internal\network\download.py", line 183, in __call__
    for chunk in chunks:
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_internal\cli\progress_bars.py", line 53, in _rich_progress_bar
    for chunk in iterable:
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_internal\network\utils.py", line 63, in response_chunks
    for chunk in response.raw.stream(
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_vendor\urllib3\response.py", line 622, in stream
    data = self.read(amt=amt, decode_content=decode_content)
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_vendor\urllib3\response.py", line 587, in read
    raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\contextlib.py", line 137, in __exit__
    self.gen.throw(typ, value, traceback)
  File "G:\ProgramData\anaconda3\envs\media_crawler\lib\site-packages\pip\_vendor\urllib3\response.py", line 443, in _error_catcher
    raise ReadTimeoutError(self._pool, None, "Read timed out.")
pip._vendor.urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='files.pythonhosted.org', port=443): Read timed out.
(media_crawler) PS C:\Users\Administrator> pip install -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt
Collecting httpx==0.24.0 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached httpx-0.24.0-py3-none-any.whl.metadata (8.1 kB)
Collecting Pillow==9.5.0 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 2))
  Using cached Pillow-9.5.0-cp39-cp39-win_amd64.whl.metadata (9.7 kB)
Collecting playwright==1.42.0 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 3))
  Using cached playwright-1.42.0-py3-none-win_amd64.whl.metadata (3.5 kB)
Collecting tenacity==8.2.2 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 4))
  Using cached tenacity-8.2.2-py3-none-any.whl.metadata (1.1 kB)
Collecting tornado (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 5))
  Using cached tornado-6.4-cp38-abi3-win_amd64.whl.metadata (2.6 kB)
Collecting PyExecJS==1.5.1 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 6))
  Using cached PyExecJS-1.5.1.tar.gz (13 kB)
  Preparing metadata (setup.py) ... done
Collecting opencv-python==4.7.0.72 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 7))
  Using cached opencv_python-4.7.0.72-cp37-abi3-win_amd64.whl.metadata (18 kB)
Collecting tortoise-orm==0.20.0 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 8))
  Using cached tortoise_orm-0.20.0-py3-none-any.whl.metadata (10 kB)
Collecting aiomysql==0.2.0 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 9))
  Using cached aiomysql-0.2.0-py3-none-any.whl.metadata (11 kB)
Collecting aerich==0.7.2 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 10))
  Using cached aerich-0.7.2-py3-none-any.whl.metadata (8.5 kB)
Collecting redis~=4.6.0 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 11))
  Using cached redis-4.6.0-py3-none-any.whl.metadata (8.3 kB)
Collecting pydantic==2.5.2 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 12))
  Using cached pydantic-2.5.2-py3-none-any.whl.metadata (65 kB)
Collecting aiofiles~=23.2.1 (from -r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 13))
  Using cached aiofiles-23.2.1-py3-none-any.whl.metadata (9.7 kB)
Collecting certifi (from httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached certifi-2024.2.2-py3-none-any.whl.metadata (2.2 kB)
Collecting httpcore<0.18.0,>=0.15.0 (from httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached httpcore-0.17.3-py3-none-any.whl.metadata (18 kB)
Collecting idna (from httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached idna-3.6-py3-none-any.whl.metadata (9.9 kB)
Collecting sniffio (from httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached sniffio-1.3.1-py3-none-any.whl.metadata (3.9 kB)
Collecting greenlet==3.0.3 (from playwright==1.42.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 3))
  Using cached greenlet-3.0.3-cp39-cp39-win_amd64.whl.metadata (3.9 kB)
Collecting pyee==11.0.1 (from playwright==1.42.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 3))
  Using cached pyee-11.0.1-py3-none-any.whl.metadata (2.7 kB)
Collecting six>=1.10.0 (from PyExecJS==1.5.1->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 6))
  Using cached six-1.16.0-py2.py3-none-any.whl.metadata (1.8 kB)
Collecting numpy>=1.17.0 (from opencv-python==4.7.0.72->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 7))
  Using cached numpy-1.26.4-cp39-cp39-win_amd64.whl.metadata (61 kB)
Collecting aiosqlite<0.18.0,>=0.16.0 (from tortoise-orm==0.20.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 8))
  Using cached aiosqlite-0.17.0-py3-none-any.whl.metadata (4.1 kB)
Collecting iso8601<2.0.0,>=1.0.2 (from tortoise-orm==0.20.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 8))
  Using cached iso8601-1.1.0-py3-none-any.whl.metadata (9.8 kB)
Collecting pypika-tortoise<0.2.0,>=0.1.6 (from tortoise-orm==0.20.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 8))
  Using cached pypika_tortoise-0.1.6-py3-none-any.whl.metadata (2.2 kB)
Collecting pytz (from tortoise-orm==0.20.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 8))
  Using cached pytz-2024.1-py2.py3-none-any.whl.metadata (22 kB)
Collecting PyMySQL>=1.0 (from aiomysql==0.2.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 9))
  Using cached PyMySQL-1.1.0-py3-none-any.whl.metadata (4.4 kB)
Collecting click (from aerich==0.7.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 10))
  Using cached click-8.1.7-py3-none-any.whl.metadata (3.0 kB)
Collecting dictdiffer (from aerich==0.7.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 10))
  Using cached dictdiffer-0.9.0-py2.py3-none-any.whl.metadata (4.8 kB)
Collecting tomlkit (from aerich==0.7.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 10))
  Using cached tomlkit-0.12.4-py3-none-any.whl.metadata (2.7 kB)
Collecting annotated-types>=0.4.0 (from pydantic==2.5.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 12))
  Using cached annotated_types-0.6.0-py3-none-any.whl.metadata (12 kB)
Collecting pydantic-core==2.14.5 (from pydantic==2.5.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 12))
  Using cached pydantic_core-2.14.5-cp39-none-win_amd64.whl.metadata (6.6 kB)
Collecting typing-extensions>=4.6.1 (from pydantic==2.5.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 12))
  Using cached typing_extensions-4.11.0-py3-none-any.whl.metadata (3.0 kB)
Collecting async-timeout>=4.0.2 (from redis~=4.6.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 11))
  Using cached async_timeout-4.0.3-py3-none-any.whl.metadata (4.2 kB)
Collecting h11<0.15,>=0.13 (from httpcore<0.18.0,>=0.15.0->httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached h11-0.14.0-py3-none-any.whl.metadata (8.2 kB)
Collecting anyio<5.0,>=3.0 (from httpcore<0.18.0,>=0.15.0->httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached anyio-4.3.0-py3-none-any.whl.metadata (4.6 kB)
Collecting colorama (from click->aerich==0.7.2->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 10))
  Using cached colorama-0.4.6-py2.py3-none-any.whl.metadata (17 kB)
Collecting exceptiongroup>=1.0.2 (from anyio<5.0,>=3.0->httpcore<0.18.0,>=0.15.0->httpx==0.24.0->-r G:\project\VideoSpider\MediaCrawler-main\MediaCrawler-main\requirements.txt (line 1))
  Using cached exceptiongroup-1.2.0-py3-none-any.whl.metadata (6.6 kB)
Using cached httpx-0.24.0-py3-none-any.whl (75 kB)
Downloading Pillow-9.5.0-cp39-cp39-win_amd64.whl (2.5 MB)
   ---------------------------------------- 2.5/2.5 MB 17.5 kB/s eta 0:00:00
Downloading playwright-1.42.0-py3-none-win_amd64.whl (29.4 MB)

去激活环境:

conda deactivate

pycharm环境设置

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:/a/520804.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

C++ 标准库类型stackqueue

C/C总述&#xff1a;Study C/C-CSDN博客 栈与队列详解&#xff08;数据结构&#xff09;&#xff1a;栈与队列_禊月初三-CSDN博客 stack&#xff08;栈&#xff09; stack的常用函数 函数说明功能说明stack()构造空栈push(T& val)将元素val压入栈中size()返回栈中元素个…

数据结构之二叉树由浅入深最终章!

题外话 我说清明节想放松一下没更新大家信吗? 博客毕竟是文字不是视频,大家如果有不明白的地方,可以使用数形结合的方式,画图一边通过图片,一边通过对照代码进行推导一下,有什么问题都可以私信我或者写在评论区 正题 第一题 寻找二叉树中p,q最近公共祖先 第一题思路 先…

【C++】红黑树讲解及实现

前言&#xff1a; AVL树与红黑树相似&#xff0c;都是一种平衡二叉搜索树&#xff0c;但是AVL树的平衡要求太严格&#xff0c;如果要对AVL树做一些结构修改的操作性能会非常低下&#xff0c;比如&#xff1a;插入时要维护其绝对平衡&#xff0c;旋转的次数比较多&#xff0c;更…

【Claude 3】This organization has been disabled.此组织已被禁用。(Claude无法对话的原因和解决办法)

Claude对话提示 This organization has been disabled.此组织已被禁用。 This organization has been disabled.此组织已被禁用。 This organization has been disabled.此组织已被禁用。 问题截图 问题原因 出现该页面&#xff0c;表示您的账户已经无法使用&#xff0c;可能…

顺序表相关习题

&#x1f308; 个人主页&#xff1a;白子寰 &#x1f525; 分类专栏&#xff1a;python从入门到精通&#xff0c;魔法指针&#xff0c;进阶C&#xff0c;C语言&#xff0c;C语言题集&#xff0c;C语言实现游戏&#x1f448; 希望得到您的订阅和支持~ &#x1f4a1; 坚持创作博文…

Jenkins (三) - 拉取编译

Jenkins (三) - 拉取编译 通过Jenkins平台 git 拉取github上项目&#xff0c;通过maven编译并打包。 Jenkins 安装 git 插件 Manager Jenkins -> Plugins -> Available plugins -> Git 打包编译检验 FressStyle 风格编译 New Item输入 item name Spring-Cloud-1…

回溯法(一)——全排列 全组合 子集问题

全排列问题 数字序列 [ l , r ] [l,r] [l,r]​区间内元素的全排列问题 extern int ans[],l,r,num;//num&#xff1a;方案数 extern bool flag[]; void dfs(int cl){//cl:current left&#xff0c;即为当前递归轮的首元素if(cl r 1){//数组已越界&#xff0c;本轮递归结束for…

IDEA2023创建SpringMVC项目

✅作者简介&#xff1a;大家好&#xff0c;我是Leo&#xff0c;热爱Java后端开发者&#xff0c;一个想要与大家共同进步的男人&#x1f609;&#x1f609; &#x1f34e;个人主页&#xff1a;Leo的博客 &#x1f49e;当前专栏&#xff1a; 开发环境篇 ✨特色专栏&#xff1a; M…

SpringBoot整合Spring Data JPA

✅作者简介:大家好,我是Leo,热爱Java后端开发者,一个想要与大家共同进步的男人😉😉🍎个人主页:Leo的博客💞当前专栏: 循序渐进学SpringBoot ✨特色专栏: MySQL学习 🥭本文内容: SpringBoot整合Spring Data JPA 📚个人知识库: Leo知识库,欢迎大家访问 1.…

智慧牧场数据 7

1 体征数据采集 需求:获取奶牛记步信息 三轴加速度测量&#xff1a;加速度测量计反应的加速向量与当前的受力方向是相反&#xff0c;单位为g 陀螺仪&#xff0c;是用来测量角速度的&#xff0c;单位为度每秒&#xff08;deg/s&#xff09; 2000deg/s 相当于1秒钟多少转 1.1…

Vue关键知识点

watch侦听器 Vue.js 中的侦听器&#xff08;Watcher&#xff09;是 Vue提供的一种响应式系统的核心机制之一。 监听数据的变化&#xff0c;并在数据发生变化时执行相应的回调函数。 目的:数据变化能够自动更新到视图中 原理&#xff1a; Vue 的侦听器通过观察对象的属性&#…

正排索引 vs 倒排索引 - 搜索引擎具体原理

阅读导航 一、正排索引1. 概念2. 实例 二、倒排索引1. 概念2. 实例 三、正排 VS 倒排1. 正排索引优缺点2. 倒排索引优缺点3. 应用场景 三、搜索引擎原理1. 宏观原理2. 具体原理 一、正排索引 1. 概念 正排索引是一种索引机制&#xff0c;它将文档或数据记录按照某种特定的顺序…

分布式锁实战

4、分布式锁 4.1 、基本原理和实现方式对比 分布式锁&#xff1a;满足分布式系统或集群模式下多进程可见并且互斥的锁。 分布式锁的核心思想就是让大家都使用同一把锁&#xff0c;只要大家使用的是同一把锁&#xff0c;那么我们就能锁住线程&#xff0c;不让线程进行&#x…

【鲜货】企业数据治理的首要一步:数据溯源

目录 背景 一、数据探索溯源的定义 二、数据探索溯源的重要性 1、提高数据质量 2、增强数据信任度 3、促进数据合规性 三、数据溯源的主要方法 1、标注法 2、反向查询法 3、双向指针追踪法 四、数据探索溯源的主要步骤 1、确定溯源目标 2、收集元数据 3、分析数据…

微信小程序uniapp+vue.js旅游攻略系统9krxx

实现了一个完整的旅游攻略小程序系统&#xff0c;其中主要有用户模块、用户表模块、token表模块、收藏表模块、视频信息模块、视频类型模块、景点资讯模块、门票购买模块、旅游攻略模块、景点信息模块、论坛表模块、视频信息评论表模块、旅游攻略评论表模块、景点信息评论表模块…

噪声的力量:重新定义 RAG 系统的检索

该文得到了一个反常识的结论&#xff0c;当无关的噪声文档放在正确的位置时&#xff0c;实际上有助于提高RAG的准确性。 摘要 检索增强生成&#xff08;RAG&#xff09;系统代表了传统大语言模型&#xff08;大语言模型&#xff09;的显着进步。 RAG系统通过整合通过信息检索…

CSS基础:插入CSS样式的3种方法

你好&#xff0c;我是云桃桃。 一个希望帮助更多朋友快速入门 WEB 前端的程序媛。大专生&#xff0c;2年时间从1800到月入过万&#xff0c;工作5年买房。 分享成长心得。 262篇原创内容-公众号 后台回复“前端工具”可获取开发工具&#xff0c;持续更新中 后台回复“前端基础…

【UnityRPG游戏制作】Unity_RPG项目之界面面板分离和搭建

&#x1f468;‍&#x1f4bb;个人主页&#xff1a;元宇宙-秩沅 &#x1f468;‍&#x1f4bb; hallo 欢迎 点赞&#x1f44d; 收藏⭐ 留言&#x1f4dd; 加关注✅! &#x1f468;‍&#x1f4bb; 本文由 秩沅 原创 &#x1f468;‍&#x1f4bb; 收录于专栏&#xff1a;Uni…

2_5.Linux存储的基本管理

实验环境&#xff1a; 系统里添加两块硬盘 ##1.设备识别## 设备接入系统后都是以文件的形式存在 设备文件名称&#xff1a; SATA/SAS/USB /dev/sda,/dev/sdb ##s SATA, dDISK a第几块 IDE /dev/hd0,/dev/hd1 ##h hard VIRTIO-BLOCK /de…

stm32开发之threadx使用记录(主逻辑分析)

前言 threadx的相关参考资料 论坛资料、微软官网本次使用的开发板为普中科技–麒麟&#xff0c;核心芯片为 stm32f497zgt6开发工具选择的是stm32cubemx(代码生成工具)clion(代码编写工具)编译构建环境选择的是arm-none-gcc编译 本次项目结构 CMakeList对应的配置 set(CMAKE_…