Open WebUI – 本地化部署大模型仿照 ChatGPT用户界面

Open WebUI介绍:

Open WebUI 是一个仿照 ChatGPT 界面,为本地大语言模型提供图形化界面的开源项目,可以非常方便的调试、调用本地模型。你能用它连接你在本地的大语言模型(包括 Ollama 和 OpenAI 兼容的 API),也支持远程服务器。Docker 部署简单,功能非常丰富,包括代码高亮、数学公式、网页浏览、预设提示词、本地 RAG 集成、对话标记、下载模型、聊天记录、语音支持等。

官网地址:https://openwebui.com

GitHub:GitHub - open-webui/open-webui: User-friendly WebUI for LLMs (Formerly Ollama WebUI)

功能:

💻 直观界面:我们的聊天界面深受ChatGPT启发,旨在确保用户获得友好易用的体验。

📱 响应式设计:无论是在桌面电脑还是移动设备上,都能享受一致而流畅的用户体验。

⚡ 迅捷响应速度:畅享快速且高效的响应性能。

🚀 轻松启动:采用Docker或Kubernetes(通过kubectl、kustomize或helm工具)实现无缝安装,带给您无烦恼的初始化体验。

✅ 代码语法高亮显示:得益于我们的语法高亮功能,您可以享受到更为清晰易读的代码展示效果。

✍️数字化写作 ✂️ 全面支持Markdown和LaTeX:借助全面集成的Markdown和LaTeX功能,全面提升您的LLM互动体验。

📚 本地RAG集成:步入未来聊天交互的新篇章,我们内建了Retrieval Augmented Generation(RAG)支持,让您能够将文档操作无缝融合进聊天流程。只需简单地将文档载入聊天或添加文件至文档库,然后通过#命令即可轻松访问文档内容。此功能目前尚处于alpha测试阶段,我们正不断改进和完善,以确保其稳定性和性能表现达到最优。

总结一下,重点理解为如下三点:

  • Open WebUI 是一个多功能且直观的开源用户界面,与 ollama 配合使用,它作为一个webui,为用户提供了一个私有化的 ChatGPT 体验。
  • Open WebUI 集成了 Retrieval Augmented Generation(RAG)技术,允许用户将文档、网站和视频等作为上下文信息,供 AI 在回答问题时参考,以提供更准确的信息。
  • 通过调整 Top K 值和改进 RAG 模板提示词来提高基于文档的问答系统的准确性。

Q:关于Open WebUI的安全性,尤其是第一次使用还需要注册,注册信息到哪里去了?

open-webui是一个用于构建Web用户界面的开源库,它通常不直接处理数据传输,而是作为前端框架与后端服务器之间的中介。

第一次使用注册信息,是要求您注册成为管理员用户。这确保了如果Open WebUI被外部访问,您的数据仍然是安全的。

需要注意的是,所有东西都是本地的。我们不收集您的数据。当您注册时,所有信息都会留在您的服务器中,永远不会离开您的设备。

您的隐私和安全是我们的首要任务,确保您的数据始终处于您的控制之下。

参考:📋 FAQ | Open WebUI

Q: Why am I asked to sign up? Where are my data being sent to?

A: We require you to sign up to become the admin user for enhanced security. This ensures that if the Open WebUI is ever exposed to external access, your data remains secure. It's important to note that everything is kept local. We do not collect your data. When you sign up, all information stays within your server and never leaves your device. Your privacy and security are our top priorities, ensuring that your data remains under your control at all times.

Open WebUI安装:

目前我只在linux环境下,做了安装实践, 在安装过程中,我重点参考了csdn上的这篇文章:

linux环境安装参考: ollama+open-webui,本地部署自己的大模型

操作步骤很详细,包括安装过程中遇到的报错问题,基本上都可以按照文章中的步骤,逐步执行解决。

还有一篇文章,写的也很详细,如果是windows下安装,建议参考下:

windows环境安装参考:本机部署大语言模型:Ollama和OpenWebUI实现各大模型的人工智能自由

另外,我使用的centos系统,在安装过程中遇到如下错误:

(open-webui) [root@master open-webui]# npm i

node: /lib64/libm.so.6: version `GLIBC_2.27' not found (required by node)

node: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by node)

node: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by node)

node: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by node)

node: /lib64/libc.so.6: version `GLIBC_2.28' not found (required by node)

node: /lib64/libc.so.6: version `GLIBC_2.25' not found (required by node)

解决办法:node: /lib64/libm.so.6: version `GLIBC_2.27‘ not found问题解决方案_libm.so.6 glibc2.27-CSDN博客

另外,open-webui安装过程中,需要连接'https://huggingface.co' 网站,会报错无法连接。

我通过简单设置环境变量解决:export HF_ENDPOINT=HF-Mirror

下面给出了我在conda虚拟环境创建完后,并且npm也安装ok后,open-webui的安装执行过程快照,仅供参考:

(base) [root@master backend]# conda activate open-webui
(open-webui) [root@master backend]# bash start.sh 
No WEBUI_SECRET_KEY provided
Loading WEBUI_SECRET_KEY from .webui_secret_key
USER_AGENT environment variable not set, consider setting it to identify your requests.
No sentence-transformers model found with name sentence-transformers/all-MiniLM-L6-v2. Creating a new one with mean pooling.
Traceback (most recent call last):
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connection.py", line 196, in _new_conn
    sock = connection.create_connection(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/util/connection.py", line 85, in create_connection
    raise err
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/util/connection.py", line 73, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connectionpool.py", line 789, in urlopen
    response = self._make_request(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connectionpool.py", line 490, in _make_request
    raise new_e
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connectionpool.py", line 466, in _make_request
    self._validate_conn(conn)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connectionpool.py", line 1095, in _validate_conn
    conn.connect()
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connection.py", line 615, in connect
    self.sock = sock = self._new_conn()
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connection.py", line 211, in _new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f8572048eb0>: Failed to establish a new connection: [Errno 101] Network is unreachable

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/requests/adapters.py", line 667, in send
    resp = conn.urlopen(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connectionpool.py", line 843, in urlopen
    retries = retries.increment(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/util/retry.py", line 519, in increment
    raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /sentence-transformers/all-MiniLM-L6-v2/resolve/main/config.json (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f8572048eb0>: Failed to establish a new connection: [Errno 101] Network is unreachable'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1722, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(url=url, proxies=proxies, timeout=etag_timeout, headers=headers)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1645, in get_hf_file_metadata
    r = _request_wrapper(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 372, in _request_wrapper
    response = _request_wrapper(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 395, in _request_wrapper
    response = get_session().request(method=method, url=url, **params)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/utils/_http.py", line 66, in send
    return super().send(request, *args, **kwargs)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/requests/adapters.py", line 700, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /sentence-transformers/all-MiniLM-L6-v2/resolve/main/config.json (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f8572048eb0>: Failed to establish a new connection: [Errno 101] Network is unreachable'))"), '(Request ID: 430abcfa-0ffb-419d-a853-40caed43b5c8)')

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/transformers/utils/hub.py", line 399, in cached_file
    resolved_file = hf_hub_download(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1221, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1325, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1826, in _raise_on_head_call_error
    raise LocalEntryNotFoundError(
huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/root/miniconda3/envs/open-webui/bin/uvicorn", line 8, in <module>
    sys.exit(main())
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/main.py", line 410, in main
    run(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/main.py", line 577, in run
    server.run()
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/server.py", line 65, in run
    return asyncio.run(self.serve(sockets=sockets))
  File "/root/miniconda3/envs/open-webui/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/server.py", line 69, in serve
    await self._serve(sockets)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/server.py", line 76, in _serve
    config.load()
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/config.py", line 434, in load
    self.loaded_app = import_from_string(self.app)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/importer.py", line 19, in import_from_string
    module = importlib.import_module(module_str)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 843, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/root/open-webui/backend/main.py", line 25, in <module>
    from apps.rag.main import app as rag_app
  File "/root/open-webui/backend/apps/rag/main.py", line 85, in <module>
    embedding_functions.SentenceTransformerEmbeddingFunction(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/chromadb/utils/embedding_functions.py", line 83, in __init__
    self.models[model_name] = SentenceTransformer(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/sentence_transformers/SentenceTransformer.py", line 299, in __init__
    modules = self._load_auto_model(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/sentence_transformers/SentenceTransformer.py", line 1324, in _load_auto_model
    transformer_model = Transformer(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/sentence_transformers/models/Transformer.py", line 53, in __init__
    config = AutoConfig.from_pretrained(model_name_or_path, **config_args, cache_dir=cache_dir)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py", line 934, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/transformers/configuration_utils.py", line 632, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/transformers/configuration_utils.py", line 689, in _get_config_dict
    resolved_config_file = cached_file(
  File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/transformers/utils/hub.py", line 442, in cached_file
    raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like sentence-transformers/all-MiniLM-L6-v2 is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

(open-webui) [root@master backend]# export HF_ENDPOINT=https://hf-mirror.com
(open-webui) [root@master backend]# bash start.sh 
No WEBUI_SECRET_KEY provided
Loading WEBUI_SECRET_KEY from .webui_secret_key
USER_AGENT environment variable not set, consider setting it to identify your requests.
modules.json: 100%|██████████████████████████████████████████████████████████████████████████████████████| 349/349 [00:00<00:00, 95.8kB/s]
config_sentence_transformers.json: 100%|█████████████████████████████████████████████████████████████████| 116/116 [00:00<00:00, 25.3kB/s]
README.md: 10.7kB [00:00, 19.1MB/s]
sentence_bert_config.json: 100%|███████████████████████████████████████████████████████████████████████| 53.0/53.0 [00:00<00:00, 25.2kB/s]
config.json: 612B [00:00, 283kB/s]                                                                                                        
model.safetensors: 100%|█████████████████████████████████████████████████████████████████████████████| 90.9M/90.9M [00:16<00:00, 5.67MB/s]
tokenizer_config.json: 100%|█████████████████████████████████████████████████████████████████████████████| 350/350 [00:00<00:00, 71.3kB/s]
vocab.txt: 210kB [00:17, 30.4kB/s]Error while downloading from https://hf-mirror.com/sentence-transformers/all-MiniLM-L6-v2/resolve/main/vocab.txt: HTTPSConnectionPool(host='hf-mirror.com', port=443): Read timed out.
Trying to resume download...
vocab.txt: 232kB [00:00, 24.2MB/s]
vocab.txt: 214kB [00:29, 7.34kB/s]
tokenizer.json: 466kB [00:03, 155kB/s] 
special_tokens_map.json: 100%|███████████████████████████████████████████████████████████████████████████| 112/112 [00:00<00:00, 50.8kB/s]
1_Pooling/config.json: 100%|█████████████████████████████████████████████████████████████████████████████| 190/190 [00:00<00:00, 32.2kB/s]
INFO:     Started server process [71959]
INFO:     Waiting for application startup.
Intialized router with Routing strategy: simple-shuffle

Routing fallbacks: None

Routing context window fallbacks: None

Router Redis Caching=None

#------------------------------------------------------------#
#                                                            #
#           'It would help me if you could add...'            #
#        https://github.com/BerriAI/litellm/issues/new        #
#                                                            #
#------------------------------------------------------------#

 Thank you for using LiteLLM! - Krrish & Ishaan



Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new


Intialized router with Routing strategy: simple-shuffle

Routing fallbacks: None

Routing context window fallbacks: None

Router Redis Caching=None
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:/a/733373.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

牛顿迭代法(求解整数的近似平方根)

情景再现 面试官&#xff1a;给你一个整数怎样最快求解他的近似平方根&#xff1f; 小白&#xff1a;可以用while循环呀&#xff01; 面试官&#xff1a;有没有更好的方法&#xff1f; 小白&#xff1a;可以从这个数的左右两边开始迭代。 面试官&#xff1a;除了这个呢&#xf…

Flutter第十二弹 Flutter多平台运行

目标&#xff1a; 1.在多平台调试启动Flutter程序运行 一、安卓模拟器 1.1 检查当前Flutter适配的版本 flutter doctor提供了Flutter诊断。 $ flutter doctor --verbose /Users/zhouronghua/IDES/flutter/bin/flutter doctor --verbose [✓] Flutter (Channel master, 2.1…

操作系统:高效、稳定的承上启下

标题&#xff1a;操作系统&#xff1a;高效、稳定的承上启下 水墨不写bug &#xff08;图片来源于网络&#xff09; 目录 一、初识操作系统 第一个操作系统&#xff1a;Uinx Uinx的商业化 Linux&#xff1a;横空出世 二、如何在Windows上使用Linux 正文开始&#xff1a;…

打工人的PPT救星来了!用这款AI工具,10秒生成您的专属PPT

今天帮同事解决了一个代码合并的问题。其实问题不复杂&#xff0c;要把1的代码合到2的位置&#xff1a; 这个处理方式其实很简单&#xff0c;使用 “git cherry-pick hash值” 就可以。 同事直接对我赞许有加&#xff0c;不曾想被领导看到了&#xff0c;对我说了一句&#xff…

Tampermonkey油猴 跨域请求下载图片示例

Tampermonkey油猴 跨域请求下载图片示例 前言项目目标网站代码编写 运行效果 前言 需要用油猴采集并下载一个网站的图片&#xff0c;直接下下不了&#xff0c;搜了一下&#xff0c;是禁止跨域&#xff0c;使用CORS Unblock也不行&#xff0c;所以使用油猴自带的GM_xmlhttpRequ…

The First项目报告:解读互链操作协议LayerZero

随着 DeFi 项目的兴起&#xff0c;跨链互操作性成为区块链领域的热门话题&#xff0c;在众多的跨链平台中&#xff0c;Layer Zero 凭借其创新技术和设计备受关注&#xff0c;近期Layer Zero发布代币空投方案&#xff0c;引发社区热议&#xff0c;随着其代币上线The First平台&a…

【STM32入门学习】学习嵌入式实时操作系统(RTOS)移植uc/OS到stm32F103上

目录 一、建立STM32HAL库工程 1.1实时操作系统 1.2基于HAL库创建工程 二、获取uC/OS-III源码 三、移植准备 3.1复制uC/OS-III文件到工程文件夹 3.2添加工程组件和头文件路径 四、移植修改代码 &#xff14;.1.启动文件修改&#xff1a; &#xff14;.2.app_cfg.h &a…

【数据结构】线性表之《栈》超详细实现

栈 一.栈的概念及结构二.顺序栈与链栈1.顺序栈2.链栈1.单链表栈2.双链表栈 三.顺序栈的实现1.栈的初始化2.检查栈的容量3.入栈4.出栈5.获取栈顶元素6.栈的大小7.栈的判空8.栈的清空9.栈的销毁 四.模块化源代码1.Stack.h2.Stack.c3.test.c 一.栈的概念及结构 栈&#xff1a;一种…

Azure vs. AssemblyAI:深度解析语音转文本服务的对决

在技术飞速发展的今天&#xff0c;API已成为连接不同软件和服务的关键桥梁。对于需要语音转文本功能的应用&#xff0c;我们对比了两个广受欢迎的API接口&#xff1a;Azure 语音转文本和AssemblyAI AI语音转文本。 Azure 语音转文本 Azure 语音转文本提供快速、准确的语音转文本…

ArrayList知识点(面试)

上一篇我们说了hashmap的相关知识点&#xff0c;这一篇我们再说一些ArrayList的相关知识&#xff0c;因为ArrayList也是我们项目中比较常用的。 ArrayList(底层是数组) 底层工作原理 首先&#xff0c;在构造ArrayList的时候会先看有没有指定容量&#xff0c;如果没有&#xf…

音视频开发29 FFmpeg 音频编码- 流程以及重要API,该章节使用AAC编码说明

此章节的一些参数&#xff0c;需要先掌握aac的一些基本知识&#xff1a;​​​​​​aac音视频开发13 FFmpeg 音频 --- 常用音频格式AAC&#xff0c;AAC编码器&#xff0c; AAC ADTS格式 。_ffmpeg aac data数据格式-CSDN博客 目的&#xff1a; 从本地⽂件读取PCM数据进⾏AAC格…

FL论文专栏|设备异构、异步联邦

论文&#xff1a;Asynchronous Federated Optimization&#xff08;12th Annual Workshop on Optimization for Machine Learning&#xff09; 链接 实现Server的异步更新。每次Server广播全局Model的时候附带一个时间戳&#xff0c;Client跑完之后上传将时间戳和Model同时带回…

【办公类-50-01】20240620自主游戏观察记录表19周内容打乱

背景需求&#xff1a; 又到了期末&#xff0c;各种班级资料需要提交。 有一份自主游戏观察记录需要写19周&#xff08;每周2次&#xff09;的观察记录&#xff0c;并根据参考书填写一级、三级、五级的评价指标。 去年中六班的时候&#xff0c;我很认真的手写了21周的户外游戏…

基于CST的连续域束缚态(BIC)设计与机制研究

关键词&#xff1a;太赫兹&#xff0c;超表面&#xff0c;连续域束缚态&#xff0c;CST&#xff0c;高Q 束缚态的概念最先出现于量子力学中&#xff0c;当粒子被势场约束在特定的区域内运动&#xff0c;即在无限远处波函数等于零的态叫束缚态&#xff0c;例如势阱中的粒子就处…

Map集合之HashMap细说

最近在看面试题&#xff0c;看到了hashmap相关的知识&#xff0c;面试中问的也挺多的&#xff0c;然后我这里记录下来&#xff0c;供大家学习。 Hashmap为什么线程不安全 jdk 1.7中&#xff0c;在扩容的时候因为使用头插法导致链表需要倒转&#xff0c;从而可能出现循环链表问…

百度ai人脸识别项目C#

一、项目描述 本项目通过集成百度AI人脸识别API&#xff0c;实现了人脸检测和识别功能。用户可以上传图片&#xff0c;系统将自动识别人脸并返回识别结果。 二、开发环境 Visual Studio 2019或更高版本.NET Framework 4.7.2或更高版本AForge.NET库百度AI平台人脸识别API 三、…

Django 模版变量

1&#xff0c;模版变量作用 模板变量使用“{{ 变量名 }}” 来表示模板变量前后可以有空格&#xff0c;模板变量名称&#xff0c;可以由数字&#xff0c;字母&#xff0c;下划线组成&#xff0c;不能包含空格模板变量还支持列表&#xff0c;字典&#xff0c;对象 2&#xff0c;…

一文搞懂Linux信号【下】

目录 &#x1f6a9;引言 &#x1f6a9;阻塞信号 &#x1f6a9;信号保存 &#x1f6a9;信号捕捉 &#x1f6a9;操作信号集 1.信号集操作函数 2.其它操作函数 &#x1f6a9;总结&#xff1a; &#x1f6a9;引言 在观看本博客之前&#xff0c;建议大家先看一文搞懂Linux信…

React hydrateRoot如何实现

React 服务器渲染中&#xff0c;hydrateRoot 是核心&#xff0c;它将服务器段的渲染与客户端的交互绑定在一起&#xff0c;我们知道 React 中 Fiber Tree 是渲染的的核心&#xff0c;那么 React 是怎么实现 hydrateRoot 的呢&#xff1f;首先我们验证一下&#xff0c;hydrateRo…

期货交易豆粕品种详细分析

文章目录 1、豆粕期货标准&#xff08;2024年6月22号数据&#xff09;2、豆粕是什么3、豆粕1、5、9合约区别4、影响豆粕的价格因素1、大豆的供应情况。2、豆粕的季节性3、油粕比&#xff08;豆油和豆粕的价格关系 &#xff09; 5、美国大豆的生产/库存炒作6、豆粕双方&#xff…