昇思MindSpore学习笔记2-03 LLM原理和实践--基于MindSpore通过GPT实现情感分类

摘要:

昇思MindSpore AI框架中使用openai-gpt的方法、步骤。

没调通,存疑。

一、环境配置

%%capture captured_output
# 实验环境已经预装了mindspore==2.2.14,如需更换mindspore版本,可更改下面mindspore的版本号
!pip uninstall mindspore -y
!pip install -i https://pypi.mirrors.ustc.edu.cn/simple mindspore==2.2.14
# 该案例在 mindnlp 0.3.1 版本完成适配,如果发现案例跑不通,可以指定mindnlp版本,执行`!pip install mindnlp==0.3.1`
!pip install mindnlp==0.3.1
!pip install jieba
%env HF_ENDPOINT=https://hf-mirror.com

输出:

Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Requirement already satisfied: mindnlp==0.3.1 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (0.3.1)
Requirement already satisfied: mindspore in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (2.2.14)
Requirement already satisfied: tqdm in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (4.66.4)
Requirement already satisfied: requests in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (2.32.3)
Requirement already satisfied: datasets in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (2.20.0)
Requirement already satisfied: evaluate in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (0.4.2)
Requirement already satisfied: tokenizers in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (0.19.1)
Requirement already satisfied: safetensors in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (0.4.3)
Requirement already satisfied: sentencepiece in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (0.2.0)
Requirement already satisfied: regex in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (2024.5.15)
Requirement already satisfied: addict in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (2.4.0)
Requirement already satisfied: ml-dtypes in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (0.4.0)
Requirement already satisfied: pyctcdecode in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (0.5.0)
Requirement already satisfied: jieba in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (0.42.1)
Requirement already satisfied: pytest==7.2.0 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindnlp==0.3.1) (7.2.0)
Requirement already satisfied: attrs>=19.2.0 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from pytest==7.2.0->mindnlp==0.3.1) (23.2.0)
Requirement already satisfied: iniconfig in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from pytest==7.2.0->mindnlp==0.3.1) (2.0.0)
Requirement already satisfied: packaging in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from pytest==7.2.0->mindnlp==0.3.1) (23.2)
Requirement already satisfied: pluggy<2.0,>=0.12 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from pytest==7.2.0->mindnlp==0.3.1) (1.5.0)
Requirement already satisfied: exceptiongroup>=1.0.0rc8 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from pytest==7.2.0->mindnlp==0.3.1) (1.2.0)
Requirement already satisfied: tomli>=1.0.0 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from pytest==7.2.0->mindnlp==0.3.1) (2.0.1)
Requirement already satisfied: filelock in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from datasets->mindnlp==0.3.1) (3.15.3)
Requirement already satisfied: numpy>=1.17 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from datasets->mindnlp==0.3.1) (1.26.4)
Requirement already satisfied: pyarrow>=15.0.0 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from datasets->mindnlp==0.3.1) (16.1.0)
Requirement already satisfied: pyarrow-hotfix in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from datasets->mindnlp==0.3.1) (0.6)
Requirement already satisfied: dill<0.3.9,>=0.3.0 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from datasets->mindnlp==0.3.1) (0.3.8)
Requirement already satisfied: pandas in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from datasets->mindnlp==0.3.1) (2.2.2)
Requirement already satisfied: xxhash in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from datasets->mindnlp==0.3.1) (3.4.1)
Requirement already satisfied: multiprocess in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from datasets->mindnlp==0.3.1) (0.70.16)
Requirement already satisfied: fsspec<=2024.5.0,>=2023.1.0 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from fsspec[http]<=2024.5.0,>=2023.1.0->datasets->mindnlp==0.3.1) (2024.5.0)
Requirement already satisfied: aiohttp in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from datasets->mindnlp==0.3.1) (3.9.5)
Requirement already satisfied: huggingface-hub>=0.21.2 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from datasets->mindnlp==0.3.1) (0.23.4)
Requirement already satisfied: pyyaml>=5.1 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from datasets->mindnlp==0.3.1) (6.0.1)
Requirement already satisfied: charset-normalizer<4,>=2 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from requests->mindnlp==0.3.1) (3.3.2)
Requirement already satisfied: idna<4,>=2.5 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from requests->mindnlp==0.3.1) (3.7)
Requirement already satisfied: urllib3<3,>=1.21.1 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from requests->mindnlp==0.3.1) (2.2.2)
Requirement already satisfied: certifi>=2017.4.17 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from requests->mindnlp==0.3.1) (2024.6.2)
Requirement already satisfied: protobuf>=3.13.0 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindspore->mindnlp==0.3.1) (5.27.1)
Requirement already satisfied: asttokens>=2.0.4 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindspore->mindnlp==0.3.1) (2.0.5)
Requirement already satisfied: pillow>=6.2.0 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindspore->mindnlp==0.3.1) (10.3.0)
Requirement already satisfied: scipy>=1.5.4 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindspore->mindnlp==0.3.1) (1.13.1)
Requirement already satisfied: psutil>=5.6.1 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindspore->mindnlp==0.3.1) (5.9.0)
Requirement already satisfied: astunparse>=1.6.3 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from mindspore->mindnlp==0.3.1) (1.6.3)
Requirement already satisfied: pygtrie<3.0,>=2.1 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from pyctcdecode->mindnlp==0.3.1) (2.5.0)
Requirement already satisfied: hypothesis<7,>=6.14 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from pyctcdecode->mindnlp==0.3.1) (6.104.2)
Requirement already satisfied: six in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from asttokens>=2.0.4->mindspore->mindnlp==0.3.1) (1.16.0)
Requirement already satisfied: wheel<1.0,>=0.23.0 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from astunparse>=1.6.3->mindspore->mindnlp==0.3.1) (0.43.0)
Requirement already satisfied: aiosignal>=1.1.2 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from aiohttp->datasets->mindnlp==0.3.1) (1.3.1)
Requirement already satisfied: frozenlist>=1.1.1 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from aiohttp->datasets->mindnlp==0.3.1) (1.4.1)
Requirement already satisfied: multidict<7.0,>=4.5 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from aiohttp->datasets->mindnlp==0.3.1) (6.0.5)
Requirement already satisfied: yarl<2.0,>=1.0 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from aiohttp->datasets->mindnlp==0.3.1) (1.9.4)
Requirement already satisfied: async-timeout<5.0,>=4.0 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from aiohttp->datasets->mindnlp==0.3.1) (4.0.3)
Requirement already satisfied: typing-extensions>=3.7.4.3 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from huggingface-hub>=0.21.2->datasets->mindnlp==0.3.1) (4.11.0)
Requirement already satisfied: sortedcontainers<3.0.0,>=2.1.0 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from hypothesis<7,>=6.14->pyctcdecode->mindnlp==0.3.1) (2.4.0)
Requirement already satisfied: python-dateutil>=2.8.2 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from pandas->datasets->mindnlp==0.3.1) (2.9.0.post0)
Requirement already satisfied: pytz>=2020.1 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from pandas->datasets->mindnlp==0.3.1) (2024.1)
Requirement already satisfied: tzdata>=2022.7 in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (from pandas->datasets->mindnlp==0.3.1) (2024.1)

[notice] A new release of pip is available: 24.1 -> 24.1.1
[notice] To update, run: python -m pip install --upgrade pip
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Requirement already satisfied: jieba in /home/nginx/miniconda/envs/jupyter/lib/python3.9/site-packages (0.42.1)

[notice] A new release of pip is available: 24.1 -> 24.1.1
[notice] To update, run: python -m pip install --upgrade pip
env: HF_ENDPOINT=https://hf-mirror.com

导入os mindspore dataset nn _legacy等模块

import os
​
import mindspore
from mindspore.dataset import text, GeneratorDataset, transforms
from mindspore import nn
​
from mindnlp.dataset import load_dataset
​
from mindnlp._legacy.engine import Trainer, Evaluator
from mindnlp._legacy.engine.callbacks import CheckpointCallback, BestModelCallback
from mindnlp._legacy.metrics import Accuracy

输出:

Building prefix dict from the default dictionary ...
Dumping model to file cache /tmp/jieba.cache
Loading model cost 1.027 seconds.
Prefix dict has been built successfully.

二、加载训练数据集和测试数据集

imdb_ds = load_dataset('imdb', split=['train', 'test'])
imdb_train = imdb_ds['train']
imdb_test = imdb_ds['test']

输出:

Downloading readme:-- 7.81k/? [00:00<00:00, 478kB/s]
Downloading data: 100%---------------- 21.0M/21.0M [00:09<00:00, 2.43MB/s]
Downloading data: 100%---------------- 20.5M/20.5M [00:10<00:00, 1.95MB/s]
Downloading data: 100%---------------- 42.0M/42.0M [00:16<00:00, 2.69MB/s]
Generating train split: 100%---------------- 25000/25000 [00:00<00:00, 102317.15 examples/s]
Generating test split: 100%---------------- 25000/25000 [00:00<00:00, 130128.57 examples/s]
Generating unsupervised split: 100%---------------- 50000/50000 [00:00<00:00, 140883.29 examples/s]

imdb_train.get_dataset_size()

输出:

25000

三、预处理数据集

import numpy as np
​
def process_dataset(dataset, tokenizer, max_seq_len=512, batch_size=4, shuffle=False):
    is_ascend = mindspore.get_context('device_target') == 'Ascend'
    def tokenize(text):
        if is_ascend:
            tokenized = tokenizer(text, padding='max_length', truncation=True, max_length=max_seq_len)
        else:
            tokenized = tokenizer(text, truncation=True, max_length=max_seq_len)
        return tokenized['input_ids'], tokenized['attention_mask']
​
    if shuffle:
        dataset = dataset.shuffle(batch_size)
​
    # map dataset
dataset = dataset.map(operations=[tokenize], input_columns="text", 
output_columns=['input_ids', 'attention_mask'])
dataset = dataset.map(operations=transforms.TypeCast(mindspore.int32), 
input_columns="label", output_columns="labels")
    # batch dataset
    if is_ascend:
        dataset = dataset.batch(batch_size)
    else:
        dataset = dataset.padded_batch(batch_size, 
 pad_info={'input_ids': (None, tokenizer.pad_token_id),
                                                 'attention_mask': (None, 0)})
​
    return dataset

from mindnlp.transformers import GPTTokenizer
# tokenizer
gpt_tokenizer = GPTTokenizer.from_pretrained('openai-gpt')
​
# add sepcial token: <PAD>
special_tokens_dict = {
    "bos_token": "<bos>",
    "eos_token": "<eos>",
    "pad_token": "<pad>",
}
num_added_toks = gpt_tokenizer.add_special_tokens(special_tokens_dict)

输出:

连接失败,不知是否openai关闭服务的原因。

【从此往下,执行不下去了】

100%---------------- 25.0/25.0 [00:00<00:00, 2.39kB/s]
 ---------------- 533k/0.00 [00:35<00:00, 49.3kB/s]
Failed to download: HTTPSConnectionPool(host='hf-mirror.com', port=443): Read timed out.
Retrying... (attempt 0/5)
 ---------------- 263k/0.00 [00:08<00:00, 57.6kB/s]
 ---------------- 378k/0.00 [00:41<00:00, 5.35kB/s]
Failed to download: HTTPSConnectionPool(host='hf-mirror.com', port=443): Read timed out.
Retrying... (attempt 0/5)
 ---------------- 69.6k/0.00 [00:01<00:00, 35.7kB/s]
 ---------------- 684k/0.00 [00:45<00:00, 8.49kB/s]
Failed to download: HTTPSConnectionPool(host='hf-mirror.com', port=443): Read timed out.
Retrying... (attempt 0/5)
 ---------------- 559k/0.00 [00:36<00:00, 27.3kB/s]
 ---------------- 656/? [00:00<00:00, 62.5kB/s]

# split train dataset into train and valid datasets
imdb_train, imdb_val = imdb_train.split([0.7, 0.3])
dataset_train = process_dataset(imdb_train, gpt_tokenizer, shuffle=True)
dataset_val   = process_dataset(imdb_val, gpt_tokenizer)
dataset_test  = process_dataset(imdb_test, gpt_tokenizer)

next(dataset_train.create_tuple_iterator())

输出:

[Tensor(shape=[4, 512], dtype=Int64, value=
 [[   11,   250,    15 ...     3,   242,     3],
  [    5,    23,     5 ... 40480, 40480, 40480],
  [   14,     3,     5 ...   243,     8, 18073],
  [    7,   250,     3 ... 40480, 40480, 40480]]),
 Tensor(shape=[4, 512], dtype=Int64, value=
 [[1, 1, 1 ... 1, 1, 1],
  [1, 1, 1 ... 0, 0, 0],
  [1, 1, 1 ... 1, 1, 1],
  [1, 1, 1 ... 0, 0, 0]]),
 Tensor(shape=[4], dtype=Int32, value= [0, 1, 0, 1])]

from mindnlp.transformers import GPTForSequenceClassification
from mindspore.experimental.optim import Adam

# set bert config and define parameters for training
model = GPTForSequenceClassification.from_pretrained('openai-gpt', num_labels=2)
model.config.pad_token_id = gpt_tokenizer.pad_token_id
model.resize_token_embeddings(model.config.vocab_size + 3)

optimizer = nn.Adam(model.trainable_params(), learning_rate=2e-5)

metric = Accuracy()

# define callbacks to save checkpoints
ckpoint_cb = CheckpointCallback(save_path='checkpoint', ckpt_name='gpt_imdb_finetune', epochs=1, keep_checkpoint_max=2)
best_model_cb = BestModelCallback(save_path='checkpoint', ckpt_name='gpt_imdb_finetune_best', auto_load=True)

trainer = Trainer(network=model, train_dataset=dataset_train,
                  eval_dataset=dataset_train, metrics=metric,
                  epochs=1, optimizer=optimizer, callbacks=[ckpoint_cb, best_model_cb],
                  jit=False)

输出:

100%----------------  457M/457M [04:06<00:00, 2.87MB/s]
100%----------------  74.0/74.0 [00:00<00:00, 4.28kB/s]

四、训练

trainer.run(tgt_columns="labels")

五、评估

evaluator = Evaluator(network=model, eval_dataset=dataset_test, metrics=metric)
evaluator.run(tgt_columns="labels")

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mfbz.cn/a/766262.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

【实战场景】记一次UAT jvm故障排查经历

【实战场景】记一次UAT jvm故障排查经历 开篇词&#xff1a;干货篇&#xff1a;1.查看系统资源使用情况2.将十进制进程号转成十六进制3.使用jstack工具监视进程的垃圾回收情况4.输出指定线程的堆内存信息5.观察日志6.本地环境复现 总结篇&#xff1a;我是杰叔叔&#xff0c;一名…

仿论坛项目--初识Spring Boot

1. 技术准备 技术架构 • Spring Boot • Spring、Spring MVC、MyBatis • Redis、Kafka、Elasticsearch • Spring Security、Spring Actuator 开发环境 • 构建工具&#xff1a;Apache Maven • 集成开发工具&#xff1a;IntelliJ IDEA • 数据库&#xff1a;MySQL、Redi…

Docker拉取失败,利用 Git将 Docker镜像重新打 Tag 推送到阿里云等其他公有云镜像仓库里

目录 一、开通阿里云容器镜像服务 二、Git配置 三、去DockerHub找镜像 四、编写images.txt文件 ​五、演示 六、其他注意事项 最近一段时间 Docker 镜像一直是 Pull 不下来的状态&#xff0c;想直连 DockerHub 是几乎不可能的。更糟糕的是&#xff0c;很多原本可靠的国内…

Vue+ElementUi实现录音播放上传及处理getUserMedia报错问题

1.Vue安装插件 npm install --registryhttps://registry.npmmirror.com 2.Vue页面使用 <template><div class"app-container"><!-- header --><el-header class"procedureHeader" style"height: 20px;"><el-divid…

密码学及其应用 —— 密码学的经典问题

1. 古典密码学问题 1.1 问题1&#xff1a;破解凯撒密码 1.1.1 问题 凯撒密码是最简单的单字母替换加密方案。这是一种通过将字母表中的字母固定向右移动几位来实现的加密方法。解密下面的文本&#xff0c;该文本通过对一个去除了空格的法语文本应用凯撒密码获得&#xff1a; …

layui-按钮

1.用法 使用 用button标签 type"button" class"layui-button" 效果&#xff1a; 2.主题设置 前面都要加上layui-bin 3.尺寸设置 可以叠加使用&#xff01; 4.圆角设置 加一个layui-bin-radius 5.按钮图标设置 里面加一个i标签 加class"layui-…

借教室(题解)

P1083 [NOIP2012 提高组] 借教室 - 洛谷 | 计算机科学教育新生态 (luogu.com.cn) 思路&#xff1a;二分前缀和 我们将和质检员那题差不多&#xff0c;只需要将候选人二分即可 #include<bits/stdc.h> using namespace std; #define int long long int n,m; int r[100000…

【操作与配置】VSCode配置Python

Python环境配置 可以参见&#xff1a;【操作与配置】Python&#xff1a;CondaPycharm_pycharmconda-CSDN博客 官网下载Python&#xff1a;http://www.python.org/download/官网下载Conda&#xff1a;Miniconda — Anaconda documentation VSCode插件安装 插件安装后需重启V…

disql使用

进入bin目录&#xff1a;cd /opt/dmdbms/bin 启动disql&#xff1a;./disql&#xff0c;然后输入用户名、密码 sh文件直接使用disql&#xff1a; 临时添加路径到PATH环境变量&#xff1a;在当前会话中临时使用disql命令而无需每次都写完整路径&#xff0c;可以在执行脚本之前…

Eclipse + GDB + J-Link 的单片机程序调试实践

Eclipse GDB J-Link 的调试实践 本文介绍如何创建Eclipse的调试配置&#xff0c;如何控制调试过程&#xff0c;如何查看修改各种变量。 对 Eclipse 的要求 所用 Eclipse 应当安装了 Eclipse Embedded CDT 插件。从 https://www.eclipse.org/downloads/packages/ 下载 Ecli…

20240628模拟赛总结

cf好了 让我们开始 T1 Two Regular Polygons 判断能不能构造出题中要求的正多边形 关键是n%m0 Two Regular Polygons #include<bits/stdc.h> using namespace std; int t; int n,m; int main() {cin>>t;for(int i1;i<t;i){cin>>n>>m;if(n%m0)co…

MySQL 代理层:ProxySQL

文章目录 说明安装部署1.1 yum 安装1.2 启停管理1.3 查询版本1.4 Admin 管理接口 入门体验功能介绍3.1 多层次配置系统 读写分离将实例接入到代理服务定义主机组之间的复制关系配置路由规则事务读的配置延迟阈值和请求转发 ProxySQL 核心表mysql_usersmysql_serversmysql_repli…

【C++】相机标定源码笔记- 标定工具库测试

标定工具库测试 一、计算相机内参&#xff1a;对两个相机进行内参标定&#xff0c;并将标定结果保存到指定的文件中 采集图像&#xff1a;相机1-16张 相机2-17张 定义保存相机1/2内参的文件(.yml)路径。 定义相机1/2采集的图片文件夹路径。定义相机1/2存储文件名的向量获取文件…

作为图形渲染API,OpenGL和Direct3D的全方位对比。

当你在网页看到很多美轮美奂的图形效果&#xff0c;3D交互效果&#xff0c;你知道是如何实现的吗&#xff1f;当然是借助图形渲染API了&#xff0c;说起这个不就不得说两大阵营&#xff0c;OpenGL和Direct3D&#xff0c;贝格前端工场在本文对二者做个详细对比。 一、什么是图形…

26.5 Django模板层

1. 模版介绍 在Django中, 模板(Templates)主要用于动态地生成HTML页面. 当需要基于某些数据(如用户信息, 数据库查询结果等)来动态地渲染HTML页面时, 就会使用到模板.以下是模板在Django中使用的几个关键场景: * 1. 动态内容生成: 当需要根据数据库中的数据或其他动态数据来生…

推动能源绿色低碳发展,风机巡检进入国产超高清+AI时代

全球绿色低碳能源数字转型发展正在进入一个重要窗口期。风电作为一种清洁能源&#xff0c;在碳中和过程中扮演重要角色&#xff0c;但风电场运维却是一件十足的“苦差事”。 传统的风机叶片人工巡检方式主要依靠巡检人员利用高倍望远镜检查、高空绕行下降目测检查(蜘蛛人)、叶…

校园水质信息化监管系统——水质监管物联网系统

随着物联网技术的发展越来越成熟&#xff0c;它不断地与人们的日常生活和工作深入融合&#xff0c;推动着社会的进步。其中物联网系统集成在高校实践课程中可以应用到许多项目&#xff0c;如环境气象检测、花卉种植信息化监管、水质信息化监管、校园设施物联网信息化改造、停车…

Qt6 qcustomplot在图表上画一条直线

完整代码如下: 主要注意的是Qt中的QHBoxLayout等Qt类对象在被引用的情况下是可以使用局部变量的,典型的如setLayout这类型的函数接口,都可以使用局部变量,而不是new对象。 另外一点就是qcustomplot中的replot就相当于Qt中的update,由于qcustomplot是属于绘图类的接口库,…

如何用Python向PPT中批量插入图片

办公自动化办公中&#xff0c;Python最大的优势是可以批量操作&#xff0c;省去了用户粘贴、复制、插入等繁琐的操作。经常做PPT的朋友都知道&#xff0c;把图片插入到PPT当中的固定位置是一个非常繁琐的操作&#xff0c;往往调整图片时耗费大量的时间和精力。如何能省时省力插…

新型200V预稳压器可简化故障容受型电源的设计

讨论几种设计故障容受型电源的方法&#xff0c;其中包括新的预稳压器拓扑结构&#xff0c;该结构可简化电路设计及元件选择。 对抗相位故障 如果交流电源到电表之间出现错误连接故障&#xff0c;或是像空调或电磁炉等采用三相电源工作的大功率负载在两个相位之间的连接错误&a…