跟着深度学习好书实践tensorflow神经网络

前言

2024 年诺贝尔物理学奖授予了约翰·霍普菲尔德 (John Hopfield)和图灵奖得主、AI教父杰弗里·辛顿(Geoffrey  Hinton),"以表彰他们利用人工神经网络进行机器学习的奠基性发现和发明"。 

辛顿在接受电话采访时表示,我没有想到(I have no idea that will happen)。

被誉为“AI教父”的杰佛瑞·埃佛勒斯·辛顿(Geoffrey Everest Hinton,1947-),在70多岁时,成为图灵奖和诺贝尔物理学奖双料得主。

远在1986年,辛顿与David Rumelhart和Ronald Williams共同发表了一篇题为“通过反向传播误差来学习”(Learning representations by back-propagating errors)的论文。

[1]Home Page of Geoffrey Hinton,https://www.cs.toronto.edu/~hinton/

[2]David E. Rumelhart, Geoffrey E. Hinton und Ronald J. Williams. Learning representations by back-propagating errors., Nature (London) 323, S. 533-536,http://www.cs.utoronto.ca/~hinton/absps/naturebp.pdf

三位科学家,并不是第一个提出这种“反向传播”方法的人。但他们将反向传播算法应用于多层神经网络并且证明了这种方法对机器学习行之有效。他们的论文也证明了,神经网络中的多个隐藏层可以学习任何函数,从而解决了闵斯基等书中提出的单层感知机存在的问题。

同一时期,辛顿与 David Ackley 和 Terry Sejnowski 共同发明了玻尔兹曼机。辛顿1986年有关反向传播算法和波尔兹曼机的两篇重要文章,研究者们兴趣盎然,他们凭借自身的信念,排除嘈杂的干扰而自得其乐,江湖貌似平静但暗流涌动,为人工智能春天之到来做好了准备。正是应了一句名言:“大隐隐于市”。

Hinton在1986年提出的通过反向传播来训练深度网络理论,标志着深度学习发展的一大转机,为近年来人工智能的发展奠定了基础。更实际点说,今天谷歌中通过语音识别进行图片检索、在手机上把语音转化为文字的技术的实现,大部分功劳要归于Hinton博士的研究。他的研究,彻底改变了人工智能,乃至整个人类发展的轨迹。

《深度学习的数学》是一本介绍深度学习数学原理的书籍。本书由日本作者涌井良幸和涌井贞美合著,由杨瑞龙翻译成中文。

作者: [日]涌井良幸 / [日]涌井贞美
出版社: 人民邮电出版社 2019
出品方: 图灵教育
原作名: ディープラーニングがわかる数学入門
译者: 杨瑞龙
出版年: 2019-4
页数: 236
定价: 69.00元
装帧: 平装
丛书: 图灵程序设计丛书·程序员的数学
ISBN/ISSN:978-7-115-50934-5
载体形态:225页 :图 ;21cm
中图分类号:TP181

内容简介

《深度学习的数学》基于丰富的图示和具体示例,通俗易懂地介绍了深度学习相关的数学知识。本书主要分为四个部分:线性代数、微积分、概率论和信息论。每一部分都详细介绍了相关的数学知识,并结合深度学习的应用进行讲解。

在线性代数部分,本书讲解了向量、矩阵、线性变换等基本概念,以及它们在深度学习中的应用,如神经网络的表示和运算。

在微积分部分,本书讲解了导数、积分、链式法则等基本概念,以及它们在深度学习中的应用,如优化算法和反向传播算法。

在概率论部分,本书讲解了概率、随机变量、期望等基本概念,以及它们在深度学习中的应用,如概率图模型和贝叶斯推断。

在信息论部分,本书讲解了信息熵、互信息、条件熵等基本概念,以及它们在深度学习中的应用,如信息论视角下的优化和学习。

第1章介绍神经网络的概况;

第2章介绍理解神经网络所需的数学基础知识;

第3章介绍神经网络的最优化;

第4章介绍神经网络和误差反向传播法;

第5章介绍深度学习和卷积神经网络。

书中使用Excel进行理论验证,帮助读者直观地体验深度学习的原理。

作者和译者简介

涌井良幸

1950年生于东京,毕业于东京教育大学(现筑波大学)数学系,现为自由职业者。

著有《用Excel学深度学习》(合著)、《统计学有什么用?》等。

涌井贞美

1952年生于东京,完成东京大学理学系研究科硕士课程,现为自由职业者。

著有《用Excel学深度学习》(合著)、《图解贝叶斯统计入门》等。

译者简介:杨瑞龙

1982年生,2008年北京大学数学科学学院硕士毕业,软件开发者,从事软件行业10年。2013年~2016年赴日工作3年,从2016年开始在哆嗒数学网公众号发表《数学上下三万年》等多篇翻译作品。

目录

第1章 神经网络的思想


1 - 1 神经网络和深度学习  2
1 - 2 神经元工作的数学表示  6
1 - 3 激活函数:将神经元的工作一般化  12
1 - 4 什么是神经网络  18
1 - 5 用恶魔来讲解神经网络的结构  23
1 - 6 将恶魔的工作翻译为神经网络的语言  31
1 - 7 网络自学习的神经网络  36


第2章 神经网络的数学基础


2 - 1 神经网络所需的函数  40
2 - 2 有助于理解神经网络的数列和递推关系式  46
2 - 3 神经网络中经常用到的Σ符号  51
2 - 4 有助于理解神经网络的向量基础  53
2 - 5 有助于理解神经网络的矩阵基础  61
2 - 6 神经网络的导数基础  65
2 - 7 神经网络的偏导数基础  72
2 - 8 误差反向传播法必需的链式法则  76
2 - 9 梯度下降法的基础:多变量函数的近似公式  80
2 - 10 梯度下降法的含义与公式  83
2 - 11 用Excel 体验梯度下降法  91
2 - 12 最优化问题和回归分析  94


第3章 神经网络的最优化


3 - 1 神经网络的参数和变量  102
3 - 2 神经网络的变量的关系式  111
3 - 3 学习数据和正解  114
3 - 4 神经网络的代价函数  119
3 - 5 用Excel体验神经网络  127


第4章 神经网络和误差反向传播法


4 - 1 梯度下降法的回顾  134
4 - 2 神经单元误差  141
4 - 3 神经网络和误差反向传播法  146
4 - 4 用Excel体验神经网络的误差反向传播法  153


第5章 深度学习和卷积神经网络


5 - 1 小恶魔来讲解卷积神经网络的结构  168
5 - 2 将小恶魔的工作翻译为卷积神经网络的语言  174
5 - 3 卷积神经网络的变量关系式  180
5 - 4 用Excel体验卷积神经网络  193
5 - 5 卷积神经网络和误差反向传播法  200
5 - 6 用Excel体验卷积神经网络的误差反向传播法  212


附录


A 训练数据(1)  222
B 训练数据(2)  223
C 用数学式表示模式的相似度  225

小试牛刀

使用清华镜像开源获取anaconda安装软件

https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/Anaconda3-2024.06-1-Windows-x86_64.exe

安装成功后,打开Prompt,配置环境

因为TensorFlow可能需要使用某些特定版本的库,而这些库与您的系统上的其他应用程序可能存在冲突,使用虚拟环境可以隔离TensorFlow和其他应用程序之间的库,从而避免冲突,所以先为TensorFlow创建一个虚拟环境。

在Anaconda Prompt终端中,运行以下命令以创建名为“py39tf210_env”的虚拟环境:

conda create --name py39tf210_env

激活虚拟环境
在创建虚拟环境之后,您需要激活该虚拟环境。在Anaconda Prompt终端中,运行以下命令:

conda activate py39tf210_env

查看当前已有环境,激活的环境
conda env list

安装TensorFlow,打开anaconda prompt,然后输入在里面输入以下命令:

conda install pip

更新

python -m pip install --upgrade pip

nvidia显卡驱动下载
https://www.nvidia.cn/content/DriverDownloads/confirmation.php?url=/Windows/460.89/460.89-desktop-win10-64bit-international-dch-whql.exe&lang=cn&type=GeForce


 

cuda下载
https://developer.nvidia.com/cuda-toolkit
https://developer.nvidia.com/cuda-11.2.0-download-archive

cudnn下载
https://developer.nvidia.com/rdp/cudnn-archive

下载的cudnn文件夹,将bin、include、lib合并到CUDA对应版本文件夹

在文件夹NVIDIA GPU Computing Toolkit\CUDA\v11.2\extras\demo_suite CMD命令窗口,检查CUDA安装是否成功

执行bandwidthTest.exe

Microsoft Windows [版本 10.0.19045.5011]
(c) Microsoft Corporation。保留所有权利。

C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.2\extras\demo_suite>bandwidthTest.exe
[CUDA Bandwidth Test] - Starting...
Running on...

 Device 0: NVIDIA GeForce GTX 1660 Ti
 Quick Mode

 Host to Device Bandwidth, 1 Device(s)
 PINNED Memory Transfers
   Transfer Size (Bytes)        Bandwidth(MB/s)
   33554432                     6315.2

 Device to Host Bandwidth, 1 Device(s)
 PINNED Memory Transfers
   Transfer Size (Bytes)        Bandwidth(MB/s)
   33554432                     6322.3

 Device to Device Bandwidth, 1 Device(s)
 PINNED Memory Transfers
   Transfer Size (Bytes)        Bandwidth(MB/s)
   33554432                     249160.5

Result = PASS

NOTE: The CUDA Samples are not meant for performance measurements. Results may vary when GPU Boost is enabled.

执行deviceQuery.exe

Microsoft Windows [版本 10.0.19045.5011]
(c) Microsoft Corporation。保留所有权利。


C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.2\extras\demo_suite>deviceQuery.exe
deviceQuery.exe Starting...

 CUDA Device Query (Runtime API) version (CUDART static linking)

Detected 1 CUDA Capable device(s)

Device 0: "NVIDIA GeForce GTX 1660 Ti"
  CUDA Driver Version / Runtime Version          12.0 / 11.2
  CUDA Capability Major/Minor version number:    7.5
  Total amount of global memory:                 6144 MBytes (6442123264 bytes)
  (24) Multiprocessors, ( 64) CUDA Cores/MP:     1536 CUDA Cores
  GPU Max Clock rate:                            1590 MHz (1.59 GHz)
  Memory Clock rate:                             6001 Mhz
  Memory Bus Width:                              192-bit
  L2 Cache Size:                                 1572864 bytes
  Maximum Texture Dimension Size (x,y,z)         1D=(131072), 2D=(131072, 65536), 3D=(16384, 16384, 16384)
  Maximum Layered 1D Texture Size, (num) layers  1D=(32768), 2048 layers
  Maximum Layered 2D Texture Size, (num) layers  2D=(32768, 32768), 2048 layers
  Total amount of constant memory:               zu bytes
  Total amount of shared memory per block:       zu bytes
  Total number of registers available per block: 65536
  Warp size:                                     32
  Maximum number of threads per multiprocessor:  1024
  Maximum number of threads per block:           1024
  Max dimension size of a thread block (x,y,z): (1024, 1024, 64)
  Max dimension size of a grid size    (x,y,z): (2147483647, 65535, 65535)
  Maximum memory pitch:                          zu bytes
  Texture alignment:                             zu bytes
  Concurrent copy and kernel execution:          Yes with 2 copy engine(s)
  Run time limit on kernels:                     Yes
  Integrated GPU sharing Host Memory:            No
  Support host page-locked memory mapping:       Yes
  Alignment requirement for Surfaces:            Yes
  Device has ECC support:                        Disabled
  CUDA Device Driver Mode (TCC or WDDM):         WDDM (Windows Display Driver Model)
  Device supports Unified Addressing (UVA):      Yes
  Device supports Compute Preemption:            Yes
  Supports Cooperative Kernel Launch:            Yes
  Supports MultiDevice Co-op Kernel Launch:      No
  Device PCI Domain ID / Bus ID / location ID:   0 / 1 / 0
  Compute Mode:
     < Default (multiple host threads can use ::cudaSetDevice() with device simultaneously) >

deviceQuery, CUDA Driver = CUDART, CUDA Driver Version = 12.0, CUDA Runtime Version = 11.2, NumDevs = 1, Device0 = NVIDIA GeForce GTX 1660 Ti
Result = PASS

NVIDIA的nvidia-smi(系统管理接口,监控GPU状态)提供GPU实时性能数据和管理功能

NVIDIA的nvcc-V(CUDA编译器版本信息获取工具)主要用于检查编译器兼容

在CMD窗口使用指令nvidia-smi 查看

Microsoft Windows [版本 10.0.19045.5011]
(c) Microsoft Corporation。保留所有权利。

C:\Users\admin>nvidia-smi
Fri Oct 11 15:19:13 2024
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 528.79       Driver Version: 528.79       CUDA Version: 12.0     |
|-------------------------------+----------------------+----------------------+
| GPU  Name            TCC/WDDM | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ... WDDM  | 00000000:01:00.0 Off |                  N/A |
| N/A   59C    P0    24W /  80W |      0MiB /  6144MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+

+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|  No running processes found                                                 |
+-----------------------------------------------------------------------------+

 在CMD窗口使用指令nvcc -V 查看

Microsoft Windows [版本 10.0.19045.5011]
(c) Microsoft Corporation。保留所有权利。

C:\Users\admin>nvcc -V
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2020 NVIDIA Corporation
Built on Mon_Nov_30_19:15:10_Pacific_Standard_Time_2020
Cuda compilation tools, release 11.2, V11.2.67
Build cuda_11.2.r11.2/compiler.29373293_0

pip install -i https://pypi.tuna.tsinghua.edu.cn/simple tensorflow==2.10.0
pip install -i https://pypi.tuna.tsinghua.edu.cn/simple tensorflow-gpu==2.10.1

注意:本机 Windows 上的 GPU 支持仅适用于 2.10 或更早版本,从 TF 2.11 开始,Windows 不支持 CUDA 构建。要在 Windows 上使用 TensorFlow GPU,您需要在 WSL2 中构建/安装 TensorFlow 或将 tensorflow-cpu 与 TensorFlow-DirectML-Plugin 一起使用。

CPU版本和GPU版本的区别主要在于运行速度,GPU版本运行速度更快,所以如果电脑显卡支持cuda,推荐安装gpu版本的。CPU版本,无需额外准备,CPU版本一般电脑都可以安装,无需额外准备显卡的内容。GPU版本,需要提前下载 cuda 和 cuDNN。

安装前 一定 要查看自己电脑的环境配置,然后查询Tensorflow-gpuPython、 cuda 、 cuDNN 版本关系,要 一 一对应。

tensorflow版本从2.x开始不再区分CPU版和GPU版,因此在软件配置正确的情况下,是可以找到GPU设备的。Tensorflow 2.10是最后一个在本地windows上支持GPU的版本。从2.11版本开始,需要在windows WLS2(适用于 Linux 的 Windows 子系统)上安装才能使用GPU。所以要在native-windows上使用GPU,就只能安装2.10.0版本及以下的版本,或者安装老版的tensorflow-gpu。

tensorflow gpu cuda cudnn对应表格
https://tensorflow.google.cn/install/source_windows?hl=en#gpu


tensorflow:支持 CPU 和 GPU 的最新稳定版(适用于 Ubuntu 和 Windows)
tf-nightly:预览 build(不稳定)。Ubuntu 和 Windows 均包含 GPU 支持。
旧版 TensorFlow
对于 TensorFlow 1.x,CPU 和 GPU 软件包是分开的:

tensorflow==1.15:仅支持 CPU 的版本
tensorflow-gpu==1.15:支持 GPU 的版本(适用于 Ubuntu 和 Windows)
系统要求
Python 3.6–3.9
若要支持 Python 3.9,需要使用 TensorFlow 2.5 或更高版本。
若要支持 Python 3.8,需要使用 TensorFlow 2.2 或更高版本。
pip 19.0 或更高版本(需要 manylinux2010 支持)
Ubuntu 16.04 或更高版本(64 位)
macOS 10.12.6 (Sierra) 或更高版本(64 位)(不支持 GPU)
macOS 要求使用 pip 20.3 或更高版本
Windows 7 或更高版本(64 位)
适用于 Visual Studio 2015、2017 和 2019 的 Microsoft Visual C++ 可再发行软件包
GPU 支持需要使用支持 CUDA® 的卡(适用于 Ubuntu 和 Windows)
注意:必须使用最新版本的 pip,才能安装 TensorFlow 2。
硬件要求
从 TensorFlow 1.6 开始,二进制文件使用 AVX 指令,这些指令可能无法在旧版 CPU 上运行。

安装numpy

pip install -i https://pypi.tuna.tsinghua.edu.cn/simple numpy==1.20.3

安装pytorch

Commands for Versions >= 1.0.0
v2.4.0
Conda
OSX
# conda
conda install pytorch==2.4.0 torchvision==0.19.0 torchaudio==2.4.0 -c pytorch
Linux and Windows
# CUDA 11.8
conda install pytorch==2.4.0 torchvision==0.19.0 torchaudio==2.4.0  pytorch-cuda=11.8 -c pytorch -c nvidia
# CUDA 12.1
conda install pytorch==2.4.0 torchvision==0.19.0 torchaudio==2.4.0 pytorch-cuda=12.1 -c pytorch -c nvidia
# CUDA 12.4
conda install pytorch==2.4.0 torchvision==0.19.0 torchaudio==2.4.0 pytorch-cuda=12.4 -c pytorch -c nvidia
# CPU Only
conda install pytorch==2.4.0 torchvision==0.19.0 torchaudio==2.4.0 cpuonly -c pytorch

(base) C:\Users\>conda install pytorch==2.4.0 torchvision==0.19.0 torchaudio==2.4.0 pytorch-cuda=12.4 -c pytorch -c nvidia
Collecting package metadata (current_repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Collecting package metadata (repodata.json): done
Solving environment: -
The environment is inconsistent, please check the package plan carefully
The following packages are causing the inconsistency:

  - defaults/win-64::anaconda==2021.11=py39_0
  - defaults/win-64::astropy==4.3.1=py39hc7d831d_0
  - defaults/win-64::bkcharts==0.2=py39haa95532_0

测试配置

import tensorflow as tf
from tensorflow.keras import layers, models
import torch
import os

#将TF_ENABLE_ONEDNN_OPTS设置成1或以上。0代表显示所有信息,1表示不显示info,2表示不显示warning,3表示不显示error。
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'

print('TensorFlow版本',tf.__version__)
# 查看torch当前版本号
print('torch当前版本号',torch.__version__)  
# 编译当前版本的torch使用的cuda版本号
print('torch使用的cuda版本号',torch.version.cuda)  
# 查看当前cuda是否有可用的Torch,如果输出True,则表示存在/成功安装
print('当前cuda是否有可用的Torch',torch.cuda.is_available())  

# 获取TensorFlow的构建信息
build = tf.sysconfig.get_build_info()

# 打印CUDA的版本号(如果已安装)
print(build['cuda_version'])

# 打印cuDNN的版本号(如果已安装)
print(build['cudnn_version'])

print('GPU:是否已经编译了CUDA支持', tf.test.is_built_with_cuda())
print('GPU:当前GPU设备名称', tf.test.gpu_device_name())

# 输出可用的GPU数量
print("Num GPUs Available: ", len(tf.config.experimental.list_physical_devices('GPU')))

print("Num CPUs Available: ", len(tf.config.list_physical_devices('CPU')))

#默认情况下,TensorFlow 会映射进程可见的所有 GPU(取决于 CUDA_VISIBLE_DEVICES)的几乎全部内存。
#这是为了减少内存碎片,更有效地利用设备上相对宝贵的 GPU 内存资源
gpus = tf.config.list_physical_devices('GPU')

if gpus:
  # Restrict TensorFlow to only allocate 1GB of memory on the first GPU
  try:
    tf.config.set_logical_device_configuration(
        gpus[0],
        [tf.config.experimental.VirtualDeviceConfiguration(memory_limit=1024*6)])
    logical_gpus = tf.config.list_logical_devices('GPU')
    print(len(gpus), "Physical GPUs,", len(logical_gpus), "Logical GPUs")
  except RuntimeError as e:
    # Virtual devices must be set before GPUs have been initialized
    print(e)

import timeit
 
#指定在cpu上运行
def cpu_run():
    with tf.device('/cpu:0'):
        cpu_a = tf.random.normal([10000, 1000])
        cpu_b = tf.random.normal([1000, 2000])
        c = tf.matmul(cpu_a, cpu_b)
    return c
 
#指定在gpu上运行 
def gpu_run():
    with tf.device('/gpu:0'):
        gpu_a = tf.random.normal([10000, 1000])
        gpu_b = tf.random.normal([1000, 2000])
        c = tf.matmul(gpu_a, gpu_b)
    return c

cpu_time = timeit.timeit(cpu_run, number=10)
gpu_time = timeit.timeit(gpu_run, number=10)
print("cpu:", cpu_time, "  gpu:", gpu_time)

主程序

输出概率
Softmax
线性层
加法和归一化
前馈网络
残差连接与层归一化
加权和归一化
多头注意力
编码器自注意力:标记互相看
查询、键、值由编码器的状态计算得出
位置编码
输入嵌入
输入
前馈网络:在获取其他标记的信息后,花点时间思考和加工这些信息
编码器自注意力:目标标记查看源查询——来自解码器的状态;键和值来自编码器的状态
解码器自注意力(掩蔽):标记查看前面的标记
查询、键、值由解码器的状态计算得出
位置编码
输出嵌入
输出(右移)

 Torch和TensorFlow是两个流行的深度学习框架,都可以用于构建和训练神经网络模型。它们都提供了强大的计算图和自动求导功能,以及一系列工具和库来简化深度学习模型的开发和训练过程。Torch的计算图是动态的,这意味着可以在运行时进行修改和调整。而TensorFlow的计算图是静态的,需要在构建之后才能执行。这使得Torch更适合于动态的计算流程和实验性的研究,而TensorFlow更适合于大规模的生产环境和优化计算性能。Torch在早期得到了广泛的应用和支持,而TensorFlow则在谷歌的大力推动下逐渐成为了业界的主流深度学习框架之一。这也导致了它们在生态系统和社区支持方面的差异。

# -*- coding: utf-8 -*-
"""
Spyder Editor

This is a temporary script file.

y=sin(x) 神经网络的设计与训练
"""

import torch
from torch import nn
from matplotlib import pyplot as plt
import numpy as np

#神经网络类 network,继承nn.Module类
class Network(nn.Module):
    #类的初始化函数
    #init函数传入1个参数m,m代表了隐藏层神经元的数量
    def __init__(self,m):
        super().__init__()
        #layer1 是输入层与隐藏层之间的线性层,大小是1*m
        self.layer1 = nn.Linear(1,m)
        #layer2 是隐藏层与输出层之间的线性层,大小是m*1
        self.layer2 = nn.Linear(m,1)
        
    #前向传播计算函数,函数传入输入数据x
    def forward(self,x):
        x = self.layer1(x) #先进入layer1层,计算结果
        x = torch.sigmoid(x) #使用sigmoid激活函数
        return self.layer2(x) #返回layer2层的计算结果
        
    
# 在main函数中实现模型的测试代码
if __name__ == '__main__':
    '''
    model = Network(6) #创建模型,隐藏层个数传入6
    print(model) #打印model,可以看到模型的结果
    #接着使用循环,遍历模型中的参数
    for name,param in model.named_parameters():
        #打印参数名name和参数的尺寸param.data.shape
        print(f"{name}:{param.data.shape}")
    #定义一个100*1大小的张量
    #代表了100个输入数据,每个数据包括1个特征值,也就是sin(x)中的x
    x = torch.zeros([100,1])
    h = model(x) #将x输入至模型model,得到预测结果h,h即为sin(x)
    print(f"x:{x.shape}")
    print(f"h:{h.shape}")
    '''
    
    #神经网络模拟正弦函数,是一个回归任务
    #训练使用的数据,可以直接使用正弦函数进行构造

    #数据的生成代码
    #使用np.arrange 生成一个从0到1,步长为0.01
    # 含有100个数据点的数组,作为正弦函数的输入数据
    x = np.arange(0.0, 1.0, 0.01)
    # 将0到1的x,乘以2倍PI,从单位间隔转换为弧度值
    # 将x映射到正弦函数的一个完整周期上,并计算正弦值
    y = np.sin(2*np.pi*x)
    # 将x和y通过reshape函数转为100乘以1的数组
    # 也就是100个(x,y)坐标,代表100个训练数据
    x = x.reshape(100,1)
    y = y.reshape(100,1)
    #将(x,y)组成的数据点,画在画板上
    plt.scatter(x,y)
    
    x = torch.Tensor(x) # 训练前将数据x和y转化为tensor张量
    y = torch.Tensor(y)
    
    model = Network(3) # 定义神经网络
    # 隐藏层的神经元数量,可以尝试6 、 10 、 32 等数据,观察实验结果
    
    criterion = nn.MSELoss() # 创建均方误差损失函数
    # Adam优化器optimizer
    optimizer = torch.optim.Adam(model.parameters(), lr=0.001)
    # 完成这些必要的变量声明后,就可以进入神经网络的循环迭代
    
    #神经网络训练代码
    # 使用批量梯度下降算法
    # 每次迭代,都会基于全部样本计算损失值,执行梯度下降
    # 循环的轮数定为1万,就可以使模型达到收敛
    for epoch in range(1000):
        # 在循环中,包括了5个步骤:
        h = model(x) #1.计算模型的预测值h
        loss=criterion(h,y) #2.计算预测h和标签y之间的损失loss
        loss.backward() #3.使用backward计算梯度
        optimizer.step() #4.使用optimizer.step更新参数
        optimizer.zero_grad() #5.将梯度清零
        
        #这5个步骤,是使用pytorch框架训练模型的定式
        
        #每迭代1000次,就打印一次模型的损失,用于观察训练的过程
        if epoch % 1000 == 0:
            #其中loss.item是损失的标量值
            print(f"After { epoch } iterations, the loss is {loss.item()}")
    
    h = model(x) # 完成训练后,使用模型预测输入x,得到预测结果h
    x = x.data.numpy()
    h = h.data.numpy()
    plt.scatter(x,h) # 将预测点(x,h)打印在屏幕
    plt.show()
    

# -*- coding: utf-8 -*-
"""
Created on Tue Oct 15 17:57:10 2024

"""

import tensorflow as tf
print(tf.__file__)
#https://github.com/tensorflow/tensorflow

from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets("MNIST_data", one_hot=True)
print ( ' 输入数据:', mnist.train.images)
print ( ' 输入数据打shape :', mnist.train.images.shape)
import pylab
im = mnist.train.images[1]
im = im.reshape(-1 ,28)
pylab.imshow(im)
pylab.show()
# -*- coding: utf-8 -*-
"""
Created on Tue Oct 15 17:51:23 2024

"""

#coding: utf-8
import os
import tensorflow as tf
from tensorflow.examples.tutorials.mnist import input_data
from PIL import Image

'''
函数功能:按照bmp格式提取mnist数据集中的图片
参数介绍:
    mnist_dir   mnist数据集存储的路径
    save_dir    提取结果存储的目录
'''
def extract_mnist(mnist_dir, save_dir):
    rows = 28
    cols = 28

    # 加载mnist数据集
    # one_hot = True为默认打开"独热编码"
    mnist = input_data.read_data_sets(mnist_dir, one_hot=False)
    # 获取训练图片数量
    shape = mnist.train.images.shape
    images_train_count = shape[0]
    pixels_count_per_image = shape[1]
    # 获取训练标签数量=训练图片数量
    # 关闭"独热编码"后,labels的类型为[7 3 4 ... 5 6 8]
    labels = mnist.train.labels
    labels_train_count = labels.shape[0]

    if (images_train_count == labels_train_count):
        print("训练集共包含%d张图片,%d个标签" % (images_train_count, labels_train_count))
        print("每张图片包含%d个像素" % (pixels_count_per_image))
        print("数据类型为", mnist.train.images.dtype)

        # mnist图像数值的范围为[0,1], 需将其转换为[0,255]
        for current_image_id in range(images_train_count):
            for i in range(pixels_count_per_image):
                if mnist.train.images[current_image_id][i] != 0:
                    mnist.train.images[current_image_id][i] = 255

            if ((current_image_id + 1) % 50) == 0:
                print("已转换%d张,共需转换%d张" %
                      (current_image_id + 1, images_train_count))

        # 创建train images的保存目录, 按标签保存
        for i in range(10):
            dir = "%s/%s" % (save_dir, i)
            print(dir)
            if not os.path.exists(dir):
                os.mkdir(dir)

        # indices = [0, 0, 0, ..., 0]用来记录每个标签对应的图片数量
        indices = [0 for x in range(0, 10)]
        for i in range(images_train_count):
            new_image = Image.new("L", (cols, rows))
            # 遍历new_image 进行赋值
            for r in range(rows):
                for c in range(cols):
                    new_image.putpixel(
                        (r, c), int(mnist.train.images[i][c + r * cols]))

            # 获取第i张训练图片对应的标签
            label = labels[i]
            image_save_path = "%s/%s/%s.bmp" % (save_dir, label,
                                                indices[label])
            indices[label] += 1
            new_image.save(image_save_path)

            # 打印保存进度
            if ((i + 1) % 50) == 0:
                print("图片保存进度: 已保存%d张,共需保存%d张" % (i + 1, images_train_count))
    else:
        print("图片数量与标签数量不一致!")


if __name__ == '__main__':
    mnist_dir = "./mnist_Data"
    save_dir = "./mnist_Data_TrainImages"
    extract_mnist(mnist_dir, save_dir)
    
    
# -*- coding: utf-8 -*-
"""
Created on Tue Oct 15 15:30:10 2024
设计、实现标准前馈神经网络
输入层 隐藏层 输出层 sofrmax p0+p1+...+pn=1
神经网络的设计和实现
训练数据的准备和处理
模型的训练和测试流程

MNIST是一个手写体数字的图片数据集,该数据集来由美国国家标准与技术研究所(National Institute of Standards and Technology (NIST))发起整理,一共统计了来自250个不同的人手写数字图片,其中50%是高中生,50%来自人口普查局的工作人员。
该数据集的收集目的是希望通过算法,实现对手写数字的识别。
1998年,Yan LeCun 等人发表了论文《Gradient-Based Learning Applied to Document Recognition》,
首次提出了LeNet-5 网络,利用上述数据集实现了手写字体的识别。
https://yann.lecun.com/exdb/mnist/

在PyTorch中构建一个简单的卷积神经网络,并使用MNIST数据集训练它识别手写数字。 MNIST包含70,000张手写数字图像: 60,000张用于培训,10,000张用于测试。
图像是灰度(即通道数为1),28x28像素,并且居中的,以减少预处理和加快运行。
PyTorch是一个非常流行的深度学习框架。但是与其他框架不同的是,PyTorch具有动态执行图,意味着计算图是动态创建的。

"""
import torch
from torch import nn
from torchvision import transforms
from torchvision import datasets
from torch.utils.data import DataLoader
import torch.optim as optim

# 定义神经网络Network
class Network(nn.Module):
    def __init__(self):
        super().__init__()
        # 线性层1,输入层和隐藏层之间的线性层
        self.layer1 = nn.Linear(784,256)
        # 线性层2,隐藏层和输出层之间的线性层
        self.layer2 = nn.Linear(256,10)
        
    # 在前向传播,forward函数中,输入为图像x
    def forward(self,x):
        x = x.view(-1, 28*28) #使用 view函数,将x展平
        x = self.layer1(x) # 将x输入至layer1
        x = torch.relu(x) # 使用relu激活
        return self.layer2(x) # 输入至layer2计算结果
        # 这里我们没有直接定义softmax层
        # 这是因为后面会使用CrossRntropyLoss损失函数
        # 在这个损失函数中,会实现softmax的计算

#训练数据的准备和处理
#MNIST数据集,可以从torchvision.datasets中获取
#TFDS 存在于两个软件包中:
#pip install tensorflow-datasets:稳定版,数月发行一次。
#pip install tfds-nightly:每天发行,包含最近版本的数据集。
#https://tensorflow.google.cn/datasets/overview?hl=zh-cn
#https://tensorflow.google.cn/datasets/catalog/overview
#train_dataset = datasets.MNIST(root='./MNIST',train=True,transform=data_tf,download=True)
#train 60000个数据,用作训练
#tesst 10000个数据,用作测试

#一共4个文件,训练集、训练集标签、测试集、测试集标签

#文件名称	大小	内容
#train-labels-idx1-ubyte.gz	9,681 kb	55000张训练集,5000张验证集
#train-labels-idx1-ubyte.gz	29 kb	训练集图片对应的标签
#t10k-images-idx3-ubyte.gz	1,611 kb	10000张测试集
#t10k-labels-idx1-ubyte.gz	5 kb	测试集图片对应的标签

#如果直接下载该数据集的话,下载下来的是.gz格式的数据
#手动提取Mnist数据集中的图片,并把它按照常用的格式存储 在代码中用到了两个第三方的包,分别为tensorflow、PIL
#conda install tensorflow-gpu
#conda install Pillow
#https://github.com/tensorflow/tensorflow 压缩包解压,把里面的tensorflow下的examples文件夹直接复制过来

# 数据处理流程
if __name__ == '__main__':
    device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
    print(torch.cuda.device_count(),torch.cuda.get_device_name(0),torch.cuda.current_device())
    #optimizer加载参数时,tensor默认在CPU上,当你使用GPU训练时就会报以上错误。
    # 实现图像的预处理pipeline
    transform = transforms.Compose([
        transforms.Grayscale(num_output_channels=1), #转换为单通道灰度图 
        transforms.ToTensor() # 转换为张量
    ])
    # 使用ImageFolder函数,读取数据文件夹,构建数据集dataset
    # 这个函数会将保存数据的文件夹的名字,作为数据的标签,组织数据
    # 例如:对于名字为“3”的文件夹
    # 会将“3”作为文件夹中图像数据的标签,和图像匹配,用于后续的训练,使用起来非常的方便
    train_dataset =  datasets.ImageFolder(root="./mint_image/train", transform=transform)
    #读取测试数据集
    test_dataset =  datasets.ImageFolder(root="./mint_image/test", transform=transform)
    # 打印他们的长度
    print("train_dataset length:", len(train_dataset))
    print("test_dataset length:", len(test_dataset))
    
    #使用train_loader,实现小批量的数据读取
    #这里设置小批量的大小,batch_size=64,也就是每个批次,包括64个数据
    train_loader = DataLoader(train_dataset, batch_size=64, shuffle=True)
    #打印train_loader的长度
    print("train_loader length:", len(train_loader))
    #60000个训练数据,如果每个小批量,读入64个样本,那么60000个数据会被分割成938组数据
    #计算938*64=60032,这说明最后一个组,会不够64个数据
    
    # 循环遍历train_loader
    # 每一次循环,都会取出64个图像数据,作为一个小批量batch
    for batch_idx, (data, label) in enumerate(train_loader):
        if batch_idx == 2: #打印前2个batch观察
           break
        print("batch_idx:",batch_idx)
        print("data.shape:", data.shape) # 数据尺寸 torch.Size([64, 1, 28, 28]) 每组数据包括64个图像,每个图像有1个灰色通道,图像的尺寸是28*28
        print("label:", label.shape) #图像中的数字
        print(label)
    
    #模型的训练和测试
    #在使用pytorch训练模型时,需要创建三个对象
    model = Network() #1. 模型本身,它就是我们设计的神经网络
    #整个模型的参数、缓存、依赖的模块 放入device
    model.to(device)
    optimizer = optim.Adam(model.parameters()) #2. 优化器,优化模型中的参数
    criterion = nn.CrossEntropyLoss() #3. 损失函数,分类问题,使用交叉熵损失误差
    
    #进入模型的循环迭代
    for epoch in range(10): #外层循环,代表整个训练数据集的遍历次数
        #整个训练集要循环多少轮,是10次、20次或者100次都是可能的
        
        #内循环使用train_loader,进行小批量的数据读取
        for batch_idx, (inputs, target) in enumerate(train_loader):
            #内层每循环一次,就会进行一次梯度下降算法
            #包括5个步骤:
            #inputs, target = data
            inputs, target = inputs.to(device), target.to(device)
            # forward + backward + update
            outputs = model(inputs) #1.计算神经网络的前向传播结果
            loss = criterion(outputs, target) #2. 计算output和标签label之间的损失loss
            loss.backward() #3. 使用backward计算梯度
            optimizer.step() #4. 使用optimizer.step更新参数
            optimizer.zero_grad() #5.将梯度清零
            #这5个步骤,是使用pytorch框架训练模型的定式
            
            #每迭代100个小批量,就打印一次模型的损失,观察训练的过程
            if batch_idx % 100 == 0:
                print(f"Epoch {epoch + 1}/10" f"|Batch {batch_idx}/{len(train_loader)}" f"|Loss: {loss.item():.4f}")
                
    torch.save(model.state_dict(), './mnist.pth') # 保存模型
    
    #对象模型进行测试  map_location=lambda storage, loc: storage.cuda(0) map_location=torch.device('cuda:0') , map_location=device
    #model.load_state_dict(torch.load('./mnist.pth', map_location=device )) #加载刚刚训练好的模型文件
    
    checkpoint = torch.load("./mnist.pth", map_location=device)  # Load all tensors onto the CPU
    model.load_state_dict(checkpoint)
    #optimizer.load_state_dict(checkpoint)
    
    #for k, v in optimizer.state.items():    # key is Parameter, val is a dict {key='momentum_buffer':tensor(...)}
    #    if 'momentum_buffer' not in v:
    #        continue
    #    optimizer.state[k]['momentum_buffer'] = optimizer.state[k]['momentum_buffer'].cuda()


    # 增加以下几行代码,将optimizer里的tensor数据全部转到GPU上
    #for state in optimizer.state.values():
    #    for k, v in state.items():
    #        if torch.is_tensor(v):
    #            state[k] = v.cuda()


    right = 0 # 保存正确识别的数量
    for i,(x,y) in enumerate(test_dataset):
        output = model(x) #将其中的数据x输入到模型
        predict = output.argmax(1).item() #选择概率最大的作为预测结果
        #对比预测值predict和真实标签y
        if predict == y:
            right += 1
        else:
            #将识别错误的样例打印出来
            img_path = test_dataset.samples[i][0]
            print(f"wrong case: predict = {predict} y= {y} img_path = {img_path}")
            
    #计算测试效果
    sample_num = len(test_dataset)
    acc = right*1.0/sample_num
    print("test accuracy = %d/%d = %.31f" % (right, sample_num, acc))
    
    

train_dataset length: 55000
test_dataset length: 63
train_loader length: 860
batch_idx: 0
data.shape: torch.Size([64, 1, 28, 28])
label: torch.Size([64])
tensor([5, 7, 5, 5, 1, 3, 0, 3, 6, 6, 2, 5, 5, 2, 7, 0, 1, 3, 4, 4, 7, 9, 0, 6,
        6, 8, 3, 1, 6, 7, 7, 6, 6, 2, 0, 3, 0, 4, 7, 5, 1, 9, 1, 3, 2, 0, 1, 7,
        6, 5, 4, 6, 7, 4, 7, 6, 3, 2, 6, 2, 0, 0, 2, 4])
batch_idx: 1
data.shape: torch.Size([64, 1, 28, 28])
label: torch.Size([64])
tensor([3, 1, 5, 1, 1, 9, 3, 4, 2, 7, 5, 6, 6, 5, 1, 9, 5, 2, 8, 8, 7, 9, 4, 8,
        4, 1, 0, 7, 3, 9, 0, 2, 8, 4, 0, 7, 1, 9, 8, 7, 9, 3, 5, 1, 3, 9, 9, 4,
        7, 8, 4, 4, 5, 4, 0, 0, 5, 1, 4, 7, 1, 0, 4, 8])
batch_idx: 2
data.shape: torch.Size([64, 1, 28, 28])
label: torch.Size([64])
tensor([8, 0, 8, 1, 9, 1, 0, 4, 2, 3, 7, 6, 3, 4, 1, 1, 5, 9, 6, 1, 7, 2, 8, 2,
        6, 5, 6, 0, 4, 3, 9, 8, 8, 8, 3, 6, 6, 0, 1, 4, 7, 4, 5, 0, 4, 9, 5, 8,
        1, 0, 8, 3, 6, 3, 0, 6, 3, 2, 3, 1, 6, 2, 9, 4])
Epoch 1/10|Batch 0/860|Loss: 2.3214
Epoch 1/10|Batch 100/860|Loss: 0.3560
Epoch 1/10|Batch 200/860|Loss: 0.1577
......
Epoch 9/10|Batch 300/860|Loss: 0.0234
Epoch 9/10|Batch 400/860|Loss: 0.0211
Epoch 9/10|Batch 500/860|Loss: 0.0067
Epoch 9/10|Batch 600/860|Loss: 0.0033
Epoch 9/10|Batch 700/860|Loss: 0.0301
Epoch 9/10|Batch 800/860|Loss: 0.0306
Epoch 10/10|Batch 0/860|Loss: 0.0074
Epoch 10/10|Batch 100/860|Loss: 0.0015
Epoch 10/10|Batch 200/860|Loss: 0.0197
Epoch 10/10|Batch 300/860|Loss: 0.0299
Epoch 10/10|Batch 400/860|Loss: 0.0309
Epoch 10/10|Batch 500/860|Loss: 0.0609
Epoch 10/10|Batch 600/860|Loss: 0.0226
Epoch 10/10|Batch 700/860|Loss: 0.0720
Epoch 10/10|Batch 800/860|Loss: 0.0091

train_dataset length: 55000
test_dataset length: 63
wrong case: predict = 3 y= 2 img_path = ./mint_image/test\2\0.bmp
wrong case: predict = 3 y= 2 img_path = ./mint_image/test\2\10.bmp
wrong case: predict = 3 y= 2 img_path = ./mint_image/test\2\6.bmp
test accuracy = 60/63 = 0.9523809523809523280846178749925
# -*- coding: utf-8 -*-
"""
Created on Tue Oct 15 17:06:39 2024
在PyTorch中构建一个简单的卷积神经网络,并使用MNIST数据集训练它识别手写数字。 MNIST包含70,000张手写数字图像: 60,000张用于培训,10,000张用于测试。
图像是灰度(即通道数为1),28x28像素,并且居中的,以减少预处理和加快运行。
PyTorch是一个非常流行的深度学习框架。但是与其他框架不同的是,PyTorch具有动态执行图,意味着计算图是动态创建的。

"""

import torch
import torchvision
from torch.utils.data import DataLoader
import torch.nn as nn #torch.nn层中包含可训练的参数
import torch.nn.functional as F
import torch.optim as optim
import matplotlib.pyplot as plt
#注意下面两行在matplotlib使用上出错时,加上可不出错
import os
os.environ['KMP_DUPLICATE_LIB_OK'] = 'TRUE'

n_epochs = 3 #epoch的数量定义了将循环整个训练数据集的次数
batch_size_train = 64 #每次投喂的样本数量
batch_size_test = 1000
learning_rate = 0.01
momentum = 0.5 #优化器的超参数
log_interval = 10
random_seed = 1
torch.manual_seed(random_seed) #对于可重复的实验,须为任何使用随机数产生的东西设置随机种子
#训练集数据
train_loader = torch.utils.data.DataLoader(
  torchvision.datasets.MNIST('./data/', train=True, download=True, #加载该数据集(download=True)
                             transform=torchvision.transforms.Compose([
                               torchvision.transforms.ToTensor(),
                               torchvision.transforms.Normalize(
                                 (0.1307,), (0.3081,))
                             ])), #Normalize()转换使用的值0.1307和0.3081是该数据集的全局平均值和标准偏差,这里将它们作为给定值
  batch_size=batch_size_train, shuffle=True)
#测试集数据
test_loader = torch.utils.data.DataLoader(
  torchvision.datasets.MNIST('./data/', train=False, download=True,
                             transform=torchvision.transforms.Compose([
                               torchvision.transforms.ToTensor(),
                               torchvision.transforms.Normalize(
                                 (0.1307,), (0.3081,))
                             ])),
  batch_size=batch_size_test, shuffle=True) #使用size=1000对这个数据集进行测试
#查看一批测试数据由什么组成
examples = enumerate(test_loader) #enumerate指循环,类似for
batch_idx, (example_data, example_targets) = next(examples) #example_targets是图片实际对应的数字标签,example_data是指图片本身数据
print(example_targets)
print(example_data.shape) #输出torch.Size([1000, 1, 28, 28]),意味着我们有1000个例子的28x28像素的灰度(即没有rgb通道)

#定义卷积神经网络
class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        # batch*1*28*28(每次会送入batch个样本,输入通道数1(黑白图像),图像分辨率是28x28)
        # 下面的卷积层Conv2d的第一个参数指输入通道数,第二个参数指输出通道数(即用了几个卷积核),第三个参数指卷积核的大小
        self.conv1 = nn.Conv2d(1, 10, kernel_size=5) #因为图像为黑白的,所以输入通道为1,此时输出数据大小变为28-5+1=24.所以batchx1x28x28 -> batchx10x24x24
        self.conv2 = nn.Conv2d(10, 20, kernel_size=5) #第一个卷积层的输出通道数等于第二个卷积层是输入通道数。
        self.conv2_drop = nn.Dropout2d() #在前向传播时,让某个神经元的激活值以一定的概率p停止工作,可以使模型泛化性更强,因为它不会太依赖某些局部的特征
        self.fc1 = nn.Linear(320, 50) #由于下部分前向传播处理后,输出数据为20x4x4=320,传递给全连接层。# 输入通道数是320,输出通道数是50
        self.fc2 = nn.Linear(50, 10)#输入通道数是50,输出通道数是10,(即10分类(数字1-9),最后结果需要分类为几个就是几个输出通道数)。全连接层(Linear):y=x乘A的转置+b
    def forward(self, x):
        x = F.relu(F.max_pool2d(self.conv1(x), 2)) # batch*10*24*24 -> batch*10*12*12(2*2的池化层会减半,步长为2)(激活函数ReLU不改变形状)
        x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2)) #此时输出数据大小变为12-5+1=8(卷积核大小为5)(2*2的池化层会减半)。所以 batchx10x12x12 -> batchx20x4x4。
        x = x.view(-1, 320) # batch*20*4*4 -> batch*320
        x = F.relu(self.fc1(x)) #进入全连接层
        x = F.dropout(x, training=self.training) #减少遇到过拟合问题,dropout层是一个很好的规范模型。
        x = self.fc2(x)
        #计算log(softmax(x))
        return F.log_softmax(x)
#初始化网络和优化器
#如果我们使用GPU进行训练,应使用例如network.cuda()将网络参数发送给GPU。将网络参数传递给优化器之前,将它们传输到适当的设备很重要,否则优化器无法以正确的方式跟踪它们。
network = Net()
optimizer = optim.SGD(network.parameters(), lr=learning_rate,
                      momentum=momentum)
train_losses = []
train_counter = []
test_losses = []
test_counter = [i*len(train_loader.dataset) for i in range(n_epochs + 1)]
#每个epoch对所有训练数据进行一次迭代。加载单独批次由DataLoader处理
#训练函数
def train(epoch):
    network.train() #在训练模型时会在前面加上
    for batch_idx, (data, target) in enumerate(train_loader):
        optimizer.zero_grad() #使用optimizer.zero_grad()手动将梯度设置为零,因为PyTorch在默认情况下会累积梯度
        output = network(data) #生成网络的输出(前向传递)
        loss = F.nll_loss(output, target) #计算输出(output)与真值标签(target)之间的负对数概率损失
        loss.backward() #对损失反向传播
        optimizer.step() #收集一组新的梯度,并使用optimizer.step()将其传播回每个网络参数
        if batch_idx % log_interval == 0: #log_interval=10,每10次投喂后输出一次
            print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(
                epoch, batch_idx * len(data), len(train_loader.dataset),
                       100. * batch_idx / len(train_loader), loss.item()))
            train_losses.append(loss.item()) #添加进训练损失列表中
            train_counter.append(
                (batch_idx * 64) + ((epoch - 1) * len(train_loader.dataset)))
            #神经网络模块以及优化器能够使用.state_dict()保存和加载它们的内部状态。这样,如果需要,我们就可以继续从以前保存的状态dict中进行训练——只需调用.load_state_dict(state_dict)。
            torch.save(network.state_dict(), './model.pth')
            torch.save(optimizer.state_dict(), './optimizer.pth')


train(1)

#测试函数。总结测试损失,并跟踪正确分类的数字来计算网络的精度。
def test():
    network.eval() #在测试模型时在前面使用
    test_loss = 0
    correct = 0
    with torch.no_grad(): #使用上下文管理器no_grad(),我们可以避免将生成网络输出的计算结果存储在计算图(计算过程的构建,以便梯度反向传播等操作)中。(with是使用的意思)
        for data, target in test_loader:
            output = network(data) #生成网络的输出(前向传递)
            # 将一批的损失相加
            test_loss += F.nll_loss(output, target, size_average=False).item() #NLLLoss 的输入是一个对数概率向量和一个目标标签
            pred = output.data.max(1, keepdim=True)[1] ## 找到概率最大的下标
            correct += pred.eq(target.data.view_as(pred)).sum() #预测正确的数量相加
    test_loss /= len(test_loader.dataset)
    test_losses.append(test_loss)
    print('\nTest set: Avg. loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(
        test_loss, correct, len(test_loader.dataset),
        100. * correct / len(test_loader.dataset)))

test()

#我们将在循环遍历n_epochs之前手动添加test()调用,以使用随机初始化的参数来评估我们的模型。
for epoch in range(1, n_epochs + 1):
  train(epoch)
  test()

#评估模型的性能,画损失曲线
fig = plt.figure()
plt.plot(train_counter, train_losses, color='blue')
plt.scatter(test_counter, test_losses, color='red')
plt.legend(['Train Loss', 'Test Loss'], loc='upper right')
plt.xlabel('number of training examples seen')
plt.ylabel('negative log likelihood loss')
plt.show()

#输出自己找的测试图片,比较模型的输出。
examples = enumerate(test_loader)
batch_idx, (example_data, example_targets) = next(examples)
with torch.no_grad():
  output = network(example_data)
fig1 = plt.figure()
for i in range(6):
  plt.subplot(2,3,i+1)
  plt.tight_layout()
  plt.imshow(example_data[i][0], cmap='gray', interpolation='none')
  plt.title("Prediction: {}".format(
    output.data.max(1, keepdim=True)[1][i].item()))
  plt.xticks([])
  plt.yticks([])
plt.show()

#继续对网络进行训练,并看看如何从第一次培训运行时保存的state_dicts中继续进行训练。我们将初始化一组新的网络和优化器。
continued_network = Net()
continued_optimizer = optim.SGD(network.parameters(), lr=learning_rate,
                                momentum=momentum)

network_state_dict = torch.load('model.pth') #见左侧项目列表,有该文件
continued_network.load_state_dict(network_state_dict) #使用.load_state_dict(),我们现在可以加载网络的内部状态,并在最后一次保存它们时优化它们。
optimizer_state_dict = torch.load('optimizer.pth') #见左侧项目列表,有该文件
continued_optimizer.load_state_dict(optimizer_state_dict)
#同样,运行一个训练循环应该立即恢复我们之前的训练。为了检查这一点,我们只需使用与前面相同的列表来跟踪损失值
for i in range(4,9):
  test_counter.append(i*len(train_loader.dataset))
  train(i)
  test()
#我们再次看到测试集的准确性从一个epoch到另一个epoch有了(运行更慢的,慢的多了)提高。
#输出自己找的测试图片,比较模型的输出。
examples = enumerate(test_loader)
batch_idx, (example_data, example_targets) = next(examples)
with torch.no_grad():
  output = network(example_data)
fig1 = plt.figure()
for i in range(6):
  plt.subplot(2,3,i+1)
  plt.tight_layout()
  plt.imshow(example_data[i][0], cmap='gray', interpolation='none')
  plt.title("Prediction: {}".format(
    output.data.max(1, keepdim=True)[1][i].item()))
  plt.xticks([])
  plt.yticks([])
plt.show()

Train Epoch: 7 [0/60000 (0%)]	Loss: 0.191760
Train Epoch: 7 [640/60000 (1%)]	Loss: 0.294734
Train Epoch: 7 [1280/60000 (2%)]	Loss: 0.191830
Train Epoch: 7 [1920/60000 (3%)]	Loss: 0.261667
Train Epoch: 7 [2560/60000 (4%)]	Loss: 0.246295
Train Epoch: 7 [3200/60000 (5%)]	Loss: 0.051389
Train Epoch: 7 [3840/60000 (6%)]	Loss: 0.091725
Train Epoch: 7 [4480/60000 (7%)]	Loss: 0.091675
Train Epoch: 7 [5120/60000 (9%)]	Loss: 0.228122
Train Epoch: 7 [5760/60000 (10%)]	Loss: 0.169875
Train Epoch: 7 [6400/60000 (11%)]	Loss: 0.240313
Train Epoch: 7 [7040/60000 (12%)]	Loss: 0.418751
Train Epoch: 7 [7680/60000 (13%)]	Loss: 0.282468
Train Epoch: 7 [8320/60000 (14%)]	Loss: 0.441290
Train Epoch: 7 [8960/60000 (15%)]	Loss: 0.399027
Train Epoch: 7 [9600/60000 (16%)]	Loss: 0.102721
Train Epoch: 7 [10240/60000 (17%)]	Loss: 0.203118
Train Epoch: 7 [10880/60000 (18%)]	Loss: 0.190707
Train Epoch: 7 [11520/60000 (19%)]	Loss: 0.208380
Train Epoch: 7 [12160/60000 (20%)]	Loss: 0.179324
Train Epoch: 7 [12800/60000 (21%)]	Loss: 0.087058
Train Epoch: 7 [13440/60000 (22%)]	Loss: 0.162107
Train Epoch: 7 [14080/60000 (23%)]	Loss: 0.320501
Train Epoch: 7 [14720/60000 (25%)]	Loss: 0.500689
Train Epoch: 7 [15360/60000 (26%)]	Loss: 0.124898
Train Epoch: 7 [16000/60000 (27%)]	Loss: 0.231137
Train Epoch: 7 [16640/60000 (28%)]	Loss: 0.214216
Train Epoch: 7 [17280/60000 (29%)]	Loss: 0.126741
Train Epoch: 7 [17920/60000 (30%)]	Loss: 0.128205
Train Epoch: 7 [18560/60000 (31%)]	Loss: 0.121784
Train Epoch: 7 [19200/60000 (32%)]	Loss: 0.289838
Train Epoch: 7 [19840/60000 (33%)]	Loss: 0.118274
Train Epoch: 7 [20480/60000 (34%)]	Loss: 0.150952
Train Epoch: 7 [21120/60000 (35%)]	Loss: 0.249013
Train Epoch: 7 [21760/60000 (36%)]	Loss: 0.266434
Train Epoch: 7 [22400/60000 (37%)]	Loss: 0.095815
Train Epoch: 7 [23040/60000 (38%)]	Loss: 0.085854
Train Epoch: 7 [23680/60000 (39%)]	Loss: 0.194914
Train Epoch: 7 [24320/60000 (41%)]	Loss: 0.049003
Train Epoch: 7 [24960/60000 (42%)]	Loss: 0.300659
Train Epoch: 7 [25600/60000 (43%)]	Loss: 0.187070
Train Epoch: 7 [26240/60000 (44%)]	Loss: 0.139188
Train Epoch: 7 [26880/60000 (45%)]	Loss: 0.185233
Train Epoch: 7 [27520/60000 (46%)]	Loss: 0.323797
Train Epoch: 7 [28160/60000 (47%)]	Loss: 0.149383
Train Epoch: 7 [28800/60000 (48%)]	Loss: 0.096734
Train Epoch: 7 [29440/60000 (49%)]	Loss: 0.097670
Train Epoch: 7 [30080/60000 (50%)]	Loss: 0.158953
Train Epoch: 7 [30720/60000 (51%)]	Loss: 0.046962
Train Epoch: 7 [31360/60000 (52%)]	Loss: 0.373680
Train Epoch: 7 [32000/60000 (53%)]	Loss: 0.146498
Train Epoch: 7 [32640/60000 (54%)]	Loss: 0.106851
Train Epoch: 7 [33280/60000 (55%)]	Loss: 0.238737
Train Epoch: 7 [33920/60000 (57%)]	Loss: 0.188395
Train Epoch: 7 [34560/60000 (58%)]	Loss: 0.240837
Train Epoch: 7 [35200/60000 (59%)]	Loss: 0.196060
Train Epoch: 7 [35840/60000 (60%)]	Loss: 0.347056
Train Epoch: 7 [36480/60000 (61%)]	Loss: 0.148283
Train Epoch: 7 [37120/60000 (62%)]	Loss: 0.276109
Train Epoch: 7 [37760/60000 (63%)]	Loss: 0.169987
Train Epoch: 7 [38400/60000 (64%)]	Loss: 0.228216
Train Epoch: 7 [39040/60000 (65%)]	Loss: 0.136284
Train Epoch: 7 [39680/60000 (66%)]	Loss: 0.092286
Train Epoch: 7 [40320/60000 (67%)]	Loss: 0.232601
Train Epoch: 7 [40960/60000 (68%)]	Loss: 0.135339
Train Epoch: 7 [41600/60000 (69%)]	Loss: 0.230948
Train Epoch: 7 [42240/60000 (70%)]	Loss: 0.211363
Train Epoch: 7 [42880/60000 (71%)]	Loss: 0.173132
Train Epoch: 7 [43520/60000 (72%)]	Loss: 0.183898
Train Epoch: 7 [44160/60000 (74%)]	Loss: 0.144288
Train Epoch: 7 [44800/60000 (75%)]	Loss: 0.406860
Train Epoch: 7 [45440/60000 (76%)]	Loss: 0.092411
Train Epoch: 7 [46080/60000 (77%)]	Loss: 0.109340
Train Epoch: 7 [46720/60000 (78%)]	Loss: 0.070295
Train Epoch: 7 [47360/60000 (79%)]	Loss: 0.094628
Train Epoch: 7 [48000/60000 (80%)]	Loss: 0.140481
Train Epoch: 7 [48640/60000 (81%)]	Loss: 0.136578
Train Epoch: 7 [49280/60000 (82%)]	Loss: 0.171077
Train Epoch: 7 [49920/60000 (83%)]	Loss: 0.295319
Train Epoch: 7 [50560/60000 (84%)]	Loss: 0.221483
Train Epoch: 7 [51200/60000 (85%)]	Loss: 0.059724
Train Epoch: 7 [51840/60000 (86%)]	Loss: 0.266256
Train Epoch: 7 [52480/60000 (87%)]	Loss: 0.134226
Train Epoch: 7 [53120/60000 (88%)]	Loss: 0.132452
Train Epoch: 7 [53760/60000 (90%)]	Loss: 0.311764
Train Epoch: 7 [54400/60000 (91%)]	Loss: 0.372495
Train Epoch: 7 [55040/60000 (92%)]	Loss: 0.134466
Train Epoch: 7 [55680/60000 (93%)]	Loss: 0.149578
Train Epoch: 7 [56320/60000 (94%)]	Loss: 0.108037
Train Epoch: 7 [56960/60000 (95%)]	Loss: 0.168167
Train Epoch: 7 [57600/60000 (96%)]	Loss: 0.173183
Train Epoch: 7 [58240/60000 (97%)]	Loss: 0.205244
Train Epoch: 7 [58880/60000 (98%)]	Loss: 0.052629
Train Epoch: 7 [59520/60000 (99%)]	Loss: 0.322954

Test set: Avg. loss: 0.0584, Accuracy: 9802/10000 (98%)

Train Epoch: 8 [0/60000 (0%)]	Loss: 0.214194
Train Epoch: 8 [640/60000 (1%)]	Loss: 0.070404
Train Epoch: 8 [1280/60000 (2%)]	Loss: 0.144778
Train Epoch: 8 [1920/60000 (3%)]	Loss: 0.412123
Train Epoch: 8 [2560/60000 (4%)]	Loss: 0.138327
Train Epoch: 8 [3200/60000 (5%)]	Loss: 0.293890
Train Epoch: 8 [3840/60000 (6%)]	Loss: 0.392861
Train Epoch: 8 [4480/60000 (7%)]	Loss: 0.192090
Train Epoch: 8 [5120/60000 (9%)]	Loss: 0.102100
Train Epoch: 8 [5760/60000 (10%)]	Loss: 0.277190
Train Epoch: 8 [6400/60000 (11%)]	Loss: 0.191931
Train Epoch: 8 [7040/60000 (12%)]	Loss: 0.122084
Train Epoch: 8 [7680/60000 (13%)]	Loss: 0.096803
Train Epoch: 8 [8320/60000 (14%)]	Loss: 0.366435
Train Epoch: 8 [8960/60000 (15%)]	Loss: 0.149501
Train Epoch: 8 [9600/60000 (16%)]	Loss: 0.232998
Train Epoch: 8 [10240/60000 (17%)]	Loss: 0.208888
Train Epoch: 8 [10880/60000 (18%)]	Loss: 0.168597
Train Epoch: 8 [11520/60000 (19%)]	Loss: 0.176016
Train Epoch: 8 [12160/60000 (20%)]	Loss: 0.155226
Train Epoch: 8 [12800/60000 (21%)]	Loss: 0.409431
Train Epoch: 8 [13440/60000 (22%)]	Loss: 0.089147
Train Epoch: 8 [14080/60000 (23%)]	Loss: 0.249908
Train Epoch: 8 [14720/60000 (25%)]	Loss: 0.161371
Train Epoch: 8 [15360/60000 (26%)]	Loss: 0.228113
Train Epoch: 8 [16000/60000 (27%)]	Loss: 0.231048
Train Epoch: 8 [16640/60000 (28%)]	Loss: 0.213986
Train Epoch: 8 [17280/60000 (29%)]	Loss: 0.189937
Train Epoch: 8 [17920/60000 (30%)]	Loss: 0.174937
Train Epoch: 8 [18560/60000 (31%)]	Loss: 0.127933
Train Epoch: 8 [19200/60000 (32%)]	Loss: 0.217749
Train Epoch: 8 [19840/60000 (33%)]	Loss: 0.191027
Train Epoch: 8 [20480/60000 (34%)]	Loss: 0.173072
Train Epoch: 8 [21120/60000 (35%)]	Loss: 0.040515
Train Epoch: 8 [21760/60000 (36%)]	Loss: 0.159429
Train Epoch: 8 [22400/60000 (37%)]	Loss: 0.233260
Train Epoch: 8 [23040/60000 (38%)]	Loss: 0.140640
Train Epoch: 8 [23680/60000 (39%)]	Loss: 0.102482
Train Epoch: 8 [24320/60000 (41%)]	Loss: 0.080744
Train Epoch: 8 [24960/60000 (42%)]	Loss: 0.222218
Train Epoch: 8 [25600/60000 (43%)]	Loss: 0.250175
Train Epoch: 8 [26240/60000 (44%)]	Loss: 0.072187
Train Epoch: 8 [26880/60000 (45%)]	Loss: 0.150023
Train Epoch: 8 [27520/60000 (46%)]	Loss: 0.123127
Train Epoch: 8 [28160/60000 (47%)]	Loss: 0.392271
Train Epoch: 8 [28800/60000 (48%)]	Loss: 0.122911
Train Epoch: 8 [29440/60000 (49%)]	Loss: 0.469280
Train Epoch: 8 [30080/60000 (50%)]	Loss: 0.114386
Train Epoch: 8 [30720/60000 (51%)]	Loss: 0.174965
Train Epoch: 8 [31360/60000 (52%)]	Loss: 0.381519
Train Epoch: 8 [32000/60000 (53%)]	Loss: 0.250100
Train Epoch: 8 [32640/60000 (54%)]	Loss: 0.176469
Train Epoch: 8 [33280/60000 (55%)]	Loss: 0.324991
Train Epoch: 8 [33920/60000 (57%)]	Loss: 0.082569
Train Epoch: 8 [34560/60000 (58%)]	Loss: 0.164245
Train Epoch: 8 [35200/60000 (59%)]	Loss: 0.428066
Train Epoch: 8 [35840/60000 (60%)]	Loss: 0.334527
Train Epoch: 8 [36480/60000 (61%)]	Loss: 0.262608
Train Epoch: 8 [37120/60000 (62%)]	Loss: 0.150066
Train Epoch: 8 [37760/60000 (63%)]	Loss: 0.259629
Train Epoch: 8 [38400/60000 (64%)]	Loss: 0.157193
Train Epoch: 8 [39040/60000 (65%)]	Loss: 0.242240
Train Epoch: 8 [39680/60000 (66%)]	Loss: 0.073488
Train Epoch: 8 [40320/60000 (67%)]	Loss: 0.131683
Train Epoch: 8 [40960/60000 (68%)]	Loss: 0.120093
Train Epoch: 8 [41600/60000 (69%)]	Loss: 0.097674
Train Epoch: 8 [42240/60000 (70%)]	Loss: 0.130673
Train Epoch: 8 [42880/60000 (71%)]	Loss: 0.155427
Train Epoch: 8 [43520/60000 (72%)]	Loss: 0.054707
Train Epoch: 8 [44160/60000 (74%)]	Loss: 0.203655
Train Epoch: 8 [44800/60000 (75%)]	Loss: 0.171834
Train Epoch: 8 [45440/60000 (76%)]	Loss: 0.111149
Train Epoch: 8 [46080/60000 (77%)]	Loss: 0.124253
Train Epoch: 8 [46720/60000 (78%)]	Loss: 0.106826
Train Epoch: 8 [47360/60000 (79%)]	Loss: 0.303563
Train Epoch: 8 [48000/60000 (80%)]	Loss: 0.079353
Train Epoch: 8 [48640/60000 (81%)]	Loss: 0.234413
Train Epoch: 8 [49280/60000 (82%)]	Loss: 0.138271
Train Epoch: 8 [49920/60000 (83%)]	Loss: 0.185210
Train Epoch: 8 [50560/60000 (84%)]	Loss: 0.226690
Train Epoch: 8 [51200/60000 (85%)]	Loss: 0.176387
Train Epoch: 8 [51840/60000 (86%)]	Loss: 0.286560
Train Epoch: 8 [52480/60000 (87%)]	Loss: 0.113581
Train Epoch: 8 [53120/60000 (88%)]	Loss: 0.075368
Train Epoch: 8 [53760/60000 (90%)]	Loss: 0.062795
Train Epoch: 8 [54400/60000 (91%)]	Loss: 0.242043
Train Epoch: 8 [55040/60000 (92%)]	Loss: 0.299757
Train Epoch: 8 [55680/60000 (93%)]	Loss: 0.148714
Train Epoch: 8 [56320/60000 (94%)]	Loss: 0.133206
Train Epoch: 8 [56960/60000 (95%)]	Loss: 0.203755
Train Epoch: 8 [57600/60000 (96%)]	Loss: 0.287291
Train Epoch: 8 [58240/60000 (97%)]	Loss: 0.321706
Train Epoch: 8 [58880/60000 (98%)]	Loss: 0.067667
Train Epoch: 8 [59520/60000 (99%)]	Loss: 0.121717

Test set: Avg. loss: 0.0560, Accuracy: 9825/10000 (98%)

## **1. 环境配置**

- pandas1.3.4
- tensorflow2.10.0
- tensorflow-gpu2.10.1
- python3.9

## **2. 运行配置**

- CPU/GPU均可
- 最小内存要求
    - 特征/样本生成:3G
    - 模型训练及评估:6G

- 耗时
    - 测试环境:内存8G,CPU 2.3 GHz 双核Intel Core i5
    - 特征/样本生成:226 s
    - 模型训练及评估:740 s 
    
## **3. 目录结构**

- comm.py: 数据集生成
- baseline.py: 模型训练,评估,提交
- evaluation.py: uauc 评估
- data/: 数据,特征,模型
    - wechat_algo_data1/: 初赛数据集
    - feature/: 特征
    - offline_train/:离线训练数据集
    - online_train/:在线训练数据集
    - evaluate/:评估数据集
    - submit/:在线预估结果提交
    - model/: 模型文件

## **4. 运行流程**
- 新建data目录,下载比赛数据集,放在data目录下并解压,得到wechat_algo_data1目录
- 生成特征/样本:python comm.py (自动新建data目录下用于存储特征、样本和模型的各个目录)
- 训练离线模型:python baseline.py offline_train 
- 评估离线模型:python baseline.py evaluate  (生成data/evaluate/submit_${timestamp}.csv)
- 训练在线模型:python baseline.py online_train 
- 生成提交文件:python baseline.py submit  (生成data/submit/submit_${timestamp}.csv)
- 评估代码: evaluation.py

## **5. 模型及特征**
- 模型:[Wide & Deep](https://dl.acm.org/doi/pdf/10.1145/2988450.2988454)
- 参数:
    - batch_size: 128
    - emded_dim: 10
    - num_epochs: 1
    - learning_rate: 0.1
- 特征:
    - dnn 特征: userid, feedid, authorid, bgm_singer_id, bgm_song_id
    - linear 特征:videoplayseconds, device,用户/feed 历史行为次数
  
## **6. 模型结果**

|stage  |weight_uauc |read_comment|like|click_avatar|forward| 
|:---- |:----  |:----  |:----  |:----  |:----|
| 离线  | 0.657003 |0.626822 |0.633864  |0.735366 |0.690416 | 
| 在线  | 0.607908| 0.577496 |0.588645  |0.682383  |0.638398 | 
   
## **7. 相关文献**
* Cheng, Heng-Tze, et al. "Wide & deep learning for recommender systems." Proceedings of the 1st workshop on deep learning for recommender systems. 2016.

# 训练集
"user_action.csv"

useridfeediddate_deviceread_commentcommentlikeplaystayclick_avatarforwardfollowfavorite
8714741100150053660000
8739161100025015330000
8502821100075013020000
81139111001375051910000
827349110002508000000
83028711000014960000
81151110002509760000
869745110002508170000
82245111000020140000

"feed_info.csv"

feedidauthoridvideoplaysecondsdescriptionocrasrbgm_song_idbgm_singer_idmanual_keyword_listmachine_keyword_listmanual_tag_listmachine_tag_listdescription_charocr_charasr_char
43549616538104741 122649 8109 117252 65632 23463 118668 45861 8109 142955 27736 21751 112151 116906 32715 93520 32714 80461 8109 93563 102383 10952 48706 12885 68441 93563 8097 134820 55911 80449 79213 23233 13997 53706 104690 6994139499 59421 82007 142955 27736 83577 52394 112151 116906 93520 80461 82007 90327 28303 133091 82007 118636 99614 22694 93541 94993 82007 133091 45443 118749 82007 17273 26295 755 82007 79531 45861 60826 12240 99614 16648 117253 82007 6488 128286 16917 82007 66297 6994 146483 6994 6994 6994 92811 6478 123078 107293 82007 116352 47035 90992 111660 93115 7660 6994 91023 6994 91023 6994 117843 118696 125334 49255 140964 78844 101640 117843 82007 133091 92811 12240 123078 25794 82007 67209 140959 105807 27736 79314 68476 82007 46310 73451 133091 99617 8041 60574 150245 142561 90217 118696 125334 27736 82007 90992 99617 82007 46310 6994 73451 133091 99617 8041 23648 150245 106770 78844 133091 82007 502 94016 102311 37630 55139 93566 82007 55633 140609 27736 21763 16194 82007 3030 140609 27736 21763 16194 82007 110142 94993 136991 56344 11145 27736 83577 82007 12240 54654 25794 82007 121844 127586 12666 82007 287 826 122748 27736 140959 109451 82007 44399 136492 978 88576 5976 25794142955 27736 83577 103956 32010 34170 89740 90327 8109 28303 31798 12214 19942 99614 22694 93541 94993 133091 84618 118749 8109 1556 26295 79531 4724 12240 99614 16648 117253 79314 93536 120507 6486 128286 122159 46310 131225 6478 123078 140147 118668 14312 8109 118668 25794 99656 125334 49255 140964 101640 134892 40003 46310 131225 12240 123078 25794 44506 116906 25794 9844 27736 112516 51703 142561 146692 125334 27736 75770 4438 133091 122013 94016 29200 55139 93566 8109 115201 93566 29288 3030 100909 27736 21763 16194 8109 86461 119958 8109 101594 10143 27736 83577 12240 31336 85335 8109 12852 26661 4438 49448 12205 27736 140959 93524 8109 134871 37630 55139 93566 27736 15998 65632 99614 77895 26564 129634 33988 27736 53964 116914 78817 4438193561170315506;7715;1758226334;219;25209;7715;1854181;269;159;6269 0.8525666;81 0.8525666;8 1.1e-07;306 0.0;207 6.31e-06;10 0.2740430226439 5247 6426 3827 1882 26018 20744 22204 30024 24307 10436 1882 2203 26439 6243 21632 3713 15640 25926 7357 20823 7356 17870 1882 20857 3653 11877 24307 21492 17178 11341 10043 20833 15357 20857 1872 12681 5043 17859 17859 17531 15208 2597 3161 22626 27055 22439 2007725926 8491 13394 2203 26439 6243 33054 16435 11945 15640 25926 20823 17870 2874 25926 32378 20860 20268 27664 2820 13505 22179 5043 20838 21147 20268 27664 10326 31491 27464 20744 15640 29786 17597 199 20857 26054 10436 13681 2841 22179 3701 22474 1531 28575 3805 14875 32495 10517 11575 2200 20831 27464 24513 17596 25782 24307 22539 20268 24815 20752 1801 20292 20292 26208 26456 32499 20268 17549 22539 20292 17442 22626 26208 20268 27664 10517 11575 2841 27464 8481 22806 25926 4196 23482 6243 17596 15382 10517 14121 20268 27664 22181 6155 17602 13616 33234 3882 1531 20049 26456 32499 6243 20268 22181 10517 14121 20268 27664 22181 6155 17602 5344 33234 29751 29959 17442 20268 27664 4153 22241 21456 5033 22741 12837 5061 17870 20860 12604 3546 6243 22546 11662 3588 716 3546 6243 22546 11662 3588 24513 21147 32499 5043 12785 10086 22439 6243 33054 16435 2841 21468 22474 8481 26641 8481 2874 2923 66 219 32386 6243 25926 4196 24364 10043 11575 11877 29786 7305 19565 1341 84812203 26439 6243 33054 16435 16307 17070 24908 25920 7778 19891 2874 25926 1882 32378 20860 2227 3899 2820 13505 28798 22179 5043 20838 21147 20268 27664 18757 31491 27464 1882 17577 5966 29786 17597 20857 26054 10436 8740 2841 22179 3701 22474 17596 20833 29095 12762 1529 28575 27209 10517 11575 28798 2200 20831 27464 17531 66 24307 11388 6946 1882 24307 8481 22214 32499 20268 17549 22539 20292 22626 30042 11789 27664 10517 11575 28798 2841 27464 8481 10080 25926 8481 2206 6243 17596 10134 11796 3882 1531 14070 26456 32499 6243 10130 15911 969 20268 27664 2239 21498 21456 5033 20857 7799 5061 17870 20860 1882 7799 27464 20860 24513 1520 716 22446 6243 22546 11662 3588 1882 24513 20863 30038 15360 1882 22610 26084 26183 6243 33054 16435 2841 2066 21465 22474 5596 1882 15036 31177 21633 8512 969 30035 15921 2813 6243 25926 4196 20827 1882 30026 12837 5061 17870 20860 6243 17855 15360 20744 22179 17181 5967 24307 11574 11581 14292 6243 28598 6259 25930 17420 969
7743293866035753 27736 146603 73055 11794 101761 11794 8109 16933 52236 96307 12240 31279 137786 46760 8109 95030 7626 35990 8105 93533 8106 134820 35990 5349635753 146603 73055 11794 101761 67496 16933 52236 12240 99614 137786 46760 22893 99893 149481 90267 32443 83722 49523 44399 30521 73055 11794 101761 11794 4430 26692 32395 4431 85868 20248 109427 77075 24343146739 14368 79290 79213 47366 8109 33194 11989 31279 139153 7739 27736 97114 111182 8109 32443 83722 49523 44399 83265 73055 11794 11794 12240 12220 9949 45012 73055 25794 66512 8109 13547 79213 117050 121114 12240 23840 25794 79682 27736 5251 4438 32443 83722 110142 85502 8109 23509 12220 149649 133296 25794 118749 8109 134878 79213 47366 8109 73757 9844 106949 102036 113903 100901 8109 11989 104154 116819 147841 45012 22454 25794 118749 4438 71331 135636 27736 9949 147789 107673 125472 8109 92687 37527 50006 9844 102383 55805 49985 4438 86458 59637 12240 45012 32443 83722 23729 25794 79303 113003 8109 32443 83722 35339 72615 4101 27736 59394 57464 94842 112151 62197 93563 106949 25794 8109 90290 36623 93541 66151 8109 81787 12240 37630 45121 96353 27736 56338 96353 93541 66151 136194 22682 8109 121528 142314 85907 25794 45121 96353 27736 56338 8109 56338 146823 27736 107962 129369 56522 18710 8109 121274 139898 56981 27736 112151 8109 96307 96353 27736 105115 75625 12127 11989 80297 116819 44399 118814 121428 57464 141821 117050 8109 49523 104109 8109 4101 23944 92857 57464 25794 4438 75625 106972 59637 137786 18422 25794 4438 135636 27736 85592 114620 61803 8109 134878 79213 47366 110142 128739 42526 129746 57464 8109 96150 22454 19019 93541 93536 44352 8109 9844 77895 9844 121428 22414 25794 110142 149596 100491 8109 93324 56522 27013 118480 22682 145811 78276 32440 85907 25794 118668 86339 90810 2517 2517 47082 107673 45061 72615 108277 8109 40621 106949 144795 8109 134449 25794 8109 62436 117780 8109 121577 137786 24885 85056 146492 61803 4438 96439 116906 75625 12240 107973 25794 94171 8109 107673 77905 9973 109024 25794 110142 135558 63991 56280 8109 6510 6510 93541 56280 12240 63991 25794 118668 107962 4438 73757 9844 135558 107962 145339 38594 27736 31877 8109 106949 30426 72615 105251 75625 12127 56344 45012 85051 25794 38594 8109 26597 93541 66151 137550 107973 71826 35380 135636 27736 85592 107673 132715 58694 27736 53807 146823 8109 121274 79213 117050 121114 8109 29802 26564 44398 7409 77601 75625 12127 122650 25794 8109 38594 96099 44137 8109 150623 147516 62339 23632 7307 57464 56344 47366 25794 110142 93536 46071 27736 45829 8109 34517 15099 25794 6851 40277 142518 27736 8109 77895 62690 27736 56976 37527 93563 45412 100901 52611 10818 4438 64771 4438 147565 142518 9844 125334 93563 148254 52611 25794 128896 100901 79290 66989 44388199;18322;424324078;19924194;267;159;6267 0.99293476;194 0.9929347631010 32495 6243 13923 15360 30483 2709 26084 2709 1882 6976 1119 32191 66 21460 2841 30031 2220 11445 20077 30055 21492 1882 10436 1216 9019 20016 106 12187 1878 20831 1879 12681 106 12187 28535 1138331010 32495 13923 15360 30483 2709 26084 15160 6976 1119 32191 66 2841 22179 11445 20077 30055 21492 5104 5329 28798 33075 20070 7261 24307 5318 24513 10083 10043 32066 17278 30483 2709 26084 2709 961 1018 28161 7235 962 8890 27393 9781 14241 55137259 20851 5061 26207 17573 17531 15117 20072 1882 12584 15360 9119 29499 30031 2220 1487 1487 20857 28704 6243 12787 23108 5061 25630 1882 7261 24307 5318 24513 10083 10043 7261 19631 30483 2709 2709 2841 22717 16567 19020 10226 30483 8481 29242 23786 1882 12762 29499 17531 10045 24513 12371 28798 2841 26084 14040 8481 20857 2818 6243 32280 969 7261 24307 5318 24513 3907 32376 1882 6267 23705 22717 27061 17571 33330 8481 31491 27464 1882 30031 17531 15117 20072 1882 16435 2206 23786 2682 32028 15366 32280 22439 1882 9119 29499 7305 13842 25897 23492 27788 10226 4967 8481 31491 27464 969 23804 27464 19800 66 6243 16567 19020 7685 30038 23963 24513 6491 1882 11670 6439 8512 11383 2206 3653 11877 12669 11364 969 24513 20859 15360 2841 10226 7261 24307 5318 20078 8481 17585 25121 1882 7261 24307 5318 3779 19565 16174 30749 9202 6243 15268 10134 28798 31238 11371 15640 14040 20857 23786 8481 1882 20085 11598 11354 20838 11792 5329 1882 28161 12678 2841 12837 17770 9119 21492 6243 25926 12860 21492 20838 11792 5329 14040 13660 5033 1882 9416 16174 12836 18765 19777 20072 8481 17770 9119 21492 6243 25926 12860 1882 25926 12860 32562 6243 5270 25926 10434 18895 12836 14049 11296 1882 25123 20851 31075 12960 6243 15640 1882 21460 21492 6243 20824 8512 8486 25298 15360 9119 29499 21456 26322 25897 10043 11792 27050 28798 21800 10045 1882 24513 10083 13394 8512 1882 30749 9202 5416 20860 31376 28798 8481 969 8486 23801 15360 11445 20077 30055 32376 8481 969 19800 66 6243 2820 10424 19791 29781 4947 13394 1882 30031 17531 15117 20072 24513 28680 24307 7261 19631 2223 28798 1882 20857 1541 4967 20857 2239 20838 20833 32505 8512 1882 2206 17181 2206 27050 4944 8481 24513 16133 22351 1882 9416 20087 12836 5414 6269 6426 23786 5033 32333 17264 7259 19777 20072 8481 24307 13016 22351 30043 95 25926 28798 25926 28798 24513 7305 23963 3826 2585 16174 23786 10045 1882 5009 11383 23786 32066 1882 29922 8481 1882 16323 25616 70 1882 32505 28798 11445 20077 7771 21474 18881 32500 4947 13394 969 16435 18605 25926 8486 2841 27464 20072 8481 8486 32562 26446 1882 23963 17190 14223 19804 13858 14040 8481 24513 4956 30328 26543 12771 1882 1541 1541 20838 12771 2841 30328 26543 8481 24307 5270 25926 969 16435 2206 4956 5270 25926 26493 6426 27464 6243 22439 3879 1882 23786 10038 18876 16174 23343 8486 25298 15360 12785 10226 18876 8481 6426 27464 1882 5990 20838 11792 5329 7799 32336 27464 20072 17076 32562 28703 25306 25299 19800 66 6243 2820 10424 23963 9201 10222 12955 24307 6243 25274 15360 32562 1882 25123 20851 17531 10045 24513 12371 28798 1882 19800 2220 5967 10042 22629 32221 16174 8486 25298 15360 27343 8481 1882 6426 27464 32028 32376 26444 12650 1882 5990 15640 20825 8481 22179 26467 30055 12653 67 2233 28798 12785 15117 20072 8481 24513 20833 10230 5311 6243 17173 20090 1882 2206 2223 16265 14743 8481 1202 27976 9187 31605 6243 1882 17181 16650 6243 24442 8512 20857 11680 17278 22439 1211 14040 27061 4948 969 30031 15640 969 1804 18895 31605 2206 32499 20857 21494 24937 1211 14040 8481 28161 25920 22439 17573 4913 29499 969
12921299615114413 107973 117252 27736 41035 32715 125374 121275 32714 134820 130882 134820 121275102881135411696;6926202;23;160;623 0.64771646;202 0.6477164627077 10050 27464 20072 26018 6243 21492 14750 7357 31937 5036 11720 12949 7356 12681 19901 11379 12681 11720 12949
34538125771860473 9864 8109 135558 108803 89307 37630 6481 40061 93566 13997 49752 88822 3599039654 95201 22675 49521 135558 108803 89307 37630 6481 40061 935667446 47082 37630 99660 9844 113002 9844 113002 9844 27 39909 78557 30819 95343 17745 54593 879 84541 33616 56196 93542 95463 8109 26564 99569 443824435136455454;1197;3727328;13;159;6267 0.21298289;191 0.21298289;8 0.34298885;306 0.0;207 0.03609685;10 0.2691571721489 24672 2223 1882 4956 1545 10424 19791 12837 1525 9116 20860 3161 11278 19642 106 121879015 21209 5026 11192 4956 1545 10424 19791 12837 1525 9116 2086020825 13395 14765 24513 7305 12837 10337 8885 2206 21457 2206 21457 2206 19634 5318 19634 19634 5318 5318 26439 21465 31376 30749 27298 12358 255 2841 15640 17766 20833 12743 20839 21292 1882 5967 20857 17446 12836 969
75414718516105860 4691 134820 55911 80449 79213 23233 13997 53706 10469073055 66447 88970104002 4438 104002 4438234281073017083;41238058;219;21639;15621;2520981;269;159;6269 0.86163938;81 0.8616393825782 19583 7768 3907 14068 12681 5043 17859 17859 17531 15208 2597 3161 22626 27055 22439 2007730483 23721 1518 22741 2511823112 969 23112 969
34539259543123046 116187 8092 22091 77895 11717 93522 100860 146036 56882 134820 125215 14332 1482 699461498 70163 1428 123046 116187 44971 3063 32144 54606 113005 36385 86055 32144 39671 127146 113003 116906 28486 123046 116187 96158 6994 150134 59080 11109 139400 35425 49880 109355 125215 14332 1482 13818 40059 14332 50550 28467 145820 79577 101112 61794 57498 14332 51973 13818 40059 28467 145820 79577147566 45843 10952 25290 79030 11717 34198 99319 25146 4438 107652 11717 3871 8097 111993 27736 4438 58982 27736 8109 104424 84541 123078 22662 8109 29504 85051 25146 4438 34517 3871 8097 29640 45312 93522 100860 146036 69024 33602 2078 23233 4438 143358 90843 5070 36854 105393 104154 146036 27736 28379 8109 90270 51131 70959 118278 8109 92220 106952 4438 107673 5070 101258 110142 93536 73300 27736 69024 87887 8109 62339 104154 146036 8109 58123 53102 33602 131560 27736 69024 33847 4438 22091 34198 77895 107652 11717 82242 4438 121388 44381654117296529;2666;26954529;929845;239;159;169239 0.68520623;45 0.68520623;8 0.0;306 0.0;207 0.07740904;10 0.2794248527442 1543 31605 1868 20857 22179 17181 7261 8512 20825 10434 12836 4947 10090 15208 8311 12681 32987 1528 6439 20838 1237113842 6439 32379 25274 8847 1251 27442 1543 31605 10198 14040 18883 17188 7264 12371 25123 6426 22827 20831 24364 17188 7264 7585 31372 32556 25121 25926 6439 27442 1543 31605 22959 1528 6439 20838 12371 17585 20861 9114 1528 6439 8512 24307 6428 32336 12371 17187 20831 6439 13922 7305 19912 1528 6439 16269 17591 17585 20861 9114 6428 32336 12371 1718715371 6498 31376 30024 24307 21492 10083 20072 22639 26464 7261 8512 7799 31491 23654 13505 15360 969 10043 8493 7261 8512 11582 1872 21477 26656 6243 969 1514 7261 6243 1882 26080 30743 2841 15640 27464 30420 1882 15360 32505 18876 13505 15360 969 2206 2223 11582 1872 10222 18895 66 8512 20825 10434 12836 4947 10090 11704 13811 7613 27949 11677 15208 2597 969 2203 25121 26018 2223 1119 19785 24307 27314 10086 7305 13842 4947 10090 6243 31936 14292 1882 20072 24940 25121 25121 31239 26322 1882 5111 20833 23787 969 23963 1119 6332 21474 24513 20833 21474 23472 6243 11704 13811 95 12750 1882 22179 26467 7305 13842 4947 10090 1882 11715 32183 31376 11352 7613 12837 7802 6243 11704 13811 7613 26054 969 20857 22179 7799 17181 10043 8493 7261 8512 20825 3146 969 22392 24930 969
101310811515106872 36623 26564 8109 110838 148383 26564 33895 110690 133005 134820 118599 134820 768269837 110838 148383 26564 33895 110690 133005117252 44854 26564 110142 93536 91254 8109 9837 28467 96353 12240 31279 148383 26564 33895 110690 110838 93566 287 50242 16892 110838 93566 287 114678 8109 24802 40682 134913 111660 8109 33895 102364 148254 128784 33895 19695 55795 8109 110838 93566 132192 53514 88911 16650 3871 8097 141540 40075 8109 9844 50006 26564 33895 110690 8097193671053218351;13433;25057;11002;14687;10664;19485;128659;6237 0.90296459;246 0.9029645920831 11704 11598 11354 5967 1882 1217 23805 10083 14040 5967 12194 13395 14765 6153 1514 12681 4948 2875 12681 27061 15142200 1217 23805 10083 14040 5967 12194 13395 14765 6153 151426018 10155 5967 24513 20833 5311 32214 1882 2200 6428 21492 2841 30031 2220 10083 14040 5967 12194 13395 14765 1217 23805 20860 66 11443 3792 1217 23805 20860 66 24815 23472 1882 22351 16075 15506 17070 30055 24815 1882 12194 22779 21494 24937 28704 12194 3588 24815 12661 1882 1217 23805 20860 19074 3882 19944 32392 11463 3703 11582 1872 25118 9419 9126 1882 2206 11383 5967 12194 13395 14765 1872
137691169625144240 54435 37630 117252 27736 40670 134820 144240 65651 6510 134820 13088263733 106137 31785 73044 25794 65020 90843 107673 287 25341 25794 27438 12237 76067 51259 46674 26892 115928 1181 143307 82007 287 62219 93536 16407 80067 82007 287 62219 93536 74110 82007 3881 144240 99614 54748 82007 114039 90223 6476 26564 74075 93541 72364 82007 144240 127395 37630 26564 117252 40670123075 11342 79301 23908 110100 112151 110677 107673 112164 32444 140205 106949 4438 78106 4438 135187 24334 84163 46310 37527 11819 8109 5070 79301 33988 79682 110202 44388800;11061;157529;617 0.74826479;37 0.7482647912783 22473 5277 10040 12837 26018 6243 8311 32562 12681 12783 22473 21456 1541 12681 19901 1137920857 15640 7261 23039 30060 32499 5009 8481 7259 15640 26018 2223 23963 66 10156 8484 8481 26084 8488 2839 11342 17196 1553 15631 9419 22744 13842 11357 5009 31075 26464 8134 20753 66 14057 20833 3653 17791 66 14057 20833 3653 9201 26464 2818 12783 22473 22179 19074 15366 2203 22430 20085 12712 1520 5967 12787 12787 20838 7578 19804 12783 22473 18895 12837 5967 26018 8311 3256230191 2606 17583 5391 20861 15640 12836 4895 15360 23963 24930 7262 31134 23786 969 3797 1119 969 5505 1525 5505 18657 10517 8512 2726 1882 1119 17583 11581 14292 20857 2818 32394 22179 969
6311953360102036 39661 66989 110142 118715 93522 24335 8109 16650 33804 25794 90843 27736 5251 8097 79072 27736 108358 22882 26564 89316 102383 27736 10751 78511 4438 43127 80607 45012 26564 110142 93536 111883 27736 141185 43768 49085 134820 127696 27736 106096 80607 134820 116237 125923 134820 43127 80607 134820 72032 107016 134820122344 140404 55785 26564 93563 102383 27736 79072 102291 90843 31279 117347 116237 287 49575 56976 82007 123078 97133 44399 9870 97713 82007 110690 56196 116237 27736 49575 56976 122222 82007 117252 45012 26564 109888 110142 93536 59964 27736 30904 82007 287 110142 93536 37630 55019 45443 121478 27736 12885 50664 37527 82007 9844 147516 59287 110142 93536 83089 9844 12240 106949 82007 45443 25794 86458 82007 883 145095 27736 148386 18836 25794 57356 82007 12885 83089 27736 37527 82007 9844 51772 82007 9844 59287 25794 12885 121478 82007 146739 83099 25794 110142 121697 5251 82007 64771 90270 25794 85868 75625 113003 82007 34517 27736 121478 12240 77084 25794 82007 80029 44517 24143 6994 53640 112151 9844 82007 59720 116466 148375 9844 45012 25794 32066 118787 82007 29641 9844 59287 25794 12885 83089 4430 26597 111993 93566 82007 22694 118787 78844 82007 90843 128579 83089 35227 112151 140147 136294 109716 27736 82007 72106 27736 28512 112151 110690 122222 82007 12885 78783 37630 79030 113003 118278 82007 51819 287 110142 93536 43556 82007 37630 25907 27736 31877 22681 9844 17296 82007 32440 33616 31877 109527 9844 27736 5251 93563 118709 82007 9844 37630 33616 31877 84166 93565 25794 44399 113957 116237 82007 44399 113957 141185 27736 116237 58180 27736 31877 82007 90270 25794 9844 18225 59287 33616 83089 27736 3187790843 31279 117347 116237 8109 287 49575 56976 123078 97133 44399 9870 97713 8109 80029 110690 56196 116237 27736 49575 56976 3871 8097 117252 45012 26564 109888 93536 59964 27736 30904 106519 8109 287 110142 93536 37630 55019 45443 121478 27736 12885 50664 37527 8109 34517 147516 59287 110142 93536 83089 8109 9844 12240 106949 50835 45443 25794 86458 883 145095 27736 148386 12240 18836 25794 4438 57356 12885 83089 27736 37527 8109 80029 9844 3871 12240 51772 8109 96307 59287 25794 12885 121478 8109 146739 83099 25794 110142 121697 5251 4438 64771 90270 25794 85868 75625 113003 8109 34517 27736 121478 12240 77084 25794 4438 80029 44517 24143 3871 8109 7275 112151 9844 50835 8109 148375 9844 45012 25794 32066 50835 8109 142720 50006 118787 8109 29641 9844 59287 25794 12885 83089 8109 26597 111993 93566 22694 118787 78844 90843 128579 83089 35227 112151 140147 136294 109716 27736 4438 80029 72106 27736 140623 112151 110690 3871 8097 12885 78783 37630 79030 113003 118278 3871 8109 51819 287 110142 93536 43556 37630 25907 27736 31877 22681 9844 17296 8109 32440 33616 31877 8109 109527 9844 27736 5251 93563 118709 8109 150623 3871 8109 9844 135558 5251 20484 25794 33616 43556 8109 9844 37630 33616 31877 84166 93565 25794 44399 56543 116237 8109 44399 113957 141185 27736 116237 58180 27736 31877 4438 90270 25794 9844 18225 59287 33616 83089 27736 31877 8109 44399 113957 116237 19590 77895 9844 59287 25794 12885 83089 8109 44399 112151 28512 78844 56265 27736 109716 8109 32440 9844 18836 27736 116237 65632 54606 58180 4438 73757 12885 116237 58180 27736 31877 8109 54606 80767 110690 37216 3871 8097 84541 9844 27736 121478 54606 77084 8109 29802 90843 61724 4438 36562 90843 54606 142856 90843 27736 141185 27736 121478 8109 150623 133643 90843 106949 45443 78844 141185 92811 27736 84166 93565 84634 116237 27736 49984 8109 80029 12885 37216 49862 65632 54606 86645 90843 105491 4438 101263 125923 59964 93563 72104 443817892;184289;6235 0.36773422;204 0.367734222682 32028 12649 7461 4913 29499 24513 24368 20825 5506 1882 3703 10337 6269 8481 26018 2223 6243 32280 1872 32556 32280 6243 18654 27775 5095 5967 19800 3653 11877 6243 10042 1119 20061 32510 969 32556 20093 18621 11458 10226 5967 24513 20833 1528 28714 6243 12653 27779 32562 1793 1873 12681 33347 16046 6243 2727 20845 18621 11458 12681 18659 15360 11670 20095 12681 32556 20093 18621 11458 12681 32505 18944 22474 19074 1268112676 3901 31177 12653 5967 20857 3653 11877 6243 32556 32280 31177 16332 26018 2223 30031 2220 26054 18659 15360 66 22439 31380 24442 27464 28535 28335 10043 13421 8484 22867 13395 14765 12743 18659 15360 6243 22439 31380 24442 27234 26018 10226 5967 20821 24513 20833 10080 21477 6243 33081 15360 66 24513 20833 12837 27072 13924 28703 10326 26084 17278 6243 10043 20833 25121 22359 8512 2206 20825 8481 22629 20072 24513 20833 9413 12594 2206 2841 23786 10326 8481 24513 20859 258 31239 20859 6243 5909 24937 32028 12668 8481 12676 12652 10043 20833 9413 12594 6243 8512 2206 12762 20857 1519 29499 2206 22629 20072 8481 10043 20833 26084 17278 7259 20851 3770 20072 8481 24513 24307 8886 32280 30031 15640 20072 8481 8890 8486 25121 2206 2223 6243 26084 17278 2841 117 8498 8481 17766 10083 20860 23786 32499 22495 15640 2206 6454 20825 2206 10226 8481 24513 8493 6426 19780 29499 5033 2206 22629 20072 8481 10043 20833 9413 12594 961 5990 21477 26656 20860 5043 6426 19780 17442 26018 2223 32197 20072 9413 12594 21456 31380 15640 17531 66 6454 12762 1522 1021 6243 10080 26195 6243 6454 15640 13395 14765 27234 10043 20833 2820 12372 15360 12837 22639 26464 25121 26322 31069 10222 66 24513 20833 68 12783 12837 19988 20845 6243 22439 3879 5032 2206 3875 32280 7259 17766 20833 22439 3879 16025 29499 2206 6243 32280 20857 26464 2206 12837 17766 20833 22439 3879 18659 20859 8481 10043 15513 18659 15360 10043 15513 12653 27779 6243 18659 15360 22474 22241 6243 22439 3879 20072 8481 2206 15117 1119 22629 20072 17766 20833 9413 12594 6243 22439 387926018 2223 30031 2220 26054 18659 15360 1882 66 22439 31380 24442 27464 28535 28335 10043 13421 8484 22867 1882 17766 13395 14765 12743 18659 15360 6243 22439 31380 24442 11582 1872 26018 10226 5967 20821 20833 10080 21477 6243 33081 15360 23664 1882 66 24513 20833 12837 27072 13924 28703 10326 26084 17278 6243 10043 20833 25121 22359 8512 1882 2206 2223 20825 8481 22629 20072 24513 20833 9413 12594 1882 2206 2841 23786 11602 10326 8481 24513 20859 258 31239 20859 6243 5909 24937 2841 32028 12668 8481 969 12676 12652 10043 20833 9413 12594 6243 8512 1882 17766 2206 11582 2841 12762 20857 1519 29499 1882 21460 22629 20072 8481 10043 20833 26084 17278 1882 7259 20851 3770 20072 8481 24513 24307 8886 32280 969 30031 15640 20072 8481 8890 8486 25121 1882 2206 2223 6243 26084 17278 2841 117 8498 8481 969 17766 10083 20860 23786 11582 1882 32499 30957 15640 2206 11602 1882 6454 20825 2206 10226 8481 24513 8493 11602 1882 5095 28712 11383 6426 19780 1882 29499 5033 2206 22629 20072 8481 10043 20833 9413 12594 1882 5990 21477 26656 20860 5043 6426 19780 17442 26018 2223 32197 20072 9413 12594 21456 31380 15640 17531 66 6454 12762 1522 1021 6243 969 17766 10080 26195 6243 31238 15640 13395 14765 11582 1872 10043 20833 2820 12372 15360 12837 22639 26464 25121 26322 11582 1882 31069 10222 66 24513 20833 68 12783 12837 19988 20845 6243 22439 3879 5032 2206 3875 32280 1882 7259 17766 20833 22439 3879 1882 16025 29499 2206 6243 32280 20857 26464 1882 5990 15640 11582 1882 2206 4956 32280 3875 10226 8481 17766 20833 68 12783 1882 2206 12837 17766 20833 22439 3879 18659 20859 8481 10043 12846 18659 15360 1882 10043 15513 12653 27779 6243 18659 15360 22474 22241 6243 22439 3879 969 20072 8481 2206 15117 1119 22629 20072 17766 20833 9413 12594 6243 22439 3879 1882 10043 15513 18659 15360 19791 10436 17181 2206 22629 20072 8481 10043 20833 9413 12594 1882 10043 15640 6454 17442 12762 6243 1522 1021 1882 7259 2206 32028 12668 6243 18659 15360 20744 12371 22474 22241 969 16435 10043 20833 18659 15360 22474 22241 6243 22439 3879 1882 12371 10436 14040 13395 14765 10235 12762 11582 1872 2841 15640 2206 6243 26084 17278 12371 117 8498 1882 19800 2220 26018 2223 18881 27347 969 20085 8512 26018 2223 12371 18876 27464 26018 2223 6243 12653 27779 6243 26084 17278 1882 5990 15640 5323 12762 26018 2223 23786 10326 17442 12653 27779 10517 11575 6243 18659 20859 18768 18659 15360 6243 33219 1882 17766 10043 20833 10235 12762 73 10218 20744 12371 6426 20072 26018 2223 11536 20860 969 4739 6439 11670 20095 10080 21477 20857 16035 969

"feed_embeddings.csv"

feedidfeed_embedding
46022-0.02032269 0.06095614 0.11057708 0.03385210 0.02812547 -0.08518990 0.05559204 -0.00109777 -0.00646856 -0.04048987 0.05000276 0.03058990 0.02377489 0.01612881 0.00667492 -0.01905753 0.02873693 -0.03817699 0.06316159 -0.02236949 0.04154284 -0.01927267 -0.06563771 0.08156607 -0.04581382 -0.05931602 0.01337736 0.07015406 -0.01570191 -0.01832109 -0.01408037 -0.07717960 -0.00888649 -0.01240795 0.06539840 0.01043406 0.04577293 0.02424406 -0.00729460 -0.02975632 0.00499715 0.06271300 0.03864457 -0.01238171 -0.02206316 0.00645974 -0.08267149 0.02221970 -0.05874122 0.07345319 0.04663751 -0.06113898 -0.02436930 0.00578540 0.03339976 0.00486011 -0.00548112 -0.00866814 0.08987114 -0.00129464 0.04020681 0.01253866 0.02508013 0.04873040 0.02622130 -0.12279262 0.02532234 0.00593781 -0.01332844 -0.08224472 -0.00781064 0.05548930 -0.08653537 0.07381740 -0.03475798 0.00520029 0.00493612 -0.05922305 -0.03604246 0.01421230 0.01137579 -0.00903295 -0.00506969 0.06808347 0.00031668 0.00321970 0.01076905 0.04734666 -0.04765892 -0.02314426 0.01742046 0.01082754 -0.04263207 0.01978174 0.07064336 -0.13424741 -0.00581700 0.04816765 0.01221545 0.04083176 -0.02402344 -0.05684857 0.09058131 -0.01992483 -0.00160424 0.01903352 0.01355337 0.02915311 0.00215676 0.08514176 0.04068467 -0.03306572 0.02835644 -0.09085545 0.06172447 0.04430490 -0.03951653 -0.00200954 -0.04779271 0.00963933 0.00386438 -0.00415530 0.04054871 0.02923424 -0.05307896 0.04201009 -0.04722787 -0.01842144 0.00009333 0.04985205 0.04255443 0.00911923 -0.01072426 0.02350518 0.01895704 0.02341363 -0.04879943 -0.02468463 0.03283368 -0.04267694 -0.01028677 0.02750140 0.02792602 0.02738203 0.00373658 0.02880585 0.01674607 -0.00738857 0.03368308 -0.06989074 0.06081523 0.02620446 0.02515519 0.06889910 -0.03786224 -0.07981806 0.03638101 -0.07057864 -0.05443394 -0.02865824 -0.02918431 0.02051153 -0.00332493 -0.01165866 -0.04812939 -0.01120594 0.01178872 -0.01323979 0.07307810 0.00476284 -0.05055879 -0.04914421 -0.00481287 -0.06710012 0.01469112 0.02320142 0.06550282 -0.00914748 -0.03589254 0.05905405 0.10123876 -0.00419901 -0.00719613 0.05544743 0.10254011 0.00782380 0.02079490 -0.02809303 -0.04530121 -0.03898184 -0.02739525 0.01066480 0.00833534 0.12200668 0.04381459 -0.00817765 -0.04889630 0.05482227 -0.08672274 -0.03184219 -0.01717448 -0.00146739 -0.00987157 0.03455915 -0.04361616 -0.00062902 0.03544388 -0.03759692 -0.05021265 0.05029952 0.01289930 -0.03469002 -0.00912714 0.00672391 0.01649949 -0.00577535 -0.05752452 0.00975136 0.02605979 0.04716295 -0.01831437 0.01437550 0.01546246 0.02756327 0.05089093 -0.03236460 0.01081885 -0.03754948 0.02966969 -0.03720146 0.07741702 -0.02197685 0.06275516 -0.03059018 -0.00424720 -0.01130959 0.00484924 -0.00575983 0.02731662 0.00062399 -0.04465419 -0.00074388 0.08434938 -0.01008605 0.00764129 -0.03048825 -0.05511282 0.02064251 -0.08207756 0.01726147 0.03612636 -0.00789190 0.00014601 -0.00796945 0.04558631 -0.02367557 0.00444732 -0.09099491 -0.00729158 -0.02615191 0.03486659 0.01161077 0.04520666 0.00065168 -0.03243519 0.07814760 0.01661739 -0.03682500 0.00981577 -0.09054305 -0.04702920 0.02616067 0.01455957 -0.00353303 0.02884221 -0.02081287 -0.07629243 0.00695370 -0.01778692 0.00880174 0.08003455 0.00398634 0.00012975 0.04273194 -0.08689152 -0.03103559 -0.07127035 0.09543968 -0.05054901 0.05314960 -0.06365775 0.03675173 -0.04571718 -0.02463496 0.03205013 -0.00552466 0.00456964 -0.00594977 0.04317423 -0.03427084 -0.00055008 -0.05752844 -0.03839415 0.03413538 0.01788029 -0.07257541 0.08720824 0.04316966 0.00323391 0.04534198 -0.06934435 -0.08984338 -0.02866630 -0.04099227 0.10802484 -0.05618636 0.01704556 -0.07502606 0.02171447 0.03243683 0.00787064 -0.01571300 -0.02152908 -0.01138533 -0.02238571 -0.03149838 -0.00156173 0.01275216 -0.01831292 -0.00598939 -0.01588922 0.01261405 0.00619349 -0.02560474 0.00613529 0.00123962 -0.08659820 -0.05092514 0.10997511 0.01421863 -0.00524529 0.00019514 -0.07556754 -0.04444754 -0.01348028 0.07491199 -0.02326861 -0.12794608 0.02975268 -0.03449291 -0.01024262 -0.01155580 -0.03674039 -0.01398400 -0.05561178 -0.00155903 -0.02296476 0.00109307 -0.02060547 0.03902683 0.06736176 0.00648715 -0.06171996 0.06332462 -0.05589305 0.02302027 0.00375052 -0.00503274 -0.04439599 -0.00869162 0.01229169 0.02595357 -0.04152213 0.01606401 -0.03725438 -0.05983813 0.01130613 -0.01860004 0.02045351 -0.00935699 -0.00286893 -0.06380587 -0.00323672 0.02018174 -0.05697639 0.01796653 -0.01225796 -0.04377607 0.02099486 -0.03662959 0.08018362 -0.03193632 0.02948860 -0.02269475 -0.00158489 0.01692997 -0.01128647 -0.02006533 0.00535052 0.04321480 0.01764181 -0.03647076 0.02001253 0.03082337 -0.02983007 -0.04337493 0.00000814 -0.03990178 0.01942642 0.03087643 -0.00010357 0.03358505 -0.08878423 -0.00656996 0.01361455 0.04418071 0.05161112 0.00361041 -0.08816525 0.00123013 0.07253796 -0.00680497 0.00461921 -0.06001453 0.03364732 -0.00767394 0.00199417 -0.03522309 0.16066504 0.03234567 0.11663093 -0.01790307 0.01066366 0.01568888 0.00870564 0.03152613 0.08003417 -0.02711316 0.02694526 0.04068552 -0.00444693 -0.04228260 -0.07975496 0.01086247 0.02282070 0.01170601 0.01554572 -0.02712515 -0.01338211 -0.05108256 0.05718218 0.06600715 -0.01412682 -0.01988017 -0.05348738 -0.02588307 -0.00519316 -0.04381202 -0.01363191 0.02466319 0.02176631 -0.02100345 0.05002431 0.00223020 -0.02882733 -0.04819504 0.05396529 0.03474168 -0.00557452 -0.06588931 -0.00704136 0.13441491 -0.02744192 0.00871559 0.00270329 0.01987642 -0.09018298 -0.03253672 -0.00472016 0.03399897 -0.04667177 -0.06853016 0.05842008 0.01362774 0.01524105 0.05741464 -0.01563466 -0.05132842 -0.00704558 0.03052486 0.07656473 0.03421598 0.00667690 0.00198740 0.17558958 -0.08450387 -0.06857902 -0.08434431 -0.06336001 -0.04416615 0.03017968 0.08363450 0.07305276 -0.00548235 -0.00748979 0.03120062 -0.07124944 -0.03179765 0.04072666 -0.01122775 0.00182425 0.01854352 
73903-0.07594238 0.01796364 -0.00135112 -0.00333468 -0.03906913 -0.01201122 0.01489581 -0.02089913 0.10003719 -0.07392261 0.03225327 0.02473335 0.00636177 0.02102265 0.03089060 -0.03134924 -0.06404098 0.03373779 -0.00518042 -0.01271136 0.01948555 0.05098340 0.00550926 -0.02428887 0.05736289 -0.02302925 0.04962537 -0.01309316 0.09409212 -0.02051228 0.04507539 0.03339470 -0.00257009 -0.02399968 0.05928390 0.03032476 0.04379846 0.01120587 -0.05540381 0.00846365 -0.04584219 -0.00396154 0.06241690 -0.03641150 0.01623077 0.01645221 -0.02737627 0.03496832 -0.02385960 -0.00398438 0.04775199 0.02285005 -0.07606190 -0.05024003 0.02827910 0.03891960 0.02728591 0.00345305 0.00702850 0.02369766 0.02937021 -0.02049670 0.01032636 0.05047403 -0.01418479 -0.00077433 -0.03624297 0.00867193 -0.03669184 0.02765483 -0.01740871 0.00542626 0.01084013 0.02200013 0.01794571 0.04306297 -0.00947919 0.04025915 -0.01774670 -0.01912778 -0.06145070 0.03126000 -0.00115523 0.00936637 0.02967164 -0.00809643 -0.04341362 0.06411163 0.05899989 -0.04464117 0.02456213 0.02848955 -0.01131580 0.01889319 -0.04771464 -0.07052191 -0.00488702 0.01804709 -0.07264372 0.04023656 0.07296780 -0.05345747 -0.02736894 -0.07020248 0.00509500 0.03921197 -0.02438912 0.08465929 0.04449910 -0.02722020 -0.03655417 0.05407390 -0.00085665 -0.00137844 0.07106167 0.06253675 0.02042741 -0.01186015 0.01316830 -0.00522864 0.09923287 0.00769629 0.00978996 -0.01202288 -0.01736288 0.01604834 0.01374294 -0.00028096 0.01195977 0.03993271 -0.00335022 -0.03831121 0.00423142 -0.01571903 -0.01704898 -0.05133960 0.01814442 -0.04913471 0.06358683 0.03993648 0.00467288 0.06941103 0.09235100 0.00120684 0.07162999 0.01684064 0.03822704 0.01895683 -0.01702017 -0.05076078 0.06184896 -0.00318958 0.02996226 0.08431702 -0.01691770 0.01826590 0.04895123 -0.03844851 -0.03934265 -0.13275942 -0.04281111 -0.02111810 0.00653309 0.05832545 -0.02759578 -0.01613747 -0.01467544 -0.11005738 0.05112905 0.00735669 -0.00278276 -0.03568284 0.05494421 -0.07528665 0.04417328 0.01837460 -0.02152363 -0.01051528 -0.04102257 0.01246542 -0.00352169 0.00923514 0.03614680 -0.01109485 0.15965936 0.05147836 0.02165487 -0.02121574 -0.03660213 -0.00712492 0.10324964 -0.03717653 -0.00228646 0.06687082 -0.01103984 -0.04261708 0.01131162 -0.01651539 0.00761495 -0.00010661 0.02812144 0.03702230 -0.03794552 0.00630495 0.02418205 -0.10343902 -0.02352784 -0.08566369 0.03009162 -0.02595068 -0.00153899 -0.00667142 -0.02194335 0.03773309 -0.08472937 -0.01057322 0.08704079 0.00841769 -0.06334057 0.02984183 0.02412351 -0.03310562 -0.07210101 0.00758023 0.03608061 0.01817781 -0.01168917 -0.05662557 0.00728231 0.04109305 -0.01028993 -0.00190037 -0.04285161 -0.10830160 -0.06046910 -0.00576138 0.00256882 -0.02608991 0.00048509 -0.05069792 0.02510072 0.05438080 -0.00955379 0.05120897 0.00500637 -0.00768774 -0.04581336 -0.04661058 -0.01873002 -0.01965985 -0.00910656 0.00771759 -0.00771715 -0.03558020 -0.04408149 0.02434779 -0.04449471 -0.08640025 0.02633034 0.01401446 -0.02991109 0.02091301 -0.03193769 -0.09023648 -0.05315933 0.07306558 -0.00987630 -0.00071834 0.07606030 -0.01181938 0.03104783 -0.13166402 -0.05278255 0.02717610 -0.01386143 -0.00120414 0.04412623 -0.05894787 -0.00322600 0.08652246 -0.00465762 -0.00498337 -0.00419991 -0.02606772 -0.00287455 -0.04445483 -0.05578917 -0.00053750 -0.06597452 -0.01892062 0.00540181 0.01114137 -0.01598815 0.01749417 -0.03687771 0.00244985 -0.00424908 -0.01056018 -0.02285592 0.02766569 -0.01757519 -0.03392760 0.00654466 0.04481994 -0.06923404 -0.13472672 0.05695025 0.02140952 -0.01614774 0.02533490 0.00352784 -0.10389028 -0.01526039 -0.02067529 0.05742448 -0.02637330 -0.11930270 -0.06080859 0.06210210 0.03539850 -0.00330420 0.04358843 0.07506454 -0.02390368 0.02569561 -0.06499826 -0.00932904 -0.04585499 -0.02361907 0.05483912 0.02089226 -0.06230016 0.02582444 0.05294023 -0.02427315 0.00876689 -0.04757544 -0.05130978 0.02829998 0.03379386 -0.09588508 -0.01591926 -0.00793483 -0.04615004 -0.01350085 0.01324183 -0.05063310 -0.12004694 -0.00352318 -0.02353628 -0.00883490 0.04822904 -0.03084642 0.01207361 -0.11640972 -0.04809754 -0.00874826 -0.04017000 -0.05314135 0.04821342 -0.00916475 -0.01807761 -0.01738140 0.05130200 0.04112475 0.02604895 -0.02793798 0.04032456 0.04344140 0.06855219 -0.00710810 -0.08756772 0.01323290 -0.04575055 0.04221840 -0.01758523 -0.02041554 0.07255618 -0.05273480 -0.04463623 -0.00139858 -0.03228387 0.04875416 0.03410862 -0.01784754 -0.09118838 0.04816244 0.03647121 0.03280392 -0.01059617 0.00824945 -0.04179013 0.02109089 -0.03092263 0.01613336 0.09325067 -0.03590599 -0.01734307 0.01095122 -0.03508218 0.01687572 0.00061620 0.01432052 0.00864924 -0.01262275 0.01801399 -0.01999735 -0.05717986 -0.03808663 -0.02342012 -0.01238015 0.05014532 -0.01998086 -0.06673498 0.06625552 0.02109602 0.04559500 -0.08954698 -0.03191192 -0.00314834 0.06807643 0.02350906 0.02105173 0.02014902 0.02614738 -0.06609744 0.01724842 -0.04266182 0.05469267 0.08569905 -0.04022294 0.04291619 0.00057342 0.00926134 0.00666743 -0.03847331 0.04774343 -0.01349877 0.08759391 0.01585761 -0.01754360 0.02176344 0.01442944 0.05781641 0.02179542 -0.01367151 0.03730481 -0.09979834 -0.04965018 -0.03559049 0.06669184 0.02107971 -0.03719990 -0.10976494 -0.00270014 -0.06443203 -0.00047266 -0.02647014 0.00775410 0.02393967 0.00252653 -0.04271706 0.04699518 0.04753019 -0.03757812 -0.02424217 -0.01526279 0.07269544 -0.02139632 0.02975084 -0.00554142 0.11163556 -0.05954265 -0.03654151 -0.00439670 -0.03055683 -0.03316990 -0.03963906 0.00710597 -0.01831547 -0.01334572 -0.01227175 0.00590593 0.00681747 -0.02885551 0.00143084 -0.01485525 -0.01837045 0.03039640 0.01359720 0.00265866 0.07406741 0.04616953 0.07222846 0.14089309 0.03624927 0.01569271 -0.11507455 -0.02494325 -0.00028331 -0.01674029 0.10012484 0.01175136 0.00419004 0.06434142 0.09482542 -0.07808147 0.00388887 -0.01404470 -0.08724269 -0.04499206 0.01471738 
88646-0.05067272 -0.08208735 -0.01929738 -0.01501827 -0.00038959 -0.05159938 0.04332055 -0.06904563 0.03221740 -0.01554368 -0.04385469 -0.04407806 -0.03588175 0.02280679 0.04877719 -0.05029437 -0.05559462 -0.06515324 0.01651813 -0.03672857 0.02289440 0.05664971 0.00267147 0.01296334 -0.02044016 -0.01513546 -0.00557246 0.09786832 0.02527305 -0.00212141 0.02741688 0.00330064 0.03280130 0.02488577 0.06983393 0.02414365 -0.00699104 -0.05710818 -0.08229160 0.00634324 0.02985863 0.02082054 0.03869525 0.00932222 -0.00132692 0.04198465 -0.03334956 0.00274552 -0.03820707 -0.00294017 0.09021682 0.08084838 -0.05117399 -0.05813904 0.01381184 0.02252370 -0.00412344 0.01588899 0.07759049 0.03403556 -0.05906681 -0.00111482 0.01572437 0.02370655 0.01353970 0.00149892 -0.01267412 0.02101260 -0.02727676 0.10750418 -0.02083897 0.02577888 -0.01548656 -0.02014923 -0.00316759 0.05028935 -0.02036578 0.01221025 0.00280789 -0.01625615 -0.01102243 0.02724736 -0.03714441 0.09324453 -0.02119380 -0.01338365 0.04334483 -0.02932577 -0.06010614 0.02194520 0.03120819 -0.01482157 -0.03873977 -0.01533248 -0.00963177 -0.01252772 -0.00039460 0.03576810 -0.04661172 0.07073739 0.00179699 -0.00422548 -0.02734251 -0.09876736 -0.03294269 -0.05283048 0.02841063 0.05473662 0.00447362 -0.01955818 0.01687199 0.00014156 -0.03243348 -0.00205641 0.00416805 0.03032403 0.01797360 0.01752640 0.00058520 0.05141484 0.00782514 0.02332065 0.03152848 0.04186667 -0.02847480 0.13816127 -0.01465796 0.01733425 0.02162299 0.00623077 -0.04240157 0.00416671 -0.04847397 0.03362853 0.08098639 0.08238877 0.01758221 -0.02829522 0.00292246 0.03479441 0.00646806 0.03178821 0.03734975 0.01424364 0.00510850 -0.02481154 -0.00945966 -0.01229870 -0.00807575 -0.02648347 0.09113201 -0.00054640 0.03403364 0.04573727 -0.01453612 0.04103000 0.00288156 -0.05675283 -0.08565611 -0.15289034 -0.06677265 0.00817633 -0.05832835 -0.04557263 0.01269871 0.02849567 0.00548222 -0.04823745 0.05614936 0.04187132 0.06379262 -0.01708682 0.05551881 -0.01173416 0.07920032 0.08374719 0.01194961 -0.01805772 -0.00998983 0.03047612 0.04270288 0.01911803 -0.00924963 -0.08956198 0.09270053 0.11791071 0.00849549 0.03105891 -0.01103668 0.01702955 -0.01122448 -0.05961910 0.01652621 0.00057317 -0.01856926 0.02637315 -0.02020464 0.04472414 -0.03286382 0.02657255 0.00147497 -0.00067629 -0.09543887 -0.00249321 -0.01151598 -0.03414627 -0.02798822 0.06249873 0.02083975 -0.01326988 0.09504215 -0.03234911 -0.01475788 0.02202436 -0.02956435 -0.01595682 -0.00700036 -0.03818225 -0.09613942 0.01852354 0.01484412 0.01826848 -0.00083936 -0.02246628 0.00487860 0.05256017 0.04753419 -0.07606748 0.03038303 0.02106338 0.08291823 -0.04721445 -0.06168294 -0.04429012 -0.04113692 -0.05070617 0.00836526 0.03422128 -0.03487058 -0.05357457 0.05885691 -0.05330305 -0.09594003 -0.04186359 0.01314490 -0.09390945 -0.05882730 -0.06852514 0.00770191 -0.02330212 -0.06024780 0.00524850 -0.00756898 -0.08717562 -0.01065575 0.01552774 0.04514596 -0.06014437 0.02875090 -0.08696876 -0.05121149 0.01852734 -0.03024030 -0.05966379 -0.03166589 0.10331713 0.05049342 0.05431195 -0.01650844 -0.05612279 0.07912651 -0.00688766 0.04886195 -0.00109441 -0.02652496 0.05183651 0.02098495 -0.00946550 -0.02004380 0.01288475 0.05019947 -0.00623552 -0.01943335 -0.02776803 -0.05049729 -0.03476553 0.00108621 0.02969190 -0.03436214 -0.02236490 -0.00299161 0.07844786 -0.02880765 0.01946403 0.02197720 -0.02051305 -0.04761739 -0.02085033 0.06450456 -0.01000122 -0.01297451 -0.03051789 -0.02004598 -0.02013790 -0.00778489 -0.07837914 0.05264473 0.02386641 -0.00521187 0.00831060 -0.05376568 -0.07892095 -0.02595938 0.01397101 -0.03515410 -0.10096491 0.03852309 -0.04627219 0.02494148 0.00661065 0.05126872 0.03203677 0.00516666 -0.05976383 -0.04571061 -0.08782940 0.07726248 -0.02641325 -0.00424595 -0.01154765 0.06586535 -0.04642367 0.01712613 -0.01171286 0.10162871 -0.02173328 -0.04728069 -0.04989777 0.00027009 -0.05248197 -0.01328100 0.04856213 0.00398691 0.03588153 0.01072570 0.01163169 -0.10279609 -0.06096014 -0.05584699 -0.01753491 -0.05164653 -0.08219001 0.00514196 -0.06556301 -0.03053525 -0.00977338 0.02978731 -0.00141753 0.05970217 0.00102642 0.01849302 -0.00698365 0.01149928 0.09602358 -0.00351073 0.02882254 0.03827009 -0.02282917 0.03779576 0.15468474 -0.11025730 -0.02862621 -0.07783020 -0.06059508 -0.02221277 0.03151442 -0.01336000 0.02886356 0.08447710 -0.03982925 0.06348084 -0.01323500 0.03718908 0.01447825 -0.00809639 0.01723031 0.07902426 -0.03872322 0.02486964 -0.06994900 -0.01042738 0.04361902 0.02555341 -0.07352958 -0.01616972 -0.01092017 -0.02154877 0.02134314 0.02289392 -0.01619562 0.02275624 -0.03685661 0.01703184 -0.02428205 -0.05660282 0.02329825 0.00507871 -0.02636531 0.02199334 -0.03053039 -0.05941364 -0.04405103 -0.01501020 0.02694536 0.00604961 -0.03611446 0.05026260 -0.04243067 -0.03871452 0.02247872 0.02932918 -0.01438194 0.00233183 -0.06440627 0.02348668 -0.00683752 -0.00473228 -0.04439731 -0.00492754 0.05812232 0.00526671 0.04157448 0.05099698 -0.02342957 0.01801392 -0.01495086 -0.02705456 -0.02886024 0.00730425 0.04916168 0.02388517 -0.08112455 0.06157391 -0.03021661 -0.01776803 0.02443523 -0.01849309 -0.00439963 -0.01062151 -0.04164270 0.08100840 0.00270997 -0.06587024 -0.05623777 0.05757860 0.00952956 0.02478462 0.03403945 -0.08940522 0.07115985 -0.01537533 -0.05602503 -0.03390697 -0.05229597 -0.10166371 -0.05614155 0.05552727 0.00870884 -0.01562445 -0.03196934 -0.03055804 0.06933524 -0.01804990 -0.00094585 -0.00094608 -0.01044984 -0.03193475 -0.08523086 0.05971082 -0.00406854 0.02691225 -0.04862738 0.00678359 -0.01883859 -0.00781526 -0.03199244 -0.00756633 -0.03194518 0.05981876 0.00701445 -0.00264011 0.03570542 0.05560002 0.06968141 0.12561211 0.04800880 -0.00018306 -0.08792247 -0.00888074 -0.04746253 -0.03623943 0.07785048 0.02343371 -0.07968921 0.05832693 0.03426208 -0.00908421 -0.02558041 -0.00875034 -0.00356540 0.01355158 0.02154511 
24381-0.06976026 0.00218324 0.04416835 0.06146711 -0.08802648 0.04781765 -0.01343574 0.02662652 -0.00566320 -0.07438503 0.02168855 0.09547312 -0.09235452 0.03707227 0.04599776 -0.07710495 0.03154327 -0.01152408 0.01705810 0.00433115 0.01251920 0.04381652 0.03282749 -0.03651747 -0.04062123 0.02172578 0.07138524 0.06655575 -0.02012912 0.01931259 0.03702487 0.00377967 0.09054788 -0.01276819 -0.03662274 -0.03092973 -0.10463640 0.03742385 -0.04511465 -0.00137044 -0.01738933 0.01966510 0.04260730 -0.03541566 -0.02692642 0.08264069 -0.00804367 -0.01709807 -0.03342787 -0.01235364 0.10267100 0.03991142 -0.00312930 -0.03877714 0.02467329 0.00329064 -0.00269561 -0.01247059 0.00921096 0.03916454 0.00727561 0.05462626 -0.00468312 -0.00676825 -0.01367853 -0.01563967 -0.00566356 0.01456867 0.00297191 -0.04921877 -0.02920209 0.03204035 -0.02432196 0.04994351 -0.01145576 0.03147586 0.04159772 -0.09104280 0.05640835 -0.03123120 0.04984039 0.00614818 0.00240011 -0.00175141 0.01413581 -0.00761210 -0.00589388 -0.04541255 0.04155206 -0.02562555 0.00860481 -0.03384613 -0.02769881 0.04847504 0.04732028 0.03670042 0.12572764 0.04275893 -0.02191357 0.00751449 0.04227966 0.01502067 -0.03812968 0.00797641 -0.01231233 -0.07936236 0.03172009 0.04092912 -0.04726245 0.00271729 -0.00927808 0.08806325 -0.00020820 0.05464160 0.02534782 0.01967836 -0.08988339 0.00277929 -0.03235507 -0.01442254 -0.00354769 0.03040205 0.02761286 0.06472635 -0.04729231 0.08809100 -0.02281209 -0.03781455 0.01087369 -0.06492377 -0.00855082 -0.00935954 -0.04065703 0.11785305 -0.00412939 0.02864560 0.00199895 -0.05000700 0.00960819 0.04066486 -0.01282146 0.04144263 0.05483418 0.02461074 -0.00063384 -0.01987657 0.00900903 -0.03675061 0.03972876 0.04800663 0.01029220 0.04944750 -0.05399984 0.07828584 0.02150703 -0.04113849 -0.04751937 0.06800048 -0.01154776 -0.11071070 0.07606068 -0.06392493 0.04885723 0.04448138 0.01598859 -0.00959315 0.02609779 0.04796583 0.06041780 0.01369766 -0.01777397 -0.04670265 0.00645441 -0.04251963 0.01154961 -0.07980359 -0.01536035 -0.01440947 0.03041165 -0.01177324 0.03769876 0.00414102 -0.00718055 0.04699631 0.05773848 0.00137174 -0.02352696 0.01502391 0.04876222 0.04142653 0.03637256 0.00113255 0.06124074 -0.01852301 -0.01008971 -0.06970693 0.03264251 -0.00770095 -0.02064445 0.02137140 0.00677546 -0.00194725 0.01303248 0.00369940 0.05259701 -0.04591543 -0.00744313 0.12415449 0.03568517 -0.01115335 -0.06815468 -0.04036563 0.01522625 -0.00380189 0.03228394 -0.01426455 0.02031737 0.00222382 0.01180171 0.07148228 -0.00890457 0.02939647 -0.01482999 -0.01581176 -0.03179030 0.00785457 -0.04811956 -0.04265208 0.00394609 -0.00035084 -0.06269354 0.01481573 -0.00071081 -0.02613206 -0.01887515 0.00782885 0.00780991 -0.02132148 0.07817438 -0.05174704 -0.03988666 -0.09277907 -0.00650688 -0.02383569 0.00898849 0.02855145 -0.09905262 -0.07400704 0.00318121 -0.03964020 0.00128410 0.00762135 -0.00544089 -0.04221756 0.05788989 -0.07409884 0.06747039 0.00034842 -0.02361764 -0.02622271 0.00591738 -0.01755980 0.01159817 0.00564966 0.03780129 0.12698118 0.01244724 -0.01856184 -0.05593989 0.06361351 -0.07931933 -0.05598297 0.02817590 0.09853566 -0.06137826 -0.02951090 -0.00653074 -0.02443819 0.04416737 0.04111494 0.10179107 -0.01447860 0.05524287 -0.05422387 0.05606660 -0.07081358 0.01133882 0.05109034 0.04709164 0.00118926 0.01161053 0.00169213 0.01802946 -0.01665777 0.03726530 -0.07358429 -0.06225288 0.01903516 0.03937712 -0.00555917 -0.01178581 -0.06033428 0.01355083 0.02018455 -0.00667757 -0.07514291 -0.02465495 0.03228199 0.02735765 0.04020798 0.06114768 -0.08927497 0.05173398 -0.01956077 0.00206590 -0.18074416 -0.04737336 -0.01236977 -0.00110956 -0.03452112 -0.06471319 0.00086102 0.04203147 -0.06652445 0.00103072 -0.06698009 -0.00321923 -0.00237965 0.05829265 0.02951203 -0.02043612 -0.01001843 0.07455344 0.01637289 0.00109878 -0.01945443 0.08986950 -0.04711286 0.03173281 0.03609861 0.00151383 -0.06016019 0.03734135 0.02266889 -0.00710800 -0.03851509 -0.02615848 -0.08146857 -0.01277088 -0.08973767 0.00477035 0.04157642 0.05376767 -0.03273620 -0.01802989 0.04373518 0.01736315 0.06393924 0.06745569 -0.01763436 0.03566597 0.04573693 -0.07596673 0.04549777 -0.00284135 0.02855426 0.00664145 -0.04350558 0.03263379 0.02844778 0.06919816 0.02619349 0.03321384 -0.06994061 0.01965462 -0.03840573 -0.01430757 0.04053975 -0.02806237 -0.02697347 0.03171200 0.01906998 -0.02705118 0.02604625 0.08320018 0.01900660 0.06610989 0.01520345 0.01544921 0.05111990 -0.01087462 0.01571349 0.04037810 -0.02743763 -0.03269254 -0.08658236 -0.08115023 -0.02964170 0.04630625 -0.02734743 0.00740089 -0.01678552 -0.00222486 -0.04692337 0.00338239 -0.04560305 -0.07189275 0.04676084 0.04173818 -0.04221216 -0.03913644 0.06087866 0.04798429 0.00062538 -0.02445265 0.02459259 -0.00014691 -0.13288546 -0.05750576 -0.01172028 0.03516504 -0.00613891 -0.00123729 -0.06385314 0.03276839 -0.01654129 -0.00054981 0.06363380 0.08163938 0.08857244 0.00468470 0.00100642 -0.06707665 0.00085849 0.02294280 0.00565519 0.06094836 -0.03446708 0.03700726 -0.02988978 -0.05028405 -0.01043994 -0.09191372 0.00833913 0.00123370 0.02646804 0.00303689 -0.06872938 0.06453871 -0.01017338 -0.03585217 -0.03600917 -0.03226072 -0.03230839 -0.11672989 -0.00522379 -0.01944697 -0.03038329 -0.02547538 0.03207330 0.01471470 -0.03840258 0.02898888 0.01858050 0.00095290 0.03026697 -0.01445681 0.02311156 0.02406912 -0.01372257 0.00612717 0.02878629 -0.03906660 0.00386148 0.01713467 -0.03746939 -0.05601564 0.00984880 0.05796926 -0.02153150 -0.02660482 -0.00374927 0.02039821 0.05556843 -0.06947033 0.01642538 0.00778102 0.01902090 -0.00111177 0.01075591 0.08197337 0.04430728 0.01454500 -0.00536242 0.10277201 -0.04519478 -0.00497946 -0.09375971 -0.10082474 -0.02819098 -0.01215009 0.09097216 -0.00691723 -0.06623649 0.00745853 0.05694120 -0.03261917 0.01345738 -0.03879289 -0.03219145 -0.05809024 0.00072706 
41542-0.04981736 -0.03523079 0.03022859 0.06672543 -0.00030098 0.01966337 0.02291727 -0.03563147 0.09084728 0.02357664 0.01229000 -0.00848378 0.00768520 0.03055391 0.01151930 0.00703397 -0.02476040 0.00086094 -0.01835941 0.02819020 -0.01292117 0.04141713 -0.01065153 -0.07067223 0.02861673 0.07761225 0.04779736 0.04068311 0.13641451 -0.01317122 0.04244301 -0.00652088 -0.03145161 0.02770749 -0.06039192 -0.05771347 -0.02798739 -0.00716540 -0.11880331 -0.02071778 -0.10007018 0.01768678 0.04053475 -0.03648228 -0.00632726 0.05773903 -0.01387891 0.04685591 -0.00334297 0.02311109 0.02326881 -0.03307109 -0.03598138 0.02706265 0.00619586 0.01987820 0.03164059 0.01072968 0.00397140 0.03477823 0.00229738 0.10549998 0.01104600 -0.06039140 -0.04951116 0.04871347 -0.02151998 0.01638046 0.02148634 -0.06809484 -0.04264098 0.06449823 0.03830655 0.00744388 0.04136178 0.08076154 0.04756954 -0.01755306 0.03848363 -0.02460161 -0.03423927 -0.05078312 0.00785587 -0.05104299 0.02053348 0.03730534 -0.02738640 0.08294257 -0.04002080 -0.02269992 0.02810474 0.00137104 -0.05580587 0.01768873 0.01206631 0.00532649 -0.04427547 0.03659799 0.00179368 0.01833197 0.01750747 -0.04040471 0.01445586 0.04947934 -0.02380049 -0.02934153 0.00540512 0.04113558 -0.03676968 -0.05387433 -0.04018908 -0.00670923 -0.00185785 0.01611192 0.00580857 0.00476019 -0.03271099 0.00095737 0.00770680 -0.02097254 0.05530684 0.01760110 0.00036022 0.02197772 0.05808390 0.04263783 -0.00161840 -0.01034918 -0.01408333 0.08218169 0.06700993 0.03157150 0.04825121 0.06083612 0.00684755 0.01881643 0.02228334 -0.02052055 -0.02029077 0.03714403 -0.02143680 0.07481048 -0.00571484 0.02376693 0.07818246 -0.01826737 0.01224601 -0.01641475 -0.03133288 -0.03895621 0.07565148 0.00625855 -0.06683341 0.10094142 -0.02059562 0.03847144 -0.05275005 -0.04920644 -0.02366334 -0.08038885 0.00765233 0.02506812 0.03665168 0.05760499 0.00456442 0.03224757 0.02431640 -0.05972792 0.02237566 -0.00501968 0.01696394 0.03051774 0.07507446 -0.01799663 -0.00153062 0.01935643 -0.06444901 -0.01440222 0.00925573 0.04241448 0.01856150 0.01113309 -0.04312521 -0.00684659 0.06548994 -0.02531961 0.07762957 -0.00629510 -0.06184380 0.01719425 0.04235665 -0.02845081 0.06214572 0.03555688 0.03938865 0.08130477 -0.04005970 0.02836971 -0.05993198 0.00271415 0.00291518 0.00197635 -0.01281938 0.02651005 0.03213095 -0.06048923 0.05453788 0.06902868 0.01827468 -0.04376782 0.02107076 -0.03608887 -0.00714318 0.01625970 -0.04107542 -0.01628905 -0.06560226 -0.01645081 0.01562575 0.00481891 0.02081241 0.05557044 -0.03536456 0.04230977 -0.02744777 0.00276074 -0.07135648 0.02455526 -0.00114013 0.04883128 -0.00469685 -0.01948103 0.00297213 -0.00613558 -0.04076257 -0.08137619 0.00229047 -0.00774974 -0.06068910 -0.09017048 0.04110863 -0.03128371 0.07471178 0.03012746 0.00725192 -0.04773902 -0.08067133 -0.03159777 -0.00090438 -0.04613575 0.03745152 0.00694226 0.00811698 0.00678230 0.01007449 0.02066128 -0.01161191 -0.05003368 -0.05144330 0.00713184 0.01852412 0.01592403 -0.00389690 -0.12742576 0.05915622 0.06805997 -0.00303192 0.02861650 -0.03432684 0.01366198 -0.02413278 -0.13034360 0.00401501 0.01464029 0.01103082 0.02309472 0.01183400 0.03311311 -0.02013417 0.06128745 -0.00356655 -0.01679007 0.03416852 -0.04177640 0.00590760 -0.03823905 0.04348926 -0.01379515 -0.06347029 -0.02066045 -0.06840926 0.03199758 -0.02425123 0.03031467 0.09384187 0.01290346 0.04946197 -0.04625419 -0.00263892 0.02766909 -0.07183763 -0.03262762 0.00626133 -0.00060665 0.01797323 -0.06051299 -0.00737015 0.03088329 -0.02512267 0.00569060 -0.02034966 -0.05502877 -0.02828220 0.02881666 -0.06205872 -0.01703288 0.02487692 -0.02286132 -0.00548175 0.04357373 0.03930883 -0.00875252 0.04354928 -0.04466012 -0.02937136 -0.01792646 0.01871741 -0.02790425 0.00333339 0.12592641 -0.01645747 0.01710511 0.07419778 -0.03316149 -0.01018261 0.01295162 -0.00095855 -0.02509455 0.07935519 0.02519948 -0.06521688 -0.02026445 -0.03150653 -0.02555520 -0.04487810 -0.02492919 0.01594514 -0.03730559 -0.06526088 -0.03405840 0.00591265 0.06014728 0.06763497 -0.03814585 0.01706978 -0.00114699 0.06499374 -0.02412197 -0.03860181 0.04069721 -0.06025385 0.03554636 -0.01251849 -0.07059138 0.00738881 0.01863135 0.04211282 0.08578721 0.06457201 -0.04724048 -0.12247586 -0.04019832 -0.01151634 0.00645844 0.00083840 -0.04912537 -0.01855712 -0.06042539 -0.04632937 -0.08690026 0.04152036 -0.01319686 0.02066666 0.03496332 -0.04125759 -0.06409210 0.00478754 -0.07133327 0.09597256 -0.10851935 -0.05655646 0.00666128 0.04312337 -0.01018878 -0.02674392 -0.00295556 -0.04649903 0.04384313 0.04256339 -0.05090790 0.05891019 0.03733234 0.00882729 -0.01240546 -0.00117765 0.00898283 -0.04560826 -0.02983981 -0.03299675 -0.02978668 -0.05246192 0.02147628 0.02913087 -0.04922287 -0.06511413 0.03167628 -0.06518428 -0.10348988 -0.05818617 -0.00713415 0.04589852 0.06003338 0.00271603 -0.04638266 0.02581900 -0.07689984 0.05069113 -0.04766549 0.01301876 0.02267186 0.03561099 0.03326007 0.00513241 -0.05465193 0.05673033 0.00516836 0.04853170 -0.03099295 0.05218850 0.04214777 -0.09330077 -0.04836918 -0.04232829 0.11933929 0.05358133 0.01568396 0.04920753 -0.04646454 -0.10706607 0.02759984 0.10169235 -0.02252438 -0.03998820 -0.10273618 -0.02820694 -0.09753069 -0.03388422 -0.03330630 -0.00230481 -0.00750686 0.00740075 0.01326383 0.02728580 0.02697254 -0.13450603 -0.06520446 0.00865226 0.01663697 0.01653050 -0.04008555 0.00785957 0.02292715 -0.05456672 0.02927435 -0.09762313 -0.02044516 0.00140773 -0.03388416 0.00593047 -0.04835204 0.04397551 -0.07838688 0.01585215 -0.00715363 0.03982727 0.00847803 0.04875366 0.04217406 -0.00263410 -0.02198016 0.05792144 0.07221700 -0.01811677 -0.00154049 0.12578644 0.00206930 0.06023760 -0.00663873 0.10446887 -0.00318279 -0.03281548 0.05756688 0.04689809 0.03599035 0.06021099 0.02612458 -0.02004343 0.03585856 0.02193722 -0.02176018 -0.05639330 -0.03533638 
15622-0.10077595 -0.01764622 0.03897166 -0.04593844 0.01715661 -0.01724917 0.05979870 -0.03779739 0.06112895 -0.03714003 -0.01747213 0.03550101 0.03549486 0.02301705 0.03522169 0.02766289 -0.10954344 -0.00996705 -0.01207141 0.01616751 -0.00594770 -0.02212696 0.04324117 0.03493432 0.04528690 0.07106131 0.04496280 -0.00137855 0.02615937 -0.04857089 0.03201089 0.01065133 -0.00264098 0.02623129 0.03010847 -0.01963877 -0.03820573 -0.04266864 -0.06319275 -0.03104514 0.02164459 -0.04475407 0.02004438 0.00063482 -0.03696389 0.02641538 -0.05460966 0.01438061 -0.04702732 -0.02384697 0.04432992 -0.02317565 0.01696911 0.02526354 0.00100748 -0.02739243 0.06817608 0.00886543 0.08216880 0.02078659 -0.01944580 0.04670368 -0.06817745 -0.05232233 0.06741227 -0.00659829 -0.04054372 0.01921687 -0.03544956 0.02925787 -0.00061197 0.03561329 -0.02611852 -0.01733857 0.03494352 0.02274538 0.02186846 0.05647455 -0.01926573 0.01394409 0.04355384 0.01633004 0.01456675 0.00547861 0.04754227 0.01417749 0.03147627 0.00589992 -0.05165641 0.03779242 0.01805400 0.00736993 0.01858380 0.04496987 0.07722866 -0.03112209 -0.02212571 0.01958328 0.01565128 -0.00898597 0.00487760 -0.00028045 -0.02034017 0.04953356 0.01037415 0.00388479 -0.01906893 0.02227630 0.04361451 -0.03807955 0.01122388 -0.00640271 -0.05738836 -0.06342605 0.11281389 0.02271564 -0.12616387 -0.00146480 -0.04450425 -0.06976276 -0.05054686 -0.01441088 -0.00625695 0.02573087 -0.00045465 0.00031225 -0.02211127 0.06533451 0.02892488 -0.01335599 0.02052295 -0.02565089 -0.03270254 -0.02820109 -0.06470260 0.07514989 -0.01008122 -0.14610088 -0.07731757 0.01849148 -0.02519377 -0.02881871 0.04113417 0.03058241 0.03846078 -0.05015965 0.01127251 0.03865880 -0.03493000 -0.02077643 0.02471538 0.00536382 0.03929131 0.03474134 0.05212680 0.02170489 -0.00732015 0.02323760 -0.02753505 -0.03820569 -0.01040388 0.05131797 0.01251523 0.02936913 -0.00498380 0.00295161 -0.03355404 -0.07108469 0.07748483 0.01047008 -0.00629038 -0.05549324 0.05217467 -0.02027778 0.00288454 0.00028972 -0.06506178 -0.01029082 -0.02428472 0.03202320 0.06117219 0.01407381 0.07406264 -0.01677448 0.02347546 0.00290846 0.02927946 0.00318344 -0.02756743 0.00518570 0.07233469 -0.01359824 0.02174240 0.05364293 0.02129060 -0.07934825 -0.04972514 0.06908653 -0.08892917 -0.04282847 0.02565719 0.01964669 0.02390179 0.02854536 0.05642953 0.00473894 0.03130713 -0.01156917 0.00622047 -0.00396764 0.06650575 -0.04409991 0.03087972 0.02166977 -0.02686060 -0.01389002 0.10787538 0.01470033 -0.06489301 0.01131147 0.06789917 0.08328723 0.00557111 0.02139988 0.05140057 -0.00089897 0.04618318 0.05607765 0.05289426 -0.01402436 0.05421532 0.02617088 -0.04728034 -0.12259685 0.00181465 -0.01598282 0.00883653 -0.00707197 0.00133121 -0.06365845 -0.04248215 -0.04894380 -0.06721459 -0.09744681 0.00976786 0.02112900 -0.04218952 0.00535939 -0.05199514 -0.05377065 -0.01273961 0.00637163 -0.00756737 -0.06972533 -0.03315687 -0.04409007 -0.00673932 -0.07885808 0.02253972 0.00601300 0.00229478 -0.01802232 0.04075433 -0.02321278 -0.05848870 0.11555482 -0.02051594 -0.02956766 0.05137520 -0.02453582 0.07049322 0.00487816 0.01258533 0.01428876 0.01864668 0.00673575 -0.11830001 -0.03494203 -0.01364057 0.01655483 0.09163812 -0.00595110 0.09425673 -0.02785284 -0.05341133 0.04528777 0.06082650 0.01369985 -0.04676093 -0.04503294 -0.01224785 -0.03740545 0.02475120 0.01902106 0.00249790 -0.04608558 -0.01587335 -0.02804688 -0.01134505 0.01679708 0.10868669 0.01123369 0.02897749 -0.05112009 -0.01213556 0.01704037 -0.01090442 0.07456494 -0.00255987 0.02802987 -0.03963285 -0.07553171 -0.02597808 -0.09332813 -0.02910363 -0.06109467 -0.01911068 -0.03024473 0.04114794 -0.01673407 0.01505944 0.03913342 0.09728760 -0.01331302 0.00339827 -0.04410557 0.03635599 0.00465614 -0.01046778 0.05222218 0.06230601 -0.04704642 0.05744518 -0.00867526 0.09345699 -0.02479477 -0.01577406 -0.02095005 0.03791982 0.03658821 -0.08004243 -0.02404878 0.09153602 -0.04390615 -0.02314288 -0.05602105 0.00917619 -0.09038343 -0.01054854 0.01727604 0.00669125 0.04703075 0.05101731 -0.00429961 -0.09432101 -0.07005186 -0.00014367 0.01749841 0.04953068 -0.10045879 0.04350243 0.00120873 -0.06810024 0.02077506 -0.03276649 0.02903390 0.00999591 0.02578974 -0.05908985 0.12364946 -0.05587182 -0.05107064 0.01972298 -0.14602314 0.02494878 0.03675459 -0.01356766 0.07296481 0.06803182 -0.00996698 0.02724778 -0.03901640 -0.02758852 0.01797506 -0.03225545 -0.02876210 0.00190813 -0.02109239 0.01237545 -0.03982310 -0.05151303 -0.02054208 -0.03279356 -0.00500910 -0.01079896 -0.01028488 -0.07631302 0.04030930 0.03283675 -0.05346812 0.01829912 -0.03149472 -0.00255483 -0.01964856 -0.05334250 0.03200415 0.03385820 -0.01257729 -0.00202375 -0.02696989 -0.04441279 0.00016843 -0.02676055 0.00814042 -0.03601800 -0.00277185 -0.09727464 -0.07432201 -0.05490141 0.00452304 0.04285873 0.04971673 0.02412065 0.02657832 0.03510480 0.00021786 -0.00013168 0.02440122 0.03219159 0.04272934 -0.03114944 -0.05215216 -0.03116945 -0.01391140 -0.01229583 -0.05648375 0.04910595 -0.01421858 0.00775983 0.01187761 0.01183383 -0.06034710 -0.04190185 -0.04022317 -0.04921464 0.08250491 -0.00023635 0.01695916 0.03012884 0.05041142 -0.00979740 0.04941523 -0.00979293 -0.04546975 -0.00380605 -0.05676183 0.02125451 -0.06880636 0.02648977 0.04247684 -0.06954201 0.03594662 -0.01749657 0.01573619 0.01236190 0.05272565 0.05031823 0.00311998 -0.01850791 0.01307265 0.05896016 -0.10224137 0.02495505 0.02851322 -0.03594624 0.00006710 -0.04761799 -0.06989659 0.03456368 -0.03663195 0.04932302 -0.02924916 0.00616188 -0.03272359 0.03311888 0.02461507 -0.01706909 0.05088578 0.00893492 0.03148218 0.01971190 0.05185060 -0.02357786 0.04407164 0.07015008 0.04876547 0.05720074 -0.20487998 0.00916799 0.00531352 0.02047223 0.10116648 -0.02378386 -0.03984476 0.00395958 0.09875207 -0.00288817 -0.01514586 0.01724260 -0.00717012 -0.00976571 0.03885644 
5662-0.07481042 -0.07795808 0.00200225 -0.02691050 0.04032356 0.00544188 0.03345906 -0.01210623 -0.05094478 -0.01011372 0.00928495 0.06455493 -0.01628177 0.00683069 0.04937563 -0.02426798 -0.01913817 0.07332795 -0.00432356 0.00951646 0.03109348 -0.00304769 -0.01294729 -0.04937670 0.02020634 0.01530457 -0.01620364 0.10960737 0.04464034 -0.05533797 0.02183248 0.04007893 -0.04780277 -0.06541044 0.13708206 0.05229149 0.00561611 0.00501599 -0.09977020 -0.02937608 -0.02729098 -0.00255994 0.07508475 -0.01254012 0.03428023 0.01129059 -0.03605476 0.03131250 -0.01905847 0.03362918 0.04133528 0.02226851 0.01002462 -0.08760533 0.03982676 0.04306858 0.04424097 0.01103925 0.03878894 0.02294944 -0.00507399 -0.03710074 -0.02151143 0.00696277 -0.20888136 -0.05163523 -0.01338594 0.01101028 -0.03383495 -0.06073909 0.03138209 0.03180885 -0.02334775 -0.08296543 0.01179959 0.10986104 0.03497428 0.01863089 -0.01878699 -0.07511693 0.00967114 -0.00641112 -0.00186784 0.01378434 0.05402472 0.01193703 -0.07445439 0.06642651 -0.02066271 -0.04945132 0.04599987 -0.03247464 -0.04802235 0.03282341 -0.03039316 -0.07645986 -0.01534522 0.00185074 -0.02842486 -0.03018460 0.06154222 -0.05994174 0.07469024 -0.04080152 -0.01949113 -0.03545883 0.02512607 0.06529456 -0.03923621 -0.02956725 -0.02109394 0.05152923 0.04706535 0.04051257 -0.01407118 0.00439225 0.05527385 0.00905298 0.03177425 0.00739897 0.07558198 0.05195332 0.00874550 0.01579458 0.05740421 0.01489965 0.03442402 -0.01966100 -0.11079875 0.03749343 0.03309089 -0.00983582 -0.00847581 0.02337476 0.07750838 -0.01680996 -0.02612243 -0.04461695 0.04456292 0.06422945 -0.03777782 -0.00408998 0.07768466 0.00404511 0.07050098 0.01381875 0.04422034 0.07029543 0.00188416 -0.01653483 0.01768759 0.06585320 -0.05133628 0.02525376 -0.03940448 -0.00140157 0.01197156 -0.14719902 -0.01929884 -0.07041793 -0.00277766 0.08728318 0.01368253 0.00638401 -0.01884442 -0.03935904 -0.03757090 -0.05187243 -0.00711886 -0.02280837 0.08199846 -0.04404419 0.03620578 -0.02510313 -0.00536323 0.06997829 -0.06921232 -0.01808530 -0.08192538 0.02012730 0.00588885 0.00354770 0.01209278 0.03306425 0.09657508 0.03939836 -0.00482717 0.01105486 0.03244160 0.02419541 0.01470234 -0.03442108 -0.00870148 0.12111341 0.01850129 -0.01957795 0.09593202 0.01429143 -0.01004524 0.01066626 -0.00071947 -0.05590606 -0.00061303 -0.03597185 0.01494904 -0.04369875 -0.01431281 0.03209020 0.01719157 -0.02749063 -0.00591506 -0.00341371 -0.02820386 0.00563605 -0.01766135 -0.00885180 0.04672120 -0.01866090 0.01512683 0.01843148 0.00754286 -0.05643564 -0.02229051 0.08877443 0.05585444 0.02498850 -0.00563955 -0.05077721 0.01681151 -0.00558324 -0.02991302 -0.05565681 -0.05033796 -0.02569001 -0.00886576 0.03020136 0.00256372 0.02868697 0.03744049 0.01462925 0.04495399 -0.04914272 0.01666810 0.00080938 0.00982121 0.02082802 0.03977990 0.00357262 -0.02307586 0.04731507 -0.00722630 -0.00821326 -0.00104018 -0.01794118 0.03590079 -0.00222719 0.05400464 -0.06398989 0.04721796 0.01572986 -0.00135212 -0.00283136 -0.02055034 -0.03666658 0.00725762 0.08834614 0.01020187 0.01175435 0.01136601 -0.04059976 0.07445820 -0.07346130 -0.01454344 -0.13287018 -0.04144371 0.00152673 -0.02542982 0.00629165 0.00997239 0.00283387 -0.00901667 -0.01079064 -0.00642682 0.01842677 0.02275406 0.03467805 -0.07888445 0.00528642 -0.04198223 0.01810223 0.00683585 0.01413780 -0.05085241 0.03158130 -0.01671802 0.01435870 0.06272135 -0.00489597 0.00552149 0.08069784 -0.00722099 -0.03006122 0.01434083 0.06537599 -0.02129289 -0.03117963 0.05077285 0.03426811 0.04968524 0.00642489 -0.05452867 -0.02769263 0.01262887 0.02819299 -0.01007086 -0.11335675 -0.08043556 -0.06150444 0.06045083 0.01132705 -0.02369783 -0.06133309 0.02198013 -0.06488293 -0.02567840 0.01737386 0.00176647 0.02993281 -0.03205926 0.09181374 0.00501892 0.00968029 0.05627492 0.01761724 0.05473281 -0.00841956 -0.02770885 -0.08599132 0.04046011 0.00252136 -0.01443783 0.03955487 -0.01984111 0.03503551 -0.00034599 -0.06417800 0.02153868 -0.10001611 0.01320206 0.01837191 -0.02859158 -0.00596939 0.06091147 -0.04501381 -0.05909812 0.02787513 -0.00017074 -0.00750669 -0.00481435 0.03483991 0.02275487 -0.00086138 -0.00318088 0.03991235 0.00886443 0.03223671 -0.01961277 0.07144739 0.06326114 0.07366882 0.05579618 -0.02130010 -0.06872407 -0.01441203 -0.04608161 0.06678264 -0.05240959 -0.04473044 -0.05481362 -0.02421057 0.04787275 -0.06119354 0.04684405 0.05515959 -0.05591045 -0.00263127 0.01465968 0.04260743 0.04155996 -0.02791581 -0.03885797 0.00710681 0.02215751 0.06913906 0.05621504 -0.06333020 0.00634881 0.03713811 0.00151589 -0.03993011 0.05278664 0.03189396 0.01737582 -0.04287079 0.05966168 0.02064750 0.05430415 -0.05173280 0.02901277 -0.00416220 0.03426950 0.05609505 0.02531153 -0.02037930 -0.00332107 0.00156219 0.01466211 -0.03188524 -0.04051834 -0.01024604 0.04927251 0.06441749 0.01025029 -0.03039305 0.09683338 -0.07585467 0.00510045 -0.02308706 0.01887594 0.02556677 -0.02539501 0.02111723 -0.03779904 -0.01183934 0.04242468 -0.05498742 -0.01706607 -0.02918535 -0.03149786 -0.00571373 -0.08749980 -0.00553531 -0.07772376 -0.00408681 0.05393391 0.04985627 0.00586312 -0.03878295 0.01500628 -0.03905934 0.04830775 0.01117055 0.01583104 -0.07254234 -0.06682532 -0.01346314 0.06959802 -0.05600014 0.02270764 -0.04664322 0.03144433 -0.13097124 -0.05031828 0.01819804 -0.01938991 0.01154715 -0.00335476 0.07507017 -0.02217146 0.04066833 0.03285878 0.06230157 -0.03070414 -0.05740002 -0.01543020 0.01794121 -0.01528005 0.02679652 0.02363936 0.06344450 0.07312031 -0.03037498 -0.03991330 -0.02777413 -0.01222805 0.01865029 -0.01180727 -0.04550942 0.03266982 0.01097341 0.04061085 0.05912388 0.00600307 -0.01372733 0.07364934 -0.02231321 0.00959624 -0.07055099 -0.03582703 -0.02205818 0.04485050 0.09064922 0.05014514 0.09315441 0.05807687 0.02651793 -0.07465132 0.00465676 -0.00259060 0.02690965 -0.09346714 0.05347962 
4282-0.00903127 0.04316609 0.06915708 0.00960549 0.01916564 -0.01514438 0.02965164 -0.04167099 0.03862788 0.04373322 -0.00456831 -0.01259478 0.03631695 0.02802984 0.00289256 0.00066767 -0.08812149 -0.03915885 0.05616649 -0.01926076 -0.05445179 -0.00657667 0.01537631 0.02857632 -0.00819393 0.09596666 -0.02460978 0.13060594 -0.00844228 -0.03072776 0.01230512 0.05687456 0.02597649 0.05627955 0.01898597 -0.01880583 0.01685439 -0.01791026 -0.04021660 -0.04468232 -0.05049175 -0.02204122 0.05229895 0.02032817 -0.03468572 -0.00416433 -0.00905828 0.05669087 0.00508007 -0.01179797 0.03117993 -0.03024335 -0.02867362 -0.09918139 0.02543271 -0.01835479 0.02462383 0.01025784 0.10080586 0.00025000 -0.01003117 0.06163957 0.02932849 0.01560816 -0.00486537 0.01408069 0.01805966 0.01516353 0.01069927 -0.02524294 -0.04135707 0.03045223 -0.02805525 -0.02667883 0.01205113 0.03296710 0.03719697 0.00666454 -0.01394613 0.01212304 0.01032014 0.00427745 -0.01559353 0.08151059 0.00321537 -0.00810544 -0.03299880 0.02968478 -0.01964008 0.01742565 0.01090426 0.03443343 0.02953870 0.01292127 -0.00535001 0.01622304 0.02781838 0.00840477 -0.04294309 0.03864337 0.02148384 -0.06760403 0.00326613 0.03449891 -0.01632030 -0.05313466 -0.06300806 0.02982151 0.08845006 -0.06819438 0.03629580 0.08296005 0.04479488 -0.07194778 0.09291059 0.02341176 -0.08119912 -0.01726885 -0.01478981 0.06347169 -0.01931552 0.00080891 0.00471002 0.04470418 -0.01088715 0.01522035 0.02218189 -0.03088054 0.01327704 0.04804837 0.00529879 0.00303797 -0.02763543 0.08114835 0.05717234 -0.00489206 -0.00213967 -0.05576681 0.00584363 0.03380935 -0.02134592 0.02763159 0.00370033 0.01539475 0.07329556 0.05901274 0.01391970 0.02007359 0.00851044 -0.00840571 0.06407212 -0.00494526 -0.04436928 0.06086227 0.04601342 -0.02075799 0.02746396 -0.10451899 -0.02641456 -0.08299040 -0.02223831 -0.02073294 0.03166678 -0.07079055 -0.05497809 -0.00863400 0.03608105 -0.06649498 0.02160925 -0.01433923 0.03099657 -0.11613587 0.04247981 -0.04791323 -0.02295493 0.03740236 -0.01512987 -0.00882280 -0.00759828 0.05236638 0.00013015 0.01960558 0.01489542 0.00101063 0.03431015 0.03255732 -0.00621583 0.02171322 -0.00620052 -0.01379464 0.01804462 -0.02436776 0.04110052 0.06701160 0.01187931 0.01082937 -0.03313740 0.05237625 -0.06263541 0.01004621 0.03106170 0.06966904 -0.07077425 -0.00143905 0.04201651 -0.05218826 0.04660216 0.04229749 0.02982539 -0.05933816 0.05218807 -0.03911893 -0.06735226 0.02587622 -0.07080126 -0.00662739 -0.04262061 -0.03886572 0.04926097 -0.05631799 0.05139576 0.05310157 -0.06192862 0.00352267 -0.04801565 0.03042394 -0.01591711 -0.01515486 0.00507385 0.03167430 0.02161484 -0.05195755 -0.01831361 -0.06365529 0.01129516 -0.01882964 0.00536713 0.02837457 0.01928650 0.02266110 0.00939083 -0.01219596 0.04101212 -0.03142467 0.00614299 0.00659223 0.00099334 0.00522329 -0.02122286 0.00054399 -0.00092841 -0.01812272 -0.00702653 -0.02193055 -0.00500453 -0.05474861 0.04310890 -0.07602999 0.01567750 0.04805655 -0.04169892 -0.04508436 -0.06170204 -0.06185959 -0.01388332 0.18199724 -0.00936134 0.05786855 0.08489013 -0.03256229 0.00394733 -0.02325131 0.06120018 -0.02251186 -0.03602190 0.00131691 -0.07060770 -0.01221873 -0.02399611 0.07358214 0.04019866 -0.00326866 0.04335294 0.06311903 -0.07118770 0.09792703 -0.00981261 -0.02696795 0.04544138 -0.00755275 0.05030270 0.02551402 -0.03096679 -0.02888710 0.01118596 -0.00648245 -0.06917538 -0.02396419 -0.03631739 0.05115647 -0.03122600 0.00866200 -0.05147762 -0.04041262 -0.02061390 -0.13569707 -0.00769729 0.02015784 0.18081670 0.00732535 -0.08614466 -0.09121008 0.03328920 -0.00381860 -0.05157326 -0.01001716 -0.02695707 -0.05788578 0.05114140 0.04417703 0.06320920 0.02054757 -0.01887682 0.04864321 -0.03961994 -0.03267161 0.02691967 -0.00474240 -0.07702942 0.04006559 -0.04267555 -0.05899692 0.03445410 -0.02179551 0.05939317 -0.00636574 0.01552286 -0.04403363 0.00514814 0.02471109 -0.07852975 0.00500714 0.03497127 0.03808928 -0.05893566 0.02970855 -0.07096495 -0.08102437 -0.08015988 0.03481125 0.00795918 -0.02760761 0.01546949 -0.05463671 -0.11590298 0.00999488 0.04758542 -0.03415129 -0.00740669 0.01664788 0.02255535 0.00019125 0.03466651 -0.12302116 -0.04027446 0.02222289 -0.00858284 0.09924518 -0.01894474 0.05901397 0.05184980 0.03189304 -0.01585304 -0.12255783 0.01207843 0.01112537 0.00510177 0.04617463 0.03055007 -0.02099600 -0.02600590 -0.06170241 0.01031900 0.01588962 -0.00927418 0.01478515 0.08047587 -0.04035753 0.04989618 -0.04869871 -0.00520638 0.00620961 0.06876758 -0.02385233 -0.02962158 -0.07586117 -0.04877428 0.02145823 -0.00935541 -0.03073761 -0.00572721 0.00403258 0.00671624 -0.00137905 -0.06518776 0.04855777 0.01845553 -0.01728557 -0.03768322 0.00136335 -0.02407115 -0.00250525 -0.03445590 -0.07403798 0.02541868 -0.06297617 -0.00793388 -0.04338126 -0.06625023 0.02340345 0.01773512 0.03793674 0.00491334 -0.06294013 0.00686726 0.02785482 -0.07971381 0.02454787 0.10509032 -0.01356206 -0.01861604 0.01259061 -0.02214314 -0.04394299 -0.04477680 -0.01251140 0.04530635 -0.01332550 0.05746071 0.02116239 -0.09578930 0.02925906 -0.01818376 -0.01644669 0.02003562 0.03152321 0.01701428 -0.06877145 -0.04107065 -0.00780931 0.03962371 0.03223195 -0.01938912 -0.04353359 0.03834623 -0.02971848 0.02071184 -0.00378199 -0.05211677 0.03399373 -0.07574547 0.02549582 -0.06761892 0.02813936 -0.08034644 -0.01871441 0.05064533 0.03244065 -0.01288659 -0.02713079 0.00795434 0.06803750 0.02286598 -0.01878288 -0.00663703 0.04326674 0.03361154 0.00422193 -0.00541131 -0.09335815 0.02540452 -0.09333566 0.05162109 -0.00357104 -0.02863346 -0.02880555 0.04942382 -0.00089737 0.02561399 0.02729169 0.03233086 0.04193081 -0.01261151 -0.00694437 0.10361009 0.07850079 0.05223433 -0.11244854 -0.03216882 -0.02988031 -0.00635606 0.04359118 -0.01305064 0.05243715 0.01376491 0.10783723 -0.00645204 -0.01518039 0.00708853 -0.09104434 -0.00564568 0.00705375 
26337-0.04625623 -0.01046129 0.08407867 -0.00102488 0.06333590 0.03994971 0.08286036 -0.03473618 0.07110319 -0.00772092 0.01315318 0.07544615 0.02812765 0.02049128 0.01954390 0.02456552 -0.04465071 -0.01934096 0.01167572 -0.02486398 -0.00191727 0.08632897 0.01942188 0.02901788 -0.01836357 -0.03687128 0.01238594 0.00075737 0.02900137 0.05016002 0.01657377 0.01700787 0.04215423 0.04405766 -0.05064804 -0.08699464 0.00865757 -0.07791178 0.08572436 -0.00790919 -0.03327719 -0.01676544 0.09952445 -0.03514675 0.02925982 -0.00002492 0.02403941 -0.05986428 0.02720507 0.01199124 0.04862058 -0.01654831 0.03490607 0.03965839 0.01934014 -0.00237648 0.01706071 -0.01359189 0.04266198 0.02444263 -0.02317169 0.10890672 -0.05049801 -0.07933866 -0.00915650 0.03836134 0.00621163 0.00502972 -0.04482129 0.01227826 -0.04462347 -0.06601290 0.03413775 0.03319748 -0.02864893 0.01440914 -0.05850990 0.06258553 0.06660639 0.06950049 -0.02280644 0.01028244 -0.01086982 0.02187049 0.02051910 0.00307342 0.05231592 -0.02170844 -0.06425349 0.02995417 0.00582020 -0.03442686 -0.01669109 0.05979537 0.01856443 -0.01395106 -0.03546472 0.00542357 0.00990463 0.03591567 -0.03960750 0.05715822 -0.01810165 -0.04238323 -0.07623175 -0.09086762 -0.02432463 0.12401557 -0.00305387 0.03700843 -0.02092052 0.01343347 -0.10243340 0.00444908 0.03725882 0.04567586 -0.08686863 0.03856544 -0.01964884 0.06120284 -0.06160072 0.06201145 0.00307295 0.02530710 -0.05169893 0.00725241 -0.05454800 0.01743854 0.07508831 0.04695931 -0.00227198 0.03886732 -0.03521545 0.04107918 -0.01923262 0.00981871 0.05086960 -0.00593367 0.04689453 0.04367822 0.04585448 -0.04482629 -0.01276843 0.02337302 0.07687317 -0.03765109 0.02666192 -0.02679216 -0.07264565 -0.03969557 0.06501877 0.06200853 -0.05626744 0.02408600 -0.02980445 -0.01825473 0.02832121 -0.00045506 0.00874741 -0.06573182 0.03730764 -0.03081020 -0.01457115 0.02039694 -0.04445390 -0.02406654 -0.02697277 -0.02655384 0.03696438 -0.01736440 -0.02327907 -0.05725414 0.05448800 -0.04040415 -0.03471107 -0.04531373 -0.00629886 0.00170909 -0.03978240 -0.02199096 -0.03941992 0.00850512 0.00557959 -0.00385029 -0.01575266 -0.01324082 0.00265176 -0.01940331 -0.04640437 0.04100333 0.07701855 0.01815463 0.06237967 0.04952684 -0.03971957 -0.02541417 0.06277507 0.01861660 -0.04596417 0.02350248 -0.05747897 -0.03055525 0.00598221 0.04412434 0.02074752 -0.07398371 0.01377497 -0.01844117 -0.04895420 -0.05584618 -0.01801054 0.04289067 0.03015543 -0.01675289 -0.01533137 -0.00798681 0.02712036 -0.02890862 -0.01648619 -0.01919997 0.01584245 0.00914985 -0.03906718 0.01811947 -0.04028627 0.01540859 -0.09291590 0.04691931 -0.01144527 0.02722243 0.00631417 -0.03611672 -0.00207586 0.03183465 0.01447198 -0.02720656 0.00311262 0.01358519 -0.01768105 -0.06818248 -0.07026363 0.04220725 0.01563935 0.03774305 0.00546485 0.01760084 0.01712732 0.05283617 -0.02119944 -0.09729784 -0.01160961 -0.00226528 0.00228361 -0.03254309 0.01707827 -0.03043022 -0.10299757 -0.05094879 0.02425831 -0.00355704 0.10707334 -0.00053576 0.02729356 0.00084593 0.07320160 0.14263286 -0.01009675 0.03055519 -0.00371649 -0.04643294 -0.01979426 -0.07637622 -0.09936619 0.06036915 0.00276245 0.06294693 -0.07097097 -0.05578409 -0.04505128 -0.03035136 0.04228539 -0.00141903 0.12060711 -0.00117660 -0.01382081 0.03997000 -0.03499091 -0.04934507 -0.01265479 0.04935357 0.05638725 0.05751546 0.00188572 0.02878160 -0.01070151 -0.07113525 0.01362490 0.01492100 0.01968378 -0.06246135 0.02276803 0.07418473 -0.03766726 0.05716049 0.00894559 -0.08444819 0.03740668 -0.02071392 -0.01066765 0.02912099 -0.07753864 -0.02807910 -0.00059477 0.04379735 -0.01833879 -0.05781115 -0.02306593 -0.12833962 -0.01132700 0.00043332 0.00257051 0.03555497 -0.04141807 -0.01489299 0.04796924 -0.01890572 0.01598846 0.04329584 0.01310057 0.10577817 -0.01981361 -0.02564441 0.03488779 -0.05417980 0.05422264 -0.07412330 0.04594867 -0.00021197 0.05756741 -0.02416278 0.07442671 -0.07296880 0.00578909 0.03120154 -0.02525996 -0.10727894 -0.00922724 -0.03089080 -0.01099734 0.00411152 0.01555896 -0.04203807 0.10058553 -0.04675984 -0.10714402 0.05093826 0.02926494 -0.02081728 -0.00729950 0.02160104 0.02766119 0.04699481 -0.07011296 -0.00523427 -0.04952104 0.03863125 -0.03070737 0.01310245 -0.03063107 -0.06485168 0.00553488 -0.00803247 -0.01921815 0.02247080 -0.00982030 -0.10513162 0.01794683 0.06531870 -0.02805996 0.01312310 0.02652609 0.06482637 0.05824970 -0.03478849 -0.07496799 0.04272895 -0.02852358 0.05656609 0.00454572 0.04394037 -0.04370771 -0.02451568 0.03000730 -0.03822605 0.02595372 -0.01245839 0.04209964 0.00992974 0.02053569 -0.03851036 0.02718321 -0.01058342 -0.02044339 -0.02939100 -0.01937112 0.05849671 -0.05092414 0.03183075 -0.00557423 0.00362723 0.08008144 0.00160273 0.03769433 -0.05383030 -0.03931151 -0.01455422 0.07187381 0.00338787 -0.06378862 0.02159593 0.05940752 -0.05459800 0.03299179 -0.01048890 -0.03459304 -0.05829837 -0.06199726 0.04610071 0.04733562 0.05601728 0.05026469 0.04634680 -0.03348678 -0.00660666 -0.01435589 0.02473231 0.05401890 -0.03287175 0.00733055 0.03569730 0.05239975 -0.03714010 0.00983087 0.00963711 -0.07282051 0.00712978 0.01197440 -0.01428354 -0.00000964 -0.04204410 -0.05032920 0.01509099 -0.04764790 0.00645663 0.04084183 0.00231274 0.03463424 -0.06453673 -0.07960699 0.06469877 0.02176066 -0.09742584 -0.02178211 0.00928854 -0.12112571 -0.02051966 -0.03950721 0.02019501 0.00753530 -0.05773720 -0.00585445 -0.00409730 -0.03304575 0.00041452 -0.01587901 -0.00391938 -0.02983808 -0.14276297 0.03593273 -0.05425276 0.06543014 -0.01580369 -0.03310246 -0.01813433 -0.01330116 0.02138102 0.05228459 0.02069326 -0.02252935 -0.01197431 0.04346422 -0.01321351 -0.04852602 0.04417197 0.02779470 0.01449795 0.01670827 -0.07340848 0.03534355 -0.04339332 -0.08006717 0.08272849 -0.00104691 -0.07832366 0.01218867 0.00667058 -0.08910172 -0.02372465 -0.02286686 -0.04001194 0.00500422 0.01011096 

# 测试集
"test_a.csv"

useridfeediddevice
14298672271
68356918642
499251046572
60529237382
131482690381
52981336361
55058228532
781681002221
135784789822
# coding: utf-8 comm.py 
import os
import time
import logging 
LOG_FORMAT = "%(asctime)s - %(levelname)s - %(message)s" 
logging.basicConfig(level=logging.INFO, format=LOG_FORMAT) 
logger = logging.getLogger(__file__)
import numpy as np
import pandas as pd

# 存储数据的根目录
ROOT_PATH = "./data"
# 比赛数据集路径
DATASET_PATH = os.path.join(ROOT_PATH, "wechat_algo_data1")
# 训练集
USER_ACTION = os.path.join(DATASET_PATH, "user_action.csv")
FEED_INFO = os.path.join(DATASET_PATH, "feed_info.csv")
FEED_EMBEDDINGS = os.path.join(DATASET_PATH, "feed_embeddings.csv")
# 测试集
TEST_FILE = os.path.join(DATASET_PATH, "test_a.csv")
END_DAY = 15
SEED = 2021

# 初赛待预测行为列表
ACTION_LIST = ["read_comment", "like", "click_avatar",  "forward"]
# 复赛待预测行为列表
# ACTION_LIST = ["read_comment", "like", "click_avatar",  "forward", "comment", "follow", "favorite"]
# 用于构造特征的字段列表
FEA_COLUMN_LIST = ["read_comment", "like", "click_avatar",  "forward", "comment", "follow", "favorite"]
# 每个行为的负样本下采样比例(下采样后负样本数/原负样本数)
ACTION_SAMPLE_RATE = {"read_comment": 0.2, "like": 0.2, "click_avatar": 0.2, "forward": 0.1, "comment": 0.1, "follow": 0.1, "favorite": 0.1}

# 各个阶段数据集的设置的最后一天
STAGE_END_DAY = {"online_train": 14, "offline_train": 12, "evaluate": 13, "submit": 15}
# 各个行为构造训练数据的天数
ACTION_DAY_NUM = {"read_comment": 5, "like": 5, "click_avatar": 5, "forward": 5, "comment": 5, "follow": 5, "favorite": 5}


def create_dir():
    """
    创建所需要的目录
    """
    # 创建data目录
    if not os.path.exists(ROOT_PATH):
        print('Create dir: %s'%ROOT_PATH)
        os.mkdir(ROOT_PATH)
    # data目录下需要创建的子目录
    need_dirs = ["offline_train", "online_train", "evaluate", "submit",
                 "feature", "model", "model/online_train", "model/offline_train"]
    for need_dir in need_dirs:
        need_dir = os.path.join(ROOT_PATH, need_dir)
        if not os.path.exists(need_dir):
            print('Create dir: %s'%need_dir)
            os.mkdir(need_dir)


def check_file():
    '''
    检查数据文件是否存在
    '''
    paths = [USER_ACTION, FEED_INFO, TEST_FILE]
    flag = True
    not_exist_file = []
    for f in paths:
        if not os.path.exists(f):
            not_exist_file.append(f)
            flag = False
    return flag, not_exist_file


def statis_data():
    """
    统计特征最大,最小,均值
    """
    paths = [USER_ACTION, FEED_INFO, TEST_FILE]
    pd.set_option('display.max_columns', None)
    for path in paths:
        df = pd.read_csv(path)
        print(path + " statis: ")
        print(df.describe())
        print('Distinct count:')
        print(df.nunique())


def statis_feature(start_day=1, before_day=7, agg='sum'):
    """
    统计用户/feed 过去n天各类行为的次数
    :param start_day: Int. 起始日期
    :param before_day: Int. 时间范围(天数)
    :param agg: String. 统计方法
    """
    history_data = pd.read_csv(USER_ACTION)[["userid", "date_", "feedid"] + FEA_COLUMN_LIST]
    feature_dir = os.path.join(ROOT_PATH, "feature")
    for dim in ["userid", "feedid"]:
        print(dim)
        user_data = history_data[[dim, "date_"] + FEA_COLUMN_LIST]
        res_arr = []
        for start in range(start_day, END_DAY-before_day+1):
            temp = user_data[((user_data["date_"]) >= start) & (user_data["date_"] < (start + before_day))]
            temp = temp.drop(columns=['date_'])
            temp = temp.groupby([dim]).agg([agg]).reset_index()
            temp.columns = list(map(''.join, temp.columns.values))
            temp["date_"] = start + before_day
            res_arr.append(temp)
        dim_feature = pd.concat(res_arr)
        feature_path = os.path.join(feature_dir, dim+"_feature.csv")
        print('Save to: %s'%feature_path)
        dim_feature.to_csv(feature_path, index=False)


def generate_sample(stage="offline_train"):
    """
    对负样本进行下采样,生成各个阶段所需样本
    :param stage: String. Including "online_train"/"offline_train"/"evaluate"/"submit"
    :return: List of sample df
    """
    day = STAGE_END_DAY[stage]
    if stage == "submit":
        sample_path = TEST_FILE
    else:
        sample_path = USER_ACTION
    stage_dir = os.path.join(ROOT_PATH, stage)
    df = pd.read_csv(sample_path)
    df_arr = []
    if stage == "evaluate":
        # 线下评估
        col = ["userid", "feedid", "date_", "device"] + ACTION_LIST
        df = df[df["date_"] == day][col]
        file_name = os.path.join(stage_dir, stage + "_" + "all" + "_" + str(day) + "_generate_sample.csv")
        print('Save to: %s'%file_name)
        df.to_csv(file_name, index=False)
        df_arr.append(df)
    elif stage == "submit":
        # 线上提交
        file_name = os.path.join(stage_dir, stage + "_" + "all" + "_" + str(day) + "_generate_sample.csv")
        df["date_"] = 15
        print('Save to: %s'%file_name)
        df.to_csv(file_name, index=False)
        df_arr.append(df)
    else:
        # 线下/线上训练
        # 同行为取按时间最近的样本
        for action in ACTION_LIST:
            df = df.drop_duplicates(subset=['userid', 'feedid', action], keep='last')
        # 负样本下采样
        for action in ACTION_LIST:
            action_df = df[(df["date_"] <= day) & (df["date_"] >= day - ACTION_DAY_NUM[action] + 1)]
            df_neg = action_df[action_df[action] == 0]
            df_pos = action_df[action_df[action] == 1]
            df_neg = df_neg.sample(frac=ACTION_SAMPLE_RATE[action], random_state=SEED, replace=False)
            df_all = pd.concat([df_neg, df_pos])
            col = ["userid", "feedid", "date_", "device"] + [action]
            file_name = os.path.join(stage_dir, stage + "_" + action + "_" + str(day) + "_generate_sample.csv")
            print('Save to: %s'%file_name)
            df_all[col].to_csv(file_name, index=False)
            df_arr.append(df_all[col])
    return df_arr


def concat_sample(sample_arr, stage="offline_train"):
    """
    基于样本数据和特征,生成特征数据
    :param sample_arr: List of sample df
    :param stage: String. Including "online_train"/"offline_train"/"evaluate"/"submit"
    """
    day = STAGE_END_DAY[stage]
    # feed信息表
    feed_info = pd.read_csv(FEED_INFO)
    feed_info = feed_info.set_index('feedid')
    # 基于userid统计的历史行为的次数
    user_date_feature_path = os.path.join(ROOT_PATH, "feature", "userid_feature.csv")
    user_date_feature = pd.read_csv(user_date_feature_path)
    user_date_feature = user_date_feature.set_index(["userid", "date_"])
    # 基于feedid统计的历史行为的次数
    feed_date_feature_path = os.path.join(ROOT_PATH, "feature", "feedid_feature.csv")
    feed_date_feature = pd.read_csv(feed_date_feature_path)
    feed_date_feature = feed_date_feature.set_index(["feedid", "date_"])

    for index, sample in enumerate(sample_arr):
        features = ["userid", "feedid", "device", "authorid", "bgm_song_id", "bgm_singer_id",
                    "videoplayseconds"]
        if stage == "evaluate":
            action = "all"
            features += ACTION_LIST
        elif stage == "submit":
            action = "all"
        else:
            action = ACTION_LIST[index]
            features += [action]
        print("action: ", action)
        sample = sample.join(feed_info, on="feedid", how="left", rsuffix="_feed")
        sample = sample.join(feed_date_feature, on=["feedid", "date_"], how="left", rsuffix="_feed")
        sample = sample.join(user_date_feature, on=["userid", "date_"], how="left", rsuffix="_user")
        feed_feature_col = [b+"sum" for b in FEA_COLUMN_LIST]
        user_feature_col = [b+"sum_user" for b in FEA_COLUMN_LIST]
        sample[feed_feature_col] = sample[feed_feature_col].fillna(0.0)
        sample[user_feature_col] = sample[user_feature_col].fillna(0.0)
        sample[feed_feature_col] = np.log(sample[feed_feature_col] + 1.0)
        sample[user_feature_col] = np.log(sample[user_feature_col] + 1.0)
        features += feed_feature_col
        features += user_feature_col

        sample[["authorid", "bgm_song_id", "bgm_singer_id"]] += 1  # 0 用于填未知
        sample[["authorid", "bgm_song_id", "bgm_singer_id", "videoplayseconds"]] = \
            sample[["authorid", "bgm_song_id", "bgm_singer_id", "videoplayseconds"]].fillna(0)
        sample["videoplayseconds"] = np.log(sample["videoplayseconds"] + 1.0)

        sample[["authorid", "bgm_song_id", "bgm_singer_id"]] = \
            sample[["authorid", "bgm_song_id", "bgm_singer_id"]].astype(int)
        file_name = os.path.join(ROOT_PATH, stage, stage + "_" + action + "_" + str(day) + "_concate_sample.csv")
        print('Save to: %s'%file_name)
        sample[features].to_csv(file_name, index=False)


def main():
    t = time.time()
    statis_data()
    logger.info('Create dir and check file')
    create_dir()
    flag, not_exists_file = check_file()
    if not flag:
        print("请检查目录中是否存在下列文件: ", ",".join(not_exists_file))
        return
    logger.info('Generate statistic feature')
    statis_feature()
    for stage in STAGE_END_DAY:
        logger.info("Stage: %s"%stage)
        logger.info('Generate sample')
        sample_arr = generate_sample(stage)
        logger.info('Concat sample with feature')
        concat_sample(sample_arr, stage)
    print('Time cost: %.2f s'%(time.time()-t))


if __name__ == "__main__":
    main()

程序运行结果

./data\wechat_algo_data1\user_action.csv statis: 
             userid        feedid         date_        device  read_comment  \
count  7.317882e+06  7.317882e+06  7.317882e+06  7.317882e+06  7.317882e+06   
mean   1.249679e+05  5.669863e+04  7.801455e+00  1.765396e+00  3.501587e-02   
std    7.239444e+04  3.278194e+04  4.063833e+00  4.237514e-01  1.838199e-01   
min    8.000000e+00  0.000000e+00  1.000000e+00  1.000000e+00  0.000000e+00   
25%    6.133000e+04  2.814900e+04  4.000000e+00  2.000000e+00  0.000000e+00   
50%    1.256370e+05  5.682500e+04  8.000000e+00  2.000000e+00  0.000000e+00   
75%    1.878630e+05  8.522900e+04  1.100000e+01  2.000000e+00  0.000000e+00   
max    2.502360e+05  1.128710e+05  1.400000e+01  2.000000e+00  1.000000e+00   

            comment          like          play          stay  click_avatar  \
count  7.317882e+06  7.317882e+06  7.317882e+06  7.317882e+06  7.317882e+06   
mean   4.046253e-04  2.580487e-02  2.631760e+04  3.101158e+04  7.533327e-03   
std    2.011123e-02  1.585528e-01  6.477679e+04  1.013239e+05  8.646720e-02   
min    0.000000e+00  0.000000e+00  0.000000e+00  0.000000e+00  0.000000e+00   
25%    0.000000e+00  0.000000e+00  2.017000e+03  5.189000e+03  0.000000e+00   
50%    0.000000e+00  0.000000e+00  1.328900e+04  1.782900e+04  0.000000e+00   
75%    0.000000e+00  0.000000e+00  3.600000e+04  4.133900e+04  0.000000e+00   
max    1.000000e+00  1.000000e+00  3.855337e+07  8.262444e+07  1.000000e+00   

            forward        follow      favorite  
count  7.317882e+06  7.317882e+06  7.317882e+06  
mean   3.821188e-03  7.211103e-04  1.342465e-03  
std    6.169754e-02  2.684381e-02  3.661506e-02  
min    0.000000e+00  0.000000e+00  0.000000e+00  
25%    0.000000e+00  0.000000e+00  0.000000e+00  
50%    0.000000e+00  0.000000e+00  0.000000e+00  
75%    0.000000e+00  0.000000e+00  0.000000e+00  
max    1.000000e+00  1.000000e+00  1.000000e+00  
Distinct count:
userid           20000
feedid           96564
date_               14
device               2
read_comment         2
comment              2
like                 2
play            201721
stay            220343
click_avatar         2
forward              2
follow               2
favorite             2
dtype: int64
./data\wechat_algo_data1\feed_info.csv statis: 
              feedid       authorid  videoplayseconds   bgm_song_id  \
count  106444.000000  106444.000000     106444.000000  53462.000000   
mean    56443.543704    9488.196639         34.446545  12556.594946   
std     32582.304899    5384.927792        277.086122   7319.430812   
min         0.000000       0.000000          2.000000      0.000000   
25%     28243.750000    4844.750000         14.000000   6137.000000   
50%     56419.500000    9584.000000         26.000000  12518.500000   
75%     84659.250000   14122.000000         54.000000  18965.750000   
max    112871.000000   18788.000000      59960.000000  25158.000000   

       bgm_singer_id  
count   53462.000000  
mean     8795.041338  
std      5023.494371  
min         0.000000  
25%      4589.250000  
50%      8614.000000  
75%     13218.000000  
max     17499.000000  
Distinct count:
feedid                  106444
authorid                 18789
videoplayseconds            76
description              99526
ocr                      76148
asr                      70969
bgm_song_id              25159
bgm_singer_id            17500
manual_keyword_list      49460
machine_keyword_list     54800
manual_tag_list           3314
machine_tag_list        105819
description_char         99416
ocr_char                 75760
asr_char                 70969
dtype: int64
./data\wechat_algo_data1\test_a.csv statis: 
2024-10-10 17:39:46,152 - INFO - Create dir and check file
2024-10-10 17:39:46,168 - INFO - Generate statistic feature
              userid         feedid         device
count  421985.000000  421985.000000  421985.000000
mean   124695.072128   57071.963089       1.752171
std     72321.066783   32436.065649       0.431752
min        25.000000       0.000000       1.000000
25%     61122.000000   29674.000000       2.000000
50%    126473.000000   57114.000000       2.000000
75%    188144.000000   84952.000000       2.000000
max    250224.000000  112871.000000       2.000000
Distinct count:
userid     9757
feedid    35157
device        2
dtype: int64
Create dir: ./data\offline_train
Create dir: ./data\online_train
Create dir: ./data\evaluate
Create dir: ./data\submit
Create dir: ./data\feature
Create dir: ./data\model
Create dir: ./data\model/online_train
Create dir: ./data\model/offline_train
userid
Save to: ./data\feature\userid_feature.csv
feedid
Save to: ./data\feature\feedid_feature.csv
2024-10-10 17:40:06,186 - INFO - Stage: online_train
2024-10-10 17:40:06,186 - INFO - Generate sample
Save to: ./data\online_train\online_train_read_comment_14_generate_sample.csv
Save to: ./data\online_train\online_train_like_14_generate_sample.csv
Save to: ./data\online_train\online_train_click_avatar_14_generate_sample.csv
Save to: ./data\online_train\online_train_forward_14_generate_sample.csv
2024-10-10 17:40:25,009 - INFO - Concat sample with feature
action:  read_comment
Save to: ./data\online_train\online_train_read_comment_14_concate_sample.csv
action:  like
Save to: ./data\online_train\online_train_like_14_concate_sample.csv
action:  click_avatar
Save to: ./data\online_train\online_train_click_avatar_14_concate_sample.csv
action:  forward
Save to: ./data\online_train\online_train_forward_14_concate_sample.csv
2024-10-10 17:41:04,241 - INFO - Stage: offline_train
2024-10-10 17:41:04,241 - INFO - Generate sample
Save to: ./data\offline_train\offline_train_read_comment_12_generate_sample.csv
Save to: ./data\offline_train\offline_train_like_12_generate_sample.csv
Save to: ./data\offline_train\offline_train_click_avatar_12_generate_sample.csv
Save to: ./data\offline_train\offline_train_forward_12_generate_sample.csv
2024-10-10 17:41:22,671 - INFO - Concat sample with feature
action:  read_comment
Save to: ./data\offline_train\offline_train_read_comment_12_concate_sample.csv
action:  like
Save to: ./data\offline_train\offline_train_like_12_concate_sample.csv
action:  click_avatar
Save to: ./data\offline_train\offline_train_click_avatar_12_concate_sample.csv
action:  forward
Save to: ./data\offline_train\offline_train_forward_12_concate_sample.csv
2024-10-10 17:41:57,547 - INFO - Stage: evaluate
2024-10-10 17:41:57,547 - INFO - Generate sample
Save to: ./data\evaluate\evaluate_all_13_generate_sample.csv
2024-10-10 17:42:03,660 - INFO - Concat sample with feature
action:  all
Save to: ./data\evaluate\evaluate_all_13_concate_sample.csv
2024-10-10 17:42:16,306 - INFO - Stage: submit
2024-10-10 17:42:16,306 - INFO - Generate sample
Save to: ./data\submit\submit_all_15_generate_sample.csv
2024-10-10 17:42:17,025 - INFO - Concat sample with feature
action:  all
Save to: ./data\submit\submit_all_15_concate_sample.csv
Time cost: 172.95 s

参见:

press-physicsprize2024.pdf (nobelprize.org)

Home Page of Geoffrey Hinton

http://www.cs.utoronto.ca/~hinton/absps/naturebp.pdf

https://dl.acm.org/doi/pdf/10.1145/2988450.2988454

https://www.annualreviews.org/content/journals/10.1146/annurev-conmatphys-031113-133924

Install TensorFlow 2

分析开源机器学习框架TensorFlow_tensorflow 开源模型-CSDN博客

10.卷积神经网络(基础篇)_哔哩哔哩_bilibili

简介:使用TensorFlow实现python简版神经网络模型_tensorflow 网络模型-CSDN博客

深度学习之父Hinton:下一代神经网络 | deep learning resource

【中英/讲座】Geoffrey Hinton:数字智能会取代生物智能吗?_哔哩哔哩_bilibili

赠书 | 诺奖得主辛顿:一场竞拍开启的AI新时代_新浪财经_新浪网

刚刚,诺贝尔物理学奖破天荒颁给「AI教父」!Hinton成首位图灵奖诺贝尔物理学奖双料得主

获诺奖的AI教父辛顿:曾提醒AI或威胁人类生存—新闻—科学网

“人工智能教父” 杰弗里・辛顿:《60 分钟》访谈_哔哩哔哩_bilibili

docs.google.com

Polo Club of Data Science @ Georgia Tech: Human-Centered AI, Deep Learning Interpretation & Visualization, Cybersecurity, Large Graph Visualization and Mining | Georgia Tech | Atlanta, GA 30332, United States

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:/a/893492.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

Github 2024-10-18Java开源项目日报Top9

根据Github Trendings的统计,今日(2024-10-18统计)共有9个项目上榜。根据开发语言中项目的数量,汇总情况如下: 开发语言项目数量Java项目8非开发语言项目1Python项目1HTML项目1《Hello 算法》:动画图解、一键运行的数据结构与算法教程 创建周期:476 天协议类型:OtherStar…

Mysql(4)—数据库索引

一、关于索引 1.1 简介 数据库索引是数据库管理系统中用于提高数据检索效率的数据结构。索引类似于书籍中的索引&#xff0c;它允许用户快速找到数据&#xff0c;而不需要扫描整个表。 ​ ‍ MySQL索引的建立对于MySQL的高效运行是很重要的&#xff0c;索引可以大大提高My…

java幂等控制问题

&#x1f3c6;本文收录于《全栈Bug调优(实战版)》专栏&#xff0c;主要记录项目实战过程中所遇到的Bug或因后果及提供真实有效的解决方案&#xff0c;希望能够助你一臂之力&#xff0c;帮你早日登顶实现财富自由&#x1f680;&#xff1b;同时&#xff0c;欢迎大家关注&&am…

游戏逆向基础-跳出游戏线程发包

附加游戏后下断点 bp ws2_32.send send函数断下后&#xff0c;可以看到数据地址每次都一样 可以说明这个游戏是线程发包&#xff0c;所以直接在数据窗口中转到这个地址&#xff0c;然后对这个地址下硬件写入断点。 下了硬件写入断点后可以一层一层往上面跟&#xff0c;确定写…

集合框架07:LinkedList使用

1.视频链接&#xff1a;13.14 LinkedList使用_哔哩哔哩_bilibilihttps://www.bilibili.com/video/BV1zD4y1Q7Fw?spm_id_from333.788.videopod.episodes&vd_sourceb5775c3a4ea16a5306db9c7c1c1486b5&p142.LinkedList集合的增删改查操作 package com.yundait.Demo01;im…

汽车行业焕新潮流涌动,联众优车以优质服务响应市场变化

随着消费者环保意识的改变及新能源汽车市场的快速发展&#xff0c;我国新能源汽车领域正掀起一股新的消费热潮&#xff0c;而旧车的合理处置问题也随之成为社会各界关注的焦点。今年4月末&#xff0c;商务部、财政部等七大部委携手颁布了《老旧汽车置换补贴实施指南》(以下简称…

学会组装、调试、维修无人机后从事飞手工作技术优势分析

学会组装、调试、维修无人机后从事飞手工作&#xff0c;将带来显著的技术优势&#xff0c;这些优势不仅提升了飞手的综合能力&#xff0c;也增强了其在行业中的竞争力。以下是对这些技术优势的详细分析&#xff1a; 一、深入理解无人机结构与功能 1. 结构认知&#xff1a;通过…

RabbitMQ 作为消息中间件,实现了支付消息的异步发送和接收, 同步和异步相比 响应速度具体比较

在支付场景中&#xff0c;使用 RabbitMQ 实现消息的异步发送和接收与同步处理相比&#xff0c;响应速度和整体系统性能会有显著的不同。以下是同步和异步方式在响应速度上的详细比较&#xff1a; 1. 同步处理方式 在同步模式下&#xff0c;支付消息的处理流程通常是&#xf…

exchange邮件系统ADFS双因素认证技术方案

exchange作为微软公司推出的邮件系统&#xff0c;在企业界有着广泛的应用&#xff0c;通常情况下&#xff0c;exchange为邮箱用户提供的认证方式是基于AD的静态密码认证&#xff0c;虽然微软在AD认证上已经做了大量的安全性优化&#xff0c;但是由于是静态密码方式认证&#xf…

医院信息化与智能化系统(1)

医院信息化与智能化系统(1) 这里只描述对应过程&#xff0c;和可能遇到的问题及解决办法以及对应的参考链接&#xff0c;并不会直接每一步详细配置 1、 MySQL准备 创建并初始化user数据库&#xff0c;后续为验证mybatis-plus(后续简称mp) 2、确认idea配置 在新版IDEA中需要…

5.计算机网络_抓包工具wireshark

安装 Linux中安装wireshark&#xff1a; sudo apt-get install wireshark Linux中执行wireshark&#xff1a; sudo wireshark 使用 注意&#xff1a;只有与外网交互的数据才可以被wireshark抓到&#xff0c;本机回环的数据不会被抓到 实验内容&#xff1a; 使用nc命令…

爬虫(反调试)

其实就是一种给页面反爬机制&#xff0c;一般页面用不到。 万能解决反调试方法&#xff1a;

数据结构 -- 排序算法

一 排序 1.1 排序的概念 所谓排序&#xff0c;就是一种使一串数据记录&#xff0c;按照其中的某个或某些关键字的大小&#xff0c;递增或递减地组织起来的操作。 从排序方式上&#xff0c;排序算法一般被分为比较排序和非比较排序。从比较排序的内容上&#xff0c;它一般被分为…

页面局部使用vue等框架其它部分用JQuery进行交互

这个需求是原有django在网页需要定制一个人员签到信息。状态有三种&#xff0c;在岗&#xff0c;下班。好吧两种。但是你想 1&#xff0c;1.这是两次、共四个可能&#xff0c;00&#xff0c; 10&#xff0c;01&#xff0c;11.其中00是在家。10是在岗。01是。不签到只签退&#…

程序员转行方向推荐

对于程序员转行方向的推荐&#xff0c;可以基于当前的技术趋势、市场需求以及程序员的个人技能和兴趣来综合考虑。以下是一些推荐的转行方向&#xff1a; 伴随着社会的发展&#xff0c;网络安全被列为国家安全战略的一部分&#xff0c;因此越来越多的行业开始迫切需要网安人员…

Web Storage:数据储存机制

前言 在HTML5之前&#xff0c;开发人员一般是通过使用Cookie在客户端保存一些简单的信息的。在HTML5发布后&#xff0c;提供了一种新的客户端本地保存数据的方法&#xff0c;那就是Web Storage&#xff0c;它也被分为&#xff1a;LocalStorage和SessionStorage&#xff0c;它允…

配合工具,快速学习与体验electron增量更新

有任何问题&#xff0c;都可以私信博主&#xff0c;共同探讨学习。 正文开始 前言一、如何使用源码1.1 下载代码1.2 下载资源1.3 运行项目 二、如何使用工具2.1 打包新版本更新包2.2 创建nginx文件服务器2.3 在文件服务器保存软件更新包 三、如何测试更新3.1本地运行低版本3.2 …

centos 8.4学习小结

1.权限委派 2.vim快捷方式 2.1非正常关闭文本处理方式 2.2快捷方式 2.3TAB键补齐安装包 [ rootcloud Packages]# rpm -ivh bash-completion-2.7-5.el8.noarch.rpm 2.4#history 查询历史记录 [rootcloud ~]# vim /etc/profile HISTSIZE1000&#xff08;默认保存1000条历史记…

C++求日期差值题目

C日期差值题目&#xff08;牛客网&#xff09;题目超链接 仅个人思路不是最优解 仔细阅读地题目&#xff0c;要求输入连续的两串数字表示两个日期 所以我感觉日期类不太方便&#xff08;也许是我实力不允许&#xff09; cin使用起来就不太方便&#xff0c;我这里选择使用sca…

Spark的安装配置及集群搭建

Spark的本地安装配置&#xff1a; 我们用scala语言编写和操作spark&#xff0c;所以先要完成scala的环境配置 1、先完成Scala的环境搭建 下载Scala插件&#xff0c;创建一个Maven项目&#xff0c;导入Scala依赖和插件 scala依赖 <dependency><groupId>org.scal…