GAMES101-Assignment3

GAMES101-Assignment3

参考文章:
1.《GAMES101》作业框架问题详解
2. Games101:作业3(管线分析、深度插值、libpng warning、双线性插值等)
3.【GAMES101】作业3(提高)与法线贴图原理和渲染管线框架分析

文件清单:
CMakeLists.txt(项目配置清单,cmake根据此清单进行系统构建、编译、测试)
OBJ_Loader.h(用于加载三维模型)
global.hpp(定义全局变量PI)
rasterizer.hpp(光栅器头文件)
Texture.hpp(声明纹理宽高和根据纹理坐标获取像素颜色的函数getColor)
Triangle.hpp(三角形的头文件,定义其相关属性)
Shader.hpp(声明颜色、法向量、纹理、纹理坐标,支持纹理映射,定义了fragment_shader_payload,其中包括了 Fragment Shader 可能用到的参数)
rasterizer.cpp(生成渲染器界面与绘制)
Texture.cpp(目前为空文件)
Triangle.cpp(画出三角形)
main.cpp


1.实验要求

2.实验内容及其结果

2.1 rasterize_triangle in rasterizer.cpp


barycentric

初始化空间中三角形三个点的颜色、法向量、纹理坐标、shading point位置,通过重心公式对三角形内的各个属性进行插值,计算得到的各个属性传入fragment_shader_payload

//Screen space rasterization
void rst::rasterizer::rasterize_triangle(const Triangle& t, const std::array<Eigen::Vector3f, 3>& view_pos) 
{
    // TODO: From your HW3, get the triangle rasterization code.
    // TODO: Inside your rasterization loop:
    //    * v[i].w() is the vertex view space depth value z.
    //    * Z is interpolated view space depth for the current pixel
    //    * zp is depth between zNear and zFar, used for z-buffer

    // float Z = 1.0 / (alpha / v[0].w() + beta / v[1].w() + gamma / v[2].w());
    // float zp = alpha * v[0].z() / v[0].w() + beta * v[1].z() / v[1].w() + gamma * v[2].z() / v[2].w();
    // zp *= Z;

    // TODO: Interpolate the attributes:
    // auto interpolated_color
    // auto interpolated_normal
    // auto interpolated_texcoords
    // auto interpolated_shadingcoords

    // Use: fragment_shader_payload payload( interpolated_color, interpolated_normal.normalized(), interpolated_texcoords, texture ? &*texture : nullptr);
    // Use: payload.view_pos = interpolated_shadingcoords;
    // Use: Instead of passing the triangle's color directly to the frame buffer, pass the color to the shaders first to get the final color;
    // Use: auto pixel_color = fragment_shader(payload);
    auto v = t.toVector4();
    // Define bound of box
    //INT_MAX: infnity represented by top limitation of integer
    int BoxMin_X =INT_MAX,BoxMin_Y=INT_MAX;
    //INT_MIN: infnity represented by bottom limitation of integer
    int BoxMax_X =INT_MIN,BoxMax_Y=INT_MIN;
    //iterate to find bound
    for (int i = 0; i < 3; i++)
    {
        BoxMin_X=std::min(BoxMin_X,(int)v[i][0]);
        BoxMin_Y=std::min(BoxMin_Y,(int)v[i][1]);
        BoxMax_X=std::max(BoxMax_X,(int)v[i][0]);
        BoxMax_Y=std::max(BoxMax_Y,(int)v[i][1]);
    }
    //iterate pixel inside of bounding box
    for (int i = BoxMin_X; i <= BoxMax_X; i++)
    {
        for (int j = BoxMin_Y; j <= BoxMax_Y; j++)
        {
            //float x=i+0.5,y=i+0.5;
            //check if center of current pixel is inside the triangle
            if (insideTriangle(i,j,t.v))//if centric pixel(x,y) is insidez. three point v[0],v[1],v[2]in triangle 
            {
                //interpolated depth value
                auto[alpha, beta, gamma] = computeBarycentric2D(i+0.5, j+0.5, t.v);
                float w_reciprocal = 1.0/(alpha / v[0].w() + beta / v[1].w() + gamma / v[2].w());
                float z_interpolated = alpha * v[0].z() / v[0].w() + beta * v[1].z() / v[1].w() + gamma * v[2].z() / v[2].w();
                z_interpolated *= w_reciprocal;
                //interpolated depth value is compared with depth_buffer
                if (-z_interpolated < depth_buf[get_index(i,j)])//get buffer index of pixel(x,y)
                {   
                    //color interpolate
                    auto interpolated_color = interpolate(alpha,beta,gamma,t.color[0],t.color[1],t.color[2],1);
                    //normal vector interpolate
                    auto interpolated_normal = interpolate(alpha,beta,gamma,t.normal[0],t.normal[1],t.normal[2],1).normalized();
                    //texture coordinates interpolate
                    auto interpolated_texcoords = interpolate(alpha,beta,gamma,t.tex_coords[0],t.tex_coords[1],t.tex_coords[2],1);
                    //shading point coordinates interpolate
                    auto interpolated_shadingcoords = interpolate(alpha,beta,gamma,view_pos[0],view_pos[1],view_pos[2],1);
                    //interpolated attributes send to fragment_shader_payload
                    fragment_shader_payload payload(interpolated_color,interpolated_normal.normalized(),interpolated_texcoords,texture ? &*texture :nullptr);
                    //send orignal coord to view_pos
                    payload.view_pos = interpolated_shadingcoords;
                    //update depth buffer
                    depth_buf[get_index(i,j)] = -z_interpolated;
                    //update frame buffer
                    frame_buf[get_index(i,j)] = fragment_shader(payload);
                    //get color of 1 out of 3 vertex in triangle,then set this color to current pixel
                    set_pixel({i,j},frame_buf[get_index(i,j)]);
                    //set_pixel({i,j,1},t.getColor());//(x,y,1) homogeneous coord,num 1 represent a point
                }
            }
        }
    }
}

2.2 get_projection_matrix in main.cpp

MVP变换

Eigen::Matrix4f get_projection_matrix(float eye_fov, float aspect_ratio, float zNear, float zFar)
{
    // TODO: Use the same projection matrix from the previous assignments
    Eigen::Matrix4f projection = Eigen::Matrix4f::Identity();

    // TODO: Implement this function
    // Create the projection matrix for the given parameters.
    // Then return it.

    // n=zNear f=zFar  
    // calculate r l t b according to eye_fov and aspect_ratio 
    // tan(eye_fov/2)=t/|n|,aspect_ratio=r/t
    // t=2|n|tan(eye_fov/2),r=t*aspect_ratio,l=-r,b=-t
    float t=2*abs(zNear)*tan(eye_fov/2),b=-t;
    float r=aspect_ratio*t,l=-r;
    eye_fov=(eye_fov/180.0)*MY_PI;//angle to rad
    //perspective projection M_{projection}= M_{orthographics}*M_{persp to ortho}
    Eigen::Matrix4f M_persp_to_ortho,M_ortho,M_scale,M_translate;
    M_persp_to_ortho << zNear,0,0,0,
                        0,zNear,0,0,
                        0,0,zNear+zFar,-zNear*zFar,
                        0,0,1,0;
    //M_{orthographics}=scale*translate
    M_scale << 2/(r-l),0,0,0,
                0,2/(t-b),0,0,
                0,0,2/(zNear-zFar),0,
                0,0,0,1;
    M_translate << 1,0,0,-(r+l)/2,
                    0,1,0,-(t+b)/2,
                    0,0,1,-(zNear+zFar)/2,
                    0,0,0,1;
    M_ortho = M_scale*M_translate;
    projection = M_ortho*M_persp_to_ortho*projection;
    return projection;
    
}

in main.cpp 激活相应 shader

编译后运行

./Rasterizer output.png normal

2.3 phong_fragment_shader in main.cpp

Blinn-phong 反射模型: 正确实现 phong_fragment_shader 对应的反射模型

Eigen::Vector3f phong_fragment_shader(const fragment_shader_payload& payload)
{
    Eigen::Vector3f ka = Eigen::Vector3f(0.005, 0.005, 0.005);
    Eigen::Vector3f kd = payload.color;
    Eigen::Vector3f ks = Eigen::Vector3f(0.7937, 0.7937, 0.7937);

    auto l1 = light{{20, 20, 20}, {500, 500, 500}};
    auto l2 = light{{-20, 20, 0}, {500, 500, 500}};

    std::vector<light> lights = {l1, l2};
    Eigen::Vector3f amb_light_intensity{10, 10, 10};
    Eigen::Vector3f eye_pos{0, 0, 10};

    float p = 150;//p value control range of specular

    Eigen::Vector3f color = payload.color;
    Eigen::Vector3f point = payload.view_pos;
    Eigen::Vector3f normal = payload.normal;

    Eigen::Vector3f result_color = {0, 0, 0};
    Eigen::Vector3f l,v,dif,h,spe,amb;
    for (auto& light : lights)
    {
        // TODO: For each light source in the code, calculate what the *ambient*, *diffuse*, and *specular* 
        // components are. Then, accumulate that result on the *result_color* object.

        //light vector l
        l = (light.position - point).normalized();
        //view vector v
        v = (eye_pos - point).normalized();
        //distance r from light to shading point,namely norm of light vector
        //r = (light.position - point).dot(light.position - point);
        //diffuse L_d= kd(I/r^2)max(0,n dot l)
        dif = kd.cwiseProduct(light.intensity / (light.position - point).dot(light.position - point))*std::fmax(0, normal.dot(l));//normal is n vector
        //half vector h=(v+l)/norm(v+l)
        h = (v+l) / ((v+l).dot(v+l));
        //specular L_s=ks(I/r^2)max(0,n dot h)^p
        spe = ks.cwiseProduct(light.intensity / ((light.position - point).dot(light.position - point)))*pow(std::fmax(0,normal.dot(h.normalized())),p);
        //ambient La=kaIa
        amb = ka.cwiseProduct(amb_light_intensity);
        result_color += (dif + spe + amb);
    }

    return result_color * 255.f;
}

in main.cpp 激活相应 shader

编译后运行

./Rasterizer output.png phong

2.4 texture_fragment_shader in main.cpp

Texture mapping: 将 phong_fragment_shader 的代码拷贝到 texture_fragment_shader, 在此基础上正确实现 Texture Mapping.
phong_fragment_shader中color = payload_color
texture_fragment_shader中color = texture_color

Eigen::Vector3f texture_fragment_shader(const fragment_shader_payload& payload)
{
    Eigen::Vector3f return_color = {0, 0, 0};
    if (payload.texture)
    {
        // TODO: Get the texture value at the texture coordinates of the current fragment
        return_color = payload.texture->getColor(payload.tex_coords.x(),payload.tex_coords.y());
    }
    Eigen::Vector3f texture_color;
    texture_color << return_color.x(), return_color.y(), return_color.z();
    //ambient term coefficient ka
    Eigen::Vector3f ka = Eigen::Vector3f(0.005, 0.005, 0.005);
    //diffuse reflection coefficient kd
    Eigen::Vector3f kd = texture_color / 255.f;
    //specular term coefficient ks
    Eigen::Vector3f ks = Eigen::Vector3f(0.7937, 0.7937, 0.7937);
    //light's position and intensity
    auto l1 = light{{20, 20, 20}, {500, 500, 500}};
    auto l2 = light{{-20, 20, 0}, {500, 500, 500}};

    std::vector<light> lights = {l1, l2};
    Eigen::Vector3f amb_light_intensity{10, 10, 10};
    Eigen::Vector3f eye_pos{0, 0, 10};

    float p = 150;
    //color of texture
    Eigen::Vector3f color = texture_color;
    //coord of shading point
    Eigen::Vector3f point = payload.view_pos;
    //normal vector of shading point
    Eigen::Vector3f normal = payload.normal;

    Eigen::Vector3f result_color = {0, 0, 0};
    Eigen::Vector3f l,v,dif,h,spe,amb;
    for (auto& light : lights)
    {
        // TODO: For each light source in the code, calculate what the *ambient*, *diffuse*, and *specular* 
        // components are. Then, accumulate that result on the *result_color* object.

        //light vector l
        l = (light.position - point).normalized();
        //view vector v
        v = (eye_pos - point).normalized();
        //distance r from light to shading point,namely norm of light vector
        //(light.position - point).dot(light.position - point)
        //diffuse L_d= kd(I/r^2)max(0,n dot l)
        dif = kd.cwiseProduct(light.intensity / ((light.position - point).dot(light.position - point)))*std::fmax(0, normal.dot(l));//normal is n vector
        //half vector h=(v+l)/norm(v+l)
        h = (v+l) / ((v+l).dot(v+l));
        //specular L_s=ks(I/r^2)max(0,n dot h)^p
        spe = ks.cwiseProduct(light.intensity / ((light.position - point).dot(light.position - point)))*pow(std::fmax(0,normal.dot(h.normalized())),p);
        //ambient La=kaIa
        amb = ka.cwiseProduct(amb_light_intensity);
        result_color += (dif + spe + amb);
    }

    return result_color * 255.f;
}

in main.cpp 激活相应shader

编译后运行

./Rasterizer output.png texture

2.5 bump mapping in main.cpp


Eigen::Vector3f bump_fragment_shader(const fragment_shader_payload& payload)
{
    
    Eigen::Vector3f ka = Eigen::Vector3f(0.005, 0.005, 0.005);
    Eigen::Vector3f kd = payload.color;
    Eigen::Vector3f ks = Eigen::Vector3f(0.7937, 0.7937, 0.7937);

    auto l1 = light{{20, 20, 20}, {500, 500, 500}};
    auto l2 = light{{-20, 20, 0}, {500, 500, 500}};

    std::vector<light> lights = {l1, l2};
    Eigen::Vector3f amb_light_intensity{10, 10, 10};
    Eigen::Vector3f eye_pos{0, 0, 10};

    float p = 150;

    Eigen::Vector3f color = payload.color; 
    Eigen::Vector3f point = payload.view_pos;
    Eigen::Vector3f normal = payload.normal;


    float kh = 0.2, kn = 0.1;

    // TODO: Implement bump mapping here
    // Let n = normal = (x, y, z)
    // Vector t = (x*y/sqrt(x*x+z*z),sqrt(x*x+z*z),z*y/sqrt(x*x+z*z))
    // Vector b = n cross product t
    // Matrix TBN = [t b n]
    // dU = kh * kn * (h(u+1/w,v)-h(u,v))
    // dV = kh * kn * (h(u,v+1/h)-h(u,v))
    // Vector ln = (-dU, -dV, 1)
    // Normal n = normalize(TBN * ln)
    //------------------------
    //perturb the normal in 3D
    //original surface normal n(p)=(0,0,1)
    //derivative at p are 
    //dp/du=c1*[h(u+1)-h(u)]
    //dp/dv=c2*[h(v+1)-h(v)]
    //perturbed normal is then n(p)=(-dp/du,-dp/dv,1).normalized()
    float x=normal.x(),y=normal.y(),z=normal.z();
    Eigen::Vector3f t{x*y / std::sqrt(x*x+z*z), x*z / std::sqrt(x*x+z*z), z*y / std::sqrt(x*x+z*z)};
    Eigen::Vector3f b = normal.cross(t);
    Eigen::Matrix3f TBN;
    TBN << t.x(),b.x(),normal.x(),
           t.y(),b.y(),normal.y(),
           t.z(),b.z(),normal.z();
    float u = payload.tex_coords.x(),v = payload.tex_coords.y();
    float w = payload.texture->width,h = payload.texture->height;
    float dU = kh*kn*(payload.texture->getColor(u+1.0f/w,v).norm() - payload.texture->getColor(u,v).norm());
    float dV = kh*kn*(payload.texture->getColor(u,v+1.0f/h).norm() - payload.texture->getColor(u,v).norm());
    Eigen::Vector3f perturbed_n{-dU,-dV,1.0f};
    normal = TBN * perturbed_n;
    //-------------------------
    Eigen::Vector3f result_color = {0, 0, 0};
    result_color = normal.normalized();
    return result_color * 255.f;
}


编译后运行

./Rasterizer output.png bump

2.6 displacement mapping in main.cpp

Eigen::Vector3f displacement_fragment_shader(const fragment_shader_payload& payload)
{
    
    Eigen::Vector3f ka = Eigen::Vector3f(0.005, 0.005, 0.005);
    Eigen::Vector3f kd = payload.color;
    Eigen::Vector3f ks = Eigen::Vector3f(0.7937, 0.7937, 0.7937);

    auto l1 = light{{20, 20, 20}, {500, 500, 500}};
    auto l2 = light{{-20, 20, 0}, {500, 500, 500}};

    std::vector<light> lights = {l1, l2};
    Eigen::Vector3f amb_light_intensity{10, 10, 10};
    Eigen::Vector3f eye_pos{0, 0, 10};

    float p = 150;

    Eigen::Vector3f color = payload.color; 
    Eigen::Vector3f point = payload.view_pos;
    Eigen::Vector3f normal = payload.normal;

    float kh = 0.2, kn = 0.1;
    
    // TODO: Implement displacement mapping here
    // Let n = normal = (x, y, z)
    // Vector t = (x*y/sqrt(x*x+z*z),sqrt(x*x+z*z),z*y/sqrt(x*x+z*z))
    // Vector b = n cross product t
    // Matrix TBN = [t b n]
    // dU = kh * kn * (h(u+1/w,v)-h(u,v))
    // dV = kh * kn * (h(u,v+1/h)-h(u,v))
    // Vector ln = (-dU, -dV, 1)
    // Position p = p + kn * n * h(u,v)
    // Normal n = normalize(TBN * ln)
    //------------------------
    //perturb the normal in 3D
    //original surface normal n(p)=(0,0,1)
    //derivative at p are 
    //dp/du=c1*[h(u+1)-h(u)]
    //dp/dv=c2*[h(v+1)-h(v)]
    //perturbed normal is then n(p)=(-dp/du,-dp/dv,1).normalized()
    float x=normal.x(),y=normal.y(),z=normal.z();
    Eigen::Vector3f t{x*y / std::sqrt(x*x+z*z), x*z / std::sqrt(x*x+z*z), z*y / std::sqrt(x*x+z*z)};
    Eigen::Vector3f b = normal.cross(t);
    Eigen::Matrix3f TBN;
    TBN << t.x(),b.x(),normal.x(),
           t.y(),b.y(),normal.y(),
           t.z(),b.z(),normal.z();
    float u = payload.tex_coords.x(),v = payload.tex_coords.y();
    float w = payload.texture->width,h = payload.texture->height;
    float dU = kh*kn*(payload.texture->getColor(u+1.0f/w,v).norm() - payload.texture->getColor(u,v).norm());
    float dV = kh*kn*(payload.texture->getColor(u,v+1.0f/h).norm() - payload.texture->getColor(u,v).norm());
    Eigen::Vector3f perturbed_n{-dU,-dV,1.0f};
    normal = TBN * perturbed_n;
    point += (kn * normal * payload.texture->getColor(u,v).norm()); 
    //-------------------------
    Eigen::Vector3f result_color = {0, 0, 0};
    Eigen::Vector3f light_vec,view_vec,dif,half_vec,spe,amb;
    for (auto& light : lights)
    {
        // TODO: For each light source in the code, calculate what the *ambient*, *diffuse*, and *specular* 
        // components are. Then, accumulate that result on the *result_color* object.
        //light vector l
        light_vec = (light.position - point).normalized();
        //view vector v
        view_vec = (eye_pos - point).normalized();//+l??
        //distance r from light to shading point,namely norm of light vector
        //r = (light.position - point).dot(light.position - point);
        //diffuse L_d= kd(I/r^2)max(0,n dot l)
        dif = kd.cwiseProduct(light.intensity / (light.position - point).dot(light.position - point))*std::fmax(0, normal.dot(light_vec));//normal is n vector
        //half vector h=(v+l)/norm(v+l)
        half_vec = (view_vec+light_vec) / ((view_vec+light_vec).dot(view_vec+light_vec));
        //specular L_s=ks(I/r^2)max(0,n dot h)^p
        spe = ks.cwiseProduct(light.intensity / ((light.position - point).dot(light.position - point)))*pow(std::fmax(0,normal.dot(half_vec.normalized())),p);
        //ambient La=kaIa
        amb = ka.cwiseProduct(amb_light_intensity);
        result_color += (dif + spe + amb);
    }

    return result_color * 255.f;
}


编译后运行

./Rasterizer output.png displacement


本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:/a/385006.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

vue3 腾讯tdesign 后台管理框架的使用

1.介绍 TDesign 是具有包容性的设计体系&#xff0c;它强调为业务提供产品、服务等过程中&#xff0c;追求以人为本、人人受益的包容性&#xff0c;要求搭建过程中&#xff0c;了解业务底层&#xff0c;理解业务场景的多样性&#xff0c;并在繁杂的业务场景中寻找共性和特性&a…

ubuntu快速安装miniconda

ubuntu快速安装miniconda 环境 ubuntu.22.04 显卡 RTX 3050 关于选择Miniconda还是Anaconda的问题&#xff0c;Anaconda安装包比较大&#xff0c;耗时比较长&#xff0c;如果你是绝对的初学者&#xff0c;选择Anaconda会比较稳妥一些&#xff1b;否则建议你还是选择Miniconda安…

[算法学习] 逆元与欧拉降幂

费马小定理 两个条件&#xff1a; p为质数a与p互质 逆元 如果要求 x^-1 mod p &#xff0c;用快速幂求 qmi(x,p-2) 就好 欧拉函数 思路&#xff1a;找到因数 i&#xff0c;phi / i * (i-1)&#xff0c;除干净&#xff0c;判断最后的n 欧拉降幂 欧拉定理 应用示例 m! 是一个…

无人机飞行控制系统功能,多旋翼飞行控制系统概述

飞行控制系统存在的意义 行控制系统通过高效的控制算法内核&#xff0c;能够精准地感应并计算出飞行器的飞行姿态等数据&#xff0c;再通过主控制单元实现精准定位悬停和自主平稳飞行。 在没有飞行控制系统的情况下&#xff0c;有很多的专业飞手经过长期艰苦的练习&#xff0…

npm config set registry https://registry.npm.taobao.org 这个设置了默认的镜像源之后如何恢复默认的镜像源

要恢复npm默认的镜像源&#xff0c;你可以使用以下命令将registry设置回npm的官方源&#xff1a; npm config set registry https://registry.npmjs.org/这个命令会修改你的全局npm配置&#xff0c;将包的下载源改回npm官方的源。这样做之后&#xff0c;任何后续的npm install…

docker本地目录挂载

小命令 1、查看容器详情 docker inspect 容器名称 还是以nginx为例&#xff0c;上篇文章我们制作了nginx静态目录的数据卷&#xff0c;此时查看nginx容器时会展示出来&#xff08;docker inspect nginx 展示信息太多&#xff0c;这里只截图数据卷挂载信息&#xff09;&#…

20240212请问如何将B站下载的软字幕转换成为SRT格式?

20240212请问如何将B站下载的软字幕转换成为SRT格式&#xff1f; 2024/2/12 12:47 百度搜索&#xff1a;字幕 json 转 srt json srt https://blog.csdn.net/a_wh_white/article/details/120687363?share_token2640663e-f468-4737-9b55-73c808f5dcf0 https://blog.csdn.net/a_w…

Pandas从基础统计到高级分析的完整指南【第77篇—Pandas高级分析】

Pandas从基础统计到高级分析的完整指南 在数据科学和分析领域中&#xff0c;Pandas是Python中最受欢迎的数据处理库之一。它提供了丰富而强大的功能&#xff0c;其中包括各种统计方法&#xff0c;用于更好地理解和分析数据。本文将介绍Pandas中常用的统计方法&#xff0c;通过…

Github 2024-02-07 开源项目日报 Top9

根据Github Trendings的统计&#xff0c;今日(2024-02-07统计)共有9个项目上榜。根据开发语言中项目的数量&#xff0c;汇总情况如下&#xff1a; 开发语言项目数量Rust项目2TypeScript项目2Python项目2Ruby项目1HTML项目1NASL项目1Go项目1C项目1Svelte项目1C项目1 React Nat…

【Java EE初阶十二】网络编程TCP/IP协议(二)

1. 关于TCP 1.1 TCP 的socket api tcp的socket api和U大片的socket api差异很大&#xff0c;但是和前面所讲的文件操作很密切的联系 下面主要讲解两个关键的类&#xff1a; 1、ServerSocket&#xff1a;给服务器使用的类&#xff0c;使用这个类来绑定端口号 2、Socket&#xf…

【后端高频面试题--SpringBoot篇】

&#x1f680; 作者 &#xff1a;“码上有前” &#x1f680; 文章简介 &#xff1a;后端高频面试题 &#x1f680; 欢迎小伙伴们 点赞&#x1f44d;、收藏⭐、留言&#x1f4ac; 这里写目录标题 1.什么是SpringBoot&#xff1f;它的主要特点是什么&#xff1f;2.列举一些Spri…

Stable Diffusion教程——stable diffusion基础原理详解与安装秋叶整合包进行出图测试

前言 在2022年&#xff0c;人工智能创作内容&#xff08;AIGC&#xff09;成为了AI领域的热门话题之一。在ChatGPT问世之前&#xff0c;AI绘画以其独特的创意和便捷的创作工具迅速走红&#xff0c;引起了广泛关注。随着一系列以Stable Diffusion、Midjourney、NovelAI等为代表…

【开源】SpringBoot框架开发农家乐订餐系统

目录 一、摘要1.1 项目介绍1.2 项目录屏 二、功能模块2.1 用户2.2 管理员 三、系统展示四、核心代码4.1 查询菜品类型4.2 查询菜品4.3 加购菜品4.4 新增菜品收藏4.5 新增菜品留言 五、免责说明 一、摘要 1.1 项目介绍 基于JAVAVueSpringBootMySQL的农家乐订餐系统&#xff0c…

Vulnhub靶机:hackable3

一、介绍 运行环境&#xff1a;Virtualbox 攻击机&#xff1a;kali&#xff08;10.0.2.15&#xff09; 靶机&#xff1a;hackable3&#xff08;10.0.2.53&#xff09; 目标&#xff1a;获取靶机root权限和flag 靶机下载地址&#xff1a;https://www.vulnhub.com/entry/hac…

HarmonyOS 横屏调试与真机横屏运行

我们有些程序 需要横屏才能执行出效果 我们在预览器上 点击如下图指向出 就进入一个横屏调试了 但 我们真机运行 依旧是竖着的 我们如下图 找到 module.json5 在 abilities 下面 第一个对象 最下面 加上 "orientation": "landscape"然后 我们再真机运…

matlab入门,在线编辑,无需安装matab

matlab相关教程做的很完善&#xff0c;除了B站看看教程&#xff0c;官方教程我觉得更加高效。跟着教程一步一步编辑&#xff0c;非常方便。 阅读 MATLAB 官方教程&#xff1a; MATLAB 官方教程提供了从基础到高级的教学内容&#xff0c;内容包括 MATLAB 的基本语法、数据处理…

[ai笔记5] 个人AI资讯助手实战

欢迎来到文思源想的ai空间&#xff0c;这是技术老兵重学ai以及成长思考的第5篇分享&#xff0c;也是把ai场景化应用的第一篇实操内容&#xff01; 既然要充分学习和了解ai&#xff0c;自然少不了要时常看看ai相关资讯&#xff0c;所以今天特地用字节的“扣子”做了一个ai的资讯…

第2讲投票系统后端架构搭建

创建项目时&#xff0c;随机选择一个&#xff0c;后面会生成配置properties文件 生成文件 maven-3.3.3 设置阿里云镜像 <?xml version"1.0" encoding"UTF-8"?><!-- Licensed to the Apache Software Foundation (ASF) under one or more cont…

一个三极管引脚识别的小技巧,再也不用对照手册啦

三极管是一个非常常用的器件,时不时的就需要用到他们,有些时候当我们拿到一颗三极管时 ,对于常用的友来说,三极管的引脚可能早已烂熟于心,而对于不常用或者初学者来说,三极管的引脚可以说是今天记下明天忘,后天搞混大后天重看手册(玩笑话),但是这种情况可以说每个人都…

springboot185基于vue.js的客户关系管理系统(crm)的设计与实现

简介 【毕设源码推荐 javaweb 项目】基于springbootvue 的 适用于计算机类毕业设计&#xff0c;课程设计参考与学习用途。仅供学习参考&#xff0c; 不得用于商业或者非法用途&#xff0c;否则&#xff0c;一切后果请用户自负。 看运行截图看 第五章 第四章 获取资料方式 **项…