分享WebGL物体三维建模

界面效果

代码结构

模型素材类似CT (Computed Tomography),即电子计算机断层扫描,它是利用精确准直的X线束、γ射线、超声波等,与灵敏度极高的探测器一同围绕物体的某一部位作一个接一个的断面扫描。

坐标系统

渲染流程

渲染流程是个将之前准备好的模型输出到屏幕的过程。3D 渲染流程会接受使用顶点描述 3D 物体的原始数据作为输入用于处理,并计算其片段 (fragment), 然后渲染为像素 (pixels) 输出到屏幕。

着色器

使用 GLSL 的着色器(shader),GLSL 是一门特殊的有着类似于 C 语言的语法,在图形管道 (graphic pipeline) 中直接可执行的 OpenGL 着色语言。着色器有两种类型——顶点着色器 (Vertex Shader) 和片段着色器(Fragment Shader)。前者是将形状转换到真实的 3D 绘制坐标中,后者是计算最终渲染的颜色和其他属性用的。

GLSL 不同于 JavaScript, 它是强类型语言,并且内置很多数学公式用于计算向量和矩阵。快速编写着色器非常复杂,但创建一个简单的着色器并不难。

一个着色器实际上就是一个绘制东西到屏幕上的函数。着色器运行在 GPU 中,它对这些操作进行了很多的优化,这样你就可以卸载很多不必要的 CPU, 然后集中处理能力去执行你自己的代码。

顶点着色器操作 3D 空间的坐标并且每个顶点都会调用一次这个函数。其目的是设置 gl_Position 变量——这是一个特殊的全局内置变量,它是用来存储当前顶点的位置。

片段 (或者纹理) 着色器 在计算时定义了每像素的 RGBA 颜色 — 每个像素只调用一次片段着色器。这个着色器的作用是设置 gl_FragColor 变量,也是一个 GLSL 内置变量。

        <script id="fragmentShaderFirstPass" type="x-shader/x-fragment">varying vec3 worldSpaceCoords;void main(){//The fragment's world space coordinates as fragment output.gl_FragColor = vec4( worldSpaceCoords.x , worldSpaceCoords.y, worldSpaceCoords.z, 1 );}</script><script id="vertexShaderFirstPass" type="x-shader/x-vertex">varying vec3 worldSpaceCoords;void main(){//Set the world space coordinates of the back faces vertices as output.worldSpaceCoords = position + vec3(0.5, 0.5, 0.5); //move it from [-0.5;0.5] to [0,1]gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );}</script><script id="fragmentShaderSecondPass" type="x-shader/x-fragment">varying vec3 worldSpaceCoords;varying vec4 projectedCoords;uniform sampler2D tex, cubeTex, transferTex;uniform float steps;uniform float alphaCorrection;// The maximum distance through our rendering volume is sqrt(3).// The maximum number of steps we take to travel a distance of 1 is 512.// ceil( sqrt(3) * 512 ) = 887// This prevents the back of the image from getting cut off when steps=512 & viewing diagonally.const int MAX_STEPS = 887;//Acts like a texture3D using Z slices and trilinear filtering.vec4 sampleAs3DTexture( vec3 texCoord ){vec4 colorSlice1, colorSlice2;vec2 texCoordSlice1, texCoordSlice2;//The z coordinate determines which Z slice we have to look for.//Z slice number goes from 0 to 255.float zSliceNumber1 = floor(texCoord.z  * 255.0);//As we use trilinear we go the next Z slice.float zSliceNumber2 = min( zSliceNumber1 + 1.0, 255.0); //Clamp to 255//The Z slices are stored in a matrix of 16x16 of Z slices.//The original UV coordinates have to be rescaled by the tile numbers in each row and column.texCoord.xy /= 16.0;texCoordSlice1 = texCoordSlice2 = texCoord.xy;//Add an offset to the original UV coordinates depending on the row and column number.texCoordSlice1.x += (mod(zSliceNumber1, 16.0 ) / 16.0);texCoordSlice1.y += floor((255.0 - zSliceNumber1) / 16.0) / 16.0;texCoordSlice2.x += (mod(zSliceNumber2, 16.0 ) / 16.0);texCoordSlice2.y += floor((255.0 - zSliceNumber2) / 16.0) / 16.0;//Get the opacity value from the 2D texture.//Bilinear filtering is done at each texture2D by default.colorSlice1 = texture2D( cubeTex, texCoordSlice1 );colorSlice2 = texture2D( cubeTex, texCoordSlice2 );//Based on the opacity obtained earlier, get the RGB color in the transfer function texture.colorSlice1.rgb = texture2D( transferTex, vec2( colorSlice1.a, 1.0) ).rgb;colorSlice2.rgb = texture2D( transferTex, vec2( colorSlice2.a, 1.0) ).rgb;//How distant is zSlice1 to ZSlice2. Used to interpolate between one Z slice and the other.float zDifference = mod(texCoord.z * 255.0, 1.0);//Finally interpolate between the two intermediate colors of each Z slice.return mix(colorSlice1, colorSlice2, zDifference) ;}void main( void ) {//Transform the coordinates it from [-1;1] to [0;1]vec2 texc = vec2(((projectedCoords.x / projectedCoords.w) + 1.0 ) / 2.0,((projectedCoords.y / projectedCoords.w) + 1.0 ) / 2.0 );//The back position is the world space position stored in the texture.vec3 backPos = texture2D(tex, texc).xyz;//The front position is the world space position of the second render pass.vec3 frontPos = worldSpaceCoords;//The direction from the front position to back position.vec3 dir = backPos - frontPos;float rayLength = length(dir);//Calculate how long to increment in each step.float delta = 1.0 / steps;//The increment in each direction for each step.vec3 deltaDirection = normalize(dir) * delta;float deltaDirectionLength = length(deltaDirection);//Start the ray casting from the front position.vec3 currentPosition = frontPos;//The color accumulator.vec4 accumulatedColor = vec4(0.0);//The alpha value accumulated so far.float accumulatedAlpha = 0.0;//How long has the ray travelled so far.float accumulatedLength = 0.0;//If we have twice as many samples, we only need ~1/2 the alpha per sample.//Scaling by 256/10 just happens to give a good value for the alphaCorrection slider.float alphaScaleFactor = 25.6 * delta;vec4 colorSample;float alphaSample;//Perform the ray marching iterationsfor(int i = 0; i < MAX_STEPS; i++){//Get the voxel intensity value from the 3D texture.colorSample = sampleAs3DTexture( currentPosition );//Allow the alpha correction customization.alphaSample = colorSample.a * alphaCorrection;//Applying this effect to both the color and alpha accumulation results in more realistic transparency.alphaSample *= (1.0 - accumulatedAlpha);//Scaling alpha by the number of steps makes the final color invariant to the step size.alphaSample *= alphaScaleFactor;//Perform the composition.accumulatedColor += colorSample * alphaSample;//Store the alpha accumulated so far.accumulatedAlpha += alphaSample;//Advance the ray.currentPosition += deltaDirection;accumulatedLength += deltaDirectionLength;//If the length traversed is more than the ray length, or if the alpha accumulated reaches 1.0 then exit.if(accumulatedLength >= rayLength || accumulatedAlpha >= 1.0 )break;}gl_FragColor  = accumulatedColor;}</script><script id="vertexShaderSecondPass" type="x-shader/x-vertex">varying vec3 worldSpaceCoords;varying vec4 projectedCoords;void main(){worldSpaceCoords = (modelMatrix * vec4(position + vec3(0.5, 0.5,0.5), 1.0 )).xyz;gl_Position = projectionMatrix *  modelViewMatrix * vec4( position, 1.0 );projectedCoords =  projectionMatrix * modelViewMatrix * vec4( position, 1.0 );}</script>

三维场景构建

         <script>if ( ! Detector.webgl ) Detector.addGetWebGLMessage();var container, stats;var camera, sceneFirstPass, sceneSecondPass, renderer;var clock = new THREE.Clock();var rtTexture, transferTexture;var cubeTextures = ['bonsai', 'foot', 'teapot'];var histogram = [];var guiControls;var materialSecondPass;init();//animate();function init() {//Parameters that can be modified.guiControls = new function() {this.model = 'bonsai';this.steps = 256.0;this.alphaCorrection = 1.0;this.color1 = "#00FA58";this.stepPos1 = 0.1;this.color2 = "#CC6600";this.stepPos2 = 0.7;this.color3 = "#F2F200";this.stepPos3 = 1.0;};container = document.getElementById( 'container' );camera = new THREE.PerspectiveCamera( 40, window.innerWidth / window.innerHeight, 0.01, 3000.0 );camera.position.z = 2.0;controls = new THREE.OrbitControls( camera, container );controls.center.set( 0.0, 0.0, 0.0 );//Load the 2D texture containing the Z slices.THREE.ImageUtils.crossOrigin = 'anonymous'; //处理纹理图加载跨域问题cubeTextures['bonsai'] = THREE.ImageUtils.loadTexture('./images/bonsai.raw.png');cubeTextures['teapot'] = THREE.ImageUtils.loadTexture('./images/teapot.raw.png');cubeTextures['foot'] = THREE.ImageUtils.loadTexture('./images/foot.raw.png');//Don't let it generate mipmaps to save memory and apply linear filtering to prevent use of LOD.cubeTextures['bonsai'].generateMipmaps = false;cubeTextures['bonsai'].minFilter = THREE.LinearFilter;cubeTextures['bonsai'].magFilter = THREE.LinearFilter;cubeTextures['teapot'].generateMipmaps = false;cubeTextures['teapot'].minFilter = THREE.LinearFilter;cubeTextures['teapot'].magFilter = THREE.LinearFilter;cubeTextures['foot'].generateMipmaps = false;cubeTextures['foot'].minFilter = THREE.LinearFilter;cubeTextures['foot'].magFilter = THREE.LinearFilter;var transferTexture = updateTransferFunction();var screenSize = new THREE.Vector2( window.innerWidth, window.innerHeight );rtTexture = new THREE.WebGLRenderTarget( screenSize.x, screenSize.y,{ 	minFilter: THREE.LinearFilter,magFilter: THREE.LinearFilter,wrapS:  THREE.ClampToEdgeWrapping,wrapT:  THREE.ClampToEdgeWrapping,format: THREE.RGBFormat,type: THREE.FloatType,generateMipmaps: false} );var materialFirstPass = new THREE.ShaderMaterial( {vertexShader: document.getElementById( 'vertexShaderFirstPass' ).textContent,fragmentShader: document.getElementById( 'fragmentShaderFirstPass' ).textContent,side: THREE.BackSide} );materialSecondPass = new THREE.ShaderMaterial( {vertexShader: document.getElementById( 'vertexShaderSecondPass' ).textContent,fragmentShader: document.getElementById( 'fragmentShaderSecondPass' ).textContent,side: THREE.FrontSide,uniforms: {	tex:  { type: "t", value: rtTexture },cubeTex:  { type: "t", value: cubeTextures['bonsai'] },transferTex:  { type: "t", value: transferTexture },steps : {type: "1f" , value: guiControls.steps },alphaCorrection : {type: "1f" , value: guiControls.alphaCorrection }}});sceneFirstPass = new THREE.Scene();sceneSecondPass = new THREE.Scene();var boxGeometry = new THREE.BoxGeometry(1.0, 1.0, 1.0);boxGeometry.doubleSided = true;var meshFirstPass = new THREE.Mesh( boxGeometry, materialFirstPass );var meshSecondPass = new THREE.Mesh( boxGeometry, materialSecondPass );sceneFirstPass.add( meshFirstPass );sceneSecondPass.add( meshSecondPass );renderer = new THREE.WebGLRenderer();container.appendChild( renderer.domElement );stats = new Stats();stats.domElement.style.position = 'absolute';stats.domElement.style.top = '0px';container.appendChild( stats.domElement );var gui = new dat.GUI();var modelSelected = gui.add(guiControls, 'model', [ 'bonsai', 'foot', 'teapot' ] );gui.add(guiControls, 'steps', 0.0, 512.0);gui.add(guiControls, 'alphaCorrection', 0.01, 5.0).step(0.01);modelSelected.onChange(function(value) { materialSecondPass.uniforms.cubeTex.value =  cubeTextures[value]; } );//Setup transfer function steps.var step1Folder = gui.addFolder('Step 1');var controllerColor1 = step1Folder.addColor(guiControls, 'color1');var controllerStepPos1 = step1Folder.add(guiControls, 'stepPos1', 0.0, 1.0);controllerColor1.onChange(updateTextures);controllerStepPos1.onChange(updateTextures);var step2Folder = gui.addFolder('Step 2');var controllerColor2 = step2Folder.addColor(guiControls, 'color2');var controllerStepPos2 = step2Folder.add(guiControls, 'stepPos2', 0.0, 1.0);controllerColor2.onChange(updateTextures);controllerStepPos2.onChange(updateTextures);var step3Folder = gui.addFolder('Step 3');var controllerColor3 = step3Folder.addColor(guiControls, 'color3');var controllerStepPos3 = step3Folder.add(guiControls, 'stepPos3', 0.0, 1.0);controllerColor3.onChange(updateTextures);controllerStepPos3.onChange(updateTextures);step1Folder.open();step2Folder.open();step3Folder.open();onWindowResize();window.addEventListener( 'resize', onWindowResize, false );}function imageLoad(url){   var image = document.createElement('img');image.crossOrigin = '';image.src = url;var loader = new THREE.TextureLoader();// load a resourceconst textureAssets = nullloader.load(// resource URLimage,// Function when resource is loadedfunction ( texture ) {// do something with the texturetexture.wrapS = THREE.RepeatWrapping;texture.wrapT = THREE.RepeatWrapping;texture.offset.x = 90/(2*Math.PI);textureAssets = texture},// Function called when download progressesfunction ( xhr ) {console.log( (xhr.loaded / xhr.total * 100) + '% loaded' );},// Function called when download errorsfunction ( xhr ) {console.log( 'An error happened' );});return textureAssets; // return the texture}function updateTextures(value){materialSecondPass.uniforms.transferTex.value = updateTransferFunction();}function updateTransferFunction(){var canvas = document.createElement('canvas');canvas.height = 20;canvas.width = 256;var ctx = canvas.getContext('2d');var grd = ctx.createLinearGradient(0, 0, canvas.width -1 , canvas.height - 1);grd.addColorStop(guiControls.stepPos1, guiControls.color1);grd.addColorStop(guiControls.stepPos2, guiControls.color2);grd.addColorStop(guiControls.stepPos3, guiControls.color3);ctx.fillStyle = grd;ctx.fillRect(0,0,canvas.width -1 ,canvas.height -1 );var img = document.getElementById("transferFunctionImg");img.src = canvas.toDataURL();img.style.width = "256 px";img.style.height = "128 px";transferTexture =  new THREE.Texture(canvas);transferTexture.wrapS = transferTexture.wrapT =  THREE.ClampToEdgeWrapping;transferTexture.needsUpdate = true;return transferTexture;}function onWindowResize( event ) {camera.aspect = window.innerWidth / window.innerHeight;camera.updateProjectionMatrix();renderer.setSize( window.innerWidth, window.innerHeight );}function animate() {requestAnimationFrame( animate );render();stats.update();}function render() {var delta = clock.getDelta();//Render first pass and store the world space coords of the back face fragments into the texture.renderer.render( sceneFirstPass, camera, rtTexture, true );//Render the second pass and perform the volume rendering.renderer.render( sceneSecondPass, camera );materialSecondPass.uniforms.steps.value = guiControls.steps;materialSecondPass.uniforms.alphaCorrection.value = guiControls.alphaCorrection;}//Leandro R Barbagallo - 2015 - lebarba at gmail.comconst VSHADER_SOURCE = `attribute vec4 a_Position;attribute vec2 a_uv;varying vec2 v_uv;void main() {gl_Position = a_Position;v_uv = a_uv;}`const FSHADER_SOURCE = `precision mediump float;// 定义一个取样器。sampler2D 是一种数据类型,就像 vec2uniform sampler2D u_Sampler;uniform sampler2D u_Sampler2;varying vec2 v_uv;void main() {// texture2D(sampler2D sampler, vec2 coord) - 着色器语言内置函数,从 sampler 指定的纹理上获取 coord 指定的纹理坐标处的像素vec4 color = texture2D(u_Sampler, v_uv);vec4 color2 = texture2D(u_Sampler2, v_uv);gl_FragColor = color * color2;}`function main() {const canvas = document.getElementById('webgl');const gl = canvas.getContext("webgl");if (!gl) {console.log('Failed to get the rendering context for WebGL');return;}if (!initShaders(gl, VSHADER_SOURCE, FSHADER_SOURCE)) {console.log('Failed to intialize shaders.');return;}gl.clearColor(0.0, 0.5, 0.5, 1.0);// 几何图形的4个顶点的坐标const verticesOfPosition = new Float32Array([// 左下角是第一个点,逆时针-0.5, -0.5,0.5, -0.5,0.5, 0.5,-0.5, 0.5,])// 纹理的4个点的坐标const uvs = new Float32Array([// 左下角是第一个点,逆时针,与顶点坐标保持对应0.0, 0.0,1.0, 0.0,1.0, 1.0,0.0, 1.0])initVertexBuffers(gl, verticesOfPosition)initUvBuffers(gl, uvs)initTextures(gl)initMaskTextures(gl)}// 初始化纹理。之所以为复数 s 是因为可以贴多张图片。function initTextures(gl) {// 定义图片const img = new Image();// 请求 CORS 许可。解决图片跨域问题img.crossOrigin = "";// The image element contains cross-origin data, and may not be loaded.img.src = "./images/bonsai.raw.png";img.onload = () => {// 创建纹理const texture = gl.createTexture();// 取得取样器const u_Sampler = gl.getUniformLocation(gl.program, 'u_Sampler');if (!u_Sampler) {console.log('Failed to get the storage location of u_Sampler');return false;}// pixelStorei - 图像预处理:图片上下对称翻转坐标轴 (图片本身不变)gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);// 激活纹理单元gl.activeTexture(gl.TEXTURE0);// 绑定纹理对象gl.bindTexture(gl.TEXTURE_2D, texture);// 配置纹理参数gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.MIRRORED_REPEAT);// 纹理图片分配给纹理对象gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, gl.RGB, gl.UNSIGNED_BYTE, img);// 将纹理单元传给片元着色器gl.uniform1i(u_Sampler, 0);gl.clear(gl.COLOR_BUFFER_BIT);gl.drawArrays(gl.TRIANGLE_FAN, 0, 4);}}// 初始化纹理。之所以为复数 s 是因为可以贴多张图片。function initMaskTextures(gl) {const img = new Image();img.src = "./images/teapot.raw.png";img.onload = () => {// 创建纹理const texture = gl.createTexture();// 取得取样器const u_Sampler = gl.getUniformLocation(gl.program, 'u_Sampler2');if (!u_Sampler) {console.log('Failed to get the storage location of u_Sampler');return false;}// pixelStorei - 图像预处理:图片上下对称翻转坐标轴 (图片本身不变)gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);// 激活纹理单元gl.activeTexture(gl.TEXTURE1);// 绑定纹理对象gl.bindTexture(gl.TEXTURE_2D, texture);// 配置纹理参数gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);// 纹理图片分配给纹理对象gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, gl.RGB, gl.UNSIGNED_BYTE, img);// 将纹理单元传给片元着色器gl.uniform1i(u_Sampler, 1);gl.clear(gl.COLOR_BUFFER_BIT);gl.drawArrays(gl.TRIANGLE_FAN, 0, 4);}}function initVertexBuffers(gl, positions) {const vertexBuffer = gl.createBuffer();if (!vertexBuffer) {console.log('创建缓冲区对象失败');return -1;}gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);gl.bufferData(gl.ARRAY_BUFFER, positions, gl.STATIC_DRAW);const a_Position = gl.getAttribLocation(gl.program, 'a_Position');if (a_Position < 0) {console.log('Failed to get the storage location of a_Position');return -1;}gl.vertexAttribPointer(a_Position, 2, gl.FLOAT, false, 0, 0);gl.enableVertexAttribArray(a_Position);}function initUvBuffers(gl, uvs) {const uvsBuffer = gl.createBuffer();if (!uvsBuffer) {console.log('创建 uvs 缓冲区对象失败');return -1;}gl.bindBuffer(gl.ARRAY_BUFFER, uvsBuffer);gl.bufferData(gl.ARRAY_BUFFER, uvs, gl.STATIC_DRAW);const a_uv = gl.getAttribLocation(gl.program, 'a_uv');if (a_uv < 0) {console.log('Failed to get the storage location of a_uv');return -1;}gl.vertexAttribPointer(a_uv, 2, gl.FLOAT, false, 0, 0);gl.enableVertexAttribArray(a_uv);}</script>

环境设置


你只需关注着色器代码。Three.js 和其他 3D 库给你抽象了很多东西出来——如果你想要用纯 WebGL 创建这个例子,你得写很多其他的代码才能运行。要开始编写 WebGL 着色器你不需要做太多,只需如下三步:

1、确保你在使用对 WebGL 有良好支持的现代浏览器,比如最新版的 Firefox 或 Chrome.
2、创建一个目录保存你的实验。
3、拷贝一份的 压缩版的 Three.js 库 到你的目录。

参见:

LearnOpenGL - Coordinate Systems

WebGL model view projection - Web API 接口参考 | MDN

GLSL 着色器 - 游戏开发 | MDN

解释基本的 3D 原理 - 游戏开发 | MDN

Lebarba - WebGL Volume Rendering made easy

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://xiahunao.cn/news/2806115.html

如若内容造成侵权/违法违规/事实不符,请联系瞎胡闹网进行投诉反馈,一经查实,立即删除!

相关文章

学习JAVA的第二天(基础)

目录 基本概念 关键字 class关键字 字面量 练习 变量 定义格式 变量使用 数据类型 基本数据类型 标识符 命名规则 键盘录入 1.导包 2.创建对象 3.接受数据 运算符 算术运算符 练习 隐式转换&#xff08;自动类型提升&#xff09; 强制转换 自增自减运算符 …

[嵌入式系统-34]:RT-Thread -19- 新手指南:RT-Thread标准版系统架构

目录 一、RT-Thread 简介 二、RT-Thread 概述 三、许可协议 四、RT-Thread 的架构 4.1 内核层&#xff1a; 4.2 组件与服务层&#xff1a; 4.3 RT-Thread 软件包&#xff1a; 一、RT-Thread 简介 作为一名 RTOS 的初学者&#xff0c;也许你对 RT-Thread 还比较陌生。然…

【PostgreSQL实现psql连接时候提示用户的密码有效时间】

如下内容使用session_exec插件结合自定函数实现。类似于触发器的原理。 功能需要严格在测试环境测试后&#xff0c;才可在正式环境使用。没有相关要求&#xff0c;还是建议直接查询pg_roles/pg_authid/pg_user&#xff1b; 一、判断是否需要修改用户密码和有效期的检查SQL 首…

计算机毕业设计 | SSM 学生信息管理 教务管理系统(附源码)

1&#xff0c;绪论 随着我国高等教育的发展&#xff0c;数字化校园将成为一种必然的趋势&#xff0c;国内高校迫切需要提高教育工作的质量与效率&#xff0c;学生成绩管理工作是高校信息管理工作的重要组成部分&#xff0c;与国外高校不同&#xff0c;他们一般具有较大规模的稳…

基于java后台的微信影视交流点评小程序系统设计与实现

博主介绍&#xff1a;黄菊华老师《Vue.js入门与商城开发实战》《微信小程序商城开发》图书作者&#xff0c;CSDN博客专家&#xff0c;在线教育专家&#xff0c;CSDN钻石讲师&#xff1b;专注大学生毕业设计教育和辅导。 所有项目都配有从入门到精通的基础知识视频课程&#xff…

基于PostGIS的慢查询引起的空间索引提升实践

目录 前言 一、问题定位 1、前端接口定位 2、后台应用定位 3、找到问题所在 二、空间索引优化 1、数据库查询 2、创建空间索引 3、geography索引 4、再看前端响应 总结 前言 这是一个真实的案例&#xff0c;也是一个新入门的工程师很容易忽略的点。往往在设计数据库的…

论文阅读笔记——PathAFL:Path-Coverage Assisted Fuzzing

文章目录 前言PathAFL&#xff1a;Path-Coverage Assisted Fuzzing1、解决的问题和目标2、技术路线2.1、如何识别 h − p a t h h-path h−path&#xff1f;2.2、如何减少 h − p a t h h-path h−path的数量&#xff1f;2.3、哪些h-path将被添加到种子队列&#xff1f;2.4、种…

深度学习环境配置常见指令

首先打开anaconda prompt&#xff0c;激活对应虚拟环境。 导入torch并获取对应版本 import torch torch.__version__导入torchvision并获取对应版本 import torchvision torchvision.__version__ 检查cuda是否可用 torch.cuda.is_available() 获取CUDA设备数 torch.cuda.…

VS中使用xcopy生成后命令报9009错误

错误现象: download下来的代码&#xff0c;在另一台电脑能使用生成后命令xcopy&#xff0c;换一台电脑后该命令不能使用&#xff0c;报如下错误&#xff1a; 2.错误原因&#xff1a; 这是因为xcopy /Y 为Windows程序命令&#xff0c;xcopy其实是Windows下的一个xcopy.exe,如果…

前后端分离Vue+ElementUI+nodejs蛋糕甜品商城购物网站95m4l

本文主要介绍了一种基于windows平台实现的蛋糕购物商城网站。该系统为用户找到蛋糕购物商城网站提供了更安全、更高效、更便捷的途径。本系统有二个角色&#xff1a;管理员和用户&#xff0c;要求具备以下功能&#xff1a; &#xff08;1&#xff09;用户可以修改个人信息&…

NXP实战笔记(十):S32K3xx基于RTD-SDK在S32DS上配置CAN通信

目录 1、概述 2、SDK配置 2.1、配置目标 2.2、CAN配置 3、代码实现 4、测试结果 1、概述 S32K3xx的FlexCan与之前的S32K1xx很相似,Can的中断掩码寄存器(IMASK3)与中断标志位寄存器(IFLAG3)依赖于邮箱数。 FlexCan配置实例如下 FlexCan的整体图示如下 Protocol Engine…

LeetCode二叉树中的第 K 大层和

题目描述 给你一棵二叉树的根节点 root 和一个正整数 k 。 树中的 层和 是指 同一层 上节点值的总和。 返回树中第 k 大的层和&#xff08;不一定不同&#xff09;。如果树少于 k 层&#xff0c;则返回 -1 。 注意&#xff0c;如果两个节点与根节点的距离相同&#xff0c;则…

Node.js+vue校内二手物品交易系统tdv06-vscode前后端分离

二手物品交易系统采用B/S架构&#xff0c;数据库是MySQL。网站的搭建与开发采用了先进的nodejs进行编写&#xff0c;使用了vue框架。该系统从三个对象&#xff1a;由管理员和用户、店铺来对系统进行设计构建。主要功能包括&#xff1a;个人信息修改&#xff0c;对用户、店铺、二…

uniapp 使用 z-paging组件

使用 z-paging 导入插件 获取插件进行导入 自定义上拉加载样式和下拉加载样式 页面结构 例子 搭建页面 <template><view class"content"><z-paging ref"paging" v-model"dataList" query"queryList"><templ…

【Java】输入输出流(实验八)

目录 一、实验目的 二、实验内容 三、实验小结 一、实验目的 1、掌握java I/O的基本原理。 2、掌握标准输入输出流和Scanner类的基本使用方法。 3、掌握FileInputStream、FileOutStream、FileReader、FileWriter、BufferedReader 、BufferedWriter类的常用方法。 二、实验…

从源码学习单例模式

单例模式 单例模式是一种设计模式&#xff0c;常用于确保一个类只有一个实例&#xff0c;并提供一个全局访问点。这意味着无论在程序的哪个地方&#xff0c;只能创建一个该类的实例&#xff0c;而不会出现多个相同实例的情况。 在单例模式中&#xff0c;常用的实现方式包括懒汉…

创建一个基于Node.js的实时聊天应用

在当今数字化社会&#xff0c;实时通讯已成为人们生活中不可或缺的一部分。无论是在社交媒体平台上与朋友交流&#xff0c;还是在工作场合中与同事协作&#xff0c;实时聊天应用都扮演着重要角色。与此同时&#xff0c;Node.js作为一种流行的后端技术&#xff0c;为开发者提供了…

Linux修改shell工具连接端口

nano /etc/ssh/sshd_config 或者 vi /etc/ssh/sshd_config 或者 vim /etc/ssh/sshd_config

2024最新版(黑客技术)零基础入门自学网络安全

一、自学网络安全学习的误区和陷阱 1.不要试图先以编程为基础的学习再开始学习 我在之前的回答中&#xff0c;我都一再强调不要以编程为基础再开始学习网络安全&#xff0c;一般来说&#xff0c;学习编程不但学习周期长&#xff0c;而且实际向安全过渡后可用到的关键知识并不…

如何连接ACL认证的Redis

点击上方蓝字关注我 应用程序连接开启了ACL认证的Redis时与原先的方式有差别&#xff0c;本文介绍几种连接开启ACL认证的Redis的Redis的方法。 对于RedisACL认证相关内容&#xff0c;可以参考历史文章&#xff1a; Redis权限管理体系(一&#xff09;&#xff1a;客户端名及用户…