问题描述
我正在使用帆布元素中的图像作为三.js中的纹理,使用javaScript在画布上执行图像操作,然后在纹理上调用需求updeupdate().这起作用,但是很慢.
我想在片段着色器中执行图像计算.我找到了许多几乎这样做的示例:
-
着色器材料: http:http:http:http:http:http://mrdoob.github.io/three.js/examples/webgl_shader2.html 此示例显示了在片段着色器中执行的图像操纵,但是该着色器正充当整个材料的片段着色器.我只想在纹理上使用着色器,然后将纹理用作第二材料的组成部分.
-
呈现纹理: https://threejsdoc.appspot.com/doc/three.js/examples/webgl_rtt.html 这显示了将整个场景呈现为webglrendertarget并将其用作材料中的纹理.我只想预处理图像,而不是呈现整个场景.
-
效果作曲家: http:/http://www.airtightinteractive.com/demos/js/shaders/preview/这显示了将着色器作为后传播应用于整个场景的过程.
编辑:这是另一个:
- 渲染到另一个场景:/rt.html 此示例,在 three treex中引用.从WebGlrenderTarget(Water Sim)中检索数据,使用第二个带有其拼字摄像头的场景将动态纹理呈现为WebGlrendArget,然后将其用作主要场景中的纹理.我想这是上面列出的第一个"渲染效果"示例的特殊情况,并且可能对我有用,但似乎过于复杂.
我理解,理想情况下,我能够用自己的片段着色器制作一个新的框架,自行渲染,并将其输出用作另一材料的碎片着色器的纹理统一.这可能吗?
编辑2:看来我可能会问一些类似的事情:着色器材料和三架框架" ...虽然问题似乎没有解决.
推荐答案
呈现纹理和呈现到另一个场景,如上所述是同一件事,是您想要的技术.解释:
在Vanilla WebGL中,您的做法是从头开始创建framebuffer对象(FBO),将纹理绑定到其上,并使用您选择的着色器进行渲染.诸如"场景"和"相机"之类的概念不涉及,这是一个复杂的过程.这是一个例子:
http://learningwebgl.com/blog./blog/?p=1786
但这也恰好是三分.当您使用相机渲染场景时,渲染器输出到框架缓冲器时,它的基本用法直接进入屏幕.因此,如果您指示它渲染到新的WebGlrenderTarget,则可以将相机视为第二材料的输入纹理.所有复杂的事情仍在发生,但是在幕后,这是三分之美. :)
so:要复制一个包含单个渲染纹理的FBO的WebGL设置,如评论中所述,只需制作一个包含正字摄像头和带有所需纹理材料的单个平面的新场景,然后渲染到一个使用您的自定义着色器:
的新WebGlrenderTarget// new render-to-texture scene myScene = new THREE.Scene(); // you may need to modify these parameters var renderTargetParams = { minFilter:THREE.LinearFilter, stencilBuffer:false, depthBuffer:false }; myImage = THREE.ImageUtils.loadTexture( 'path/to/texture.png', new THREE.UVMapping(), function() { myCallbackFunction(); } ); imageWidth = myImage.image.width; imageHeight = myImage.image.height; // create buffer myTexture = new THREE.WebGLRenderTarget( width, height, renderTargetParams ); // custom RTT materials myUniforms = { colorMap: { type: "t", value: myImage }, }; myTextureMat = new THREE.ShaderMaterial({ uniforms: myUniforms, vertexShader: document.getElementById( 'my_custom_vs' ).textContent, fragmentShader: document.getElementById( 'my_custom_fs' ).textContent }); // Setup render-to-texture scene myCamera = new THREE.OrthographicCamera( imageWidth / - 2, imageWidth / 2, imageHeight / 2, imageHeight / - 2, -10000, 10000 ); var myTextureGeo = new THREE.PlaneGeometry( imageWidth, imageHeight ); myTextureMesh = new THREE.Mesh( myTextureGeo, myTextureMat ); myTextureMesh.position.z = -100; myScene.add( myTextureMesh ); renderer.render( myScene, myCamera, myTexture, true );
您渲染了新场景后,myTexture将可以用作主场景中另一种材料的纹理.请注意,您可能需要在loadTexture()呼叫中使用回调函数触发第一个render,以便在源图像加载之前不会尝试渲染.
问题描述
I'm using an image in a canvas element as a texture in Three.js, performing image manipulations on the canvas using JavaScript, and then calling needsUpdate() on the texture. This works, but it's quite slow.
I'd like to perform the image calculations in a fragment shader instead. I've found many examples which almost do this:
Shader materials: http://mrdoob.github.io/three.js/examples/webgl_shader2.html This example shows image manipulations performed in a fragment shader, but that shader is functioning as the fragment shader of an entire material. I only want to use the shader on a texture, and then use the texture as a component of a second material.
Render to texture: https://threejsdoc.appspot.com/doc/three.js/examples/webgl_rtt.html This shows rendering the entire scene to a WebGLRenderTarget and using that as the texture in a material. I only want to pre-process an image, not render an entire scene.
Effects composer: http://www.airtightinteractive.com/demos/js/shaders/preview/ This shows applying shaders as a post-process to the entire scene.
Edit: Here's another one:
- Render to another scene: http://relicweb.com/webgl/rt.html This example, referenced in Three.js Retrieve data from WebGLRenderTarget (water sim), uses a second scene with its own orthographic camera to render a dynamic texture to a WebGLRenderTarget, which is then used as a texture in the primary scene. I guess this is a special case of the first "render to texture" example listed above, and would probably work for me, but seems over-complicated.
As I understand it, ideally I'd be able to make a new framebuffer object with its own fragment shader, render it on its own, and use its output as a texture uniform for another material's fragment shader. Is this possible?
Edit 2: It looks like I might be asking something similar to this: Shader Materials and GL Framebuffers in THREE.js ...though the question doesn't appear to have been resolved.
推荐答案
Render to texture and Render to another scene as listed above are the same thing, and are the technique you want. To explain:
In vanilla WebGL the way you do this kind of thing is by creating a framebuffer object (FBO) from scratch, binding a texture to it, and rendering it with the shader of your choice. Concepts like "scene" and "camera" aren't involved, and it's kind of a complicated process. Here's an example:
http://learningwebgl.com/blog/?p=1786
But this also happens to be essentially what Three.js does when you use it to render a scene with a camera: the renderer outputs to a framebuffer, which in its basic usage goes straight to the screen. So if you instruct it to render to a new WebGLRenderTarget instead, you can use whatever the camera sees as the input texture of a second material. All the complicated stuff is still happening, but behind the scenes, which is the beauty of Three.js. :)
So: To replicate a WebGL setup of an FBO containing a single rendered texture, as mentioned in the comments, just make a new scene containing an orthographic camera and a single plane with a material using the desired texture, then render to a new WebGLRenderTarget using your custom shader:
// new render-to-texture scene myScene = new THREE.Scene(); // you may need to modify these parameters var renderTargetParams = { minFilter:THREE.LinearFilter, stencilBuffer:false, depthBuffer:false }; myImage = THREE.ImageUtils.loadTexture( 'path/to/texture.png', new THREE.UVMapping(), function() { myCallbackFunction(); } ); imageWidth = myImage.image.width; imageHeight = myImage.image.height; // create buffer myTexture = new THREE.WebGLRenderTarget( width, height, renderTargetParams ); // custom RTT materials myUniforms = { colorMap: { type: "t", value: myImage }, }; myTextureMat = new THREE.ShaderMaterial({ uniforms: myUniforms, vertexShader: document.getElementById( 'my_custom_vs' ).textContent, fragmentShader: document.getElementById( 'my_custom_fs' ).textContent }); // Setup render-to-texture scene myCamera = new THREE.OrthographicCamera( imageWidth / - 2, imageWidth / 2, imageHeight / 2, imageHeight / - 2, -10000, 10000 ); var myTextureGeo = new THREE.PlaneGeometry( imageWidth, imageHeight ); myTextureMesh = new THREE.Mesh( myTextureGeo, myTextureMat ); myTextureMesh.position.z = -100; myScene.add( myTextureMesh ); renderer.render( myScene, myCamera, myTexture, true );
Once you've rendered the new scene, myTexture will be available for use as a texture in another material in your main scene. Note that you may want to trigger the first render with the callback function in the loadTexture() call, so that it won't try to render until the source image has loaded.