unity3d - Problems porting a GLSL shadertoy shader to unity -
i'm trying port shadertoy.com shader (atmospheric scattering sample, interactive demo code) unity. shader written in glsl , have start editor c:\program files\unity\editor>unity.exe -force-opengl
make render shader (otherwise "this shader cannot run on gpu" error comes up), that's not problem right now. problem porting shader unity.
the functions scattering etc. identical , "runnable" in ported shader, thing mainimage()
functions manages camera, light directions , ray direction itself. has ofcourse changed sothat unity's camera position, view direction , light sources , directions used.
the main function of original looks this:
void mainimage( out vec4 fragcolor, in vec2 fragcoord ) { // default ray dir vec3 dir = ray_dir( 45.0, iresolution.xy, fragcoord.xy ); // default ray origin vec3 eye = vec3( 0.0, 0.0, 2.4 ); // rotate camera mat3 rot = rot3xy( vec2( 0.0, iglobaltime * 0.5 ) ); dir = rot * dir; eye = rot * eye; // sun light dir vec3 l = vec3( 0, 0, 1 ); vec2 e = ray_vs_sphere( eye, dir, r ); if ( e.x > e.y ) { discard; } vec2 f = ray_vs_sphere( eye, dir, r_inner ); e.y = min( e.y, f.x ); vec3 = in_scatter( eye, dir, e, l ); fragcolor = vec4( i, 1.0 ); }
i've read through documentation of function , how it's supposed work @ https://www.shadertoy.com/howto .
image shaders implement mainimage() function in order generate procedural images computing color each pixel. function expected called once per pixel, , responsability of host application provide right inputs , output color , assign screen pixel. prototype is:
void mainimage( out vec4 fragcolor, in vec2 fragcoord );
where fragcoord contains pixel coordinates shader needs compute color. coordinates in pixel units, ranging 0.5 resolution-0.5, on rendering surface, resolution passed shader through iresolution uniform (see below).
the resulting color gathered in fragcolor 4 component vector, last of ignored client. result gathered "out" variable in prevision of future addition of multiple render targets.
so in function there references iglobaltime
make camera rotate time , references iresolution
resolution. i've embedded shader in unity shader , tried fix , wireup dir
, eye
, l
sothat works unity, i'm completly stuck. sort of picture looks "related" original shader: (top original, buttom current unity state)
i'm not shader profesional, know basics of opengl, part, write game logic in c#, @ other shader examples , @ how data camera, lightsources etc. in code, can see, nothing works out, really.
i've copied skelton-code shader https://en.wikibooks.org/wiki/glsl_programming/unity/specular_highlights , vectors http://forum.unity3d.com/threads/glsl-shader.39629/ .
i hope can point me in direction on how fix shader / correctly port unity. below current shader code, have reproduce create new shader in blank project, copy code inside, make new material, assign shader material, add sphere , add material on , add directional light.
shader "unlit/atmofragshader" { properties{ _maintex("base (rgb)", 2d) = "white" {} _lc("lc", color) = (1,0,0,0) /* stuff testing shader, used */ _lp("lp", vector) = (1,1,1,1) } subshader{ tags{ "queue" = "geometry" } //is right queue? pass{ //tags{ "lightmode" = "forwardbase" } glslprogram /* begin port copying in constants */ // math const const float pi = 3.14159265359; const float deg_to_rad = pi / 180.0; const float max = 10000.0; // scatter const const float k_r = 0.166; const float k_m = 0.0025; const float e = 14.3; // light intensity const vec3 c_r = vec3(0.3, 0.7, 1.0); // 1 / wavelength ^ 4 const float g_m = -0.85; // mie g const float r = 1.0; /* radius of spehere? should set geometry or something.. */ const float r_inner = 0.7; const float scale_h = 4.0 / (r - r_inner); const float scale_l = 1.0 / (r - r_inner); const int num_out_scatter = 10; const float fnum_out_scatter = 10.0; const int num_in_scatter = 10; const float fnum_in_scatter = 10.0; /* begin functions. these out of defines because should accesible anyone. */ // angle : pitch, yaw mat3 rot3xy(vec2 angle) { vec2 c = cos(angle); vec2 s = sin(angle); return mat3( c.y, 0.0, -s.y, s.y * s.x, c.x, c.y * s.x, s.y * c.x, -s.x, c.y * c.x ); } // ray direction vec3 ray_dir(float fov, vec2 size, vec2 pos) { vec2 xy = pos - size * 0.5; float cot_half_fov = tan((90.0 - fov * 0.5) * deg_to_rad); float z = size.y * 0.5 * cot_half_fov; return normalize(vec3(xy, -z)); } // ray intersects sphere // e = -b +/- sqrt( b^2 - c ) vec2 ray_vs_sphere(vec3 p, vec3 dir, float r) { float b = dot(p, dir); float c = dot(p, p) - r * r; float d = b * b - c; if (d < 0.0) { return vec2(max, -max); } d = sqrt(d); return vec2(-b - d, -b + d); } // mie // g : ( -0.75, -0.999 ) // 3 * ( 1 - g^2 ) 1 + c^2 // f = ----------------- * ------------------------------- // 2 * ( 2 + g^2 ) ( 1 + g^2 - 2 * g * c )^(3/2) float phase_mie(float g, float c, float cc) { float gg = g * g; float = (1.0 - gg) * (1.0 + cc); float b = 1.0 + gg - 2.0 * g * c; b *= sqrt(b); b *= 2.0 + gg; return 1.5 * / b; } // reyleigh // g : 0 // f = 3/4 * ( 1 + c^2 ) float phase_reyleigh(float cc) { return 0.75 * (1.0 + cc); } float density(vec3 p) { return exp(-(length(p) - r_inner) * scale_h); } float optic(vec3 p, vec3 q) { vec3 step = (q - p) / fnum_out_scatter; vec3 v = p + step * 0.5; float sum = 0.0; (int = 0; < num_out_scatter; i++) { sum += density(v); v += step; } sum *= length(step) * scale_l; return sum; } vec3 in_scatter(vec3 o, vec3 dir, vec2 e, vec3 l) { float len = (e.y - e.x) / fnum_in_scatter; vec3 step = dir * len; vec3 p = o + dir * e.x; vec3 v = p + dir * (len * 0.5); vec3 sum = vec3(0.0); (int = 0; < num_in_scatter; i++) { vec2 f = ray_vs_sphere(v, l, r); vec3 u = v + l * f.y; float n = (optic(p, v) + optic(v, u)) * (pi * 4.0); sum += density(v) * exp(-n * (k_r * c_r + k_m)); v += step; } sum *= len * scale_l; float c = dot(dir, -l); float cc = c * c; return sum * (k_r * c_r * phase_reyleigh(cc) + k_m * phase_mie(g_m, c, cc)) * e; } /* end functions */ /* vertex shader begins here*/ #ifdef vertex const float specularcontribution = 0.3; const float diffusecontribution = 1.0 - specularcontribution; uniform vec4 _lp; varying vec2 texturecoordinate; varying float lightintensity; varying vec4 someoutput; /* transient stuff */ varying vec3 eyeoutput; varying vec3 diroutput; varying vec3 loutput; varying vec2 eoutput; /* lighting stuff */ // i.e. 1 #include "unitycg.glslinc" uniform vec3 _worldspacecamerapos; // camera position in world space uniform mat4 _object2world; // model matrix uniform mat4 _world2object; // inverse model matrix uniform vec4 _worldspacelightpos0; // direction or position of light source uniform vec4 _lightcolor0; // color of light source (from "lighting.cginc") void main() { /* code example shader */ gl_position = gl_modelviewprojectionmatrix * gl_vertex; vec3 ecposition = vec3(gl_modelviewmatrix * gl_vertex); vec3 tnorm = normalize(gl_normalmatrix * gl_normal); vec3 lightvec = normalize(_lp.xyz - ecposition); vec3 reflectvec = reflect(-lightvec, tnorm); vec3 viewvec = normalize(-ecposition); /* copied https://en.wikibooks.org/wiki/glsl_programming/unity/specular_highlights testing stuff */ //i have no idea i'm doing, computes vectors need mat4 modelmatrix = _object2world; mat4 modelmatrixinverse = _world2object; // unity_scale.w // unnecessary because normalize vectors vec3 normaldirection = normalize(vec3( vec4(gl_normal, 0.0) * modelmatrixinverse)); vec3 viewdirection = normalize(vec3( vec4(_worldspacecamerapos, 1.0) - modelmatrix * gl_vertex)); vec3 lightdirection; float attenuation; if (0.0 == _worldspacelightpos0.w) // directional light? { attenuation = 1.0; // no attenuation lightdirection = normalize(vec3(_worldspacelightpos0)); } else // point or spot light { vec3 vertextolightsource = vec3(_worldspacelightpos0 - modelmatrix * gl_vertex); float distance = length(vertextolightsource); attenuation = 1.0 / distance; // linear attenuation lightdirection = normalize(vertextolightsource); } /* test port */ // default ray dir //that's direction of camera here? vec3 dir = viewdirection; //normaldirection;//viewdirection;// tnorm;//lightvec;//lightdirection;//normaldirection; //lightvec;//tnorm;//ray_dir(45.0, iresolution.xy, fragcoord.xy); // default ray origin //i think mean position of camera here? vec3 eye = vec3(_worldspacecamerapos); //vec3(_worldspacelightpos0); //// vec3(0.0, 0.0, 0.0); //_worldspacecamerapos;//ecposition; //vec3(0.0, 0.0, 2.4); // rotate camera not needed, remove // sun light dir //i think mean direciton of our directional light? vec3 l = lightdirection;//_lightcolor0.xyz; //lightdirection; //normaldirection;//normalize(vec3(_worldspacelightpos0));//lightvec;// vec3(0, 0, 1); /* computes intersection of ray , sphere.. needed?*/ vec2 e = ray_vs_sphere(eye, dir, r); /* copy stuff sothat can use on fragment shader, "discard" allowed in fragment shader, rest has computed in fragment shader */ eoutput = e; eyeoutput = eye; diroutput = dir; loutput = dir; } #endif #ifdef fragment uniform sampler2d _maintex; varying vec2 texturecoordinate; uniform vec4 _lc; varying float lightintensity; /* transient port */ varying vec3 eyeoutput; varying vec3 diroutput; varying vec3 loutput; varying vec2 eoutput; void main() { /* real fragment */ if (eoutput.x > eoutput.y) { //discard; } vec2 f = ray_vs_sphere(eyeoutput, diroutput, r_inner); vec2 e = eoutput; e.y = min(e.y, f.x); vec3 = in_scatter(eyeoutput, diroutput, eoutput, loutput); gl_fragcolor = vec4(i, 1.0); /*vec4 c2; c2.x = 1.0; c2.y = 1.0; c2.z = 0.0; c2.w = 1.0f; gl_fragcolor = c2;*/ //gl_fragcolor = c; } #endif endglsl } } }
any appreciated, sorry long post , explanations.
edit: found out radius of spehere have influence on stuff, sphere scale 2.0 in every direction gives better result. however, picture still completly independent of viewing angle of camera , lights, near shaderlab version.
it's trying render 2d texture on sphere. has different approach. trying do, apply shader on plane crossed sphere.
for general purpose, this article showing how convert shadertoy unity3d.
there steps included here:
- replace iglobaltime shader input (“shader playback time in seconds”) _time.y
- replace iresolution.xy (“viewport resolution in pixels”) _screenparams.xy
- replace vec2 types float2, mat2 float2x2 etc.
- replace vec3(1) shortcut constructors in elements have same value explicit float3(1,1,1)
- replace texture2d tex2d
- replace atan(x,y) atan2(y,x) <- note parameter ordering!
- replace mix() lerp()
- replace *= mul()
- remove third (bias) parameter texture2d lookups
- mainimage(out vec4 fragcolor, in vec2 fragcoord) fragment shader function, equivalent float4 mainimage(float2 fragcoord : sv_position) : sv_target
- uv coordinates in glsl have 0 @ top , increase downwards, in hlsl 0 @ bottom , increases upwards, may need use uv.y = 1 – uv.y @ point.
about question:
tags{ "queue" = "geometry" } //is right queue?
queue references order rendered, geometry 1 of first of, if want shader running on use overlay example. topic covered here.
- background - render queue rendered before others. used skyboxes , like.
- geometry (default) - used objects. opaque geometry uses queue.
- alphatest - alpha tested geometry uses queue. it’s separate queue - geometry 1 since it’s more efficient render alpha-tested objects after solid ones drawn.
- transparent - render queue rendered after geometry , alphatest, in back-to-front order. alpha-blended (i.e. shaders don’t write depth buffer) should go here (glass, particle effects).
- overlay - render queue meant overlay effects. rendered last should go here (e.g. lens flares).
Comments
Post a Comment