render3d

  • Increase font size
  • Default font size
  • Decrease font size

MENTAL RAY PHENOMENA

E-mail Print PDF

What are phenomena?

Phenomena are collections of shader nodes that are connected together. They appear as a single node. It is similiar to a grouping that is done with maya nodes. You can provide your own inputs for the phenomenon interface that can simplify your shader life.

This example relies on several maya shaders like the sampler info, maya surface shader and others. For this reason this pheonomenon will only work in maya, but it should work in all maya versions.

A good thing about phenomena is that you can combine several elements of a shader that would be complicated to combine otherwise or only with c shader programming.

e.g. if you use a mr ambient occlusion shader, there is no input for a bump map. With a phenomenon you can add it. Or if you need an output shader for a special shader. In maya you have to attach it to the active camera. With a phenomenon you can add an output statement that will alway take automatically the current camera.

Step by step creating a simple phenomenon (maya only)

Since several maya versions, you can create phenomena directly form within maya. But for a better understanding and to explain things more globally I’ve choosen the text - only version, and - I’m honest - I think its easier than with the maya phenomizer interface.

For this little example I want to create a depth shader from maya nodes. A quite simple thing that allows you to set the near and far plane for a linear dof black and white image. Lets see how we can do this in maya.

This is the scene we want to color:

First we get the distance of the current point from a sampler info. The samperInfo→pointCamera gives us the coordinate of the point along the negative z - axis. Thats useless and we correct it with a multiplication by -1.

And as you can see, because the distance of our point ranges from e.g. 8 up to 100, our colors get the same values. But colors can only use values from 0 to 1. To fix this we use a setRange node. In the oldMin oldMax values you can modify how the ramp is displayed.

This is the shader we want to rebuild as a phenomenon.

Okay, every phenomenon is defined by the phenomenon declaration:

declare phenomenon
end declare

Well, this phenomenon is quite useless. You need a few more things. Lets first define what type of phenomenon we have, that means what will be the output of the phenomenon. You can choose between geometry, color, scalar, boolean or other self defined structures, for example if you want to have not only a color connection from your new node, but some others too. But first lets define it:

declare phenomenon
  color "depth_phen" (
  		)
  version 1
  apply material
end declare

As you can see, I’v given the pheno a name “depth_phen”. But you can choose an arbitraty name. With the word “color” I tell mental ray that it has to expect that the output of this shader is an color value consisting of rgba. The “apply material” line is useful for maya which knows then where to put this shader in.

Now we can load the shader in our application. But we still dont have any interface or functionality. So lets define what we want to have in our interface:

declare phenomenon
  color "depth_phen" (
		scalar "near",	 #:default 20.0 shortname "near"
		scalar "far"	 #:default 100.0 shortname "far"
  		)
  version 1
  apply material
end declare

Now the shader appears with the inputs we need.

We give the defaultvalues 20 and 100 to get an initial value. The value after “shortname” is important for maya only. It defines how the connection plug will be called internally and - much more important - in an maya ascii file. This way you always know how shaders are connected. Without this definition, the connections are called “S001” or something like this.

Of course if we render an image with this shader, it is completly black because all resulting values of our depth_phen have their default values, what means 0. Now it is going to be a little bit more complicated.

The phenomenon needs a starting point for the walk though the network. This starting point is called “root”. So lets define it:

declare phenomenon
  color "depth_phen" (
		scalar "near",	 #:default 20.0 shortname "near"
		scalar "far"	 #:default 100.0 shortname "far"
  		)

   shader "depthSurface" "maya_surfaceshader" (
		"outColor" 1. 0. 0.,
		"outTransparency" 0. 0. 0.,
		"outMatteOpacity" 1. 1. 1.,
		"outGlowColor" 0. 0. 0.
		)

  
  root = "depthSurface"
  
  version 1
  apply material
end declare

Hey! You didnt say anything about this surface shader. Yes, you are right. But per default the root needs something to operate on. Because we use maya shaders this is the normal way to create a simple non shaded constant surface, with a surface shader.

So what shoud happen? First “root” is evaluated. It points to an surface shader, this surface shader is evaluated with a outColor of red (1, 0, 0). So the result should be a red colored image. Why do I use “shoud”???? Okay it does not happen because of a maya special feature. All maya built in mentalray shaders does not use a simple “color” as an output but a “struct”, a user defined output structure. Because root expects a color, it does not know how to evaluate this thing and this causes this error:

PHEN 0.6 error 051018: phenomenon “depth_phen” root not found

What now? One shader that provides an color output instead of a struct is the shading group shader. If we use this we get this result:

declare phenomenon
    color "depth_phen" (
		scalar "near",	 #:default 20.0 shortname "near"
		scalar "far"	 #:default 100.0 shortname "far"
            )

   shader "depthSurface" "maya_surfaceshader" (
		"outColor" 1. 0. 0.,
		"outTransparency" 0. 0. 0.,
		"outMatteOpacity" 1. 1. 1.,
		"outGlowColor" 0. 0. 0.
		)

   shader "shadingEngine" "maya_shadingengine" (
		"surfaceShader" = "depthSurface.outColor",
		"cutAwayOpacity" 0.,
		"alphaMode" 0
		)

    root = "shadingEngine"
         
    version 1
    apply material
end declare

Now the root looks into the shading engine. Here it evaluates the surface shader and sees a new assignment. With the “=” the output of another shader output is used as input for this Element. Here, the “outColor” of our “depthSurface” shader is evaluated, the color is used for further calculation. And now we have our first simple network. Two conneced shaders.

From this point we can go straightforward to our goal. What do we need else?

First we need to know in which distance our point is. This is done by a sampler info. This sampler info gives in its pointCamera attribute the xyz distance from the camera.

A sampler info is defined like this:

shader "samplerInfo"
	"maya_samplerinfo" ()

shader "samplerInfoPointCamera"
	"maya_vector_to_xyz" (
		"vector" = "samplerInfo.pointCamera"
		)

The “maya_vector_to_xyz” is needed to be able to seperate the single elements of the vector.

Now we have a point in space, but we only need the z - coordinate. Unfortunatly we cannot use it directly because the value in view direction is negative... yes, I know, I dont know why, but it is the case. So we have go get rid of the minus here. This can be done with a multiply divide node:

shader "samplerInfoPointCamera"
	"maya_vector_to_xyz" (
		"vector" = "samplerInfo.pointCamera"
		)

shader "multiplyDivideC"
	"maya_xyz_to_vector" (
		"x" = "samplerInfoPointCamera.z",
		"y" = "samplerInfoPointCamera.z",
		"z" = "samplerInfoPointCamera.z"
		)

shader "multiplyDivide"
	"maya_multdiv" (
		"operation" 1,
		"input1" = "multiplyDivideC,
		"input2" -1. -1. -1.
		)

Now we should have a positive output of the z direction. But to use it as a shader, we cannot use the current values because they show the real world distances. For any shader values we need a range from 0 to 1. And as you imagine, here come our interface:

shader "nearXYZ"
	"maya_xyz_to_vector" (
		"x" = interface "near",
		"y" = interface "near",
		"z" = interface "near",
		)

shader "farXYZ"
	"maya_xyz_to_vector" (
		"x" = interface "far",
		"y" = interface "far",
		"z" = interface "far",
		)

shader "setRange"
	"maya_setrange" (
		"value" = "multiplyDivide.output",
		"min"  0.,
		"max" 1.,
		"oldMin"  = "nearXYZ",
		"oldMax" = "farXYZ"
		)

Almost done. We can set the near and far plane and everything should be converted into a range from 0 to 1. So lets combine all together:

declare phenomenon
    color "depth_phen" (
		scalar "near",	 #:default 20.0 shortname "near"
		scalar "far"	 #:default 100.0 shortname "far"
            )

   shader "samplerInfo"
	"maya_samplerinfo" ()

   shader "samplerInfoPointCamera"
	"maya_vector_to_xyz" (
		"vector" = "samplerInfo.pointCamera"
		)

  shader "multiplyDivideC"
	"maya_xyz_to_vector" (
		"x" = "samplerInfoPointCamera.z",
		"y" = "samplerInfoPointCamera.z",
		"z" = "samplerInfoPointCamera.z"
		)

 shader "multiplyDivide"
	"maya_multdiv" (
		"operation" 1,
		"input1" = "multiplyDivideC",
		"input2" -1. -1. -1.
		)


shader "nearXYZ"
	"maya_xyz_to_vector" (
		"x" = interface "near",
		"y" = interface "near",
		"z" = interface "near",
		)

shader "farXYZ"
	"maya_xyz_to_vector" (
		"x" = interface "far",
		"y" = interface "far",
		"z" = interface "far",
		)

shader "setRange"
	"maya_setrange" (
		"value" = "multiplyDivide.output",
		"min"  0.,
		"max" 1.,
		"oldMin"  = "nearXYZ",
		"oldMax" = "farXYZ"
		)
   shader "depthSurface" "maya_surfaceshader" (
		"outColor" 1. 0. 0.,
		"outTransparency" 0. 0. 0.,
		"outMatteOpacity" 1. 1. 1.,
		"outGlowColor" 0. 0. 0.
		)

   shader "shadingEngine" "maya_shadingengine" (
		"surfaceShader" = "depthSurface.outColor",
		"cutAwayOpacity" 0.,
		"alphaMode" 0
		)

    root = "shadingEngine"
         
    version 1
    apply material
end declare

So we are finally done. Almost.... if you try to use this thing, you will see.... well nothing. Why is that? Thats because this shader seems to be loaded before all other shaders. This way it does not find the maya shaders. Fortunatly there is a small workaround for this, we can simply include the declaration files of maya this way:

$include "C:/....your path here.../maya2008/mentalray/include/mayabase.mi"

...

Now it shoud work. It does, but the image is red!

Why??? Thats because we still have a small error in our phenomenon:

...
shader "setRange"
	"maya_setrange" (
		"value" = "multiplyDivide.output",
		"min"  0.,
		"max" 1.,
		"oldMin"  = "nearXYZ",
		"oldMax" = "farXYZ"
		)
...

The min/max values take colors as input. We only have one value. This way the others are automatically set to 0 what results in a color of 1.0 0.0 0.0 what means a clear red. Set it to:

...
shader "setRange"
	"maya_setrange" (
		"value" = "multiplyDivide.output",
		"min"  0.0 0.0 0.0,
		"max" 1.0 1.0 1.0 ,
		"oldMin"  = "nearXYZ",
		"oldMax" = "farXYZ"
		)
...

And finally the result should be black and white.

 

Last Updated on Friday, 19 March 2010 17:04