Why do different render engines generate different z pass?Cycles generates distorted depthPrecision of Z-coordinateMake an object's visibility controlled by another object or an emptyCreating depth of field from Beauty+Depth passes (rendered in another software)data formatting of Z-bufferIs it possible to create a render layer with no depth of field?Z-Buffer rendering issuesHow to get the right values for Z depth renderingDepth in meters for rendered imagesUninterpretablity in depth maps: why the pixel values do not necessarily represent a valid distance?Z-Buffer render gives unexpected resultsCycles generates distorted depthRendering depth map that is linear, aliased, normalized

Optimising a list searching algorithm

Variable completely messes up echoed string

How do hiring committees for research positions view getting "scooped"?

Do US professors/group leaders only get a salary, but no group budget?

When to use snap-off blade knife and when to use trapezoid blade knife?

My friend is being a hypocrite

Usage and meaning of "up" in "...worth at least a thousand pounds up in London"

Practical application of matrices and determinants

World War I as a war of liberals against authoritarians?

Print last inputted byte

Maths symbols and unicode-math input inside siunitx commands

Fewest number of steps to reach 200 using special calculator

Can other pieces capture a threatening piece and prevent a checkmate?

Why didn't Héctor fade away after this character died in the movie Coco?

Writing in a Christian voice

How does 取材で訪れた integrate into this sentence?

Why are there no stars visible in cislunar space?

How to define limit operations in general topological spaces? Are nets able to do this?

What is the term when voters “dishonestly” choose something that they do not want to choose?

I got the following comment from a reputed math journal. What does it mean?

Probably overheated black color SMD pads

Light propagating through a sound wave

Calculate the frequency of characters in a string

Can you move over difficult terrain with only 5 feet of movement?



Why do different render engines generate different z pass?


Cycles generates distorted depthPrecision of Z-coordinateMake an object's visibility controlled by another object or an emptyCreating depth of field from Beauty+Depth passes (rendered in another software)data formatting of Z-bufferIs it possible to create a render layer with no depth of field?Z-Buffer rendering issuesHow to get the right values for Z depth renderingDepth in meters for rendered imagesUninterpretablity in depth maps: why the pixel values do not necessarily represent a valid distance?Z-Buffer render gives unexpected resultsCycles generates distorted depthRendering depth map that is linear, aliased, normalized













5












$begingroup$


I've been using Blender to generate depth maps using z pass. I notice that the z pass generated by different render engines are different, which made me a bit confused. My feeling is that the z pass generated by Cycle render denotes distance from a given pixel to the camera center while Blender render generates an orthogonal distance to the camera plane. Do I understand it correctly? Is so, is there a way to change such behavior for both render modes?



As an example, the bottom area of the model is a flat surface, to which the camera is perpendicularly pointed. Below are the nodes I use to normalize and visualize the z pass data (Viewer node is used to save the depth map).



Nodes to normalize and view



Z pass with Cycles render:



Z pass with Cycles render



Z pass with blender render (everywhere same value in the bottom part):



Z pass with blender render










share|improve this question







New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$











  • $begingroup$
    Read this related link: Precision of z coordinate
    $endgroup$
    – cegaton
    Mar 12 at 17:54










  • $begingroup$
    Also related: Cycles generates distorted depth
    $endgroup$
    – cegaton
    Mar 12 at 17:57















5












$begingroup$


I've been using Blender to generate depth maps using z pass. I notice that the z pass generated by different render engines are different, which made me a bit confused. My feeling is that the z pass generated by Cycle render denotes distance from a given pixel to the camera center while Blender render generates an orthogonal distance to the camera plane. Do I understand it correctly? Is so, is there a way to change such behavior for both render modes?



As an example, the bottom area of the model is a flat surface, to which the camera is perpendicularly pointed. Below are the nodes I use to normalize and visualize the z pass data (Viewer node is used to save the depth map).



Nodes to normalize and view



Z pass with Cycles render:



Z pass with Cycles render



Z pass with blender render (everywhere same value in the bottom part):



Z pass with blender render










share|improve this question







New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$











  • $begingroup$
    Read this related link: Precision of z coordinate
    $endgroup$
    – cegaton
    Mar 12 at 17:54










  • $begingroup$
    Also related: Cycles generates distorted depth
    $endgroup$
    – cegaton
    Mar 12 at 17:57













5












5








5





$begingroup$


I've been using Blender to generate depth maps using z pass. I notice that the z pass generated by different render engines are different, which made me a bit confused. My feeling is that the z pass generated by Cycle render denotes distance from a given pixel to the camera center while Blender render generates an orthogonal distance to the camera plane. Do I understand it correctly? Is so, is there a way to change such behavior for both render modes?



As an example, the bottom area of the model is a flat surface, to which the camera is perpendicularly pointed. Below are the nodes I use to normalize and visualize the z pass data (Viewer node is used to save the depth map).



Nodes to normalize and view



Z pass with Cycles render:



Z pass with Cycles render



Z pass with blender render (everywhere same value in the bottom part):



Z pass with blender render










share|improve this question







New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$




I've been using Blender to generate depth maps using z pass. I notice that the z pass generated by different render engines are different, which made me a bit confused. My feeling is that the z pass generated by Cycle render denotes distance from a given pixel to the camera center while Blender render generates an orthogonal distance to the camera plane. Do I understand it correctly? Is so, is there a way to change such behavior for both render modes?



As an example, the bottom area of the model is a flat surface, to which the camera is perpendicularly pointed. Below are the nodes I use to normalize and visualize the z pass data (Viewer node is used to save the depth map).



Nodes to normalize and view



Z pass with Cycles render:



Z pass with Cycles render



Z pass with blender render (everywhere same value in the bottom part):



Z pass with blender render







cycles rendering blender-render render-passes






share|improve this question







New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question







New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question






New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked Mar 12 at 16:12









DingLuoDingLuo

263




263




New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











  • $begingroup$
    Read this related link: Precision of z coordinate
    $endgroup$
    – cegaton
    Mar 12 at 17:54










  • $begingroup$
    Also related: Cycles generates distorted depth
    $endgroup$
    – cegaton
    Mar 12 at 17:57
















  • $begingroup$
    Read this related link: Precision of z coordinate
    $endgroup$
    – cegaton
    Mar 12 at 17:54










  • $begingroup$
    Also related: Cycles generates distorted depth
    $endgroup$
    – cegaton
    Mar 12 at 17:57















$begingroup$
Read this related link: Precision of z coordinate
$endgroup$
– cegaton
Mar 12 at 17:54




$begingroup$
Read this related link: Precision of z coordinate
$endgroup$
– cegaton
Mar 12 at 17:54












$begingroup$
Also related: Cycles generates distorted depth
$endgroup$
– cegaton
Mar 12 at 17:57




$begingroup$
Also related: Cycles generates distorted depth
$endgroup$
– cegaton
Mar 12 at 17:57










1 Answer
1






active

oldest

votes


















3












$begingroup$

When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



  • You may want more detail at close range where image focus is likely to reside.

  • The scene may require more detail at large distances if you are rendering a landscape or distant view

  • You may want to use it for a mist pass requiring details at a medium range.

As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.






share|improve this answer











$endgroup$












    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "502"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );






    DingLuo is a new contributor. Be nice, and check out our Code of Conduct.









    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fblender.stackexchange.com%2fquestions%2f134122%2fwhy-do-different-render-engines-generate-different-z-pass%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    3












    $begingroup$

    When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



    It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



    There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



    It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



    • You may want more detail at close range where image focus is likely to reside.

    • The scene may require more detail at large distances if you are rendering a landscape or distant view

    • You may want to use it for a mist pass requiring details at a medium range.

    As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



    If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



    You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



    Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.






    share|improve this answer











    $endgroup$

















      3












      $begingroup$

      When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



      It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



      There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



      It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



      • You may want more detail at close range where image focus is likely to reside.

      • The scene may require more detail at large distances if you are rendering a landscape or distant view

      • You may want to use it for a mist pass requiring details at a medium range.

      As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



      If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



      You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



      Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.






      share|improve this answer











      $endgroup$















        3












        3








        3





        $begingroup$

        When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



        It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



        There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



        It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



        • You may want more detail at close range where image focus is likely to reside.

        • The scene may require more detail at large distances if you are rendering a landscape or distant view

        • You may want to use it for a mist pass requiring details at a medium range.

        As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



        If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



        You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



        Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.






        share|improve this answer











        $endgroup$



        When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



        It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



        There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



        It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



        • You may want more detail at close range where image focus is likely to reside.

        • The scene may require more detail at large distances if you are rendering a landscape or distant view

        • You may want to use it for a mist pass requiring details at a medium range.

        As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



        If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



        You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



        Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Mar 12 at 18:22

























        answered Mar 12 at 16:58









        Duarte Farrajota RamosDuarte Farrajota Ramos

        34k53981




        34k53981




















            DingLuo is a new contributor. Be nice, and check out our Code of Conduct.









            draft saved

            draft discarded


















            DingLuo is a new contributor. Be nice, and check out our Code of Conduct.












            DingLuo is a new contributor. Be nice, and check out our Code of Conduct.











            DingLuo is a new contributor. Be nice, and check out our Code of Conduct.














            Thanks for contributing an answer to Blender Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fblender.stackexchange.com%2fquestions%2f134122%2fwhy-do-different-render-engines-generate-different-z-pass%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            How should I support this large drywall patch? Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How do I cover large gaps in drywall?How do I keep drywall around a patch from crumbling?Can I glue a second layer of drywall?How to patch long strip on drywall?Large drywall patch: how to avoid bulging seams?Drywall Mesh Patch vs. Bulge? To remove or not to remove?How to fix this drywall job?Prep drywall before backsplashWhat's the best way to fix this horrible drywall patch job?Drywall patching using 3M Patch Plus Primer

            random experiment with two different functions on unit interval Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Random variable and probability space notionsRandom Walk with EdgesFinding functions where the increase over a random interval is Poisson distributedNumber of days until dayCan an observed event in fact be of zero probability?Unit random processmodels of coins and uniform distributionHow to get the number of successes given $n$ trials , probability $P$ and a random variable $X$Absorbing Markov chain in a computer. Is “almost every” turned into always convergence in computer executions?Stopped random walk is not uniformly integrable

            Lowndes Grove History Architecture References Navigation menu32°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661132°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661178002500"National Register Information System"Historic houses of South Carolina"Lowndes Grove""+32° 48' 6.00", −79° 57' 58.00""Lowndes Grove, Charleston County (260 St. Margaret St., Charleston)""Lowndes Grove"The Charleston ExpositionIt Happened in South Carolina"Lowndes Grove (House), Saint Margaret Street & Sixth Avenue, Charleston, Charleston County, SC(Photographs)"Plantations of the Carolina Low Countrye