ebook img

Real-time 2D manipulation of plausible 3D appearance using shading and geometry buffers PDF

92 Pages·2017·12.39 MB·English
by  
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Real-time 2D manipulation of plausible 3D appearance using shading and geometry buffers

Real-time 2D manipulation of plausible 3D appearance using shading and geometry buffers Carlos Jorge Zubiaga Pena To cite this version: Carlos Jorge Zubiaga Pena. Real-time 2D manipulation of plausible 3D appearance using shading and geometry buffers. Other [cs.OH]. Université de Bordeaux, 2016. English. ￿NNT: 2016BORD0178￿. ￿tel-01486698￿ HAL Id: tel-01486698 https://theses.hal.science/tel-01486698 Submitted on 10 Mar 2017 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. THESIS PRESENTED AT ´ UNIVERSITE DE BORDEAUX ´ ECOLE DOCTORALE DE MATHMATIQUES ET D’INFORMATIQUE par Carlos Jorge Zubiaga Pen˜a POUR OBTENIR LE GRADE DE DOCTEUR ´ ´ SPECIALITE : INFORMATIQUE Real-time 2D manipulation of plausible 3D appearance using shading and geometry buffers Date de soutenance : 7 November 2016 Devant la commission d’examen compose de : Diego Gutierrez ..... Professeur, Universidad de Zaragoza .................... Rapporteur Daniel Sy´kora ....... Professeur associ´e, Czech Technical University in Prague Rapporteur Pascal Guitton ...... Professeur, Univerist´e Bordeaux ........................ Pr´esident David Vanderhaeghe Maˆıtre de Conferences, Univerist´e de Toulouse .......... Examinateur Xavier Granier ...... Professeur, Institut d’Optique .......................... Examinateur Pascal Barla ......... Charg´e de recherche, Inria .............................. Directeur 2016 Abstract Traditionalartistspaintdirectlyonacanvasandcreateplausibleappearances of real-world scenes. In contrast, Computer Graphics artists define objects on a virtual scene (3D meshes, materials and light sources), and use complex algorithms (rendering) to reproduce their appearance. On the one hand, painting techniques permit to freely define appearance. On the other hand, rendering techniques permit to modify separately and dynamically the different elements that compose the scene. Inthisthesiswepresentamiddle-groundapproachtomanipulateappearance. Weoffer 3D-like manipulation abilities while working on the 2D space. We first study the impact on shadingofmaterialsasband-passfiltersoflighting. Wepresentasmallsetoflocalstatistical relationships between material/lighting and shading. These relationships are used to mimic modifications on material or lighting from an artist-created image of a sphere. Techniques known as LitSpheres/MatCaps use these kinds of images to transfer their appearance to arbitrary-shaped objects. Our technique proves the possibility to mimic 3D-like modifica- tions of light and material from an input artwork in 2D. We present a different technique to modify the third element involved on the visual appearance of an object: its geometry. In thiscaseweuseasinputrenderedimagesalongsidewith3Dinformationofthesceneoutput inso-calledauxiliarybuffers. Weareabletorecovergeometry-independentshadingforeach object surface, assuming no spatial variations for each recovered surface. The recovered shadingcanbeusedtomodifyarbitrarilythelocalshapeoftheobjectinteractivelywithout the need to re-render the scene. Keywords Appearance, shading, pre-filtered environment map, MatCap, Compositing R´esum´e Les artistes traditionnels peignent directement sur une toile et cr´eent des ap- parencesplausiblesdesc`enesquiressemblentaumonder´eel. Al’oppos´e,lesartisteseninfor- matiquegraphiqued´efinissentdesobjetsdansunesc`enevirtuelle(maillages3D,mat´eriauxet sources de lumi`ere), et utilisent des algorithmes complexes (rendu) pour reproduire leur ap- parence. D’un coˆt´e, les techniques de peinture permettent de librement d´efinir l’apparence. D’un autre coˆt´e, les techniques de rendu permettent de modifier s´epar´ement et dynamique- ment les diff´erents ´el´ements qui d´efinissent l’apparence. Danscetteth`ese,nouspr´esentonsuneapprocheinterm´ediairepourmanipulerl’apparence, quipermettentcertainesmanipulationsen3Dentravaillantdansl’espace2D.Mous´etudions d’abord l’impact sur l’ombrage des mat´eriaux, tenant en compte des mat´eriaux comme des filtrespasse-banded’´eclairage. Nouspr´esentonsensuiteunpetitensemblederelationsstatis- tiqueslocalesentrelesmat´eriaux/l’´eclairageetl’ombrage. Cesrelationssontutilis´eespour imiter les modifications sur le mat´eriaux ou l’´eclairage d’une image d’une sph`ere cr´e´ee par un artiste. Les techniques connues sons le nom de LitSpheres / MatCaps utilisent ce genre d’images pour transf´erer leur apparence `a des objets de forme quelconque. Notre technique prouve la possibilit´e d’imiter les modifications 3D de la lumi`ere et de mat´eriaux `a partir d’une image en 2D. Nous pr´esentons une technique diff´erente pour modifier le troisi`eme ´el´ement impliqu´e dans l’aspect visuel d’un objet, sa g´eom´etrie. Dans ce cas, on utilise des rendus comme images d’entr´ee avec des images auxiliaires qui contiennent des informations 3D de la sc`ene. Nous r´ecup´erons un ombrage ind´ependant de la g´eom´etrie pour chaque surface, ce qui nous demande de supposer qu’il n’y a pas de variations spatiales d’´eclairage pour chaque surface. L’ombrage r´ecup´er´e peut ˆetre utilis´e pour modifier arbitrairement la formelocaledel’objetdemani`ereinteractivesanslan´ecessit´ederendre`anouveaulasc`ene. Mots-cl´es Apparence, ombrage, cartes d’environnement pr´e-flitr´ees, MatCap, Composit- ing Laboratoire d’accueil LaBRI Real-time 2D manipulation of plausible 3D appearance 3 4 Carlos Jorge Zubiaga Pen˜a Contents 1 Introduction 1 1.1 Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1.1 Painting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1.2 Rendering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.1.3 Compositing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.1.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.2 Problem statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 1.3 Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 2 Related Work 11 2.1 Shading and reflectance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.2 Inverse rendering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.3 Pre-filtered lighting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.4 Appearance manipulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 2.5 Visual perception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 2.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 3 Statistical Analysis 25 3.1 BRDF slices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 3.1.1 View-centered parametrization . . . . . . . . . . . . . . . . . . . . . . 26 3.1.2 Statistical reflectance radiance model. . . . . . . . . . . . . . . . . . . 27 3.2 Fourier analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 3.2.1 Local Fourier analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 3.2.2 Relationships between moments. . . . . . . . . . . . . . . . . . . . . . 29 3.3 Measured material analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 3.3.1 Moments of scalar functions . . . . . . . . . . . . . . . . . . . . . . . . 30 3.3.2 Choice of domain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3.3.3 BRDF slice components . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3.3.4 Moment profiles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 3.3.5 Fitting & correlation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 3.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 4 Dynamic Appearance Manipulation of MatCaps 39 4.1 Appearance model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 4.1.1 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 4.1.2 Energy estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 4.1.3 Variance estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 4.2 MatCap decomposition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 4.2.1 Low-/High-frequency separation . . . . . . . . . . . . . . . . . . . . . 45 4.2.2 Spherical mapping & reconstruction . . . . . . . . . . . . . . . . . . . 47 5 CONTENTS 4.3 Appearance manipulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 4.3.1 Lighting manipulation . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 4.3.2 Material manipulation . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 4.4 Results and comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 4.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 5 Local Shape Editing at the Compositing Stage 55 5.1 Reconstruction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 5.1.1 Diffuse component . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 5.1.2 Reflection component . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 5.2 Recompositing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 5.3 Experimental results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 5.4 Discussion and future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 6 Conclusions 71 6.1 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 6.1.1 Non-radially symmetric and anisotropic materials . . . . . . . . . . . . 71 6.1.2 Shading components . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 6.1.3 Filling-in of missing shading . . . . . . . . . . . . . . . . . . . . . . . . 73 6.1.4 Visibility and inter-reflections . . . . . . . . . . . . . . . . . . . . . . . 74 6.2 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 6.2.1 Extended statistical analysis . . . . . . . . . . . . . . . . . . . . . . . 75 6.2.2 Spatially-varying shading . . . . . . . . . . . . . . . . . . . . . . . . . 75 6.2.3 New applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 A Skewness and Kurtosis Analysis of Measured BRDFs 79 6 Carlos Jorge Zubiaga Pen˜a Chapter 1 Introduction One of the main goals of image creation in Computer Graphics is to obtain a picture which conveys a specific appearance. We first introduce the general two approaches of image creationintheSection1.1, eitherbydirectlypaintingtheimagein2Dorbyrenderinga3D scene. Wealsopresentmiddle-groundapproacheswhichworkon2Dwithimagescontaining 3D information. It is important to note that our work will take place using this middle- ground approach. We define our goal in Section 1.2 as ‘granting 3D-like control over image appearance in 2D space’. Our goal emerges from the limitations of existing techniques to manipulate 3D appearance in existing images in 2D. Painted images lack any kind of 3D information, while only partial geometric information can be output by rendering. In any case, the available information is not enough to fully control 3D appearance. Finally in Section 1.3 we present the contributions brought by the thesis. 1.1 Context Image creation can be done using different techniques. They can be gathered into two main groups, depending if they work in the 2D image plane or in a 3D scene. On the one hand, traditionalpaintingorthemoderndigitalpaintingsoftwaresworkdirectlyin2Dbyassigning colorstoaplane. Ontheotherhand,artistscreate3Dscenesbydefiningandplacingobjects andlightsources. Thenthe3Dsceneiscapturedintoanimagebyarenderingenginewhich simulates the process of taking a picture. There also exist techniques in between that use 3D information into 2D images to create or modify the colors of the image. 1.1.1 Painting Traditional artists create images of observed or imagined real-world scenes by painting. These techniques are based on the deposition of colored paint onto a solid surface. Artists may use different kinds of pigments or paints, as well as different tools to apply them, from brushes to sprays or even body parts. Our perception of the depicted scene depends on intensity and color variations across the planar surface of the canvas. Generated images may be abstract or symbolic, but we are interested in the ones that can be considered as natural or realistic. Artists are capable to depict plausible appearances of the different elements that compose a scene. The complexity of reality is well captured by the design of object’s shape and color. Artists achieve good impressions of a variety of materials under different lighting environment. This can be seen in Figure 1.1, where different object are shown ranging from organic nature to hand-crafted. 1 1.1. Context Bodegon by Francisco Zurbaran Still-life by Pieter Caesz Attributes of Music by Anne Vallayer-Coster Figure1.1: Still-lifeisaworkofartdepictinginanimatesubjects. Artistsareabletoachieve a convincing appearance from which we can infer the material of the different objects. Nowadays painting techniques have been integrated in computer system environments. Classical physical tools, like brushes or pigments, have been translated to digital ones (Fig- ure 1.2). Moreover, digital systems provide a large set of useful techniques like the use of differentlayers,selections,simpleshapes,etc. Theyalsoprovideasetofimagebasedopera- torsthatallowartiststomanipulatecolorinamorecomplexway, liketexturing, embossing orblurring. Despitethedifferences,bothclassicalpaintingandmoderndigitalsystemsshare the idea of working directly in image space. (a) ArtrageTM (b) PhotoshopTM Figure1.2: Computersystemsofferacompletesetoftoolstocreateimagesdirectlyinimage space. They provide virtual versions of traditional painting tools, such as different kinds of brushes or pigments, as can be seen in the interface of ArtRageTM (a). On the right (b) we can see the interface of one of the most well-known image editing softwares, PhotoshopTM. They also provide other tools that couldn’t exist in traditional painting, like working on layers, different kind of selections or image modifications like scaling or rotations. Artists are able to depict appearances that look plausible, in a sense that they look real eveniftheywouldnotbephysicallycorrect. Despiteourperceptionofthepaintedobjectsas if they were or could be real, artist do not control physical processes. They just manipulate colors either by painting them or performing image based operations. They use variations of colors to represent objects made of different materials and how they would behave under a different illumination. The use of achromatic variations is called shading; it is used to convey volume or light source variations (Figure 1.3), as well as material effects. Shading may also correspond to variations of colors, so we can refer to shading in a colored or in a grey scale image. 2 Carlos Jorge Zubiaga Pen˜a 1. Introduction Figure 1.3: Shading refers to depicting depth perception in 3D models or illustrations by varyinglevelsofdarkness. Itmakes possible toperceivevolume andinferlightingdirection. Image are property of Kristy Kate http://kristykate.com/. Inreallife,perceivedcolorvariationsofanobjectaretheresultoftheinteractionbetween lighting and object material properties and shape. Despite the difficult understanding of theseinteractions,artistsareabletogivegoodimpressionsofmaterialsandhowtheywould look like under certain illumination conditions. However, once a digital painting is created it cannot be modified afterwards: shape, material, or lighting cannot be manipulated. 1.1.2 Rendering Contrary to 2D techniques, computer graphics provide an environment where artists define a scene based on physical 3D elements and their properties. Artists manipulate objects and light sources, they control object’s shape (Fig. 1.4b) and material (Fig. 1.4c) and the type of light sources (Fig. 1.4a), as well as their positions. When an artist is satisfied with the scenedefinition, heselectsapointofviewtotakeasnapshotofthesceneandgetsanimage as a result. (a) Light sources (b) 3D meshes (c) Materials Figure1.4: A3Dscenearecomposedoflightsandobjects. wherelightsmayvaryintype(a) from ambient, to point, direction, area, etc. Objects are defined by their geometry defined by (b) 3D meshes and (c) materials. The creation of 2D images from a 3D scene is called rendering. Rendering engines are software frameworks that use light transport processes to shade a scene. The theory of light transport defines how light is emitted from the light sources, how it interacts with the different objects of the scene and finally how it is captured in a 2d plane. In practice, light raysaretracedfromthepointofview,perpixelintheimage. Whentheraysreachanobject surface, rays are either reflected, transmitted or absorbed, see Figure 1.5a. Rays continue their path until they reach a light source or they disappear by absorption, loss of energy or a limited number of reflections/refractions. At the same time, rays can also be traced from the light sources. Rendering engines usually mix both techniques by tracing rays from both directions, as shown in Figure 1.6. Real-time 2D manipulation of plausible 3D appearance 3

Description:
using shading and geometry buffers. Carlos Jorge Zubiaga Pena. To cite this version: Carlos Jorge Zubiaga Pena. Real-time 2D manipulation of
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.