Unity Shader Graph Procedural Planet Tutorial

Unity Shader Graph Procedural Planet Tutorial

Map 3D Simplex noise to a sphere,
Generate planet terrains by using the noise for vertex displacement,
Map textures to the terrain based on the height of the terrain,
Use a gradient and noise to map different textures to different biomes,
Add different noise settings for each biome,
Add different noise types to choose for each biome with enums.

0. Introduction

Welcome!!!!

In this tutorial we will create a procedural planet shader using Unity’s node-based shader creation tool Shader Graph.
By using home-made custom function nodes we will be able to neatly map 3D Simplex and Cellular noise to the surface of a sphere and use it to create cool random (coloured) noise patterns:

We will use the noise to displace the vertices of the (high-poly) sphere to generate terrain on top of the sphere:

We will also use the noise to blend between multiple terrain textures to make it look instantly game-ee:

We will generate different biomes using noise to sample a gradient, for both the textures and the displacement of the terrain:

And we will also use the biome gradient to generate different noise settings and noise types for each biome!:


By doing this tutorial you will learn:

  • How to map 3D Simplex noise to a sphere.
  • How to generate terrain on a sphere with noise.
  • How to use vertex displacement.
  • How to blend textures over the height of the terrain.
  • How to create a noise gradient to map different textures to different biomes.
  • How to blend two gradients together so that we can overcome the max 8 gradient key limitation.
  • How to implement different noise settings for each biome.
  • How to implement different noise types for each biome.

There is a lot to cover, many properties to balance, nodes to connect and textures to find, if we want to have a lot of variety, so for that reason the steps are going to be big and the explanations kinda brief,.. But we’re gonna get ‘er done, boys ‘n girls!!….Wicked..

This tutorial was written using Unity 2019.4.9f1 LTS using the URP (Universal Render Pipeline) but should also work with the HDRP and with Unity 2020.x.x and later releases.

0.1 Tutorial Materials

CubeSphere

Because this shader uses vertex displacement to generate terrain height on a sphere we cannot use the default Unity sphere since it doesn’t have a high enough poly count to show detailed enough terrain…
Because UV-Spheres have most of the vertices around the poles it is better to use a so called CubeSphere (a subdivided and then ‘spherized’ cube mesh) for this shader because CubeSpheres have a much more even distribution of vertices, giving us a much more uniform looking displacement.
You can create a CubeSphere in Blender or any other 3D modelling application yourself (Make sure it is smooth shaded) or you can download the one used for this tutorial from the links down below:

For the .fbx mesh Unity doesn’t have to convert the units so make sure Convert Units is unchecked in the import settings.

Custom Nodes

Because Shader Graph doesn’t have a three-dimensional noise node by default this shader uses custom Simplex3D and later Cellular3D noise nodes. (After importing the custom nodes into your project you can find them in the Shader Graph nodes creation menu under Custom/Procedural/Noise):

If you just want to copy/paste the code and make the custom function nodes yourself or if you want to have a look at my other home-brewed nodes then have a look at this post:
Unity Shader Graph Custom Function Nodes Collection

Textures

The textures used for this tutorial were mostly downloaded from the Unity Asset Store. For cartoony/hand drawn-looking textures you can search the Asset Store or the internet for ‘stylized’ or ‘hand-painted’ textures. Also make sure that the textures are all tile-able/seamless.

I personally really like the stylized textures made by LOWPOLY: https://assetstore.unity.com/publishers/16677

1.0 Mapping 3D Simplex Noise to a Sphere

This Shader Graph uses the vertex Position in object space as input for the position of the simplex noise and then uses the noise for vertex displacement.
To map a coloured gradient to the displacement we can sample the gradient with the Length of the Position but because the Length of the Position without any displacement is already 1 (with a sphere that has a radius of 1), we subtract it from the length first before sampling the gradient. Then we divide it by the Displacement Amplitude to evenly distribute the gradient over the height of the displacement:

If you want the Main Preview window to show the planet correctly make sure to right-click on the Main Preview window and select the custom high-resolution CubeSphere mesh!

2.0 Texture Mapping

Instead of sampling a colour gradient we Lerp from texture to texture over the height of the displacement/Length of the vertex Position, blending textures based on displacement height.
To control where the textures start and end we can use different black & white gradients. The first gradient has a black key on 17.5% and a white key on 22.5%, leaving 5% for the blending in between.
The second gradient has black on 37.5% and white on 42.5% and so on..:

Using gradients like this is great because it makes it visually very clear what is happening. The downside is that we cannot expose gradients so in step 2.2 we’ll switch to using SmoothStep nodes instead…

2.1 Adding normal maps

We can use exactly the same method for mapping Albedo textures to the planet for normal maps (and the same method could also be applied to Metallic, Smoothness and Occlusion maps):

Make sure to set the Type of the Sample Texture 2D nodes to Normal and also set the normal map texture properties Mode to Bump on the BlackBoard:

If the textures that you’re using didn’t come with normal maps included you can let Unity create normal maps for you. All you have to do is duplicate the Albedo textures that you have in the Project View and mark them as normal maps in the Import settings. You can let Unity create normal maps from grey scale also in the Import settings. These auto generated normal maps are usually not as superb as hand-crafted fine-tuned normal maps but definitely better than nothing!

2.2 Texture Blend Start/End and Noise Power properties

We can use SmoothStep nodes instead of gradients for the texture blending and we can use a Power node instead of a gradient to control the steepness of the displacement. This way we can use exposed Vector2 properties for the start and end of the texture blending and an exposed Vector1 for the Noise Power.
Not only do we give the user of the shader control over those properties this way, we can also decide what the values should be on a per-material basis without having to create multiple shader graph variants, which is cool if we’re going to procedurally generate different planets later on!:

3.0 Adding detail to the terrain with layered noise

To get a more detailed noise and terrain we can use layered noise instead of single noise.
The Layered Simplex3D node uses the same noise function as the non-layered Simplex3D node but instead of sampling the noise function directly the Layered Simplex3D node uses an Evaluate function that samples the noise multiple times in a loop, multiplying the frequency and amplitude of the layer with each iteration:

float4 EvaluateLayeredNoise(float3 p, float3 offset, float strength, int octaves, float baseRoughness, float roughness, float persistence){
    float4 noiseVector = 0.0;
    float frequency = baseRoughness;
    float amplitude  = 1.0;

    for(int i=0; i<octaves; i++){
        float4 n = snoise_grad(p * frequency + offset);
        noiseVector += (n+1.0) * 0.5 * amplitude;
        frequency *= roughness;
        amplitude *= persistence;
    }
    return noiseVector * strength;
}

So Layered Simplex3D noise is basically the same noise layered multiple times with each layer/octave adding more detail to the overall noise structure. This way we can have shapes with smaller shapes with even smaller shapes, for example: Octave 1 could be mountains, octave two large boulders and octave three small rocks.
The amount of layers the noise generates is controlled by the amount of Octaves, the Base Roughness controls the base scale of the noise, the roughness controls the scale of the successive layers added and the Persistence how much each octave contributes to the overall structure of the noise map:

Keep in mind that very detailed noise also need a very high resolution mesh to show properly. It’s no use to have very detailed noise with many octaves if the mesh is not very high resolution, so it is important to find a balance!

Here you can see the difference in one octave of Simplex noise and layered simplex noise with two and three octaves:

Advertisements

4.0 Generating biomes with noise and gradients

To create biomes for our planet we can use the separate RGB channels of a gradient that we sample over the y-axis of the Position for three different biomes. We can use Red for the north and south pole, Green for forest and Blue for desert biomes.
We could sample the gradient with the y-Position directly but then we would get biomes with straight horizontal edges. To make the biomes and the transitions from biome to biome more random looking we can use noise and multiply the y-Position by it before sampling the biome gradient.
To control how rough the biome edges are and to control the noise sampling position we can use properties to scale and offset the biome gradient noise, much in the same way that we did for the terrain noise:

When we have the biome gradient ready we can Split the output into it’s separate RGB components and Multiply the textures from the first biome with the Red channel, the textures from the second biome with the Green channel and the textures from the third biome with the Blue channel. Then we just add them all up.
The same method also applies to the normal maps:

The complete graph now looks like this:

As you can see all three biomes use their own separate textures and normal maps so new Texture, Normal map, Texture Tiling, Texture Offset and Texture Blend Start/End properties have to be added to the blackboard for the two new biomes. The properties of the already existing biome have to be renamed.

Every texture in the graph uses its own texture sampler/Sample Texture 2D node but there is a limit of 16 texture samplers that we can use at once. There are two ways around this limitation, by using Texture2D Arrays (which would have to be created by scripting) or by using a Sampler State.
By connecting a Sampler State property node to multiple texture samplers in the graph we’re basically telling Unity to use the same texture sampler instance for all the textures that are being sampled with samplers that use the same Sampler State:

New textures have to be found for the polar and desert biomes as well. Each biome can use 5 different textures but we can use some of the same textures as we use for the forest biome as well. For the polar biome the example shader uses a blue ice texture and it uses a snow texture a couple of times. The desert uses a rippled sand texture plus the dirt and sand textures from the forest biome.

To learn more about Sampler States see the Shader Graph Sampler State Node documentation:
https://docs.unity3d.com/Packages/com.unity.shadergraph@6.9/manual/Sampler-State-Node.html
and the Using Sampler States page from the Unity manual:
https://docs.unity3d.com/Manual/SL-SamplerStates.html

To learn more about Texture 2D Arrays see the Unity Texture 2D Array manual:
https://docs.unity3d.com/Manual/class-Texture2DArray.html
and the Texture2DArray scripting API:
https://docs.unity3d.com/ScriptReference/Texture2DArray.html

4.1 Double Biome Gradient

Because we are limited to 8 keys per gradient it is not really possible to generate more than four biome zones on the planet and still have short transitions between them, which is ok if we only needed four zones but it would be nicer if we could have five zones, for example two polar biomes, two forest biomes and one desert biome in the centre.
To get past the 8 key limit we can use two gradients, one for the top half of the planet and the other for the bottom half:

If we setup both the gradients with keys on exactly the same positions but opposite then we can still use a high Biomes Edge Noise Strength to get nice random biome regions without visual artefacts:

To debug the biome gradient, temporarily plug it into the Color input of an Unlit Master node and right-click on it to set it active:

5.0 Biome Noise Settings

Now that we have different textures for each biome it makes a lot of sense to add different noise properties for each biome as well!

To generate noise with different settings for each biome we start with the vertex displacement function at the top, which is going to be for the polar biome only. We Multiply the displacement by the R channel of the Biome gradient with a Split node before the adding of the vertex Position:

Now we have to create new noise and vertex displacement functions with new separate properties for the other two biomes as well,..
The vertex displacement for the forest biome is multiplied by the G channel of the biome gradient. (Notice that we don’t need to add the Position twice):

The Vertex Displacement for the desert biome is multiplied by the B channel of the biome gradient:

To blend the three biome displacement functions we can simply Add them together:

To make the textures for the forest and desert biomes separate displacement functions work, make sure to divide the length by Forest Biome Displacement Amplitude for the forest biome textures and by Desert Biome Displacement Amplitude for the desert biome textures:

Because we have separate noise and displacement functions for each biome now we also have to take that into account for the Occlusion so we have to create two new Occlusion functions and blend the three together in the same way as with the textures and normal maps:

The complete graph with different noise settings for each biome now looks like this:

6.0 Multiple Biome Noise Types

Cool!! Our planets are really starting to look like something now but wouldn’t it be great to have some completely different noise types that we can pick for each biome? I think it would!.. By using exposed Enum Keywords we can switch between different noise nodes in our graph:

(At the time of writing this the enum keyword functionality of Shader Graph is kind of buggy, sometimes it works and sometimes it doesn’t so use this at your own risk.. I’ve found that sometimes after creating an enum it would work until I changed a reference name or reference suffix from one of the enum entries so it might take a couple of retries before it works. My advice for when you want to use enums is to keep the default reference name and refference suffix names and to not use too many Enum Keywords because there is a limit to how many shader variants Unity can create.
Also sometimes is helps to disconnect and reconnect the enum nodes after changing stuff or to completely recreate the enum keyword property on the blackboard..)

For the example I’ve chosen to use the F2 minus the F1 noise value of Cellular 3D noise because I like that it looks like cracked ground or ice but there are many different ways to use Cellular or Simplex noise for different Biome Noise Types… So experiment with it!:

The complete graph with different noise types for each biome:

7.0 Finally

Nice Work!! We just created our own personal procedural tiny planet shader!.. I mean who doesn’t like tiny planets right??.. Maybe now our parents will understand!!..

I really do hope that you’ve enjoyed this tutorial and if you did then you’re more than welcome to try my other tutorials. Maybe have a look at my previous tutorials about making a cool Starfield Shader Graph or a complete Skybox Shader Graph plus day/night cycle C# script over here:

Finally,..What would you like to learn next in a future tutorial? Or what did you miss in this one? Feel free to add any suggestions that you might have in the comments!

(In the planned part two of this procedural planet tutorial we will write a C# script to generate a multitude of planets that all have different properties and terrains using only a single material with the procedural planet shader, by using material instancing..So stick around!)

BUY ME A COFFEE

Donate $5 to buy me a coffee so I have the fuel I need to keep producing great tutorials!

$5.00

Downloads

Unity Shader Graph Procedural Planet Tutorial Package
Shader Graph Terrain Slope Mapping Comment Package

Advertisement

24 comments

  1. Why do you use 3D simplex noise if you only use 2D textures – i am confused.. isn’t 3D noise used for volumetric stuff… surely planet terrain is done only at the 2D level ?

    Like

    • 2D noise would be fine if the terrain was a flat surface. but using 2D noise on the surface of a (volumetric) sphere without getting obvious stretching and seams is much harder. Think of the difference between trying to gift wrap a ball with a flat piece of paper with a pattern on it (2D noise) and sculpting or scooping a sphere out of a block of Swiss cheese where the holes are the pattern (3D noise). You wouldn’t see any seams in the patterns of the holes in the cheese ball but getting the pattern of a flat piece of paper to line up when wrapped around a ball is much harder.
      The 3D noise in the shader is used for the shape of the terrain, and the height/strength of the terrain/noise for the visibility of the 2D textures that are just UV-mapped onto it.

      The 2D terrain textures just use the object’s UV’s but since it is a CubeSphere with the UV mapping of a cube there is still one seam in the 2D textures at the top or bottom face of the ‘cube’.. It is not perfect but procedural and I think a good balance. I chose to use 2D textures for the looks, ease of use and because they’re ‘cheap’ to use but it would be even better to procedurally generate the textures by sampling the 3D noise with the object position as well!

      Like

      • Ah i see, one thing i am trying to work out how to do is manipulate noise to generate planets on this kinda scale:

        I can do basic noise manipulations but i have no idea how to learn to gain more control over things like continents/islands and blends between biomes on that kinda scale.

        If i am generator planets from a distance (and thus don’t need vertex displacement) do i still need 3D noise or would 2D noise be sufficient?

        Like

        • Looks really cool what the video shows! Don’t know what the actual scale of the planets in the video is, but it looks large probably because the noise uses a lot of octaves, so you’d need layered noise for that kind of detail.
          For the Albedo color you can multiply the RGB channels by different amounts of noise. For the emissive lights etc you can generate an emission map and to add some pseudo depth to the terrain you can generate a normal map, all with the same noise but with different ‘levels’.
          2D-noise is not tile-able around the surface of a sphere so I wouldn’t know how to get rid of the seams.
          I’m no expert on biomes yet, this shader was the first time I got to that point:) but it is basically just multiplying different kinds of noise or the same noise with different settings, by some other noise map for the biomes. You could repeat it as a pattern and have biomes within biomes as well:

          You can get more control over continents and islands etcetera by using different settings for the layered noise, like the roughness,base roughness or persistence etcetera and by using other gradients, smooth step, or lerp nodes, to ‘cutoff’ the noise at different levels. It is a lot of finetuning but you can store different noise settings you like in a script and use those as a starting point for each biome.

          Like

  2. Thanks for the reply and information. Would love to learn more about the art of manipulating noise like this seems you understand it a lot better than i do !

    Liked by 1 person

    • This looks pretty cool! There are not many resources on spherical terrain shaders. Is there a download link to the final graph?
      Also, it would be cool (and more realistic) to have a different texture on steep slopes. Any suggestions on how to include that?

      Liked by 1 person

      • Hi Kenan. Thanks, I’m glad you like it!!😃
        A download link with the complete planet shader is now added to the bottom of this tutorial! Been working on it for a while now but organising all of the dependancies and tutorial examples has become a bit of a pain.. You’re questions reminded me that it’s about time that I finished it 🙂 I’ve also included a water shader that I’m currently also writing a tutorial for. It’s still in development but will be updated in the future.

        
To your second question: I’ve been trying to figure it out for you but I’m afraid the answer is.. yeah.. but no.. With Shader Graph it is really easy to add a slope texture to an already pre-made flat terrain mesh in the fragment stage, by using the normal vector like this:

        Adding a slope texture to a shader graph that also does vertex displacement in the vertex stage is much harder because we cannot get the displaced vertices normals back from the vertex stage after the displacement and use them in the fragment stage again easily because of current limitations of shader graph I’m guessing. (This is no problem in a normal vertex to fragment shader, see: https://www.rastertek.com/tertut14.html )
        
Because the normals returned from the Normal Vector node are the normals from before the vertex displacement we would have to some how reconstruct the displaced normals ourselves in the fragment stage of a shader graph, which may be possible in a way, but I haven’t succeeded with that yet ☹️

        While the slope is easy to calculate for a flat terrain, as always things are more difficult for a spherical terrain but if you have a pre-made planet mesh then you can use this method to get the terrain slope for a spherical terrain mesh:

        So to my knowledge, if you have a pre-made planet or terrain mesh its no problem to get the slope with a shader graph, just not when the shader graph is also doing the displacement..
        I hope it helps!

        Cheers, Tim

        Terrain slope mapping package:
        https://timcoster.files.wordpress.com/2021/04/coster-graphics-shadergraph-terrainshaders.unitypackage.zip


        References:
        https://www.rastertek.com/tertut14.html
        https://answers.unity.com/questions/1399619/how-do-i-find-slope-on-a-spherical-surface.html
        https://blenderartists.org/t/slope-dependent-textures-on-a-sphere/588624/6

        Like

  3. Cheers Tim! Thanks for the download link and the exhaustive reply!
    In my case, the mesh is deformed through script, so I’ll try to modify the shader according to your suggestions.
    Enjoy the coffee!

    Liked by 1 person

  4. If you were to add a player walking around this planet, how would you know which biome they are currently walking in? Or more specifically how would you know what texture the player is standing on? This way the player could say mine rock and get rock, or excavate sand and get sand.

    Like

    • Hey Valk! Good question,..
      Off the top of my head: I guess if you calculate the noise in the same way as the shadergraph but from within a C# script then you can get the noise value for the player’s position on the surface of the planet and use it to figure out the biome and the texture that the player is standing on.
      This isn’t really easy to do I’m afraid but it would be necessary to calculate the noise in C# anyway, in order to generate displaced mesh colliders for the planet as well. I will try to figure something out for you this week if I can, and post another comment:)..

      (An easier method btw (depending on the type of game) would be to manually place trigger colliders or different quad tiles that can be ray casted against to get the biome and terrain type:) )

      Like

      • I have figured out how to calculate the noise C# side with a 3D Simplex Noise script from Sebastian Lague’s planet tutorials. As for textures I feel like all the URP nodes need to be recreated C# side so all the data is in one place. The only thing that is passed shader side is the texture that is created in C#. The shader will only consist of one triplanar texture connected directly to base color (and normals / metallic etc if you so choose)

        Like

        • Hey Valk! I had a few busy days so couldn’t reply sooner.. Sorry about the comments thing, don’t know if it was the max nested comments setting on my blog (which I increased) or the browser cookies slowing things down..

          I don’t know if it is necessary to recreate every node on the C# side but I’m not entirely sure, so I will have to try it out myself. It may be better like you say to keep most data in one place.
          
For translating shader code to C# I recommend using Unity.Mathematics like this:

          
using Unity.Mathematics;
          using static Unity.Mathematics.math;

          Then you get all the shader data types and you can use math.step(), math.abs() and math.floor() functions etc. 
I just converted the 3Dsimplex noise from .HLSL to C# with it relatively easily. 
I’m trying out some things with it now so I’ll try to post something before the end of the week 🙂

          Like

          • I figured out the biomes as you can see in the picture above.

            So there are two ways to do this and only one way I like. The first way is to create a Texture2D in C# of width 2 and length numBiomes. Say there are 3 biomes, red, green and blue. Then there will be 3 strips of colors on the texture. Then you feed coherent noise through all the vertices, make sure the texture or shader is set to sampler bileaner and not point otherwise the transition between biomes will be too sharp. (The shader has nodes that allow you to read the mesh UVs and this is how you feed data from C# to the shader) This way causes a very annoying issue, for example lets say you transition between red and blue biomes, note that green is in between red and blue and so if your vertices are shared on the mesh you will see this green border between the red and blue biomes. That’s why this way sucks, plus it’s annoying to sample the right colors of red green and blue from the texture because if you’re off by just a little you will get orange instead of red.

            The other way to do this is to completely ignore UVs and just set the Color[] of the mesh.colors. Just loop through all the vertices and set the desired vertices to red green or blue. Add some noise to get more organic boundaries between biomes. There is a node in shader called Vertex Color which can see all of the mesh colors set from C#. And you’re pretty much done. This is where I’m at right now. I just need to work on blending the colors together.

            Right now the terrain noise is not based on the biomes but I think I should make it based on the biomes in the future. Then I can work on for example rivers from high elevation in warm biomes flowing to the ocean.

            Thanks for posting this article, I would never have known that you could simply split the RGB values from a texture and combine them with other textures. And as you have mentioned yeah you can achieve more biomes by say splitting the planet into 2 sections and so I’m probably going to do that as well. Maybe add a north and south pole biome as well as biomes on the left side of the planet and biomes on the right side of the planet. I’m not sure.

            Like

            • Hey Valk! Very cool to see what you’re creating! Thanks for showing me! I see you’ve already found multiple ways of doing the biomes 🙂 I think there are always pro’s and cons for all of the different mechanics, so take your time to experiment with them individually I would say.
              One voice in my head always says not to worry about biomes too much because creating biomes is just multiplying noise by some other noise 🙂
              
You can also go further with the pattern by multiplying the biomes again by some other noise to create biomes inside of biomes, which I’ve shown in one of the previous comments.
              
I’ve chosen to use an RGB gradient for the biomes for the tutorial because it’s easy but I guess that you can maybe use multiple gradients, lerps, blends, and smooth steps etcetera to create different biomes instead of one gradient for three, or in your case maybe one texture per biome. Maybe that way it will be easier to get good overlapping biomes, but I haven’t tried that out yet!
              Rivers are a really cool idea so definitely worth adding I think. About left and right side biomes, you can just use a gradient over the r/x instead of y for the poles:)



              About your original question: I think I figured out a nice way to get both the texture index and the biome by a position on the planet surface. The link below is for the package which contains a demo scene. If you press play and aim with your mouse on the generated planet then you’ll see the texture index and biome name that the RayCast from camera to mouse is currently hitting on the canvas. This info will allow you to figure out the exactly what texture of what biome the player is aiming at:D

              My plan is to fine tune it a little bit further and then write a tutorial about it. It’s a little bit messy right now but I don’t want to keep you waiting forever :)



              Hope it helps and lemme know if it works!

              https://timcoster.files.wordpress.com/2021/05/coster-graphics-shadergraph-proceduralplanettutorial-pt2.unitypackage.zip

              Like

  5. For some reason my comments are not being displayed. I’m going to try posting without my name. (I’m Valk)

    Like

  6. idk anymore there just seems to be like a huge delay in seeing my comments being posted…

    Like

  7. Any news on if you’re going to make something that generates a collider for the planet?

    Like

    • Alright I’ve looked through and found the script for the terrain collider (tutorial 2 project files), and is there any way for it to automatically get the data it needs? Also, I’m not sure which code to remove to make it only generate colliders

      Like

  8. Hi Tim! first of all, thank you very much for sharing this all with us! It is amazing and helped me a lot! I am struggling now to get the colliders to match the deformations made in the sphere… Any suggestions on how we could achieve that? thank you!

    Like

  9. Awesome tutorial! I couldn’t get a planet shader WITH TEXTURES to work. Tried multiple tutorials out there but none worked for me, yours just happens to be the simplest and most efficient one! However, it worked using only the 3D model of a planet made in Blender, because i keep getting an error when trying to use the custom nodes to displace the CubeSphere mesh in Unity: “Couldn’t open include file” and “Invalid expression at(…)”. Which version of Unity did you use? Could you shine a light for a amateur game dev? Thanks 🙂

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.