Rss

Archives for : Blog

Chord Love Privacy Policy

Effective date: October 20, 2018

Andy Gryc, sole proprieter of ByteBlacksmith  (“us”, “we”, or “our”) provides and operates the Chord Love mobile application.

This page informs you of our policies regarding the collection, use, and disclosure of personal data when you use Chord Love and the choices you have associated with that data.

We use your data to provide and improve Chord Love. By using Chord Love, you agree to the collection and use of information in accordance with this policy. Unless otherwise defined in this Privacy Policy, the terms used in this Privacy Policy have the same meanings as in our Terms and Conditions.

Definitions

  • Service is the Chord Love mobile application operated by Byte Blacksmith / Andy Gryc.
  • Personal Data means data about a living individual who can be identified from those data (or from those and other information either in our possession or likely to come into our possession).
  • Usage Data is data collected automatically either generated by the use of Chord Love.
  • Cookies Cookies are small files stored on your device (computer or mobile device).

Information Collection and Use

We collect several different types of information for various purposes to provide and improve Chord Love for you.

Types of Data Collected

Personal Data

Chord Love does not ask for nor collect any personally identifiable information that can be used to contact or identify you (“Personal Data”).

Usage Data

When you access Chord Love by or through a mobile device, we may collect certain information automatically, including, but not limited to, the type of mobile device you use, your mobile device unique ID, the IP address of your mobile device, your mobile operating system, the type of mobile Internet browser you use, unique device identifiers and other diagnostic data (“Usage Data”).

Tracking & Cookies Data

Although we do not currently tracking technology to observe your usage of the Chord Love app, we reserve the right to track your activity in Chord Love and hold certain information.

Use of Data

Chord Love uses the collected data for various purposes:

  • To provide and maintain Chord Love
  • To notify you about changes to Chord Love
  • To allow you to participate in interactive features of Chord Love when you choose to do so
  • To provide customer care and support
  • To provide analysis or valuable information so that we can improve Chord Love
  • To monitor the usage of Chord Love
  • To detect, prevent and address technical issues

Transfer Of Data

Your information may be transferred to — and maintained on — computers located outside of your state, province, country or other governmental jurisdiction where the data protection laws may differ than those from your jurisdiction.

If you are located outside Canada and choose to provide information to us, please note that we transfer the data, including Personal Data, to Canada and process it there.

Your consent to this Privacy Policy followed by your submission of such information represents your agreement to that transfer.

Chord Love will take all steps reasonably necessary to ensure that your data is treated securely and in accordance with this Privacy Policy and no transfer of your Personal Data will take place to an organization or a country unless there are adequate controls in place including the security of your data and other personal information.

Disclosure Of Data

Legal Requirements

Chord Love may disclose your Personal Data in the good faith belief that such action is necessary to:

  • To comply with a legal obligation
  • To protect and defend the rights or property of Chord Love
  • To prevent or investigate possible wrongdoing in connection with Chord Love
  • To protect the personal safety of users of Chord Love or the public
  • To protect against legal liability

Security of Data

The security of your data is important to us but remember that no method of transmission over the Internet or method of electronic storage is 100% secure. While we strive to use commercially acceptable means to protect your Personal Data, we cannot guarantee its absolute security.

Service Providers

We may employ third party companies and individuals to facilitate our Service (“Service Providers”), to provide the Service on our behalf, to perform Service-related services or to assist us in analyzing how our Service is used.

These third parties have access to your Personal Data only to perform these tasks on our behalf and are obligated not to disclose or use it for any other purpose.

Links to Other Sites

Our Service may contain links to other sites that are not operated by us. If you click a third party link, you will be directed to that third party’s site. We strongly advise you to review the Privacy Policy of every site you visit.

We have no control over and assume no responsibility for the content, privacy policies or practices of any third party sites or services.

Children’s Privacy

Our Service does not address anyone under the age of 18 (“Children”).

We do not knowingly collect personally identifiable information from anyone under the age of 18. If you are a parent or guardian and you are aware that your Child has provided us with Personal Data, please contact us. If we become aware that we have collected Personal Data from children without verification of parental consent, we take steps to remove that information from our servers.

Changes to This Privacy Policy

We may update our Privacy Policy from time to time. We will notify you of any changes by posting the new Privacy Policy on this page.

We will let you know via email and/or a prominent notice on our Service, prior to the change becoming effective and update the “effective date” at the top of this Privacy Policy.

You are advised to review this Privacy Policy periodically for any changes. Changes to this Privacy Policy are effective when they are posted on this page.

Contact Us

If you have any questions about this Privacy Policy, please contact us:

  • By email: info@byteblacksmith.com

Chord Love released

Chord Love got approved on the Apple App Store last night after a surprisingly easy and quick App review – go to the Chord Love app page to read more about it.

The worst part of this wasn’t finishing the program, it was just committing to the “paperwork” steps needed to wrap it up and submit it. The app has been fundamentally done for months, but I’ve been lazy.

Finally getting it out into the world means that I didn’t get to add all the features on my pages long wish list. The iPad version is marginally functional but looks like a dog’s breakfast. And I really did want to create a formal test plan. But now I can get some feedback to see what you think about the app. What do you think I should fix or add to make Chord Love even better?

Building the Earth

The most difficult shader in Ablative Air is the Earth, but it’s also the most fun to discuss.

Earth’s surface.

Like the Milky Way, the earth is represented by a sphere, or as close as we can get to one by creating a finely faceted polyhedron. But where the Milky Way is a single texture applied to the inside of a sphere, the Earth is several textures blended together applied to the outside of a sphere.  The first layer is the Earth’s surface.  There are a number of people who have created public domain earth images through NASA satellite imagery, so we leverage the effort of those fine folks.

earth-view

What does that look like wrapped around the globe?

Pretty sterile–we’re missing clouds. This is relatively straightforward–we take a second texture, and add it on top. Why not just add the clouds to the original image? Because I want them to move independently of the surface. I just slide them across the surface so they’re not “accurate”, but the moving clouds definitely adds to the appeal, even if they’re not meteorologically sound. (I did consider creating a cloud-based animation that would show real weather patterns, but I decided against it–too much effort and time and very few would likely notice.)

earth-cloudmask-2

 

Okay, now we have to add in the night side. It’s one thing to make it dark, but we also need to add in night-side city lights. Thankfully the fine folks at NASA have created images of the night side of the earth minus clouds that show the city lights nicely. To save on texture RAM, I don’t use a color image, but just intensity so I know how bright the lights are at any one point on the earth. However, to make it more realistic than just using white, I use a slightly yellowish hue–the same color as the spectrum of sodium lights, which form the majority of light pollution.

earth-nightlighting

One final touch on the lighting. In closeups on the rotating earth, it looked too artificial for the lights to suddenly pop on and off as soon as night came. I look for the day/night transition, and to emulate twilight, I fade the lights gently in and out near the day/night boundary to emulate the good people of earth waking up or starting their nightlife.

Are we done? Nope. The earth has a good proportion of water, so we want specular highlights (bright areas) where the sun reflects off the ocean or lakes. We need another map of the earth’s water, and we use this as a boolean gloss map: if there is a white pixel, we apply a specular highlight because it’s a watery surface, and if it’s black, we don’t. We compute the reflection angle of the sun, and spread out the highlight as that angle deviates from the user’s viewing angle. Sounds hard, but it’s not too difficult.

The specular highlighting looks especially good when you can see the sunlight reflecting off the ocean, stopping when it crosses land, but then picking up bright reflections again when the sun crosses over large inland lakes or seas (like the Great Lakes, or Caspian Sea).

earth-watermask

Again, to save on texture memory, I combine the watery gloss map, night side city lights, and cloud texture into a single texture, using the red (for clouds), green (for night lights), and blue (for water map) channels. I load the single combined texture, then pull the individual values apart in the shader when I need to apply the specific coloring or highlights. (Just for fun, here’s what the resulting texture looks like–kinda like a Earth-style Picasso.)

earth-channels

All the techniques applied for a complex but beautiful earth shader…

Wait a second! Something’s wrong still… The atmosphere is extremely thin compared to the planet, but it does have an effect, especially when the sun is behind the planet. So I added one more final touch: I put a thin shell around the planet, and tuned the color depending on the angle of the sun to the planet’s limb (edge). Doing a real atmospheric refraction calculation is very complex due to the different compositions and temperatures of the layers of the atmosphere, so I completely cheat here. I put into the shader something that reacts to the angle of the sun, fades from white to blue moving from the planet surface towards outer space, added transparency, and just played with the fine tuning of the values until I was happy with the look of the result.

Instead of building another spherical shell around the planet with a new shader, I implemented another massive cheat. In the earth shader I calculate the surface normal: when that normal is just shy of perpendicular to the viewer (that is, very close to the outer ring defining the edge of the sphere), I switch from calculating a surface pixel to calculating an atmospheric pixel. The net effect is that I don’t need to compute a whole new sphere when I want only the very edge of the sphere to be affected anyway. The atmospheric ring does truncate a small number of earth surface pixels from being displayed, since those pixels are replaced by the atmosphere refraction effects. This very slightly distorts the planet edge since the rotating surface land comes into view slightly more abruptly than it normally would, but it isn’t really noticeable even when you’re looking for the effect.

I’ve finally come to groc shaders: it’s not how you do it, it’s how it looks that’s important. You don’t need to mimic reality, you just need to fool the eye.

Here’s a snapshot of the game in progress to show the final result.

Ablative Air_capture_03

Building the Milky Way

Continuing on my multi-part blog about rendering space, now I’ll talk about building the Milky Way.

Rendering the galaxy was much simpler than rendering the stars. Basically, put a texture of the Milky Way on the inside of a sphere, and you’re done!

Okay, not a sphere, but a rough approximation of a sphere. Start with a cube, and since the GPU will only draw triangles, construct each square facet from two triangles. Subdivide each triangle into two more triangles, and then subdivide all those triangles into two more triangles. Now turn it into a pseudo-sphere by normalizing all the vectors to be a distance of 1 from the center. Presto–a 48-sided polyhedron, or “sphere”.

The Milky Way gets plastered on the inside surface of the sphere, since we’re viewing it from the inside. We use a great image from the European Southern Observatory. This gets cut up into a cube map so we can use it within our shader.

Add noise to the render so we break up the pixelation.  See my GLSL random number blog for more details on this and the noise function itself.

And we’re done!  Much simpler than the stars. The earth won’t be so simple.

(Here’s my dirty little secret–I never checked the milkyway texture with the actual stars, so I’m quite certain they are misaligned. It was on my list of things to check for the longest time, but I had so much else to finish that I just dropped it at some point. My bad.)

Building the Stars

This is the first part of a multi-part blog about how I created various visual features of Ablative Air.

Much of the visual appeal to the game comes from the realistically rendered space and earth graphics. While those features aren’t essential to game play, they do make the game fun to look at, as well as make for an interesting story (at least so my friends tell me).  I’ll start with the stars, because it was the first thing I started with on the game.

There are about 6000 or so stars visible with the naked eye. There are really only three qualities of a star we can assess with our eyes: position (referred to by astronomers through right ascension and declination), brightness (referred to by astronomers as apparent magnitude) and color (referred to by astronomers as color). Let’s start here: The Bright Star Catalogue, 5th Revised Ed., Hoffleit D., Warren Jr W.H. Astronomical Data Center, National Space Science Data Center 1991. This is an astronomical dump about the most common facts known about each visible star and a few more that aren’t visible with the naked eye but are still relatively bright. The catalogue lists about 9000 stars, so I’ll use all of them for the game. (Stands to reason that you would be able to see more stars in space without that pesky atmosphere in the way.)

A star’s position is easy to determine from the catalogue: convert from Right Ascension and Declination to spherical coordinates, then into cartesian coordinates for the game. I may have lost you there on the details, but suffice it to say that it’s straightforward math, and hence quite boring, so I won’t spend any time talking about it.

Much more interesting is the star’s color.

You’d be excused for thinking all the stars in the game are white. They’re not. The effect is subtle, but the game stars are represented with the same colors as we would see stars on earth. Here are a couple examples.

Taurus has a red “eye”, the star Aldebaran, the bright star nearest the upper left corner. When people say it’s a red star, they’re mostly being poetic. I’d agree that it’s not really that red.

Taurus

Orion in the opening image is another constellation with a recognizably colored star. You’ll see that Betelgeuse is red(ish), just like it’s supposed to be.

Pleiades is pretty easy to spot, and a kind of bluish color.

Pleiades

Star color basically comes from the temperature of the star. There’s a lot of complicated stuff to do with black body radiation and light wavelength, but Mitchell Charity from MIT has done all the hard work already, and I reproduce here his table I’ve started from.

  O5(V)     157 180 255   #9db4ff
  B1(V)     162 185 255   #a2b9ff
  B3(V)     167 188 255   #a7bcff
  B5(V)     170 191 255   #aabfff
  B8(V)     175 195 255   #afc3ff
  A1(V)     186 204 255   #baccff
  A3(V)     192 209 255   #c0d1ff
  A5(V)     202 216 255   #cad8ff
  F0(V)     228 232 255   #e4e8ff
  F2(V)     237 238 255   #edeeff
  F5(V)     251 248 255   #fbf8ff
  F8(V)     255 249 249   #fff9f9
  G2(V)     255 245 236   #fff5ec
  G5(V)     255 244 232   #fff4e8
  G8(V)     255 241 223   #fff1df
  K0(V)     255 235 209   #ffebd1
  K4(V)     255 215 174   #ffd7ae
  K7(V)     255 198 144   #ffc690
  M2(V)     255 190 127   #ffbe7f
  M4(V)     255 187 123   #ffbb7b
  M6(V)     255 187 123   #ffbb7b

I’ve done a little expanding and interpolating on Mitchell’s list to map stellar classifications onto the basic star colors like so:

# Colors are little-endian words: 0xbbggrr (or Blue Green Red), opposite of normal RGB order
     'O5': 0xffb09b,     'O6': 0xffb8a2,     'O7': 0xffb19d,     'O8': 0xffb19d,     'O9': 0xffb29a,
     'B0': 0xffb29c,     'B1': 0xffb6a0,     'B2': 0xffb4a0,     'B3': 0xffb9a5,     'B4': 0xffb8a4,
     'B5': 0xffbfaa,     'B6': 0xffbdac,     'B7': 0xffbfad,     'B8': 0xffc3b1,     'B9': 0xffc6b5,
     'A0': 0xffc9b9,     'A1': 0xffc7b5,     'A2': 0xffcbbb,     'A3': 0xffcfbf,     'A4': 0xffd2cf,
     'A5': 0xffd7ca,     'A6': 0xffd4c7,     'A7': 0xffd5c8,     'A8': 0xffded5,     'A9': 0xffe0db,
     'F0': 0xffe5e0,     'F1': 0xffeae6,     'F2': 0xffefec,     'F3': 0xffe8e6,     'F4': 0xffe2e0,
     'F5': 0xfff7f8,     'F6': 0xfff1f4,     'F7': 0xfff3f6,     'F8': 0xfcf7ff,     'F9': 0xfcf7ff,
     'G0': 0xfcf8ff,     'G1': 0xf8f7ff,     'G2': 0xf2f5ff,     'G3': 0xebf3ff,     'G4': 0xe5f1ff,
     'G5': 0xeaf4ff,     'G6': 0xebf4ff,     'G7': 0xebf4ff,     'G8': 0xdeedff,     'G9': 0xddefff,
     'K0': 0xddeeff,     'K1': 0xbce0ff,     'K2': 0xc4e3ff,     'K3': 0xc3deff,     'K4': 0xb5d8ff,
     'K5': 0xa1d2ff,     'K6': 0x95ccff,     'K7': 0x8ec7ff,     'K8': 0xaed1ff,     'M0': 0x8bc3ff,
     'M1': 0x8eccff,     'M2': 0x83c4ff,     'M3': 0x81ceff,     'M4': 0x7fc9ff,     'M5': 0x6fccff,
     'M6': 0x78c8ff,     'M7': 0x80c4ff,     'M8': 0x6dc6ff,

The letter code is the stellar classification–roughly how big and old the star is.

If you look at Mitchell’s colors based on pure mathematics, they are far more garish than we actually see. There are no carnival reds or blues in my night sky! If stars were big enough to show a disk, they might be the colors that Mitchell has computed, but they are visually so tiny that they’re a point. We need to account for the mechanics of the eye.

The human eye has cells called rods which are responsible for black and white vision, and cones which are responsible for color vision. However, rods are also much higher density, which is why at night when there’s low light we see mostly in shades of gray. We perceive low-level ambient light as being slightly bluish.

Dim objects like stars will also tend to have little color because we mostly perceive them with rods and not cones, and so will look mostly white, maybe with just a hint of blue. To mimic this effect, I alter the color of the star based on the magnitude. The very brightest stars will end up with a good proportion of the “true” star color, and as the magnitude decreases, I wash out a greater proportion of the red and green component of the color. The more dim the star, the more white it will appear. This mimics how the eye perceives the color of the pinpricks of stars, which is why it’s hard to find colored stars in the game –just like in the night sky.

That takes care of color.

Next to solve is showing a star’s brightness. This would seem rather simple–bright stars are a bright pixel, dim stars are a dim pixel. After all, a star is so tiny it’s only a single pixel as far as the display, right? I tried that approach first, and it looked awful.

Why?

We don’t see a steady point when on earth. We see the twinkling of a star as it’s bent through the atmosphere. While a twinkle shader would be a fun shader to write, my game shows the stars from outer space without any atmosphere, so the stars must be steady. But us earthbound humans are used to seeing that twinkle. We’re also used to seeing images of stars through a telescope. Both of those things influence what we expect. We expect spikes and a halo around bright stars. Stars don’t really have them, but they looks much more like what we expect when they’re there.  Take a look at this picture of the Pleiades.

Pleiades_large

Although the dim stars just look like points, look at the bright stars of the cluster. If you look carefully, you’ll see there are two levels of halo around the bright stars, and diffraction spikes on the four cardinal points (at 0, 90, 180, and 270 degrees). Those are artifacts of telescope construction, but we’re so used to them and they look so natural that I emulate them.  Again for comparison, my rendered Pleiades (without the nebula gas), up close.

pleiades-closeup

Remember that these stars are actually quite small and dim, so they won’t appear as blue as they do in the telescope. Dim stars are points, bright stars show diffraction spikes and halo effects.

What does the halo/spike effect look like up close?

big-star

Position: CHECK. Color: CHECK. Magnitude: CHECK.

Hey–wait a minute. The spikes are actually two pixels tall and wide. Why so big?

There’s just one last thing to address, and that’s temporal aliasing. Before I did my star shader, I didn’t know what that was.  I found out rather quickly that it’s a necessary problem to address.

Temporal aliasing is an artifact caused by the star in motion across a display with discrete pixel locations. Sweep the view across the sky when you’re using single-pixel stars, and it will look like the stars are “jumping” around. This happens because the relative distance between computed star locations changes, depending on the exact angle. So if you slide the viewport over a tiny bit, the floating point calculation round-off might make two nearby stars appear to jiggle with respect to each other. This effect is difficult to show in a movie, but trust me–it’s very distracting, and it completely destroys the illusion of reality, not to mention smoothness.

So the last tweak to the star shader was to compute a sub-pixel location for the star, and draw the star features (including the spikes) corresponding to the subpixel. That means that if the star location is actually at the center of the pixel we’ll get single pixel spikes. But if it’s exactly between two pixels, we get a blurred two-pixel wide spike. Although this doesn’t look perfect in a highly magnified view like above, it looks absolutely beautiful when the stars are in motion like they are in the game.

Star shader: DONE!

 

What “Gravity” gets right (SPOILER ALERT)

If you plan to see Gravity, and you don’t like spoiled endings, STOP READING NOW.

Now that I’ve put in my required disclaimers, we can talk freely. I saw Gravity last night with some friends, and I loved it. What’s not to love about an epic 3D visual extravaganza set in space? Especially one with the lovely Ms Bullock battling orbital debris.

As you might expect, in the process of creating Ablative Air I’ve done quite a bit of research on orbital debris. I’d say I’m as close to an expert on the subject as one might reasonably claim as a layperson. There were a couple of the things that rang my “scientific credibility” alarm bells.

  • Debris wiping out all communication with ground control.  Houston should be able to reach the astronauts for a window at least once every 90 minutes when they’re within visual range with ground antennas. Not to mention that comms sats are usually in geostationary orbit, thousands of kilometers above where the destruction belt was occurring.
  • All the stations are a little too conveniently close to each other. You could stretch your imagination for the proximity of the Chinese station and ISS if they’re in nearly identical orbits since one of stations is actually hypothetical. However, the Hubble is orbiting a couple hundred km higher than the ISS in real life, and it’s pretty darn impossible for a jet pack to traverse that distance.  They certainly aren’t going to be in visual range.
  • Basketball-size dents showing on the inside of the Chinese station when struck by debris? Being hit by a chunk moving at 11,000km/hour is not going to leave a dent, it’s going to vapourize a massive chunk of the station wall and turn our heroine into a tiny cloud of orbiting ashes.

I choose to excuse Alfonso Cuaró for a few of these liberties he’s taken, because it is a movie, and because there is so much that he does get right.  The zero gravity motion is impeccable. Action and reaction for everything, just like it should be. The visuals of the earth, stars, atmosphere, and aurora are beautiful, and very much in line with the research I’ve done for Ablative Air. And most impressive: the absolute silence of space. It’s something that few other directors have ever dared to try. I love when the escape ship undocks from the exploding space station and everything goes instantly silent.  Cinematically wonderful and absolutely truthful.

I can sympathize with Alfonso. There are lots of little liberties that I’ve taken with Ablative Air too. I struggled throughout development to balance scientific accuracy against the fun factor, just like Alfonso must have. Who wants to make a game/movie that’s scientifically accurate but absolutely zero fun to play/watch?

There’s one thing that I removed from the prototype of Ablative Air that Alfonso built his entire movie around. A runaway debris collision chain, otherwise known as “Kessler syndrome“. When two pieces of debris collide, they create even more debris. And when enough debris exists at a particular altitude, there will be so many collisions that more debris is created than can be cleared by a natural orbital decay. In effect, this becomes an untraversable shell of bullets zipping around the planet. This was first proposed by NASA scientist Donald Kessler in 1978, and it has been verified in simulation models.

I originally did emulate the Kessler syndrome within Ablative Air, exactly as Mr. Kessler predicted.  Not really on purpose–it spontaneously emerged out of my debris simulation. Originally when debris trajectories intersected they would collide, conserving the total mass of the two chunks but fragmenting into many smaller independent pieces. I found in playing this early prototype of the game that once you hit a certain threshold, play became impossible. Collisions were happening continually, and by the time you got rid of one chunk of debris, fifty more would be in its place. I tried putting a cap on the total debris count, or arbitrarily ending the game when the debris density got too high.

Neither of these were very good fixes. It was just too frustrating to play with collision-generated debris. Once you got anywhere near the critical debris density, any space station would be obliterated in a spray of debris within a second by a dense fog of rapidly moving projectiles. (Of course, when Gravity got to this “massive obliteration” stage near the end, it made me recall my own mini-simulations and I broke out into a big grin.)

My solution as implemented in the final game, was to test debris only for a collision with the space station, and no longer test for collisions between any two arbitrary pieces of debris. I really struggled with cutting out debris collisions since I knew it wasn’t being technically accurate, but I made the difficult choice to remove collisions because it made a much more enjoyable game. Sometimes you just have to sacrifice reality for fun.

I didn’t get Kessler syndrome right, but Gravity did. So I can forgive the other places where Gravity goes scientifically soft. I have a renewed well of sympathy for making the hard choices.

Trials of Aerogel; or Using hard math to avoid harder math; or Learning to love quaternions

The Ablative Air debris-removing weapons are all derived from scientific proposals for how to remove orbital debris: lasers, aerogel, and physical collection via automated satellites. (Electrical tethers are another proposal, but they seemed too boring from a gameplay standpoint. Sorry tether fans.)

In my original game design, I had always planned on implementing the big three: lasers, aerogel, and satellites. Lasers were easy–I did that first, and was the only weapon for a while. Hunter/Seeker – a little satellite that goes around collecting up debris – was written almost completely on a particularly long layover in the Frankfurt airport. But Aerogel I left to the bitter end, only implementing it about two weeks before my gold candidate. Why?

Because the math sucks. Or should I say, I wasn’t thinking about things properly. You see, I was still thinking in ordinary 3D cartesian coordinates. Pooh on that! My breakthrough came by thinking like a quaternion.

If you’re not familiar with quaternions, they’re 4 dimensional complex numbers, and they’re extraordinarily good for representing rotations. Plenty of other 3D game authoring sites talk about them in length. All my orbital mechanics, view manipulation, and tracking is done with quaternion math, and I’ve needed them from the very beginning of the game so I had to learn about them pretty early on.

Quaternions are not terrifically natural to my mind, so I don’t always think of them even when they’re the perfect tool for the problem at hand. What problem was I was trying to solve?

Quite simply, I needed circular decals placed on the atmospheric surface. They spread and become more diffuse until they evaporate. The screen shot shows two blobs of it: a younger one near the center, and an older, bigger (and dimmer) one to the left of the station.

Simple, no?

Seemed so–all I needed to do was calculate all the points on a sphere that were a certain radius from a given point. So I got out paper & pencil, and started drawing out the math. And realized that my simple 2D circle calculations wouldn’t quite cut it. So then I figured–ah hah! It’s as simple as having a plane intersecting a sphere! So I looked up great (and lesser) circle calculations, and implemented some of those equations. But I still didn’t get it quite right. So then I thought–wait! I know–it’s just a cylinder intersecting a sphere!

You see the pattern here. I couldn’t find exactly the formula I needed on the Internet, and although I could create a system of equations to represent the problem, I realized I was too lazy (or too rusty, or both) to do all the math.

Maybe I could have slogged through all the math eventually, but stepping away from the problem gave me the solution. I was already using quaternions for spherical rotations, can’t I use them here?

Most certainly I could.

What was a page full of intimidating coordinate transforms and trig boiled down to two very simple quaternion operations:

1) Rotate a point from the center of the decal to the edge.

2) Rotate the edge point completely around, tracing the outline of the circle (like you would with a drafting compass).

That’s it. Easy peasy!

After having that Eureka moment, I almost jumped naked out of my bath and ran through the streets of Athens. (Well, not actually, but I was pretty stoked to have solved a bugger of a problem with no real “math” to speak of.)

The actual function is almost as simple; after that mental breakthrough it took me all of 5 minutes to get working. You’ll note that I save the center point and the first edge point twice since I’m creating a GL FAN to draw the graphic. With my Aerogel implmentation I don’t bother with an spherical 3D patch that conforms to the surface of the sphere. It’s good enough for my purposes that I create just a circle. Since the decal sits on top of the transparent atmosphere, it won’t ever intersect the earth surface. Creating a spherical patch with quaternions should be almost as simple–you just need to trace out each ray as well as the outer edge. I leave that as an exercise for the reader.

void Aerogel::setVerticesFromRadius(float radius) {
    // Save the center point as the first point of the fan
    vertexArray[0] = location;

    // Create the normal to the location, based 
    // on any arbitrary vector (here we use "up")
    Vector3 norm = crossProduct(location, Vector3(1,0,0));

    // Create a quaternion that can rotate our 
    // center point to the edge of the circle
    Quaternion q;
    q.setToRotateAboutAxis(norm, radius);

    // Rotate our center point out to the edge: 
    // this is the start of tracing our circle
    Vector3 p = rotate(location, q);

    // Create a new rotation that rotates around 
    // the axis (the original point), scribing a circle
    q.setToRotateAboutAxis(location, (2.0*M_PI)/AerogelSegments);

    // Now it's easy: Save our points by tracing 
    // around the edge and rotating to next point
    for (int i=0; i<AerogelSegments; i++) {
        vertexArray[i+1] = p;
        p = rotate(p,q);
    }
    // Save the first edge point again to close the circle
    vertexArray[AerogelVertices-1] = vertexArray[1];
}

Improvements to the canonical one-liner GLSL rand() for OpenGL ES 2.0

Long title, agreed. If you have a short attention span, I’ll save you right here and give you the goods.

(1)Don’t use this:

float rand(vec2 co)
{
   return fract(sin(dot(co.xy,vec2(12.9898,78.233))) * 43758.5453);
}

(2)Do use this:

highp float rand(vec2 co)
{
    highp float a = 12.9898;
    highp float b = 78.233;
    highp float c = 43758.5453;
    highp float dt= dot(co.xy ,vec2(a,b));
    highp float sn= mod(dt,3.14);
    return fract(sin(sn) * c);
}

Version 1 is found in dozens of places on the internet. Try doing a search for “GLSL rand”, and among all the Perlin noise routines you’ll find the little gem in (1) tossed out many times. I don’t know the person who first wrote it, but hats off to him/her. It’s reliant on the fact that sin(x) modulates slowly, but sin(<huge multiplier>*x) modulates extremely quickly. So quickly that sampling the sin function at every fragment location effectively gives you “random” numbers.

Of course, this is all dependent on the GPU implementation and the quality of the sin calculation. On the BlackBerry Z10 (model STL100-2 using a Qualcomm Adreno 225) version 1 of the code works perfectly fine. I use this function to introduce background roughness into my Milkyway rendering for Ablative Air. The Milky Way image is rendered on the inside of a sphere. Without the noise, the texturing of the background looks crudely pixelated. With the noise, it almost looks like a Hubble photo. Here are two examples–not the exact same region, but you can get the idea of the difference.

milkyway_with-noise

Fig 1. Milky Way with noise

 

milkyway_no-noise

Fig 2. Milkyway without noise.

 

So, what’s the problem? I ported this renderer onto the Z30 which uses a slightly different GPU. The GPU in both cases is an Adreno (225 vs 320), but I suspect that this problem could occur on many different GPUs, as the random function relies on “problematic” behaviour.

And guess what? On the Adreno 320, my noise function completely disappeared. It became mostly all white (rand() returning 1.0) with black lines running through the noise at periodic intervals.  The effect is somewhat difficult to see underneath the Milky Way texture unless you magnify the image a lot. I saw it initially by hooking my game up to an HDMI equipped TV. In the samples below, I’ve removed the Milkyway texture to make it incredibly easy to spot. I’m just showing a texture built exclusively with the noise function.

goodGPU-noise_function

Fig 3. Pure noise texture – Adreno 225 GPU

 

badGPU-noise_function

Fig 4. Pure noise texture – Adreno 320 GPU

As you can see, figure 4 is a little less random than you might like for noise. That is, it’s not random by any stretch of the imagination.

How has our trusty little GLSL random function failed us?

I had encountered a similar issue before in one of my other GLSL ports. My shader for Ablative Air’s laser “scattering” effect also relies on sin(), so I immediately suspected the sin function was overflowing on the Adreno 320 GPU. This is purely guesswork, but I think the implementation for that GPU does not precondition the sin() input to a reasonable 2PI range first. It just applies a Taylor series (or some other approximation) to the input value without any range check. The Adreno 320 implementation is likely a scooch faster, but it’s definitely less accurate than the earlier Adreno 225, certainly for this purpose.

My fix may be a little bit overkill, but it’s safe: force all constants to high precision and use mod to get the sin input value within a reasonable range first. My version is much safer, and shouldn’t rely on any particular quirks of the GPU’s sin function implementation.  Here’s the result of my fix running on the Adreno 320: perfectly acceptable noise, just like we wanted.

Adreno-with-fixed-noise

Fig 5. Adreno 320 GPU with improved noise function

 

In fact, my tweaks to this beautiful little algorithm is probably a little cleaner than the original function, even when running on the original Adreno 225. If you look closely at the noise in Figure 3, you can see tiny stripes of regular looking data peeking out. Squint up your eyes “snake eye” style, and look near the bottom centre of the image and you’ll see three vertically stacked diagonal slashes. It’s a little bit more apparent when the texture is in motion, but those patterns aren’t visible in figure 5 using the improved noise function. I suspect that the Adreno 225 GPU, while it provides a more robust sin() implementation, is still subject to floating point inaccuracies from the lack of high precision. Hence why I’ve kept the “highp” in there, just in case.

Developer log

This blog is so I can share things that I’ve learned while writing my game. If you might be interested in what I’ve learned about OpenGL ES or quaternions or BB10 development, you’re in the right place. If you either don’t know what those things are, don’t care, or those terms make you dry heave, I’d recommend going over to my FaceBook page instead. That page is for fans of the game, where I put hints or tips about gameplay–generally items of broad interest.

And we’re off!

Just finished posting my application submission this morning…  Let’s see how long it takes to get approved!