MATTREAGAN
iOS & macOS engineer,
designer, game creator
Author: Matt Reagan
« Previous | Next »

SpriteKit Text Disintegration
Recently I wanted to create an animated title sequence for Dust using SpriteKit in which the text would disintegrate, looking as though each letter was breaking apart in the wind. I tried several approaches and went through multiple iterations before finally settling (for now at least) on the effect shown below. I also discovered a few interesting things about SpriteKit's performance capabilities in the process. In this post I'll describe how the effect was implemented and some of the earlier approaches which didn't pan out.

The Final Effect

This effect is rendered dynamically with SpriteKit using about 20,000 individual nodes comprising the particles of the text. Each node has its own SKAction animation assigned to it. Because each individual animation is randomized the effect is unique each time and can be controlled at an extremely granular level. SpriteKit is able to handle all of this out of the box, with no optimization, at 58-60fps, which is pretty darn impressive.

Attempt 1 - Initial approach

Initially, I started with the easiest approach I could think of, which was to leverage SKEmitterNode and attempt to 'fake' the disintegration. I kept things simple and put a single-letter SKLabelNode on the screen. I then created an emitter in Xcode's editor (using the Spark template as a starting point) to provide the particle effect itself.

I combined the two by running a set of simple SKActions which would present the text node and the disabled emitter (having previously set its particleBirthRate to 0), and then run an action which simultaneously faded the text while briefly turning on the emitter. This achieved a fairly simplistic result which, although not completely terrible, was not convincing.

Attempt 2 - Minor Improvements

Modifying the SKAction sequences by slightly scaling and adjusting the text as part of the animation and moving the emitter helped a bit. I then added code which could apply this overall sequence of actions for each letter of an arbitrary string. Adding a little randomization of the timing for each letter helped things visually. This was the result:


Again, not terrible, but it leaves a lot to be desired.

Attempt 3 - Different approach

I knew that what I really wanted to do was legitimately break apart the letters into their actual particles, which could be individually controlled. This was going to require a different approach, however. I didn't want to deal with the font glyphs directly for this early proof-of-concept, and SKLabelNode's API is somewhat limited, so that was off the table as well. It didn't really matter how the text was represented, however, I just needed some way to define the individual parts of the glyphs as a 2D grid of nodes - which is exactly what a bitmap is. So the next approach leveraged a texture image representing the string. It's simple, but a lot more fun:
  1. Draw the text into a bitmap (or use a premade texture asset, in my case I just drew the string into an NSBitmapImageRep)
  2. Define the resolution, or 'node-size to pixel-size' value
  3. Read the pixel color values of the bitmap
  4. If the pixel is non-white, create a SpriteKit node at a position based on the X,Y coordinates
  5. Apply the SKAction to each particle node to create the desired effect
Essentially it's simply recreating the bitmap within the scene using individual SKNodes.

SKShapeNode vs. SKSpriteNode

Initially I leveraged SKShapeNode, but I found that no matter how I adjusted those sprites the animation performance was abysmal. I then tried creating tiny SKSpriteNodes of a flat color using -spriteNodeWithColor:, which provided significantly better performance out of the box. (As an aside, it might be worth mentioning that SKShapeNode has a ton of issues, even as of 10.12. That class deserves a blog post all on its own.)

Reading in the bitmap

Again I went with the lowest-cost solution for this proof-of-concept, which wound up being trivial thanks to AppKit. I simply use NSBitmapImageRep and iterate over the pixels, obtaining the color value via -colorForPixel:. This is definitely not the most robust or performant approach, but for the purposes of this test it was sufficient.

It was helpful in this case to use NSBitmapImageRep directly, rather than attempting to work at a level higher on the more abstracted NSImage class. Drawing into an NSImage will allow the system to automatically create and manage the underlying representations, but they may not always be what you expect. For example in some cases you'll find that the NSImage is backed by an NSCGImageSnapshotRep, which is a private NSImageRep subclass which doesn't offer the convenience of -colorForPixel:.

Other Effects

Going this route of managing the individual particles based on an input texture provides a lot of power and flexibility in the appearance of the animation. Because you can easily calculate the position and color of each particle node, you can also do other interesting effects such as assembling the final composite from a randomly-scattered set of pixels. (For an example of this, check out the video below.)

Performance

Even without any optimization, the animation performs surprisingly well. SpriteKit handles 24,000 nodes animating individually without much of a hiccup. This could undoubtedly be improved by using a lower resolution (larger/fewer nodes), or grouping sets of nodes together for fewer animations, etc. It would also be interesting to experiment with different rendering settings and properties for the scene and the SKSpriteNodes themselves. But for this simple test I was surprised at the rendering speed. Chances are I will be using this effect in some form for the opening titles of Dust.

Video Demo