Integration Guide

RenderReduce is a framework that can be easily integrated into an Xcode project without external tools or packaging requirements.

To include RenderReduce functionality into your app:

  1. Copy RenderReduce.framework directory into project
  2. Copy dummy.cpp into project to link C++ runtime

Both the RenderReduce.framework and dummy.cpp files should then be added to the Xcode project. Note that adding dummy.cpp to a Swift based project may display a dialog like "Would you like to configure an Objective-C bridging header?", select "Don't Create" since the compiled framework already includes a bridging header.

For actual working Objective-C and Swift examples, see:

Once the framework has been added, use the @import statement for an Objective-C based project:

  // GameScene.m
  
  #import "GameScene.h"
  
  @import RenderReduce;
  
  @implementation GameScene
  ...

A Swift based project should make use of the import statement, as follows:

  // GameScene.swift
  
  import SpriteKit
  import RenderReduce

  class GameScene: SKScene {
  ...

The RenderReduce framework provides two modules that implement texture compression functionality, RRTexture and RRNode. The RRTexture module is used to compress an existing PNG image as a SKTexture. The texture object is then passed to the RRNode module to create a SpriteKit node. A simplified Objective-C implementation might look like:

  // GameScene.m
  
  - (void) didMoveToView:(SKView*)view {
    NSDictionary *options = @{
      @"render": @"reduce",
      @"filename": @"forest_2048_1536_65536_fs.png"
    };
    
    NSMutableDictionary *results = [NSMutableDictionary dictionary];
  
    RRTexture *texture = [RRTexture encodeTexture:options results:results];
    
    SKSpriteNode *background = [RRNode makeSpriteNode:view texture:texture];
    
    ...

This simplified code above will both compress the texture and create a SpriteKit node. But, this simplified approach may not be optimal in some cases.

In a real application, the slower texture compression step may need to be done at app startup time. For example, scanning and compressing a very large 2000x2000 texture could take a second or more. Moving the texture compression step out of didMoveToView can make a specific screen display more quickly. Note that care should be taken to avoid processing a large number of textures on the main thread or in the app startup methods, since this can impact launch time and memory usage.

See the SpriteKitFireAnimation demo for an example of non-trivial loading of a large number of textures in a secondary thread.

ReduceRender APIs:

  // RRTexture.h
  
  @interface RRTexture : NSObject
  
    // size of the original texture in pixels
  
    @property (nonatomic, assign) CGSize pixelSize;
  
    // texture size in points
  
    @property (nonatomic, assign) CGSize pointSize;
  
    // SpriteKit texture object that contains the compressed texture
  
    @property (nonatomic, retain) SKTexture *skTexture;
  
    // SpriteKit compiled shader instance
  
    @property (nonatomic, retain) SKShader *shader;
  
    // Encode a compressed texture as a NSData and prepare a
    // compiled texture shader object.
  
    + (RRTexture*) encodeTexture:(NSDictionary*)options
      results:(NSMutableDictionary*)results;
  
  @end
  // RRNode.h
  
  @interface RRNode : NSObject
  
    // The makeSpriteNode API creates a SKSpriteNode that joins
    // a render node to the compressed texture representation.
  
    + (SKSpriteNode*) makeSpriteNode:(SKView*)skView
      texture:(RRTexture*)texture;
  
    // Update the texture and shader objects associated with a Node.
    // This method will update the texture and the shader and possibly
    // rerun node validation logic.
  
    + (BOOL) updateSpriteNode:(SKView*)skView
      texture:(RRTexture*)texture
      node:(SKSpriteNode*)node;

  @end

Typically, a SpriteKit node can be created and associated with a new SKSpriteNode. In some cases, for example when animating frames, the updateSpriteNode API can be used to update the association between an existing node and a texture. See the SpriteKitFireAnimation demo for example of this type of usage.


Colorspace Reduction:

Everyone has seen GIF images online. These still or animated images make use of a very small colorspace that is limited to just 256 colors. The small colorspace leads to good compression, but at the cost of significant quality reduction and jaggy edges due to a lack partial transparency in the GIF file format.

Typically, when an image is reduced to just 256 colors there is significant quality loss and banding of like colors. In some special cases, reducing down to 256 colors still works well, but normally a larger table is needed to maintain quality. The RenderReduce framework supports variable sized tables for either 24 or 32 BPP images (with partial transparency). One can compress the original images with different table sizes and determine what size vs quality tradeoff works best for game art.

The ImageMagick command line tool makes this task simple.

Note that the minimal static ImageMagick binary provided here should be installed for Max OS X systems. If you plan on installing from another source make sure to install version 7.0.2 or newer, since earlier versions contain a known bug related to alpha channels.

Install and extract the archive in ~/Install (or some other directory) and update the PATH env var for your specific user by adding an entry to the ~/.bash_profile script.

export PATH=${HOME}/Install/ImageMagick/bin:${PATH}

Since RenderReduce supports variable sized colortables, images can be generated with different table sizes and a compared in terms of quality. This is a significant improvement over simple 256 color formats like GIF or PNG8.

Images generated with a limited colortable may get smaller than the original data, but the size results of a PNG file are sometimes inconsistent. The true value of a limited size colortable is seen when the compressed texture data is loaded on the iOS device. Runtime memory savings should be significant once the colortable reduced image is processed with the RenderReduce framework. Note that the maximum memory savings of 75% is possible with the minimal table size of 256 colors.

Reduce with ImageMagick

A colortable with just 256 colors can be generated via:

  > convert TreeFog.png -colors 256 -dither FloydSteinberg -quantize transparent TreeFog_fs_256.png

A colortable with 1024 colors and Floyd-Steinberg dithering can be generated via:

  > convert TreeFog.png -colors 1024 -dither FloydSteinberg -quantize transparent TreeFog_fs_1024.png

A even higher quality 16K table without dithering can be generated via:

  > convert TreeFog.png -colors 16384 TreeFog_16384.png

The maximum realistic color table of 16 bits without dithering can be generated via:

  > convert TreeFog.png -colors 65534 TreeFog_65534.png

The original artist will need to review colortable quantized results to ensure that proper quality is retained in the output images. The variable sized tables supported by RenderReduce represent a significant advantage over simple 256 color formats like GIF or PNG8. A bit of additional review is needed, but this should not be a significant problem. The SKSwiftCompare example is useful for comparing lossless to lossy results on the actual iOS device.