Instagram: A Meta Experience

Speculative Design
Project Type
Speculative Design
Project Year
2018
Skills
Trend Analysis, UX Research, Mind Mapping, Personas & Scenario Development
Tools
Miro, Coggle
Deliverables
Collaborative case study done with Market Gravity, Toronto.

Speculative Direct-To-Consumer experience targeted at Instagram for the metaverse. The concept of sharing re-imagined for a user that prioritizes experiences and living through meaningful moments. As well as reinterpreting user profiles as one's digital legacy.

TREND THEME
Augmented Life = Amplified experiences through Effortless interactions

The word augmented or augment means to make something better or greater by adding to it.

With the way technology has been getting better at seamlessly blending in to our lifestyles, it’s easy to say that it improves our lives in so many way. Newer innovations in science and technology means newer, more fascinating trends in the field of Augmented Lifestyle.

THE CHALLENGE

To celebrate the launch of their new space along with NUIT BLANCHE, RCHIVE were looking for a cohesive visual package that could support a live experience while being produced on a very tight turnaround. The deliverables included brand-forward logo animations and video posters, a digital fashion runway animation built from photogrammetry showcasing RCHIVE's own clothing line, and a hologram installation that could run reliably in the venue.

Success meant two things:

-Visuals that felt like they were a part of NUIT BLANCHE while pushing the mediums.

-A setup stable enough for multi-screen projection to run for the entirety of NUIT BLANCHE.

The timeline required a workflow that prioritized fast iteration, engaging yet consistent look, and production-ready exports.

BUILD Process

To solve the usability and engagement challenges identified in research, I followed a user-centered, iterative design process, focusing on navigation simplification, interaction improvements, and branding refinements.

Wireframing

I explored different layout structures to simplify navigation and ensure real-time updates were easily accessible:

Visual & Branding Refinements

To align the redesign with the Raptors’ We The North identity, we incorporated logo, colors, typography, and mood board inspiration to ensure a cohesive and visually compelling experience:

RESults

    THE CHALLENGE

    There were 2 main challenges being explored:

    1. What are the limits of ChatGPT's ability to translate an ask from one context or medium to another?

    2. How might we be able to use Generative AI to optimize workflows for animation and video editing?

    Approach

    Challenge (a)

    The ChatGPT challenge was explored based on 2 references: Cultural & Procedural. The way to do this was to utilize GPT's ability to write code, particularly Python.

    This python script is then imported into blender to generate a scene and animations based on the code. The prompts were based on the following:

    • Cultural References: Using GPT to generate scenes with code inspired by a given artist. In this case, as a personal preference and also as a great example for simple forms and colors in 3D space, I chose the contemporary minimalist, Dan Flavin.


      No additional information besides the prompt was provided. This was also to understand GPT's capacity to comprehend cultural context based on its own knowledge of the subject.
    • Procedural References: Using GPT to generate scenes with code inspired by well known algorithms: Sierpiński Triangle and Perlin Noise.

    Sierpiński Triangle is a fractal created by repeatedly subdividing an equilateral triangle and removing the central one, yielding a nested, self-similar pattern. This visual recursion illustrates how a simple rule can produce increasingly intricate designs at each level of iteration.

    Perlin noise is a procedural function that generates smoothly varying random patterns by interpolating local gradients. Its natural-looking randomness makes it ideal for creating realistic textures and landscapes in computer graphics.

    Challenge (b)

    The second challenge was based on the discovery of a tool, FlowFrames.

    The intent was to see the capabilities, performance and advantages of this tool for independent animators to optimize their workflow.

    OUTCOME (A)

    cultural context: Dan Flavin

    v1:

    Prompt: Create a script for blender: 1 row of 10 tubelights spaced out equally all lying parrallel to the Y axis inspired by artist Dan Flavin. Make the tubelights move in a wave pattern. Give the tubelights an emissive material that cycles through all the colours sequentially through the tubelights

    1import bpy
    2import math
    3
    4# Function to create a tube light with emissive material
    5def create_tube_light(name, location, color):
    6    bpy.ops.mesh.primitive_cylinder_add(vertices=32, radius=0.1, depth=2, end_fill_type='NGON', location=location, rotation=(math.pi/2, 0, 0))
    7    tube = bpy.context.object
    8    tube.name = name
    9
    10    # Add emission shader
    11    mat = bpy.data.materials.new(name="EmissionMaterial_{}".format(name))
    12    mat.use_nodes = True
    13    mat.node_tree.nodes.remove(mat.node_tree.nodes.get('Principled BSDF'))
    14    
    15    emission = mat.node_tree.nodes.new(type='ShaderNodeEmission')
    16    emission.inputs["Strength"].default_value = 5
    17    emission.inputs["Color"].default_value = (*color, 1.0)
    18
    19    output = mat.node_tree.nodes.get('Material Output')
    20    mat.node_tree.links.new(output.inputs[0], emission.outputs[0])
    21    
    22    tube.data.materials.append(mat)
    23    return tube, emission
    24
    25# Create tube lights
    26num_tube_lights = 10
    27spacing = 3
    28tube_lights = []
    29
    30# RGB colors for the tube lights
    31colors = [
    32    (1, 0, 0), (1, 0.5, 0), (1, 1, 0), (0.5, 1, 0),
    33    (0, 1, 0), (0, 1, 0.5), (0, 1, 1), (0, 0.5, 1),
    34    (0, 0, 1), (0.5, 0, 1)
    35]
    36
    37for i in range(num_tube_lights):
    38    x = i * spacing
    39    color = colors[i % len(colors)]
    40    
    41    tube_light, emission_node = create_tube_light("Tube_{}".format(i), (x, 0, 0), color)
    42    tube_lights.append((tube_light, emission_node))
    43
    44# Animate tube lights in a wave pattern
    45for frame in range(0, 375):
    46    bpy.context.scene.frame_set(frame)
    47    
    48    for i, (tube_light, emission_node) in enumerate(tube_lights):
    49        x = i * spacing
    50        
    51        # Slow down the wave animation by 50%
    52        z = 2 * math.sin(x * 0.5 + frame * 0.025)
    53        tube_light.location.z = z
    54        tube_light.keyframe_insert(data_path="location", index=2, frame=frame)
    55        
    56        # Change color
    57        color_index = (i + (frame // 20)) % len(colors)
    58        emission_node.inputs["Color"].default_value = (*colors[color_index], 1.0)
    59        emission_node.id_data.keyframe_insert(data_path='nodes["Emission"].inputs[0].default_value', frame=frame)
    60

    v2:

    Prompt: Update the script to make it a seamless loop for 15 seconds at 25 frames per second. Spread out colors more evenly amongst tube lights and make the transitions between colours smoother

    The below animation includes v1 beside v2

    1import bpy
    2import math
    3
    4# Function to create a tube light with emissive material
    5def create_tube_light(name, location, color):
    6    bpy.ops.mesh.primitive_cylinder_add(vertices=32, radius=0.1, depth=2, end_fill_type='NGON', location=location, rotation=(math.pi/2, 0, 0))
    7    tube = bpy.context.object
    8    tube.name = name
    9
    10    # Add emission shader
    11    mat = bpy.data.materials.new(name="EmissionMaterial_{}".format(name))
    12    mat.use_nodes = True
    13    mat.node_tree.nodes.remove(mat.node_tree.nodes.get('Principled BSDF'))
    14    
    15    emission = mat.node_tree.nodes.new(type='ShaderNodeEmission')
    16    emission.inputs["Strength"].default_value = 5
    17    emission.inputs["Color"].default_value = (*color, 1.0)
    18
    19    output = mat.node_tree.nodes.get('Material Output')
    20    mat.node_tree.links.new(output.inputs[0], emission.outputs[0])
    21    
    22    tube.data.materials.append(mat)
    23    return tube, emission
    24
    25# Create tube lights
    26num_tube_lights = 10
    27spacing = 3
    28tube_lights = []
    29
    30# RGB colors for the tube lights
    31colors = [
    32    (1, 0, 0), (1, 0.5, 0), (1, 1, 0), (0.5, 1, 0),
    33    (0, 1, 0), (0, 1, 1), (0, 0.5, 1), (0, 0, 1)
    34]
    35
    36# Animation settings
    37fps = 25
    38total_seconds = 15
    39total_frames = fps * total_seconds
    40
    41# Wave settings
    42wave_length = total_frames  # A wave completes in 15 seconds
    43amplitude = 2
    44
    45for i in range(num_tube_lights):
    46    x = i * spacing
    47    color = colors[i % len(colors)]
    48    
    49    tube_light, emission_node = create_tube_light("Tube_{}".format(i), (x, 0, 0), color)
    50    tube_lights.append((tube_light, emission_node))
    51
    52# Animate tube lights in a wave pattern
    53for frame in range(total_frames + 1):
    54    bpy.context.scene.frame_set(frame)
    55    
    56    for i, (tube_light, emission_node) in enumerate(tube_lights):
    57        x = i * spacing
    58        
    59        # Move in a wave formation
    60        z = amplitude * math.sin(2 * math.pi * (frame / wave_length + x / (num_tube_lights * spacing)))
    61        tube_light.location.z = z
    62        tube_light.keyframe_insert(data_path="location", index=2, frame=frame)
    63        
    64        # Smoothly change color
    65        color_phase = 2 * math.pi * (i / num_tube_lights + frame / total_frames)
    66        r = math.sin(color_phase) * 0.5 + 0.5
    67        g = math.sin(color_phase + 2 * math.pi / 3) * 0.5 + 0.5
    68        b = math.sin(color_phase + 4 * math.pi / 3) * 0.5 + 0.5
    69        emission_node.inputs["Color"].default_value = (r, g, b, 1.0)
    70        emission_node.id_data.keyframe_insert(data_path='nodes["Emission"].inputs[0].
    71

    v3:

    Prompt: Update the script to make it a seamless loop for 15 seconds at 25 frames per second. Spread out colors more evenly amongst tube lights and make the transitions between colours smoother.

    Along with the updated code, I added an Array Modifier on the X and Y axis at a factor of 5 to create a sheet of 25 lights per colour. I wanted to expand on the visual and create a sense of size the way Flavin did in some of his work.

    1import bpy
    2import math
    3
    4# Function to create a tube light with emissive material
    5def create_tube_light(name, location, color):
    6    bpy.ops.mesh.primitive_cylinder_add(vertices=32, radius=0.1, depth=2, end_fill_type='NGON', location=location, rotation=(math.pi/2, 0, 0))
    7    tube = bpy.context.object
    8    tube.name = name
    9
    10    # Add emission shader
    11    mat = bpy.data.materials.new(name="EmissionMaterial_{}".format(name))
    12    mat.use_nodes = True
    13    mat.node_tree.nodes.remove(mat.node_tree.nodes.get('Principled BSDF'))
    14    
    15    emission = mat.node_tree.nodes.new(type='ShaderNodeEmission')
    16    emission.inputs["Strength"].default_value = 5
    17    emission.inputs["Color"].default_value = (*color, 1.0)
    18
    19    output = mat.node_tree.nodes.get('Material Output')
    20    mat.node_tree.links.new(output.inputs[0], emission.outputs[0])
    21    
    22    tube.data.materials.append(mat)
    23    return tube, emission
    24
    25# Create tube lights
    26num_tube_lights = 10
    27spacing = 3
    28tube_lights = []
    29
    30# RGB colors for the tube lights
    31colors = [
    32    (1, 0, 0), (1, 0.5, 0), (1, 1, 0), (0.5, 1, 0),
    33    (0, 1, 0), (0, 1, 1), (0, 0.5, 1), (0, 0, 1)
    34]
    35
    36# Animation settings
    37fps = 25
    38total_seconds = 15
    39total_frames = fps * total_seconds
    40
    41# Wave settings
    42wave_length = total_frames  # A wave completes in 15 seconds
    43amplitude = 2
    44
    45for i in range(num_tube_lights):
    46    x = i * spacing
    47    color = colors[i % len(colors)]
    48    
    49    tube_light, emission_node = create_tube_light("Tube_{}".format(i), (x, 0, 0), color)
    50    tube_lights.append((tube_light, emission_node))
    51
    52# Animate tube lights in a wave pattern
    53for frame in range(total_frames + 1):
    54    bpy.context.scene.frame_set(frame)
    55    
    56    for i, (tube_light, emission_node) in enumerate(tube_lights):
    57        x = i * spacing
    58        
    59        # Move in a wave formation
    60        z = amplitude * math.sin(2 * math.pi * (frame / wave_length + x / (num_tube_lights * spacing)))
    61        tube_light.location.z = z
    62        tube_light.keyframe_insert(data_path="location", index=2, frame=frame)
    63        
    64        # Smoothly change color
    65        color_phase = 2 * math.pi * (i / num_tube_lights + frame / total_frames)
    66        r = math.sin(color_phase) * 0.5 + 0.5
    67        g = math.sin(color_phase + 2 * math.pi / 3) * 0.5 + 0.5
    68        b = math.sin(color_phase + 4 * math.pi / 3) * 0.5 + 0.5
    69        emission_node.inputs["Color"].default_value = (r, g, b, 1.0)
    70        emission_node.id_data.keyframe_insert(data_path='nodes["Emission"].inputs[0].
    71

    Procedural context: Sierpiński

    After a lot of trial and error, I was able to get the below code to do what I had intended and as a result the video alongside. The below prompt is a good place to start to recreate the animation.

    Prompt: Consider yourself an expert Python developer for Blender. I want a single script that creates a 3D Sierpinski pyramid fractal with the following requirements:

    1. Equilateral & Equiangular Pyramid
        - The base must be an equilateral triangle with angles of 60 degrees.  
        - The height of each pyramid must equal its base side length.  
    2. Fractal Details
        - Use a Sierpinski pyramid structure with a fractal recursion level of 6.  
        - Ensure that each recursion subdivides correctly, placing smaller pyramids around the base and one on top.
    3. Flattening & Animation  
        - At the start of the animation (first frame), the entire fractal should be completely flat in 2D (all vertices at Z=0).  
        - Over time (from the first frame to some end frame), it should transition into the full 3D shape.  
        - Each smaller pyramid and all of their vertices must also start at Z=0, then animate to their correct 3D positions.  
        - Animate both the object’s overall Z-position and each vertex’s Z-coordinate, so we see an unfolding effect.
    4. Implementation Details
        - Clear any existing mesh objects before creating the new fractal.  
        - Use Blender’s keyframe system (`keyframe_insert`) to animate all vertices and the objects.  
        - Provide the entire Python script in one piece of code, using Blender’s `bpy` and `mathutils` modules.

    Please give me the complete code that accomplishes all these points in Blender.

    Code + Result

    1import bpy
    2import mathutils
    3import math
    4
    5def create_pyramid(size, location):
    6    height = size  # Set height equal to the base length
    7    
    8    # Define vertices for an equilateral pyramid
    9    verts = [
    10        mathutils.Vector((-size/2, size*math.sqrt(3)/6, 0)),
    11        mathutils.Vector((size/2, size*math.sqrt(3)/6, 0)),
    12        mathutils.Vector((0, -size*math.sqrt(3)/3, 0)),
    13        mathutils.Vector((0, 0, height))
    14    ]
    15    
    16    edges = []
    17    faces = [(0, 1, 2), (0, 1, 3), (1, 2, 3), (2, 0, 3)]
    18    
    19    mesh_data = bpy.data.meshes.new("pyramid")
    20    mesh_data.from_pydata(verts, edges, faces)
    21    mesh_data.update()
    22    
    23    obj = bpy.data.objects.new("Pyramid", mesh_data)
    24    bpy.context.collection.objects.link(obj)
    25    obj.location = location
    26    
    27    return obj
    28
    29def sierpinski_pyramid(level, size, location):
    30    if level == 0:
    31        return [create_pyramid(size, location)]
    32    
    33    half_size = size / 2
    34    height_offset = half_size
    35    offset = mathutils.Vector((0, 0, height_offset))
    36    
    37    pyramids = []
    38    for i in range(3):
    39        angle = i * math.radians(120)
    40        rotation_matrix = mathutils.Matrix.Rotation(angle, 4, 'Z')
    41        new_location = location + rotation_matrix @ mathutils.Vector((half_size, 0, 0))
    42        pyramids.extend(sierpinski_pyramid(level - 1, half_size, new_location))
    43        
    44    pyramids.extend(sierpinski_pyramid(level - 1, half_size, location + offset))
    45    return pyramids
    46
    47# Clear existing mesh objects
    48bpy.ops.object.select_all(action='DESELECT')
    49bpy.ops.object.select_by_type(type='MESH')
    50bpy.ops.object.delete()
    51
    52# Set the level of the fractal
    53level = 6  # Increase the fractal level to 6
    54size = 2.0
    55location = mathutils.Vector((0, 0, 0))
    56
    57pyramids = sierpinski_pyramid(level, size, location)
    58
    59# Set the start and end frame for the animation
    60start_frame = 10
    61end_frame = 300
    62
    63# Animate each pyramid individually
    64for pyramid in pyramids:
    65    bpy.context.view_layer.objects.active = pyramid
    66    bpy.ops.object.origin_set(type='ORIGIN_CENTER_OF_MASS', center='BOUNDS')
    67    
    68    # Animate the whole pyramid
    69    original_location = pyramid.location.copy()
    70    pyramid.location.z = 0
    71    pyramid.keyframe_insert(data_path="location", index=2, frame=start_frame)
    72    pyramid.location = original_location
    73    pyramid.keyframe_insert(data_path="location", index=2, frame=end_frame)
    74    
    75    # Animate individual vertices
    76    mesh = pyramid.data
    77    for vertex in mesh.vertices:
    78        original_z = vertex.co.z
    79        vertex.co.z = 0
    80        vertex.keyframe_insert(data_path="co", index=2, frame=start_frame)
    81        vertex.co.z = original_z
    82        vertex.keyframe_insert(data_path="co", index=2, frame=end_frame)
    83
    84bpy.context.scene.frame_end = end_frame

    Procedural context: PErlin

    Again after some troubleshooting I was able to get a code that created the scenes below. To recreate this scene or similar with GPT, I would recommend using the prompt below to start.

    Prompt: Consider yourself an expert Python developer working with Blender (bpy). I need a single script that does the following:

    1. Scene and Frame Setup  
      • The animation should be 450 frames long, at 30 FPS.  
      • Set `bpy.context.scene.render.fps = 30` and `bpy.context.scene.frame_end = 450`.
    2. General Requirements
      • We're creating three circles, each made up of small spheres (diameter ~0.1, or radius ~0.05).
      • Each sphere should have an emissive material whose color will change over time.
      • The spheres should also move along the Z-axis using noise or randomness (e.g., Perlin noise) up to ±0.3.
      • The script should keyframe both color changes and the Z-axis movement.
    3. First Circle
      • Placed at `[0, 0, 0]`.
      • Contains 36 spheres arranged in a circle of radius 1.0 on the XY plane.
      • They should cycle smoothly through all colors (think of it as going around the entire color wheel).
      • Spheres are 50% smaller than 0.1 diameter if possible (so radius ~0.05).
      • The Z-axis motion should be keyed (so each sphere’s Z position changes over frames).
    4. Second Circle
      • Placed 1 unit above the first circle on the Z-axis (`[0, 0, 1]`).
      • Has 20 spheres, arranged in a circle scaled down by 30% (so radius ~0.7).
      • The colors should cycle through a double-split complementary harmony set.
      • Each sphere also moves with random or Perlin noise along Z, keyed over frames.
    5. Third Circle
      • Placed 1 unit above the second circle (`[0, 0, 2]`).
      • Has 12 spheres, scaled down by another 30% (radius ~0.49).
      • The color harmony used should be a square harmony.
      • Again, each sphere moves along Z with noise (±0.3) keyed across 450 frames.
    6. Implementation Details
      • Use a helper function to create each “circle” of points, including:
        • Generating a mesh and object (with `bmesh` if needed).
        • Spawning spheres around the circle, each assigned an emission material.
        • Keyframing the sphere colors so they smoothly cycle (using HSV or other transformations).
        • Keyframing the Z-axis movement via Perlin or random noise.
      • Ensure the final script links objects to the scene, updates the view layer, and clearly shows the color cycling and motion when played.
    7. Materials and Animation
      • The color changes for each circle must reflect the specified color harmony. For example:
        • First circle: Cycle through all colors evenly.
        • Second circle: Double-split complementary harmony.
        • Third circle: Square harmony.
      • Each color animation must be keyframed across the entire frame range (frame 0 or 1 through 450).
      • The Z-location of each sphere must also be keyframed from frame 0 or 1 through 450, using noise up to ±0.3.

    Please provide the complete Blender Python script (using `bpy`, `bmesh`, `random`, `mathutils`, etc.) that fulfills these requirements. Make sure to keyframe both the emissive color changes and the Z-axis noise-based movement for all three circles of spheres.

    Code + Result

    1import bpy
    2import colorsys
    3import math
    4import bmesh
    5import random
    6import mathutils
    7
    8# set frames and fps
    9bpy.context.scene.render.fps = 30
    10bpy.context.scene.frame_end = 450
    11
    12# "noise" scale
    13NOISE_SCALE = 0.3
    14
    15# function to create a circle with points and colors based on harmony
    16def create_colored_circle(name, points, radius, location, harmonies):
    17    # create mesh and object
    18    mesh = bpy.data.meshes.new(name)
    19    obj = bpy.data.objects.new(name, mesh)
    20
    21    # link object to scene
    22    bpy.context.collection.objects.link(obj)
    23
    24    # create bmesh
    25    bm = bmesh.new()
    26
    27    spheres = []
    28    materials = []
    29    for i in range(points):
    30        # calculate position
    31        angle = 2.0 * math.pi * (i / points)
    32        pos = [radius * math.cos(angle), radius * math.sin(angle), 0]
    33
    34        # create vertex
    35        v = bm.verts.new(pos)
    36
    37        # create material
    38        mat = bpy.data.materials.new(name=name + str(i))
    39        mat.use_nodes = True
    40        mat.node_tree.nodes.remove(mat.node_tree.nodes.get('Principled BSDF'))
    41
    42        emmision_node = mat.node_tree.nodes.new(type='ShaderNodeEmission')
    43        emmision_node.inputs[1].default_value = 1.0  # Strength
    44
    45        output_node = mat.node_tree.nodes.get('Material Output')
    46        mat.node_tree.links.new(emmision_node.outputs[0], output_node.inputs[0])
    47
    48        materials.append(mat)
    49
    50        # create sphere
    51        bpy.ops.mesh.primitive_uv_sphere_add(radius=0.05, location=pos)
    52        sphere = bpy.context.object
    53        sphere.data.materials.append(mat)
    54        spheres.append(sphere)
    55
    56    # animate material colors
    57    for frame in range(bpy.context.scene.frame_end + 1):
    58        for i, mat in enumerate(materials):
    59            t = frame / bpy.context.scene.frame_end
    60            harmony = harmonies[i % len(harmonies)]
    61            hue = (t + harmony) % 1.0
    62            color = colorsys.hsv_to_rgb(hue, 1.0, 1.0)
    63            animate_material(mat, frame, color)
    64
    65    # animate sphere positions
    66    for frame in range(bpy.context.scene.frame_end + 1):
    67        for i, sphere in enumerate(spheres):
    68            t = frame / bpy.context.scene.frame_end
    69            sphere.location.z = mathutils.noise.noise(t + i / points) * NOISE_SCALE
    70            sphere.keyframe_insert(data_path="location", frame=frame)
    71
    72    # update mesh
    73    bm.to_mesh(mesh)
    74    bm.free()
    75
    76    # move circle to location
    77    obj.location = location
    78
    79# create circles
    80create_colored_circle('circle1', 36, 1.0, [0, 0, 0], [i / 36 for i in range(36)])  # all colors
    81create_colored_circle('circle2', 20, 0.7, [0, 0, 1], [0, 0.25, 0.5, 0.75])  # double-split complementary
    82create_colored_circle('circle3', 12, 0.49, [0, 0, 2], [0, 0.33, 0.67])  # square
    83
    84# refresh scene
    85bpy.context.view_layer.update()

    OUTCOME (B)

    workflow optimization

    I found an AI interpolation tool called FlowFrames. This tool uses AI to improve and upscale frame rates for videos. It fills in the gaps between frames, stitches them together and renders out a video with a higher framerate as well as the ability to slow down the video to create a slow motion effect. Below is the before and after of the upscaling:

    This tool can really help independent animators across any 3D modeling or animation software. In this case while using Blender, I rendered the animation at 15 frames per second at 4096 samples, Intentionally pushing the limits with reflections and lighting.

    Next to reduce the choppiness of the animation I used the upscaler and went up to 60 fps. With 32 GB RAM configured on my machine, rendering took about 3 minutes.

    Overall render time was 70-75% less than expected using this process to upscale to frame rates up to 120 FPS

    INSIGHTS

    After all the experimenting, these were some of my takeaways using AI as a designer/creative:

    1. AI as Co-Creator
      • AI isn’t replacing creativity but enhancing it, augmenting it—providing unexpected visuals, approaches and concepts.
      • “Happy accidents” can arise from generative glitches or style mismatches, spurring new creative directions.
    2. Rapid Iteration vs. Render Time
      • While AI speeds up concept-to-prototype work, the final high-quality outputs tend to require significant processing.
      • Refining or re-rendering usually need to happen multiple times but the tools help balance between quick exploration and the reality of computational constraints.
    3. Prompt Intention & Narrative Matters
      • Purely AI-driven outputs feel hollow without deliberate narrative or direction.
      • The synergy of contextual knowledge, strong descriptive vocabulary, and real-world references often create more engaging outputs.
    4. Potential & Perils
      • It's pretty easy to get lost in endless AI prompts or style transfers. The key skills come in curating and shaping the final piece.
      • There are definitely ethical and aesthetic considerations—especially regarding copyright and plagiarism, there's a fine line between homage and appropriation, ensuring credit is given where it’s due is key.

    THE CHALLENGE

    Enhancing the Toronto Raptors App to deliver a more intuitive, engaging, and user-friendly experience by improving navigation, real-time game tracking, and fan interaction.

    The Toronto Raptors mobile app lacked an intuitive and engaging user experience, making it difficult for fans to access real-time game updates, team statistics, and arena information efficiently. Key usability issues included:

    • A cluttered menu with redundant navigation items.
    • Inefficient information hierarchy, making it difficult for users to find relevant information.
    • A lack of interactivity and personalization, reducing overall fan engagement.

    INSIGHTS

    My redesign was informed by user research and competitive analysis of leading sports apps to improve usability, accessibility, and fan engagement.

    User Research:
    Key Challenges

    Interviews with Raptors fans revealed major usability issues:

    • Difficult Score & Stats Access
      • Fans rely on the app for scores, standings, and game summaries, but these features are buried under multiple screens.
    • Confusing Navigation
      • The menu is cluttered, and standings are easier to find on the NBA app. The gradient UI reduces readability.
    • Game-Day Friction
      • Attendees struggle to locate arena maps, seating, and food ordering, with many unaware of key features.
    • Limited Community Engagement
      • Fans prefer passive updates (highlights, real-time commentary) over chat features. The app lacks social-media-style engagement.

    Competitive Analysis:
    Best Practices from Other Apps

    To inform the redesign, I analyzed industry-leading sports apps (e.g., NBA Official, F1) to identify UX improvements for real-time sports tracking and fan engagement:

    • Persistent Score Tracking – Key game updates should be always accessible.
    • Card-Based StatsBreaking down player data improves readability, especially for new fans.
    • Game-Day UtilityContextual arena info (seating, food orders) should surface only when relevant.
    • Simplified Navigation – Reducing menu options improves user flow and accessibility.

    Redesign Takeaways

    • Prioritize instant access to scores, standings, and play-by-play updates.
    • Streamline navigation and improve UI readability.
    • Enhance game-day experience with relevant arena info.
    • Introduce social-media-inspired engagement features.

    APPROACH

    To solve the usability and engagement challenges identified in research, I followed a user-centered, iterative design process, focusing on navigation simplification, interaction improvements, and branding refinements.

    Wireframing

    I explored different layout structures to simplify navigation and ensure real-time updates were easily accessible:

    Home Page Wireframe

    • Home Page Redesign – Introduced a fixed scorecard with swipeable past, live, and future games, allowing users to track scores effortlessly.
    • Navigation OptimizationReduced the menu from six to four buttons, streamlining access to core features.
    • Scrolling video feed

    Arena Wireframe

    • Arena Page Improvements – Resized banners for better readability, and prioritized key game-day functions (seating maps, food orders) at the top.
    • Dynamic Game-Day Mode – Implemented a context-aware layout that adapts based on game status. On game days, the arena page prioritizes real-time updates, seating maps, and food ordering, while on non-game days, it highlights upcoming events and ticket purchasing options.

    Team Stats Wireframe

    • Standings Accessibility – Repositioned standings to be more prominent, addressing user frustration with finding this information.
    • Swipeable Team & Game Stats – Implemented card-based layouts with clear spacing and tooltips to make stats more digestible, especially for casual fans.

    Widget Wireframe

    • With the iOS introduction of widgets, I wireframed widget users to personalize their game-tracking experience.
    • Small, medium and large widget options allow for more customisations for the users homepage.
    • Game flow graphics allows for one look summary of the gam.

    Visual & Branding Refinements

    To align the redesign with the Raptors’ We The North identity, we incorporated logo, colors, typography, and mood board inspiration to ensure a cohesive and visually compelling experience:

    Logo & Branding

    The primary Raptors logo and emblem were integrated throughout the UI, reinforcing team identity and maintaining a strong visual connection to the brand.

    Primary Logo
    Primary Emblem
    Additional Emblems

    Colours

    The official Raptors color palette was applied strategically across the interface to ensure high contrast and strong visual impact.

    RED

    PANTONE:
    PMS 200 C

    HEX COLOR:
    #CE1141

    RGB:
    (206, 17, 65)

    CMYK:
    (0, 100, 65, 15)

    BLACK


    PANTONE:
    PMS BLACK 6C

    HEX COLOR:
    #000000

    RGB:
    (6, 25, 34)

    CMYK:
    (30, 0, 0, 100)

    SILVER


    PANTONE:
    PMS COOL GRAY 8C

    HEX COLOR:
    #A1A1A4


    RGB:
    (161, 161, 164)

    CMYK:
    (0, 1, 0, 43)

    GOLD


    PANTONE:
    PMS 872 C

    HEX COLOR:
    #B4975A

    RGB:
    (180, 151, 90)

    CMYK:
    (20, 30, 70, 15)

    Typography

    Millionaire as the primary signature font for branding elements, reinforcing the bold and dynamic Raptors aesthetic.

    Roboto was selected for UI readability, ensuring a modern, clean, and legible text experience for stats, menus, and real-time updates.

    The secondary signature font is Chinese Rocks. This style was used to reinforce the bold, rebellious, and distinctive branding of the Toronto Raptor's slogan, "We the North".

    Mood Board Influence

    The mood board guided the visual direction, drawing inspiration from urban Toronto culture, the energy of the city, and the electricity of live games.


    The goal of the final UI was to reflect a sleek, high-energy aesthetic while maintaining usability and accessibility.

    OUTCOME

    Home Page

    Team Page

    Home Page - Live Game
    Home Page - Past Game
    Home Page - Next Game
    Team - Previous Game Stats
    Team - Stats & Standings
    Team Page - Roster

    Arena Page & Simplified Side Menu

    Widgets

    Live Game - Small
    Next Game - Small
    Next Game - Medium
    Live Game - Medium
    Live Game - Large
    Next Game - Large
    • Created high-fidelity Figma prototypes to test usability and refine interactions before finalizing the redesign.




    Below are some mockups of the widgets on the home screen:

    THE CHALLENGE

    Systems are being developed as a one size fits all
    Our systems need to start thinking about the way we think instead.

    INSIGHTs

    As new breakthroughs are made in the world of brain-computer interactions and neuroscience, the way we will interface with cyberspaces is going to change completely. Interfaces have been developed based on an aggregate of user behaviours. Everyone thinks and communicates differently but as a result we’re still required to use an aggregate-based interface archetype. However, as we better understand how our brains work, so should our devices.

    Now
    Will be
    +5-7 Years
    Will happen
    +10-15 Years

    At present we interface the cloud by way of "data mediums" , i.e. our daily devices, that we interact with and then communicates with the cloud to come back to us with a response. As we see devices becoming closer and more connected to us, the way we interact with the cloud will also change.

    Current Interfaces
    Future Interfaces

    Instead of using devices to interact with the cloud, we may eventually be able to interface directly through our cognition. The resulting service will need to operate and provide responses to users proactively and with intrinsic value in order to keep up to speed.

    The above diagram maps out all current services that provide users with a proactive service that utilises intrinsic data. The value opportunity here is a proactive digital assistant that provides intrinsically relevant responses to user queries.

    Approach

    Personality, temperament, moods, and thoughts are all factors that influence each other over time but at the core of it is the thought process. The diagram below is a representation of approach versus affect. Until technologies are capable enough of understanding human thought, one way to start unpacking such a complicated system is by starting from the outside in, by beginning with key personality traits.

    There are many different techniques that are employed to determine personality traits. One such method is by using the Myers-Briggs Type Indicator (MBTI). It categorises personalities based on the following four spectrums:

    Where does one get their energy?
    - Introversion (I) vs Extroversion (E)

    How do they take in information?
    -Sensing (S) vs Intuition (N)

    How do they base their decisions?
    - Thinking (T) vs Feeling (F)

    How do they organise their world?
    -Judging or structure (J) vs Perceiving or flexibility (P)

    Depending on the initials of your personality type, they can be further categorised under either Sentinels, Analysts, Explorers or Diplomats.

    Sentinels

    Sentinels are helpful and exceptionally pragmatic, embracing and creating order, security and stability wherever they go. Individuals in this category are generally persevering, meticulous and traditional, They succeed in logistical or administrative fields, particularly those that depend on clear processes and rules. These character types like to adhere to their plans but doesn't avoid taking on challenges – however, they can be inflexible and reluctant to accept different points of view.

    Diplomats

    Diplomats prioritize empathy and cooperation. They do really well in areas of diplomacy and counseling. They are imaginative and often taking on the responsibility of being the peacekeeper in their workplace or social circles. Diplomats are warm, empathic and influential individuals, but can struggle on making decisions based on rationality in tough situations.

    ISTJ
    ISfJ
    InfJ
    InTJ
    ISTp
    ISfp
    Infp
    InTp
    ESTp
    ESfp
    Enfp
    EnTp
    ESTJ
    ESFJ
    ENFJ
    ENTJ

    Analysts

    These personality types are inclined to rationality and impartiality, excelling in fields of science, technology and intellectual debate. Other character traits include being independent, open-minded, strong-willed and imaginative as well as approaching situations from a utilitarian perspective. Their priorities may often lie in what works than what satisfies everybody. These traits make Analysts excellent strategic thinkers.

    Explorers

    Explorers are utilitarian, practical, and excel in thinking on their feet. They are masters of craft and techniques, utilizing them from multiple points of view – going from dominating these skills to pursuing others. These character types are indispensable in emergencies, craft, and sales. Their qualities can also push them towards taking on risky challenges or zeroing in exclusively on sensory experiences.

    OUTCOME

    NEO is a two-fold cloud service.
    INFRASTRUCTURE AS A SERVICE: VIrtual Private Cloud

    First, it is an Infrastructure as a Service (IaaS) which consists of a virtual private cloud that allows for secure private storage of data as well as running cloud-based applications, typically a service offered only to enterprise customers. Although given the growth of cloud storage demand amongst users over the last few years, companies should prepare to provide enterprise level cloud services for everyday users in coming years.

    SOFTWARE AS A SERVICE: NEO - DIGITAL BUTLER

    And secondly, it is a Software as a Service (SaaS) which is an interface by way of an advanced virtual assistant which is modeled to respond and access information based on the user’s contextual psychographic data as well as indicators in personality. As a result, creating interactions that are synthesized through digital empathy for the user.

    Value As delivery

    The following is a Value As Delivery model which maps out the value being delivered, revenue, costs, key partners and participants in the platform.

    COST Benefit

    What is the potential market value of the service to a web company like Amazon?

    As of Q4 2020
    Amazon Web Services lead with
    31%
    of the global cloud services market
    Approximately
    $ 44.2 Billion
    of the global cloud services market

    at ~$0.045/ hour

    Current Enterprise cost rate of a VPC-NAT
    Cost for 16 hours a day

    $ 0.72/ day
    $ 21.6/ month
    $259.2/ year

    As of 2021, Amazon reported over 146. 8 million Prime Members.
    By multiplying the annual service cost into the current user base,
    Amazon's potential market value for this service is valued at

    $ 38 Billion

    THE CHALLENGE

    Instagram was created specifically to share pictures. However, questions to consider are...

    Why do we take pictures?

    What value will Instagram bring to the meta-verse?

    Why do we want to remember?

    INSIGHTs

    MIND map / brainstorm

    Using Augmented Life as a starting point, associated opportunities were mapped out in a brainstorming session:

    Following the brainstorming session, interviews were conducted to understand what makes an experience valuable and/or memorable. Insights were mapped out based on positive or negative sentiments versus memorable or non-memorable moments gave us insight into where the opportunity lies.

    Defining a LEGACY FROM A MOMENT

    As we realise the permanence of the digital age, the value of digital legacy is becoming more apparent. Addressing the building blocks of a legacy can also determine what defines a valuable moment.

    A unique moment occurs when there is intrinsic value found in anyone of these characteristics.
    Memories consist of multiple moments either by sequence or value.
    Stories are built by sharing a collection of these memories to create a narrative.
    Legacies are made up of a collection of experiences or stories

    Approach

    Building the vision

    Understanding Instagram's current vision model was key in developing the Augmented vision model.

    PHILOSOPHY

    PRESENT

    To inspire creativity

    Augmented

    To capture memories

    PURPOSE

    PRESENT

    Take better mobile pictures and share them quickly & efficiently.

    Augmented

    Capture and share intrinsic moments intuitively

    VALUES

    PRESENT

    Community first. Inspire creativity. Simplicity matters.

    Augmented

    Community first. Inspire creativity. Simplicity matters.

    RITUAL

    PRESENT

    Browsing feed. Upload pictures. Post a story.​

    Augmented

    Capture a moment. Create a legacy.
    Reflect on memories.

    OUTCOME

    IN conclusion...

    Why do we take pictures?

    To manifest a recollectable moment of intrinsic value

    What value will Instagram bring to the meta-verse?

    Memories

    Why do we want to remember?

    To either learn or to enjoy

    It is important to understand the historical and personal value of the legacy.

    The Memoir page provides the user a more personalized and in-depth understanding of the historical and emotional value connected to the assets. As a result, this provides the assets to be part of not only a physical realm but also to be part of an emotional one. The connection between the memoirs and assets, will create a more valuable experience for the user, and give them the initiative to continue and care for the legacy of benefactor.

    Providing the user awareness of their options, services, and next steps.

    The financial Doula is a service that is provided through a digital platform, it gives the user the freedom to take the next steps at their own pace and time. It is noted that millennials and younger generations prefer quick access’ to information therefore a digital application would meet these needs. The Doula is also an educational resource that provides a financial glossary to aid the user in furthering their understanding of the process.

    Merging a digital platform with a physical space creates a link between generations.

    The Serene space provides baby boomers a physical platform to easily exchange their legacy. It has been noted that older generations like baby boomers prefer face-to-face interaction when it comes to their finances. Therefore, Serene brings two generations together in a peaceful and welcoming environment to openly converse about the beginning or continuation of a legacy.

    What will your legacy be?

    THE CHALLENGE

    By studying the routine of habitual smokers,

    How might we deliver an experience that...

    emulates stress relief?

    is immersive?

    is familiar but relaxing?

    INsights

    Insights were synthesized from interviews by developing an experience map of a habitual smoker:

    Approach

    1
    2
    3
    4
    5
    Create visual narrative
    3D scene development
    Storyboarding
    Animate
    Convert to VR Experience

    OUTCOME

    FLOW MAP

    1
    2
    3
    4
    Put on headset
    Walk around for spatial calibration
    Initiate Experience
    Interact with meditative experience

    The Challenge

    How might we...

    improve the subscription service based environment?

    give more control to the subscription users?

    simplify the management of services?

    INSIGHTs

    Personas & THemes

    Key themes that were of value to specific demographics :

    Albert
    Young Proffessional

    While most users  browse freely, others are more decisive and like to maintain accountability.

    Bella
    Student

    Subscription packages are more approachable at student, family, and shared rates.

    WAtson
    independant business owner

    Users have trouble juggling multiple subscriptions and often contemplate if they are worth the cost.

    Maya
    New mother

    Single subscription is being shared widely amongst friends and family members.

    Empathy Map

    Insights were further mapped to understand what the user says, thinks, does, and feels.

    OUTCOME

    The following were the service features that were developed from the research:

    Wallet

    Landing page allows users to see all their subscriptions at a glance. Along with how many others share the account or if services are being split.

    SPLIT SERVICES

    Users can see who their subscriptions are being split with

    SPLIT INVOICES

    Transparency on how much each user pays for the split services.

    SHARED SERVICES

    Users can see who the primary account holder is on accounts that are shared such as Family Subscription since there is only user being billed.

    Feedback

    After prototyping, feedback provided some valuable critique:

    What worked

    - The app communicates wanted information clearly.


    - The idea is original, and could be seamlessly integrated in to any users life.


    - Navigation is smooth and for the most part intuitive.

    What didn't work

    - "I'd like to see a direct link between Splyt and a service in use."


    - Streaming businesses have no immediate financial incentive.


    - Primary account holder would be nice to see/made more clear.

    Other Projects

    Get in touch...

    Have an interesting project that you'd like to collaborate on? I'm always excited to work with others. Feel free to reach out for any feedback, questions, opportunities or even just to say hello.

    Thank you! Your submission has been received!
    Oops! Something went wrong while submitting the form.