Categories
Emerging Technologies Portfolio (Emerging Technologies)

Emerging Technologies: Portfolio

Campus Wayfinder showcase

Alternate Youtube Link – https://youtu.be/_vrT1ESJ0qk

Campus Wayfinder Link – https://campuswayfinderar.netlify.app

5 minute production video

Logo

Final Posters

Secondary Poster that was made for an alternate route.

Early Concept Poster Mockup

Code

<!doctype html>
<html lang="en">
<head>
  <meta charset="utf-8" />
  <meta name="viewport" content="width=device-width, initial-scale=1, viewport-fit=cover" />
  <title>Campus Wayfinder</title>

  <style>
    /* simple palette */
    :root{
      --navy:#0b1230;
      --lime:#b8ff3b;
      --text:#ffffff;
      --card:rgba(11,18,48,.78);
      --border:rgba(255,255,255,.10);
      --shadow:0 12px 30px rgba(0,0,0,.35);
    }
    html,body{margin:0;height:100%;background:#000;font-family:system-ui,-apple-system,Segoe UI,Roboto,Arial}
    *{box-sizing:border-box}

    /* camera */
    #cam{
      position:fixed; inset:0;
      width:100vw; height:100vh;
      object-fit:cover;
      background:#000;
    }

    /* splash */
    #splash{
      position:fixed; inset:0;
      display:flex; flex-direction:column;
      justify-content:center; align-items:center;
      gap:18px;
      background:linear-gradient(180deg,var(--navy),#040812);
      z-index:50;
      transition:opacity .6s ease;
    }
    #splash.hide{opacity:0; pointer-events:none; visibility:hidden}
    #splash img{width:min(260px,64vw); height:auto; filter:drop-shadow(0 14px 28px rgba(0,0,0,.35))}
    #spinner{
      width:28px;height:28px;border-radius:50%;
      border:3px solid rgba(184,255,59,.25);
      border-top-color:var(--lime);
      animation:spin 1s linear infinite;
    }
    @keyframes spin{to{transform:rotate(360deg)}}

    /* menu */
    #menu{
      position:fixed;
      left:50%; top:52%;
      transform:translate(-50%,-50%);
      width:min(380px,92vw);
      background:var(--card);
      border:1px solid var(--border);
      border-radius:22px;
      padding:18px 16px;
      color:var(--text);
      z-index:40;
      backdrop-filter:blur(10px);
      box-shadow:var(--shadow);
      opacity:0;
      pointer-events:none;
      transition:opacity .35s ease;
    }
    #menu.show{opacity:1; pointer-events:auto}
    #menu h1{margin:6px 0 6px; font-size:20px}
    #menu p{margin:0 0 12px; font-size:13px; opacity:.9; line-height:1.35}

    .btnRow{display:flex; gap:10px}
    button{
      appearance:none; border:0;
      border-radius:16px;
      padding:12px 14px;
      font-weight:800;
      cursor:pointer;
      transition:transform .06s ease, filter .12s ease, background .12s ease;
      flex:1;
    }
    button:active{transform:translateY(1px)}
    .btnPrimary{background:var(--lime); color:#061024}
    .btnPrimary:active{filter:brightness(.88)}
    .btnGhost{background:rgba(255,255,255,.12); color:var(--text)}
    .btnGhost:active{background:rgba(255,255,255,.18)}

    /* HUD */
    #hud{
      position:fixed; inset:0;
      z-index:20;
      pointer-events:none;
      display:flex; flex-direction:column;
      padding:12px 12px calc(14px + env(safe-area-inset-bottom));
      gap:10px;
      opacity:0;
      transition:opacity .3s ease;
    }
    #hud.on{opacity:1}

    #topCard{
      align-self:center;
      width:min(520px,94vw);
      background:var(--card);
      border:1px solid var(--border);
      border-radius:20px;
      padding:12px 14px;
      color:var(--text);
      backdrop-filter:blur(10px);
      box-shadow:var(--shadow);
    }
    #topTitle{font-weight:900; font-size:16px; margin:0}
    #topSub{margin:2px 0 0; font-size:12px; opacity:.88}

    /* passing toast */
    #toast{
      align-self:center;
      width:min(520px,94vw);
      overflow:hidden;
      border-radius:999px;
      background:rgba(184,255,59,.92);
      color:#061024;
      box-shadow:var(--shadow);
      padding:10px 14px;
      font-weight:900;
      font-size:13px;
      white-space:nowrap;
      opacity:0;
      transform:translateX(-20px);
    }
    .toastIn{animation:toastIn .35s ease forwards}
    .toastOut{animation:toastOut .35s ease forwards}
    @keyframes toastIn{to{opacity:1;transform:translateX(0)}}
    @keyframes toastOut{to{opacity:0;transform:translateX(20px)}}

    /* center arrow */
    #center{
      flex:1;
      display:flex;
      flex-direction:column;
      align-items:center;
      justify-content:center;
      gap:14px;
    }
    #arrowWrap{
      width:180px; height:180px;
      display:grid; place-items:center;
      position:relative;
    }
    #semi{
      position:absolute;
      width:210px; height:105px;
      border-radius:210px 210px 0 0;
      background:rgba(11,18,48,.55);
      border:1px solid var(--border);
      bottom:-28px;
      backdrop-filter:blur(10px);
    }
    #arrow{
      width:90px; height:90px;
      transform:rotate(0deg);
      transition:transform .08s linear;
      filter:drop-shadow(0 14px 30px rgba(0,0,0,.35));
    }
    #arrow path{fill:var(--lime)}

    #distancePill{
      background:var(--card);
      border:1px solid var(--border);
      border-radius:999px;
      padding:10px 14px;
      color:var(--text);
      font-weight:900;
      font-size:13px;
      backdrop-filter:blur(10px);
      box-shadow:var(--shadow);
      display:inline-flex;
      align-items:center;
      max-width:min(520px,94vw);
      white-space:nowrap;
    }
    #distanceText{
      overflow:hidden;
      text-overflow:ellipsis;
      max-width:100%;
    }

    #footer{
      align-self:center;
      width:min(520px,94vw);
      display:flex;
      gap:10px;
      pointer-events:auto;
    }
    #footer button{border-radius:16px; padding:10px 12px; font-weight:900}
  </style>
</head>

<body>
  <video id="cam" autoplay playsinline muted></video>

  <div id="splash">
    <img src="./assets/logo.svg" alt="Campus Wayfinder" />
    <div id="spinner"></div>
  </div>

  <div id="menu">
    <h1>Campus Wayfinder</h1>
    <p>Allow <b>Camera</b>, <b>Location</b> and <b>Motion</b> when asked.</p>
    <div class="btnRow">
      <button id="startBtn" class="btnPrimary">Start Navigation</button>
      <button id="resetBtn" class="btnGhost">Reset</button>
    </div>
  </div>

  <div id="hud">
    <div id="topCard">
      <p id="topTitle">Destination: Spoons</p>
      <p id="topSub">Estimated time: 5 minutes</p>
    </div>

    <div id="toast"></div>

    <div id="center">
      <div id="arrowWrap">
        <svg id="arrow" viewBox="0 0 100 100">
          <path d="M50 7 L76 55 H60 V92 H40 V55 H24 Z"></path>
        </svg>
        <div id="semi"></div>
      </div>

      <div id="distancePill">
        <span id="distanceText">Distance to destination: -- m</span>
      </div>
    </div>

    <div id="footer">
      <button id="stopBtn" class="btnGhost" style="flex:1">Stop</button>
    </div>
  </div>

<script>
  // route points (arrows aim to next waypoint)
  const ROUTE = [
    { name:"Start (Poster)",      lat:53.7698512154046,  lon:-0.36838928710517393 },
    { name:"Waypoint 2",          lat:53.7702119222983,  lon:-0.3683582932499405  },
    { name:"Waypoint 3",          lat:53.77063949566153, lon:-0.36830550551962915 },
    { name:"Waypoint 4",          lat:53.771121892644885,lon:-0.3682456100146112  },
    { name:"Waypoint 5",          lat:53.77167677372432, lon:-0.36812691354069216 },
    { name:"Waypoint 6",          lat:53.77162317714752, lon:-0.3671756992473263  },
    { name:"Destination (Spoons)",lat:53.77173654295845, lon:-0.36701474664587863 }
  ];
  const DEST = ROUTE[ROUTE.length - 1];

  // buildings (for "you are passing" message)
  const LANDMARKS = [
    { name:"Larkin Building",           lat:53.770298999999994, lon:-0.368293 },
    { name:"Chemistry Building",        lat:53.77096705860541,  lon:-0.3679494882103711 },
    { name:"Brynmor Jones Library",     lat:53.771161,          lon:-0.368523 },
    { name:"Robert Blackburn Building", lat:53.77138270374397,  lon:-0.36848045910014093 },
    { name:"Hardy Building",            lat:53.77142592119492,  lon:-0.3681547880130722 },
    { name:"Canham Turner Building",    lat:53.77172530620078,  lon:-0.36811243578327435 },
    { name:"Gulbenkian Centre",         lat:53.77169643461599,  lon:-0.367372587707564 },
    { name:"Student Hub (SU • SPAR • Wetherspoons)", lat:53.77169299417767, lon:-0.36694099722237183 }
  ];

  // values I tweaked while testing
  const ARRIVE_RADIUS = 18;      // metres
  const LANDMARK_RADIUS = 25;    // metres
  const LANDMARK_COOLDOWN = 45000;

  // elements
  const cam = document.getElementById('cam');
  const splash = document.getElementById('splash');
  const menu = document.getElementById('menu');
  const hud = document.getElementById('hud');
  const toast = document.getElementById('toast');
  const arrow = document.getElementById('arrow');
  const distanceText = document.getElementById('distanceText');

  const startBtn = document.getElementById('startBtn');
  const resetBtn = document.getElementById('resetBtn');
  const stopBtn = document.getElementById('stopBtn');

  // show menu after splash
  setTimeout(() => {
    splash.classList.add('hide');
    menu.classList.add('show');
  }, 5000);

  // state
  let watchId = null;
  let routeIndex = 0;
  let userPos = null;
  let heading = null; // 0..360
  let running = false;
  const lastToast = new Map();

  // helpers
  function toRad(d){ return d * Math.PI / 180; }
  function toDeg(r){ return r * 180 / Math.PI; }

  function dist(a,b){
    const R = 6371000;
    const dLat = toRad(b.lat - a.lat);
    const dLon = toRad(b.lon - a.lon);
    const lat1 = toRad(a.lat);
    const lat2 = toRad(b.lat);
    const x = Math.sin(dLat/2)**2 + Math.cos(lat1)*Math.cos(lat2)*Math.sin(dLon/2)**2;
    return 2 * R * Math.atan2(Math.sqrt(x), Math.sqrt(1-x));
  }

  function bearing(a,b){
    const lat1 = toRad(a.lat), lat2 = toRad(b.lat);
    const dLon = toRad(b.lon - a.lon);
    const y = Math.sin(dLon) * Math.cos(lat2);
    const x = Math.cos(lat1)*Math.sin(lat2) - Math.sin(lat1)*Math.cos(lat2)*Math.cos(dLon);
    return (toDeg(Math.atan2(y,x)) + 360) % 360;
  }

  function norm(d){
    d = d % 360;
    return d < 0 ? d + 360 : d;
  }

  function formatM(m){
    if (!isFinite(m)) return '--';
    if (m < 1000) return Math.round(m) + ' m';
    return (m/1000).toFixed(2) + ' km';
  }

  function showToast(msg){
    toast.textContent = msg;
    toast.classList.remove('toastIn','toastOut');
    void toast.offsetWidth;
    toast.classList.add('toastIn');
    setTimeout(() => {
      toast.classList.remove('toastIn');
      toast.classList.add('toastOut');
    }, 2200);
  }

  async function startCamera(){
    const stream = await navigator.mediaDevices.getUserMedia({
      video: { facingMode: { ideal:"environment" } },
      audio: false
    });
    cam.srcObject = stream;
  }

  async function askMotion(){
    // iOS safari needs this or heading might be null
    try{
      if (typeof DeviceOrientationEvent !== "undefined" &&
          typeof DeviceOrientationEvent.requestPermission === "function"){
        await DeviceOrientationEvent.requestPermission();
      }
    }catch(e){}
  }

  function startCompass(){
    window.addEventListener('deviceorientation', (e) => {
      let h = null;
      if (typeof e.webkitCompassHeading === 'number'){
        h = e.webkitCompassHeading;
      } else if (typeof e.alpha === 'number'){
        h = norm(360 - e.alpha);
      }
      if (h !== null && isFinite(h)){
        heading = h;
        if (running) tick();
      }
    }, { passive:true });
  }

  function startGPS(){
    watchId = navigator.geolocation.watchPosition(
      (p) => {
        userPos = { lat:p.coords.latitude, lon:p.coords.longitude };
        if (running) tick();
      },
      () => {},
      { enableHighAccuracy:true, maximumAge:1000, timeout:12000 }
    );
  }

  function checkLandmarks(){
    if (!userPos) return;
    for (const lm of LANDMARKS){
      const d = dist(userPos, lm);
      if (d <= LANDMARK_RADIUS){
        const last = lastToast.get(lm.name) || 0;
        if (Date.now() - last > LANDMARK_COOLDOWN){
          lastToast.set(lm.name, Date.now());
          showToast("You are passing: " + lm.name);
        }
      }
    }
  }

  function tick(){
    if (!userPos) return;

    // always show distance to final destination
    const dDest = dist(userPos, DEST);
    distanceText.textContent = "Distance to destination: " + formatM(dDest);

    // waypoint advance
    const wp = ROUTE[routeIndex];
    const dWp = dist(userPos, wp);
    if (dWp <= ARRIVE_RADIUS && routeIndex < ROUTE.length - 1){
      routeIndex++;
    }

    // arrow points to current waypoint
    const target = ROUTE[routeIndex];
    const b = bearing(userPos, target);
    const rel = (heading == null) ? b : norm(b - heading);

    arrow.style.transform = `rotate(${rel}deg)`;

    checkLandmarks();
  }

  async function startNav(){
    try{
      await startCamera();
    }catch(e){
      alert("Camera permission is required.");
      return;
    }

    await askMotion(); // if user blocks it, arrow still points but won't rotate properly

    if (!navigator.geolocation){
      alert("Geolocation not supported.");
      return;
    }

    startCompass();
    startGPS();

    running = true;
    hud.classList.add('on');
    menu.classList.remove('show');
    tick();
  }

  function resetNav(){
    routeIndex = 0;
    lastToast.clear();
    tick();
  }

  function stopNav(){
    running = false;
    hud.classList.remove('on');
    menu.classList.add('show');
    if (watchId !== null){
      navigator.geolocation.clearWatch(watchId);
      watchId = null;
    }
  }

  startBtn.addEventListener('click', startNav);
  resetBtn.addEventListener('click', resetNav);
  stopBtn.addEventListener('click', stopNav);
</script>
</body>
</html>

Emerging Technology Project Reflective Log

This portfolio shows the design and development of Campus Wayfinder, an emerging technology prototype designed to help new university students find their way around an unfamiliar campus. The idea was to create an experience where a student could scan a QR code on a poster and instantly access a browser based augmented reality (AR) navigation system. The experience uses the phone’s camera, GPS, and motion sensors to guide the user towards a destination using on screen arrows and contextual labels of nearby buildings. 

The project unfortunately was not as straightforward as this. Throughout development, I encountered several technical and platform-related challenges that forced to me to rethink my approach. Rather than seeing this as a failure, the process became an important part of the project, as it required problem solving, research, and adapting my design to real world constraints. This log shows that journey, including difficulties I had and why I changed direction, but still managed to achieve the aims of the assignment.

Initial Platform choice and Complications

At the start of the project, I planned to build the Campus Wayfinder using 8th Wall, as it is a well-known WebAR platform that supports image targets, world tracking and Visual Positioning System (VPS). My original idea was to place 3D arrows directly onto the campus environment so that users could see them anchored to the ground as they walked. 

However, once I started using 8th wall, I quickly ran into several limitations that made it difficult to continue. Access to VPS features is restricted and requires a paid subscription. Although that it’s possible to create locations on the map, the free version does not allow full testing or deployment of custom VPS locations. This meant that even though I could design the experience conceptually, I could not properly test and show it on campus in real world conditions. 

8th wall relies heavily on App Keys, hosted builds and limited local testing. These restrictions made it hard to quickly test ideas, make changes while physically on campus and in result simulate the campus wayfinder. These issues were not caused by a lack of understanding or incorrect setup on my part, but by the limitations of the platform when used without a paid plan. 

At this stage, I had to make an important decision. I could either reduce the scope of the project or rethink my approach entirely. So, I chose to take a step back and explore alternative ways of achieving the same goal. 

Changing Direction and Rethinking the Approach

The goal of this assignment was not to use a specific AR platform, but to create a project that explores emerging technologies, demonstrates technical understanding, and shows thoughtful design. I realised that the core of my project was not tied to 8th wall itself, but to the idea of combining real-world movement with digital guidance. After research, I found that modern mobile browsers already provide access to many of the features needed for this type of experience, such as camera access, GPS tracking, and device orientation sensors. 

This led me to change direction and build the project using HTML, CSS, and JavaScript instead. By using these, I was able to recreate much of the intended functionality without relying on a platform that had issues. The experience could still be launched by scanning a QR code, show a live camera feed, respond to the user’s location and movement, and guide them towards a destination. The transition was made easier by me having previous experience using HTML, CSS, and JavaScript in earlier projects. Having this background meant I was already comfortable building layouts, handling interactions and debugging issues. This allowed me to focus more on solving higher-level problems rather than learning everything from scratch. I also watched several video tutorials and blogs to help me better understand things like geolocation, device orientation, and mobile browser permissions.

Use of Emerging Technologies

The final version of Campus Wayfinder does not use native AR frameworks like ARkit or ARcore, instead it uses key principles of augmented reality. The experience overlays digital information onto the real world through a live camera feed and reacts to the user’s physical movement using GPS and compass data. 

This type of WebAR-style experience is an emerging approach. This is because it prioritises accessibility and ease of use, as users do not need to download an app or have a specific device. Instead, everything runs directly in the browser, which aligns with current trends towards lightweight, accessible emerging technologies. 

User Experience Design Decisions

User experience was a major focus of the project. New students are often in a rush, unfamiliar with their surroundings, and may already feel overwhelmed. Because of this, I designed the experience to be as simple and clear as possible. 

The experience starts with a short splash screen, followed by a simple menu that explains what permissions are needed and what will be used. Rather than presenting multiple options, the interface guides the user towards one clear action, to start navigation. 

Originally, I wanted to place arrows directly onto the ground using world-locked AR. However, due to browser limitations on iOS, this approach was unreliable. Instead, I designed a HUD-style directional arrow that rotates based on the users compass heading. While this is technically simpler, it works consistently and is easy to understand, which ultimately makes for a better user experience. 

To help users feel more confident about where they are, the system also displays temporary messages when they pass key landmarks, such as buildings on campus. This reinforces real world context and helps users gradually learn the layout if the campus.

Technical Challenges and How I Solved Them

One of the biggest technical challenges was dealing with GPS accuracy. GPS data can be unreliable, especially around buildings. To solve this, I avoided relying on exact coordinates and instead used distance ranges. When the user enters a certain radius around a checkpoint, the navigation updates smoothly rather than requiring perfect accuracy.

Another major challenge involved device orientation on iOS. Accessing compass data requires explicit permission, and the browser will not always prompt the user automatically. At first, this caused the directional arrow to stop responding. After researching the issue, I implemented direct permission requests for both motion and orientation sensors and added fallback logic to handle different browser behaviours. This process helped me gain a much better understanding of how mobile browsers manage sensor data.

Design Development and Decision-Making

A key focus throughout the development of Campus Wayfinder was ensuring that design decisions were driven by user needs and real-world context rather than purely aesthetic choices. From the early stages of the project, the aim was to create a navigation experience that felt intuitive, minimal, and suitable for use while moving through a physical environment.

One of the most important design decisions was the choice to prioritise visual guidance over text-based instructions. Traditional wayfinding systems, such as campus signage and public transport environments, highlight the effectiveness of simple directional symbols and high-contrast visuals. These systems rely on quick recognition rather than detailed information, which inspired the use of arrows, icons, and minimal copy throughout the interface. This approach helped reduce cognitive load and allowed users to focus on their surroundings rather than their screens.

Colour and contrast played a significant role in ensuring usability. The chosen colour palette was designed to remain visible in outdoor lighting conditions, while also aligning with an approachable familiar visual identity (University of Hull colours). High contrast between background elements and interactive components was used to improve legibility, particularly when the experience is used on the move. Rounded UI elements were intentionally chosen to create a friendly and accessible tone, reinforcing the idea that the system is designed to assist rather than overwhelm the user.

Typography was another key consideration. Mukta was chosen as it’s clear and legible type maintains readability at varying distances and screen sizes. Hierarchy was carefully applied so that essential information, such as direction and distance, always took priority over secondary content. This hierarchy ensured that users could quickly interpret the interface without needing to stop or concentrate on the screen for extended periods.

Iteration played a central role in shaping the final design. Early concepts explored more complex AR interactions, including placing 3D arrows directly into the environment. However, technical constraints and accessibility considerations resulted in a redesign of the interface. Instead of treating these limitations as setbacks, the design was adapted to focus on a HUD-style navigation system that still responded to the user’s movement and direction. This shift allowed the project to remain functional, accessible, and aligned with the original design goals.

Design tools such as Figma were used extensively to explore layout, spacing, and interaction flow before development began. This helped ensure consistency across screens and allowed design decisions to be tested visually before being implemented in code. Adobe Illustrator was used to develop the project’s branding and logo, reinforcing a cohesive visual identity across both physical posters and the digital interface.

Overall, the design process behind Campus Wayfinder demonstrates a balance between creativity and practicality. Each design decision was made with research and user context in mind, resulting in an experience that is both visually considered and functionally effective.

Software Proficiency

The main development was done using HTML, CSS, and JavaScript. My previous experience with these technologies allowed me to quickly prototype ideas and make changes as the project evolved. I combined this existing knowledge with online tutorials and blogs to overcome more complex challenges, particularly those related to mobile sensors and permissions.

CSS played a large role in the project, especially for animations and transitions. Sliding menus, fading headers, and animated arrows were all handled using CSS rather than heavy JavaScript, which helped keep performance smooth on mobile devices.

To host and test the Campus Wayfinder project, I used Netlify, a web-based hosting platform designed especially for web projects. Netlify allowed me to deploy the project quickly by simply dragging and dropping my HTML, CSS, JavaScript, and asset files into the platform, which automatically generated a live URL for testing and sharing. This deployment process was particularly helpful during development, as it allowed me to test the experience directly on my phone using real camera, GPS, and motion sensor data. The platform also removed the need for complex server configuration, allowing me to focus more on refining the interaction design and user experience rather than consuming time managing backend infrastructure.

For design, I used Figma to plan layouts, spacing, and interactions before implementing them in code. This helped me visualise the experience and maintain a consistent visual style.

I also used Adobe Illustrator to design the Campus Wayfinder logo. During development, I noticed that the logo text was not displaying correctly on the website. I fixed this by converting the text to outlines before exporting the SVG, ensuring it looked consistent across all devices.

Ethical Considerations

Because Campus Wayfinder uses camera access, location data, and motion sensors, ethical considerations were important. The experience only requests permissions when they are needed and explains clearly why they are required.

No user data is stored, tracked, or shared. The system does not run in the background and does not include analytics or third-party tracking. This follows principles of user consent, transparency, and data minimisation.

Accessibility was also considered. By using a browser-based approach and avoiding app installation, the experience is more accessible to a wider range of users and devices.

Forward Thinking and Future Development

Campus Wayfinder aligns with current and future trends in emerging technologies, particularly WebAR and location-based experiences. As browser support for advanced AR features improves, this type of project could evolve to include world-locked AR elements, indoor navigation, or more advanced spatial mapping.

Future improvements could include dynamic routes, accessibility-focused navigation modes, or integration with native AR frameworks where available. This project shows how emerging technologies can be explored realistically while still leaving room for future expansion.

Conclusion

Campus Wayfinder represents a practical exploration of emerging technologies through design, experimentation, and adaptation. Although my original plan was limited by platform constraints, changing direction allowed me to create a functional and well-considered prototype that still meets the aims of the assignment.

The project demonstrates technical problem-solving, user-centred design, ethical awareness, and forward-thinking development. Most importantly, it reflects the reality of working with emerging technologies: understanding limitations, adapting ideas, and finding effective solutions within those constraints.

Categories
Emerging Technologies

Exploring Emerging Technologies and Immersive Design

360 Content

For the 360-degree content task, I designed and rendered a fully immersive 3D environment in Blender, choosing to create a prison cell scene. I wanted to go beyond simply following the tutorial or example provided, so I built and several of the main assets myself, including the bed, window bars, and smaller objects around the cell. I chose this confined setting deliberately because working with a limited space meant I could concentrate more on the core principles of 3D environments—the lighting, atmosphere, and realism.

Process of adding iron bars to the model

From the beginning, my goal was to make the space feel believable but also emotionally heavy. I used basic compositional and lighting principles to give the scene a distinctive tone. The main light source was a sunlight asset, providing ‘daylight’ filtering through the barred window, which casted shadows across the floor. I paired this with a ceiling point light to simulate the artificial glow often seen in real prison cells. The combination helped to soften the darker areas whilst still maintaining a slightly moody atmosphere. Getting the balance right between natural and artificial light took a lot of trial and error, but it also helped me better understand how light direction and intensity can completely alter the emotional tone of a 3D environment. The walls and the floor textures were found for free online, I tried to find a worn and rustic textures for both the wall and the floor to reflect realism. 

Blender screenshot showing the positioning of the Sun asset

When it came to camera placement, I applied immersive UX principles related to embodiment and proximity (Sahu, 2025). I set the viewer’s perspective at around human height, giving them the sense of physically standing inside the cell. This choice was really about evoking empathy and discomfort, two feelings that are quite impactful when exploring themes like imprisonment and confinement.

I realised during the process that even without any animated characters, dialogue, or narrative, the environment itself can tell a story. The positioning of objects, the texture of the walls and the way the lights have been set up all work together to create a mood. This is what’s known as environmental storytelling, and it’s something I’ve grown to appreciate much more after completing this task.

This 360 environment could easily fit within a story sphere or interactive documentary, as it invites reflection on broader social issues. I can imagine using it in an educational or social justice context, where viewers could explore the emotional reality of imprisonment in a safe but thought-provoking way. If I were to develop it further, I’d love to add interactive elements such as moving assets, ambient sounds (like footsteps or cell doors closing) or even a brief narrative voiceover to make the experience more dynamic. This task definitely improved my confidence in Blender, particularly in lighting and atmosphere design, and gave me a foundation of skills I can build on for future projects.

WebVR

https://framevr.io/aplacetoremember

For the WebVR task, I created an interactive virtual gallery using FrameVR, a web-based engine that enables users to explore and present in 3D spaces directly through their browsers. I chose this environment to showcase two of my previous projects—one with a 3D model, and the other with a logo design sequence, allowing me to present my work in a more engaging and interactive way than a traditional portfolio.

Cab-E section of the environment
Cab-E Logo Evolution
THRIVE Section of the environment

FrameVR supports a variety of media formats such as videos, images, and 3D models (Benham and Budiu, 2022). This made it possible for me to combine static and moving content to form a multi-media experience. I arranged my assets using spatial hierarchy, giving each project its own dedicated section with clear sightlines so visitors could intuitively know how to navigate.

THRIVE video Ad

A big part of this task involved thinking like a curator rather than just a designer. I had to decide where users would spawn, how they would move through the space, and what they would see first. These decisions were guided by principles of immersive UX and spatial narrative (Sahu, 2025). I added a few interactive features such as clickable videos and a magnifying glass tool so users could zoom in on smaller details. Additionally, I imported a 3D model of an energy drink can from a previous project and applied a gentle rotational animation. It’s a small detail, but it helped to bring a sense of movement to the gallery without becoming a distraction. This subtle use of motion reflects Eriksen’s (2023) point that animation should enhance engagement rather than overwhelm it.

Example of using the magnifying glass feature

Creating this gallery taught me how digital portfolios can be more than just a bunch of images and that they can be turned into immersive storytelling spaces. I noticed that giving users the freedom to move around and explore encouraged curiosity. Some might spend longer looking at one project than another, which makes the experience more personal. It’s also interesting how translating 2D work into 3D environments changes its impact. For instance, my flat logo design felt more substantial when placed in a three-dimensional setting.

If I were to continue developing this, I’d like to turn it into an online exhibition space that could be used for professional showcases or even client presentations. The biggest challenge was balancing visual aesthetics with usability—making sure users could navigate easily without feeling confused or distracted. In this task I learned a lot about how digital spaces can convey identity and personality just like a physical gallery would.

Augmented Reality & UX

For the Augmented Reality (AR) task, I decided to take a more playful approach by designing a mobile-based prototype in 8th Wall, where users can throw a virtual ball at 3D objects to score points. My main inspiration came from gesture-based games like Pokémon GO, where the user throws a ball to level up or achieve something.

A screenshot to show the depth and assets of my project in 8th wall

The interaction would be simple: users tap or swipe on their phone screen to throw the virtual ball, while moving their device to aim at the target objects. As for this task though, I focused on the physics and the basics of this prototype. I paid particular attention to user safety and ergonomics which is a key part of UX design. Since the experience uses the device’s camera to overlay digital content onto the real world, users can still see their surroundings, which helps prevent motion sickness or accidental collisions. This aligns with Eriksen’s (2023) recommendations around maintaining freedom of movement in immersive UX.

On the technical side, I modelled the target objects (bowling pins) in Blender and exported them as .glb files into 8th Wall. Setting up the physics interactions was by far the most challenging part. Initially, the pins simply fell through the floor because the system couldn’t detect their collision boundaries. After some troubleshooting, I fixed this by applying a capsule collider, which allowed the virtual ball and pins to interact realistically. This problem-solving process gave me a much better understanding of how physics simulation works in AR environments and specifically 8th wall.

Bowling Pin model created in Blender
Manual Capsule Collider settings

The prototype demonstrated how AR can be used not just for entertainment but also for social and physical engagement. I could see this concept evolving into a multiplayer mobile game where users compete in real-world settings, turning any open space into a bowling alley. Visually, I made sure the scene had bright colours and clear contrasts to keep the focus on the action. This minimalist approach would be something I’d like to carry forward if was to continue this, as it maintains accessibility while enhancing immersion.

Overall, this project deepened my appreciation for user experience design in AR—particularly the importance of balancing realism with usability. AR’s strength lies in its ability to merge the digital and physical worlds together, and even a simple prototype task like this shows how creative and interactive it can be. 

VR Art

The VR Art task, created using Open Brush, was one of the most creatively freeing part of this series of tasks. Working in a virtual 3D drawing space offered a completely different way of thinking about art and storytelling. Instead of being limited to a flat canvas, I could draw in mid-air, surrounding myself with designs and vibrant colours.

At first, I focused on getting used to the controls and experimenting with different brushes and effects. I started by sketching simple objects—a house, a car, a few abstract shapes, mainly to test how perspective worked in three-dimensional space. One of the biggest challenges I encountered was depth alignment. Without a physical reference point, it’s surprisingly easy to misjudge distances, leading to parts of the drawing overlapping in strange ways. However, over time, I developed a better sense of spatial awareness and hand-eye coordination, which are both crucial for immersive design work (Chang, 2025).

The second time I had the VR I decided to draw a house again, this time learning to try and improve the depth alignment and make a little scene. I used the table as a level so the floor was guaranteed to be straight. 

Timelapse of me creating a house

What I found most interesting about this task was the emotional experience of creating in VR. Being inside the artwork gave me a sense of presence that’s hard to achieve in traditional digital work. Mistakes felt less frustrating because they could be fixed instantly, and I had no limit on where I could draw. This sense of embodiment and freedom aligns closely with what Polydorou (2024) describes as the connection between immersion and emotional engagement in VR storytelling. This exercise really reinforced how VR art can merge creative intuition with UX design principles, encouraging artists and designers to think spatially rather than flatly.

Conclusion

Across all four tasks—360 Content, WebVR, AR & UX, and VR Art—I’ve learned how immersive technologies reshape the way stories can be told and how experiences are designed. Each tool brought new challenges but also new ways to connect creativity with technical understanding. These tasks have strengthened my confidence in using immersive tools and have shown me how technology can be used to create meaningful, engaging, and emotional experiences. Looking ahead, I’d like to continue exploring one of these four tasks and use them to help me create a final project for this module – this being AR & UX. I believe AR is the perfect tool for me to use as a graphic designer because it allows me to combine my 2D skills and illustrations from Photopshop and Illustrator etc. with my 3D skills from 8th Wall to create interactive material.

References

Behnam, S. and Budiu, R., 2022. The usability of augmented reality (AR) [online]. Nielsen Norman Group. Available at: https://www.nngroup.com/articles/ar-ux-guidelines/ [Accessed 25 October 2025].

Chang, S., 2025. The impact of digital storytelling on presence, immersion and user experience in VR [online]. MDPI Sensors. Available at: https://www.mdpi.com/1424-8220/25/9/2914 [Accessed 25 October 2025].

Eriksen, M., 2023. 6 UX design principles for augmented-reality development [online]. UXmatters. Available at: https://www.uxmatters.com/mt/archives/2023/03/6-ux-design-principles-for-augmented-reality-development.php [Accessed 25 October 2025].

Polydorou, D., 2024. Immersive storytelling experiences: a design methodology [online]. Digital Creativity, Taylor & Francis Online. Available at: https://www.tandfonline.com/doi/full/10.1080/14626268.2024.2389886 [Accessed 25 October 2025].

Sahu, S., 2025. Designing for the immersive world: a UX designer’s guide to AR, VR, and XR [online]. UX Planet. Available at: https://uxplanet.org/designing-for-the-immersive-world-a-ux-designers-guide-to-ar-vr-and-xr-c2414802be59 [Accessed 25 October 2025].

Yang, S., 2023. Storytelling and user experience in the cultural metaverse [online]. Heliyon, Elsevier. Available at: https://www.sciencedirect.com/science/article/pii/S2405844023019667 [Accessed 25 October 2025].

Part Two: Research Proposal

Introduction

The Campus Wayfinder project proposes the creation of an interactive AR navigation tool designed specifically for new students, visitors, and staff at the University of Hull. The system aims to enhance spatial orientation and user experience by providing a digital assisted guide that overlays navigation information directly onto the physical environment. This project combines emerging AR technology with principles of accessibility and visual communication, which are key elements within graphic design. The proposed prototype will utilise 8th Wall, a web-based AR platform, and Blender for 3D modelling to produce an engaging navigational experience. 

The research investigates how AR can bridge the gap between digital wayfinding and physical exploration, turning traditional campus maps into interactive tools that promote engagement and confidence amongst new users. This project also considers broader implications of ethical data use, accessibility, and inclusivity in AR design. Through design thinking, iterative testing, and UX principles, the project aims to demonstrate how we as graphic designers can apply emerging technologies innovatively and responsibly.

Research Overview

The goal of the Campus Wayfinder is to create an immersive and user-friendly AR experience that helps users navigate the University of Hull campus more easily. This project combines graphic design, UX design, and emerging technology together, demonstrating how traditional design can be evolved into interactive, spatial media.

Purpose and Context

Navigating a new university campus can be overwhelming, especially for first-year students and visitors during open days or events. While static maps and signposts provide guidance, they can often lack interactivity, accessibility and can easily become outdated. The purpose of the AR Campus Wayfinder is to reimagine the wayfinding process by introducing an AR experience that provides 3D visual directions, real-time orientation cues, and spatial awareness through a smartphone or tablet camera.

The AR experience would be activated by scanning a poster or QR code placed around campus (e.g. at entrances or key buildings). Once activated, users would see an AR overlay featuring a 3D model of the campus built in Blender, complete with markers, arrows, and animated pathways leading to destinations such as the library, student union, or lecture halls.

UX principles such as embodied cognition have been implemented into this design, which enhances the users understanding through physical interaction with the environment (Benham and Budiu, 2022). The idea is to empower users with spatial awareness by combining real-world visuals with digital augmentation — allowing for intuitive, immersive navigation rather than abstract map-reading.

Scope and Aims

The Campus Wayfinder will function as a design-led prototype demonstrating how UX principles and graphic design can be applied to a university environment. The project aims to:

  • Explore the relationship between spatial UX design and physical navigation. 
  • Develop a 3D visual model of the campus using Blender, exported as. glb files for AR integration. 
  • Prototype interactive elements including directional arrows, information panels and animated routes. 
  • Help students be more familiar with their campus and improve their anxiety/disorientation from becoming lost or not knowing where to go.

The final prototype will not include real time GPS data as this is not achievable using 8th Wall. Instead, it will conceptually simulate how a full system could function through carefully designed interactivity, clear visuals, and animated transitions. 

User Experience (UX) Considerations

The primary audience includes new students, visitors, and staff members who are unfamiliar with the campus layout. The secondary audience includes future students exploring the university during open days or events. The design will prioritise:

  • Ease of use: The application requires no installation and should load instantly via a QR code. 
  • Visual clarity: The use of university brand colours and simple icons will not only ensure immediate recognition but also reduce cognitive load. 
  • Interactivity: Users can tap on buildings to reveal names and information or select a destination to visualise a suggested route. 
  • Spatial comprehension: The 3D map will be slightly elevated and have a slight isometric perspective, helping users to understand layout and distance. 

The system will aim to foster a sense of orientation and belonging, encouraging exploration rather than frustration. 

Ethical considerations

When the project involves interactive media and environmental data, ethics can be critical in emerging tech design:

  • Privacy: the AR system will operate locally without collecting user data or location tracking. 
  • Safety: The passthrough camera view allows users to remain aware of their surroundings whilst navigating.
  • Accessibility: Colours and texts will have contrast standards, so they are legible in outdoor environments. 
  • Testing Consent: If any user testing is done it will be voluntary and anonymised, with clear participant consent forms.

Inspiration and Case Studies

This project draws inspiration from two main applications. Marks & Spencer’s (M&S) in-store AR Wayfinder app, which uses augmented reality and interface overlays to guide customers through retail spaces with ease. The other being Google Maps Live View, which integrates arrows into navigation and allows the user to view a location in first person view through their smartphone. These applications are prime examples that help demonstrate how spatial information can be communicated in intuitive ways. 

Project Plan 

Methodology and Approach

The research adopts a Design Thinking and Agile methodology. Design thinking enables creative exploration through empathy, ideation, prototyping and testing whist Agile methods (Scrum), structure progress into measurable sprints. A sprint is a short, fixed-length period during which a team works to complete a set of prioritised tasks to deliver a tangible, potentially releasable product increment. 

User Stories (Scrum Format) 

  1. As a new student, I want to scan a QR code to instantly access an AR map so I can find my way around campus. 
  2. As a visitor, I want to explore the 3D models and be able to identify key buildings without the need to download an app. 
  3. As a student with accessibility needs, I want clear visual and audio cues that make it easier to navigate without reading smaller text. 
  4. As a staff member, I want to be able to recommend this AR tool to help visitors find my department easily.
  5. As a designer, I want to showcase how AR can enhance physical spaces through immersive storytelling.
MilestoneTasksTimeframeDeliverables
Research & IdeationAnalyse AR navigation examples and competitorsWeek 1-2Research summary, moodboards
Concept DevelopmentSketch wireframes, plan user journey, poster mockupsWeek 3-4 Concept sketches, stroyboards
3D modelling Model Campus and create direction assetsWeek 5-6 .glb models and materials
8th Wall PrototypeBuild AR experience, import 3D models Week 7-8Working prototype
Testing & IterationConduct informal tests with 3-5 students, improve based on feedbackWeek 9UX testing report and improvement to-do list
Final PresentationRecord screen capture of AR in use, create any supporting visuals.Week 10 Finalised prototype and presentation slides. 

Task board and Workflow

Tasks will be tracked via a Trello board divided into ‘To-Do’, ‘In Progress’, ‘Testing’, and ‘Complete’ columns. Each milestone will have specific subtasks e.g ‘Export .glb model from blender’. 

Tools and Technologies

These are the Tools and applications that will most likely be used to complete this project:

  • 8th Wall: For AR development and scripting interactions
  • Blender: For creating optimised 3D assets and exporting. glb models.
  • Adobe Illustrator and Photoshop: For poster design and interface graphics.
  • Trello: For workflow management.
  • Figma: For prototyping UI overlays and visual testing. 

UX and Design Principles

The AR Wayfinder will be guided by UX and spatial design principles: 

  • Visibility of interaction confirmation: Provide feedback when users interact with an element e.g highlighted building animation.
  • Recognition over Recall: Present all key destinations on the map rather than making users rely on memory. 
  • Minimalist Design: Use clear, colour-coded arrows and icons without unnecessary clutter. 
  • Consistency and Branding: Ensure colours and typefaces align with the University of Hull’ identity system. 

Anticipated Challenges

  • Managing to achieve accurate spatial alignment between the 3D models and the real-world environment.
  • Ensuring performance is stable on different devices.
  • Diverse user groups are considered when designing for accessibility. 

The camera will be anchored to be aligned with eye level perspective, this ensures the user feels present within the environment rather than perceiving seperate detached models (Eriksen, 2023). This design choice supports immersive UX theory and enhances user embodiment within the digital-physical interface. 

Concept Storyboard

Storyboard

Visual Style and Tone

The AR interface will follow a minimalist and modern visual identity with vibrant green and blues, referencing the University of Hull’s branding. Arrows and route lines will use glowing gradients for clarity in outdoor light. Typography will prioritise legibility, using bold sans-serif fonts such as Helvetica or Poppins. To maintain visual interest subtle animations will be added, without causing any distraction or motion sickness. 

User Journey

The user journey is designed to be smooth and instant, requiring no installations or logins. QR codes are used a lot in today’s world meaning the simplicity of scanning and interacting with the codes, encourages repeat engagement and reduces technological barriers which can be a crucial UX factor for first-time AR users.

Reflection and Emerging Tech Justification

Why AR?

Augmented reality is one of the most promising emerging technologies for spatial communication, as it seamlessly can combine the digital and physical worlds together. For graphic designers, AR can help shift our flat 2D pieces of work into spatial experiences – merging visual communication, interactivity, and immersion together. 

Innovation and Trends

The innovation of this project comes from its accessibility and its purpose. Unlike high- budget corporate AR tools, this project uses web- based AR meaning no downloads required, making it practical for public use. This reflects a growing trend in the UX design – frictionless access (Eriksen, 2023). 

AR technology highlights its relevance as an emerging design tool by not only continuing to redefine how information is communicated, consumed, and visualised, but also Its integration within educational and navigational contexts. The project also addresses sustainable design thinking by replacing printed maps with digital overlays.

Design Thinking

After researching design thinking strategies, the ‘Design Thinking framework’ appeared. It is a non-linear, iterative methodology for creative problem-solving that focuses on understanding user needs through five stages. These being empathise, define, ideate, prototype, and test. This framework will help form the foundation of this project’s research. 

  • Empathise: This stage involves understanding the challenges new students face navigating campus. This insight gave me the idea to create a digital orientation aid rather than a traditional map that can become easily outdated. 
  • Define: The project’s goal is to reduce spatial confusion through accessible visual storytelling.
  • Ideate: I explored multiple concepts, including gamified campus tours and AR treasure hunts, but they had already been done before. The wayfinder was chosen for its practicality and its inclusive potential. 
  • Prototype: Blender was chosen to design the environment and 8th wall was chosen as it enables quick iteration. Animation and interaction will be refined based on student feedback and usability principles. 
  • Test: Feedback sessions will assess clarity, visual comfort, and intuitiveness. 

Success Criteria

The success of the Campus wayfinder will be measured by:

  • Ease of use: Users navigate intuitively without instruction.
  • Engagement: Users spend longer exploring due to interactive features. 
  • Accessibility: The design supports diverse users effectively. 
  • Reliability: The AR functions smoothly across different devices and environments. 

Future Development

With further development, the prototype could integrate real time GPS and indoor positioning systems, expanding to support live navigation. It could also include voice assisted guidance for users with visual impairments, improving the accessibility of the system. The concept could then later be extended as part of a university-wide welcome campaign, not only reinforcing student engagement through emerging media but also improving accessibility and usability to universities across the country. 

Conclusion

This project proposal demonstrates how graphic designers can apply emerging technologies in meaningful ways. By using AR to enhance real-world navigation, the Campus Wayfinder bridges digital communication and physical experience. UX design, 3D modelling, and AR development have all been combined to create a practical and innovative solution for students and visitors. In developing the project, design thinking will be used to guide every stage – from mapping user empathy and prototyping to user testing and refinement. This ensures that innovation is always driven by human needs. The wayfinder is not only a digital navigational tool but is also a reflection of the future of graphic design, where interactivity, accessibility and storytelling all coexist. As this technology will continue to evolve, we as designers must adapt, refine, and reimagine how information can be delivered. This project incorporates that evolution, turning a traditional campus map into a dynamic, immersive, and interactive experience. 

References

Benham, S. and Budiu, R. (2022) The role of spatial cognition in AR navigation design. Nielsen Norman Group. Available at: https://www.nngroup.com/articles/ar-navigation-design (Accessed: 24 October 2025).

Eriksen, M. (2023) Designing frictionless experiences in augmented reality. UX Collective. Available at: https://uxdesign.cc/frictionless-ux-ar (Accessed: 24 October 2025).