Categories
Uncategorized

002_CWRK:Development Log

Research & Context

This project focuses on sustainability within the outdoor clothing industry, particularly how people treat clothing once it becomes worn or damaged. Although brands like The North Face promote durability and long-lasting products, there is still a tendency for people to replace items rather than repair them. This contributes to unnecessary waste and goes against the idea of sustainability that these brands often promote.

Patagonia Worn Wear Logo
The North Face Campaign Posters

A key influence for this project was Patagonia’s “Worn Wear” campaign (Patagonia, 2023), which encourages people to repair and reuse clothing rather than buying new products. I was also influenced by The North Face’s campaign style, particularly their use of bold typography and simple layouts, where imagery is the main focus. These references helped shape the direction of my work, as I wanted the outcome to feel relevant to the industry while still communicating a clear environmental message.

Conceptual Development

Initial layout sketches exploring composition and text placement

At the beginning of the project, I explored more literal ideas around repair, such as showing stitching or patching in detail. However, these ideas felt quite difficult to execute effectively using the resources I had available. Because of this, I shifted towards a more conceptual approach.

Visual concept sketches testing layering and background treatments

The idea developed into focusing on the value of clothing, leading to the concept that garments are “worth repairing”. Instead of showing the repair process directly, the work highlights the contrast between damaged and repaired states. This allows the audience to understand the message without needing a detailed explanation.

This shift helped simplify the overall direction and made the outcome feel more like a real campaign rather than a demonstration. It also made the message clearer and more direct, which is important in advertising.

Early campaign idea showing before and after structure

Experimentation & Prototyping

Throughout this project, AI was used as a supporting tool to help develop and visualise ideas. It was particularly useful in generating base imagery for the campaign, providing a canvas to build from when resources were limited. To help with this I created my own visual content by photographing a North Face jacket that I own. I then used AI tools to extend this by generating a full figure wearing the jacket, allowing me to place the product within a more realistic outdoor context. This allowed me to focus more on composition, layout, and concept rather than production constraints.

I experimented with a range of visual approaches, including double exposure, contour line overlays, and layered compositions such as torn paper effects. These were used to explore how to add more character and depth to the designs while still keeping the message clear.

Early version of billboard poster

Some of these ideas worked well, but others made the designs feel too busy and overcomplicated. This helped me realise the importance of keeping things simple, especially when creating campaign visuals.

I also explored different formats, including video and social media content. Although due to limitations with stock footage and not having access to the right equipment, I wasn’t able to fully produce a video outcome.

Instead, I created a moodboard to show what the video advert would look like if I had the time and resources to make it. This allowed me to still develop the idea visually, focusing on the tone, pacing, and overall feel of the advert.

This process helped me adapt my ideas and move towards a more refined and controlled outcome.

User Testing & Feedback

Feedback was mainly gathered through peer discussions and informal critiques. Early feedback suggested that some of the designs felt too simple and lacked visual interest. In response, I introduced additional elements such as textures and layering to add more depth.

Web page design Figma file

I also received feedback on the web page design, particularly around how users interact with the content. To improve this, I introduced a simple hover interaction where elements change colour and slightly expand when the mouse is placed over them. This made the page feel more responsive and engaging, while still keeping the overall design clean and consistent.

Hover interaction increasing engagement and responsiveness

Another key piece of feedback was around readability. It was suggested that certain key words should be highlighted using colour to break up the text and make it easier to read. I applied this by subtly changing the colour of important words, which helped improve hierarchy and made the content clearer without overwhelming the design.

However, further feedback showed that adding too many visual elements could make the work feel cluttered. This led me to refine the designs by removing anything unnecessary and focusing more on the main idea.

Overall, people responded best to the clearer, more minimal designs where the message was easy to understand. The addition of subtle interactive elements and improved text hierarchy helped make the work more engaging while still supporting the overall concept.

Informed Design Decisions & Direction

The final outcome is a series of campaign posters and supporting digital content that aim to encourage people to repair their clothing rather than replace it. The designs use strong imagery and minimal text to communicate the message clearly and quickly.

A key decision was to keep everything consistent and focused, rather than trying to include too many ideas. This helped the work feel more professional and closer to real campaign design. The visual style is influenced by existing outdoor campaigns, particularly those by The North Face, but also reflects my own approach to layout and composition.

In terms of sustainability, the project promotes a shift in behaviour by encouraging people to extend the lifespan of their clothing. This helps reduce waste and supports more responsible consumption.

Overall, the project shows how a simple idea can be developed through experimentation and feedback into a clear and effective campaign. This project shows how design can be used to influence behaviour, encouraging people to make more sustainable choices through visual communication.

References

Patagonia (2023) Worn Wear. Available at: https://www.patagonia.com/worn-wear/ (Accessed: 1 May 2026).

The North Face (2024) Brand Campaigns. Available at: https://www.thenorthface.co.uk/ (Accessed: 1 May 2026).

Categories
Uncategorized

001_PRES: Presentation of Proposal

Task 1: Otl Aicher’s Munich 1972 Olympics

A key example of graphic design for good from the post-war period is the visual identity created by Otl Aicher for the 1972 Munich Olympics. What makes this project important is not just how clean and modern it looks, but what it was trying to achieve socially and politically.

After the Second World War, Germany needed to rebuild its international reputation. The 1972 Olympics gave West Germany the opportunity to present itself as peaceful, democratic and forward-thinking. Aicher’s design played a major role in shaping that message. Instead of relying on heavy national symbols, he created a calm, clear and structured visual system that felt open and accessible, aligning with modernist principles of clarity and function.

Example of Aicher’s Pictograms

The pictograms are arguably the most recognisable part of the project. Aicher designed simple icons to represent each sport, reducing them to geometric forms. They did not rely on language, which meant they could be understood by visitors from all over the world. This is where the idea of “design for good” becomes clear, as the system made the Games easier to navigate and more inclusive. It showed how graphic design can remove communication barriers and improve accessibility in a public setting.

Guidelines and Standards for the Olympiad Munich 1972

Colour choices were also meaningful. Aicher avoided red and black because of their association with Germany’s Nazi past. Instead, he used a palette of light blues and greens, which created a softer and more optimistic atmosphere. This was not just an aesthetic decision, but a conscious attempt to distance modern Germany from its history and reshape how the country was perceived internationally.

Another important factor was consistency. Aicher developed strict guidelines so that everything, including posters, tickets and signage, followed the same visual language. This use of a structured system, supported by a grid, created clarity and reliability. It also set a precedent for future large-scale design projects, particularly in wayfinding and branding systems used in airports and public spaces today.

Aicher’s work on different media

Even though the Games were later overshadowed by tragedy, Aicher’s design still stands as an example of how graphic communication can have real societal impact. It helped reposition a nation and promote democratic values through visual means.

Overall, this project shows that post-war graphic design was not just about style. It became a tool for rebuilding identity, encouraging openness and shaping how a country was seen globally, which is what makes it a strong example of design for good.

Task 2: The Climate Clock

A contemporary example of graphic design for good is the Climate Clock, first launched in 2020 by Gan Golan and Andrew Boyd. The project displays a live digital countdown showing how much time remains to limit global warming before reaching critical levels. What makes it effective is how it turns complex data into something visual, immediate and emotionally engaging.

The first clock was installed in Union Square, New York. Large, bold numbers were projected onto the side of a building, counting down in real time. The design is extremely simple, using only typography and numbers, but that simplicity is what gives it impact. There are no distracting visuals or detailed explanations. The message is direct and urgent, making it easy for a wide audience to understand.

Although it appears to be a piece of public art, it functions as contemporary graphic design. It uses typography, scale, hierarchy and digital display systems to communicate information clearly in a public space. This reflects how graphic communication has expanded beyond print into technological and interactive environments. Instead of existing on a page, the design occupies urban space and demands attention.

What makes the Climate Clock a clear example of design for good is its purpose. It is not promoting a product or brand, but instead translating climate science into a format that is accessible to the general public. Climate reports can often feel distant or difficult to engage with, but a countdown creates a sense of urgency. Watching the time decrease makes the issue feel immediate and real, bridging the gap between data and public understanding.

Technology is central to its impact. The clock runs using live climate data, meaning the numbers are constantly changing. This reinforces the idea that the crisis is ongoing. The project has also been replicated in cities around the world and widely shared online, showing how contemporary graphic design operates across both physical and digital platforms. Social media platforms such as Instagram have helped the clock become a recognisable symbol of climate urgency.

In terms of societal impact, the Climate Clock has contributed to keeping climate change in public conversation. It has appeared in protests, educational contexts and media coverage. While it may not directly create policy change, it influences how people think about the issue and encourages a sense of collective responsibility.

Overall, the Climate Clock shows how graphic design today can move beyond aesthetics and function as a form of public communication. Through the use of bold typography and real-time data, it makes an invisible crisis visible, which is what makes it a strong example of contemporary design for good.

Task 3: Collaborative Workshop – Screen Time Vs Green Time.

For the collaborative workshop, Jess and I worked together using Figma and a FigJam board to brainstorm ideas. We decided to focus on the 14–29 age group, as this audience is heavily influenced by digital culture, gaming, and social media platforms such as TikTok, Instagram, and BeReal. We wanted to create something that would feel relevant and slightly humorous, as this would be more effective for this age group.

FigJam Collab Board

Our campaign idea, “Touch Grass (Literally)”, is inspired by the popular online phrase “go touch grass.” Instead of using it negatively, we chose to reframe it in a more playful and positive way, encouraging people to actually step outside. The humour comes from taking the phrase literally and turning it into a real-world action, while still highlighting the need to take a break from screens.

One of our main ideas was to design interactive posters backed with fake grass, so when someone touches the poster, they are physically engaging with the message. We also discussed using environmentally friendly chalk spray to stencil “Touch Grass (Literally)” onto existing patches of grass in public areas, helping guide people towards outdoor spaces. These outcomes focus on interaction rather than just visual communication, allowing the benefits of being outdoors to be experienced rather than explained.

To support this, we planned a time-sensitive app inspired by BeReal, which would send random notifications prompting users to go outside and take a photo touching real grass. This links directly to existing digital habits while encouraging small behavioural changes. Overall, the campaign aims to reduce screen time and promote simple, everyday interaction with outdoor environments.

Task 4: Major Project Brief

Categories
Professional Portfolio Design Uncategorized Year 3

Negotiated Project

https://www.youtube.com/watch?v=UhVJ6LPcELA

Brand Guidelines

References

Adobe Stock asset #573492246. Adobe Stock. Available at: https://stock.adobe.com/templates/smartphone-app-mockup/573492246 (Accessed: 26th February 2026).

Adobe Stock asset #1773109223. Adobe Stock. Available at: https://stock.adobe.com/templates/coffee-cup-pouring-moment-mockup/1773109223 (Accessed: 26th February 2026).

Adobe Stock asset #940918483. Adobe Stock. Available at: https://stock.adobe.com/templates/cafe-poster-mockup-with-generative-ai/940918483 (Accessed: 26th February 2026).

Adobe Stock asset #1607303672. Adobe Stock. Available at: https://stock.adobe.com/templates/coffee-shop-t-shirt-mockup-with-back-view-of-person/1607303672 (Accessed: 26th February 2026).

Adobe Stock asset #977266744. Adobe Stock. Available at: https://stock.adobe.com/templates/front-view-wooden-storefron-mockup/977266744 (Accessed: 26th February 2026).

Adobe Stock asset #829209659. Adobe Stock. Available at: https://stock.adobe.com/templates/kitchen-apron-mockup/829209659 (Accessed: 26th February 2026).

Adobe Stock asset #1559039763. Adobe Stock. Available at: https://stock.adobe.com/templates/hands-holding-cafe-menu-mockup-with-coffee-and-cookie/1559039763 (Accessed: 26th February 2026).

Categories
PJ3 - Design Portfolio Project Three Uncategorized

Edecks Website Redesign

eDecks Logo

The live design brief involved a collaboration with a UK-based company called eDecks, which specialises in the sale of decking, timber, fencing, and roofing products through its online platform. Known for catering to both amateur DIYers and professional landscapers, eDecks provides a wide range of outdoor improvement materials. The company operates primarily online and serves customers across the UK. This project was conducted in partnership with the Marketing students at the University of Hull, forming a cross-disciplinary collaboration between design and marketing.

The target audience identified by the marketing team includes UK homeowners aged 28 to 60, many of whom are invested in maintaining or improving their outdoor spaces. Within this broader group, three key customer groups were identified:

  • Gardeners – typically seeking decorative or functional enhancements to their gardens.
  • DIY Hobbyists – often interested in user-friendly, ready-to-install kits.
  • Small-scale Landscapers – looking for reliable supplies for residential jobs.

The purpose of the design brief was to reduce the seasonal sales slump by engaging customers year-round, encouraging them to continue planning and purchasing for their outdoor spaces even in the off-peak seasons. This aligns with the company’s broader goals of maintaining profitability and visibility throughout the entire year.

The marketing students proposed several key ideas to support this aim. One of their suggestions was the integration of an AI-based design tool into the eDecks website called Evergreen Design Bot. This interactive tool would allow users to visualise their outdoor spaces across different seasons, using eDecks products. For example, customers could upload a photo of their garden and see how it might look in winter, complete with decking, fencing, or lighting. This feature not only enhances the user experience but also encourages out-of-season purchases by helping customers envision year-round utility.

Additional ideas included the “How To” digital leaflets and engaging social media content to promote the new tool. Our design response to this brief involved creating visuals for these deliverables, designing a new homepage layout, and prototyping designs that would promote the AI tool’s features. Our work supported the marketing strategy by ensuring that the digital presentation was appealing, functional, and brand aligned.

3.) Project Development- Document your group’s developing design work and what you contributed (500 words plus visuals with captions)

Our group project centred on designing elements to advertise the AI tool the marketing team came up with whilst modernising the visual branding and digital presence of eDecks. This included a redesigned website, promotional banners, social media content, and how to leaflets. Our group worked collaboratively to divide tasks according to our strengths, resulting in a final outcome that reflects a fresh, user-friendly that promotes the new Evergreen AI tool at a professional standard. 

How-to leaflets
Youtube Channel Mockup
Current eDecks website

Redesigns

Design 1
Design 2

I also designed website banners for promotion.

Vertical banner

Additionally, I produced a vertical banner ad designed to run on external websites.

Example of the vertical banner on another website

5.) Live Design Brief Portfolio Video (5 minutes)

link – https://youtu.be/fQMA2mbW35E

Categories
PSAD:WT

Cab – E

Research MoodBoard – Cab-E Online

  • This colour palette makes people aware the company provides an eco-conscious yet forward thinking service.
  • Using clean and approachable typography ensures the brand feels both user friendly and innovative.
  • Balancing imagery of nature, technology and people humanises the brand and positions the company as innovative.
  • Imagery focused on people also builds emotional connection and dynamic visuals can emphasise the forward momentum the brand provides.

Cab-E Online Style Guide

  • The Inclusion of electric vehicles reinforces CabEonline’s core identity as an eco-friendly and sustainable transportation alternative. It also highlights the company’s advancement in innovation and leadership in modern mobility as it sets them apart from traditional taxi firms. 
  • Featuring families, professionals, and individuals, generates a wide demographic that promotes inclusivity throughout the company. This shows that CabEonline caters to all types of customers and creates an emotional connection with diverse audiences. Smiling customers in comfortable settings also builds trust and confidence in the service, showing the user satisfaction it can have and its reliability. 
  • Including Hull landmarks like the Humber Bridge roots the company in the local community, resulting in not only relevance to residents but a connection of trust. Also adding urban settings to the environments section shows people the service is optimised for busy cities. This appeals to people who navigate city environments including both locals and tourists.  
  • Poppins is a clean and modern sans-serif font that has an approachable and simple feel. This ensures its readability across digital and physical formats, which is crucial for a company relying heavily on a mobile app. The bold fonts of Poppins convey confidence and authority, suggesting to other people that CabEonline is superior in the taxi industry. The italic font adds a sense of movement, symbolising progress, and forward motion. This is effective for headlines and taglines as it reinforces the notion of speed and efficiency. 
  • The slogans are clear, concise, and impactful using positive and eco-focused language. They resonate with a broad audience as they appeal to both practical (affordable, convenient) and emotional (green, future) aspects.
  • The Colour palette balances eco-conscious tones (greens) with modern and professional tones (blues and teal), making it ideal for both the environmental focus and the tech aspect of things. The harmonious mix maintains a premium and forward-thinking feel, appealing to a wide audience. 

Logo

 

App – Wireframes

rough example 1
rough example 2

App – Final HQ.

I believe this design would benefit the company’s agenda because ‘CabEonline’ is superior to other firms so booking a taxi needs to be the quickest thing. Once loading up you can see your location straight away and next to it how long it will take to find a taxi. This is a good way of keeping positive customer reviews because normally people don’t mind waiting the time if it states to them. For returning customers, it is quicker to get a taxi as below you can click one of your pre-saved destinations and order a taxi without having to type the address in and wasting time. 

For new customers it is still as easy by clicking ‘Where to?’ which allows you to type your address straight away and add stops if needed. ‘CabEonline’ is all about efficiency as well which the app provides by allowing you to prebook your taxi, so it automatically comes later without you doing anything. The confirmation screen also shows the route the taxi will take which will be the quickest and the safest route. This comforts the passenger in knowing that they will get there the quickest way but can also see the exact route in case they didn’t feel 100% safe. To make payment easy as possible the app allows phone payments like ‘Apple Pay’, which saves time not having to put card details in.

The actual design of the app would benefit the company’s agenda as the colours are from the palette created, meaning the company’s online presence will match across the board. Using the bright green to highlight key elements like the ‘Confirm booking’ button and other CTAs improves the overall readability of the app resulting in faster booking times and a higher rating from customers. ‘CabEonline’ has a very broad demographic meaning the design of the app must be easy to use for most ages’ erg 18+, and I think the design has successfully done that as it follows a simplistic but affective style throughout. 

Banners.

Video Ads.

Categories
Adv Design Portfolio Advanced Web design

Corner House – Rebrand.

Artboard 3
Artboard 4
Artboard 5
Artboard 7
Artboard 9
Artboard 3_1
Artboard 5_1
Artboard 7_1
Artboard 8
Artboard 9_1
Artboard 1_1
Artboard 8_1
Artboard 1_2
Artboard 2
Artboard 3_2
Artboard 4_1
Artboard 5_3
Artboard 43
Artboard 44
Artboard 45
Categories
Uncategorized

Presentation of Proposal

Tasks 1 – 4 – Blogposts & Presentation Brief

Task 1 : Otl Aicher’s Munich 1972 Olympics

A key example of graphic design for good from the post-war period is the visual identity created by Otl Aicher for the 1972 Munich Olympics. What makes this project so important is not just how clean and modern it looks, but what it was trying to achieve socially and politically.

After the Second World War, Germany needed to rebuild its international reputation. The 1972 Olympics gave West Germany the chance to present itself as peaceful, democratic, and forward-thinking. Aicher’s design played a big role in shaping that message. Instead of using heavy national symbols, he created a calm, clear and structured visual system that felt open and accessible.

Example of Aicher’s Pictograms

The pictograms are probably the most recognisable part of the project. Aicher designed simple icons to represent each sport. They didn’t rely on language, which meant they could be understood by visitors from all over the world. This is where the idea of “design for good” becomes clear as the system made the Games easier to navigate and more inclusive. It showed how graphic design can remove barriers rather than create them.

Guidelines and Standards for the Olympiad Munich 1972

Colour choices were also meaningful. Aicher avoided red and black because of their connection to Germany’s Nazi past. Instead, he used light blues and greens, which created a softer and more optimistic atmosphere. This wasn’t just an aesthetic choice but a conscious decision to distance modern Germany from its history. In that sense, the design helped reshape how the country was seen internationally.

Another important factor was consistency. Aicher developed strict guidelines so that everything including posters, tickets and signage followed the same visual language. This created clarity and trust. The project influenced how major events and public spaces are designed even today, especially in areas like wayfinding and branding systems.

Aicher’s work on different media

Even though the Games were overshadowed by tragedy, Aicher’s design still stands as an example of how graphic communication can have real societal impact. It helped reposition a nation and promote democratic values through visual means.

For me, this shows that post-war graphic design wasn’t just about style. It became a tool for rebuilding identity and encouraging openness. That’s what makes it a strong example of design for good.

Task 2: The Climate Clock

A contemporary example of graphic design for good is The Climate Clock, first launched in 2020 by Gan Golan and Andrew Boyd. The project displays a live digital countdown showing how much time remains to limit global warming before reaching critical levels. What makes it effective is how it turns data into something visual, immediate, and emotionally powerful.

New York Climate Clock

The first clock was installed in New York City in Union Square. Large, bold numbers were projected onto the side of a building, counting down in real time. The design is extremely simple with just typography and numbers, but that simplicity is what gives it impact. There are no distracting images or complicated explanations. The message is direct, that time is running out.

Although it appears to be an installation of public art, it functions as contemporary graphic design. It uses typography, scale, hierarchy, and digital display systems to communicate information clearly in a public space. This reflects how graphic communication has expanded beyond print into technological and interactive environments. Instead of sitting on a page, the design occupies urban space and demands attention.

What makes The Climate Clock a clear example of design for good is its purpose. It’s not selling a product or promoting a brand. It Instead translates climate science into a format that anyone can understand instantly. Climate reports can feel foreign and distant to most people, but a countdown creates a sense of urgency. Watching the seconds decrease makes the issue feel real and time sensitive. In that sense, the design bridges the gap between scientific data and public awareness.

Technology is central to its impact. The clock runs using up-to-date climate data, meaning the numbers are constantly changing. This reinforces the idea that the crisis is ongoing. The project has also been replicated in cities around the world and widely shared online, showing how contemporary graphic design operates across both physical and digital platforms. Social media has helped the clock become a recognisable symbol of climate urgency.

In terms of societal impact, The Climate Clock has contributed to keeping climate change in public conversation. It has appeared in protests, educational discussions, and media coverage. While it does not directly create policy change, it influences how people think about the issue and encourages collective responsibility.

The Climate Clock shows how graphic design today can move beyond aesthetics and become a form of public communication. Through bold typography and real-time data, it makes an invisible crisis visible. This is what makes it a strong example of contemporary design for good.

Task 3: Collaborative Workshop

For the collaborative workshop, Jess and I worked together using Figma and a FigJam board to brainstorm ideas. We decided to focus on the 14–29 age group, as this audience is heavily influenced by digital culture, gaming, and social media. We wanted to create something that would feel relevant and humorous as this would be highly effective with the age group.

Our campaign idea is called “Touch Grass (Literally)” and is inspired by the popular online phrase “go touch grass.” Instead of using it negatively, we plan to reframe it in a playful way to encourage people to actually step outside. The humour comes from taking the phrase literally and turning it into a real-world action.

One of our main ideas is to design interactive posters backed with fake grass, so when someone touches the poster, they are physically engaging with the message. The tagline could include phrases like “Now try the real thing” or “The real grass is outside,” with an arrow directing people toward nearby green spaces. We also discussed using environmentally friendly chalk spray to stencil “Touch Grass (Literally)” onto existing patches of grass in public areas to guide people outdoors.

To support this, we plan on creating a time-sensitive app similar to BeReal. The app would send random notifications prompting users to go outside and take a photo touching real grass. It would also provide small reminders about the benefits of fresh air and outdoor activity.

Task 4

Categories
Emerging Technologies Portfolio (Emerging Technologies)

Emerging Technologies: Portfolio

Campus Wayfinder showcase

Alternate Youtube Link – https://youtu.be/_vrT1ESJ0qk

Campus Wayfinder Link – https://campuswayfinderar.netlify.app

5 minute production video

Logo

Final Posters

Secondary Poster that was made for an alternate route.

Early Concept Poster Mockup

Code

<!doctype html>
<html lang="en">
<head>
  <meta charset="utf-8" />
  <meta name="viewport" content="width=device-width, initial-scale=1, viewport-fit=cover" />
  <title>Campus Wayfinder</title>

  <style>
    /* simple palette */
    :root{
      --navy:#0b1230;
      --lime:#b8ff3b;
      --text:#ffffff;
      --card:rgba(11,18,48,.78);
      --border:rgba(255,255,255,.10);
      --shadow:0 12px 30px rgba(0,0,0,.35);
    }
    html,body{margin:0;height:100%;background:#000;font-family:system-ui,-apple-system,Segoe UI,Roboto,Arial}
    *{box-sizing:border-box}

    /* camera */
    #cam{
      position:fixed; inset:0;
      width:100vw; height:100vh;
      object-fit:cover;
      background:#000;
    }

    /* splash */
    #splash{
      position:fixed; inset:0;
      display:flex; flex-direction:column;
      justify-content:center; align-items:center;
      gap:18px;
      background:linear-gradient(180deg,var(--navy),#040812);
      z-index:50;
      transition:opacity .6s ease;
    }
    #splash.hide{opacity:0; pointer-events:none; visibility:hidden}
    #splash img{width:min(260px,64vw); height:auto; filter:drop-shadow(0 14px 28px rgba(0,0,0,.35))}
    #spinner{
      width:28px;height:28px;border-radius:50%;
      border:3px solid rgba(184,255,59,.25);
      border-top-color:var(--lime);
      animation:spin 1s linear infinite;
    }
    @keyframes spin{to{transform:rotate(360deg)}}

    /* menu */
    #menu{
      position:fixed;
      left:50%; top:52%;
      transform:translate(-50%,-50%);
      width:min(380px,92vw);
      background:var(--card);
      border:1px solid var(--border);
      border-radius:22px;
      padding:18px 16px;
      color:var(--text);
      z-index:40;
      backdrop-filter:blur(10px);
      box-shadow:var(--shadow);
      opacity:0;
      pointer-events:none;
      transition:opacity .35s ease;
    }
    #menu.show{opacity:1; pointer-events:auto}
    #menu h1{margin:6px 0 6px; font-size:20px}
    #menu p{margin:0 0 12px; font-size:13px; opacity:.9; line-height:1.35}

    .btnRow{display:flex; gap:10px}
    button{
      appearance:none; border:0;
      border-radius:16px;
      padding:12px 14px;
      font-weight:800;
      cursor:pointer;
      transition:transform .06s ease, filter .12s ease, background .12s ease;
      flex:1;
    }
    button:active{transform:translateY(1px)}
    .btnPrimary{background:var(--lime); color:#061024}
    .btnPrimary:active{filter:brightness(.88)}
    .btnGhost{background:rgba(255,255,255,.12); color:var(--text)}
    .btnGhost:active{background:rgba(255,255,255,.18)}

    /* HUD */
    #hud{
      position:fixed; inset:0;
      z-index:20;
      pointer-events:none;
      display:flex; flex-direction:column;
      padding:12px 12px calc(14px + env(safe-area-inset-bottom));
      gap:10px;
      opacity:0;
      transition:opacity .3s ease;
    }
    #hud.on{opacity:1}

    #topCard{
      align-self:center;
      width:min(520px,94vw);
      background:var(--card);
      border:1px solid var(--border);
      border-radius:20px;
      padding:12px 14px;
      color:var(--text);
      backdrop-filter:blur(10px);
      box-shadow:var(--shadow);
    }
    #topTitle{font-weight:900; font-size:16px; margin:0}
    #topSub{margin:2px 0 0; font-size:12px; opacity:.88}

    /* passing toast */
    #toast{
      align-self:center;
      width:min(520px,94vw);
      overflow:hidden;
      border-radius:999px;
      background:rgba(184,255,59,.92);
      color:#061024;
      box-shadow:var(--shadow);
      padding:10px 14px;
      font-weight:900;
      font-size:13px;
      white-space:nowrap;
      opacity:0;
      transform:translateX(-20px);
    }
    .toastIn{animation:toastIn .35s ease forwards}
    .toastOut{animation:toastOut .35s ease forwards}
    @keyframes toastIn{to{opacity:1;transform:translateX(0)}}
    @keyframes toastOut{to{opacity:0;transform:translateX(20px)}}

    /* center arrow */
    #center{
      flex:1;
      display:flex;
      flex-direction:column;
      align-items:center;
      justify-content:center;
      gap:14px;
    }
    #arrowWrap{
      width:180px; height:180px;
      display:grid; place-items:center;
      position:relative;
    }
    #semi{
      position:absolute;
      width:210px; height:105px;
      border-radius:210px 210px 0 0;
      background:rgba(11,18,48,.55);
      border:1px solid var(--border);
      bottom:-28px;
      backdrop-filter:blur(10px);
    }
    #arrow{
      width:90px; height:90px;
      transform:rotate(0deg);
      transition:transform .08s linear;
      filter:drop-shadow(0 14px 30px rgba(0,0,0,.35));
    }
    #arrow path{fill:var(--lime)}

    #distancePill{
      background:var(--card);
      border:1px solid var(--border);
      border-radius:999px;
      padding:10px 14px;
      color:var(--text);
      font-weight:900;
      font-size:13px;
      backdrop-filter:blur(10px);
      box-shadow:var(--shadow);
      display:inline-flex;
      align-items:center;
      max-width:min(520px,94vw);
      white-space:nowrap;
    }
    #distanceText{
      overflow:hidden;
      text-overflow:ellipsis;
      max-width:100%;
    }

    #footer{
      align-self:center;
      width:min(520px,94vw);
      display:flex;
      gap:10px;
      pointer-events:auto;
    }
    #footer button{border-radius:16px; padding:10px 12px; font-weight:900}
  </style>
</head>

<body>
  <video id="cam" autoplay playsinline muted></video>

  <div id="splash">
    <img src="./assets/logo.svg" alt="Campus Wayfinder" />
    <div id="spinner"></div>
  </div>

  <div id="menu">
    <h1>Campus Wayfinder</h1>
    <p>Allow <b>Camera</b>, <b>Location</b> and <b>Motion</b> when asked.</p>
    <div class="btnRow">
      <button id="startBtn" class="btnPrimary">Start Navigation</button>
      <button id="resetBtn" class="btnGhost">Reset</button>
    </div>
  </div>

  <div id="hud">
    <div id="topCard">
      <p id="topTitle">Destination: Spoons</p>
      <p id="topSub">Estimated time: 5 minutes</p>
    </div>

    <div id="toast"></div>

    <div id="center">
      <div id="arrowWrap">
        <svg id="arrow" viewBox="0 0 100 100">
          <path d="M50 7 L76 55 H60 V92 H40 V55 H24 Z"></path>
        </svg>
        <div id="semi"></div>
      </div>

      <div id="distancePill">
        <span id="distanceText">Distance to destination: -- m</span>
      </div>
    </div>

    <div id="footer">
      <button id="stopBtn" class="btnGhost" style="flex:1">Stop</button>
    </div>
  </div>

<script>
  // route points (arrows aim to next waypoint)
  const ROUTE = [
    { name:"Start (Poster)",      lat:53.7698512154046,  lon:-0.36838928710517393 },
    { name:"Waypoint 2",          lat:53.7702119222983,  lon:-0.3683582932499405  },
    { name:"Waypoint 3",          lat:53.77063949566153, lon:-0.36830550551962915 },
    { name:"Waypoint 4",          lat:53.771121892644885,lon:-0.3682456100146112  },
    { name:"Waypoint 5",          lat:53.77167677372432, lon:-0.36812691354069216 },
    { name:"Waypoint 6",          lat:53.77162317714752, lon:-0.3671756992473263  },
    { name:"Destination (Spoons)",lat:53.77173654295845, lon:-0.36701474664587863 }
  ];
  const DEST = ROUTE[ROUTE.length - 1];

  // buildings (for "you are passing" message)
  const LANDMARKS = [
    { name:"Larkin Building",           lat:53.770298999999994, lon:-0.368293 },
    { name:"Chemistry Building",        lat:53.77096705860541,  lon:-0.3679494882103711 },
    { name:"Brynmor Jones Library",     lat:53.771161,          lon:-0.368523 },
    { name:"Robert Blackburn Building", lat:53.77138270374397,  lon:-0.36848045910014093 },
    { name:"Hardy Building",            lat:53.77142592119492,  lon:-0.3681547880130722 },
    { name:"Canham Turner Building",    lat:53.77172530620078,  lon:-0.36811243578327435 },
    { name:"Gulbenkian Centre",         lat:53.77169643461599,  lon:-0.367372587707564 },
    { name:"Student Hub (SU • SPAR • Wetherspoons)", lat:53.77169299417767, lon:-0.36694099722237183 }
  ];

  // values I tweaked while testing
  const ARRIVE_RADIUS = 18;      // metres
  const LANDMARK_RADIUS = 25;    // metres
  const LANDMARK_COOLDOWN = 45000;

  // elements
  const cam = document.getElementById('cam');
  const splash = document.getElementById('splash');
  const menu = document.getElementById('menu');
  const hud = document.getElementById('hud');
  const toast = document.getElementById('toast');
  const arrow = document.getElementById('arrow');
  const distanceText = document.getElementById('distanceText');

  const startBtn = document.getElementById('startBtn');
  const resetBtn = document.getElementById('resetBtn');
  const stopBtn = document.getElementById('stopBtn');

  // show menu after splash
  setTimeout(() => {
    splash.classList.add('hide');
    menu.classList.add('show');
  }, 5000);

  // state
  let watchId = null;
  let routeIndex = 0;
  let userPos = null;
  let heading = null; // 0..360
  let running = false;
  const lastToast = new Map();

  // helpers
  function toRad(d){ return d * Math.PI / 180; }
  function toDeg(r){ return r * 180 / Math.PI; }

  function dist(a,b){
    const R = 6371000;
    const dLat = toRad(b.lat - a.lat);
    const dLon = toRad(b.lon - a.lon);
    const lat1 = toRad(a.lat);
    const lat2 = toRad(b.lat);
    const x = Math.sin(dLat/2)**2 + Math.cos(lat1)*Math.cos(lat2)*Math.sin(dLon/2)**2;
    return 2 * R * Math.atan2(Math.sqrt(x), Math.sqrt(1-x));
  }

  function bearing(a,b){
    const lat1 = toRad(a.lat), lat2 = toRad(b.lat);
    const dLon = toRad(b.lon - a.lon);
    const y = Math.sin(dLon) * Math.cos(lat2);
    const x = Math.cos(lat1)*Math.sin(lat2) - Math.sin(lat1)*Math.cos(lat2)*Math.cos(dLon);
    return (toDeg(Math.atan2(y,x)) + 360) % 360;
  }

  function norm(d){
    d = d % 360;
    return d < 0 ? d + 360 : d;
  }

  function formatM(m){
    if (!isFinite(m)) return '--';
    if (m < 1000) return Math.round(m) + ' m';
    return (m/1000).toFixed(2) + ' km';
  }

  function showToast(msg){
    toast.textContent = msg;
    toast.classList.remove('toastIn','toastOut');
    void toast.offsetWidth;
    toast.classList.add('toastIn');
    setTimeout(() => {
      toast.classList.remove('toastIn');
      toast.classList.add('toastOut');
    }, 2200);
  }

  async function startCamera(){
    const stream = await navigator.mediaDevices.getUserMedia({
      video: { facingMode: { ideal:"environment" } },
      audio: false
    });
    cam.srcObject = stream;
  }

  async function askMotion(){
    // iOS safari needs this or heading might be null
    try{
      if (typeof DeviceOrientationEvent !== "undefined" &&
          typeof DeviceOrientationEvent.requestPermission === "function"){
        await DeviceOrientationEvent.requestPermission();
      }
    }catch(e){}
  }

  function startCompass(){
    window.addEventListener('deviceorientation', (e) => {
      let h = null;
      if (typeof e.webkitCompassHeading === 'number'){
        h = e.webkitCompassHeading;
      } else if (typeof e.alpha === 'number'){
        h = norm(360 - e.alpha);
      }
      if (h !== null && isFinite(h)){
        heading = h;
        if (running) tick();
      }
    }, { passive:true });
  }

  function startGPS(){
    watchId = navigator.geolocation.watchPosition(
      (p) => {
        userPos = { lat:p.coords.latitude, lon:p.coords.longitude };
        if (running) tick();
      },
      () => {},
      { enableHighAccuracy:true, maximumAge:1000, timeout:12000 }
    );
  }

  function checkLandmarks(){
    if (!userPos) return;
    for (const lm of LANDMARKS){
      const d = dist(userPos, lm);
      if (d <= LANDMARK_RADIUS){
        const last = lastToast.get(lm.name) || 0;
        if (Date.now() - last > LANDMARK_COOLDOWN){
          lastToast.set(lm.name, Date.now());
          showToast("You are passing: " + lm.name);
        }
      }
    }
  }

  function tick(){
    if (!userPos) return;

    // always show distance to final destination
    const dDest = dist(userPos, DEST);
    distanceText.textContent = "Distance to destination: " + formatM(dDest);

    // waypoint advance
    const wp = ROUTE[routeIndex];
    const dWp = dist(userPos, wp);
    if (dWp <= ARRIVE_RADIUS && routeIndex < ROUTE.length - 1){
      routeIndex++;
    }

    // arrow points to current waypoint
    const target = ROUTE[routeIndex];
    const b = bearing(userPos, target);
    const rel = (heading == null) ? b : norm(b - heading);

    arrow.style.transform = `rotate(${rel}deg)`;

    checkLandmarks();
  }

  async function startNav(){
    try{
      await startCamera();
    }catch(e){
      alert("Camera permission is required.");
      return;
    }

    await askMotion(); // if user blocks it, arrow still points but won't rotate properly

    if (!navigator.geolocation){
      alert("Geolocation not supported.");
      return;
    }

    startCompass();
    startGPS();

    running = true;
    hud.classList.add('on');
    menu.classList.remove('show');
    tick();
  }

  function resetNav(){
    routeIndex = 0;
    lastToast.clear();
    tick();
  }

  function stopNav(){
    running = false;
    hud.classList.remove('on');
    menu.classList.add('show');
    if (watchId !== null){
      navigator.geolocation.clearWatch(watchId);
      watchId = null;
    }
  }

  startBtn.addEventListener('click', startNav);
  resetBtn.addEventListener('click', resetNav);
  stopBtn.addEventListener('click', stopNav);
</script>
</body>
</html>

Emerging Technology Project Reflective Log

This portfolio shows the design and development of Campus Wayfinder, an emerging technology prototype designed to help new university students find their way around an unfamiliar campus. The idea was to create an experience where a student could scan a QR code on a poster and instantly access a browser based augmented reality (AR) navigation system. The experience uses the phone’s camera, GPS, and motion sensors to guide the user towards a destination using on screen arrows and contextual labels of nearby buildings. 

The project unfortunately was not as straightforward as this. Throughout development, I encountered several technical and platform-related challenges that forced to me to rethink my approach. Rather than seeing this as a failure, the process became an important part of the project, as it required problem solving, research, and adapting my design to real world constraints. This log shows that journey, including difficulties I had and why I changed direction, but still managed to achieve the aims of the assignment.

Initial Platform choice and Complications

At the start of the project, I planned to build the Campus Wayfinder using 8th Wall, as it is a well-known WebAR platform that supports image targets, world tracking and Visual Positioning System (VPS). My original idea was to place 3D arrows directly onto the campus environment so that users could see them anchored to the ground as they walked. 

However, once I started using 8th wall, I quickly ran into several limitations that made it difficult to continue. Access to VPS features is restricted and requires a paid subscription. Although that it’s possible to create locations on the map, the free version does not allow full testing or deployment of custom VPS locations. This meant that even though I could design the experience conceptually, I could not properly test and show it on campus in real world conditions. 

8th wall relies heavily on App Keys, hosted builds and limited local testing. These restrictions made it hard to quickly test ideas, make changes while physically on campus and in result simulate the campus wayfinder. These issues were not caused by a lack of understanding or incorrect setup on my part, but by the limitations of the platform when used without a paid plan. 

At this stage, I had to make an important decision. I could either reduce the scope of the project or rethink my approach entirely. So, I chose to take a step back and explore alternative ways of achieving the same goal. 

Changing Direction and Rethinking the Approach

The goal of this assignment was not to use a specific AR platform, but to create a project that explores emerging technologies, demonstrates technical understanding, and shows thoughtful design. I realised that the core of my project was not tied to 8th wall itself, but to the idea of combining real-world movement with digital guidance. After research, I found that modern mobile browsers already provide access to many of the features needed for this type of experience, such as camera access, GPS tracking, and device orientation sensors. 

This led me to change direction and build the project using HTML, CSS, and JavaScript instead. By using these, I was able to recreate much of the intended functionality without relying on a platform that had issues. The experience could still be launched by scanning a QR code, show a live camera feed, respond to the user’s location and movement, and guide them towards a destination. The transition was made easier by me having previous experience using HTML, CSS, and JavaScript in earlier projects. Having this background meant I was already comfortable building layouts, handling interactions and debugging issues. This allowed me to focus more on solving higher-level problems rather than learning everything from scratch. I also watched several video tutorials and blogs to help me better understand things like geolocation, device orientation, and mobile browser permissions.

Use of Emerging Technologies

The final version of Campus Wayfinder does not use native AR frameworks like ARkit or ARcore, instead it uses key principles of augmented reality. The experience overlays digital information onto the real world through a live camera feed and reacts to the user’s physical movement using GPS and compass data. 

This type of WebAR-style experience is an emerging approach. This is because it prioritises accessibility and ease of use, as users do not need to download an app or have a specific device. Instead, everything runs directly in the browser, which aligns with current trends towards lightweight, accessible emerging technologies. 

User Experience Design Decisions

User experience was a major focus of the project. New students are often in a rush, unfamiliar with their surroundings, and may already feel overwhelmed. Because of this, I designed the experience to be as simple and clear as possible. 

The experience starts with a short splash screen, followed by a simple menu that explains what permissions are needed and what will be used. Rather than presenting multiple options, the interface guides the user towards one clear action, to start navigation. 

Originally, I wanted to place arrows directly onto the ground using world-locked AR. However, due to browser limitations on iOS, this approach was unreliable. Instead, I designed a HUD-style directional arrow that rotates based on the users compass heading. While this is technically simpler, it works consistently and is easy to understand, which ultimately makes for a better user experience. 

To help users feel more confident about where they are, the system also displays temporary messages when they pass key landmarks, such as buildings on campus. This reinforces real world context and helps users gradually learn the layout if the campus.

Technical Challenges and How I Solved Them

One of the biggest technical challenges was dealing with GPS accuracy. GPS data can be unreliable, especially around buildings. To solve this, I avoided relying on exact coordinates and instead used distance ranges. When the user enters a certain radius around a checkpoint, the navigation updates smoothly rather than requiring perfect accuracy.

Another major challenge involved device orientation on iOS. Accessing compass data requires explicit permission, and the browser will not always prompt the user automatically. At first, this caused the directional arrow to stop responding. After researching the issue, I implemented direct permission requests for both motion and orientation sensors and added fallback logic to handle different browser behaviours. This process helped me gain a much better understanding of how mobile browsers manage sensor data.

Design Development and Decision-Making

A key focus throughout the development of Campus Wayfinder was ensuring that design decisions were driven by user needs and real-world context rather than purely aesthetic choices. From the early stages of the project, the aim was to create a navigation experience that felt intuitive, minimal, and suitable for use while moving through a physical environment.

One of the most important design decisions was the choice to prioritise visual guidance over text-based instructions. Traditional wayfinding systems, such as campus signage and public transport environments, highlight the effectiveness of simple directional symbols and high-contrast visuals. These systems rely on quick recognition rather than detailed information, which inspired the use of arrows, icons, and minimal copy throughout the interface. This approach helped reduce cognitive load and allowed users to focus on their surroundings rather than their screens.

Colour and contrast played a significant role in ensuring usability. The chosen colour palette was designed to remain visible in outdoor lighting conditions, while also aligning with an approachable familiar visual identity (University of Hull colours). High contrast between background elements and interactive components was used to improve legibility, particularly when the experience is used on the move. Rounded UI elements were intentionally chosen to create a friendly and accessible tone, reinforcing the idea that the system is designed to assist rather than overwhelm the user.

Typography was another key consideration. Mukta was chosen as it’s clear and legible type maintains readability at varying distances and screen sizes. Hierarchy was carefully applied so that essential information, such as direction and distance, always took priority over secondary content. This hierarchy ensured that users could quickly interpret the interface without needing to stop or concentrate on the screen for extended periods.

Iteration played a central role in shaping the final design. Early concepts explored more complex AR interactions, including placing 3D arrows directly into the environment. However, technical constraints and accessibility considerations resulted in a redesign of the interface. Instead of treating these limitations as setbacks, the design was adapted to focus on a HUD-style navigation system that still responded to the user’s movement and direction. This shift allowed the project to remain functional, accessible, and aligned with the original design goals.

Design tools such as Figma were used extensively to explore layout, spacing, and interaction flow before development began. This helped ensure consistency across screens and allowed design decisions to be tested visually before being implemented in code. Adobe Illustrator was used to develop the project’s branding and logo, reinforcing a cohesive visual identity across both physical posters and the digital interface.

Overall, the design process behind Campus Wayfinder demonstrates a balance between creativity and practicality. Each design decision was made with research and user context in mind, resulting in an experience that is both visually considered and functionally effective.

Software Proficiency

The main development was done using HTML, CSS, and JavaScript. My previous experience with these technologies allowed me to quickly prototype ideas and make changes as the project evolved. I combined this existing knowledge with online tutorials and blogs to overcome more complex challenges, particularly those related to mobile sensors and permissions.

CSS played a large role in the project, especially for animations and transitions. Sliding menus, fading headers, and animated arrows were all handled using CSS rather than heavy JavaScript, which helped keep performance smooth on mobile devices.

To host and test the Campus Wayfinder project, I used Netlify, a web-based hosting platform designed especially for web projects. Netlify allowed me to deploy the project quickly by simply dragging and dropping my HTML, CSS, JavaScript, and asset files into the platform, which automatically generated a live URL for testing and sharing. This deployment process was particularly helpful during development, as it allowed me to test the experience directly on my phone using real camera, GPS, and motion sensor data. The platform also removed the need for complex server configuration, allowing me to focus more on refining the interaction design and user experience rather than consuming time managing backend infrastructure.

For design, I used Figma to plan layouts, spacing, and interactions before implementing them in code. This helped me visualise the experience and maintain a consistent visual style.

I also used Adobe Illustrator to design the Campus Wayfinder logo. During development, I noticed that the logo text was not displaying correctly on the website. I fixed this by converting the text to outlines before exporting the SVG, ensuring it looked consistent across all devices.

Ethical Considerations

Because Campus Wayfinder uses camera access, location data, and motion sensors, ethical considerations were important. The experience only requests permissions when they are needed and explains clearly why they are required.

No user data is stored, tracked, or shared. The system does not run in the background and does not include analytics or third-party tracking. This follows principles of user consent, transparency, and data minimisation.

Accessibility was also considered. By using a browser-based approach and avoiding app installation, the experience is more accessible to a wider range of users and devices.

Forward Thinking and Future Development

Campus Wayfinder aligns with current and future trends in emerging technologies, particularly WebAR and location-based experiences. As browser support for advanced AR features improves, this type of project could evolve to include world-locked AR elements, indoor navigation, or more advanced spatial mapping.

Future improvements could include dynamic routes, accessibility-focused navigation modes, or integration with native AR frameworks where available. This project shows how emerging technologies can be explored realistically while still leaving room for future expansion.

Conclusion

Campus Wayfinder represents a practical exploration of emerging technologies through design, experimentation, and adaptation. Although my original plan was limited by platform constraints, changing direction allowed me to create a functional and well-considered prototype that still meets the aims of the assignment.

The project demonstrates technical problem-solving, user-centred design, ethical awareness, and forward-thinking development. Most importantly, it reflects the reality of working with emerging technologies: understanding limitations, adapting ideas, and finding effective solutions within those constraints.

Categories
Professional Portfolio Design Year 3

Professional Portfolio: Proposal

Proposal Brief

5 minute Presentation

Creative Boom Article Review

This Creative Boom article explores how different designers responded to a branding challenge to create a visual identity for a fictional coastal music and arts festival. The article showcases diverse design responses, each using typography, colour, and illustration to reflect the emotional experience of the event. It highlights that successful branding (for music and cultural venues) goes beyond visual aesthetics and that it should evoke an emotional connection and create a sense of place.

A key insight from this article is the importance of storytelling through design. Many participants combined retro visual styles with modern graphic design to really represent the festival’s sound and movement. This demonstrates how visual identity can translate music and atmosphere into a cohesive design language. The article makes it clear that effective branding for music-related environments should feel authentic and immersive, giving audiences a taste of the experience before they arrive.

This article supports my proposed rebrand of Solos Music Cafe, as both projects share a focus on creating a music centred cultural experience rather than simply a logo or a visual system. Like the festivals branding examples, my cafe rebrand look to merge sound, culture and community into one cohesive identity. It reinforces the ides that good branding should invite people to participate and make them emotionally connect – In my case, by supporting local musicians and offering an engaging cafe environment. The articles examples of blending retro style with modern design resonates strongly with my project. Solos Music Cafe will include nostalgic musical elements e.g vinyls whilst introducing a modern twist through interactive features such as the NFC sticker. This technology allows customers to engage directly with the cafes music, reflecting how branding can evolve alongside audience behaviour and technology.

In Summary, the article validates my approach by demonstrating that branding for music and cultural spaces should combine visual storytelling, emotional design and modern interactivity to build a meaningful experience for its audience.

References

Creative Boom. (2024) Boom Brief #3: How you met our challenge to brand a coastal music & arts festival. Available at: https://www.creativeboom.com/inspiration/boom-brief-3-how-you-met-our-challenge-to-brand-a-coastal-festival/ (Accessed: 3 November 2025).

Categories
Emerging Technologies

Exploring Emerging Technologies and Immersive Design

360 Content

For the 360-degree content task, I designed and rendered a fully immersive 3D environment in Blender, choosing to create a prison cell scene. I wanted to go beyond simply following the tutorial or example provided, so I built and several of the main assets myself, including the bed, window bars, and smaller objects around the cell. I chose this confined setting deliberately because working with a limited space meant I could concentrate more on the core principles of 3D environments—the lighting, atmosphere, and realism.

Process of adding iron bars to the model

From the beginning, my goal was to make the space feel believable but also emotionally heavy. I used basic compositional and lighting principles to give the scene a distinctive tone. The main light source was a sunlight asset, providing ‘daylight’ filtering through the barred window, which casted shadows across the floor. I paired this with a ceiling point light to simulate the artificial glow often seen in real prison cells. The combination helped to soften the darker areas whilst still maintaining a slightly moody atmosphere. Getting the balance right between natural and artificial light took a lot of trial and error, but it also helped me better understand how light direction and intensity can completely alter the emotional tone of a 3D environment. The walls and the floor textures were found for free online, I tried to find a worn and rustic textures for both the wall and the floor to reflect realism. 

Blender screenshot showing the positioning of the Sun asset

When it came to camera placement, I applied immersive UX principles related to embodiment and proximity (Sahu, 2025). I set the viewer’s perspective at around human height, giving them the sense of physically standing inside the cell. This choice was really about evoking empathy and discomfort, two feelings that are quite impactful when exploring themes like imprisonment and confinement.

I realised during the process that even without any animated characters, dialogue, or narrative, the environment itself can tell a story. The positioning of objects, the texture of the walls and the way the lights have been set up all work together to create a mood. This is what’s known as environmental storytelling, and it’s something I’ve grown to appreciate much more after completing this task.

This 360 environment could easily fit within a story sphere or interactive documentary, as it invites reflection on broader social issues. I can imagine using it in an educational or social justice context, where viewers could explore the emotional reality of imprisonment in a safe but thought-provoking way. If I were to develop it further, I’d love to add interactive elements such as moving assets, ambient sounds (like footsteps or cell doors closing) or even a brief narrative voiceover to make the experience more dynamic. This task definitely improved my confidence in Blender, particularly in lighting and atmosphere design, and gave me a foundation of skills I can build on for future projects.

WebVR

https://framevr.io/aplacetoremember

For the WebVR task, I created an interactive virtual gallery using FrameVR, a web-based engine that enables users to explore and present in 3D spaces directly through their browsers. I chose this environment to showcase two of my previous projects—one with a 3D model, and the other with a logo design sequence, allowing me to present my work in a more engaging and interactive way than a traditional portfolio.

Cab-E section of the environment
Cab-E Logo Evolution
THRIVE Section of the environment

FrameVR supports a variety of media formats such as videos, images, and 3D models (Benham and Budiu, 2022). This made it possible for me to combine static and moving content to form a multi-media experience. I arranged my assets using spatial hierarchy, giving each project its own dedicated section with clear sightlines so visitors could intuitively know how to navigate.

THRIVE video Ad

A big part of this task involved thinking like a curator rather than just a designer. I had to decide where users would spawn, how they would move through the space, and what they would see first. These decisions were guided by principles of immersive UX and spatial narrative (Sahu, 2025). I added a few interactive features such as clickable videos and a magnifying glass tool so users could zoom in on smaller details. Additionally, I imported a 3D model of an energy drink can from a previous project and applied a gentle rotational animation. It’s a small detail, but it helped to bring a sense of movement to the gallery without becoming a distraction. This subtle use of motion reflects Eriksen’s (2023) point that animation should enhance engagement rather than overwhelm it.

Example of using the magnifying glass feature

Creating this gallery taught me how digital portfolios can be more than just a bunch of images and that they can be turned into immersive storytelling spaces. I noticed that giving users the freedom to move around and explore encouraged curiosity. Some might spend longer looking at one project than another, which makes the experience more personal. It’s also interesting how translating 2D work into 3D environments changes its impact. For instance, my flat logo design felt more substantial when placed in a three-dimensional setting.

If I were to continue developing this, I’d like to turn it into an online exhibition space that could be used for professional showcases or even client presentations. The biggest challenge was balancing visual aesthetics with usability—making sure users could navigate easily without feeling confused or distracted. In this task I learned a lot about how digital spaces can convey identity and personality just like a physical gallery would.

Augmented Reality & UX

For the Augmented Reality (AR) task, I decided to take a more playful approach by designing a mobile-based prototype in 8th Wall, where users can throw a virtual ball at 3D objects to score points. My main inspiration came from gesture-based games like Pokémon GO, where the user throws a ball to level up or achieve something.

A screenshot to show the depth and assets of my project in 8th wall

The interaction would be simple: users tap or swipe on their phone screen to throw the virtual ball, while moving their device to aim at the target objects. As for this task though, I focused on the physics and the basics of this prototype. I paid particular attention to user safety and ergonomics which is a key part of UX design. Since the experience uses the device’s camera to overlay digital content onto the real world, users can still see their surroundings, which helps prevent motion sickness or accidental collisions. This aligns with Eriksen’s (2023) recommendations around maintaining freedom of movement in immersive UX.

On the technical side, I modelled the target objects (bowling pins) in Blender and exported them as .glb files into 8th Wall. Setting up the physics interactions was by far the most challenging part. Initially, the pins simply fell through the floor because the system couldn’t detect their collision boundaries. After some troubleshooting, I fixed this by applying a capsule collider, which allowed the virtual ball and pins to interact realistically. This problem-solving process gave me a much better understanding of how physics simulation works in AR environments and specifically 8th wall.

Bowling Pin model created in Blender
Manual Capsule Collider settings

The prototype demonstrated how AR can be used not just for entertainment but also for social and physical engagement. I could see this concept evolving into a multiplayer mobile game where users compete in real-world settings, turning any open space into a bowling alley. Visually, I made sure the scene had bright colours and clear contrasts to keep the focus on the action. This minimalist approach would be something I’d like to carry forward if was to continue this, as it maintains accessibility while enhancing immersion.

Overall, this project deepened my appreciation for user experience design in AR—particularly the importance of balancing realism with usability. AR’s strength lies in its ability to merge the digital and physical worlds together, and even a simple prototype task like this shows how creative and interactive it can be. 

VR Art

The VR Art task, created using Open Brush, was one of the most creatively freeing part of this series of tasks. Working in a virtual 3D drawing space offered a completely different way of thinking about art and storytelling. Instead of being limited to a flat canvas, I could draw in mid-air, surrounding myself with designs and vibrant colours.

At first, I focused on getting used to the controls and experimenting with different brushes and effects. I started by sketching simple objects—a house, a car, a few abstract shapes, mainly to test how perspective worked in three-dimensional space. One of the biggest challenges I encountered was depth alignment. Without a physical reference point, it’s surprisingly easy to misjudge distances, leading to parts of the drawing overlapping in strange ways. However, over time, I developed a better sense of spatial awareness and hand-eye coordination, which are both crucial for immersive design work (Chang, 2025).

The second time I had the VR I decided to draw a house again, this time learning to try and improve the depth alignment and make a little scene. I used the table as a level so the floor was guaranteed to be straight. 

Timelapse of me creating a house

What I found most interesting about this task was the emotional experience of creating in VR. Being inside the artwork gave me a sense of presence that’s hard to achieve in traditional digital work. Mistakes felt less frustrating because they could be fixed instantly, and I had no limit on where I could draw. This sense of embodiment and freedom aligns closely with what Polydorou (2024) describes as the connection between immersion and emotional engagement in VR storytelling. This exercise really reinforced how VR art can merge creative intuition with UX design principles, encouraging artists and designers to think spatially rather than flatly.

Conclusion

Across all four tasks—360 Content, WebVR, AR & UX, and VR Art—I’ve learned how immersive technologies reshape the way stories can be told and how experiences are designed. Each tool brought new challenges but also new ways to connect creativity with technical understanding. These tasks have strengthened my confidence in using immersive tools and have shown me how technology can be used to create meaningful, engaging, and emotional experiences. Looking ahead, I’d like to continue exploring one of these four tasks and use them to help me create a final project for this module – this being AR & UX. I believe AR is the perfect tool for me to use as a graphic designer because it allows me to combine my 2D skills and illustrations from Photopshop and Illustrator etc. with my 3D skills from 8th Wall to create interactive material.

References

Behnam, S. and Budiu, R., 2022. The usability of augmented reality (AR) [online]. Nielsen Norman Group. Available at: https://www.nngroup.com/articles/ar-ux-guidelines/ [Accessed 25 October 2025].

Chang, S., 2025. The impact of digital storytelling on presence, immersion and user experience in VR [online]. MDPI Sensors. Available at: https://www.mdpi.com/1424-8220/25/9/2914 [Accessed 25 October 2025].

Eriksen, M., 2023. 6 UX design principles for augmented-reality development [online]. UXmatters. Available at: https://www.uxmatters.com/mt/archives/2023/03/6-ux-design-principles-for-augmented-reality-development.php [Accessed 25 October 2025].

Polydorou, D., 2024. Immersive storytelling experiences: a design methodology [online]. Digital Creativity, Taylor & Francis Online. Available at: https://www.tandfonline.com/doi/full/10.1080/14626268.2024.2389886 [Accessed 25 October 2025].

Sahu, S., 2025. Designing for the immersive world: a UX designer’s guide to AR, VR, and XR [online]. UX Planet. Available at: https://uxplanet.org/designing-for-the-immersive-world-a-ux-designers-guide-to-ar-vr-and-xr-c2414802be59 [Accessed 25 October 2025].

Yang, S., 2023. Storytelling and user experience in the cultural metaverse [online]. Heliyon, Elsevier. Available at: https://www.sciencedirect.com/science/article/pii/S2405844023019667 [Accessed 25 October 2025].

Part Two: Research Proposal

Introduction

The Campus Wayfinder project proposes the creation of an interactive AR navigation tool designed specifically for new students, visitors, and staff at the University of Hull. The system aims to enhance spatial orientation and user experience by providing a digital assisted guide that overlays navigation information directly onto the physical environment. This project combines emerging AR technology with principles of accessibility and visual communication, which are key elements within graphic design. The proposed prototype will utilise 8th Wall, a web-based AR platform, and Blender for 3D modelling to produce an engaging navigational experience. 

The research investigates how AR can bridge the gap between digital wayfinding and physical exploration, turning traditional campus maps into interactive tools that promote engagement and confidence amongst new users. This project also considers broader implications of ethical data use, accessibility, and inclusivity in AR design. Through design thinking, iterative testing, and UX principles, the project aims to demonstrate how we as graphic designers can apply emerging technologies innovatively and responsibly.

Research Overview

The goal of the Campus Wayfinder is to create an immersive and user-friendly AR experience that helps users navigate the University of Hull campus more easily. This project combines graphic design, UX design, and emerging technology together, demonstrating how traditional design can be evolved into interactive, spatial media.

Purpose and Context

Navigating a new university campus can be overwhelming, especially for first-year students and visitors during open days or events. While static maps and signposts provide guidance, they can often lack interactivity, accessibility and can easily become outdated. The purpose of the AR Campus Wayfinder is to reimagine the wayfinding process by introducing an AR experience that provides 3D visual directions, real-time orientation cues, and spatial awareness through a smartphone or tablet camera.

The AR experience would be activated by scanning a poster or QR code placed around campus (e.g. at entrances or key buildings). Once activated, users would see an AR overlay featuring a 3D model of the campus built in Blender, complete with markers, arrows, and animated pathways leading to destinations such as the library, student union, or lecture halls.

UX principles such as embodied cognition have been implemented into this design, which enhances the users understanding through physical interaction with the environment (Benham and Budiu, 2022). The idea is to empower users with spatial awareness by combining real-world visuals with digital augmentation — allowing for intuitive, immersive navigation rather than abstract map-reading.

Scope and Aims

The Campus Wayfinder will function as a design-led prototype demonstrating how UX principles and graphic design can be applied to a university environment. The project aims to:

  • Explore the relationship between spatial UX design and physical navigation. 
  • Develop a 3D visual model of the campus using Blender, exported as. glb files for AR integration. 
  • Prototype interactive elements including directional arrows, information panels and animated routes. 
  • Help students be more familiar with their campus and improve their anxiety/disorientation from becoming lost or not knowing where to go.

The final prototype will not include real time GPS data as this is not achievable using 8th Wall. Instead, it will conceptually simulate how a full system could function through carefully designed interactivity, clear visuals, and animated transitions. 

User Experience (UX) Considerations

The primary audience includes new students, visitors, and staff members who are unfamiliar with the campus layout. The secondary audience includes future students exploring the university during open days or events. The design will prioritise:

  • Ease of use: The application requires no installation and should load instantly via a QR code. 
  • Visual clarity: The use of university brand colours and simple icons will not only ensure immediate recognition but also reduce cognitive load. 
  • Interactivity: Users can tap on buildings to reveal names and information or select a destination to visualise a suggested route. 
  • Spatial comprehension: The 3D map will be slightly elevated and have a slight isometric perspective, helping users to understand layout and distance. 

The system will aim to foster a sense of orientation and belonging, encouraging exploration rather than frustration. 

Ethical considerations

When the project involves interactive media and environmental data, ethics can be critical in emerging tech design:

  • Privacy: the AR system will operate locally without collecting user data or location tracking. 
  • Safety: The passthrough camera view allows users to remain aware of their surroundings whilst navigating.
  • Accessibility: Colours and texts will have contrast standards, so they are legible in outdoor environments. 
  • Testing Consent: If any user testing is done it will be voluntary and anonymised, with clear participant consent forms.

Inspiration and Case Studies

This project draws inspiration from two main applications. Marks & Spencer’s (M&S) in-store AR Wayfinder app, which uses augmented reality and interface overlays to guide customers through retail spaces with ease. The other being Google Maps Live View, which integrates arrows into navigation and allows the user to view a location in first person view through their smartphone. These applications are prime examples that help demonstrate how spatial information can be communicated in intuitive ways. 

Project Plan 

Methodology and Approach

The research adopts a Design Thinking and Agile methodology. Design thinking enables creative exploration through empathy, ideation, prototyping and testing whist Agile methods (Scrum), structure progress into measurable sprints. A sprint is a short, fixed-length period during which a team works to complete a set of prioritised tasks to deliver a tangible, potentially releasable product increment. 

User Stories (Scrum Format) 

  1. As a new student, I want to scan a QR code to instantly access an AR map so I can find my way around campus. 
  2. As a visitor, I want to explore the 3D models and be able to identify key buildings without the need to download an app. 
  3. As a student with accessibility needs, I want clear visual and audio cues that make it easier to navigate without reading smaller text. 
  4. As a staff member, I want to be able to recommend this AR tool to help visitors find my department easily.
  5. As a designer, I want to showcase how AR can enhance physical spaces through immersive storytelling.
MilestoneTasksTimeframeDeliverables
Research & IdeationAnalyse AR navigation examples and competitorsWeek 1-2Research summary, moodboards
Concept DevelopmentSketch wireframes, plan user journey, poster mockupsWeek 3-4 Concept sketches, stroyboards
3D modelling Model Campus and create direction assetsWeek 5-6 .glb models and materials
8th Wall PrototypeBuild AR experience, import 3D models Week 7-8Working prototype
Testing & IterationConduct informal tests with 3-5 students, improve based on feedbackWeek 9UX testing report and improvement to-do list
Final PresentationRecord screen capture of AR in use, create any supporting visuals.Week 10 Finalised prototype and presentation slides. 

Task board and Workflow

Tasks will be tracked via a Trello board divided into ‘To-Do’, ‘In Progress’, ‘Testing’, and ‘Complete’ columns. Each milestone will have specific subtasks e.g ‘Export .glb model from blender’. 

Tools and Technologies

These are the Tools and applications that will most likely be used to complete this project:

  • 8th Wall: For AR development and scripting interactions
  • Blender: For creating optimised 3D assets and exporting. glb models.
  • Adobe Illustrator and Photoshop: For poster design and interface graphics.
  • Trello: For workflow management.
  • Figma: For prototyping UI overlays and visual testing. 

UX and Design Principles

The AR Wayfinder will be guided by UX and spatial design principles: 

  • Visibility of interaction confirmation: Provide feedback when users interact with an element e.g highlighted building animation.
  • Recognition over Recall: Present all key destinations on the map rather than making users rely on memory. 
  • Minimalist Design: Use clear, colour-coded arrows and icons without unnecessary clutter. 
  • Consistency and Branding: Ensure colours and typefaces align with the University of Hull’ identity system. 

Anticipated Challenges

  • Managing to achieve accurate spatial alignment between the 3D models and the real-world environment.
  • Ensuring performance is stable on different devices.
  • Diverse user groups are considered when designing for accessibility. 

The camera will be anchored to be aligned with eye level perspective, this ensures the user feels present within the environment rather than perceiving seperate detached models (Eriksen, 2023). This design choice supports immersive UX theory and enhances user embodiment within the digital-physical interface. 

Concept Storyboard

Storyboard

Visual Style and Tone

The AR interface will follow a minimalist and modern visual identity with vibrant green and blues, referencing the University of Hull’s branding. Arrows and route lines will use glowing gradients for clarity in outdoor light. Typography will prioritise legibility, using bold sans-serif fonts such as Helvetica or Poppins. To maintain visual interest subtle animations will be added, without causing any distraction or motion sickness. 

User Journey

The user journey is designed to be smooth and instant, requiring no installations or logins. QR codes are used a lot in today’s world meaning the simplicity of scanning and interacting with the codes, encourages repeat engagement and reduces technological barriers which can be a crucial UX factor for first-time AR users.

Reflection and Emerging Tech Justification

Why AR?

Augmented reality is one of the most promising emerging technologies for spatial communication, as it seamlessly can combine the digital and physical worlds together. For graphic designers, AR can help shift our flat 2D pieces of work into spatial experiences – merging visual communication, interactivity, and immersion together. 

Innovation and Trends

The innovation of this project comes from its accessibility and its purpose. Unlike high- budget corporate AR tools, this project uses web- based AR meaning no downloads required, making it practical for public use. This reflects a growing trend in the UX design – frictionless access (Eriksen, 2023). 

AR technology highlights its relevance as an emerging design tool by not only continuing to redefine how information is communicated, consumed, and visualised, but also Its integration within educational and navigational contexts. The project also addresses sustainable design thinking by replacing printed maps with digital overlays.

Design Thinking

After researching design thinking strategies, the ‘Design Thinking framework’ appeared. It is a non-linear, iterative methodology for creative problem-solving that focuses on understanding user needs through five stages. These being empathise, define, ideate, prototype, and test. This framework will help form the foundation of this project’s research. 

  • Empathise: This stage involves understanding the challenges new students face navigating campus. This insight gave me the idea to create a digital orientation aid rather than a traditional map that can become easily outdated. 
  • Define: The project’s goal is to reduce spatial confusion through accessible visual storytelling.
  • Ideate: I explored multiple concepts, including gamified campus tours and AR treasure hunts, but they had already been done before. The wayfinder was chosen for its practicality and its inclusive potential. 
  • Prototype: Blender was chosen to design the environment and 8th wall was chosen as it enables quick iteration. Animation and interaction will be refined based on student feedback and usability principles. 
  • Test: Feedback sessions will assess clarity, visual comfort, and intuitiveness. 

Success Criteria

The success of the Campus wayfinder will be measured by:

  • Ease of use: Users navigate intuitively without instruction.
  • Engagement: Users spend longer exploring due to interactive features. 
  • Accessibility: The design supports diverse users effectively. 
  • Reliability: The AR functions smoothly across different devices and environments. 

Future Development

With further development, the prototype could integrate real time GPS and indoor positioning systems, expanding to support live navigation. It could also include voice assisted guidance for users with visual impairments, improving the accessibility of the system. The concept could then later be extended as part of a university-wide welcome campaign, not only reinforcing student engagement through emerging media but also improving accessibility and usability to universities across the country. 

Conclusion

This project proposal demonstrates how graphic designers can apply emerging technologies in meaningful ways. By using AR to enhance real-world navigation, the Campus Wayfinder bridges digital communication and physical experience. UX design, 3D modelling, and AR development have all been combined to create a practical and innovative solution for students and visitors. In developing the project, design thinking will be used to guide every stage – from mapping user empathy and prototyping to user testing and refinement. This ensures that innovation is always driven by human needs. The wayfinder is not only a digital navigational tool but is also a reflection of the future of graphic design, where interactivity, accessibility and storytelling all coexist. As this technology will continue to evolve, we as designers must adapt, refine, and reimagine how information can be delivered. This project incorporates that evolution, turning a traditional campus map into a dynamic, immersive, and interactive experience. 

References

Benham, S. and Budiu, R. (2022) The role of spatial cognition in AR navigation design. Nielsen Norman Group. Available at: https://www.nngroup.com/articles/ar-navigation-design (Accessed: 24 October 2025).

Eriksen, M. (2023) Designing frictionless experiences in augmented reality. UX Collective. Available at: https://uxdesign.cc/frictionless-ux-ar (Accessed: 24 October 2025).