Author's Note: This repository has been edited to combine the efforts of 2 modules: 60.003 Product Design Studio and 60.002 AI applications in Design.

Airlines like Scoot have always faced issues with excessive hand-carry baggage among their customers. As part of SUTD DAI pillar's 60.003 Product Design Studio, we are tasked to design a means to prevent late detection of excessive hand carry baggage so as to reduce disruptions in boarding procedures.
Our solution Hoverfly is equipped with an Intel RealSense 3D camera and load sensors. Hoverfly can accurately measure weight and dimensions, ensuring early detection of oversized carry-ons at the boarding gate.
From ideation to final prototyping, the process of product design is documented in the slide deck linked below.
Due to NDA restrictions, we are unable to provide the relevant code documentation.
Attachments: Slide Deck
Attachments: Report | Slide Deck
As part of SUTD DAI pillar's 60.002 AI applications in Design module, we are tasked to identify opportunities for design improvements. We are assigned to use data-driven text analytics and/or image processing to identify design opportunities. We are advised to focus on a specific product and conduct benchmarking with competitors whenever appropriate.
Our end result is a series of Jupyter notebooks with a clear AI workflow for Identification of Design Opportunities.
The clear design workflow is as such:
- Initial Text Query with ChatGPT
- Initial Image Contextual Query with ChatGPT
- Data Collection and Pre-processing
- Finding Top Words for Data Analysis
- Identification of Design Opportunities with Classified Sentiment Analysis
- Design Justification with Design Structure Matrix
The notebooks are meant to be modular and replicable for any product of the user's choice. To link back to our 60.003 research, we chose to identify opportunities for Scoot's Boeing 787s.
Step 1: Clone the repo
git clone https://github.com/hunchongtan/AID_Project_2.git
Step 2: Set up the .env
file by cloning the .env.example
file
cp .env.example .env
Step 3: Retrieve the following API keys from the following websites:
OPENAI_API_KEY
:
- Sign in to OpenAI and go to https://platform.openai.com/docs/overview
- Expand the left panel (below the OpenAI logo) > Click on API keys > Click on + Create new secret key
- Fill out the required details: (refer to the first/left picture)
Name: any_name_for_your_app > Click on Create secret key - Copy the API Key as OPENAI_API_KEY (refer to the second/right picture)


GOOGLE_API_KEY
:
- Sign in to Google and go to https://console.cloud.google.com/
- Expand the left panel > Click on APIs & Services > Click on Enabled APIs & services > CREATE PROJECT
- Fill out the required details: (refer to the first/left picture)
Project name: any_name_for_your_app > CREATE - Expand the left panel again > Click on Enabled APIs & services > Click on ENABLE APIS AND SERVICES (on top) > Scroll down and click on YouTube Data API v3
- Click on Enable > CREATE CREDENTIALS (top right) > Select Public data > Click on Next
- Copy the API Key as GOOGLE_API_KEY (refer to the second/right picture)


REDDIT_API_ID
, REDDIT_API_KEY
, REDDIT_API_USER
, REDDIT_API_PW
:
- Sign in to Reddit and go to https://www.reddit.com/prefs/apps
- Click are you a developer? create an app... / create another app… button
- Fill out the required details: (refer to the first/left picture)
Name: any_name_for_your_app, Select script, description: any_description, about url: Leave Empty, redirect url: your_socials_link & click create app - Copy the personal use script as REDDIT_API_ID and the secret token as REDDIT_API_KEY (refer to the second/right picture)
- Copy your reddit username and password as REDDIT_API_USER and REDDIT_API_PW respectively


Working with Sample Data:
If you wish to verify with sample data (Author was exploring for Scoot 787), please copy the support and others folders in sample_data folder to the main folder.
Working with own Data:
If you wish to try out on a different product, remember to clear all outputs in the jupyter notebooks before running all codes.
- Follow the instructions in the jupyter notebooks.
- Edit the data in all the TO DO SECTIONs.
- Run the codes without editing in RUN AS INTENDED (DO NOT CHANGE ANYTHING.) sections.
- Have fun learning the workflow!
Our work is made possible with the help of DAI professors and Professor Edwin Koh's research paper on LLM-generated DSMs.
Special thanks to the team behind this project:
- Delphine Sim Yingting (1006986)
- Tan Hun Chong (1006643)
- Cyan Koh Shi-An (1007230)
- Lim Sophie (1007487)
- Lim Ying Xuan (1006960)
- Tan Ze Lin (1007054)