Module 3 Activity Research

Weekly Activity

Minh Vo


Project 3


Module 3

Module 3 focuses on turning our earlier prototypes into something that actually works in real conditions. This is where I moved from testing ideas on screen to connecting real sensors, refining the UI, and making the physical model respond to live data. Most of the work involved hands-on testing, fixing issues, and improving how the digital and physical parts communicate.

Workshop 1

Workshop 1 was a bodystorming exercise where our team acted out a clinic visit to understand real doctor–patient communication. By role-playing as doctors, nurses, and patients, we noticed how things like eye contact, tone, posture, and even using a phone for symptom checks affected trust and comfort. The photos capture moments from routine checkups to sudden “emergency” scenarios, helping us see how interaction design can make healthcare conversations clearer and less stressful.

For our bodystorming activity, we recreated a clinic visit. I and Valerii acted as nurses, Keegan took on the role of observer and note-taker (or head nurse), and Yiyang, Henry and our classmate played the patients. This scene shows us setting up the consultation and explaining how the exercise would run. As a nurse, I greeted the patient and asked about their symptoms. I made sure to maintain eye contact and use a calm tone to help the patient feel comfortable sharing their concerns. During the consultation, I took notes on the patient's symptoms and medical history. I also explained the next steps clearly, ensuring the patient understood what to expect. One of the patients experienced a sudden 'emergency' scenario. As a nurse, I had to quickly assess the situation, provide reassurance, and call for additional help while keeping the patient calm. After the consultation, I provided the patient with instructions for follow-up care and answered any remaining questions they had to ensure they felt supported and informed.

Activity 1: My Research

Here captures the very beginning of my prototype testing process. Before I could experiment with any interactions, I needed to organize all the UI elements and confirm which variables would eventually connect to the Arduino sensor. Seeing everything laid out like this helped me understand what I already had control over, such as the screen layout, temperature and humidity labels, and the overall flow of information. It also made me more aware of the unknowns I still needed to figure out, especially how these elements would behave once real data started coming in. This setup became the foundation for the rest of my testing work. The moment I began testing how the prototype responds to different temperature ranges. I wasn’t fully sure if the conditional logic would update the UI the way I intended, so creating these rules helped me explore that uncertainty. I set up conditions for cold, pleasant, and hot states, each connected to changes in text, colour, and icons. While building this, I realized how many small decisions go into a dynamic feedback system and how easily something can break if a value doesn’t behave as expected. This step helped me move from simply designing the interface to understanding how it reacts when data changes. When my temperature logic is triggered in the prototype. I tested the cold state by manually setting a low temperature value, which activated the red alert message at the top of the screen. Even though the temperature and humidity numbers still display ‘0,’ this is intentional because, at this stage, I am only testing the local logic for the temperature status and the alert message rather than pulling actual sensor data. Seeing the warning state appear helped me confirm that the conditions I set were working correctly. It also made me think more about how the message, colour, and layout communicate urgency to the user. Testing this locally allowed me to refine the interaction before connecting it to live values later. This screenshot shows the pleasant temperature state in my prototype. After switching the test value to fall within the balanced range, the background shifted to a green tone and the alert message updated to a calm, reassuring statement. Similar to the previous test, the indoor temperature and humidity numbers remain at zero on purpose because I am still testing the local logic for the status message and colour changes rather than displaying actual sensor data. Seeing the pleasant state appear confirmed that my conditional setup is working across multiple ranges, not just the extreme cases. This helped me understand how the interface communicates comfort and balance, and it allowed me to compare how each state feels from the user’s perspective. The hot temperature state after I tested the upper range of my conditions. When I set the test variable to a high value, the background changed to a strong red tone and the alert message updated to warn the user about elevated temperatures. The temperature and humidity values remain at zero on purpose because I am only testing the local logic for the temperature status and alert message, not the live sensor data. Seeing the system respond correctly to the hot range helped me confirm that all three temperature states function the way I intended. It also made me think more about how urgency is communicated visually and whether the transition into this state feels clear and intuitive for the user.

Activity 2: My Research

When I connected Yiyang’s Arduino sensor to ProtoPie for live testing, the interface immediately received real temperature and humidity values. The sensor detected a high temperature and very dry air at the same time, which caused both alerts to activate together. This moment revealed one of my key Activity 2 unknowns: how the UI handles multiple real-time conditions that overlap. Unlike the controlled testing in Activity 1, the live data shifted slightly with each reading, which made the alert system feel more dynamic but also more sensitive. This helped me evaluate whether the messaging hierarchy, timing, and visual feedback still work when the prototype interacts with real sensor behaviour rather than simulated values. In this test, the live Arduino data shifted to high temperature and very high humidity, which immediately activated the alerts for heat and oppressive air. Seeing both warnings appear together helped me check how the UI handles rapid changes in sensor values. This confirmed that the system responds accurately, but it also showed me that I need to refine how multiple alerts are displayed so the screen doesn’t feel overwhelming when conditions change quickly. While testing with the live Arduino sensor, I noticed that the humidity value didn’t always update smoothly in ProtoPie. Even though the mapping looked correct, the number sometimes froze or jumped in ways that didn’t match the real sensor readings. This made me realize that the issue wasn’t the hardware but how ProtoPie processed rapid changes in incoming data. It helped me understand that I need to adjust how the UI handles fast or fluctuating values so the interface remains stable and reliable. Live sensor data updated the phone UI and the lamp model at the same time, confirming that the Arduino, ProtoPie Connect, and interface were communicating correctly. This test helped me see how the whole system responds together and where small delays or inconsistencies might still occur. The sensor was placed close to the lamp model to test how quickly the interface reacts when the temperature and humidity change in real time. As the readings shifted, the background colour and alert messages on the phone updated instantly.

Project 3


Project 3 Final Prototype

The final prototype brings together the Arduino sensor, the lamp model, and the ProtoPie interface into one fully working system. The lamp now reacts instantly to temperature and humidity changes detected by the sensor, shifting its light and sending live data to the app. On the phone, the UI updates in real time with new values, colour changes, and alert messages that match the current conditions. After all the testing, the whole system feels much more stable and responsive. Seeing the lamp glow red at the same moment the app displays warnings shows that the digital and physical parts are finally communicating the way I intended.

The final prototype in action with the lamp and mobile interface updating instantly based on real temperature and humidity readings.
×

Powered by w3.css