This project was part of a course in Animal-Computer Interaction, which focuses on designing for non-human users rather than the typical human-centered approach. Together with three classmates, I explored the challenge of designing for a specific animal, which required us to step outside our usual perspective and understand the world from a non-human point of view.
Our solution was an interactive prototype aimed at preventing bird-window collisions, combining behavioral insights with practical design considerations. The project challenged us to apply user-centered design principles in an entirely new context.
Tools and techniques
Brainstorming, 20-questions, interviews, observations, sketching, Crazy 8, Lofi/Hifi prototypes, Wizard of Oz, Evaluation, ISO.
Duration
Nov 2024 - Jan 2025
The process began with selecting an animal to focus our design on. We used brainstorming with post-its to generate ideas collaboratively, with each team member contributing an animal and related concepts. This method helped us explore a wide range of possibilities and identify promising directions for the project.
We decided to focus on the problem of bird-window collisions, as we found extensive research on the issue but few practical solutions. To refine our approach, we presented our choice to the class and used the 20-questions method, where the audience provided 20 questions to guide our next steps. This helped us identify key challenges and opportunities for our design solution.
"Which types of birds are most affected by this?"
"How can we know that the solution works? How do we measure and test it?"
"What solutions are there right now?"
"How are we going to find people to interview and collect data from?"
"When do birds fly in the most?"
We began by conducting in-depth research on the problem.
One team member spoke with residents in her grandparents’ neighborhood who had experienced collisions.
Another visited a bird sanctuary to observe and learn from birdwatchers about bird behavior.
A third member explored online communities to understand people’s experiences.
I contacted an ontology lab in Falsterbo specializing in bird research. The lab shared their knowledge on window collisions and referred me to a biology professor at Lund University, whom I interviewed to gain further insights.
All findings were then compiled and organized on a FigJam board, allowing us to identify patterns and key challenges for our design solution.
Birds do not perceive glass as a solid structure; reflections of trees, for example, make them see the window as something they can fly through.
Collisions often occur when birds are stressed, e.g. when they are hunting or being hunted.
The majority of documented deaths are among small birds.
Having curtains or potted plants in the window prevents the mirror effect.
Decals on the window have a certain effect, but they need to be placed close together and cover the entire window.
People don't want things on the windows preventing their view outside.
Green lasers have proven to be effective, as birds perceive them as a solid structure (similar to a pipe or pole), which scares them away without causing harm.
Limited documentation on specific bird breeds and when they collide.
To explore the design space, we examined existing solutions and user-driven approaches through the method of "Six thinking hats", drawing insights from interviews and discussions in the Facebook group. This provided an overview of what worked, where shortcomings existed, and what opportunities could be developed further. Building on these insights, we moved into ideation through brainstorming and sketching to start shaping potential solutions.
With an initial concept in place, we began refining specific components of the prototype. Based on sketches and earlier insights, we identified the needs for a camera, a deterrent mechanism, and a way to detect birds nearby.
Several alternatives were explored, weighing pros and cons in relation to both birds and homeowners. Ultimately, a green laser was chosen as the deterrent, as it proved both effective and less intrusive. To enable accurate bird detection and identification, we integrated the Merlin Bird ID app, powered by eBird’s extensive database.
During prototyping, we applied Houde & Hill’s framework to explore role, look & feel, and functionality, iterating through storyboards, sketches, and a high-fidelity prototype. This process helped us better understand user interaction, though it also highlighted the challenge of testing within ACI, where expert feedback could have added valuable perspectives.
The camera detects bird species and images are processed.
The solution uses an integrated AI system that detects bird species using a database. These are then stored and create comprehensive documentation of which species collide and when.
A green laser is activated when activity occurs, which scares the bird away.
The solution is small and placed near the window, making it invisible from inside the house and thus not affecting the residents' view.
Evaluation and testing were carried out continuously throughout the development process. The low-fidelity prototype was tested by assessing individual components, such as verifying that the green laser effectively startled birds and validating the Merlin Bird ID app’s accuracy on different images.
High-fidelity prototypes were evaluated using a Wizard of Oz approach, with a moderator simulating automated bird detection.
For safety reasons, the laser was later excluded from the final prototype, and moving birds were simulated using printed cutouts. Following ISO recommendations, evaluations were iterative, user-centered, and carefully documented, enabling us to refine the prototypes step by step while keeping both functionality and user experience in focus.
The final product