UI/UX Designer


A step-by-step process of our team designing UI and HUD for a mobile puzzle game SciBeaver Adventures. The article describes all of the major stages of drafting and refining the user interface: from gathering references, designing the layout and grouping of the interface elements to interaction design and visual storytelling. The following UI played an integral part in the success of the project, since based on our data, roughly 95% of all the time players had spent in the game, they interacted with the following interface.


Team & Stakeholders

3x Developers, Publisher

Educational puzzle

SciBeaver was a puzzle game with a real lifelike light physics simulation for iOS (mobile and tablets). To complete the game you had to develop the skill of navigating light through the level by operating optic lenses, prisms, mirrors and other physical elements. The game also told a story of a scientist, who was captured by terrorist and escaped using his knowledge of optics and physics.

The game had a steep learning curve, taught optics and required managing a lot of interactive elements at the same time

The game had a steep learning curve, with a lot of interactive elements available on the screen at the same time. To be successful you had to not only understand how optics works, but also manage all the elements on the screen, sometimes changing their positions, angles, and combinations several times to complete a single level. So we needed to make every possible effort to decrease that complexity, especially during onboarding, and create a UI to match that kind of highly interactive gameplay.

Experience Analysis

I started by analyzing the gameplay and listing all the features that our UI design had to incorporate. I listed both: UI elements (buttons, switches, interactive game objects end etc.) and actions a player could perform (pushing, dragging, spinning, swiping and etc.) Then I went through each listed element and added more description and details. For example: whether an element needed to provide any feedback, or if an element had different states, or just recorded my thoughts on how an element might be designed - essentially starting a rough UI design document. This list evolved until there was a solid prototype in place. After that I just recorded things that needed to be changed/added for the next iteration.

I started by listing all the required UI and interactive elements that had to be a part of the experience

In SciBeaver you needed to place optical elements on the screen in a sequence that would allow the light to travel from the start of the level to the exit door. Once you had the light near the door, you were able to use the super lens to burn the door and move to the next level. As I've said before, to be successful you had to not only understand how optics works, but also manage all the elements on the screen, sometimes changing their positions, angles, and combinations several times to complete a single level. My first task was to ease the adoption and the learning curve, so I decided to start by gathering references from some of the similar games.


I collected references from games that had similar gameplay or used similar concepts to ours: a free screen to drag objects around, elements, that could be taken on and off the screen and etc. Luckily, we had a decent enough understanding of our audience to know what other games of this type they enjoyed.



The first step was to logically separate areas on the screen that would contain UI elements of different nature. The placement of the groups of elements where determined based on the the usability tests, industry standards and common sense. For example: all buttons, that were not directly a part of the game and caused a popup window to open over the play area, like pause button or hints, where placed at the top, along side with level progress information, like number of bulbs collected, friends saved, timer, level number and etc. The buttons where placed on the sides, following the best usability practises of a mobile UI layout while non-interactive information was  pushed to the centre top.


All available optical elements where placed at the bottom left. This decision was made with right-handed people in mind, since when a player needed to drag an element to the play area, she most frequently used a forefinger of her right hand thus, in some cases covering up the lower right part of the screen completely.

Elements of similar nature were combined in groups and then assigned places on the screen based on the results of the usability tests

The battery and the super lens activation button were placed in the right lower part of the screen since it felt natural and comfortable to press a huge button that destroyed things with your right thumb. Also, this button was most often pressed close to level completion, as a final action before winning or losing. Its placement enhanced that context of use, since after a lot of interactions you would finally rest your hand on the button and logically would get a result for all your previous efforts.

The main character and the door where logically placed in the different ends of the screen, facing each-other. That was consistent for all the levels, as and additional remainder of a player's goal and the main problem of the game. Plus, it visually marked level start and finish.


To test and get the feeling of how the UI might work, I started with a paper mockup of the gameplay, taking a blank piece of paper for a screen and, step by step adding different game and UI elements. We also wanted to bring a little fun to the whole process so the paper prototype ended up being a fully playable and adjustable level.

Paper prototype allowed to determine the best ways to interact with numerous game objects

The paper prototype almost precisely mimicked the future digital version, with the majority of the optic lenses and other game elements being separate pieces of paper so they could be dragged around the play area. This allowed us to better understand the experience we were building. Particularly, my main priorities at this stage were two things. First of all, to understand the best way to move, rotate and adjust the optic elements. This was important, because, as I've mentioned before, this was the main part of the gameplay and players would be managing those objects constantly. And secondly, to figure out what kind of feedback was required for the changing states of the game objects. This was more a task for imagination, since paper prototype was not interactive and just helped to create and think through various possible scenarios.

The UI came gradually together, through continues testing and adjusting various combinations and documenting the outcomes. The paper prototype was a great starting point and we learnt a great deal out of it, but at some point it became ineffective and we moved to create a digital one, based on everything we have learned on paper.

reinforcing the right mental model

I wanted the UI to also help the player build the right mental model of how game elements worked and how they were connected. We already added a lot of direct explanations and signifiers, like arrows, tutorial animations, hints, loading screen messages, pre-level pop-ups and etc. In addition, we provided indirect explanations, like short story videos in the begging of the game to set the stage and explain what kind of world the player was getting into. Yet still, I wanted the UI to visually explain itself, tell a story about its elements and through those details help the players create a better mental model of how objects on the screen functioned and were connected with each other.

The UI had to visually explain itself, the elements had to tell a story, and through those details set expectations, explain behaviours and help players create a better mental model of the game world

To start off, we enhanced the feeling that the game world is alive and welcomes interaction. This was done through the use of animations on various game objects and parts of the background. If a player tapped the main character, he would see a popup describing the stages he needed to take in order to complete each level, just as if a player poked him to get a hint.


Light Particles

Next we turned our attention to how light is displayed. Initially it was just a semi-transparent white gradient, which felt static and boring. One could guess it was light, as it emerged from a lamp, but that was the only cue. We decided to add particles of dust, that would go through the light and would give players a feeling that they are dealing with real light, not just a white gradient. 


Dynamic Light Feedback

Important part of the sign and feedback design was the real time change of the light beam depending on the position of lenses and mirrors. At first we had a much easier option, from the programming point of view, to implement light that would change only after a player finishes adjusting the optical element. Although it was easier for us, it did not help to perceive the white gradient as real light and lenses as real optic elements. It also did not give any feedback to the player on the proper way of adjusting the optic element, resulting in an unsatisfactory experience. In the end, our programmer created a much more advanced light algorithm with real-time calculations to provide players the immediate feedback and help them believe they were actually dealing with real light.


Super Lens Sign & Feedback Design

Also, the Super Lens was a hard object to explain to the player. Especially the part of how it functions and the relation between the amount of light it was receiving and how powerful it was. First challenge was to explain the current state of the Super Lens. We added a lot of small details to communicate that, like different ON and OFF pictures, colour-coded power bars on its sides to communicate how powerful it was based on the amount of light it was receiving. We also added a commonly-known on/off sign on the button to explain that the lens needed human input to work and made different button states so you could clearly see when the lens was active. Also, we made the available charge of the lens to look like a battery so players would easily understand the concept of draining a battery dry and the fact that once the battery is empty the Super Lens will no longer work. I also added visual cues, like wires coming from the battery to button, to visually connect the two elements, so it was obvious that the two work together.



It’s a fun “edutainment” game that makes learning about light reflection and refraction fun

SciBeaver received very positive reviews both from the press and from the players. Through proper UI & HUD design, a lot of attention to details and numerous tests, adjustments and constant enhancements of the onboarding process and tutorial, we managed to achieve a smooth learning curve and significantly decrease the complexity of an otherwise complex puzzle game.