It's hard to be a good human assistant while knowing nothing about the boss's mind. So is the virtual assistant. Omni imitates how humans understand others' wishes: learning from their behaviours and experience. With the user's permission, Omni records and searches through all virtual data created by the user, such as the places he/she has been, the social posts and collections. Then, Omni analyzes this data with its algorithm, growing more and better knowledge of the user, including the characteristics, preferences and habits through time. With a deep understanding of the user, Omni can eventually do "mind-reading" to the user.
Omni will be held based on AR so that it can observe the situation around the user. It sees so it can react quickly and accurately. Also, Omni can be interacted by gesture controls for those situations that the user can't or prefer not to talk. It is controlled by the smart band wearing on the hand with muscle sensors so the user will not be limited by the display device's sensing range.
Based on our research, people always naturally gaze at the fellows on whom they wish to rely. We thought it would be an efficient and cute interaction between the user and Omni and integrated it into our design. When the user needs to activate Omni, he/she just needs to gaze at Omni on the top right of their view. Also, different microexpressions from gazes can offer various valuable feedback to Omni. For example, if the user is satisfied or not.
Omni is always ready behind, observing the surrounding environment, the user's behaviours and physiological data. Also, it listens the user, learning through his/her life pieces. When the user has a demand, Omni can immediately react to it, analyze and decide what and how to assist based on the previous observations. Then, it uses AR to guide and indicate directly on the user's visual field. Omni may respond smartly in a passive situation. For example, if the user has been depressed for a long time, Omni will find an indirect, comforting means to cheer him/her up.
Inspired by our low-fidelity tests and interviews with experienced Siri users, I designed this UX flow. Users used to be able to do nothing towards their virtual assistant, but not any more. Omni allows the user to "fire" the current assistant if the learning progress is poor. The system will mark this as an important warning and learn a lesson from it to improve the experience & knowledge of the user. At the same time, a new assistant (self-updated version covered by a new name & avatar) can be "hired" instead.
Perfect human assistants in real life always understands their bosses' true thoughts. Thus, they offer perfectly satisfying assistance. Omni imitates this, learning user's characteristics to figure out user's true desires, assisting like a true friend after his/her own heart. Omni focus on fulfilling the user with subjective self-satisfaction rather than the objective correctness.
Bring hundreds of audience into live E-sports game interactively and nicely; Research the various desires of potential users while watching live games.
Build game mechanism and in-game interactions attractive, interest, playable and immersive along with the live E-sports game.
Fulfill the players' game experience with visual, physical and psychological feedback and interactions between them and the game.
Instead of watching the e-sports as an audience outside of the game itself, Taunt's wants to make audience involve into the live game. Interacting with other fans & audience, making the whole experience more playable and "game-like"
My missions during my internship were heavy. Firstly I fully attended the design for the first ever open beta product (launched in late August 2018), then I need to provide refined UX design for our next possible iteration to make the game more interactive and interesting.
When I joined as the UX Design intern in June 2018, the team just decided to redesign the product and the whole UX and planned to launch the first ever open beta in APP stores in late August. So I mainly worked with Kevin Hanna and co-designed the whole product, microinteractions for fast start-up-paced iterations. In the end of my internship, we launched the application together, and I designed a refined product's UX. Some of the features and designs of it have been built into the latest update for better in-game experience.
Omni knows that sometimes, logically correctness isn't the real correctness.
It inspired me and the team a lot about how AI and future virtual assistant can be and how we should make correct use of our digital data and identity. The balance is hard to achieve indeed, as not everyone will feel safe to make their private data "open", even towards their virtual assistant computed all on the local database. However, the experience design here has persuasively solved many of the trouble we used to face. Indeed, a prepared educational design and the clearance how the data would be used do a lot of help.