My designer colleagues in eureka, Inc. and I formed a team to participate in UI Battle. We chose Apple Music and redesigned its search UI with a new added feature, Feeling as we discovered its shortcoming in music search experience.
UX Researcher UI Design Presenter
My designer colleagues in eureka, Inc. and I formed a team to participate in UI Battle hosted by one of the big enterprises in Japan, Recruit Holdings which owns Indeed. The new feature we added as we redesigned Apple Music search UI, Feeling, allows users to search music that matches with their feelings at the moment. ・・・・・・・・・・・・ The screen would first start reading user's location information, and then recommend music based on what other people around them are listening to. If the user wants to listen to another song in the same genre, simply tab the next button. The keywords shown at the top help users put their mood into words. If they feel one of those keywords matches their feelings, they can tap on it, then a search result would appear. If the user wants to try a different genre, they could tap on the cross button. ・・・・・・・・・・・・ The algorithm would refine the recommendation through machine learning. Users would be able to find the music they are looking for even if they do not have a clear idea in mind.
We first brainstormed all the possible situations when searching music, then applied those situations onto Apple Music and found what it does not solve.
Apple Music's “Library” is where we put our own digital files. “For You” helps users look for music from the categories that they like. “Browse” lets users search the genre they love from the new hits. “Radio” provides recommendations from other people and allows users to search music based on genres. While “Search” helps users search with keywords that they have in mind.
We realized that Apple Music does not address the situation when users do not have a specific category or a singer in their head, making it difficult to find music that matches their mood at the moment. In other words, Apple Music's “Search” can only be helpful when users already have a rough idea of what kind of music they are looking for, but not when they are trying to find songs based on their feelings, which is the most common situation regarding music search based on our research.
At first, it was difficult to come up with a solution that helps capture users’ abstract and intangible emotions because they cannot easily describe what kind of mood they are in or what kind of music they feel like listening to.
After brainstorming about the solutions that can help solve the problem, we realized that the user’s demographic information includes the most information needed when trying to search music based on mood and feelings as it contains weather, hobby, location, and action information. Weather often influences people’s emotions as we might want to hear a light-hearted music on a sunny day. Hobbies also make a difference in the music search as users who grew up in Tokyo are more likely to listen to some Shibuya-Roppongi style music. Location information can be demonstrated that in a trip to New York, one would like to hear the hit songs in New York city. The action information is the hardest to comprehend, but for instance, when a Marvel hit movie is in theaters, those that go to cinemas might want to listen to the same soundtrack.
To redesign the UI of “Search” function into feeling-based music search feature, I changed the magnifying glass icon on the “Search” tab to a smiley face icon and switched the name from “Search” to “Feeling,” indicating this is a feature that enables users to search music that matches with their feelings at the time.
After tapping the “Feeling” tab, the screen would start reading user's demographic information, and then recommend music based on what other people around the user are listening to at the moment. If the user likes the type of the music that is recommended, but wants to listen to another song in the same genre, simply tap the “Next” button. The keywords shown at the top help users put their mood into words. If they feel that one of those keywords matches their feelings, they can tap on the keyword and a search result based on that keyword will appear. If the user does not like the recommended music and wants to try a different kind, they can tap on the “Cross” button. The algorithm would refine the recommendation through machine learning.
The original keyword search function is embedded in other tabs, the keyword search box would appear at the top of the screen by scrolling down the page.
We hope with this new UI, users would be able to find the music they are looking for even if they do not have a clear idea in mind.