P & G Artwork

Exploring an AI image recognition solution to differentiate package design
Project Overview
I participate this project as UI/UX designer through Sogeti. 
P&G, as long term client with Sogeti, needs a solution to help their employee to organize and manage the large quantity of package designs for different brand collections.
Team Member
SME /BA
PM/Architect 
Scrum Master
Senior Data Scientist x 3
Senior FSD + Azure
Data Scientist x 7
Python WebAPI Developer x 2
QA x 2
UI/UX Designer
Timeline
11 Wks for Phase 1
My Role
UI/UX designer of Phase 1
Agile Software Development
For this project, the team followed Agile Development process. Me as designer was mostly adapt to the Agile UX process with the development team and the clients. During the 11 weeks, we were be able to went through MVP version of all 3 business case for phase 1. 
Scrum master and project manager host scrum meetings every morning at 8AM. The whole team go through the user stories list together, and sync in the progress. UX design and discussion are most active in the beginning of the process. During the development phase, I was mostly iterating the design and supporting the developers from the design perspective. 
•  Agile software development
•  Agile UX design process
•  Go through user stories in scrum meetings
Project Kickoff
P&G has large numbers of packaging design from all different sub-brands. Each of the collection has its own variants and sizes. Therefore they needs a visual solution to help them mark all the different properties on a large quantity of package artworks. 
Our team used AI learning technology to develop an AI image recognition solution for this problem. We were able to build and train the model for the first phase through 450+ image sample from P&G.
•  Use Case 1        Final Art VS. TemplateUse
•  Use Case 2       Lineup Compare
•  Use Case 3       New Concept VS. Predecessors
Discover
3 Use Cases
The initial product meeting classified user demands in 3 use cases: compare final art and template to re-verified the final design; compare lineup package design in the same collection to make sure the style consistency; compare new concept and predecessors for the same product to see the evolvement. 
We were able to build and train the model for the first phase through 450+ image sample from P&G. The training data focusing on feeding the model components examples like labels, logos and tags to help the system recognize them, and compare them apple to apple. 
Use case 1
Final Art VS. Template
compare final art and template to re-verified the final design
Use case 2
Lineup Compare
compare lineup package design in the same collection to make sure the style consistency
Use case 3
New Concept VS. Predecessors
compare new concept and predecessors for the same product to see the evolvement
design
User Flow
According to the information we gathered from the users, UX designer, Business Analyst and the stakeholder were able to form a general user flow in the early conversations. 
As you can see from the diagram, the flow is mostly linear and straight forward. The whole design dedicate to create a simple and effective experience for the users. 
Testing
Problem(1)_Waiting Time
During the development phase, we realized large amount of visual tags need to be compare caused longer waiting time on the user end. Each compare submitted will take at least 30mins to accomplish in the backend. 
And currently all the compare result are temporary, if the browser is closed accidentally during the 30 mins wait, user will have to start all over again. 
Discover
Brainstorming
How might we manage user waiting time?
Developer and UX designer are going back and force brainstorming the possible solution. We talked about a lot options, like Email notification, user sign portal. 
Compare all the options from business, development and user perspectives, the team finally settling on adding a result list access allows users to source back to their compare from the data cloud at any time. 
Design
Solution(1)_Request Queue
During the development phase, we realized large amount of visual tags need to be compare caused longer waiting time on the user end. Each compare submitted will take at least 30mins to accomplish in the backend. 
And currently all the compare result are temporary, if the browser is closed accidentally during the 30 mins wait, user will have to start all over again. 
Discover
Problem 2_Refine Use Case
And now let us deep dive into the use case 2, which involved the most testing and discussion between UX designer, business analyst and the stakeholder. We were constantly coordinate on prioritized features, development phase plan and budget.
The original simple linear flow doesn’t fulfill all the user needs. 
As you see from the draft prototype, use case 2 is fairly linear. During the first round of usability test and user interview, we discovered the user pain points for this iteration. 
Discover
User Research
During the user research process,  we discovered…
🔍 Insight 01.
In the real life use cases, User usually need to run cross compare within 10-20 files over and over again.
🔍 Insight 02.
Each file has 2 properties, and only files holds the same properties are worth comparing.
🔍 Insight 03.
Each compare is depending on the property. For example, size b compare will be file 456, variant 2 compare will be file 258, etc. 
Design
Refine Use Case
How might we leverage the compare experience ?
🔬 Prioritized Plotting
The team conducted a prioritized plotting activity. During the discussion, we coordinate with our client on prioritized features, development phase plan and budget. 
Because the portal is for in house employee, the Prioritization plotting’s 2 axises are technology_project budget and users preference. I demonstrated some possible directions to support the discussion with the stakeholder.  
💡 Brainstorming
One of the most discussed point is about clickable table VS. manually table.
I demonstrated a clickable table option, which requires backend to auto identify the file properties and auto populate the file. And the clickable property tab allows user to submit compares in no time. To me as a UX designer, I believe this is the most visual and intuitive way for users to accomplish the compare job. 
However, as the MVP phase of the app, we would need to keep the long term goal in mind and develop what is feasible as well as usable for the current phase. That is why we came up with an in progress iteration on the right side.
design
User Flow
Problem
These use case details lead us to the paint points of our current flow. First and foremost, repeating the uploading files process for each single compare is unacceptable. Other than that, they need to manually track all the compares and file properties, which made error rate very high.
💡 Solution
After the back and forth with stakeholder, the team agreed on the direction to go. The features we can not include for the first phase, we would keep it as a long term goal, documented in notes. 
Design
Problem 2_Solution
An embedded table feature allow user to fetch local files into a file pool. From the same file pool, compare selection can be re-used. 
design
Hi-Fi Prototype
💻 Desktop

Use Case 2 Re-design

An embedded table feature allow user to track all the files and compare submissions easily.
Retrospective
Takeways
📂 Business
Product cycle and phase play an important role  throughout design.
Designing for a mature digital product and designing for a MVP product will require completely different strategy to design and develop. The design always adapt to the phases.
🎨 Design
Understanding the actual and literal use case of your user is the key. Never too much details.
The interaction between the end user interact with the digital product is all coming from a user purpose. Thoroughly understanding the intent, logic and all the details helps us as designers to leverage the experience.
Process Deck
More Projects: