Qindle's data-driven design principles Monitor and Automate

Monitor & Automate – ‘How will data and AI change Innovation and Design?’ Part II

By Taco Schmidt, Managing Director at Qindle.

Data and AI are changing the future of work: they are critical inputs to decision making, and in some cases can replace human labour entirely. However, McKinsey predicts that fewer than 5% of occupations can be fully automated with today’s AI advances. How will that influence innovation and design services?

In the first article of this series, I explored the impact of AI-enabled data applications on innovation and design services today. Though innovation, design and engineering teams are already using such tools to automate, accelerate and enhance their existing workflows, these are often limited to repetitive analytical tasks.

While this is useful, I believe there are bigger opportunities for data and AI in the design process that are captured in six principles: Monitor, Automate, Customize, Synthesize, Reproduce and Create. These principles are foundational to Qindle’s data-driven design vision.

This is article two in a four-part series. Here, I will go deeper into the first two principles of data-driven design: Monitor and Automate.

Principle 1 – Monitor

Principle 1: Monitor of Qindle’s data-driven design visionThis principle is perhaps the most recognizable: it became famous when people realized they could train software to spot a cat amongst billions of human pictures on social media.

It then found its way to multiple online commerce applications, mostly focused on recognizing and categorizing objects from uploaded photos (fashion, consumer electronics etc.) so consumers can easily shop online, filter their preferences and find the right products.

JD is the second-largest business-to-consumer e-commerce company in China. With over 65 million active users, their rapid growth is fueled in part by POP, its open e-commerce platform that allows independent stores to upload photos and videos of products.

These independent stores upload terabytes of video data, pictures and text every day. 100 million items are uploaded daily in pictures alone. It’s JD’s responsibility to ensure that the uploaded pictures and videos don’t contain inappropriate content. AI supports JD to identify and sift through this content using inference-based video content filtering.

This is clearly useful for online platforms, but how does it help innovation and design communities?

Social media provides valuable data and context in understanding consumer behaviour. There are entire blogs dedicated to specific use cases and accompanying product pain points. AI can support in harvesting this data and flagging behaviour that matches with insights developed by design and engineering teams within the conventional cycle of ideation, prototyping, and user testing.

AI video start-up Valossa, a spinoff from the University of Oulu in Finland, is using the principle of video content monitoring but for live face and expression analytics. They deliver video analysis software to scan faces in videos for micro-expressions and track facial gestures for behavioural insights. Valossa’s vision is to develop AI that understands video like a human does.

Recognizing consumer behaviour through video content could deliver consumer behaviour research and validation on an unprecedented scale, enabling vastly more qualitative research with fewer resources.

Principle 2 – Automate

Principle 2: Automate of Qindle’s data-driven design vision

McKinsey estimates that at least ~18% of activities could be automated with today’s technology. This estimate is based on an assumption that 30% of activities in ~60% of all occupations can be automated.

Intelligent personal assistants like Siri, Google Assistant, Alexa or Cortana are all based on the principle of automating daily personal tasks. Even some professional tasks may be (semi) automated: meet PathAI, an AI-powered technology assisting pathologists in making rapid and accurate diagnoses as well as identifying personalised therapies and medicine.

Design and engineering teams use AI-enabled tools to automate, accelerate and enhance existing research and creation processes, for example, automated audio transcription of design research interviews or AI flow control in a plastic injection moulding tool.

Design teams working with AI can create designs faster due to the increased efficiency it offers. The power of AI will lie in the speed in which it can analyse vast amounts of data and suggest design adjustments or options. A designer can then cherry-pick and approve adjustments based on consumer data insights, improving productivity and decision-making processes.

AI-assisted creative skills will become an integral part of future design skill-sets. Automation will help designers to focus on testing functional design elements, rather than investing time in polishing pixels or smoothing NURBS. See ‘Wireframing Automation and Artificial Intelligence for UX Design’ for a more detailed case on the automation of wireframing.

This is article 2 out of 4 on my vision on how data-driven, AI-enabled tools will change creative industries. Find here Part 1: Introduction to How will data and AI change Innovation and Design?. Part 3: Principles ‘Customize’ and ‘Synthesize’ and Part 4: Principles ‘Reproduce’ and  ‘Create’.

 

This article was first published on Taco’s Schmidt Linkedin Pulse on June 11, 2021.