AI
our blog
AI Is Changing How Products Actually Work

For a long time, interface design has been centred around screens, with teams defining layouts, mapping user journeys and ensuring each step in a process was clear and predictable. That model worked well when systems behaved in fixed, linear ways, where users took an action and the platform responded in a structured sequence.
That is becoming less representative of how modern products operate. As AI becomes more embedded, interactions are no longer confined to a single interface and a task might begin in a chat, trigger a workflow behind the scenes and reappear somewhere else as an output or recommendation. The interaction moves across the system, even if the user does not.
Many products are still designed with a screen-first mindset, with AI added into existing flows rather than reshaping them. The result is a disconnect between what the system is capable of and how it is experienced, where the underlying platform may be adaptive and dynamic, but the product still expects users to move through it step by step.
In reality, most users are not thinking about screens or navigation structures, they are focused on getting something done. They want to complete a task, make a decision and move forward without needing to understand how the system is organised underneath, which requires a shift from designing individual interfaces to designing connected systems of interaction.
This obviously has practical implications for how products are structured, including where interactions begin, how they progress and what context needs to persist as they move across different touchpoints. It also requires a more deliberate approach to control, particularly as systems begin to take actions on behalf of users rather than simply responding to them.
In most products, this shows up pretty clearly. A user starts something in one place, then has to repeat or re-enter information somewhere else because the context has not carried through. Different parts of the system behave inconsistently because they have been updated at different times and automation is introduced in isolated steps rather than across the full task. The result is not that the product requires more effort than it should.
At Studio Graphene, this shift tends to become most visible as platforms scale, where interfaces originally designed for linear workflows begin to struggle as automation and intelligence are introduced. Trying to layer AI into those structures usually adds complexity rather than simplifying things, whereas stepping back and designing the interaction as a connected system from the outset tends to produce something that is easier to use, easier to adapt and much closer to how work actually happens.







