How will the rise of AI affect my job as a UI/UX designer?

Tim Walsh

Product Designer

Tim Walsh

There has been a ton of news recently about artificial intelligence (AI). And a lot of the conversations, at least the ones that garner attention, often focus on the idea that this movement will negatively impact human beings - about how the robots will take over all of our jobs; that humans will become irrelevant, sentenced to a life of serving the machines; that we are ultimately creating the next evolutionary superior being.

I’m a designer. I focus on user experience and interface design. Recently I’ve been very curious about AI. Could it in fact consume all jobs? Could it take mine? This post explores what it may look like if AI was to replace a UI/UX designer.

DISCLAIMER - This is in no way a statement of how much I know about artificial intelligence, how it’s built or what the future holds, rather an evaluation of what its potential MAY look like. For the sake of this post, I’m going to refer to my mega-brilliant AI counterpart as HAL (ahem ahem).

As a consultancy, we work with people who approach us and have a problem. Mostly, I think it can be distilled down into two different types of projects. One is where a company already has a product and they want improvements and additions on existing design. The other would be a company who has an idea and wants us to help create it. In either case, business development will do their thing and create contracts, agreements, etc. Design gets on board during the initial kickoff meeting, after the deal is made and agreements are reached. So, for the sake of this post, that’s where I’ll start.


Kickoff Meeting

What I do: Before a kickoff meeting, I prepare by reading through any and all existing documents that we’ve received. I compile questions, create to-do lists, make schedules, prepare exercises; all to assist in moving the discussion along. This will help establish where the problem lies and support coming up with a game plan for the solution, all the while, keeping features in mind.

What HAL does: As long as all documentation is available in a digital platform, HAL is able to digest this information instantly. HAL can listen to the conversation, interjecting where necessary. HAL keeps things on schedule and on topic, making sure that the correct conversations (goals, stakeholders, user groups, platform, etc.) happen. HAL can ask questions, pivoting the conversation and moving it in ways that help yield a better product. HAL doesn’t need bathroom breaks, but understands that some people might, and allows time for that.



What I do: I’ll take all of the information that was gathered during the discovery period and begin researching. I search for similar companies or competitors, look at and experience their products, and research branding to help make decisions on things like color, logo, and typography.

What HAL does: HAL scours the digital world, accumulating all related matter that is of most value to the project. This happens instantly.

One thing I often have to do, is sign up for a service to access the product. Would AI be able to do this - taking action on a confirmation email, etc.? In that case would AI have their own email? (EXISTENTIAL CRISIS CLOSING IN).


Identifying Users & Creating Flows

What I do: This is a vital part of the experience. I’ll outline user types and user choices as a step by step process. This is often done using flow-maps, or site maps, essentially anything that allows representation of a product experience at a high level.

What HAL does: HAL understands the goal of the app and knows the parameters with which to abide in order to achieve that goal. From research & client feedback, HAL creates a list of user types - outlining all conceivable routes for those users while focusing on business goals and desired features.


Sketching & Wireframing

What I do: I begin sketching wireframes and layouts based on the features that were defined in the flows. This will generally be informed by previous work that we’ve completed or trends in popular web or app design. The more complicated or unique experiences require these initial sketches to be pretty low fidelity and will likely be a dialogue between design and client. This is done to: move rapidly, avoid making any concrete decisions too early on, and keeps options flexible.

What HAL does: Knowing the overall purpose of the product, HAL can generate wireframes and layouts keeping the product goal in mind at all times. HAL can utilize client responses, making decisions off of their positive and negative feedback. Just imagine: instant turnaround.

(Side note - Does time between phases make something feel more substantial - does a longer wait imply better work?)



What I do: Once wireframes have been approved, the structure of the app is pretty much in place. Decisions around styling come from a culmination of previous discussions around branding preferences, research, competitors, and trends.

What HAL does: HAL references every digital product ever created, identifying trends that are popular in particular industries. HAL then presents options, highlighting decisions made (I chose this color because it corresponds to this percent of the industry, other similar products identify user groups who respond to this layout, etc.).



What I do: User testing can begin once the product has been styled. This is a vital part of the experience because it gives me the opportunity to test whether my decisions work for users and that the product functions efficiently and as expected.

What HAL does: HAL identifies issues that users are running into and self corrects accordingly. For more complicated experiences, HAL presents several designs, monitoring users across the board, ultimately selecting the one that performs the best, propagating it to all users. HAL operates, at this point, without the feedback of the client. HAL understands what success and failure means, and makes decisions accordingly.

Final Thoughts

After writing this, the biggest areas where I imagine that AI could improve human capability is time it takes to create something, the sheer access and understanding of information, and the ability to parse through mass amounts of data, finding trends to make decisions.

Now, obviously what I’ve outlined above is far from where we stand with our current computing technology - and even then, it is likely not realistic nor will it shape out exactly as I’ve described. I simply wanted to envision for myself what it might look like if a computer was to replace my current role. I guess the takeaway would be that it’s not that difficult to imagine.

In addition, I’d like to leave you with my version of ‘the darkest timeline.’ AI identifies problem, creates solution. Humans remain blissfully unaware.


Stay in the Know

Get the latest news and insights on Elixir, Phoenix, machine learning, product strategy, and more—delivered straight to your inbox.

Narwin holding a press release sheet while opening the DockYard brand kit box