I Built The World's First (& Meanest) AI Design Stakeholder

Lesson learned: some things simply don't need to be built.

Conversation about AI in design has been daunting, with the threat of job automation looming above designers. But, what if we collectively pushed the opposite way? What if, instead of AI making our designs, that we make designs for AI? And what if that AI serves to protect us from critique in a closed environment, rather than ridicule us using critique in front of a live audience?

That's where Director Delta comes to save the day. With UX competencies and image recognition on designs, Director Delta provides succinct and intentional feedback in order to drive higher caliber design....or, that's what it should do. In its infantile state, Director Delta 1.0 is a rude, arrogant, and critical bot that hones in on specific heuristic issues. Director Delta often needs hand-holding with a designer in order to fully understand the context of a given UI. But other than that, Director Delta is supercharged with the knowledge and capabilities of GPT-4, packaged into API calls from two simple 5KB Python files.

Check out my video:

Why an "AI client?"

The idea is rather s case study than anything that would actually have a direct application. I remember watching YouTube videos when generative AI was kicking off and seeing content creators deploy havoc with their own robots and assistants. It's inspiring to try and bring that into a UX lens - hesitantly, the only way I could build something of meaningfulness was to dust off the "Python for Dummies" book I got back in my career discovery. My experimentation started to devolve into a silly idea - an assistant that aims to obliterate a designer's self esteem, while still providing necessary critique.

But it still provides context to a neat question - how does the capabilities of AI help serve better decision-making within UX and design? All of this realistically can be done with a ChatGPT conversation using the correct prompt. However, we at shannadige.com like to keep things chaotic and unnecessarily complex.

Why is it called "Director Delta?"

I asked ChatGPT to create the name.

How does it all work?

Director Delta can provide feedback on your designs, giving its honest and brutal critique before providing recommendations for improvement. Director Delta also serves as a challenge generator, providing you with a business context and problem that can be solved through the use of digital design.

You can use Director Delta in two available formats:

  1. Available on ChatGPT as a public GPT model. Requires a Plus subscription or higher.
  2. Available for free on Github as a messy Python script. Python and OpenAI API knowledge required.


Engaging with Director Delta in both ChatGPT and on the Python script allows any designer to continue the context in a given session based on previously-stored conversations. There is a lot of complexity behind the Python script to get it working correctly, but it does follow the simple feedback loop available in LLM chatbots:

  1. Designer provides a prompt (via microphone) and attaches an image of the design, which is sent to OpenAI.
  2. The gpt-4-vision model provides back a response based on the prompt query and the image, in the format defined as Director Delta's "directives" (ie. set of instructions provided to the model).
  3. To retain memory, a second call to OpenAI summarizes the conversation and appends it to an array, using gpt-3.5-turbo.

The flowchart to this day still doesn't make sense to me, only because I created it AFTER creating the script.

Some disclaimers I need to disclose regarding this script:

  • Download and use at your own risk, follow data compliance rules as you normally would, and read your signed agreements with 3rd party services (Python, OpenAI, ElevenLabs, etc.) - me not responsible for anything!
  • You are free to use, abuse, and modify this script however you'd like, for personal or commercial use.
  • Yes, I understand the script code is not perfect. Feel free to tinker around and send me a merge request if you want to see improvements.
  • The script does not confidently provide or reflect an expert UX opinion, and should not be reliably used for professional business decisions.
  • The "script" refers to two scripts - one called "client-command.py" which is the primary script, and "send_audio.py" as a sister script for transcribing microphone input.
  • In the demo video, I edited the recording across two identical conversations, in order to build the content. Additionally, there are some moments where the GPT incorrectly calls out user needs and business context, that can always be remedied in prompting the tool.
  • Transcribing text uses the Python repo of Whisper, OpenAI's speech recognition model. It's an amazing and pretty nimble technology, read more here.
  • System prompts for Director Delta can be modified as needed - they are what keep Director Delta in character.
  • On average each call of this script cost me <$0.02. We used "gpt-4-vision-preview" to get image recognition capabilities, and in its current state is pretty expensive in comparison to "gpt-3.5-turbo."
  • Summarizing conversations is currently with a ceiling of 3 conversations max to save on token spend, but you can always increase or decrease this number (see Step 4 in the script).
  • ElevenLabs is a totally optional augmentation of this code, you can delete it if you feel it entirely unnecessary (which it definitely is, but funny if you find the weirdest voice).

Do you see a future for Director Delta?

For Director Delta specifically, absolutely not. But the idea and use case for having an AI assistant as a co-pilot, rather than a human as the co-pilot, sounds like a more refreshing approach to preserve as much of the craft of design as possible. Director Delta might not always have the right answer, and it's our human intuition that has the ability to assess and say "wait a second that's a stupid thing to critique or recommend." I could totally see this type of prompting as a plugin for Figma or as a mediator for design discussions.

This isn't without saying that much of GPT 4 is limited to its dataset and general use cases. I did not have the time or interest to train & fine-tune a model to accurately perform heuristic evaluations or accurately flag WCAG contrast ratios, because it requires an immense amount of collecting and tagging interfaces with the right parameters. Maybe I'll do that someday when the weather isn't so snowy.

I do hope though that this is inspiring enough for any designer, UXer or UIer to think of AI in a different light and start building solutions that will help grow the practice. So to my fellow designers - happy hunting! ■

Zoom in on some more examples:

No items found.
Director Delta demands that you enter your email to get notified about the next crazy invention.
Your submission has been received!
Something went wrong while submitting the form. Please try again later.