5 Testing, Iterating & Saving

Module 5: Testing, Iterating & Saving

From the course: Design Your Personal AI Brain – Conspire With Yourself: AI Systems for Personal Growth


Welcome to Module 5!

Now that your system is full of rich context, useful references, and your tone of voice — it’s time to put it to the test.

This is where your personal AI brain goes from setup to real-world use. We’ll walk through how to test it, assess the results, iterate, and save what works — so your system gets better and better over time.


Why Testing Is Important

Building your system is just the first step — but now it needs to prove itself in action.

And that means giving it real work to do, not just a throwaway test prompt.

You’re about to start collaborating with your AI system, not just using it.


Use Real, Relevant Tasks

Don’t just type “write a blog post” or “summarise this” — go deeper.

Pick a real task from your actual work or creative life. For example:

  • “Write a response to this tricky email in my tone”

  • “Turn this article into a carousel using Gamma-style slides”

  • “Analyse this job listing and show how I align based on my Project Brain”

When the context is real, the feedback you give your AI is more meaningful.


Assess the Response

Once your system replies, don’t just accept it blindly — take a moment to reflect.

Ask:

  • Did this sound like me?

  • Did it use the right knowledge or examples?

  • Was it genuinely helpful or just filler?

  • What worked well? What felt off?

Treat this like reviewing a junior team member’s work — it’s a chance to guide and refine.


The Iteration Loop

Here’s a simple feedback cycle to follow:

  1. Run a task using your system

  2. Reflect on what was strong or weak

  3. Tweak something — the prompt, the context, or the instructions

  4. Try again

  5. Save any version that feels strong

This is how your system becomes more consistent and tuned to your preferences.


Save What Works

When you get an output that feels just right, save it.

  • Create a folder for great outputs by use case

  • Label and tag them (e.g. “Voice_Nailed_JobResponse” or “GoodCarouselStructure”)

  • Use these examples as new context in future prompts

You’re building a library of wins that you and your AI system can learn from.


Bonus Tip: Ask Your AI for Feedback

You can even ask your AI system to help you improve its own output. Try prompts like:

  • “How could you revise this to match my tone more closely?”

  • “What’s missing that would make this more actionable?”

  • “Compare this to my writing sample — what’s different?”

These kinds of meta-prompts help your system become even more insightful.


Your Task: Trial and Tune

Pick one real task that’s relevant to your current work or goals.

Then:

✅ Prompt your system to help with it
✅ Reflect and assess the result
✅ Make one change (prompt, context, or tone input)
✅ Re-run the task
✅ Save the best version
✅ Keep a note of what you learned

Remember: you’re not just testing — you’re training by collaborating.


Up Next…

In the final module, we’ll look at how to maintain and expand your AI system over time — so it grows with you and continues to support your evolving goals and priorities.

This is where your single “AI brain” becomes a full ecosystem of support.

Let’s finish strong!