AI + UX Research Walkthrough -> For Bitcoiners learning to run local AI models
We're running local AI models because the public AI models are not necessarily safe to sift through our data. A lot of Bitcoiners are very privacy focused, so it's very important to run local models so that the data we're sharing is not available to the public.
Here are the steps I go through to create high-quality output analysis of data using AI:
---
### 1. Prepare the Raw Data
Preparing the data might sound overly complicated, but it's not.
The problem AI has is recognizing that you've done, for example, 12 interviews. It might only pick three of those interviews.
You need to create a document and label every single interview. Copy and paste every single interview transcript and write person one, person two, person three, person four, and so on.
### 2. Provide Context to the AI Model
You're going to be providing context to the AI model. Here you can use a public AI to create the prompt - this is all the preparation phase. You're not processing the data yet.
What I usually do is take my UX research plan and pop it into the AI model. I give the AI model context in terms of what I'm about to be processing. Now it has context. It knows the reason, the goals, the visions.
### 2. Prepare the Prompt
I say to the AI model: "Use the data that I've given you, and I want you to prepare a prompt based on the goals and vision of the UX research sprint that I've done."
Then it's going to prepare a prompt. That's the prompt you're going to use, and you're going to take that prompt into the local AI model running on your PC.
### 3. Process Data in Biteable Pieces
You can do it in two ways:
Take the prompt, put it in, attach the PDF, and click run,
Process the data in bite-sized chunks - use half of the prompt first with all the data, then use half of the prompt again for the second half,
The local AI models become less intelligent the further down the chat you go because they use up the tokens. So I tend to put the prompt in the beginning and all of the data in the beginning.
Something really crucial: Say "I want you to use all of the data that I've provided. I want you to use person one, person two, person three, person four, and so on."
If you don't do this, the AI model will not be smart enough to pick up on the fact that it needs to use all of the data points you provided. In my experience, it will only use three of the data points.
That's a quick summary----> Any questions shoot them this way! And very happy to learn from others who are also using AI to process UX Research
** this post was created by using an AI voice tool and the summarizing feature. Simply recorded the audio notes, had AI transcribe and summarize the notes using my own language as the core foundation.
---
### 1. Prepare the Raw Data
Preparing the data might sound overly complicated, but it's not.
The problem AI has is recognizing that you've done, for example, 12 interviews. It might only pick three of those interviews.
You need to create a document and label every single interview. Copy and paste every single interview transcript and write person one, person two, person three, person four, and so on.
### 2. Provide Context to the AI Model
You're going to be providing context to the AI model. Here you can use a public AI to create the prompt - this is all the preparation phase. You're not processing the data yet.
What I usually do is take my UX research plan and pop it into the AI model. I give the AI model context in terms of what I'm about to be processing. Now it has context. It knows the reason, the goals, the visions.
### 2. Prepare the Prompt
I say to the AI model: "Use the data that I've given you, and I want you to prepare a prompt based on the goals and vision of the UX research sprint that I've done."
Then it's going to prepare a prompt. That's the prompt you're going to use, and you're going to take that prompt into the local AI model running on your PC.
### 3. Process Data in Biteable Pieces
You can do it in two ways:
Take the prompt, put it in, attach the PDF, and click run,
Process the data in bite-sized chunks - use half of the prompt first with all the data, then use half of the prompt again for the second half,
The local AI models become less intelligent the further down the chat you go because they use up the tokens. So I tend to put the prompt in the beginning and all of the data in the beginning.
Something really crucial: Say "I want you to use all of the data that I've provided. I want you to use person one, person two, person three, person four, and so on."
If you don't do this, the AI model will not be smart enough to pick up on the fact that it needs to use all of the data points you provided. In my experience, it will only use three of the data points.
That's a quick summary----> Any questions shoot them this way! And very happy to learn from others who are also using AI to process UX Research
** this post was created by using an AI voice tool and the summarizing feature. Simply recorded the audio notes, had AI transcribe and summarize the notes using my own language as the core foundation.



