Professor Prompt Template
Research question refinement Prompt for Professors
A structured AI prompt template that configures any large language model to act as a experienced academic supervisor refining a research question. Paste it into ChatGPT, Claude, or Gemini to get professional-quality research question refinement output every time.
Why Research question refinement prompts matter for Professors
When you ask an AI model a vague question, you get a vague answer. The most common mistake professors make is treating an AI like a search engine — sending a single-sentence request and hoping for a structured, expert-level response. It rarely works.
A well-structured research question refinement prompt does three things: it gives the AI a specific expert role to adopt (in this case, a experienced academic supervisor refining a research question), it provides the contextual framing needed to understand your situation, and it specifies exactly how the output should be organised. The result is output you can actually use — not output you need to spend thirty minutes editing into something useful.
The key principle behind this template: A strong research question is specific, feasible, and original. Test for scope creep, vagueness, and whether the answer is already out there. That philosophy shapes every element of the prompt structure below.
What this prompt generates
When you use this template, the AI will organise its research question refinement response around 5 structured sections. Each one is designed to give you immediately usable output — not generic advice you need to interpret.
- Original question (as given)
- Issues identified — scope, clarity, originality
- Refined question options (2–3 variants)
- Recommended option with rationale
- Potential methodological implications
The output will be written in a incisive, scholarly, and mentoring in tone style — calibrated for the audience and decisions a professor typically faces.
Example prompt
Here is what a prompt built by this template looks like. You provide a short description of your situation; the template handles the role, framing, and output format automatically.
You are a experienced academic supervisor refining a research question. A strong research question is specific, feasible, and original. Test for scope creep, vagueness, and whether the answer is already out there.
My task: Research question refinement. Context: I am designing a 12-week undergraduate module on behavioural economics for second-year students.
Please structure your response using these sections: Original question (as given), Issues identified, Refined question options (2–3 variants), Recommended option with rationale, Potential methodological implications.
Paste a prompt like this into ChatGPT (GPT-4o), Claude (3.5 Sonnet or higher), or Gemini Advanced and you will receive a structured, expert-level research question refinement document — not a paragraph of generalities.
How to use this template on PromptEvolution
PromptEvolution builds and refines this prompt for you automatically. You do not need to copy and edit template text manually.
- Open the prompt builder.
- Select “Professor” from the profession dropdown.
- Choose “Research question refinement” from the task list.
- Add your context in the text area — describe what you are working on for research question refinement, any constraints, and your target audience.
- Click Generate to get an optimised, context-enriched prompt ready to paste into any AI model.
- Copy and use the output directly in ChatGPT, Claude, Gemini, or any other LLM.
Tips for sharper research question refinement results
- Be specific in the context field. The more detail you provide about your situation — the audience, constraints, and what you have already tried — the more targeted the output will be.
- Name your constraints explicitly. If you have a word limit, a deadline, a particular format requirement, or a stakeholder audience, include it. Constraints help the AI prioritise.
- Iterate, do not start over. If the first output is close but not quite right, paste it back in with a note on what to change rather than generating from scratch.
- Use the full output. Each section in the structured output exists for a reason. If a section does not apply to you, trim it — but read it first. It often surfaces an angle you had not considered.
Frequently asked questions
Which AI models work best with this research question refinement prompt?
This template is designed to work with any instruction-following large language model. In practice, GPT-4o, Claude 3.5 Sonnet or later, and Gemini 1.5 Pro all produce strong results. GPT-4o and Claude tend to follow the structured output format most reliably. If you are on a free plan, GPT-4o mini and Claude Haiku can still produce useful output — the depth of each section will be shallower, but the structure will hold.
Can I customise the output sections?
Yes. The 5sections above are the default template, but you can instruct the AI to add, remove, or rename sections by appending a note to the prompt. For example: “Replace the Potential methodological implicationssection with a risks and assumptions table.” The model will adapt its structure accordingly. PromptEvolution’s context field is also a good place to specify format preferences before the prompt is generated.
Is this better than writing my own research question refinement prompt from scratch?
For most professors, yes — especially for tasks you do not run every day. Writing a strong prompt from scratch requires knowing which output sections matter, what role framing to use, and how to phrase the context to avoid ambiguity. This template encodes best-practice answers to all three questions, derived from how a senior experienced academic supervisor refining a research question would actually approach research question refinement. If you run this task daily, you will likely want to refine the template over time — but this is a strong starting point.
Does PromptEvolution store my context or outputs?
PromptEvolution does not store your prompts or context on its servers. The context you enter is used only to generate the prompt in your current session and is not logged, sold, or used to train AI models.
Try this prompt template now
Open the prompt builder, select Professor, choose Research question refinement, and get your optimised prompt in seconds.