Subscribe

Eight key considerations when prompting AI


Johannesburg, 29 Apr 2024
Karlien Rust, National Marketing Manager for Corporate at CTU Training Solutions.
Karlien Rust, National Marketing Manager for Corporate at CTU Training Solutions.

In the rapidly-evolving landscape of artificial intelligence (AI), the ability to effectively communicate with large language models (LLMs) has emerged as a critical skill. Whether interacting with ChatGPT, Google Gemini or Microsoft Copilot, mastering the art of crafting prompts is essential for harnessing the full potential of AI capabilities.

CTU Training Solutions recently hosted a webinar presented by Professor Johan Steyn, a human-centred AI specialist, who shared invaluable insights into navigating the complexities of AI communication.

He talks about the importance of articulating your requirements properly to AI language models in order to get the desired output. “The ability to express yourself clearly, whether spoken or written, is becoming an increasingly important skill with the advent of AI and particularly when using LLMs.

Here, we distil Prof Steyn’s expertise into eight key considerations for optimising AI interactions.

Talking to AI

Prompting is the process of providing input text to the AI model to elicit a specific response or output. This input text, called a prompt, acts as a guide or instruction to the model, telling it what type of information or response is being sought. “Effective prompting can affect the quality, relevance and specificity of the model’s output, and it can be crafted creatively to maximise the model’s performance on various tasks, such as generating content, answering questions or completing tasks.”

Professor Johan Steyn, human-centred AI specialist.
Professor Johan Steyn, human-centred AI specialist.

“People are often frustrated when using ChatGPT and other platforms, but the way they interact with them is often inadequate. There are so many things we can tweak when we interact with these platforms that will give you better results. The trick here is to reprompt the AI, if you don’t get what you asked for. Be specific, ask for bullet points and not paragraphs, for example. And keep prompting until you get the answer you’re looking for.”

Here are some of Prof Steyn’s tips on how to speak to LLMs effectively:

  1. Clear task definition: Start prompts with action verbs and define clear end goals. Understand the importance of directive language in AI prompting and practise formulating concise tasks.
  2. Contextual mastery: Provide relevant and comprehensive background information for nuanced AI responses. Craft prompts embedded with rich contexts tailored to specific scenarios. Provide URL links that you find relevant.
  3. Using exemplars: Incorporate specific examples or frameworks to guide AI responses. By providing exemplars, users steer AI models towards producing outputs aligned with desired formats or structures. Include links / pdfs as examples.
  4. Persona crafting: Assign roles or personas for more targeted AI responses. Practise assigning and using different personas to shape AI interactions.
  5. Formatting for clarity and impact: Specify output formats for organised and impactful AI responses. Whether requiring data tables or narrative texts, defining formatting preferences ensures clarity and usability of AI-generated content.
  6. Tone setting: Specify the desired tone and style of AI responses to align with communication goals. From formal to casual, setting the appropriate tone enhances the relevance and effectiveness of AI-generated outputs.
  7. Response length calibration: Define the desired length of AI responses for concise and relevant AI outputs.
  8. Explore plugins: Augment AI functionality by leveraging plugins to access external data sources or perform specialised tasks. Integrating plugins expands the scope of AI capabilities, enabling enhanced functionality tailored to specific user requirements. Look for plugins that will make your life easier, such as being able to submit and upload a file instead of cutting and pasting the contents.

He adds that it may be necessary to opt for the paid version of the LLM to access all of the functionality, such as being able to utilise unique plugins, although users should start with (and try to master) the basic free versions before they start trying to use the full spectrum of functionality.

Professor Steyn advises the importance of adopting a systematic approach to AI communication, rooted in clarity, specificity and adaptability. “Don’t just trust what it’s giving back to you, make sure you prompt as accurately as you can, defining what you want to do, how long must it be and how you want it back, and reprompt as many times as needed to get the required output. All of these will improve the quality of the output. Remember, AI can’t replace human expertise but it can offer unprecedented productivity gains.”

In conclusion, Karlien Rust, National Marketing Manager for Corporate at CTU Training Solutions, mentions that the institution offers a series of AI-specific courses that cover various topics related to AI, such as the Microsoft Azure AI Mastery Bundle. To find out more about how you can upskill to use AI effectively, visit CTU Training Solutions’ website.

Share