GenAI prompt training: is it worth it for your business?

GenAI tools are only as good as the prompts users enter. Organisations must train their employees to get the most from the technology

Lecturer Helps Scholar With Project, Advising On Their Work. Teacher Giving Lesson To Diverse Multiethnic Group Of Female And Male Students In College Room, Teaching New Academic Skills On A Computer.

Even as employees express concerns about generative AI impacting their careers, many use tools such as ChatGPT and Gemini in their daily workflows. Without the proper training, that could be a problem.

Users ask ChatGPT and similar tools what they want via text inputs – or ‘prompts’ – and the platforms use a probabilistic algorithm to return what they think the answer might be.

There’s no denying the success of such tools. Since OpenAI released ChatGPT, the GenAI platform has acquired 180.5 million users as of September 2024, drawn to its simplicity. According to Cypher Learning, a quarter of employees use GenAI at work – whether their boss has signed off on it or not.

This could be a compliance nightmare. Staff might use the various AI assistants to work on confidential documents. The overly trusting, or those under intense time pressures, might turn to consumer GenAI as a source of information without properly scrutinising its output. There are other dangers: these platforms can spit out convincing “hallucinations”, or return results with baked-in bias.

So how can organisations empower their employees to make the most of GenAI, while avoiding these pitfalls? As with developing any new skill, training is essential – but there still isn’t enough of it. The Cypher report found that in organisations that have approved the use of GenAI, 57% of employees barely utilise the technology, mostly because they haven’t received training on the “prompt engineering” needed to use the platforms.

It’s therefore clear that companies need to focus on training, ensuring their employees use GenAI prompts effectively. However, perspectives differ on the ideal prompt training programme.

The case for prompt training

For a start, how much time should your organisation dedicate to prompt training? Lots, according to Harshul Asnani, president and head of the Europe business at IT services company Tech Mahindra, who thinks prompt training is critical.

“It’s the difference between a good or poor-quality outcome using AI,” Asnani says.

GenAI is trained on large language models (LLMS), so it’s essential to be able to use language commands to narrow your focus for a particular outcome, he notes.

“Giving examples and refining responses helps teams get the inspiration or suggestions they’re looking for in the shortest amount of time, which is crucial to increasing efficiency and productivity.”

His company set about starting a GenAI training scheme after partnering with Microsoft, bringing the Copilot suite of AI chatbot apps into Tech Mahindra.

Training has to cover the fundamentals, but also less-straightforward factors such as clarity and context, creativity, iteration and validation

Tech Mahindra carved training into two distinct groups: developers and information workers, the latter including sales teams, project managers and senior leaders.

Prompt engineering was a large part of this training. The education began with courses on the foundational concepts of GenAI, followed by case studies, test assignments and responsible AI content covering data privacy, security guardrails and ensuring sensitive information isn’t inadvertently disclosed.

The organisation has already trained more than 45,000 employees in the basic principles of AI and automation, with another 15,000 trained in GenAI specifically. Tech Mahindra plans to upskill all its IT staff with AI training by 2025.

Different levels of GenAI prompt training

After training staff on the fundamentals and establishing sandbox environments to encourage hands-on experimentation, employers can dive deeper into more advanced techniques.

Tech services business Infosys decided to ensure its entire workforce of more than 300,000 employees was “AI-ready”, equipped with the basic knowledge of what GenAI is, how it works and the process of crafting effective prompts with meaningful results.

Infosys CTO Rafee Tarafdar says the company took a three-tiered approach to training, splitting staff into general consumers of AI, who would receive basic training; builders, who would use GenAI to create new products or applications; and finally masters, who would specialise in deep GenAI knowledge for their specific domains.

More advanced AI techniques might include training employees in parameters such as “temperature”, which in the parlance of LLMs essentially means variability in results: ie high temperatures allow for more creativity and randomness in the response, while low temperatures set safer and more predictable responses.

The company encourages employees to experiment with other elements such as personas: what kind of person do users want the model to consider in its outputs?

Going deeper still, developers using GenAI might need training in how to check and validate the output of the code generated. Security specialists might investigate prompt injection attacks, where GenAI is turned against itself and “tricked” to provide answers outside of its security guardrails, in order to audit the safety and efficacy of the models in use.

How prompt training differs to traditional software training

One of the most important factors across the board is to facilitate ongoing learning.

“Prompts that worked for one model will not work for another model,” says Tarafdar. “So you need to continuously learn and see how to work across models and across versions.”

That learning looks a little different than for traditional software. With typical business software such as a content management system or a customer relationship management tool, the application will always work in the same way. Button X achieves task Y, for instance, or “this is how to set up a workflow.”

Every person thinks and asks questions differently, so prompt engineering training
needs to address both diversity of logic as well as diversity of learning styles

By comparison, GenAI can be inconsistent. It is only as effective as the prompts issued to it. The user must know how to get the best from the app. It is a little more like the effective use of search, with its requirements for critical thinking and trial and error to find what you need.

“Every person thinks and asks questions differently, so prompt engineering training needs to address both diversity of logic as well as diversity of learning styles to be effective and enriching,” says Kathy Diaz, chief people officer at Cognizant, an IT services and consulting firm.

“Prompt engineering is a mix of art and science, so training has to cover the fundamentals of prompting but also enable learners to think through less straightforward factors such as clarity and context, creativity, iteration and validation to achieve optimal results,” Diaz says.

This means hands-on tasks are essential. Cognizant, for instance, acquired 25,000 licences for Microsoft Copilot and set up sandbox environments so that employees could freely experiment and observe how outputs varied.

At the same time, employees receive access to practical exercises, quizzes and discussions. The idea is to combine theoretical knowledge with concrete, real-world application of their skills.

“A blended learning experience is critical to engage diverse learning styles and create deeper understanding of concepts,” Diaz says. “Along with e-learning courses, we have subject matter experts to deepen conversation, provide examples and clarify concepts.”

The future of GenAI training: prompts or outcomes?

Continuous training and upskilling should always be a priority, especially around emerging technologies such as AI. But Graham Glass, founder and CEO of Cypher Learning, urges some caution in terms of training everyone on certain aspects of the technology, such as prompt engineering.

“There’s a bit of a knee-jerk reaction from companies who think everyone needs to get specific prompt training when that isn’t really the case,” Glass argues. “While it’ll be a big part of some jobs – for those who need to get under the hood and work directly with the AI – your average worker won’t need to do that.”

Glass expects that by 2025, the need for users to learn complex prompt design or prompt engineering will “largely disappear” as AI technologies become more deeply embedded into everyday applications.

Peter van der Putten, director of the AI lab at Pega, says employees in an enterprise context won’t just use chatbot-style interfaces. In many cases, GenAI will be embedded into tools, processes and workflows, with prompts hidden for end users.

“It will be more important to train employees on general GenAI principles: for instance, the fact that the output can be hallucinating or biased,” he says, adding that generally, teams should also avoid sharing sensitive information materials with GenAI. Instead, organisations should focus their training on how GenAI is being used by existing tools and systems. 

Joshua Wöhle is CEO and co-founder of AI skills platform Mindstone. He thinks most organisations would benefit from an outcomes-based approach. In his company’s AI training programme, only one of nine hours is spent on prompt engineering, with the rest spent on discussing use cases.

“We’ve found that once people understand how AI can help them in their day-to-day, the tech stuff just clicks,” Wöhle says. “It’s about making AI relevant to what they’re already doing.”

He advises businesses should examine where the AI skills gaps are and set some clear goals. They could make learning about AI part of the workday, rather than an extracurricular activity. And it’s helpful to mix up the styles of learning – sometimes individual and sometimes in a group.

Organisations can monitor the success of this outcomes-based approach to training by examining team KPIs. Are sales going up? Are IT teams sorting out issues faster?

Views on precisely how to train the workforce on GenAI might differ, but the experts agree that it’s a worthwhile endeavour to demystify the tooling and ensure staff make the most of the rapidly developing technology.