A thoughtful approach to creating prompts is necessary to bridge the gap between raw queries and meaningful AI-generated responses. By fine-tuning effective prompts, engineers can significantly optimize the quality and relevance of outputs to solve for both the specific and the general. This process reduces the need for manual review and post-generation editing, ultimately saving time and effort in achieving the desired outcomes. Prompt engineering is the process where you guide generative artificial intelligence (generative AI) solutions to generate desired outputs.
Other organizations, including McKinsey, have launched their own gen AI tools. Morgan Stanley has launched a gen AI tool to help its financial advisers better apply insights from the company’s 100,000-plus research reports. The government of Iceland has partnered with OpenAI to work on preserving the Icelandic language. And enterprise software company Salesforce has integrated gen AI technology into its popular customer relationship management (CRM) platform. McKinsey’s Lilli provides streamlined, impartial search and synthesis of vast stores of knowledge to bring the best insights, capabilities, and technology solutions to clients. It’s essential to experiment with different ideas and test the AI prompts to see the results.
For more information on generative AI-related terms, read the following articles:
The bank decides to build a solution that accesses a gen AI foundation model through an API (or application programming interface, which is code that helps two pieces of software talk to each other). The tool scans documents and can quickly provide synthesized answers to questions asked by RMs. To make sure RMs receive the most accurate answer possible, the bank trains them in prompt engineering.
- The prompts can then be used for diverse processes and business units.
- Prompt engineering can be used to enhance a model’s creative abilities in various scenarios.
- This process reduces the need for manual review and post-generation editing, ultimately saving time and effort in achieving the desired outcomes.
- In September 2023, Morgan Stanley launched an AI assistant using GPT-4, with the aim of helping tens of thousands of wealth managers find and synthesize massive amounts of data from the company’s internal knowledge base.
McKinsey estimates that gen AI tools could create value from increased productivity of up to 4.7 percent of the industry’s annual revenues. Prompt engineering has a role to play in helping banks capture this value. Gen AI could enable labor productivity growth of up to 0.6 percent annually through 2040—but that all depends on how fast organizations are able to adopt the technology and effectively redeploy workers’ time. Employees with skills that stand to be automated will need support in learning new skills, and some will need support with changing occupations. Balance simplicity and complexity in your prompt to avoid vague, unrelated, or unexpected answers. A prompt that is too simple may lack context, while a prompt that is too complex may confuse the AI.
Prompt iteration strategies
Prompt engineers bridge the gap between your end users and the large language model. They identify scripts and templates that your users can customize and complete to get the best result from the language models. These engineers experiment with different types of inputs to build a prompt library that application developers can reuse in different scenarios. The large language models (LLMs) are very flexible and can perform various tasks.
You can also phrase the
instruction as a question, or give the model a “role,” as seen in the second
example below. These professionals are also tasked with training and fine-tuning emerging AI tools, such as OpenAI’s ChatGPT, Google’s Bard, Dall-E, Midjourney and Stable Diffusion to deliver precise and relevant responses to people’s questions. Keep in mind that you may need experience in engineering, developing, and coding to be a strong candidate for a prompt engineering role. In addition to earning credentials, consider taking prompt engineering courses. These can be a great way to learn in-demand skills in a structured format, and in some cases, with the support of the course instructor.
Introducing McKinsey Explainers: Direct answers to complex questions
McKinsey’s latest research suggests that gen AI is poised to boost performance across sales and marketing, customer operations, software development, and more. In the process, gen AI could add up to $4.4 trillion annually to the global economy, across sectors from banking to life sciences. Prompt engineering can be used to enhance a model’s creative abilities in various scenarios. For instance, in decision-making scenarios, you could prompt a model to list all possible options, evaluate each option, and recommend the best solution. Learn how to leverage the right databases for applications, analytics and generative AI. The more creative and
open-minded you are, the better your results will be.
Those working with image generators should know art history, photography, and film terms. Those generating language context may need to know various narrative styles or literary theories. In addition to a breadth of communication skills, prompt engineers need to understand generative AI tools and the deep learning frameworks that guide their decision-making. Prompt engineers can employ the following advanced techniques to improve the model’s understanding and output quality.
Want to know more about prompt engineering?
Developers use prompt engineering to design robust and effective prompting techniques that interface with LLMs and other tools. Here are some more examples of techniques that prompt engineers use to improve their AI models’ natural language processing (NLP) tasks. Users avoid trial and error and still receive coherent, accurate, and relevant responses from AI tools. Prompt engineering makes it easy for users to obtain relevant results in the first prompt.
As generative AI becomes more accessible, organizations are discovering new and innovative ways to use prompt engineering to solve real-world problems. Getting good outputs isn’t rocket science, but it can take patience and iteration. Just like when you’re asking a human for something, providing specific, clear instructions with examples is more likely to result in good outputs prompt engineer training than vague ones. In September 2023, Morgan Stanley launched an AI assistant using GPT-4, with the aim of helping tens of thousands of wealth managers find and synthesize massive amounts of data from the company’s internal knowledge base. The model combines search and content creation so wealth managers can find and tailor information for any client at any moment.
Anna Bernstein, a 29-year-old prompt engineer at generative AI firm Copy.ai in New York, is one of the few people already working in this new field. Her role involves writing text-based prompts that she feeds into the back end of AI tools so they can do things such as generate a blog post or sales email with the proper tone and accurate information. She doesn’t need to write any technical code to do this; instead, she types instructions to the AI model to help refine responses. An artificial intelligence (AI) prompt engineer is an expert in creating text-based prompts or cues that can be interpreted and understood by large language models and generative AI tools. In contrast to traditional computer engineers who write code, prompt engineers use written language to evaluate AI systems for idiosyncrasies. Researchers use prompt engineering to improve the capacity of LLMs on a wide range of common and complex tasks such as question answering and arithmetic reasoning.
In healthcare, prompt engineers instruct AI systems to summarize medical data and develop treatment recommendations. Effective prompts help AI models process patient data and provide accurate insights and recommendations. For example, imagine a user prompts a model, “Write a short essay on literature.” The model might draft an essay, critique it for lack of specific examples, and rewrite the essay to include specific examples. This process would repeat until the essay is deemed satisfactory or a stop criterion is met. Critical thinking applications require the language model to solve complex problems. To do so, the model analyzes information from different angles, evaluates its credibility, and makes reasoned decisions.
Chatbot developers can ensure the AI understands user queries and provides meaningful answers by crafting effective prompts. This prompt engineering technique includes a hint or cue, such as desired keywords, to guide the language model toward the desired output. Prompt engineering gives developers more control over users’ interactions with the AI.