TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Photo by Emiliano Vittoriosi on Unsplash

Member-only story

Prompt Engineering Guide

Principles, Techniques, and Applications to Harness the Power of Prompts in LLMs as a Data Analyst

7 min readMay 7, 2023

--

Large Language Model (LLM) is on the rise, driven by the popularity of ChatGPT by OpenAI which took the internet by storm. As a practitioner in the data field, I look for ways to best utilize this technology in my work, especially for insightful-yet-practical work as a Data Analyst.

LLMs can solve tasks without additional model training via “prompting” techniques, in which the problem is presented to the model as a text prompt. Getting to “the right prompts” are important to ensure the model is providing high-quality and accurate results for the tasks assigned.

In this article, I will be sharing the principles of prompting, techniques to build prompts, and the roles Data Analysts can play in this “prompting era”.

What is prompt engineering?

Quoting Ben Lorica from Gradient Flow, “prompt engineering is the art of crafting effective input prompts to elicit the desired output from foundation models.” It’s the iterative process of developing prompts that can effectively leverage the capabilities of existing generative AI models to accomplish specific objectives.

--

--

TDS Archive
TDS Archive

Published in TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Olivia Tanuwidjaja
Olivia Tanuwidjaja

Written by Olivia Tanuwidjaja

Analytics geek🤓, playing with data and beyond 🚀. Views are my own.

Responses (1)