GenerativeAI Glossary:

Renswick Delver
3 min readJun 24, 2024

--

This blog contains a collection of the GenAI buzz words and its brief decriptions.

A — E

  1. Auto-encoders: Neural networks used for learning efficient representations of data.
  2. Computer Vision: AI’s ability to interpret and understand visual information.
  3. Ethical AI: Considerations and practices to ensure AI systems are used responsibly and ethically.
  4. Embeddings: In generative AI, embeddings transform text into numerical vectors for better understanding and processing, like using word embeddings to improve search relevance in recommendation systems.

F — J

  1. Fine-tuning: Fine-tuning involves training a pre-trained model on a specific dataset to improve its performance on tasks, like customizing GPT-3 for legal or custom document generation.
  2. GPT: (Generative Pre-trained Transformers) are powerful language models capable of generating coherent text, such as GPT-4 writing creative content or summarizing articles.
  3. Hallucinations: In generative AI, hallucinations refer to instances where models generate plausible but incorrect or nonsensical information, like ChatGPT inventing fictitious historical events.
  4. HuggingFace: Hugging Face is a company and open-source community known for its vast variety of libraries, models, and tools, including the popular Transformers library used for NLP, Audio and Computer Vision.

L — P

  1. Large Language Model (LLM): A Large Language Model (LLM) like GPT-4 can generate diverse and complex text outputs, ranging from technical documentation to creative writing. For instance, GPT series (GPT-4o, GPT-4, GPT-3.5 Turbo etc.), Gemini, Llama
  2. Natural Language Processing (NLP): AI’s ability to understand and generate human language.
  3. Open Source LLM: Open Source LLMs provide publicly available large language models, enabling developers to customize and deploy AI models like Hugging Face’s GPT-2, Llama etc for various applications. Refer HuggingFace for more context
  4. Prompt Engineering: Prompt Engineering involves crafting effective prompts to guide generative AI models, like designing questions to get concise and relevant answers from ChatGPT.

— Few Shot Learning: Few Shot Learning allows models to generalize from a few examples, enabling LLMs to adapt to new tasks with minimal data. For example, providing just a couple of sentences to ChatGPT to generate a product description for a new gadget. (One Shot/Zero Shot — including one or no example content with the prompt.)

Q — Z

  1. Reinforcement Learning: Learning through trial and error, rewarding good outcomes.
  2. Retrieval Augmented Generation (RAG): RAG combines retrieval mechanisms with generative models to provide accurate information, such as retrieving facts to enhance chatbot responses. For example, a customer service chatbot using RAG to pull up recent order information to answer customer queries.
  3. Semantic Search: Semantic Search uses AI to understand the meaning behind queries, improving search relevance even if the exact words aren’t used.
  4. Tokenization: Tokenization is the process of breaking text into smaller units (tokens) for processing by AI models, such as splitting sentences into words or characters.
  5. Transformers: Transformers are the foundational architecture behind many generative models, including GPT-3, enabling efficient processing of sequential data for tasks like translation and text generation.
  6. Transfer Learning: Transfer Learning applies knowledge from one domain to another. For example, transferring the knowledge from a language model trained on Wikipedia to improve its performance in drafting medical reports.
  7. Vector Database: A Vector Database stores embeddings to facilitate fast and accurate search and retrieval, such as improving the speed of semantic searches in large document repositories. (e.g.) Pinecone, ChromaDB etc.

--

--

Renswick Delver

I translate complex data into clear stories with deep expertise in Statistics and GenAI. Let's unlock the power of data and explore the future of AI, together!