Go to Course: https://www.coursera.org/learn/intro-generative-ai-course-snowflake
How to build applications to implement common AI tasks such as summarization, translation, sentiment analysis, and text classification
How to select a foundation model, including how to decide between the smaller or larger model sizes within a model family
How to perform prompt engineering and inference programmatically with foundation model families including Llama, Mistral, and Anthropic
How to fine-tune a foundation model to distill the capability of a larger model into a smaller one, or to train a model to respond in desired style
Introduction to GenAI on Snowflake
In this module you will be introduced to important generative AI concepts and the Snowflake capabilities to implement them. You will set up your environment to get started implementing AI use cases with Snowflake. You will also build a simple AI application to analyze unstructured text data from call transcripts, including: loading a dataset from AWS S3 bucket into a Snowflake table; prompting foundation models to summarize transcripts in json format; and building a Streamlit UI for the application.
Snowflake Cortex’s LLM-Based FunctionsIn this module you will learn how to use Cortex LLM functions to accomplish many AI tasks, including: how to implement common generative AI use cases such as summarization, translation, sentiment analysis, and text classification with the Cortex Task-specific functions; how to implement other generative AI use cases using the Llama, Mistral, and Anthropic family of models with prompt engineering and the Cortex COMPLETE function; how to select an LLM for their use case, including when to select the larger or smaller models within a family of models; how to use the Cortex Helper functions to estimate token count and cost; and how to test their LLM calls for potential errors without incurring the associated inference cost.
Customize LLM responses with Cortex Fine-TuningIn this module you will learn how to fine-tune an LLM to enable better performance for your use case. You will understand how to distill the capabilities of a large model into a smaller one. You will also learn: how Parameter Efficient Fine-Tuning (PEFT) can lower the training data requirements and reduce cost; how to generate training data and split it into training and evaluation datasets; how to fine-tune a foundation model, Mistral-7b, to learn to respond in a specific style using the Cortex FINETUNE function and the no-code Snowflake AI/ML Studio; testing your fine-tuned model using the Cortex COMPLETE function; how to build and share a simple AI application in Python for your fine-tuned model using Streamlit.
This course introduces learners to generative AI and how to implement common AI use cases using Snowflake. The course starts with an introduction of important concepts, the setup of the learner environment, and the building of a simple application. It’s followed by learning how to use the Cortex LLM functions to accomplish many common AI tasks, and ends with learning how to fine-tune foundation models to perform specific tasks. This course is for anyone looking to skill up on AI, but is particul