Create a text generator in pytorch from scratch | Free Generative AI Course
Free Generative AI Udemy Course
Create a text generator in pytorch from scratch | Free Generative AI Course |
Description
In this course, the primary objective is to develop a text generator from scratch using next-token prediction. To accomplish this, we will utilize an opensource dataset called bookcorpus. By the end of this course, we will have a better understanding of how to build a text generator and implement the necessary components for training a model and generating text.
One of the first things we will learn is how to load data into our model. We will explore various techniques for batching data and discuss why certain batching methods are better than others. We will also cover how to preprocess and clean the data to ensure that it is suitable for training our model.
After loading and preprocessing the data, we will delve into the process of training a model. We will learn about the architecture of a typical text generation model and the different types of layers that can be used. We will also cover topics such as loss functions and optimization algorithms and explore the impact that these have on our model's performance.
Once we have trained our model, we will move on to generating text using our newly trained text generator. We will explore various approaches for generating text, such as random sampling, greedy decoding, and beam search. We will also discuss how to tune the hyperparameters of our model to achieve better results.
Finally, we will create a small app that can run in the browser to showcase our text generator. We will discuss various front-end frameworks such as React and Vue.js and explore how to integrate our model into a web application.
Overall, this course will provide us with a comprehensive understanding of how to build a text generator from scratch and the tools and techniques required to accomplish this task.