書目名稱 | Hyperparameter Optimization in Machine Learning | 副標(biāo)題 | Make Your Machine Le | 編輯 | Tanay Agrawal | 視頻video | http://file.papertrans.cn/431/430671/430671.mp4 | 概述 | Covers state-of-the-art techniques for hyperparameter tuning.Covers implementation of advanced Bayesian optimization techniques on machine learning algorithms to complex deep learning frameworks.Expla | 圖書封面 |  | 描述 | .Dive into hyperparameter tuning of machine learning models and focus on what hyperparameters are and how they work. This book discusses different techniques of hyperparameters tuning, from the basics to advanced methods...This is a step-by-step guide to hyperparameter optimization, starting with what hyperparameters are and how they affect different aspects of machine learning models. It then goes through some basic (brute force) algorithms of hyperparameter optimization. Further, the author addresses the problem of time and memory constraints, using distributed optimization methods. Next you’ll discuss Bayesian optimization for hyperparameter search, which learns from its previous history. ..The book discusses different frameworks, such as Hyperopt and Optuna, which implements sequential model-based global optimization (SMBO) algorithms. During these discussions, you’ll focus on different aspects such as creation of search spaces and distributed optimization of these libraries. ..Hyperparameter Optimization in Machine Learning. creates an understanding of how these algorithms work and how you can use them in real-life data science problems. The final chapter summaries the role of | 出版日期 | Book 2021 | 關(guān)鍵詞 | Artificial Itelligence; Machine Learning; Python; Hyper Parameter Optimization; Hyperparameter Tuning; Se | 版次 | 1 | doi | https://doi.org/10.1007/978-1-4842-6579-6 | isbn_softcover | 978-1-4842-6578-9 | isbn_ebook | 978-1-4842-6579-6 | copyright | Tanay Agrawal 2021 |
The information of publication is updating
|
|