This book offers a pioneering exploration of classification-based derivative-free optimization (DFO), providing researchers and professionals in artificial intelligence, machine learning, AutoML, and optimization with a robust framework for addressing complex, large-scale problems where gradients are unavailable. By bridging theoretical foundations with practical implementations, it fills critical gaps in the field, making it an indispensable resource for both academic and industrial audiences.

The book introduces innovative frameworks such as sampling-and-classification (SAC) and sampling-and-learning (SAL), which underpin cutting-edge algorithms like Racos and SRacos. These methods are designed to excel in challenging optimization scenarios, including high-dimensional search spaces, noisy environments, and parallel computing. A dedicated section on the ZOOpt toolbox provides practical tools for implementing these algorithms effectively. The book’s structure moves from foundational principles and algorithmic development to advanced topics and real-world applications, such as hyperparameter tuning, neural architecture search, and algorithm selection in AutoML.

Readers will benefit from a comprehensive yet concise presentation of modern DFO methods, gaining theoretical insights and practical tools to enhance their research and problem-solving capabilities. A foundational understanding of machine learning, probability theory, and algorithms is recommended for readers to fully engage with the material.

Les mer
mso-fareast-font-family: 'Times New Roman';">Readers will benefit from a comprehensive yet concise presentation of modern DFO methods, gaining theoretical insights and practical tools to enhance their research and problem-solving capabilities.
Les mer

Introduction.- Preliminaries.- Framework.- Theoretical Foundation.- Basic Algorithm.- Optimization in Sequential Mode.- Optimization in High-Dimensional Search Space.- Optimization under Noise.- Optimization with Parallel Computing.

Les mer

This book offers a pioneering exploration of classification-based derivative-free optimization (DFO), providing researchers and professionals in artificial intelligence, machine learning, AutoML, and optimization with a robust framework for addressing complex, large-scale problems where gradients are unavailable. By bridging theoretical foundations with practical implementations, it fills critical gaps in the field, making it an indispensable resource for both academic and industrial audiences.

The book introduces innovative frameworks such as sampling-and-classification (SAC) and sampling-and-learning (SAL), which underpin cutting-edge algorithms like Racos and SRacos. These methods are designed to excel in challenging optimization scenarios, including high-dimensional search spaces, noisy environments, and parallel computing. A dedicated section on the ZOOpt toolbox provides practical tools for implementing these algorithms effectively. The book’s structure moves from foundational principles and algorithmic development to advanced topics and real-world applications, such as hyperparameter tuning, neural architecture search, and algorithm selection in AutoML.

Readers will benefit from a comprehensive yet concise presentation of modern DFO methods, gaining theoretical insights and practical tools to enhance their research and problem-solving capabilities. A foundational understanding of machine learning, probability theory, and algorithms is recommended for readers to fully engage with the material.

Les mer
Provides comprehensive guide to derivative-free optimization, addressing high-dimensional, noisy, and parallel computing Presents novel classification-based optimization framework for efficient hyperparameter tuning Covers practical ZOOpt toolbox for easy implementation of automated machine learning optimization
Les mer
GPSR Compliance The European Union's (EU) General Product Safety Regulation (GPSR) is a set of rules that requires consumer products to be safe and our obligations to ensure this. If you have any concerns about our products you can contact us on ProductSafety@springernature.com. In case Publisher is established outside the EU, the EU authorized representative is: Springer Nature Customer Service Center GmbH Europaplatz 3 69115 Heidelberg, Germany ProductSafety@springernature.com
Les mer

Produktdetaljer

ISBN
9789819659289
Publisert
2025-08-05
Utgiver
Vendor
Springer Nature Switzerland AG
Høyde
235 mm
Bredde
155 mm
Aldersnivå
Research, P, UP, 06, 05
Språk
Product language
Engelsk
Format
Product format
Innbundet

Biografisk notat

Yang Yu is a professor at Nanjing University, specializing in artificial intelligence, machine learning, and optimization. His research focuses on derivative-free optimization, AutoML, and reinforcement learning. Prof. Yu has an extensive publication record in leading journals and conferences, including Artificial Intelligence, IEEE Transactions on Pattern Analysis and Machine Intelligence, ICML, NeurIPS, IJCAI, and AAAI. He is a co-author of the book Evolutionary Learning: Advances in Theories and Algorithms (Springer, 2019). His work has introduced foundational frameworks and algorithms in classification-based optimization, notably Racos and SRacos, and contributed to the development of the optimization toolbox ZOOpt, widely utilized in academic and industrial research.

Hong Qian is an associate professor at East China Normal University, with expertise in optimization algorithms, machine learning, and computational intelligence. His research focuses on developing scalable derivative-free optimization techniques for high-dimensional problems with theoretical guarantees, and LLM for optimization. Dr. Qian has published extensively in prominent venues such as ICML, NeurIPS, AAAI, and IEEE Transactions on Evolutionary Computation and has contributed to advancements in sampling-and-classification frameworks and their applications in machine learning and optimization tasks.

Yi-Qi Hu is an AI technical expert in Huawei Co. Ltd., with expertise in machine learning, optimization algorithms, and large language model on device. His work focuses on developing machine learning systems utilizing derivative-free optimization techniques. Dr. Hu has published extensively in prominent venues such as AAAI and IJCAI and has contributed to advancements in derivative-free optimization-based AutoML systems.