Автоматизоване машинне навчання. Стан та перспективи розвитку

Автор(и)

DOI:

https://doi.org/10.15407/intechsys.2025.02.003

Ключові слова:

автоматизоване машинне навчання, демократизація штучного інтелекту, керований даними штучний інтелект, глибоке посилене навчання, трансферне навчання

Анотація

Розглянуто автоматизоване машинне навчання як рішення на основі штучного інтелекту для потреби автоматизації наскрізного процесу застосування машинного навчання, тобто проектування конвеєрів машинного навчання — послідовності кроків, які перетворюють необроблені дані на машинну модель, прийнятну для розгортання у практичному використанні. Присутність людини у цьому циклі має бути значно скорочена або її бажано зовсім виключити. Розглянуто напрям подальшого розвитку штучного інтелекту та автоматизованого машинного навчання та тенденції його розвитку.

Посилання

Oursatyev O., Data Research in Industrial Data Mining Projects in the Big Data Generation Era. Control Systems and Computers, 2023, Issue 3, 33–54. [In Ukrainian: Дослідження даних у промислових data-mining-проєктах в епоху генерації великих даних. ] https://doi.org/10.15407/csc.2023.03.033

Elliott T. Intelligence Called?! – Innovation Evangelism. URL: https://timoelliott.com/blog/2017/06/what-is-artificial-intelligence-called.html [Accessed: 11 June 2019].

Schlesinger M., Hlavac V. Ten Lectures on Statistical and Structural Pattern Recognition. Computational Imaging and Vision. Kluwer Academic Publishers, Dordrecht, Boston London, 2002, Vol. 24, 520 p. https://doi.org/10.1007/978-94-017-3217-8

GitHab. Awesome-AutoML-Papers. What is AutoML? URL: https://github.com/hibayesian/awesome-automl-papers [Accessed: 3 March 2024]

Guyon I., et al. Design of the 2015 ChaLearn AutoML Challenge. URL: http://haralick.org/ML/automl_ijcnn15.pdf [Accessed: 01 Apr. 2024]

Guyon I. et al., Analysis of the AutoML Challenge Series 2015–2018. URL: https://link.springer.com/chapter/10.1007/978-3-030-05318-5_10 [Accessed: 01 Apr. 2024]

Domhan T., Springenberg J.T., Hutter F. Speeding Up Automatic Hyperparameter Optimization of Deep Neural Networks by Extrapolation of Learning Curves. 24th International Conference on Artificial Intelligence IJCAI 2015, Buenos Aires, Argentina, 3460–3468.

Bengio, Y. Gradient-Based Optimization of Hyperparameters. Neural Computation, 2000, Vol. 12 (8), 1889–1900. https://doi.org/10.1162/089976600300015187

Bergstra, J., Bengio, Y., Random Search for Hyper-Parameter Optimization. Journal of Machine Learning Research, 2012, Vol. 13, 281–305. URL: http://www.jmlr.org/papers/volume13/bergstra12a/bergstra12a.pdf

Budjac R., Nikmon M., Schreiber P., Zahradníková B., Janáčová, D. Automated machine learning overview. Research Papers Faculty of Materials Science and Technology Slovak University of Technology, 2019, Vol. 27 (45), 107–112. https://doi.org/10.2478/rput-2019-0033

Snoek J., Larochelle H., Adams Ryan P. Practical Bayesian Optimization of Machine Learning Algorithms. ArXiv, 2012, 206.2944, 1–9. https://doi.org/10.48550/arXiv.1206.2944

Bengio Yo., Ducharme R., Vincent P., Janvin C. A Neural Probabilistic Language Model. Journal of Machine Learning Research, 2003, Vol. 3, 1137—1155. URL: https://dl.acm.org/ doi/10.5555/944919.944966

Hutter F., et al. Sequential Model-Based Optimization for General Algorithm Configuration. Learning and intelligent optimization: 5th international conference, LION 5, Rome, Italy, 2011, 1–15. URL: https://ml.informatik.uni-freiburg.de/papers/11-LION5SMAC.pdf

Olson R., Moore J. TPOT: A Tree-based Pipeline Optimization Tool for Automating Machine Learning. JMLR: Workshop and Conference Proceedings AutoML Workshop (ICML2016), 2016, 66–74. URL: http://proceedings.mlr.press/v64/olson_tpot_2016.pdf

Li Liam. Towards Efficient Automated Machine Learning. 2020, 184 p. URL: https://www.ml.cmu.edu/research/phd-dissertation-pdfs/thesis_li_liam.pdf

Thornton C., Hutter F. et al. Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms. 19th ACM SIGKDD international conference on Knowledge discovery and data mining (KDD-2013), 2013, 847–855. https://doi.org/ 10.1145/2487575.2487629

Thornton C., Hutter F. et al. Auto-WEKA: Combined Selection and Hyperparameter Optimization of Classification Algorithms. 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD'13), 2013, 847-855. URL: https://www.cs.ubc.ca/~hoos/Publ/ThoEtAl13.pdf [Accessed: 01 Apr. 2024].

Feurer M. et al. Practical Automated Machine Learningfor the AutoML Challenge 2018, ICML 2018 AutoML Workshop. URL: https://ml.informatik.uni-freiburg.de/papers/18-AUTOML-AutoChallenge.pdf

Bergstra J., et al. Algorithms for Hyper-Parameter Optimization. Part of Advances in Neural Information Processing Systems (NIPS 2011), 2011, Vol. 24, 1-9. URL: https://papers.nips.cc/paper_files/paper/2011/file/86e8f7ab32cfd12577bc2619bc635690-Paper.pdf

Sun, L., et al. Automatic Neural Network Search Method for Open Set Recognition. 2019 IEEE Int. Conf. Image Process, Beijing, China Fujitsu Laboratories Ltd., Kawasaki, Japan, 2019. https://doi.org/10.1109/ICIP.2019.8803605

Jamieson K., Talwalkar A. Non-stochastic Best Arm Identification and Hyperparameter Optimization, Feb 2015, 1–13. URL: https://arxiv.org/pdf/1502.07943.pdf

Li L., Jamieson K., Rostamizadeh A., Talwalkar A. Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization. Journal of Machine Learning Research, 2018, Vol. 18, 1‒52. URL: https://arxiv.org/pdf/1603.06560.pdf

Li L., Jamieson K., Rostamizadeh A., Gonina E., et al. A System for Massively Parallel Hyperparameter Tuning. 3-rd MLSys Conference, Austin, TX, USA, 2020, 1–17. URL: https://arxiv.org/pdf/1810.05934.pdf

Jin Y. et al. Multi-Objective Machine Learning. Berlin Heidelberg: Springer, 2006. Vol. 16., 657 p. URL: https://link.springer.com/content/pdf/ bfm%3A978-3-540-33019-6%2F1.pdf

Jin Y., Sendhoff B. Pareto-Based Multiobjective Machine Learning: An Overview and Case Studies. IEEE transactions on systems, man, and cybernetics – part C: applications and reviews, 2008, Vol. 38 (3), 397—415. https://doi.org/10.1109/TSMCC.2008.919172

Parmentier L.,Nicol O., Jourdan L., Kessaci M. TPOT-SH: A Faster Optimization Algorithm to Solve the AutoML Problem on Large Datasets. IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI), 2019, 471–478. https://doi.org/10.1109/ICTAI.2019.00072

Oliveira M. 3 Reasons Why AutoML Won’t Replace Data Scientists Yet. 2019. URL: https://www.kdnuggets.com/2019/03/why-automl-wont-replace-data-scientists.html

Adopting a data science process framework. URL: https://microsoft.github.io/azureml-ops-accelerator/1-MLOpsFoundation/2-SkillsRolesAndResponsibilities/1-AdoptingDSProcess.html [Accessed 6 Jun. 2025]

Berthold M. Principles of Guided Analytics. 2018. URL: https://www.knime.com/blog/principles-of-guided-analytics

Tamagnini P., Schmid S.,Dietz C. How to Automate Machine Learning. 2019. URL: https://www.knime.com/blog/how-to-automate-machine-learning

301. LeDell E. The different flavors of AutoML. 2018. URL: https://www.h2o.ai/blog/the-different-flavors-of-automl/

Krizhevsky A., Sutskever I., Hinton G. Imagenet classification with deep convolutional neural networks. NIPS, 2012, 1097–1105.

Domhan T., Springenberg J.T., Hutter F. Speeding Up Automatic Hyperparameter Optimization of Deep Neural Networks by Extrapolation of Learning Curves. 24th International Conference on Artificial Intelligence [IJCAI 2015], 2015, 3460–3468.

Piatetsky Gregory, KDnuggets on December 10, 2021 in 2022 Predictions. Main 2021 Developments and Key 2022 Trends in AI, Data Science, Machine Learning Technology – https://www.kdnuggets.com/2021/12/trends-ai-data-science-ml-technology.html

Guyon I. HUMANIA. Artificial Intelligence for All – https://guyon.chalearn.org/projects/humania

V.I. Gritsenko, M.I. Schlesinger. Interraction of pattern recognition machine thinking and learning problems. International Scientific Technical Journal «Problems of Control and Informatics», 3, 108-136. URL: http://jnas.nbuv.gov.ua/article/UJRN-0001259001 [In Russian].

AI Glossary. Automated Machine Learning Automl, December 24, 2023. URL: https://www.larksuite.com/en_us/topics/ai-glossary/automated-machine-learning-automl

LeCun Y., BengioY., Hinton G. Deep learning. Nature, 2015, Vol. 521 (7553), 436‒444. https://doi.org/10.1038/nature14539

Alom M. Z. et al. A state-of-the-art survey on deep learning theory and architectures. Electronics, 2019, Vol. 8 (3), 292‒357. https://doi.org/10.3390/electronics8030292

Deng J. et al. Imagenet: A large-scale hierarchical image database. IEEE Conference on Computer Vision and Pattern Recognition, 2009, 248–255. https://doi.org/10.1109/CVPR.2009.5206848

Sun C. et al. Revisiting unreasonable effectiveness of data in deep learning era. IEEE international conference on computer vision, 2017, 843–852. https://doi.org/10.1109/ICCV.2017.97

Assunçao F. et al. Evolving the topology of large scale deep neural networks. European Conference on Genetic Programming, Springer, 2018, 19–34. https://doi.org/10.1007/978-3-319-77553-1_2

Cubuk, Ekin D. et al. Autoaugment: Learning augmentation policies from data. CoRR, 2018, Vol. 1, 1-14. https://doi.org/10.48550/arXiv.1805.09501

Vapnik V. Statistical Learning Theory. Wiley-Interscience publication, Wiley, 1998, 736 p. URL: https://books.google.fr/books?id=GowoAQAAMAAJ [Accessed: 01 Apr. 2024]

Downloads

Опубліковано

2025-07-17

Як цитувати

Урсатьєв, О., Волков, О., & Ткаля, В. (2025). Автоматизоване машинне навчання. Стан та перспективи розвитку. Information Technologies and Systems (Інформаційні технології та системи), 2(2), 3–33. https://doi.org/10.15407/intechsys.2025.02.003

Номер

Розділ

Теорія побудови інформаційних технологій та систем