个人资料
- 直属机构:计算机科学与技术学院(软件学院)
- 联系电话:
- 性别:男
- 电子邮箱:zhuangyang@suda.edu.cn
- 专业技术职务:
- 办公地址:苏州大学天赐庄校区理工实验楼541
- 毕业院校:厦门大学
- 通讯地址:
- 学位:工学博士
- 邮编:215000
- 学历:博士研究生毕业
- 传真:
教育经历
- 博士,2014年9月-2018年12月,信号与信息处理,厦门大学,工学博士,2018年12月,信号与信息处理,厦门大学
工作经历
- 2019年5月-2021年6月,中山大学,博士后研究工作,博士后
- 2021年8月-~,苏州大学,教学科研,副教授
个人简历
杨壮个人简介
杨壮,男,博士,副教授,苏州大学优秀青年学者。2018年毕业于厦门大学信息科学与技术学院,获得信号与信息处理工学博士学位。2019年5月~2021年6月于中山大学电子与通信工程学院从事博士后研究工作。 2021年8月加入苏州大学计算机科学与技术学院,参与计算机系和机器学习课题组的各项科研和教学工作。在中山大学从事博士后期间,获得博士后面上资助。主持国家自然科学基金(青年项目2024-2026)、江苏省自然科学基金(青年项目2023-2026)。目前主要研究领域包括但不限于:机器学习、统计学习、数据挖掘、优化理论与应用研究。近年来以第一作者身份发表SCI论文超过二十篇。包括:IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI, CCF A)、Journal of Machine Learning Research (JMLR, CCF A)、 IEEE Transactions on Neural Networks and Learning Systems (TNNLS, CCF B)、Information Sciences (CCF B)、 IEEE Transactions on Big Data (TBD)、Knowledge-Based Systems、Expert Systems with Applications、Neurocomputing等国际期刊。
担任《南京理工大学学报》青年编委;担任下述国际/国内刊物的审稿人(包括但不限于):IEEE Transactions on Circuits and Systems I,IEEE Transactions on Big Data, Pattern Recognition, Signal Processing, Neural Networks, Numerical Algorithms, 计算机科学。
[2025.05] 入选《南京理工大学学报》青年编委
[2025.01] 实验室一篇论文被IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)接收
[2024.06] 实验室一篇论文被IEEE Transactions on Computational Social Systems接收
招生说明:
每年有5个研究生名额(一般不考虑招满,PS 2-3个最佳)!
对学生的要求:
不期待你本科有多优秀,但期望你勤奋、好学、对学习始终抱着严谨的态度;积极向上,踏实肯干,具有活力,有足够的韧劲和追求卓越的精神!
不接受:有躺平思想的学!!有摸鱼思想的学生!!
研究领域
开授课程
科研项目
- 1、博士后面上资助,2019-2021,主持
- 2、国家自然科学基金(青年项目),2024-2026,主持
- 3、江苏省自然科学基金(青年项目),2023-2026,主持
论文
- 1、Variance reduced optimization with implicit gradient transport,Knowledge-Based Systems,SCI,2021,Zhuang Yang,212/106626,1
- 2、Fast automatic step size selection for zeroth-order nonconvex stochastic optimization,Expert Systems with Applications,SCI,2021,Zhuang Yang,174/114749,1
- 3、On the step size selection in variance-reduced algorithm for nonconvex optimization,Expert Systems with Applications ,SCI,2021,Zhuang Yang,169/114336,1
- 4、Accelerating mini-batch sarah by step size rules,Information Sciences,SCI,2021,Zhuang Yang,Zengping Chen,Cheng Wang,558/157-173,1
- 5、An accelerated stochastic variance-reduced method for machine learning problems,Knowledge-Based Systems,SCI,2020,Zhuang Yang,Zengping Chen,Cheng Wang,198/105941,1
- 6、Accelerated stochastic gradient descent with step size selection rules,Signal Processing,SCI,2019,Zhuang Yang,Cheng Wang,Zhemin Zhang,159/171-186,1
- 7、Mini-batch algorithms with online step size,Knowledge-Based Systems ,SCI,2019,Zhuang Yang,Cheng Wang,Zhemin Zhang,165/228-240,1
- 8、Mini-batch algorithms with Barzilai–Borwein update step,Neurocomputing,SCI,2018,Zhuang Yang,Cheng Wang,Yu Zang,314/177-185,1
- 9、Random Barzilai–Borwein step size for mini-batch algorithms,Engineering Applications of Artificial Intelligence,SCI,2018,Zhuang Yang,Cheng Wang,Zhemin Zhang,72/124-135,1
- 10、Adaptive stochastic conjugate gradient for machine learning,Expert Systems with Applications,SCI,2022,Zhuang Yang,Soochow University,117719,1
- 11、Large-scale machine learning with fast and stable stochastic conjugate gradient,Computers & Industrial Engineering,SCI,2022/09,Zhuang Yang,173,1
- 12、Adaptive step size rules for stochastic optimization in large-scale learning,Statistics and Computing,SCI,2023,Zhuang Yang,Li Ma,33/2,1
- 13、Painless Stochastic Conjugate Gradient for Large-Scale Machine Learning,IEEE Transactions on Neural Networks and Learning Systems,SCI,2023,Zhuang Yang,1-14,1
- 14、Adaptive Powerball Stochastic Conjugate Gradient for Large-Scale Learning,IEEE transactions on Big Data,SCI,2023,Zhuang Yang,1-11,1
- 15、Improved Powered Stochastic Optimization Algorithms for Large-Scale Machine Learning,Journal of Machine Learning Research,SCI,2023,Zhuang Yang,24/1-29,1
- 16、Stochastic variance reduced gradient with hyper-gradient for non-convex large-scale learning,Applied Intelligence,SCI,2023/10,Zhuang Yang,1-15,1
- 17、Powered stochastic optimization with hypergradient descent for large-scale learning systems,Expert Systems with Applications,SCI,2023/10,Zhuang Yang,Xiaotian Li,1
- 18、 SARAH-M: A fast Stochastic Recursive Gradient Descent Algorithm via Momentum,Expert Systems with Applications,SCI,2023/11,Zhuang Yang,238/1-15,1
- 19、The Powerball Method With Biased Stochastic Gradient Estimation for Large-Scale Learning Systems,IEEE Transactions on Computational Social Systems,SCI,2024/7/2,Zhuang Yang,X/X/1-13,1
- 20、Adaptive Biased Stochastic Optimization,IEEE Transactions on Pattern Analysis and Machine Intelligence ,SCI,2025/1/10,Zhuang Yang,Soochow University,1-15,1