在可转移超平面优化下反思语言模型缩放
Rethinking Language Model Scaling under Transferable Hypersphere Optimization
作者
Authors
暂无作者信息
期刊
Journal
暂无期刊信息
年份
Year
-
分类
Category
国家
Country
-
📝 摘要
Abstract
Scaling laws for large language models depend critically on the optimizer and parameterization. Existing hyperparameter transfer laws are mainly developed for first-order optimizers, and they do not structurally prevent training instability at scale. Recent hypersphere optimization methods constrain weight matrices to a fixed-norm hypersphere, offering a promising alternative for more stable scaling. We introduce HyperP (Hypersphere Parameterization), the first framework for transferring optimal
📊 文章统计
Article Statistics
基础数据
Basic Stats
9
浏览
Views
0
下载
Downloads
0
引用
Citations
引用趋势
Citation Trend
阅读国家分布
Country Distribution
阅读机构分布
Institution Distribution
月度浏览趋势
Monthly Views