DeepSeek, a division of magic Square Quantization, released the second-generation MoE model...
DIYuan | 2024-05-07 17:47
【数据猿导读】 DeepSeek, a division of magic Square Quantization, released the second-generation MoE model

May 6 news, private equity giant Magic Square Quantification official micro announced that its new organization to explore AGI (general artificial intelligence) "DeepSeek" officially open source the second generation of MoE model: Deepseek-V2. The DeepSeek-V2 API is priced at $1 input and $2 output per million tokens (32K context), which is only nearly one percent of the price of GPT-4-Turbo.
来源:DIYuan
声明:数据猿尊重媒体行业规范,相关内容都会注明来源与作者;转载我们原创内容时,也请务必注明“来源:数据猿”与作者名称,否则将会受到数据猿追责。
刷新相关文章
我要评论
不容错过的资讯
大家都在搜
