܄

DeepSeek, a division of magic Square Quantization, released the second-generation MoE model...

【数据猿导读】 DeepSeek, a division of magic Square Quantization, released the second-generation MoE model

DeepSeek, a division of magic Square Quantization, released the second-generation MoE model

May 6 news, private equity giant Magic Square Quantification official micro announced that its new organization to explore AGI (general artificial intelligence) "DeepSeek" officially open source the second generation of MoE model: Deepseek-V2. The DeepSeek-V2 API is priced at $1 input and $2 output per million tokens (32K context), which is only nearly one percent of the price of GPT-4-Turbo.


来源:DIYuan

声明:数据猿尊重媒体行业规范,相关内容都会注明来源与作者;转载我们原创内容时,也请务必注明“来源:数据猿”与作者名称,否则将会受到数据猿追责。

刷新相关文章

Sensetime will release a Cantonese version of Sunday Model 5.0
Sensetime will release a Cantonese version of Sunda...
Intelligent spectrum AI is developing a domestic Vincennes video model against Sora
Intelligent spectrum AI is developing a domestic Vi...
Small red book test self-developed large model
Small red book test self-developed large model "s...

我要评论

数据猿微信公众号
2023第七届上海AI大会暨医药和医疗创新峰会
2023深圳物联网展
人工智能博览会
FMW2023全球闪存峰值
2023世界农业科技创新大会暨世界农业科技博览会
2024上海世博展览馆
返回顶部