DeepSeek has launched an upgraded V3.1 model, which is optimised to support Chinese-made chips and offers faster processing. The DeepSeek-V3.1 model's UE8M0 FP8 precision format is optimised for forthcoming domestic chips, though the specific chip models and manufacturers remain undisclosed. FP8, or 8-bit floating point, is a data processing format that allows AI models to operate more efficiently, using less memory while running faster than traditional methods.
The V3.1 features a hybrid inference structure, enabling operation in both reasoning and non-reasoning modes. Users can switch between modes using a 'deep thinking' button on the company's app and web platform, both now running V3.1. The company will adjust the costs for using the model's API starting September 6, allowing developers to integrate its AI models into other apps and web products.
DeepSeek gained recognition for AI models competing with Western counterparts like OpenAI's ChatGPT, while offering lower operational costs. The company's focus on domestic chip compatibility signals a move towards supporting China's emerging semiconductor ecosystem, amidst efforts to replace US technology.