List of DeepSeek-V3.2-Exp Integrations

This is a list of platforms and tools that integrate with DeepSeek-V3.2-Exp. This list is updated as of October 2025.

  • 1
    Hugging Face Reviews & Ratings

    Hugging Face

    Hugging Face

    Empowering AI innovation through collaboration, models, and tools.
    Hugging Face is an AI-driven platform designed for developers, researchers, and businesses to collaborate on machine learning projects. The platform hosts an extensive collection of pre-trained models, datasets, and tools that can be used to solve complex problems in natural language processing, computer vision, and more. With open-source projects like Transformers and Diffusers, Hugging Face provides resources that help accelerate AI development and make machine learning accessible to a broader audience. The platform’s community-driven approach fosters innovation and continuous improvement in AI applications.
  • 2
    DeepSeek-V3.1-Terminus Reviews & Ratings

    DeepSeek-V3.1-Terminus

    DeepSeek

    Unlock enhanced language generation with unparalleled performance stability.
    DeepSeek has introduced DeepSeek-V3.1-Terminus, an enhanced version of the V3.1 architecture that incorporates user feedback to improve output reliability, uniformity, and overall performance of the agent. This upgrade notably reduces the frequency of mixed Chinese and English text as well as unintended anomalies, resulting in a more polished and cohesive language generation experience. Furthermore, the update overhauls both the code agent and search agent subsystems, yielding better and more consistent performance across a range of benchmarks. DeepSeek-V3.1-Terminus is released as an open-source model, with its weights made available on Hugging Face, thereby facilitating easier access for the community to utilize its functionalities. The model's architecture stays consistent with that of DeepSeek-V3, ensuring compatibility with existing deployment strategies, while updated inference demonstrations are provided for users to investigate its capabilities. Impressively, the model functions at a massive scale of 685 billion parameters and accommodates various tensor formats, such as FP8, BF16, and F32, which enhances its adaptability in diverse environments. This versatility empowers developers to select the most appropriate format tailored to their specific requirements and resource limitations, thereby optimizing performance in their respective applications.
  • Previous
  • You're on page 1
  • Next