Microsoft continues to push the boundaries of AI innovation by introducing DeepSeek R1 7B & 14B distilled models for Copilot+ PCs via Azure AI Foundry.
The availability of these models reinforces the commitment to providing fast, efficient AI capabilities for real-world applications.
Copilot+ PCs powered by Qualcomm Snapdragon X, Intel Core Ultra 200V, and AMD Ryzen will support the new models.
Running 7B and 14B parameter reasoning models on NPUs signifies a milestone in AI democratization, enabling users to leverage powerful machine learning models directly from their PCs.
NPUs in Copilot+ PCs are designed for efficient AI inference, allowing CPUs and GPUs to handle other tasks while maintaining superior results.
Efficient inferencing has become crucial for language models, enabling improved response quality across various tasks through enhanced computational power.
Microsoft's research investments have led to innovations like Phi Silica, enabling low-bit inference on NPUs for advanced scenarios like model fine-tuning.
Developers can access and run DeepSeek models by downloading the AI Toolkit VS Code extension, allowing experimentation with different model variants on Copilot+ PCs.
Copilot+ PCs offer local compute capabilities alongside Azure services, facilitating training and fine-tuning of models on-device while leveraging the cloud for intensive workloads.
This integration of cloud and edge computing signifies a new paradigm of continuous compute, empowering developers to create innovative AI solutions.