Supermicro SuperMinute: X12 4-Way MP System
Supermicro 推出全新高性能 X12 代 4 插槽SuperServer ®,针对新第 3 代Intel® Xeon®可扩展处理器和Intel® Optane™ 持久内存 200 系列进行了优化,为企业、云规模和混合环境中的关键任务工作负载带来了出色的性能。
Supermicro 推出全新高性能 X12 代 4 插槽SuperServer ®,针对新第 3 代Intel® Xeon®可扩展处理器和Intel® Optane™ 持久内存 200 系列进行了优化,为企业、云规模和混合环境中的关键任务工作负载带来了出色的性能。
The new AS -2124GQ-NART server features the power of NVIDIA A100 Tensor Core GPUs and the HGX A100 4-GPU baseboard. The system supports PCI-E Gen 4 for fast CPU-GPU connection and high-speed networking expansion cards.
Supermicro’s highly configurable Outdoor Edge Systems, powered by Intel®, give data center and telco operators unprecedented new deployment options rivaling industry competitors.
Modular design of Supermicro’s FatTwin Servers allows company to rapidly scale to support growing business needs
Supermicro is collaborating with T-Systems (Deutsche Telekom) on EdgAIR, an Edge Computing platform that delivers low latency for IoT applications at enterprise facilities.
Supermicro 提供最新的高性价比 ToR 以太网交换技术,采用一系列传统的全功能模型,在完整的解决方案中集成了硬件和软件。这些标准交换机为从 1 Gbps 到 100 Gbps 的一系列计算和存储系统提供连接,并支持最流行的行业标准第 2 层和第 3 层以太网交换机功能。
Supermicro 通过其“裸金属”以太网交换机系列支持开放式网络。硬件可以选择一个 “best of breed” 的方法来匹配操作系统的选择由开放网络安装环境(ONIE)启用。基于Supermicro ONIE 的交换机完全支持从 1 Gbps 到 100 Gbps 的端口速度范围。
Meet all your AI/ML application needs with Supermicro optimized GPU server solutions
AI is helping to solve some of the world’s most complex problems. Solving these enormous challenges require the computation of large amounts of data and highly optimized AI models running at scale. NVIDIA GPU Cloud (NGC) is the GPU-accelerated software hub for optimized AI and HPC. Supermicro’s NGC-Ready systems make it easy and efficient to run large workloads with a complete end-to-end NVIDIA Tensor Core GPU-accelerated hardware and software stack.
Time to market is key to success for today’s AI development. By using Supermicro designed NVIDIA GPU systems with all the latest AI stack installed and supported, data scientists and AI developers can start testing and training their AI models for product development and research.
This white paper is intended to help an organization deploy an on-premises SUSE CaaS Platform cluster on Supermicro’s BigTwin™, Ultra and SuperStorage systems with the 2nd Generation Intel® Xeon® Scalable processor to support the latest, Kubernetes compatible workloads.
Supermicro’s new 10U GPU server using NVIDIA HGX-2 GPU baseboards helps leading Belgian research team improve calculation times by 10X which leads to faster AI development
Company targets more than $13M in savings per year in total energy costs across the entire data center
Supermicro’s X11 generation Building Block Solutions and SuperServer systems support customer's evolving needs for cloud computing, web hosting, big data and high-performance applications
Supermicro's X11 generation system and Building Block Solutions provide a foundation for company's real-time data analytics for SQL data wharehouse deployment