FREMONT, Calif.— AMAX, an award-winning provider of Deep Learning, Inference and GPU computing solutions, is showcasing its revolutionary Deep Learning solutions at NVIDIA’s GPU Technology Conference (GTC) in booth #517 from March 19-21 at the San Jose McEnery Convention Center, San Jose, California.
AMAX is a trusted GPU solutions provider for leading global AI and enterprise companies. As an NVIDIA Elite Partner, AMAX offers the most comprehensive line of GPU-integrated solutions optimized for deep learning at any scale — from high-performance workstations for development to large-scale production-ready GPU clusters complete with integrated high-density compute, storage and networking. AMAX is able to custom-configure its solutions based on each customer’s compute requirements, stage of development or deployment and performance goals, scaling as those needs grow and evolve.
ServMax™ DGS-1 NVLink Optimized Deep Learning Server – The DGS-1 is pre-installed with eight V100 SXM2 GPUs, with support for NVIDIA NVLink technology, enabling superior strong-scaling performance for HPC and hyperscale applications, and also allowing for hardware optimization on the fly for AI and DL training, inference, HPC compute, rendering and virtualization applications. Utilizing Intel’s latest Xeon Scalable Processor, the system provides a 56 percent increase in memory bandwidth and a 54 percent bandwidth increase for communication between CPUs (UPI) compared to the previous Intel Processor generation.
ServMax™ X-110G – A 1U compact server featuring the new generation of Intel Xeon E processor and a single NVIDIA Tesla T4 GPU, optimized for applications such as deep learning inference, video surveillance, cybersecurity and autonomous vehicles.
ServMax™ LCS-1 Liquid Cooling GPU Workstation – The LCS-1 brings real GPU computing power to any environment in a silent, mobile platform. The sleek single-blade dual GPU server is completely immersed and passively cooled within a resilient metal chassis built to withstand any working environment. Ideal for both office use and unpredictable remote conditions, the LCS-1 delivers on-site processing power and control to critical applications.
ServMax™ XP-240L 2U Liquid Cool Inference Server – The ServMax™ XP-240L enables deployment in data centers requiring liquid cooling, delivering system-level power efficiency to optimize data center power usage effectiveness. It comes with four Intel DP Xeon Scalable Family processor nodes, each node supporting up to one single-slot GPU. This solution is ideal for various HPC, cloud, data center and inference applications.
For AI companies interested in deploying their software as a turnkey on-premise appliance, AMAX offers OEM services to design, validate and mass produce GPU-integrated workstation or server appliances featuring unique custom-branding options. AMAX’s field-proven NPI (New Product Introduction) program enables AI startups to quickly bring new products to market for a competitive advantage.
As an Elite member of the NVIDIA Partner Network program, AMAX has demonstrated a proven track record of expertise in GPU computing and is committed to providing the most cutting-edge solutions to meet accelerated computing needs at any and every scale. Contact AMAX or visit www.amax.com to learn more about AMAX’s Deep Learning and Inference solutions or to schedule a meeting at GTC 2019.
AMAX is an award-winning global leader in application-tailored data center, HPC and Deep Learning solutions designed towards highest efficiency and optimal performance. Whether a Fortune 1000 company seeking significant cost savings through better efficiency for global data centers or a software startup seeking an experienced manufacturing partner to design and launch a flagship product, AMAX is a trusted solutions provider, delivering the results needed to meet specific metrics for success.