Scalable Inference System

Fits the AI implementation in the production environment

  • Scalable Inference System
    Scalable Inference System

The TANK-870AI from ICP Deutschland is a powerful and scalable inference system that allows to place AI application-related and locally at the edge. Coming with a pre-installed Linux Ubuntu 16.04 LTS operating system and ready-to-use software, the hardware is based on an Intel® Skylake or Kaby Lake CPU and has up to 32GB pre-installed RAM. The TANK-870AI is already equipped with a 1TB 2.5” HDD and can support another SSD with RAID 0/1 functionality.


Equipped with the Mustang AI accelerator card, installed via the two PCIe x8 expansion slots


The software used is the Open Visual Inference & Neural Network Optimization Toolkit, called OpenVINO from Intel®. It associates several tools such as the Intel® Deep Learning Deployment Toolkit, optimized computer vision libraries, Intel® Media SDK OpenCLTM graphics drivers and runtimes as well as current topologies like AlexNet, GoogleNet and more. OpenVINO can help optimize deep learning training models such as those from Caffe, MXNET and Tensorflow across hardware like CPU, GPU or the Mustang Intel® Movidus or FPGA AI acceleration cards.

Graduated in political sciences and international relations in Paris, Anis joined the team in early 2019. Editor for IEN Europe and the new digital magazine AI IEN, he is a new tech enthusiast. Also passionate about sports, music, cultures and languages. 

More articles Contact