QNAP Mustang-F100
Couldn't load pickup availability
PNP : UAE's TOP Data Storage Integrator Since 2005


Couldn't load pickup availability
PNP : UAE's TOP Data Storage Integrator Since 2005
Do you provide qnap after sales
are qnap drives hot swappable
How to do qnap to qnap backup
qnap what is a storage pool
qnap where are snapshots stored
qnap how to replace failed drive
what are qnap snapshots
what are qnap devices
is qnap a chinese company
qnap which file system is used
Do you have a qnap service center dubai
qnap where is os stored
Does qnap have a middle east office
qnap which raid to use
How to do qnap to synology backup
can qnap read ntfs
qnap what is container station
qnap where is backup station
qnap where are apps installed
qnap what is bitmap
How to backup qnap to google drive
Can I use qnap with ssd
Are there any qnap with thunderbolt
Are there any qnap with gpu
Does qnap work with docker
Whats the deal with qnap and ransomware
Whats the deal with qnap and deadbolt
Does qnap have an antivirus
How can I do a qnap backup to external drive
How can I check qnap hdd compatibility
What is qnap meaning
is qnap safe to use
Intel Vision Accelerator Design with Intel Arria 10 FPGA
As QNAP NAS evolves to support a wider range of applications (including surveillance, virtualization, and AI) you not only need more storage space on your NAS, but also require the NAS to have greater power to optimize targeted workloads. The Mustang-F100 is a PCIe-based accelerator card using the programmable Intel Arria 10 FPGA that provides the performance and versatility of FPGA acceleration. It can be installed in a PC or compatible QNAP NAS to boost performance as a perfect choice for AI deep learning inference workloads.
Available Models : Mustang-F100-A10-R10
PCIe FPGA Highest Performance Accelerator Card with Arria 10 1150GX support DDR4 2400Hz 8GB, PCIe Gen3 x8 interface
OpenVINO toolkit
OpenVINO toolkit is based on convolutional neural networks (CNN), the toolkit extends workloads across Intel hardware and maximizes performance. It can optimize pre-trained deep learning model such as Caffe, MXNET, Tensorflow into IR binary file then execute the inference engine across Intel-hardware heterogeneously such as CPU, GPU, Intel Movidius Neural Compute Stick, and FPGA.
Get deep learning acceleration on Intel-based Server/PC
You can insert the Mustang-F100 into a PC/workstation running Linux (Ubuntu) to acquire computational acceleration for optimal application performance such as deep learning inference, video streaming, and data center. As an ideal acceleration solution for real-time AI inference, the Mustang-F100 can also work with Intel OpenVINO toolkit to optimize inference workloads for image classification and computer vision.
QNAP NAS as an Inference Server
OpenVINO toolkit extends workloads across Intel hardware (including accelerators) and maximizes performance. When used with QNAPs OpenVINO Workflow Consolidation Tool, the Intel-based QNAP NAS presents an ideal Inference Server that assists organizations in quickly building an inference system. Providing a model optimizer and inference engine, the OpenVINO toolkit is easy to use and flexible for high-performance, low-latency computer vision that improves deep learning inference. AI developers can deploy trained models on a QNAP NAS for inference, and install the Mustang-F100 to achieve optimal performance for running inference.
Note:
1. QTS 4.4.0 (or later) and OWCT v1.1.0 are required for the QNAP NAS.
2. To use FPGA card computing on the QNAP NAS, the VM pass-through function will be disabled. To avoid potential data loss, make sure that all ongoing NAS tasks are finished before restarting.
Easy-to-manage Inference Engine with QNAP OWCT
Upload a video file
Download inference result
Check Compatible NAS Models
28-Bay | TS-2888X |
24-Bay | TVS-2472XU-RP |
16-Bay | TVS-1672XU-RP |
12-Bay | TVS-1272XU-RP |
8-Bay | TVS-872XU, TVS-872XU-RP |
Main FPGA | Intel Arria 10 GX1150 FPGA |
Operating Systems | PC: Ubuntu 16.04.3 LTS 64-bit, CentOS 7.4 64-bit, Windows 10 (More OS are coming soon) NAS: QTS (Installing Mustang Card User Driver in the QTS App Center is required.) |
Voltage Regulator and Power Supply | Intel Enpirion Power Solutions |
Memory | 8G on board DDR4 |
Dataplane Interface | PCI Express x8 Compliant with PCI Express Specification V3.0 |
Power Consumption (W) | <60W |
Operating Temperature & Relative Humidity | 5C~60C (ambient temperature)5% ~ 90% |
Cooling | Active fan: (50 x 50 x 10 mm) x 2 |
Dimensions | 169.5 mm x 68.7 mm x 33.7 mm |
Power Connector | *Preserved PCIe 6-pin 12V external power |
Dip Switch/LED indicator | Up to 8 cards can be supported with operating systems other than QTS; QNAP TS-2888X NAS supports up to 4 cards. Please assign a card ID number (from 0 to 7) to the Mustang-F100 by using rotary switch manually. The card ID number assigned here will be shown on the LED display of the card after power-up. |