QNAP MUSTANG-F100-A10-R10 FPGA Accelerator Card
- N/A
- N/A
- N/A
- N/A
- N/A
Brand Name | QNAP |
---|---|
Country of Origin | Taiwan |
Depth | 6.7" |
Form Factor | Plug-in Card |
Height | 2.7" |
Manufacturer | QNAP Systems |
Manufacturer Part Number | MUSTANG-F100-A10-R10 |
Manufacturer Website Address | |
Marketing Information | Intel® Vision Accelerator Design with Intel® Arria® 10 FPGA As QNAP NAS evolves to support a wider range of applications (including surveillance, virtualization, and AI) you not only need more storage space on your NAS, but also require the NAS to have greater power to optimize targeted workloads. The Mustang-F100 is a PCIe-based accelerator card using the programmable Intel® Arria® 10 FPGA that provides the performance and versatility of FPGA acceleration. It can be installed in a PC or compatible QNAP NAS to boost performance as a perfect choice for AI deep learning inference workloads.
OpenVINO™ toolkit OpenVINO™ toolkit is based on convolutional neural networks (CNN), the toolkit extends workloads across Intel® hardware and maximizes performance. Get deep learning acceleration on Intel-based Server/PC You can insert the Mustang-F100 into a PC/workstation running Linux® (Ubuntu®) to acquire computational acceleration for optimal application performance such as deep learning inference, video streaming, and data center. As an ideal acceleration solution for real-time AI inference, the Mustang-F100 can also work with Intel® OpenVINO™ toolkit to optimize inference workloads for image classification and computer vision.
QNAP NAS as an Inference Server OpenVINO™ toolkit extends workloads across Intel® hardware (including accelerators) and maximizes performance. When used with QNAP's OpenVINO™ Workflow Consolidation Tool, the Intel®-based QNAP NAS presents an ideal Inference Server that assists organizations in quickly building an inference system. Providing a model optimizer and inference engine, the OpenVINO™ toolkit is easy to use and flexible for high-performance, low-latency computer vision that improves deep learning inference. AI developers can deploy trained models on a QNAP NAS for inference, and install the Mustang-F100 to achieve optimal performance for running inference. |
Product Model | MUSTANG-F100-A10-R10 |
Product Name | MUSTANG-F100-A10-R10 FPGA Accelerator Card |
Product Type | FPGA Accelerator Card |
Width | 1.3" |
Related products
-
CPU Accelerators/ System Cache Controllers
Xilinx Alveo U250 FPGA Accelerator Card
Rated 0 out of 5$10,607.16 Add to cart -
CPU Accelerators/ System Cache Controllers
Xilinx Alveo U50 Data Center Accelerator
Rated 0 out of 5$3,784.51 Add to cart -
CPU Accelerators/ System Cache Controllers
Xilinx Alveo U200 FPGA Accelerator Card with Active Cooling
Rated 0 out of 5$6,817.65 Add to cart -
CPU Accelerators/ System Cache Controllers
Intel QuickAssist Adapter 8950
Rated 0 out of 5$839.51 Add to cart
Reviews
There are no reviews yet.