Path to this page:
./
wip/ncnn,
Neural network inference computing framework
Branch: CURRENT,
Version: 20240820nb1,
Package name: ncnn-20240820nb1,
Maintainer: triaxxncnn is a high-performance neural network inference computing framework
optimized for mobile platforms. ncnn is deeply considerate about deployment and
uses on mobile phones from the beginning of design. ncnn does not have
third-party dependencies. It is cross-platform and runs faster than all known
open-source frameworks on mobile phone cpu. Developers can easily deploy deep
learning algorithm models to the mobile platform by using efficient ncnn
implementation, creating intelligent APPs, and bringing artificial intelligence
to your fingertips. ncnn is currently being used in many Tencent applications,
such as QQ, Qzone, WeChat, Pitu, and so on.
Master sites:
Version history: (Expand)
- (2024-12-14) Updated to version: ncnn-20240820nb1
- (2024-11-26) Package added to pkgsrc.se, version ncnn-20240820 (created)