NCNN 树莓派部署U版yolov8

introduction

最近学习使用C++搞yolo部署,对于手边没有GPU, NPU的小伙伴而言,使用CPU加速推理显得更加重要,前期本人已经实现在windows上使用onnx框架完成yolov8的推理,有兴趣的小伙伴可以去看看windows环境下 C++ onnxruntime框架yolov8推理_onnxruntime 1.18推理-CSDN博客,本次内容真对于NCNN

模型准备 

github下载yolov8n

yolov8n.pt

pt转ncnn使用yolo官方的转化脚本

 model = YOLO('yolov8n.pt')
 model.export(format='ncnn')

树莓派环境配置

可以参考如下

树莓派5 yolov8 ncnn部署记录-CSDN博客

完整代码

参考树莓派5 yolov8 ncnn部署记录-CSDN博客

头文件

#pragma once
#ifndef YOLOV8_H
#define YOLOV8_H

#include <opencv2/core/core.hpp>
#include <ncnn/net.h>

struct Config
{
    int imgsize;
    float IoU;
    float confidence;
    const char* para_path;
    const char* bin_path;
};


struct Object
{
    cv::Rect_<float> rect;
    int label;
    float prob;
};


class YoloV8
{
public:
    YoloV8(Config cfg);
    void load(Config cfg);
    void detect(const cv::Mat& rgb, std::vector<Object>& objects, Config cfg);
    void draw(cv::Mat& rgb, const std::vector<Object>& objects);
private:
    ncnn::Net model;
    float scale;
    int Padwl, Padwr, Padht, Padhd;
    int num_thread;
    float norm_vals[3];
};

#endif // YOLOV8_H

源文件

#include "yoloV8.h"
#include <ncnn/net.h>
#include <iostream>

#include <string>
#include <cmath>
#include <opencv2/opencv.hpp>
#include <stdio.h>
#include <thread>
using namespace std;

const char* class_names[] = {
    "person", "bicycle", "car", "motorcycle", "airplane", "bus", "train", "truck", "boat", "traffic light",
    "fire hydrant", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow",
    "elephant", "bear", "zebra", "giraffe", "backpack", "umbrella", "handbag", "tie", "suitcase", "frisbee",
    "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard",
    "tennis racket", "bottle", "wine glass", "cup"
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值