您好,登錄后才能下訂單哦!
本篇內(nèi)容介紹了“USB攝像頭linux實(shí)現(xiàn)方法是什么”的有關(guān)知識,在實(shí)際案例的操作過程中,不少人都會遇到這樣的困境,接下來就讓小編帶領(lǐng)大家學(xué)習(xí)一下如何處理這些情況吧!希望大家仔細(xì)閱讀,能夠?qū)W有所成!
做嵌入式linux上的開發(fā)很多年了,扳手指頭算算,也起碼9年了,陸陸續(xù)續(xù)做過很過諸如需要讀取外接的USB攝像頭或者CMOS攝像機(jī)的程序,實(shí)時采集視頻,將圖像傳到前端,或者對圖像進(jìn)行人臉分析處理,最開始嘗試的就是QCamera來處理,直接歇菜放棄,后面通過搜索發(fā)現(xiàn)都說要用v4l2視頻框架來進(jìn)行,于是東搞搞西搞搞嘗試了很多次,終于整出來了,前后完善了好幾年,無論寫什么程序,發(fā)現(xiàn)要簡簡單單的實(shí)現(xiàn)基礎(chǔ)的功能,都是非??焖俣胰菀椎?,但是想要做得好做得精,要花不少的精力時間去完善,適應(yīng)各種不同的場景,比如就說用v4l2加載攝像頭這個,需要指定設(shè)備文件來讀取,而現(xiàn)場不可能讓用戶來給你指定,頻繁的拔插也會導(dǎo)致設(shè)備文件名的改動,所以必須找到一個機(jī)制自動尋找你想要的攝像機(jī)的設(shè)備文件名稱,比如開個定時器去調(diào)用linux命令來處理,甚至在不同的系統(tǒng)平臺上要執(zhí)行的命令還有些許的區(qū)別,如果本地有多個攝像頭還需要區(qū)分左右之類的時候,那就只能通過斷電先后上電順序次序來區(qū)分了。
linux方案處理流程:
調(diào)用封裝的函數(shù)findCamera實(shí)時查找攝像頭設(shè)備文件名。
調(diào)用::open函數(shù)打開設(shè)備文件。
調(diào)用封裝的函數(shù)initCamera初始化攝像頭參數(shù)(圖片格式、分辨率等)。
調(diào)用::select函數(shù)從緩沖區(qū)取出一個緩沖幀。
緩沖幀數(shù)據(jù)是yuyv格式的,需要轉(zhuǎn)換rgb24再轉(zhuǎn)成QImage。
拿到圖片進(jìn)行繪制、人臉分析等。
關(guān)閉設(shè)備文件。
同時支持windows、linux、嵌入式linux上的USB攝像頭實(shí)時采集。
支持多路USB攝像頭多線程實(shí)時采集。
在嵌入式linux設(shè)備上,自動查找USB設(shè)備文件并加載。
可手動設(shè)置設(shè)備文件名稱,手動設(shè)置后按照手動設(shè)置的設(shè)備文件加載。
在嵌入式linux設(shè)備上支持人臉識別接口,實(shí)時繪制人臉框。
具有打開、暫停、繼續(xù)、關(guān)閉、截圖等常規(guī)功能。
可設(shè)置兩路OSD標(biāo)簽,分別設(shè)置文本、顏色、字號、位置等。
可作為視頻監(jiān)控系統(tǒng)使用。
void CameraLinux::run() { while (!stopped) { if (!cameraOk) { msleep(10); continue; } if (isPause) { //這里需要假設(shè)正常,暫停期間繼續(xù)更新時間 lastTime = QDateTime::currentDateTime(); msleep(10); continue; } QImage image = readImage(); if (!image.isNull()) { if (isSnap) { emit snapImage(image); isSnap = false; } if (findFaceOne) { findFace(image); } if (findFaceRect) { image = drawFace(image); } lastTime = QDateTime::currentDateTime(); emit receiveImage(image); } msleep(interval); } this->closeCamera(); this->initData(); } QDateTime CameraLinux::getLastTime() const { return this->lastTime; } QString CameraLinux::getCameraName() const { return this->cameraName; } int CameraLinux::getCameraWidth() const { return this->cameraWidth; } int CameraLinux::getCameraHeight() const { return this->cameraHeight; } void CameraLinux::sleep(int msec) { if (msec > 0) { QTime endTime = QTime::currentTime().addMSecs(msec); while (QTime::currentTime() < endTime) { QCoreApplication::processEvents(QEventLoop::AllEvents, 100); } } } void CameraLinux::initData() { stopped = false; isPause = false; isSnap = false; cameraOk = false; cameraHwnd = -1; errorCount = 0; } void CameraLinux::readData() { QStringList cameraNames; while (!process->atEnd()) { //逐行讀取返回的結(jié)果 過濾video開頭的是攝像頭設(shè)備文件 QString line = process->readLine(); if (line.startsWith("video")) { line = line.replace("\n", ""); cameraNames << QString("/dev/%1").arg(line); } } if (cameraNames.count() > 0) { cameraName = cameraNames.first(); emit receiveCamera(cameraNames); qDebug() << TIMEMS << cameraNames; } } bool CameraLinux::initCamera() { //如果沒有指定設(shè)備文件名稱(默認(rèn)auto)則查找 if (cameraName == "auto") { findCamera(); } //延時判斷是否獲取到了設(shè)備文件 sleep(300); return openCamera(); } void CameraLinux::findCamera() { if (process->state() == QProcess::NotRunning) { process->start("ls /dev/"); } } bool CameraLinux::openCamera() { #ifdef Q_OS_LINUX if (cameraName.length() > 5) { cameraHwnd = ::open(cameraName.toUtf8().data(), O_RDWR | O_NONBLOCK, 0); } if (cameraHwnd < 0) { qDebug() << TIMEMS << "open camera error"; return false; } //查詢設(shè)備屬性 struct v4l2_capability capability; if (::ioctl(cameraHwnd, VIDIOC_QUERYCAP, &capability) < 0) { qDebug() << TIMEMS << "error in VIDIOC_QUERYCAP"; ::close(cameraHwnd); return false; } if (!(capability.capabilities & V4L2_CAP_VIDEO_CAPTURE)) { qDebug() << TIMEMS << "it is not a video capture device"; ::close(cameraHwnd); return false; } if (!(capability.capabilities & V4L2_CAP_STREAMING)) { qDebug() << TIMEMS << "it can not streaming"; ::close(cameraHwnd); return false; } if (capability.capabilities == 0x4000001) { qDebug() << TIMEMS << "capabilities" << "V4L2_CAP_VIDEO_CAPTURE | V4L2_CAP_STREAMING"; } //設(shè)置視頻輸入源 int input = 0; if (::ioctl(cameraHwnd, VIDIOC_S_INPUT, &input) < 0) { qDebug() << TIMEMS << "error in VIDIOC_S_INPUT"; ::close(cameraHwnd); return false; } //設(shè)置圖片格式和分辨率 struct v4l2_format format; format.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; //多種格式 V4L2_PIX_FMT_YUV420 V4L2_PIX_FMT_YUYV(422) V4L2_PIX_FMT_RGB565 format.fmt.pix.pixelformat = V4L2_PIX_FMT_YUYV; //部分硬件花屏要設(shè)置成 V4L2_FIELD_NONE format.fmt.pix.field = V4L2_FIELD_INTERLACED; format.fmt.pix.width = cameraWidth; format.fmt.pix.height = cameraHeight; int bpp = 16; //format.fmt.pix.bytesperline = width * bpp / 8; //format.fmt.pix.sizeimage = cameraWidth * cameraHeight * bpp / 8; if (::ioctl(cameraHwnd, VIDIOC_S_FMT, &format) < 0) { ::close(cameraHwnd); return false; } //查看圖片格式和分辨率,判斷是否設(shè)置成功 if (::ioctl(cameraHwnd, VIDIOC_G_FMT, &format) < 0) { qDebug() << TIMEMS << "error in VIDIOC_G_FMT"; ::close(cameraHwnd); return false; } //重新打印下寬高看下是否真正設(shè)置成功 struct v4l2_pix_format pix = format.fmt.pix; quint32 pixelformat = pix.pixelformat; qDebug() << TIMEMS << "cameraWidth" << cameraWidth << "cameraHeight" << cameraHeight << "width" << pix.width << "height" << pix.height; qDebug() << TIMEMS << "pixelformat" << QString("%1%2%3%4").arg(QChar(pixelformat & 0xFF)).arg(QChar((pixelformat >> 8) & 0xFF)).arg(QChar((pixelformat >> 16) & 0xFF)).arg(QChar((pixelformat >> 24) & 0xFF)); //重新設(shè)置寬高為真實(shí)的寬高 cameraWidth = pix.width; cameraHeight = pix.height; //設(shè)置幀格式 struct v4l2_streamparm streamparm; streamparm.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; streamparm.parm.capture.timeperframe.numerator = 1; streamparm.parm.capture.timeperframe.denominator = 25; streamparm.parm.capture.capturemode = 0; if (::ioctl(cameraHwnd, VIDIOC_S_PARM, &streamparm) < 0) { qDebug() << TIMEMS << "error in VIDIOC_S_PARM"; ::close(cameraHwnd); return false; } if (::ioctl(cameraHwnd, VIDIOC_G_PARM, &streamparm) < 0) { qDebug() << TIMEMS << "error in VIDIOC_G_PARM"; ::close(cameraHwnd); return false; } //申請和管理緩沖區(qū) struct v4l2_requestbuffers requestbuffers; requestbuffers.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; requestbuffers.memory = V4L2_MEMORY_MMAP; requestbuffers.count = 1; if (::ioctl(cameraHwnd, VIDIOC_REQBUFS, &requestbuffers) < 0) { qDebug() << TIMEMS << "error in VIDIOC_REQBUFS"; ::close(cameraHwnd); return false; } buff_yuv422 = (uchar *)malloc(cameraWidth * cameraHeight * bpp / 8); buff_yuv420 = (uchar *)malloc(cameraWidth * cameraHeight * bpp / 8); buff_rgb24 = (uchar *)malloc(cameraWidth * cameraHeight * 24 / 8); buff_img = (ImgBuffer *)calloc(1, sizeof(ImgBuffer)); if (buff_img == NULL) { qDebug() << TIMEMS << "error in calloc"; ::close(cameraHwnd); return false; } struct v4l2_buffer buffer; for (int index = 0; index < 1; index++) { buffer.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; buffer.memory = V4L2_MEMORY_MMAP; buffer.index = index; if (::ioctl(cameraHwnd, VIDIOC_QUERYBUF, &buffer) < 0) { qDebug() << TIMEMS << "error in VIDIOC_QUERYBUF"; ::free(buff_img); ::close(cameraHwnd); return false; } buff_img[index].length = buffer.length; buff_img[index].start = (quint8 *)mmap(NULL, buffer.length, PROT_READ | PROT_WRITE, MAP_SHARED, cameraHwnd, buffer.m.offset); if (MAP_FAILED == buff_img[index].start) { qDebug() << TIMEMS << "error in mmap"; ::free(buff_img); ::close(cameraHwnd); return false; } //把緩沖幀放入隊列 if (::ioctl(cameraHwnd, VIDIOC_QBUF, &buffer) < 0) { qDebug() << TIMEMS << "error in VIDIOC_QBUF"; for (int i = 0; i <= index; i++) { munmap(buff_img[i].start, buff_img[i].length); } ::free(buff_img); ::close(cameraHwnd); return false; } } enum v4l2_buf_type type = V4L2_BUF_TYPE_VIDEO_CAPTURE; if (::ioctl(cameraHwnd, VIDIOC_STREAMON, &type) < 0) { qDebug() << TIMEMS << "error in VIDIOC_STREAMON"; for (int i = 0; i < 1; i++) { munmap(buff_img[i].start, buff_img[i].length); } ::free(buff_img); ::close(cameraHwnd); return false; } cameraOk = true; #endif qDebug() << TIMEMS << "open camera ok"; return cameraOk; } void CameraLinux::closeCamera() { #ifdef Q_OS_LINUX if (cameraOk && buff_img != NULL) { //停止攝像頭采集 enum v4l2_buf_type type = V4L2_BUF_TYPE_VIDEO_CAPTURE; if (::ioctl(cameraHwnd, VIDIOC_STREAMOFF, &type) < 0) { qDebug() << TIMEMS << "error in VIDIOC_STREAMOFF"; } //解除內(nèi)存映射 for (int i = 0; i < 1; i++) { munmap((buff_img)[i].start, (buff_img)[i].length); } //關(guān)閉設(shè)備文件 ::close(cameraHwnd); qDebug() << TIMEMS << "close camera ok"; } //釋放資源 ::free(buff_img); buff_img = NULL; ::free(buff_yuv422); buff_yuv422 = NULL; ::free(buff_yuv420); buff_yuv420 = NULL; ::free(buff_rgb24); buff_rgb24 = NULL; cameraOk = false; cameraHwnd = -1; #endif } int CameraLinux::readFrame() { int index = -1; #ifdef Q_OS_LINUX //等待攝像頭采集到一楨數(shù)據(jù) for (;;) { fd_set fds; struct timeval tv; FD_ZERO(&fds); FD_SET(cameraHwnd, &fds); tv.tv_sec = 2; tv.tv_usec = 0; int r = ::select(cameraHwnd + 1, &fds, NULL, NULL, &tv); if (-1 == r) { if (EINTR == errno) { continue; } return -1; } else if (0 == r) { return -1; } else { //采集到一張圖片 跳出循環(huán) break; } } //從緩沖區(qū)取出一個緩沖幀 struct v4l2_buffer buffer; buffer.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; buffer.memory = V4L2_MEMORY_MMAP; if (::ioctl(cameraHwnd, VIDIOC_DQBUF, &buffer) < 0) { qDebug() << TIMEMS << "error in VIDIOC_DQBUF"; return -1; } memcpy(buff_yuv422, (uchar *)buff_img[buffer.index].start, buff_img[buffer.index].length); //將取出的緩沖幀放回緩沖區(qū) if (::ioctl(cameraHwnd, VIDIOC_QBUF, &buffer) < 0) { qDebug() << TIMEMS << "error in VIDIOC_QBUF"; return -1; } index = buffer.index; #endif return index; }
“USB攝像頭linux實(shí)現(xiàn)方法是什么”的內(nèi)容就介紹到這里了,感謝大家的閱讀。如果想了解更多行業(yè)相關(guān)的知識可以關(guān)注億速云網(wǎng)站,小編將為大家輸出更多高質(zhì)量的實(shí)用文章!
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場,如果涉及侵權(quán)請聯(lián)系站長郵箱:is@yisu.com進(jìn)行舉報,并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。