当前位置: 首页 > news >正文

使用 Android NDK 获取 YUV420p摄像头原始数据

使用 Android NDK 获取 YUV420p摄像头原始数据

首先frameworks/av/camera/Camera.cpp已经过时了不要再使用它了, 当然想要更换旧的Camera的成本也不小,一般公司也不会做.
先介绍一一些常见的数据格式,然后介绍一下使用方式即可,然后下篇文件在探索一下源码.
脉络大概如下:
CameraManager → CameraService → Camera HAL v3 → Sensor/Driver.

常见的视频原始数据格式

本质上视频就是一张一张的图片,利用人眼视觉暂留的原理,24帧率的时候人眼就会无法辨别出单幅的静态画面.
编码就是利用算法算出每张图片之间的关系然后进行压缩.
解码就是一个逆向的过程,将压缩后的数据利用逆向算法恢复成一张一张的图片,然后播放.

yuv420p

最常见得
这个是最常见的.举个例子:
4x2像素的图片存储格式如下:
首先Y分量和像素一样,如下:
YYYY
YYYY
接着是U分量,4个Y分量共用一个U分量.
UU
接着是V分量,同理
VV
最终在内存中如下:

YYYY
YYYY
UU
VV

5x3像素的图片存储格式如下:
首先Y分量和像素一样,如下:

YYYYY
YYYYY
UU
VV

他们一共在内存中占用15 + 2 + 2 = 19字节.
YU12

YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
UU
VV
UU
VV

YU21

YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
VV
UU
VV
UU

yuv420sp

它和yuv420p得区别在于前者UV是顺序存储,后者是交替存储.
yuv420sp分为NV12NV21
NV12
4x8

YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
UV
UV
UV
UV

NV21
4x8

YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
VU
VU
VU
VU

源码封装

cmake


# For more information about using CMake with Android Studio, read the
# documentation: https://d.android.com/studio/projects/add-native-code.html.
# For more examples on how to use CMake, see https://github.com/android/ndk-samples.# Sets the minimum CMake version required for this project.
cmake_minimum_required(VERSION 3.22.1)# Declares the project name. The project name can be accessed via ${ PROJECT_NAME},
# Since this is the top level CMakeLists.txt, the project name is also accessible
# with ${CMAKE_PROJECT_NAME} (both CMake variables are in-sync within the top level
# build script scope).
project(openslLearn VERSION 0.1.0 LANGUAGES C CXX)# ✅ 设置 C++ 标准
set(CMAKE_CXX_STANDARD 23)  # 使用 C++26 标准
set(CMAKE_CXX_STANDARD_REQUIRED ON)  # 强制使用指定标准
set(CMAKE_CXX_EXTENSIONS OFF)        # 禁用编译器扩展(使用纯标准)# Creates and names a library, sets it as either STATIC
# or SHARED, and provides the relative paths to its source code.
# You can define multiple libraries, and CMake builds them for you.
# Gradle automatically packages shared libraries with your APK.
#
# In this top level CMakeLists.txt, ${CMAKE_PROJECT_NAME} is used to define
# the target library name; in the sub-module's CMakeLists.txt, ${PROJECT_NAME}
# is preferred for the same purpose.
#
# In order to load a library into your app from Java/Kotlin, you must call
# System.loadLibrary() and pass the name of the library defined here;
# for GameActivity/NativeActivity derived applications, the same library name must be
# used in the AndroidManifest.xml file.# 第一个库
# 查找源文件
file(GLOB_RECURSE LEARN01_SOURCES CONFIGURE_DEPENDS"src/learn01/*.cpp""src/learn01/*.c"
)
add_library(${CMAKE_PROJECT_NAME} SHARED ${LEARN01_SOURCES})# 设置头文件包含路径
target_include_directories(${CMAKE_PROJECT_NAME}PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/learn01PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/logging
)# Specifies libraries CMake should link to your target library. You
# can link libraries from various origins, such as libraries defined in this
# build script, prebuilt third-party libraries, or Android system libraries.
target_link_libraries(${CMAKE_PROJECT_NAME}# List libraries link to the target libraryandroidlogOpenSLES
)# 新增第二个库 (openslLearn2)
file(GLOB_RECURSE LEARN02_SOURCES CONFIGURE_DEPENDS"src/learn02/*.cpp""src/learn02/*.c""src/sqlite/*.cpp""src/sqlite/*.c"
)
set(LIBRARY_NAME2 ${CMAKE_PROJECT_NAME}2)
message("LIBRARY_NAME2: ${LIBRARY_NAME2}")
add_library(${LIBRARY_NAME2} SHARED ${LEARN02_SOURCES})  # 使用不同源文件
target_include_directories(${LIBRARY_NAME2}PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/sqlitePUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/learn02PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/logging
)
find_package (oboe REQUIRED CONFIG)
target_link_libraries(${LIBRARY_NAME2}androidlogaaudiooboe::oboecamera2ndkmediandk
)

头文件

//
// Created by 29051 on 2025/10/25.
//#ifndef OPENSL_LEARN_CAMERA_HPP
#define OPENSL_LEARN_CAMERA_HPPextern "C" {
#include <camera/NdkCameraManager.h>
#include <media/NdkImageReader.h>
}#include <string>
#include <fstream>#include "logging.hpp"class NDKCamera {
private:int mWidth;int wHeight;ACameraManager *aCameraManager = nullptr;ACameraDevice *device = nullptr;ACameraCaptureSession *session = nullptr;AImageReader *aImageReader = nullptr;ACaptureSessionOutputContainer *aCaptureSessionOutputContainer = nullptr;ACaptureSessionOutput *sessionOutput = nullptr;std::string yuvPath;std::ofstream *yuvStream = nullptr;
public:NDKCamera(int width, int height, std::string yuvPath);~NDKCamera();/*** Capabilities 功能*/void printCameraCapabilities(const char * cameraId);
};#endif //OPENSL_LEARN_CAMERA_HPP

源文件

//
// Created by 29051 on 2025/10/25.
//
#include "NDKCamera.hpp"#include <utility>const char * const TAG = "NDKCamera";/*** CameraManager → CameraService → Camera HAL v3 → Sensor/Driver* @param width* @param height*/
NDKCamera::NDKCamera(int width, int height, std::string yuvPath) : mWidth(width), wHeight(height), yuvPath(std::move(yuvPath)) {logger::info(TAG, "width: %d, height: %d, yuvPath: %s", this -> mWidth, this -> wHeight, this -> yuvPath.c_str());this->yuvStream = new std::ofstream(this->yuvPath, std::ios::binary);if (!this->yuvStream->is_open()){logger::error(TAG, "文件打开失败...");return;}aCameraManager = ACameraManager_create();if (aCameraManager == nullptr){logger::error(TAG, "aCameraManager is null");return;}ACameraIdList *cameraIdList = nullptr;camera_status_t status = ACameraManager_getCameraIdList(aCameraManager, &cameraIdList);if (status != ACAMERA_OK){logger::error(TAG, "开启 getCameraIdList 失败");return;}if (cameraIdList->numCameras <= 0){logger::error(TAG, "此设备没有摄像头");return;}for(int i = 0; i < cameraIdList->numCameras; i ++ ){logger::info(TAG, "index: %d, cameraId: %s", i, cameraIdList->cameraIds[i]);}const char* cameraId = cameraIdList->cameraIds[1];this->printCameraCapabilities(cameraId);ACameraDevice_StateCallbacks deviceStateCallbacks = {.context = nullptr,.onDisconnected = [](void*, ACameraDevice* aCameraDevice) -> void {},.onError = [](void*, ACameraDevice* aCameraDevice, int errorCode) -> void {},};status = ACameraManager_openCamera(aCameraManager, cameraId, &deviceStateCallbacks, &device);if (status != ACAMERA_OK){logger::error(TAG, "开启 camera 失败");return;}media_status_t mediaStatus = AImageReader_new(width, height, AIMAGE_FORMAT_YUV_420_888, 4, &aImageReader);if (mediaStatus != AMEDIA_OK){logger::error(TAG, "AImageReader_new 失败");return;}AImageReader_ImageListener imageListener = {.context = this,.onImageAvailable = [](void* context, AImageReader* reader) -> void {AImage *image = nullptr;media_status_t mediaStatus = AImageReader_acquireNextImage(reader, &image);if (mediaStatus != AMEDIA_OK || image == nullptr){logger::error(TAG, "获取当前yuv帧失败");AImage_delete(image);return;}int32_t width = 0, height = 0;mediaStatus = AImage_getWidth(image, &width);if (mediaStatus != AMEDIA_OK || image == nullptr){logger::error(TAG, "获取当前yuv帧宽度失败");AImage_delete(image);return;}mediaStatus = AImage_getHeight(image, &height);if (mediaStatus != AMEDIA_OK || image == nullptr){logger::error(TAG, "获取当前yuv帧高度失败");AImage_delete(image);return;}// ==========const auto *ndkCamera = reinterpret_cast<NDKCamera*>(context);for (int plane = 0; plane < 3; ++plane) {uint8_t* planeData = nullptr;int planeDataLen = 0;if (AImage_getPlaneData(image, plane, &planeData, &planeDataLen) != AMEDIA_OK) {logger::error(TAG, "AImage_getPlaneData failed plane=%d", plane);AImage_delete(image);return;}int rowStride = 0, pixelStride = 0;AImage_getPlaneRowStride(image, plane, &rowStride);AImage_getPlanePixelStride(image, plane, &pixelStride);int planeWidth = (plane == 0) ? width : (width + 1) / 2;int planeHeight = (plane == 0) ? height : (height + 1) / 2;// 按行按 pixelStride 写入,确保是连续的 Y then U then Vfor (int y = 0; y < planeHeight; ++y) {const uint8_t* rowPtr = planeData + y * rowStride;if (pixelStride == 1) {// 直接写 planeWidth 字节ndkCamera->yuvStream->write(reinterpret_cast<const char*>(rowPtr), planeWidth);} else {// 需要按 pixelStride 抽取for (int x = 0; x < planeWidth; ++x) {ndkCamera->yuvStream->put(rowPtr[x * pixelStride]);}}}}AImage_delete(image);logger::info(TAG, "yuv width: %d, height: %d", width, height);},};AImageReader_setImageListener(aImageReader, &imageListener);ANativeWindow* window = nullptr;mediaStatus = AImageReader_getWindow(aImageReader, &window);if (mediaStatus != AMEDIA_OK){logger::error(TAG, "AImageReader_getWindow 失败");return;}ACaptureRequest *request = nullptr;status = ACameraDevice_createCaptureRequest(device, TEMPLATE_PREVIEW, &request);if (status != ACAMERA_OK){logger::error(TAG, "开启 ACameraDevice_createCaptureRequest 失败");return;}// 设置帧率范围int32_t range[2] = {30, 30}; // 固定 30fpsACaptureRequest_setEntry_i32(request,ACAMERA_CONTROL_AE_TARGET_FPS_RANGE,2, range);ACameraOutputTarget *aCameraOutputTarget = nullptr;status = ACameraOutputTarget_create(window, &aCameraOutputTarget);if (status != ACAMERA_OK){logger::error(TAG, "开启 ACameraOutputTarget_create 失败");return;}status = ACaptureRequest_addTarget(request, aCameraOutputTarget);if (status != ACAMERA_OK){logger::error(TAG, "开启 ACaptureRequest_addTarget 失败");return;}ACameraCaptureSession_stateCallbacks sessionStateCallbacks = {.context = nullptr,.onClosed = [](void* context, ACameraCaptureSession *session) -> void {logger::info(TAG, "onClosed...");},.onReady = [](void* context, ACameraCaptureSession *session) -> void {logger::info(TAG, "onReady...");},.onActive = [](void* context, ACameraCaptureSession *session) -> void {logger::info(TAG, "onActive...");},};ACameraCaptureSession_captureCallbacks captureCallbacks = {.context = nullptr,.onCaptureStarted = [](void* context, ACameraCaptureSession* session,const ACaptureRequest* request, int64_t timestamp) -> void {logger::info(TAG, "onCaptureStarted timestamp: %d", timestamp);},.onCaptureProgressed = [](void* context, ACameraCaptureSession* session,ACaptureRequest* request, const ACameraMetadata* result) -> void {logger::info(TAG, "onCaptureProgressed...");},.onCaptureCompleted = [](void* context, ACameraCaptureSession* session,ACaptureRequest* request, const ACameraMetadata* result) -> void {ACameraMetadata_const_entry fpsEntry = {};if (ACameraMetadata_getConstEntry(result,ACAMERA_CONTROL_AE_TARGET_FPS_RANGE, &fpsEntry) == ACAMERA_OK) {if (fpsEntry.count >= 2) {int32_t minFps = fpsEntry.data.i32[0];int32_t maxFps = fpsEntry.data.i32[1];logger::info(TAG, "onCaptureCompleted 当前帧率范围: [%d, %d]", minFps, maxFps);}}},.onCaptureFailed = [](void* context, ACameraCaptureSession* session,ACaptureRequest* request, ACameraCaptureFailure* failure) -> void {logger::info(TAG, "onCaptureFailed frameNumber: %d, reason: %d, sequenceId: %d, wasImageCaptured: %d", failure->frameNumber, failure->reason, failure->sequenceId, failure->wasImageCaptured);},.onCaptureSequenceCompleted = [](void* context, ACameraCaptureSession* session,int sequenceId, int64_t frameNumber) -> void {logger::info(TAG, "onCaptureSequenceCompleted sequenceId: %d, frameNumber: %d", sequenceId, frameNumber);},.onCaptureSequenceAborted = [](void* context, ACameraCaptureSession* session,int sequenceId) -> void {logger::info(TAG, "onCaptureSequenceAborted sequenceId: %d", sequenceId);},.onCaptureBufferLost = [](void* context, ACameraCaptureSession* session,ACaptureRequest* request, ACameraWindowType* window, int64_t frameNumber) -> void {logger::info(TAG, "onCaptureBufferLost frameNumber: %d", frameNumber);},};status = ACaptureSessionOutputContainer_create(&aCaptureSessionOutputContainer);if (status != ACAMERA_OK){logger::error(TAG, "开启 ACaptureSessionOutputContainer_create 失败");return;}status = ACaptureSessionOutput_create(window, &sessionOutput);if (status != ACAMERA_OK){logger::error(TAG, "开启 ACaptureSessionOutput_create 失败");return;}status = ACaptureSessionOutputContainer_add(aCaptureSessionOutputContainer, sessionOutput);if (status != ACAMERA_OK){logger::error(TAG, "开启 ACaptureSessionOutputContainer_add 失败");return;}status = ACameraDevice_createCaptureSession(device, aCaptureSessionOutputContainer, &sessionStateCallbacks, &session);if (status != ACAMERA_OK){logger::error(TAG, "开启 ACameraDevice_createCaptureSession 失败");return;}
#if __ANDROID_API__ >= 33ACameraCaptureSession_captureCallbacksV2 captureCallbacksV2 = {.context = nullptr,.onCaptureStarted = [](void* context, ACameraCaptureSession* session,const ACaptureRequest* request, int64_t timestamp, int64_t frameNumber) -> void {},.onCaptureProgressed = [](void* context, ACameraCaptureSession* session,ACaptureRequest* request, const ACameraMetadata* result) -> void {},.onCaptureCompleted = [](void* context, ACameraCaptureSession* session,ACaptureRequest* request, const ACameraMetadata* result) -> void {},.onCaptureFailed = [](void* context, ACameraCaptureSession* session,ACaptureRequest* request, ACameraCaptureFailure* failure) -> void {},.onCaptureSequenceCompleted = [](void* context, ACameraCaptureSession* session,int sequenceId, int64_t frameNumber) -> void {},.onCaptureSequenceAborted = [](void* context, ACameraCaptureSession* session,int sequenceId) -> void {},.onCaptureBufferLost = [](void* context, ACameraCaptureSession* session,ACaptureRequest* request, ACameraWindowType* window, int64_t frameNumber) -> void {},};status = ACameraCaptureSession_setRepeatingRequestV2(session, &captureCallbacksV2, 1, &request, nullptr);if (status != ACAMERA_OK){logger::error(TAG, "开启 ACameraCaptureSession_setRepeatingRequestV2 失败");return;}
#elsestatus = ACameraCaptureSession_setRepeatingRequest(session, &captureCallbacks, 1, &request, nullptr);if (status != ACAMERA_OK){logger::error(TAG, "开启 ACameraCaptureSession_setRepeatingRequest 失败");return;}
#endif
}
NDKCamera::~NDKCamera() {logger::info(TAG, "~NDKCamera...");if (this->aImageReader != nullptr){AImageReader_delete(this->aImageReader);}if (session != nullptr){ACameraCaptureSession_close(session);}if (device != nullptr){ACameraDevice_close(device);}if (aCameraManager != nullptr) {ACameraManager_delete(aCameraManager);}if (this->yuvStream != nullptr){this->yuvStream->close();}if (this->aCaptureSessionOutputContainer != nullptr){ACaptureSessionOutputContainer_free(this->aCaptureSessionOutputContainer);}if (this->sessionOutput != nullptr){ACaptureSessionOutput_free(this->sessionOutput);}
}void NDKCamera::printCameraCapabilities(const char * const cameraId){ACameraMetadata *metadata = nullptr;camera_status_t status = ACameraManager_getCameraCharacteristics(this->aCameraManager, cameraId, &metadata);if(status != ACAMERA_OK){logger::error(TAG, "获取摄像头信息失败");return;}ACameraMetadata_const_entry entry = {};if (ACameraMetadata_getConstEntry(metadata, ACAMERA_SCALER_AVAILABLE_STREAM_CONFIGURATIONS, &entry) == ACAMERA_OK){logger::info(TAG, "支持的分辨率:");for(uint32_t i = 0; i + 3 < entry.count; i += 4){int32_t format = entry.data.i32[i + 0];int32_t width = entry.data.i32[i + 1];int32_t height = entry.data.i32[i + 2];int32_t isInput = entry.data.i32[i + 3];if (isInput == 0 && format == AIMAGE_FORMAT_YUV_420_888){logger::info(TAG, "format: %d, width: %d, height: %d, isInput: %d", format, width, height, isInput);}}}if (ACameraMetadata_getConstEntry(metadata, ACAMERA_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES, &entry) == ACAMERA_OK){logger::info(TAG, "支持的帧率范围:");for (uint32_t i = 0; i + 1 < entry.count; i += 2) {logger::info(TAG, "帧率范围: [%d, %d]", entry.data.i32[i], entry.data.i32[i + 1]);}}ACameraMetadata_free(metadata);
}

暴露给Kotlin

extern "C"
JNIEXPORT jlong JNICALL
Java_io_github_opensllearn_utils_Utils_initCamera(JNIEnv *env, jobject, jint width, jint height, jstring pcmPath) {NDKCamera *ndkCamera = nullptr;try {jboolean isCopy = false;const char * const pcmPathStr = env->GetStringUTFChars(pcmPath, &isCopy);ndkCamera = new NDKCamera(width, height, pcmPathStr);if (isCopy){env->ReleaseStringUTFChars(pcmPath, pcmPathStr);}} catch (const std::exception &e) {delete ndkCamera;ndkCamera = nullptr;env->ThrowNew(env->FindClass("java/lang/RuntimeException"), e.what());}return reinterpret_cast<jlong>(ndkCamera);
}
extern "C"
JNIEXPORT void JNICALL
Java_io_github_opensllearn_utils_Utils_releaseCamera(JNIEnv*, jobject, jlong ptr) {const auto* const ndkKCamera = reinterpret_cast<NDKCamera*>(ptr);delete ndkKCamera;
}

结束.后续如果想渲染得话可以使用Surface,然后传入Native,使用OpenGL,先将yuv420p转为RGB然后交给OpenGL.不是很复杂.

核心逻辑

for (int plane = 0; plane < 3; ++plane) {uint8_t* planeData = nullptr;int planeDataLen = 0;if (AImage_getPlaneData(image, plane, &planeData, &planeDataLen) != AMEDIA_OK) {logger::error(TAG, "AImage_getPlaneData failed plane=%d", plane);AImage_delete(image);return;}int rowStride = 0, pixelStride = 0;AImage_getPlaneRowStride(image, plane, &rowStride);AImage_getPlanePixelStride(image, plane, &pixelStride);int planeWidth = (plane == 0) ? width : (width + 1) / 2;int planeHeight = (plane == 0) ? height : (height + 1) / 2;// 按行按 pixelStride 写入,确保是连续的 Y then U then Vfor (int y = 0; y < planeHeight; ++y) {const uint8_t* rowPtr = planeData + y * rowStride;if (pixelStride == 1) {// 直接写 planeWidth 字节ndkCamera->yuvStream->write(reinterpret_cast<const char*>(rowPtr), planeWidth);} else {// 需要按 pixelStride 抽取for (int x = 0; x < planeWidth; ++x) {ndkCamera->yuvStream->put(rowPtr[x * pixelStride]);}}}
}

AIMAGE_FORMAT_YUV_420_888: 后面得888表示Y,U,V占一字节.
这个特殊得结果兼任了上文所说得yuv420pyuv420sp.

int32_t planes = 0;
AImage_getNumberOfPlanes(image, &planes);

AImage_getNumberOfPlanes可以获得planes得分量,一般是3(RGB,YUV)或者4(ARGB).
AImage_getPlaneData(image, plane, &planeData, &planeDataLen)获取得是对于得分量的Plane.
planeData是个char类型的二维数组指针,planeDataLen就是把二维数组看成一维数组以后的长度.
比如:


planeData
|
YYYY
YYYY

又比如

planeData
|
UPUP

高潮时刻到了,打起精神! 先整一个AI笑话.

「对着代码改到凌晨,突然灵魂拷问:我费这劲学 YUV 格式、调 AImage 有啥卵用啊?」「要是此刻能冲进来个富婆,啪给我一巴掌说‘别卷这些破玩意了’,再扔张黑卡‘姐带你环球旅行’,我当场能把编译器删了!」
「调试 YUV420P 转码又卡了两小时,盯着屏幕发呆:会这些到底能换几毛钱啊?」「突然脑补一个场景:富婆推门进来,反手给我一巴掌,特霸气地说‘别跟像素较劲了’,然后拽着我就走‘现在就去马尔代夫晒太阳’—— 唉,梦该醒了,继续改 bug 吧。」
「写 AImage 提取数据的代码时,突然摆烂:学这些冷门技术,除了掉头发还有啥用?」「要是有富婆能过来,轻轻扇我一下说‘别学了没用’,再补一句‘我带你去环游世界’,我现在就把项目文件夹拖进回收站,绝不犹豫!」
006bllTKly1frnu8cgiksj305i03sjr8

梦醒了!
AImage_getPlaneRowStride会返回每行的数据量,且会包含无效数据
如下

planeData
|
UPUP

P就是无效数据,所以就需要下一个函数登场.
AImage_getPlanePixelStride代表每行有效像素的距离.
这时候你就需要一个char一个char的写了.
结束.

http://www.hskmm.com/?act=detail&tid=38970

相关文章:

  • task3
  • LLM安全新威胁:为什么几百个毒样本就能破坏整个模型
  • 文档扩展名.js .jsx .ts .tsx区别(JavaScript扩展名、React扩展名、TypeScript扩展名)
  • MySQL5.7安装及配置
  • ASP.NET Core Blazor简介和快速入门三(布局和路由)
  • 碎碎念(0....)
  • 玩转单片机之智能车小露——通过UART为单片机增加TTY终端
  • mysql数据库学习之用户权限管理(四) - 实践
  • 2025超纯水推荐品牌,哪个品牌口碑好?
  • 五笔练习
  • cnbook主题风格美化 —— 01(未完成)
  • 2025 年热镀锌方管立柱制造厂家最新推荐榜,技术实力与市场口碑深度解析佛山/顺德/广州薄壁/异形/Q235厂家推荐
  • 【嵌入式】IIC和SPI的比较
  • session、cookie、token的区别
  • AppSec与事件响应的融合实践
  • 权威调研榜单:电磁加热器厂家TOP3榜单好评深度解析
  • CSP-S模拟39 ( 2025多校冲刺CSP模拟赛8 )
  • 2025年市面上双曲铝单板品牌、行业内双曲铝单板厂家、市场双曲铝单板产品、目前双曲铝单板供应商、口碑好的双曲铝单板公司排行榜
  • 2025市面上双曲铝单板品牌、行业内双曲铝单板厂家、市场双曲铝单板产品、口碑好的双曲铝单板厂家、2025年双曲铝单板供应商权威排名
  • 2025市面上双曲铝单板品牌、行业内双曲铝单板生产厂家、市场双曲铝单板供应厂家、目前双曲铝单板实力厂家、口碑好的双曲铝单板公司排行榜
  • 2025 年调直机厂家最新推荐排行榜权威发布:聚焦伺服 / 高速 / 铁线 / 扁铁机型,揭秘行业优质企业
  • 2025年10月导电炭黑厂家全景解析报告,基于专业测评的技术、性能及市场优势深度分析
  • 全新的图形化AI编程软件——aily blockly公测来咯
  • CF1984E
  • 2025年市面上美国留学品牌、口碑好的美国留学产品、2025年美国留学渠道商、评价高的美国留学服务商、美国留学品牌推荐榜综合评测
  • 2025年市面上美国留学品牌、2025年美国留学品牌、口碑好的美国留学品牌、热门的美国留学品牌、美国留学品牌推荐榜深度评测
  • gu
  • 2025 年导电炭黑厂家最新推荐榜:聚焦企业专利技术、品质管控及知名客户合作案例的权威解析
  • docker 端口映射
  • L07_2