最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

How to calculate opengl camera image vertices, knowing opencv extrinsic calibration parameters? - Stack Overflow

programmeradmin4浏览0评论

So, I did intrinsic and extrinsic camera calibration, and I got intrinsic parameters (fx,fy,cx,cy) and extrinsic [R|t]. The function cv2.drawFrameAxes is plotting the axis just fine, with the length that is set.

  1. I don't understand the translation vector t. It should be in world coordinates. The amplitude of that vector should be a distance from the camera to the center of the aruco marker (I took it as the center of the world coordinates). My vector looks like this:

    T= [[-0.58791781]
        [-0.1707134 ]
        [ 2.31275417]]
    

    ...but when I measure the distance, it is about 1.4 meters. What is the reason for this difference?

  2. I would like to render in opengl a sphere, and in the part to which the camera points to render the camera's image. The texture coordinates would go from [0,0] to [1,1], but how to calculate vertice coordinates of the image coorners? Camera's orientation and position in opengl are not a problem.

与本文相关的文章

发布评论

评论列表(0)

  1. 暂无评论