<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>深度模糊 on zhaoyli&#39;s Blog</title>
    <link>https://zhaoylee.github.io/Blogs/tags/%E6%B7%B1%E5%BA%A6%E6%A8%A1%E7%B3%8A/</link>
    <description>Recent content in 深度模糊 on zhaoyli&#39;s Blog</description>
    <generator>Hugo</generator>
    <language>zh-cn</language>
    <copyright>[©2024 zhaoyli&amp;rsquo;s Blog] https://zhaoylee.github.io/)</copyright>
    <lastBuildDate>Mon, 16 Mar 2026 01:45:51 +0000</lastBuildDate>
    <atom:link href="https://zhaoylee.github.io/Blogs/tags/%E6%B7%B1%E5%BA%A6%E6%A8%A1%E7%B3%8A/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>OBMO: One Bounding Box Multiple Objects
for Monocular 3D Object Detection</title>
      <link>https://zhaoylee.github.io/Blogs/posts/plug_and_play/obmo--one-bounding-box-multiple-objects-for-monocular-3d-object-detection/</link>
      <pubDate>Sun, 15 Mar 2026 21:59:12 +0800</pubDate>
      <guid>https://zhaoylee.github.io/Blogs/posts/plug_and_play/obmo--one-bounding-box-multiple-objects-for-monocular-3d-object-detection/</guid>
      <description>&lt;hr&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;🏛️ 会议/期刊&lt;/strong&gt;：IEEE TIP&lt;br&gt;
&lt;strong&gt;📅 发表年份&lt;/strong&gt;：2023&lt;br&gt;
&lt;strong&gt;💻 开源代码&lt;/strong&gt;：&lt;a href=&#34;https://github.com/mrsempress/OBMO_patchnet&#34;&gt;mrsempress/OBMO_patchnet&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;📄 论文题目&lt;/strong&gt;：&lt;a href=&#34;https://arxiv.org/pdf/2212.10049&#34;&gt;OBMO: One Bounding Box Multiple Objects for Monocular 3D Object Detection&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;hr&gt;
&lt;p&gt;这篇发表于 IEEE TIP (2023) 的经典论文 &lt;strong&gt;《OBMO: One Bounding Box Multiple Objects for Monocular 3D Object Detection》&lt;/strong&gt; 切入点非常犀利。它没有在复杂的网络主干上做文章，而是直击单目 3D 目标检测在“底层数学物理逻辑”上的痛点，提出了一种极其优雅的“即插即用（Plug-and-play）”训练策略。&lt;/p&gt;</description>
    </item>
  </channel>
</rss>
