表紙
市場調查報告書
商品編碼
1015123

新興圖像傳感器技術(2021-2031):應用和市場

Emerging Image Sensor Technologies 2021-2031: Applications and Markets

出版日期: | 出版商: IDTechEx Ltd. | 英文 307 Slides | 商品交期: 最快1-2個工作天內

價格
  • 全貌
  • 簡介
  • 目錄
簡介

標題
新興圖像傳感器技術(2021-2031):應用和市場
用於自動駕駛汽車、無人機、精準農業和工業自動化的創新圖像傳感器。包括有機光電探測器、短波紅外圖像傳感器、基於事件的視覺、高光譜成像、靈活的 X 射線探測器和波前成像。

"到 2031 年,自主技術將引領新興圖像傳感器市場達到 3.6 億美元。"

圖像傳感是一項非常重要的功能,用於從網絡攝像頭和智能手機攝像頭到自動駕駛汽車和工業檢測等各種應用。IDTechEx 的這份報告全面探索了新興圖像傳感器的市場,涵蓋了從薄膜柔性光電探測器到基於事件的視覺的各種技術。

雖然用於可見光的傳統 CMOS 檢測器已經成熟且有些商品化,至少對於低價值應用而言,但對於更複雜的圖像傳感器存在廣泛的機會,這些傳感器提供的功能超出了簡單地獲取紅色、綠色和藍色 (RGB ) 強度值。因此,目前正致力於開發新興的圖像傳感器技術,這些技術可以檢測超出人類視覺範圍的光的各個方面。這包括在更寬的光譜範圍、更大的區域上成像、在每個像素處獲取光譜數據,以及同時提高時間分辨率和動態範圍。

這種機會統一在很大程度上源於機器視覺的日益普及,其中圖像分析由計算算法執行。機器學習需要盡可能多的輸入數據來建立有助於物體識別和分類的相關性,因此獲取不同波長範圍內的光學信息(例如具有光譜分辨率)是非常有利的。

當然,新興的圖像傳感器技術提供了許多其他好處。根據技術的不同,這可以包括成本更低的類似功能、增加的動態範圍、提高時間分辨率、空間可變靈敏度、高分辨率的全局快門、減少散射的有害影響、靈活性/適形性等。一個特別重要的趨勢是開髮用於在短波紅外(SWIR,1000 - 2000 nm)光譜區域成像的非常昂貴的 InGaAs 傳感器的更便宜的替代品,這將使這種能力擴展到更廣泛的r 範圍應用程序。這包括自動駕駛汽車,其中短波紅外成像有助於區分在可見光譜中看起來相似的物體/材料,同時還減少灰塵和霧氣的散射。

該報告涵蓋以下技術:

  • 矽混合圖像傳感器上的量子點
  • 矽混合圖像傳感器上的有機光電探測器
  • 新興的 SWIR 圖像傳感器技術
  • 有機和鈣鈦礦光電二極管(OPD 和 PPD)
  • 基於事件的視覺
  • 高光譜成像
  • 靈活的 X 射線圖像傳感器
  • 波前成像
  • 混合圖像傳感器。在 CMOS 讀出電路頂部添加額外的光吸收層是一種混合方法,它利用有機半導體或量子點來增加對 SWIR 區域的光譜靈敏度。目前由昂貴的 InGaAs 傳感器主導,這項新技術有望大幅降低價格,從而將 SWIR 成像用於自動駕駛汽車等新應用。
  • 擴展範圍的矽。鑑於 InGaAs 傳感器的價格非常高,有相當大的動機開發成本低得多的替代品,可以檢測 SWIR 光譜區域低端的光。這種 SWIR 傳感器隨後可用於車輛,以減少散射,從而在霧和灰塵中提供更好的視野。
  • 薄膜光電探測器。 在大面積上檢測光,而不是在單個小探測器上,對於獲取生物特徵數據和皮膚成像(如果靈活)是非常理想的。目前,矽的高成本意味著大面積圖像傳感器可能非常昂貴。然而,利用溶液可加工半導體的新興方法提供了一種生產大面積共形光電探測器的引人注目的方法。印刷有機光電探測器 (OPD) 是最發達的方法,正在積極探索顯示屏下指紋檢測。
  • 基於事件的視覺:自動駕駛汽車、無人機和高速工業應用需要具有高時間分辨率的圖像傳感。然而,對於傳統的基於幀的成像,高時間分辨率會產生大量需要計算密集型處理的數據。基於事件的視覺,也稱為動態視覺傳感 (DVS),是解決這一挑戰的新興技術。這是一種獲取光學信息的全新思維方式,其中每個傳感器像素報告與強度變化相對應的時間戳。因此,基於事件的視覺可以將快速變化的圖像區域的更高時間分辨率與大大減少的數據傳輸和後續處理要求相結合。
  • 高光譜成像:從入射光中獲取盡可能多的信息對於需要物體識別的應用非常有利,因為分類算法需要處理更多數據。高光譜成像是一種相對成熟的技術,它在每個像素處獲取完整的光譜,以使用色散光學元件和圖像傳感器生成(x, y, &lamb da;) 數據立方體,是一項相對成熟的技術,已在精準農業和工業過程檢驗。然而,目前大多數高光譜相機都採用線性電子掃瞄原理,而 SWIR 高光譜成像由於 InGaAs 傳感器成本高而僅限於相對小眾的應用。新興技術似乎將破壞這兩個方面,快照成像提供了線掃瞄相機的替代方案,並且上述新的 SWIR 傳感技術有助於降低成本並被更廣泛的應用採用。
  • 靈活的 X 射線傳感器:X 射線傳感器非常成熟,對於醫療和安全應用非常重要。然而,聚焦 X 射線的困難意味著傳感器需要覆蓋大面積。此外,由於矽不能有效地吸收 X 射線,因此通常使用閃爍體層。然而,這兩個方面都增加了傳感器的尺寸和重量,使 X 射線探測器體積龐大且笨重。基於非晶矽背板的柔性 X 射線傳感器提供了一種引人注目的替代方案,因為它們更輕且保形(尤其適用於對彎曲的身體部位進行成像)。展望未來,基於解決方案可加工半導體的直接 X 射線傳感器可減輕重量和複雜性,並具有提高空間分辨率的潛力。
    < li>波前成像: 波前(或相位)成像能夠從傳統傳感器丟失的入射光中提取相位信息。該技術目前用於小眾應用,例如光學組件設計/檢查和眼科。然而,最近的進展導致了分辨率的顯著改進,這將使這項技術得到更廣泛的應用。生物成像是更有前途的新興應用之一,其中收集相位和強度減少了散射的影響,從而實現了更好定義的圖像。

總而言之,越來越多地採用計算圖像分析為圖像傳感技術提供了巨大的機會,這些技術提供了超越傳統 CMOS 傳感器的功能。本報告為市場提供了一個全面的概述新興圖像傳感器技術和相關技術的發展,涵蓋從自治範圍多種應用輛icles工業質量控制。預計在未來十年內,這些激動人心的創新成像技術將被迅速採用。

報告中包含以下信息:

  • 執行摘要和結論。
  • 對上述新興圖像傳感器技術的詳細技術分析。
  • 高度精細的 10 年市場預測,按技術劃分,隨後按應用劃分。這包括 40 多個單獨的預測類別。預測以數量和收入來表示。
  • 技術/商業準備情況評估,按技術和應用劃分。
  • 開發和採用各種新興圖像傳感技術的商業動機。
  • 針對每種圖像傳感技術的多個應用案例研究。
  • 每種圖像傳感技術的 SWOT 分析。
  • 每個技術類別中的主要參與者概覽。
  • 超過 25 份公司簡介,其中大部分基於近期的主要採訪。其中包括對當前狀態、技術、潛在市場和商業模式的討論,以及公司財務信息(如已披露)和我們的 SWOT 分析。
  • 精選的與新興圖像傳感器技術相關的學術研究亮點。

來自 IDTechEx 的分析師訪問

所有報告購買都包括與專家分析師最多 30 分鐘的電話時間,他將幫助您將報告中的關鍵發現與您正在解決的業務問題聯繫起來。這需要在購買報告後的三個月內使用。

目錄

1. 執行摘要

  • 1.1. 關鍵要點
  • 1.2. 傳統圖像傳感器:市場概覽
  • 1.3. 短波紅外 (SWIR) 成像的動機
  • 1.4. SWIR 成像:現有和新興技術選項
  • 1.5。SWIR 圖像傳感器的機遇
  • 1.6. SWIR 傳感器:應用概述
  • 1.7. OPD-on-CMOS 混合圖像傳感器
  • 1.8。量子點作□□為光學傳感器材料
  • 1.9. QD/OPD-on-CMOS 探測器的前景
  • 1.10. 用於 SWIR 成像的 QD-Si 技術面臨的挑戰
  • 1.11. 薄膜有機和鈣鈦礦光電探測器概述
  • 1.12。有機光電探測器的應用。
  • 1.13. 高光譜成像簡介
  • 1.14. 高光譜成像概述
  • 1.15。什麼是基於事件的視覺?
  • 1.16。基於事件的視覺應用前景廣闊
  • 1.17. 基於事件的視覺概述
  • 1.18. 波前成像概述
  • 1.19. 靈活且直接的 X 射線圖像傳感器概述
  • 1.20。新興圖像傳感器技術的 10 年市場預測
  • 1.21. 新興圖像傳感器技術的 10 年市場預測(按數量)
  • 1.22。新興圖像傳感器技術的 10 年市場預測(按數量、數據表)
  • 1.23。新興圖像傳感器技術的 10 年市場預測(按收入)
  • 1.24。新興圖像傳感器技術的 10 年市場預測(按收入、數據表)

2. 簡介

  • 2 .1. 什麼是傳感器?
  • 2.2. 傳感器價值鏈示例:數碼相機
  • 2.3. 光電探測器工作原理
  • 2.4. 量化光電探測器和圖像傳感器的性能
  • 2.5. 從光中提取盡可能多的信息
  • 2.6。自動駕駛汽車將需要機器視覺
  • 2.7. 自動駕駛汽車採用趨勢
  • 2.8. 汽車的自動化程度如何?
  • 2.9. 全球自動駕駛汽車市場
  • 2.10. 不同的汽車自主級別需要多少個攝像頭
  • 2.11. 不斷增長的無人機使用為新興的圖像傳感器提供了廣闊的市場
  • 2.12. 無人機所需的新興圖像傳感器

3. 市場預測

  • 3.1. 市場預測方法
  • 3.2. 參數化預測曲線
  • 3.3. 確定總可尋址市場 (TAM)
  • 3.4. 確定收入
  • 3.5。10 年短波紅外 (SWIR) 圖像傳感器市場預測:按數量
  • 3.6. 10 年混合 OPD-on-CMOS 圖像傳感器市場預測:按數量
  • 3.7. 10 年混合 OPD-on-CMOS 圖像傳感器市場預測:按收入
  • 3.8. 10 年混合 QD-on-CMOS 圖像傳感器市場預測:按數量
  • 3.9. 10 年混合 QD-on-CMOS 圖像傳感器市場預測:按收入
  • 3.10. 10 年薄膜有機和鈣鈦礦光電探測器(OPD 和 PPD)市場預測:按數量
  • 3.11. 10 年薄膜有機和鈣鈦礦光電探測器(OPD 和 PPD)市場預測:按收入
  • 3.12. 10 年高光譜成像市場預測:按數量
  • 3.13. 10 年高光譜成像市場預測:按收入
  • 3.14. 10 年基於事件的視覺市場預測:按數量
  • 3.15。10 年基於事件的視覺市場預測:按收入
  • 3.16。10 年ar 波前成像市場預測:按數量
  • 3.17。10 年波前成像市場預測:按收入
  • 3.18. 10 年柔性 X 射線圖像傳感器市場預測:按數量

4. 已建立的可見光圖像傳感器(CCD 和 CMOS)簡要概述

  • 4.1. 傳統圖像傳感器:市場概覽
  • 4.2. CMOS 圖像傳感器 (CIS) 中的關鍵組件
  • 4.3. 傳感器架構:正面和背面照明
  • 4.4. 背照式 CMO S 圖像傳感器的工藝流程
  • 4.5。比較 CMOS 和 CCD 圖像傳感器
  • 4.6. 全局而不是滾動百葉窗的好處

5. 短波紅外(SWIR)圖像傳感器

  • 5.1. 短波紅外 (SWIR) 成像的動機
  • 5.2. SWIR 成像減少光散射
  • 5.3. SWIR:現有和新興技術選項
  • 5.4. SWIR 成像的應用
    • 5.4.1. SWIR 成像的應用
    • 5.4.2. 確定含水量SWIR意馬吳
    • 5.4.3. 用於自主移動的 SWIR
    • 5.4.4. SWIR 成像可實現更好的危險檢測
    • 5.4.5。SWIR 可通過矽晶片進行成像
    • 5.4.6. SWIR光成像溫度
    • 5.4.7。工業檢驗期間異物的可視化
    • 5.4.8。光譜化學傳感器
    • 5.4.9。用於工業過程優化的 SWIR 圖像傳感
    • 5.4.10。MULTIPLE(歐盟項目):重點領域、目標和參與者
    • 5.4.11。短波紅外光譜:可穿戴應用
    • 5.4.12。SWIR 光譜:通過可穿戴技術確定水和體溫
    • 5.4.13。SWIR 光譜:酒精檢測
    • 5.4.14。用於高光譜成像的 SWIR 圖像傳感器
    • 5. 4.15. SWIR 傳感器:應用概述
    • 5.4.16。SWIR應用要求
  • 5.5。InGaAs 傳感器 - SWIR 成像的現有技術
    • 5.5.1. 現有長波長檢測:InGaAs
    • 5.5.2. 高分辨率、低成本紅外傳感器的挑戰
    • 5.5.3. InGaAs 傳感器設計:焊料凸點限制分辨率
    • 5.5.4。索尼提高 InGaAs 傳感器分辨率和光譜範圍
  • 5.6. 新興無機 SWIR 技術和參與者
    • 5.6.1. Trieye:創新的基於矽的 SWIR 傳感器
    • 5.6.2. OmniVision:使矽 CMOS 對 NIR 敏感 (II)
    • 5.6.3. SWOT 分析:SWIR 圖像傳感器(非混合、非 InGaAs)
    • 5.6.4。供應商概覽:新興的 SWIR 圖像傳感器
    • 5.6.5。公司簡介:SWIR 成像(不包括混合方法)

6. 混合 OPD-ON-CMOS 圖像傳感器(包括 SWIR)

  • 6.1. OPD-on-CMOS 混合圖像傳感器
  • 6.2. 雜化有機/CMOS蟾酥□
  • 6.3. 用於廣播級攝像機的混合有機/CMOS 傳感器
  • 6.4. 比較混合有機/CMOS 傳感器與背照式 CMOS 傳感器
  • 6.5。僅使用矽技術的 CMOS 全局快門的進展
  • 6.6. Fraunhofer FEP:SWIR OP D-on-CMOS 傳感器 (I)
  • 6.7. Fraunhofer FEP:SWIR OPD-on-CMOS 傳感器 (II)
  • 6.8. 學術研究:對較長波長紅外光敏感的扭曲雙層石墨烯
  • 6.9. OPD-on-CMOS 探測器的技術準備水平(按應用)<我>6.10。OPD-on-CMOS 圖像傳感器的 SWOT 分析
  • 6.11. 供應商概覽:OPD-on-CMOS 混合圖像傳感器
  • 6.12. 公司簡介:OPD-on-CMOS

7. 混合 QD-ON-CMOS 圖像傳感器

  • 7.1. 量子點作□□為光學傳感器材料
  • 7.2. 硫化鉛作為量子點
  • 7.3. 量子點:材料系統的選擇
  • 7.4. 量子點在圖像傳感器中的應用和挑戰
  • 7.5。圖像傳感器中的 QD 層優勢 (I):提高傳感器靈敏度和增益
  • 7.6. QD-Si 混合圖像傳感器(II):減少厚度
  • 7.7. 檢測基準(一)
  • 7.8。檢測基準(二)
  • 7.9。帶有全局快門的混合 QD-on-CMOS,用於 SWIR 成像。
  • 7.10。QD-Si 混合圖像傳感器:實現高分辨率全局快門
  • 7.11。QD-Si 混合圖像傳感器(IV):機器視覺結構光檢測的低功耗和高靈敏度?
  • 7.12。QD層是如何應用的?
  • 7.13。解決方案加工的優勢:易於與 ROIC CMOS 集成?
  • 7.14。QD 光學層:提高 QD 薄膜導電性的方法
  • 7.15。量子點:涵蓋從可見光到近紅外的範圍
  • 7.16。PbS QD、Si、聚合物、InGaAs、HgCdTe 等的SWIR 靈敏度...
  • 7.17。混合 QD-on-CMOS 圖像傳感器:處理
    • 7.17.1。QD-on-CMOS 的價值鍊和生產步驟
    • 7.17.2. 解決方案處理的優勢:易於與 CMOS ROIC 集成?
    • 7.17.3。量子點薄膜:加工挑戰
    • 7.17.4。帶有石墨烯中間層的 QD-on-CMOS
    • 7.17.5。用於 SWIR 成像的 QD-Si 技術面臨的挑戰
    • 7.17.6。QD-on-CMOS 傳感器:持續的技術挑戰
    • 7.17.7。QD-on-CMOS 探測器的技術準備水平(按應用)
  • 7.18。混合 QD-on-CMOS 圖像傳感器:主要參與者
    • 7.18.1。SWIR 視覺系統:用於 SWIR 成像的混合量子點
    • 7.18.2. SWIR視覺傳感器:一是商業QD-CMOS攝像頭小號
    • 7.18.3。IMEC:QD-on-CMOS 集成示例(一)
    • 7.18.4。IMEC:QD-on-CMOS 集成示例(二)
    • 7.18.5。RTI International:QD-on-CMOS 集成示例
    • 7.18.6。QD-on-CMOS 集成示例(ICFO 續) <李>7.18.7。Emberion:QD-石墨烯短波紅外傳感器
    • 7.18.8。Emberion:QD-Graphene-Si 寬範圍 SWIR 傳感器
    • 7.18.9。Emberion:具有 400 至 2000 nm 光譜範圍的 VIS-SWIR 相機
    • 7.18.10。Qurv Technologies:從 ICFO 分拆出來的石墨烯/量子點圖像傳感器公司
    • 7.18.11。學術研究:來自漢陽大學的 QD-on-CMOS(韓國)
    • 7.18.12。學術研究:膠體量子點實現中紅外傳感
    • 7.18.13。學術研究:等離子納米立方體製造廉價的短波紅外相機
  • 7.19。總結:QD-on-CMOS 圖像傳感器
    • 7.19.1。總結:QD/OPD-on-CMOS 探測器
    • 7.19.2. QD-on-CMOS 圖像傳感器的 SWOT 分析
    • 7.19.3。供應商概覽:QD-on-CMOS 混合圖像傳感器
    • 7.19.4。公司簡介:混合 QD-on-CMOS 圖像傳感器

8. 薄膜光電探測器(有機和鈣鈦礦)

  • 8.1. 薄膜光電探測器(有機和鈣鈦礦)簡介
  • 8.2. 有機光電探測器(OPD)
  • 8.3. 薄膜光電探測器:優點和缺點
  • 8.4. 減少暗電流以增加動態範圍
  • 8.5。根據特定應用定制檢測波長
  • 8.6. 將 OPD 擴展到 NIR 區域:腔體的使用
  • 8.7. 從解決方案製造薄膜光電探測器的技術挑戰
  • 8.8. 薄膜光電探測器材料
  • 8.9. 薄膜有機和鈣鈦礦光電二極管(OPD 和 PPD):應用和主要參與者
    • 8.9.1。有機光電探測器的應用
    • 8.9.2. 用於生物識別安全的 OPD
    • 8.9.3. 用於醫學成像的噴塗有機光電二極管
    • 8.9.4。ISORG:帶有 OPD 的 "顯示指紋"
    • 8.9.5。ISORG:使用 TFT 有源矩陣的靈活 OPD 應用
    • 8.9.6。ISORG:第一條OPD生產線
    • 8.9.7。劍橋顯示技術:使用 OPD 進行脈搏血氧飽和度傳感
    • 8.9.8。Holst 中心:基於鈣鈦礦的圖像傳感器
    • 8.9.9。大面積 OPD 應用面臨的商業挑戰
    • 8.9.10。薄膜光電探測器應用技術要求
    • 8.9.11。薄膜OPD和PPD應用要求
    • 8.9.12。薄膜 OPD 和 PPD 的應用評估 < li>8.9.13。按應用劃分的有機和鈣鈦礦光電探測器的技術準備水平
  • 8.10. 有機和鈣鈦礦薄膜光電探測器(OPD 和 PPD):總結
    • 8.10.1。總結:薄膜有機和鈣鈦礦 p熱探測器
    • 8.10.2. 大面積OPD圖像傳感器的SWOT分析
    • 8.10.3。供應商概覽:薄膜光電探測器
    • 8.10.4。公司簡介:有機光電二極管 (OPD)

9. 高光譜成像

  • 9.1. 高光譜成像簡介
  • 9.2. 獲取高光譜數據立方體的多種方法
  • 9.3. 用於高光譜數據採集的對比設備架構 (II)
  • 9.4. 線掃瞄(推掃式)相機非常適合傳送帶和衛星圖像
  • 9.5。 "推掃帚" 和舊的高光譜成像方法之間的比較
  • 9.6. 線掃瞄高光譜相機設計
  • 9.7. 快照高光譜成像
  • 9.8. 高光譜成像照明<我>9.9。用於多光譜/超光譜圖像增強的全色銳化
  • 9.10。高光譜成像作為多光譜成像的發展
  • 9.11。高光譜和多光譜成像之間的權衡
  • 9.12。邁向寬帶高光譜成像
  • 9.13。高光譜成像:應用
    • 9.13.1。高光譜成像與精準農業
    • 9.13.2. 來自 UAV(無人機)的高光譜成像
    • 9.13.3. 農業無人機生態系統發展
    • 9.13.4。使用高光譜相機進行衛星成像
    • 9.13.5。歷史性的無人機投資創造了對高光譜成像的需求
    • 9.13.6。使用高光譜成像進行在線檢測
    • 9.13.7。使用在線高光譜成像進行目標識別
    • 9.13.8。使用高光譜成像對物體進行分類回收
    • 9.13.9。使用高光譜成像進行食品檢測
    • 9.13.10。用於皮膚診斷的高光譜成像
    • 9.13.11。高光譜成像應用requir對此語句
  • 9.14。高光譜成像:主要參與者
    • 9.14.1。比較高光譜相機製造商
    • 9.14.2. Specim:線掃瞄成像的市場領導者
    • 9.14.3. 頂牆光子學
    • 9.14.4。Cubert:快照光譜成像的專家
    • 9.14.5。高光譜成像波長範圍
    • 9.14.6。高光譜波長範圍與光譜分辨率
    • 9.14.7。高光譜相機參數表
    • 9.14.8。分析和應用高光譜成像的公司
    • 9.14.9。Condi Food:使用高光譜成像進行食品質量監測
    • 9.14.10。Orbital Sidekick:衛星高光譜成像
    • 9.14.11。Gamaya:用於農業分析的高光譜成像
  • 9.15。總結:高光譜成像
    • 9.15.1。總結:高光譜成像
    • 9.15.2。SWOT 分析:高光譜成像
    • 9.15.3。供應商概覽:高光譜成像
    • 9.15.4。公司簡介:Hypers光譜成像

10。基於事件的視覺(也稱為動態視覺感知)

  • 10.1. 什麼是基於事件的感知?
  • 10.2. 基於一般事件的傳感:優點和缺點
  • 10.3. 什麼是基於事件的視覺?(一世)
  • 10.4。什麼是基於事件的視覺?(三)
  • 10.5。基於事件的視覺數據是什麼樣的?
  • 10.6. 基於事件的願景:利弊
  • 10.7. 基於事件的視覺傳感器可增加動態範圍
  • 10.8。基於事件的視覺傳感器的成本
  • 10.9. 基於事件的視覺軟件的重要性
  • 10.10. 基於事件的視覺應用
    • 10.10.1。基於事件的視覺應用前景廣闊
    • 10.10.2. 自動駕駛汽車的基於事件的視覺
    • 10.10 .3. 基於事件的無人機 (UAV) 防撞視覺
    • 10.10.4。智能建築中的乘員跟蹤(跌倒檢測)
    • 10.10.5。增強/虛擬現實的基於事件的視覺
    • 10.10.6。用於光學對準/光束分析的基於事件的視覺
    • 10.10.7。基於事件的視覺應用要求
    • 10.10.8。應用程序基於事件的視覺技術準備水平
  • 10.11。基於事件的願景:關鍵參與者
    • 10.11.1。EV基於ENT視覺:公司景觀
    • 10.11.2. IniVation:以有機增長為目標
    • 10.11.3。Prophesee:資金充足,目標是自主移動
    • 10.11.4。CelePixel:專注於硬件
    • 10.11.5。洞察力:針對無人機防撞的垂直集成模型
  • 10.12。總結:基於事件的視覺
    • 10.12.1。總結:基於事件的視覺
    • 10.12.2. SWOT 分析:基於事件的願景
    • 10.12.3. 供應商概述:基於事件的VISI上
    • 10.12.4。公司簡介:基於事件的願景

11。波前成像(也稱為相位成像)

  • 11.1. 波前成像的動機
  • 11.2. 傳統的 Shack-Hartman 波前傳感器
  • 1 1.3. Phasics:波前成像的創新者
  • 11.4. Wooptix:光場和波前成像
  • 11.5。總結:波前成像
  • 11.6. SWOT 分析:波前成像
  • 11.7. 供應商概覽:波前成像傳感器
  • 11.8。公司簡介:波前成像

12。靈活且直接的 X 射線圖像傳感器

  • 12.1. 傳統的 X 射線傳感
  • 12.2. 基於非晶矽的柔性圖像傳感器
  • 12.3. 用於醫療成像的噴塗有機光電二極管
  • 12.4. 使用有機半導體進行直接 X 射線傳感
  • 12.5。Holst 中心開發基於鈣鈦礦的 X 射線傳感器 (I)
  • 12.6. Holst 中心開發基於鈣鈦礦的 X 射線傳感器(二)
  • 12.7. 靈活和直接的 X 射線傳感器的技術準備水平
  • 12.8。總結:靈活且直接的 X 射線圖像傳感器
  • 12.9。SWOT 分析:靈活且直接的 X 射線圖像傳感器
  • 12.10。供應商概覽:靈活的 X 射線圖像傳感器
  • 12.11。公司簡介:靈活且直接的 X 射線圖像傳感器
目錄
Product Code: ISBN 9781913899530

Title:
Emerging Image Sensor Technologies 2021-2031: Applications and Markets
Innovative image sensor for autonomous vehicles, UAVs, precision agriculture and industrial automation. Includes organic photodetectors, SWIR image sensors, event-based vision, hyperspectral imaging, flexible x-ray detectors, and wavefront imaging.

"Autonomous technologies will lead the market for emerging image sensors to $360 million by 2031."

Image sensing is a highly important capability, used in applications ranging from webcams and smartphone cameras to autonomous vehicles and industrial inspection. This report from IDTechEx comprehensively explores the market for emerging image sensors, covering a diverse range of technologies than span from thin-film flexible photodetectors to event-based vision.

While conventional CMOS detectors for visible light are well established and somewhat commoditized, at least for low value applications, there is an extensive opportunity for more complex image sensors that offer capabilities beyond that of simply acquiring red, green and blue (RGB) intensity values. As such, extensive effort is currently being devoted to developing emerging image sensor technologies that can detect aspects of light beyond human vision. This includes imaging over a broader spectral range, over a larger area, acquiring spectral data at each pixel, and simultaneously increasing temporal resolution and dynamic range.

Much of this opportunity stems from the ever-increasing adoption of machine vision, in which image analysis is performed by computational algorithms. Machine learning requires as much input data as possible to establish correlations that can facilitate object identification and classification, so acquiring optical information over a different wavelength range, or with spectral resolution for example, is highly advantageous.

Of course, emerging image sensor technologies offer many other benefits. Depending on the technology this can include similar capabilities at a lower cost, increased dynamic range, improve temporal resolution, spatially variable sensitivity, global shutters at high resolution, reducing the unwanted influence of scattering, flexibility/conformality and more. A particularly important trend is the development of much cheaper alternatives to very expensive InGaAs sensors for imaging in the short-wave infra-red (SWIR, 1000 - 2000 nm) spectral region, which will open up this capability to a much wider range of applications. This includes autonomous vehicles, in which SWIR imaging assists with distinguishing objects/materials that appear similar in the visible spectrum, while also reducing scattering from dust and fog.

The report covers the following technologies:

  • Quantum dots on silicon hybrid image sensors
  • Organic photodetectors on silicon hybrid image sensors
  • Emerging SWIR image sensor technologies
  • Organic and perovskite photodiodes (OPDs and PPDs)
  • Event-based vision
  • Hyperspectral imaging
  • Flexible x-ray image sensors
  • Wavefront imaging
  • Hybrid image sensors. Adding an additional light absorbing layer on top of a CMOS read-out circuit is a hybrid approach that utilizes either organic semiconductors or quantum dots to increase the spectral sensitivity into the SWIR region. Currently dominated by expensive InGaAs sensors, this new technology promises a substantial price reduction and hence adoption of SWIR imaging for new applications such as autonomous vehicles.
  • Extended-range silicon. Given the very high price of InGaAs sensors, there is considerable motivation to develop much lower cost alternatives that can detect light towards the lower end of the SWIR spectral region. Such SWIR sensors could then be employed in vehicles to provide better vision through fog and dust due to reduced scattering.
  • Thin film photodetectors. Detection of light over a large area, rather than at a single small detector, is highly desirable for acquiring biometric data and, if flexible, for imaging through the skin. At present, the high cost of silicon means that large-area image sensors can be prohibitively expensive. However, emerging approaches that utilize solution processable semiconductors offer a compelling way produce large-area conformal photodetectors. Printed organic photodetectors (OPDs) are the most developed approach, with under-display fingerprint detection being actively explored.
  • Event-based vision: Autonomous vehicles, drones and high-speed industrial applications require image sensing with a high temporal resolution. However, with conventional frame-based imaging a high temporal resolution produces vast amounts of data that requires computationally intensive processing. Event-based vision, also known as dynamic vision sensing (DVS), is an emerging technology that resolves this challenge. It is a completely new way of thinking about obtaining optical information, in which each sensor pixel reports timestamps that correspond to intensity changes. As such, event-based vision can combine greater temporal resolution of rapidly changing image regions, with much reduced data transfer and subsequent processing requirements.
  • Hyperspectral imaging: Obtaining as much information as possible from incident light is highly advantageous for applications that require object identification, since classification algorithms have more data to work with. Hyperspectral imaging, in which a complete spectrum is acquired at each pixel to product an (x, y, λ) data cube using a dispersive optical element and an image sensor, is a relatively established technology that has gained traction for precision agriculture and industrial process inspection. However, at present most hyperspectral cameras work on a line-scan principle, while SWIR hyperspectral imaging is restricted to relatively niche applications due to the high cost of InGaAs sensors. Emerging technologies look set to disrupt both these aspects, with snapshot imaging offering an alternative to line-scan cameras and with the new SWIR sensing technologies outlined above facilitating cost reduction and adoption for a wider range of applications.
  • Flexible x-ray sensors: X-ray sensors are well-established and highly important for medical and security applications. However, the difficulty in focusing x-rays means that sensors need to cover a large area. Furthermore, since silicon cannot effectively absorb x-rays a scintillator layer is commonly used. However, both these aspects increase sensor size and weight, making x-ray detectors bulky and unwieldy. Flexible x-ray sensors based on an amorphous silicon backplane offer a compelling alternative, since they would be lighter and conformal (especially useful for imaging curved body parts). Looking further ahead, direct x-ray sensors based on solution processable semiconductors offer reduced weight and complexity along with the potential for higher spatial resolution.
  • Wavefront imaging: Wavefront (or phase) imaging enables the extraction of phase information from incident light that is lost by a conventional sensor. This is technique is currently used for niche applications such as optical component design/inspection and ophthalmology. However, recent advances have led to significant resolution improvements which will allow this technology to be applied somewhat more widely. Biological imaging is one of the more promising emerging applications, in which collecting phase along with intensity reduces the influence of scattering and thus enables better defined images.

In summary, increasing adoption of computational image analysis provides a great opportunity for image sensing technologies that offer capabilities beyond conventional CMOS sensors. This report offers a comprehensive overview of the market for emerging image sensor technologies and associated technical developments, covering a multitude of applications that range from autonomous vehicles to industrial quality control. Expect to see many of these exciting and innovative imaging technologies being rapidly adopted over the next decade.

The following information is included within the report:

  • Executive summary & conclusions.
  • Detailed technical analysis of the emerging image sensor technologies outlined above.
  • Highly granular 10-year market forecasts, split by technology and subsequently by application. This includes over 40 individual forecast categories. Forecasts are expressed by both volume and revenue.
  • Technological/commercial readiness assessments, split by technology and application.
  • Commercial motivation for developing and adopting each of the emerging image sensing technologies.
  • Multiple application case studies for each image sensing technology.
  • SWOT analysis of each image sensing technology.
  • Overview of the key players within each technology category.
  • Over 25 company profiles, the majority based on recent primary interviews. These include a discussion of current status, technology, potential markets and business model, along with company financial information (where disclosed) and our SWOT analysis.
  • Selected highlights from academic research relevant to emerging image sensor technologies.

Analyst access from IDTechEx

All report purchases include up to 30 minutes telephone time with an expert analyst who will help you link key findings in the report to the business issues you're addressing. This needs to be used within three months of purchasing the report.

TABLE OF CONTENTS

1. EXECUTIVE SUMMARY

  • 1.1. Key takeaways
  • 1.2. Conventional image sensors: Market overview
  • 1.3. Motivation for short-wave infra-red (SWIR) imaging
  • 1.4. SWIR imaging: Incumbent and emerging technology options
  • 1.5. Opportunities for SWIR image sensors
  • 1.6. SWIR sensors: Application overview
  • 1.7. OPD-on-CMOS hybrid image sensors
  • 1.8. Quantum dots as optical sensor materials
  • 1.9. Prospects for QD/OPD-on-CMOS detectors
  • 1.10. Challenges for QD-Si technology for SWIR imaging
  • 1.11. Overview of thin film organic and perovskite photodetectors
  • 1.12. Applications of organic photodetectors.
  • 1.13. Introduction to hyperspectral imaging
  • 1.14. Overview of hyperspectral imaging
  • 1.15. What is event-based vision?
  • 1.16. Promising applications for event-based vision
  • 1.17. Overview of event-based vision
  • 1.18. Overview of wavefront imaging
  • 1.19. Overview of flexible and direct x-ray image sensors
  • 1.20. 10-year market forecast for emerging image sensor technologies
  • 1.21. 10-year market forecast for emerging image sensor technologies (by volume)
  • 1.22. 10-year market forecast for emerging image sensor technologies (by volume, data table)
  • 1.23. 10-year market forecast for emerging image sensor technologies (by revenue)
  • 1.24. 10-year market forecast for emerging image sensor technologies (by revenue, data table)

2. INTRODUCTION

  • 2.1. What is a sensor?
  • 2.2. Sensor value chain example: Digital camera
  • 2.3. Photodetector working principles
  • 2.4. Quantifying photodetector and image sensor performance
  • 2.5. Extracting as much information as possible from light
  • 2.6. Autonomous vehicles will need machine vision
  • 2.7. Trends in autonomous vehicle adoption
  • 2.8. What are the levels of automation in cars?
  • 2.9. Global autonomous car market
  • 2.10. How many camera needed in different automotive autonomy levels
  • 2.11. Growing drone uses provides extensive market for emerging image sensors
  • 2.12. Emerging image sensors required for drones

3. MARKET FORECASTS

  • 3.1. Market forecast methodology
  • 3.2. Parametrizing forecast curves
  • 3.3. Determining total addressable markets (TAMs)
  • 3.4. Determining revenues
  • 3.5. 10-year short-wave infra-red (SWIR) image sensors market forecast: by volume
  • 3.6. 10-year hybrid OPD-on-CMOS image sensors market forecast: by volume
  • 3.7. 10-year hybrid OPD-on-CMOS image sensors market forecast: by revenue
  • 3.8. 10-year hybrid QD-on-CMOS image sensors market forecast: by volume
  • 3.9. 10-year hybrid QD-on-CMOS image sensors market forecast: by revenue
  • 3.10. 10-year thin film organic and perovskite photodetectors (OPDs and PPDs) market forecast: by volume
  • 3.11. 10-year thin film organic and perovskite photodetectors (OPDs and PPDs) market forecast: by revenue
  • 3.12. 10-year hyperspectral imaging market forecast: by volume
  • 3.13. 10-year hyperspectral imaging market forecast: by revenue
  • 3.14. 10-year event-based vision market forecast: by volume
  • 3.15. 10-year event-based vision market forecast: by revenue
  • 3.16. 10-year wavefront imaging market forecast: by volume
  • 3.17. 10-year wavefront imaging market forecast: by revenue
  • 3.18. 10-year flexible x-ray image sensors market forecast: by volume

4. BRIEF OVERVIEW OF ESTABLISHED VISIBLE IMAGE SENSORS (CCD AND CMOS)

  • 4.1. Conventional image sensors: Market overview
  • 4.2. Key components in a CMOS image sensor (CIS)
  • 4.3. Sensor architectures: Front and backside illumination
  • 4.4. Process flow for back-side-illuminated CMOS image sensors
  • 4.5. Comparing CMOS and CCD image sensors
  • 4.6. Benefits of global rather than rolling shutters

5. SHORT-WAVE INFRA-RED (SWIR) IMAGE SENSORS

  • 5.1. Motivation for short-wave infra-red (SWIR) imaging
  • 5.2. SWIR imaging reduces light scattering
  • 5.3. SWIR: Incumbent and emerging technology options
  • 5.4. Applications for SWIR imaging
    • 5.4.1. Applications for SWIR imaging
    • 5.4.2. Identifying water content with SWIR imaging
    • 5.4.3. SWIR for autonomous mobility
    • 5.4.4. SWIR imaging enables better hazard detection
    • 5.4.5. SWIR enables imaging through silicon wafers
    • 5.4.6. Imaging temperature with SWIR light
    • 5.4.7. Visualization of foreign materials during industrial inspection
    • 5.4.8. Spectroscopic chemical sensors
    • 5.4.9. SWIR image sensing for industrial process optimization
    • 5.4.10. MULTIPLE (EU Project): Focus areas, targets and participants
    • 5.4.11. SWIR spectroscopy: Wearable applications
    • 5.4.12. SWIR spectroscopy: Determining water and body temperature via wearable technology
    • 5.4.13. SWIR spectroscopy: Alcohol detection
    • 5.4.14. SWIR image sensors for hyperspectral imaging
    • 5.4.15. SWIR sensors: Application overview
    • 5.4.16. SWIR application requirements
  • 5.5. InGaAs sensors - existing technology for SWIR imaging
    • 5.5.1. Existing long wavelength detection: InGaAs
    • 5.5.2. The challenge of high resolution, low cost IR sensors
    • 5.5.3. InGaAs sensor design: Solder bumps limit resolution
    • 5.5.4. Sony improve InGaAs sensor resolution and spectral range
  • 5.6. Emerging inorganic SWIR technologies and players
    • 5.6.1. Trieye: Innovative silicon based SWIR sensors
    • 5.6.2. OmniVision: Making silicon CMOS sensitive to NIR (II)
    • 5.6.3. SWOT analysis: SWIR image sensors (non-hybrid, non-InGaAs)
    • 5.6.4. Supplier overview: Emerging SWIR image sensors
    • 5.6.5. Company profiles: SWIR imaging (excluding hybrid approaches)

6. HYBRID OPD-ON-CMOS IMAGE SENSORS (INCLUDING FOR SWIR)

  • 6.1. OPD-on-CMOS hybrid image sensors
  • 6.2. Hybrid organic/CMOS sensor
  • 6.3. Hybrid organic/CMOS sensor for broadcast cameras
  • 6.4. Comparing hybrid organic/CMOS sensor with backside illumination CMOS sensor
  • 6.5. Progress in CMOS global shutter using silicon technology only
  • 6.6. Fraunhofer FEP: SWIR OPD-on-CMOS sensors (I)
  • 6.7. Fraunhofer FEP: SWIR OPD-on-CMOS sensors (II)
  • 6.8. Academic research: Twisted bilayer graphene sensitive to longer wavelength IR light
  • 6.9. Technology readiness level of OPD-on-CMOS detectors by application
  • 6.10. SWOT analysis of OPD-on-CMOS image sensors
  • 6.11. Supplier overview: OPD-on-CMOS hybrid image sensors
  • 6.12. Company profiles: OPD-on-CMOS

7. HYBRID QD-ON-CMOS IMAGE SENSORS

  • 7.1. Quantum dots as optical sensor materials
  • 7.2. Lead sulphide as quantum dots
  • 7.3. Quantum dots: Choice of the material system
  • 7.4. Applications and challenges for quantum dots in image sensors
  • 7.5. QD layer advantage in image sensor (I): Increasing sensor sensitivity and gain
  • 7.6. QD-Si hybrid image sensors(II): Reducing thickness
  • 7.7. Detectivity benchmarking (I)
  • 7.8. Detectivity benchmarking (II)
  • 7.9. Hybrid QD-on-CMOS with global shutter for SWIR imaging.
  • 7.10. QD-Si hybrid image sensors: Enabling high resolution global shutter
  • 7.11. QD-Si hybrid image sensors(IV): Low power and high sensitivity to structured light detection for machine vision?
  • 7.12. How is the QD layer applied?
  • 7.13. Advantage of solution processing: ease of integration with ROIC CMOS?
  • 7.14. QD optical layer: Approaches to increase conductivity of QD films
  • 7.15. Quantum dots: Covering the range from visible to near infrared
  • 7.16. SWIR sensitivity of PbS QDs, Si, polymers, InGaAs, HgCdTe, etc...
  • 7.17. Hybrid QD-on-CMOS image sensors: Processing
    • 7.17.1. Value chain and production steps for QD-on-CMOS
    • 7.17.2. Advantage of solution processing: Ease of integration with CMOS ROICs?
    • 7.17.3. Quantum dot films: Processing challenges
    • 7.17.4. QD-on-CMOS with graphene interlayer
    • 7.17.5. Challenges for QD-Si technology for SWIR imaging
    • 7.17.6. QD-on-CMOS sensors: Ongoing technical challenges
    • 7.17.7. Technology readiness level of QD-on-CMOS detectors by application
  • 7.18. Hybrid QD-on-CMOS image sensors: Key players
    • 7.18.1. SWIR Vision Systems: Hybrid quantum dots for SWIR imaging
    • 7.18.2. SWIR Vision Sensors: First commercial QD-CMOS cameras
    • 7.18.3. IMEC: QD-on-CMOS integration examples (I)
    • 7.18.4. IMEC: QD-on-CMOS integration examples (II)
    • 7.18.5. RTI International: QD-on-CMOS integration examples
    • 7.18.6. QD-on-CMOS integration examples (ICFO continued)
    • 7.18.7. Emberion: QD-graphene SWIR sensor
    • 7.18.8. Emberion: QD-Graphene-Si broadrange SWIR sensor
    • 7.18.9. Emberion: VIS-SWIR camera with 400 to 2000 nm spectral range
    • 7.18.10. Qurv Technologies: Graphene/quantum dot image sensor company spun off from ICFO
    • 7.18.11. Academic research: QD-on-CMOS from Hanyang University (South Korea)
    • 7.18.12. Academic research: Colloidal quantum dots enable mid-IR sensing
    • 7.18.13. Academic research: Plasmonic nanocubes make a cheap SWIR camera
  • 7.19. Summary: QD-on-CMOS image sensors
    • 7.19.1. Summary: QD/OPD-on-CMOS detectors
    • 7.19.2. SWOT analysis of QD-on-CMOS image sensors
    • 7.19.3. Supplier overview: QD-on-CMOS hybrid image sensors
    • 7.19.4. Company profiles: Hybrid QD-on-CMOS image sensors

8. THIN FILM PHOTODETECTORS (ORGANIC AND PEROVSKITE)

  • 8.1. Introduction to thin film photodetectors (organic and perovskite)
  • 8.2. Organic photodetectors (OPDs)
  • 8.3. Thin film photodetectors: Advantages and disadvantages
  • 8.4. Reducing dark current to increase dynamic range
  • 8.5. Tailoring the detection wavelength to specific applications
  • 8.6. Extending OPDs to the NIR region: Use of cavities
  • 8.7. Technical challenges for manufacturing thin film photodetectors from solution
  • 8.8. Materials for thin film photodetectors
  • 8.9. Thin film organic and perovskite photodiodes (OPDs and PPDs): Applications and key players
    • 8.9.1. Applications of organic photodetectors
    • 8.9.2. OPDs for biometric security
    • 8.9.3. Spray-coated organic photodiodes for medical imaging
    • 8.9.4. ISORG: 'Fingerprint on display' with OPDs
    • 8.9.5. ISORG: Flexible OPD applications using TFT active matrix
    • 8.9.6. ISORG: First OPD production line
    • 8.9.7. Cambridge display technology: Pulse oximetry sensing with OPDs
    • 8.9.8. Holst Center: Perovskite based image sensors
    • 8.9.9. Commercial challenges for large-area OPD adoption
    • 8.9.10. Technical requirements for thin film photodetector applications
    • 8.9.11. Thin film OPD and PPD application requirements
    • 8.9.12. Application assessment for thin film OPDs and PPDs
    • 8.9.13. Technology readiness level of organic and perovskite photodetectors by applications
  • 8.10. Organic and perovskite thin film photodetectors (OPDs and PPDs): Summary
    • 8.10.1. Summary: Thin film organic and perovskite photodetectors
    • 8.10.2. SWOT analysis of large area OPD image sensors
    • 8.10.3. Supplier overview: Thin film photodetectors
    • 8.10.4. Company profiles: Organic photodiodes (OPDs)

9. HYPERSPECTRAL IMAGING

  • 9.1. Introduction to hyperspectral imaging
  • 9.2. Multiple methods to acquire a hyperspectral data-cube
  • 9.3. Contrasting device architectures for hyperspectral data acquisition (II)
  • 9.4. Line-scan (pushbroom) cameras ideal for conveyor belts and satellite images
  • 9.5. Comparison between 'push-broom' and older hyperspectral imaging methods
  • 9.6. Line-scan hyperspectral camera design
  • 9.7. Snapshot hyperspectral imaging
  • 9.8. Illumination for hyperspectral imaging
  • 9.9. Pansharpening for multi/hyper-spectral image enhancement
  • 9.10. Hyperspectral imaging as a development of multispectral imaging
  • 9.11. Trade-offs between hyperspectral and multi spectral imaging
  • 9.12. Towards broadband hyperspectral imaging
  • 9.13. Hyperspectral imaging: Applications
    • 9.13.1. Hyperspectral imaging and precision agriculture
    • 9.13.2. Hyperspectral imaging from UAVs (drones)
    • 9.13.3. Agricultural drones ecosystem develops
    • 9.13.4. Satellite imaging with hyperspectral cameras
    • 9.13.5. Historic drone investment creates demand for hyperspectral imaging
    • 9.13.6. In-line inspection with hyperspectral imaging
    • 9.13.7. Object identification with in-line hyperspectral imaging
    • 9.13.8. Sorting objects for recycling with hyperspectral imaging
    • 9.13.9. Food inspection with hyperspectral imaging
    • 9.13.10. Hyperspectral imaging for skin diagnostics
    • 9.13.11. Hyperspectral imaging application requirements
  • 9.14. Hyperspectral imaging: Key players
    • 9.14.1. Comparing hyperspectral camera manufacturers
    • 9.14.2. Specim: Market leaders in line-scan imaging
    • 9.14.3. Headwall Photonics
    • 9.14.4. Cubert: Specialists in snapshot spectral imaging
    • 9.14.5. Hyperspectral imaging wavelength ranges
    • 9.14.6. Hyperspectral wavelength range vs spectral resolution
    • 9.14.7. Hyperspectral camera parameter table
    • 9.14.8. Companies analysing and applying hyperspectral imaging
    • 9.14.9. Condi Food: Food quality monitoring with hyperspectral imaging
    • 9.14.10. Orbital Sidekick: Hyperspectral imaging from satellites
    • 9.14.11. Gamaya: Hyperspectral imaging for agricultural analysis
  • 9.15. Summary: Hyperspectral imaging
    • 9.15.1. Summary: Hyperspectral imaging
    • 9.15.2. SWOT analysis: Hyperspectral imaging
    • 9.15.3. Supplier overview: Hyperspectral imaging
    • 9.15.4. Company profiles: Hyperspectral imaging

10. EVENT-BASED VISION (ALSO KNOWN AS DYNAMIC VISION SENSING)

  • 10.1. What is event-based sensing?
  • 10.2. General event-based sensing: Pros and cons
  • 10.3. What is event-based vision? (I)
  • 10.4. What is event-based vision? (III)
  • 10.5. What does event-based vision data look like?
  • 10.6. Event-based vision: Pros and cons
  • 10.7. Event-based vision sensors enable increased dynamic range
  • 10.8. Cost of event-based vision sensors
  • 10.9. Importance of software for event-based vision
  • 10.10. Applications for event-based vision
    • 10.10.1. Promising applications for event-based vision
    • 10.10.2. Event-based vision for autonomous vehicles
    • 10.10.3. Event-based vision for unmanned aerial vehicle (UAV) collision avoidance
    • 10.10.4. Occupant tracking (fall detection) in smart buildings
    • 10.10.5. Event-based vision for augmented/virtual reality
    • 10.10.6. Event-based vision for optical alignment / beam profiling
    • 10.10.7. Event-based vision application requirements
    • 10.10.8. Technology readiness level of event-based vision by application
  • 10.11. Event-based vision: Key players
    • 10.11.1. Event-based vision: Company landscape
    • 10.11.2. IniVation: Aiming for organic growth
    • 10.11.3. Prophesee: Well-funded and targeting autonomous mobility
    • 10.11.4. CelePixel: Focussing on hardware
    • 10.11.5. Insightness: Vertically integrated model targeting UAV collision avoidance
  • 10.12. Summary: Event-based vision
    • 10.12.1. Summary: Event-based vision
    • 10.12.2. SWOT analysis: Event-based vision
    • 10.12.3. Supplier overview: Event-based vision
    • 10.12.4. Company profiles: Event-based vision

11. WAVEFRONT IMAGING (ALSO KNOW AS PHASE-BASED IMAGING)

  • 11.1. Motivation for wavefront imaging
  • 11.2. Conventional Shack-Hartman wavefront sensors
  • 11.3. Phasics: Innovators in wavefront imaging
  • 11.4. Wooptix: Light-field and wavefront imaging
  • 11.5. Summary: Wavefront imaging
  • 11.6. SWOT analysis: Wavefront imaging
  • 11.7. Supplier overview: Wavefront imaging sensors
  • 11.8. Company profiles: Wavefront imaging

12. FLEXIBLE AND DIRECT X-RAY IMAGE SENSORS

  • 12.1. Conventional x-ray sensing
  • 12.2. Flexible image sensors based on amorphous-Si
  • 12.3. Spray-coated organic photodiodes for medical imaging
  • 12.4. Direct x-ray sensing with organic semiconductors
  • 12.5. Holst Center develop perovskite-based x-ray sensors (I)
  • 12.6. Holst Center develop perovskite-based x-ray sensors (II)
  • 12.7. Technology readiness level of flexible and direct x-ray sensors
  • 12.8. Summary: Flexible and direct x-ray image sensors
  • 12.9. SWOT analysis: Flexible and direct x-ray image sensors
  • 12.10. Supplier overview: Flexible x-ray image sensors
  • 12.11. Company profiles: Flexible and direct x-ray image sensors