Please use this identifier to cite or link to this item: http://hdl.handle.net/11455/51428
標題: A Study of Efficient Texture Synthesis Algorithms
有效的材質合成演算法之研究
作者: 王宗銘
關鍵字: 資訊工程--硬體工程
Texture Synthesis
基礎研究
Structural Textures
Edge Synthesis
Content Synthesis
Rendering
Computer Graphics.
材質合成
結構性材質
材質邊界合成
材質內容合成
成圖計算
計算機圖學
摘要: Texture mapping has been a popular technique used to increase the visualappearance of a scene. Texture synthesis algorithms, which aimed to synthesize largetexture images given a small source texture, have under intensive research. In thisresearch, we intend to present algorithms and techniques to overcome four drawbacksencountered in synthesizing structural textures using conventional patch-based texturesynthesis algorithm. We will partition the synthesis steps into the 「edge synthesis」 stepand the 「content synthesis」 step. This will substitute the conventional single-stepprocess to two-step processes, avoiding possible faulty results. In the 「edge synthesis」step, we will extract the edge information in the source texture using common edgedetection algorithms. We then synthesize an 「edge texture」 image, containing edgeinformation only. In the 「content synthesis」 step, however, we will use the contents inthe source texture and synthesize a new texture containing contents but with no edges.Results obtained from these two steps are then combined. Finally, we will apply arestored process to remove the interrupted object boundaries. As a result, final synthesisresults contain synthesized textures with appealing appearance. We will also propose anew texture-mixing algorithm. During the synthesis process, contents of varioustextures can be selected as users』 wish. When associating the 「content synthesis」 and「edge synthesis」 step, textures with versatile looks can be generated. In addition, wewill provide a control map including different diagrams to guide the combination. As aresult, features of the mixing textures will has a similar appearance to the control map,producing interesting and visual attractive results. We intend to demonstrate that ourmethod will be superior to previous approaches, producing satisfactory results with nointerrupted object boundaries in the final synthesized textures. In addition, using theproposed texture-mixing algorithm we are able to synthesize versatile textures.Furthermore, we will derive an output texture, performing as many characteristics aspossible, similar to the entire source textures. Our method is potential to significantlyreduce the synthesized time. The theoretical analysis we performed demonstrates that atmost 2/3 of the synthesis time can be reduced. The anticipated results of this researchare to propose efficient approaches to solve four potential problems encountered instructural texture synthesis.
在貼圖技術被普遍的運用且需要使用到解析度的材質下,材質合成(TextureSynthesis),成為了一個重要的研究議題。然而,目前普遍使用的區塊材質合成演算法仍有下列四個缺點亟待解決:1.邊界破碎情形;2.材質合成缺乏多樣性;3.合成時間稍長;4.合成邊界不一致。本研究擬針對目前區塊材質合成演算法之缺失,提出具體改進之道。針對邊界破碎的缺失,我們擬將傳統的單一步驟材質合成分為兩個獨立步驟,分別為材質邊界合成與材質內容合成。我們將利用邊界資訊合成僅具有邊界資訊的邊界。接著,我們對僅具有邊界的輸出材質做內容材質的合成,產生具有材質內容的合成材質。最後,我們再對此合成材質進行修補,消除合成材質的邊界破碎情形。針對材質合成缺乏多樣性之缺失,我們擬在材質內容的合成過程中,使用多種材質內容來加以混合使用,使得所合成的輸出材質產生變化。此材質混合方式將促使合成材質的變化,使之具有多種來源材質的特色,導致合成的材質的多樣化。針對合成時間稍長之缺失,由於我們僅使用來源材質內的邊界資訊,故可簡化來源材質的資料量,相對的也大幅減少了合成時所需建立的龐大資料量。此也縮短比對所需的時間,加速了合成的時效性。針對合成邊界不一致之缺失,我們擬導入兩階段的材質合成,避免傳統材質合成時,進行比對搜尋的過程中由於色彩的影響,導致選取不適合的區塊,影響最後的合成品質。結構性材質不致因為此影響產生邊緣不一致的情形,增加了視覺的一致性,避免了視覺的突兀情形。兩階段的材質合成將減低重複合成的次數,提升材質合成的有效性。我們希望經由此研究能使結構性材質合成的輸出材質影像中看不到有邊界破碎的情形。此外,利用我們所擬的材質混合方式,對材質進行合成將促使我們的輸出材質除了具有相似於原始來源材質的特色外,同時也混合擁有其他材質的特色,此將使得合成的材質更富多樣性。在合成時效方面,根據我們在理論分析上的探討,我們認為約能減少2/3 的合成時間。本研究計畫擬提出演算法來有效的改善上述四項區塊材質合成之缺失。計畫若能順利完成,我們認為本研究將對材質合成的議題做出具體之貢獻。
URI: http://hdl.handle.net/11455/51428
其他識別: NSC93-2213-E005-018
文章連結: http://grbsearch.stpi.narl.org.tw/GRB/result.jsp?id=1005796&plan_no=NSC93-2213-E005-018&plan_year=93&projkey=PB9308-1575&target=plan&highStr=*&check=0&pnchDesc=%E6%9C%89%E6%95%88%E7%9A%84%E6%9D%90%E8%B3%AA%E5%90%88%E6%88%90%E6%BC%94%E7%AE%97%E6%B3%95%E4%B9%8B%E7%A0%94%E7%A9%B6
Appears in Collections:資訊科學與工程學系所

文件中的檔案:

取得全文請前往華藝線上圖書館



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.