我用SharpDX替換了一些狡猾的GDI +例程,從Stream中加載一個雙色TIFF圖像,在其上渲染文本,然後將其保存爲TIFF格式作爲Stream。SharpDX:BitmapFrameEncode.WriteSource在一些圖像上花費的時間更長
但是SharpDX代碼花費了更長的時間來做同樣的事情,我想知道我是否做錯了什麼。
正如你可以從這裏的樣本看,我有2個不同的功能:
- RenderImageFromExistingImage
SaveRenderedImage
using System; using System.Diagnostics; using System.IO; using SharpDX; using SharpDX.Direct2D1; using SharpDX.DirectWrite; using SharpDX.DXGI; using SharpDX.WIC; using Factory = SharpDX.Direct2D1.Factory; using FactoryType = SharpDX.Direct2D1.FactoryType; using PixelFormat = SharpDX.WIC.PixelFormat; using WicBitmap = SharpDX.WIC.Bitmap; public class ImageCreator2 { private static ImagingFactory _wicFactory; private static Factory _d2DFactory; private static SharpDX.DirectWrite.Factory _dwFactory; private int _imageWidth = 1000, _imageHeight = 500; private readonly int _imageDpi = 96; public ImageCreator2() { _wicFactory = new ImagingFactory(); _d2DFactory = new Factory(FactoryType.SingleThreaded); _dwFactory = new SharpDX.DirectWrite.Factory(SharpDX.DirectWrite.FactoryType.Shared); } private void RenderImage(WicRenderTarget renderTarget) { using (var blackBrush = new SolidColorBrush(renderTarget, Color4.Black)) using (var tformat = new TextFormat(_dwFactory, "Arial", 30f)) using (var tformat2 = new TextFormat(_dwFactory, "Arial", 11f)) { renderTarget.BeginDraw(); renderTarget.Clear(Color.White); renderTarget.DrawText("TEST", tformat, new RectangleF(300f, 30f, 100f, 20f), blackBrush); renderTarget.DrawText("MORE TEST", tformat2, new RectangleF(30f, 150f, 100f, 20f), blackBrush); renderTarget.DrawLine(new Vector2(0f, 25f), new Vector2(500f, 25f), blackBrush); renderTarget.DrawLine(new Vector2(0f, 210f), new Vector2(500f, 210f), blackBrush); renderTarget.EndDraw(); } } public void BuildImageFromExistingImage(byte[] image, Stream systemStream) { using (var checkStream = new MemoryStream(image)) using ( var inDecoder = new BitmapDecoder(_wicFactory, checkStream, DecodeOptions.CacheOnDemand)) using (var converter = new FormatConverter(_wicFactory)) { if (inDecoder.FrameCount > 0) { using (var frame = inDecoder.GetFrame(0)) { converter.Initialize(frame, PixelFormat.Format32bppPRGBA, BitmapDitherType.None, null, 0.0f, BitmapPaletteType.MedianCut); _imageWidth = converter.Size.Width; _imageHeight = converter.Size.Height; } } else { throw new Exception(); } var renderProperties = new RenderTargetProperties( RenderTargetType.Software, new SharpDX.Direct2D1.PixelFormat(Format.Unknown, AlphaMode.Unknown), _imageDpi, _imageDpi, RenderTargetUsage.None, FeatureLevel.Level_DEFAULT); using (var wicBitmap = new WicBitmap( _wicFactory, converter, BitmapCreateCacheOption.CacheOnDemand)) using ( var renderTarget = new WicRenderTarget(_d2DFactory, wicBitmap, renderProperties)) { RenderImage(renderTarget); using ( var encoder = new BitmapEncoder(_wicFactory, ContainerFormatGuids.Tiff)) { encoder.Initialize(systemStream); using (var bitmapFrameEncode = new BitmapFrameEncode(encoder)) { var pixFormat = PixelFormat.Format32bppPRGBA; bitmapFrameEncode.Initialize(); bitmapFrameEncode.SetSize(_imageWidth, _imageHeight); bitmapFrameEncode.SetResolution(96, 96); bitmapFrameEncode.SetPixelFormat(ref pixFormat); //This takes 30-40ms per image. var watch = new Stopwatch(); try { watch.Start(); bitmapFrameEncode.WriteSource(wicBitmap); } finally { watch.Stop(); } Console.WriteLine("Saved real image in {0} ms.", watch.Elapsed.TotalMilliseconds); bitmapFrameEncode.Commit(); } encoder.Commit(); } } } } public void SaveRenderedImage(Stream systemStream) { var renderProperties = new RenderTargetProperties( RenderTargetType.Default, new SharpDX.Direct2D1.PixelFormat(Format.Unknown, AlphaMode.Unknown), _imageDpi, _imageDpi, RenderTargetUsage.None, FeatureLevel.Level_DEFAULT); using (var wicBitmap = new WicBitmap( _wicFactory, _imageWidth, _imageHeight, PixelFormat.Format32bppBGR, BitmapCreateCacheOption.CacheOnDemand )) using (var renderTarget = new WicRenderTarget(_d2DFactory, wicBitmap, renderProperties)) { RenderImage(renderTarget); using ( var encoder = new BitmapEncoder(_wicFactory, ContainerFormatGuids.Tiff)) { encoder.Initialize(systemStream); using (var bitmapFrameEncode = new BitmapFrameEncode(encoder)) { bitmapFrameEncode.Initialize(); bitmapFrameEncode.SetSize(_imageWidth, _imageHeight); bitmapFrameEncode.SetResolution(_imageDpi, _imageDpi); //This takes 8-10ms per image. var watch = new Stopwatch(); try { watch.Start(); bitmapFrameEncode.WriteSource(wicBitmap); } finally { watch.Stop(); } Console.WriteLine("Saved generated image in {0} ms.", watch.Elapsed.TotalMilliseconds); bitmapFrameEncode.Commit(); } encoder.Commit(); } } } }
他們大多是相同的,並且做大致同樣的事情,除了第一個(RenderImageFromExistingImage)需要使用現有的1000x500雙色TIFF圖像才能使用作爲基礎圖像,第二個(SaveRenderedImage)從頭創建一個尺寸相似的WIC位圖。
,其採用現有的圖像的功能需要約30-40ms來執行,其中的大部分由BitmapFrameEncode.WriteSource佔用該時間(30毫秒〜)。該功能等同於被替換的GDI +代碼。
創建從頭WicBitmap需要8-10ms來執行,而不採取顯著時間BitmapFrameEncode.WriteSource,大致的時間作爲GDI +功能,它取代了相同的量的作用。唯一的區別是這個函數沒有加載預先存在的圖像,這是我需要的。
爲什麼BitmapFrameEncode.WriteSource(這似乎是圍繞IWICBitmapFrameEncode瘦包裝)在BuildImageFromExistingImage如此緩慢,相比SaveRenderedImage?
我的猜測是,BuildImageFromExistingImage是慢,因爲它是做輸入圖像上的額外的轉換(使用FormatConverter),將其轉換到D2D將處理的像素格式,而且這樣做確實的時間處罰直到BitmapFrameEncode.WriteSource發生。
有什麼我做錯了嗎?或者WIC(Windows圖像處理組件)與基於GDI +的通話相比只是很慢?
理想情況下,我需要的第一種情況(BuildImageFromExistingImage),以儘可能快的GDI +它取代代碼,並期望它應該有可能使其儘可能快,如果不是更快,比GDI +代碼,它是打算替換。
感謝您的提示和解釋,我有點認爲'WriteSource()'就是一切真正發生的地方。 'CacheOnLoad'和'CacheOnDemand'沒有明顯區別,'Format32bppPBGRA'和'Format32bppPRGBA'都沒有明顯區別。 我會嘗試'BitmapPaletteType.Custom',看看我得到了什麼結果。 –
'CacheOnLoad'是否完全改變了時序?我不希望它會使整體執行時間減少,但它會對'WriteSource'調用產生一些影響。另外,在調用'bitmapFrameEncode.SetPixelFormat(ref pixFormat)'後,您可能想要檢查'pixFormat'的值。如果編碼器不支持輸入圖像的格式,則會在保存期間進行轉換,並會告訴您將要轉換的格式。這會告訴你,保存本身是否會產生轉換成本。 – saucecontrol